Put this together with Recall, and Microsoft is giving a bit of a tell on where they’re headed with AI. I’d say this will be a linchpin in the surveillance state to come. And the computer industry is moving to AI by putting AI chips in new computers. And there is an interesting phenomenon with AI, as people are becoming even more lazy when it comes to critical thinking, and they’re trying to soften the falsehoods AI tells by labeling them “hallucinations”. When so much of what is published is propaganda and lies, until AI develops true intelligence to weed out lies, it’s as worthless as the megacorp propaganda media it’s based upon. Consequently, it’s a new avenue of distributing propaganda messaging as well as singling out those who resist.
Exclusive: Gathering of face and voice data went unnoticed for one month after it was automatically enabled for video conferencing app users in March
By Josh Taylor
The New South Wales education department was caught by surprise when Microsoft began collecting the voice and facial biometric data of school students using the Teams video conferencing app in March.
Late last year, Microsoft announced it would enable data collection by default, commencing in March, for a Teams feature known as voice and face enrolment.
Voice and face enrolment in Teams creates a voice and face “profile” for each participant in Teams meetings, which the company said improves the audio quality, reduces background noise and enables the software to tell who is speaking in meetings by recognising their voice and face.
The data is also fed into Microsoft’s large language model CoPilot to improve accuracy in transcription or summaries when that is enabled in those meetings.
The NSW education department website states Teams is used by schools as “a hub for teachers and students to engage, create, interact, and collaborate”.
“It’s a one-stop communication platform that combines chat, video meetings/lessons, file storage, assignments and integration of multiple applications,” the website states.
Guardian Australia can reveal that when the voice and face enrolment for Teams was switched on in March, the department was caught unaware for a month.
“A new Microsoft Teams feature that allowed voice and facial enrolment for people entering Teams meetings was quickly disabled across our network, and any face or voice recognition profiles that were created have been removed,” a spokesperson for the education department said.
The feature was switched off in April and the profiles were deleted within 24 hours of the department becoming aware that voice and facial enrolment was enabled.
The education department did not answer questions about the number of students or staff who had biometric data collected on them in the time it was available, or if those affected had been informed.
One concerned parent who alerted Guardian Australia to the matter expressed concern that despite the reassurances the department had given them about the data being deleted and the feature switched off, other parents may not have been aware it had been collected in the first place.skip past newsletter promotion
Microsoft retains a copy of the data while a user is enrolled and a user can choose to delete the profile at any point. If a user deletes their Teams account, Microsoft states on its website that it deletes the biometric data within 90 days.
Rys Farthing, the director of policy and research at the research organisation Reset Tech Australia, described the collection of biometric data of children as “a real worry”.
“That young people’s biometric data was unnecessarily collected creates real concerns – those students now have a lifetime to live with those risks,” Farthing said.
“Was this data used to train their AI after it was collected? Are we sure it wasn’t disclosed or shared while it existed, and that all copies of it have been deleted? Data is like toothpaste, it’s hard to put it back in the tube once it’s been collected.
“This just shows why we need stronger protections around children’s data, especially around preventing excessive collection. It’s worrying stuff.”
Microsoft declined to comment.