top of page
Writer's pictureJaime González Gasque

Meta has lots of voice assistant ideas but zip for biometric privacy


Can Meta Platforms get involved in speech recognition ethically — that is, while protecting the privacy of its subscribers?


When Meta CEO Mark Zuckerberg donned a 2001-grade avatar last week to make three speech-centered announcements related to the metaverse, noticeably absent was any mention of biometric privacy.


Even though it can be argued that Zuckerberg created Facebook parent Meta in part to diffuse the heat the social media platform was feeling for its data practices, Zuckerberg has nothing to say about privacy in Meta’s announcement.


Specifically, the company says it has created a new tool, BuilderBot, that will enable someone in Horizon, Meta’s metaverse, to import and create digital objects and features using only their voice. The voice assistant may not include a biometric authentication feature, but would rapidly accumulate a database of speech data, if successful.

And an AI model for chatting with virtual assistants, something called Project CAIRaoke, aims to enable natural conversations between assistants and people.


Its engineers also are working on software to translate all written languages and, separately, a universal speech translator for instant spoken-word translation (This last announcement is too speculative even for Meta to publish a marketing page about it.).

It is interesting to note that Zuckerberg showcases those designs on the same page with an ingratiating sop about how Meta has developed materials to educated people about the “many AI models that comprise an AI system.” Not Meta’s AI systems; just categories of code.


Yet nothing is said about biometric privacy at Meta.


A Recode article covering Zuckerberg’s announcement notes that breaches and privacy failures by Meta (including the Cambridge Analytica scandal) have been seen as especially egregious because Facebook’s entire business model is the harvesting of personal data.

Putting people in VR goggles opens a new chapter in biometric data collection, analysis and sales. Then subscribers create their own make-believe world — certainly the stuff of psychiatrists’ dreams. And, ultimately, people interact in that world with other people, imaginary obstacles and situations.


Yet, no assurances that any effort is being expended to safeguard the biggest cache of volunteered biometric information since Facebook itself?

When it comes to 600-pound digital gorillas in a room, erasing it might be a better strategy than ignoring it.


By Jim Nash

1 view0 comments

Commentaires


bottom of page