icon bookmark-bicon bookmarkicon cameraicon checkicon chevron downicon chevron lefticon chevron righticon chevron upicon closeicon v-compressicon downloadicon editicon v-expandicon fbicon fileicon filtericon flag ruicon full chevron downicon full chevron lefticon full chevron righticon full chevron upicon gpicon insicon mailicon moveicon-musicicon mutedicon nomutedicon okicon v-pauseicon v-playicon searchicon shareicon sign inicon sign upicon stepbackicon stepforicon swipe downicon tagicon tagsicon tgicon trashicon twicon vkicon yticon wticon fm
12 Jul, 2019 01:45

Outsourced spying: Google admits ‘language experts’ listen to ‘some’ assistant recordings

Outsourced spying: Google admits ‘language experts’ listen to ‘some’ assistant recordings

Google’s smart speakers are recording users when they least expect it, according to temp worker language experts hired by the company to listen to the snippets – which include some of users’ most private moments.

Google is able to claim it does not listen to the recordings Google Home devices are constantly generating only because it contracts the job out to temp workers. These “language experts,” as they are called, use a collaborative system built by the company to share and analyze sound snippets, assisting Google’s AI assistant in deciphering the nuances of human speech.

While Google emphasizes that it anonymizes the snippets, replacing the user’s name with a serial number, Belgian broadcaster VRT found that matching a voice snippet with its owner was not very difficult, given the ample supply of addresses and sensitive information found on the recordings they were given. They listened to over 1,000 excerpts supplied by a Dutch contractor and discovered that more than 15 percent of them - 153 recordings in all - were recorded without the user’s knowledge.

Also on rt.com 'Luxury good' or no privacy at all? Apple & Google duke it out over customers' data

In one “accidental” recording, a woman was in “definite distress,” the temp said. Other snippets included sex and pillow talk, fights, and professional phone calls packed with private information. While employees are instructed to treat account numbers and passwords as “sensitive,” they’re left to their own devices everywhere else, leading to potential errors in judgment…like leaking to the media, according to Google, which condemned the contractor who spoke to VRT while fiercely defending its own practices.

Insisting that Google has safeguards in place to prevent “false accepts” – recordings initiated without the user’s knowledge – Google Search project manager David Monsees wrote in a Thursday blog post that the use of “language experts” is “necessary to creating products like the Google Assistant” and claimed the experts only review 0.2 percent of audio fragments recorded by the device. Monsees warned the leaker that “Security and Privacy Response teams have been activated on the issue, are investigating, and …will take action.”

Also on rt.com Google AI ethics council disintegrates over... lack of ethics?

When Bloomberg reported that Amazon’s Alexa was using thousands of humans to transcribe and annotate recordings, many made without the user’s knowledge, Google gloated that its superior Home Assistant anonymized and distorted audio snippets. However, the recordings VRT heard were not distorted at all. Google Home snippets were “clear,” and Google Assistant, the cellphone app version, produced “telephone quality” audio.

Google Home owners who expect the company to respect their privacy might be wise to consult the history of the company, whose founders have made their distaste for the concept of privacy abundantly clear: an attempt to set up an AI “ethics council” lasted less than a week before collapsing, and a study published earlier this week showed over 1,000 apps for Google's Android operating system collect data even when users deny them permission to do so. Earlier this year, users of the company's Nest Secure home security system discovered the device had a hidden microphone when a downloadable update activated the feature.

If you like this story, share it with a friend!

Podcasts
0:00
29:12
0:00
28:18