Apple to tell Oireachtas committee about Siri reviews of users

Privacy fears to fore as tech giant to say analyses done on small subset of audio

According to multiple former graders, recordings made by Siri accidentally were regularly sent for review, including those containing confidential information, illegal acts and even Siri users having sex.

According to multiple former graders, recordings made by Siri accidentally were regularly sent for review, including those containing confidential information, illegal acts and even Siri users having sex.

 

Reviews of audio recordings captured by voice assistant Siri on phones, laptops and other Apple devices are only carried out on a “very small subset” of samples, Apple will tell an Oireachtas committee on Tuesday.

The company this year apologised for allowing contractors to listen to voice recordings of Siri users in order to grade them.

According to multiple former graders, recordings made by Siri accidentally were regularly sent for review, including those containing confidential information, illegal acts and even Siri users having sex.

Representatives from Apple will appear before the Oireachtas Committee on Communications on Tuesday to update TDs on privacy concerns.

“To improve voice assistants such as Siri, there is a need for human review of a very small sample of audio interactions. This helps to ensure that Siri understands users’ questions and provides the right answer,” the company’s opening statement, seen by The Irish Times, says.

‘Random identifier’

The company will say that “human review of audio samples has always been conducted on a very small subset of audio samples from Siri requests”.

“And the people reviewing the audio samples are not shown an Apple ID, phone number or email. All Siri requests are associated with a random identifier.”

If the microphone is always on in our homes, what data is being stored and who can listen to that data or who is it being shared with?

The company says it no longer retains audio recordings of Siri interactions and that users can choose if they want to share their audio. It says that “only Apple employees, not contractors, will be allowed to listen to audio samples of Siri interactions for the limited review”.

The company also says it will “work to delete any recording which is determined to be the result of an inadvertent trigger of Siri”.

Users can also delete their dictation history.

Google Assistant

Google representatives will also appear before the committee. Google recently introduced a new policy that will see users of Google Assistant, a home voice assist system, opt in if they want to have their voice recorded or reviewed by humans. Google will tell the committee it is now automatically deleting more audio data in a bid to address privacy concerns.

Committee chair Hildegarde Naughton said there was growing concern about the recording of voices by devices.

“If the microphone is always on in our homes, what data is being stored and who can listen to that data or who is it being shared with? What profiles are being created from the data collected by these devices and who is this information being shared with?

“Visitors that come into the home or unsuspecting children may also have their data collected. We are all vulnerable when we do not know what is happening to our own data. We are exploring this issue in terms of data protection and whether stronger transparency and legislation is needed in this area.”