Apologetic Apple promises to do better after Siri privacy snafu

Apple said it will no longer retain audio recordings of interactions with Siri by default following a company review into privacy protections around the virtual assistant.

The technology giant said it would instead ask users to opt in to any audio sample use when it resumes the programme later this year, but would no longer use contractors to listen to such recordings — only Apple staff.

The announcement comes after a number of reports raised privacy concerns about recordings of user conversations with a range of voice-based assistants — including those from Amazon, Google and Microsoft — being listened to and analysed by human reviewers.

The technology firms each said the reviews were done as part of programmes to grade and analyse the quality of their voice and language recognition services, with the aim of improving their performance.

In a statement, Apple said: “At Apple, we believe privacy is a fundamental human right. We design our products to protect users’ personal data, and we are constantly working to strengthen those protections. This is true for our services as well. Our goal with Siri, the pioneering intelligent assistant, is to provide the best experience for our customers while vigilantly protecting their privacy.

“We know that customers have been concerned by recent reports of people listening to audio Siri recordings as part of our Siri quality evaluation process — which we call grading. We heard their concerns, immediately suspended human grading of Siri requests and began a thorough review of our practices and policies. We’ve decided to make some changes to Siri as a result.”

Sorry

The tech giant apologised to users and confirmed that while it will no longer keep audio recordings, it would use “computer-generated transcripts” of interactions to help with its grading programme.

“As a result of our review, we realise we haven’t been fully living up to our high ideals, and for that we apologise,” the company said.

“As we previously announced, we halted the Siri grading programme. We plan to resume later this fall when software updates are released to our users — but only after making the following changes: first, by default, we will no longer retain audio recordings of Siri interactions. We will continue to use computer-generated transcripts to help Siri improve.

“Second, users will be able to opt in to help Siri improve by learning from the audio samples of their requests. We hope that many people will choose to help Siri get better, knowing that Apple respects their data and has strong privacy controls in place. Those who choose to participate will be able to opt out at any time.

“Third, when customers opt in, only Apple employees will be allowed to listen to audio samples of the Siri interactions. Our team will work to delete any recording which is determined to be an inadvertent trigger of Siri.”

The company added that it was “committed to putting the customer at the centre” of everything it does, including protecting user privacy.

Other tech firms involved in the privacy incident have taken a similar approach to Apple — Google has also paused the human review of audio recordings, while Amazon has confirmed it will let users opt out of having their audio reviewed by humans.

Facebook has also confirmed the pausing of human reviews after it confirmed they had been used to analyse samples of recordings from the transcription feature within its Messenger app.

Source: techcentral.co.za