How to use Apple’s new Siri privacy controls in iOS 13.2
When we learned that teams of humans were listening into people’s private conversations with Siri, we were shocked. Apple moved quickly to abandon the practise and in iOS 13.2 it is providing users with new tools to better control who listens to what we tell our voice assistant.
Here is what you need to know:
Introducing Apple’s Siri privacy controls
When you first launch iOS 13.2 you’ll be presented with a new system message that gives you the chance to opt out of sharing recordings of what you ask Siri with others.
It is fair to note that these recordings are, Apple claims, “not associated with your Apple ID and will only be stored for a limited period.”
However, when news of the grading process broke, we learned that at times Siri picks up personal information that people may not want to share.
The screen presently urges users to:
“Help improve Siri and Dictation by allowing Apple to store and review audio of your Siri and Dictation interactions on this iPhone and on any connected Apple Watch or HomePod. You can change this later in the settings for each device.”
You can decline to share your recordings at this stage.
In the event you agree to share but then subsequently change your mind, there are now two Settings you should review:
The privacy tool
- Apple has placed one Siri tool in Settings>Privacy>Analytics & Improvements.
- Look down the list and the item Improve Siri & Dictation is the one you are looking for.
- This is usually enabled by default.
The description of this feature urges you to:
“Help improve Siri and Dictation by allowing Apple to store and review audio of your Siri and Dictation interactions.”
What this means is that small anonymized snippets of conversation recorded by Siri will be kept by Apple for a limited period and shared with human workers for analysis and review. The team is aiming to help Siri become more accurate in its reports.
This may be fine for some users, others may feel somewhat uncomfortable, given that not every conversation Siri picks up is necessarily known.
The search tool
Siri also searches for things for you. When it does it retains recordings in the same way as described above, and these may also be sampled and heard by human operators. You can now delete this information from Apple’s servers:
Open Settings>Search>Siri & Search
Tap Delete Siri & Dictation History
The description reads:
“Delete Siri & Dictation interactions currently associated with this iPhone from Apple servers. Data that has been sampled to help improve Siri and Dictation is no longer associated with this iPhone and will not be deleted.”
Apple has said it is applying more stringent controls over what human operators can access in terms of conversational snippets, is bringing the grading process in house and will otherwise limit the data reviewers can access.
Overall, these decisions to a better job of matching up to Apple’s warnings around privacy — including Apple CEO Tim Cook’s warning that tech is marching us into a surveillance state.
I’d urge Apple to also find some way to make it possible for users to review any data the company has that relates to their device.
Please follow me on Twitter, or join me in the AppleHolic’s bar & grill and Apple Discussions groups on MeWe.
Dear reader, this is just to let you know that as an Amazon Associate I earn from qualifying purchases.