WWDC: Apple Voice Control – what is it and what does it do?
Voice-based computing became a reality at WWDC 2019, when Apple enhanced its platform-wide Voice Control feature, which will really and truly transform lives.
What is Voice Control?
Limited Voice Control has been available as an option in Settings>General>Accessibility>Home/Side button for some time, but iOS 13 and macOS Catalina makes it much more effective, introducing Siri dictation and support for label/voice/field-based controls.
This smarter implementation of Voice Control realises something that has long been a dream for many users.
While others have tried to achieve this, Apple appears to have reached the goal — little surprise given the company’s long track record in enabling people with these system level tools.
Using just your voice Voice Control enables you to consistently control almost everything you do on your Mac, iPad and iPhone. You can select items, launch apps, select items/controls within apps, action events (send message, save, etc) and more, all using your voice and a set of dictation commands, UI labels and a grid overlay for items which can’t be accessed elsewhere.
It’s really clear, fast and fluid in operation, and while the version I saw was a little buggy it still looks really good for the final, full launch in September.
I’m in little doubt Public Beta testers will want to make use of it when that beta ships in July.
What must developers do to support it?
Introduced at WWDC 2019, Voice Control drew a huge cheer from developers who all recognized that as well as being a huge step for accessibility it also means they must down on using Apple’s built-in accessibility developer tools.
This means things like making sure user interface items are correctly and descriptively marked and that they spend time figuring out how VoiceOver users will navigate, articulate and use those elements.
The power of Apple’s system is that it will still be possible to use apps that haven’t done that work — but it also means it will become that much easier to identify those apps that do put accessibility first.
https://twitter.com/iliketweeting01/status/1135620645126049793?s=20
What platforms have it?
Voice Control will be available in Macs running macOS Catalina and iOS 13 iPads and iPhones.
You won’t need to spend more cash on software or equipment to use it.
Gone are the days when the accessibility market meant people with the most need to access tech were required to spend even more money on their kit than anyone else on the platform – at least on the Apple platform.
How does it work?
Voice Control works using Siri dictation, on-device AI and a combination of grid- and label-based interface elements (all with support for VoiceOver), and tools that let you interact with clickable items and on-screen items by voice.
New labels and grids let users interact with virtually any app using comprehensive navigation tools, with audio processing happening on-device.
The idea is that you get both set commands (such as ‘Open’ or ‘Close’) supplemented by contextual controls that let you navigate more complex application behaviors.
Of course, not every app works alike.
Apple’s systems automatically number UI elements in apps, so you can invoke those commands by name, for example. When labels for in-app elements don’t exist you’ll be able to use on-display grids navigate those items – you can also zoom in and out of apps and select items to drag-&-drop them between apps.
The end result?
As the following video shows, you can launch apps, compose and despatch messages, make calls, design documents – you can do it all. You can also use your voice to initiate things like long presses, swipes and gestures.
https://youtu.be/v72nu602WXU
Is Voice Control private?
Voice Control is private. That’s because the system relies on on-device Siri language processing, which means none of your instructions are ever shared with Apple, or anyone else – no one will be able to hack the Voice Control servers to search for people’s bank account authorization codes, for example.
This is yet another illustration of the importance and capability of Apple’s on-device AI focus.
Final thoughts
Voice Control this effective will transform the way we interact with computers and will enable many with a far more intuitive way to interact with Apple tech. This is life-changing to many.
Not only this, but it shows how future computer interfaces can be unleashed from the device and accessed in powerful new ways in which voice becomes the primary user interface.
Please follow me on Twitter, or join me in the AppleHolic’s bar & grill and Apple Discussions groups on MeWe.
Dear reader, this is just to let you know that as an Amazon Associate I earn from qualifying purchases.