Watch Apple’s Siri blast through requests with on-device processing

0

During WWDC’s privacy segment, Apple talked about moving Siri’s processing from the cloud to your device, using the “Neural Engine” built into Apple silicon. While it’s obviously better for privacy to run speech processing on your phone rather than on one of Apple’s servers, it can also help speed up performance and reliability, as Apple showed in its demo.


The power of on-device learning.

Now let’s see how fast it is when I try.

Compared to my demo, Apple’s is definitely spicier – partly because I don’t have to turn off airplane mode every time I turn it on. (My phone still needs an internet connection for the requests that come after, but the model on the device doesn’t.) Full disclosure: My demo lasted a few times, and the first few times, the phone warned me that turning on Airplane Mode would make Siri inaccessible and I had to tap the switch that turns it off as I couldn’t do it with my voice.

Apple processing Siri requests on the device should help its users feel more confident in the privacy of their data: In 2019, we found out that contractors listened to some Siri requests, something that wouldn’t happen if those requests through your phone only. While Apple eventually tried to rectify that situation by being more transparent and notifying Siri recordings, handling more Siri requests on the phone is a good way to make the service a little more reliable.