A new post in Apple’s Machine Learning Journal details the firm’s work on its signature digital assistant — and specifically, its “Hey Siri” functionality.
For one, the entry (Vol. 1, Issue 9) claims that Apple engineers chose “Hey Siri” as an activation cue because of its natural phrasing. The team wrote that the phrase was “so natural that even before this feature was introduced, users would invoke Siri using the home button and inadvertently prepend their requests with the words, ‘Hey Siri.’”
Even then, the team describes several instances when Siri could be activated accidentally and how it worked to mitigate those annoying instances. Using speaker recognition techniques, they said they tweaked Siri to focus on “who is speaking” rather than “what was spoken.” In other words, this is why the Hey Siri is personalized to your own voice.
“The overall goal of speaker recognition (SR) is to ascertain the identity of a person using his or her voice,” the entry reads. The team says that it’s continuing to perfect the system’s way of dealing with false activations.
One of those methods is something that the team calls “user enrollment.” This breaks down into two categories: explicit enrollment and implicit enrollment.
- Explicit enrollment is, basically, the Hey Siri setup process — in which users are required to speak variations of the Hey Siri phrase several times.
- Implicit enrollment, on the other hand, is a speaker profile created and refined over a period of time, based on user interactions with Siri.
Of course, the team acknowledges that there’s still a lot of work to be done. In specific, they pointed out that improving quality performance in large rooms and noisy environments is one of their remaining challenges.
“Hey Siri” was first introduced as a feature in iOS 8 for the iPhone 6 family of devices. At the time, it could only be used when the handset was charging. Now, thanks to low-power processors that consistently listen for the trigger phrase, Hey Siri is effectively an “always-on” feature.
A previous post (Vol. 1, Issue 6) published on the platform back in October 2017 described how the “Hey Siri” functionality actually works in terms of picking up and detecting the trigger phrase.
Apple’s Machine Learning Journal is an online platform in which the company’s A.I. teams record and publish their research and work. Launched last summer, the journal has been populated by a handful of posts on complex machine-learning topics including facial detection, autonomous vehicle systems, and other work on Siri.
Read Next: Apple Sued a Small Unauthorized Repair Shop in Norway and Lost