Apple is currently trying to make Siri better by researching how the digital assistant could understand atypical speech.
The Wall Street Journal reports that voice assistant companies are aiming to be better in understanding people with stuttering speech. Apple has employed 28,000 audio clips collected from podcasts with atypical speech in an effort to train Siri. An Apple representative says the data collected will be used to improve voice recognition technology for atypical speech patterns.
Apple has also added a ‘Hold to Talk’ Siri feature which gives users the freedom to choose how long Siri will listen to commands, thereby eliminating interruptions for people who stutter.
Siri is equipped to ‘listen’ to commands inputted via keyboard through the ‘Type to Siri’ feature, which was introduced to users in iOS 11.
The Cupertino-based company intends to publish a more detailed report on how they intend to overcome the obstacle this week.