The iPhone 16 Will Get a Better Mic for Siri’s New AI Powers
Credit: Miguel Tomás / Unsplash
After all, even if Siri suddenly gets a thousand times smarter and more capable, there’s only so much the voice assistant can do if it can’t actually hear what you’re saying.
According to analyst Ming-Chi Kuo, Apple recognizes this and plans to address it with a new microphone system in the iPhone 16 lineup. In a post on Medium, Kuo claims that “all iPhone 16 models will feature a significant upgrade in microphone specifications” that should “improve the Siri experience significantly.”
According to Kuo, the key part of the new mic system is providing a better signal-to-noise ratio (SNR) that will help it more accurately pick up whatever the user is saying, even in noisy environments.
The new microphones are also said to feature better water resistance. However, this is likely more about ensuring they don’t get waterlogged and unable to pick up your voice rather than letting water into the iPhone. After all, recent iPhone models already feature some of the best IP68 ratings of any smartphone on the market; they can withstand immersion in up to 6 meters of water for up to 30 minutes, well beyond the 1.5 meters of even the best Android flagships.
Kuo speculates that the new microphones likely indicate that Apple plans to develop more AI capabilities around Siri and make those “a key selling point of the iPhone 16.”
My latest survey indicates that all iPhone 16 models will feature a significant upgrade in microphone specifications. In addition to better water resistance, the key specification upgrade is a better signal-to-noise ratio (SNR) to improve the Siri experience significantly. It could indicate that Apple expects to integrate more AI/AIGC capabilities into Siri as a key selling point of the iPhone 16.
Ming-Chi Kuo
However, that speculation is based on some other information Kuo has gleaned, including sources that have revealed that Apple has recently reorganized its Siri team to focus on integrating generative AI and large language models (LLM). Rather than the text-based “AppleGPT” system that we’ve heard some rumors about, Apple intends to make voice input “the key interface for AI/AIGC/LLM,” and Kuo expects that this will be true not only for the iPhone but smartphones in general.
Such a move makes a lot of sense in the mobile space, as when it comes to AI chatbots, speaking is far easier than typing. ChatGPT has recently embraced voice input, and you can already call it up with Siri to get far better answers than traditional voice assistants can provide. While Siri has lagged behind Amazon Alexa and Google Assistant in this area, ChatGPT is light-years ahead of them all when it comes to responding to everyday questions.
Apple has repeatedly said that it already bakes AI into all of its products, with CEO Tim Cook emphasizing that “we view AI and machine learning as fundamental technologies” that permeate the iPhone experience, such as Personal Voice, Live Voicemail, Fall Detection, and Crash Detection. However, Apple disclosed that it spent $22.6 billion on research and development in the first half of 2023, with much of that going to generative AI. A later report revealed that the company is spending millions of dollars a day on “conversational” AI that will undoubtedly be used to improve Siri.
So, while Cook likes to shrug off the notion that Apple is spending money “building chatbots,” there’s little doubt that generative AI will become a foundational part of Siri in much the same way that AI already powers many of the iPhone’s other great features. Unlike competitors like Google, Apple won’t label it as AI — they’ll probably just keep calling it Siri.
[The information provided in this article has NOT been confirmed by Apple and may be speculation. Provided details may not be factual. Take all rumors, tech or otherwise, with a grain of salt.]