All the new updates of the iPhone, such as the AI-powered features in the Photos app, indicate that Apple is gradually transforming into an AI company now.
Must Watch: Would you live on 3D Printed Mars for a year for $60,000?
After more than a decade, autocorrect “fails” could be on their way out. Apple’s much-maligned spelling software is getting upgraded by artificial intelligence: Using sophisticated language models, the new autocorrect won’t just check words against a dictionary, but will be able to consider the context of the word in a sentence. In theory, it won’t suggest consolation when you mean consolidation, because it’ll know that those words aren’t interchangeable.
The next generation of autocorrect was one of several small updates to the iPhone experience that Apple announced earlier this month. The Photos app will be able to differentiate between your dog and other dogs, automatically recognizing your pup the same way it recognizes people who frequently appear in your pictures. And AirPods will get smarter about adjusting to background noise based on your listening over time.
All of these features are powered by AI—even if you might not know it from how Apple talks about them. Its conference unveiling the updates included zero mentions of AI, now a buzzword for tech companies of all stripes. Instead, Apple used more technical language such as machine learning or transformer language model. Apple has been quiet about the technology—so quiet that it has been accused of falling behind. Indeed, whereas ChatGPT can write halfway-decent business proposals, Siri can set your morning alarm and not much else. But Apple is pushing forward with AI in small ways, an incrementalist approach that nonetheless still might be the future of where this technology is headed.
Since ChatGPT debuted last fall, tech leaders have not been very subtle about AI’s potential—for good and for evil. Sam Altman, the CEO at OpenAI, tweeted last month that AI “is the most amazing tool yet created.” The Microsoft founder Bill Gates has called AI “the most important advance in technology since the graphical user interface.” At a Google conference, Alphabet CEO Sundar Pichai said “AI” 27 times in a 15-minute speech. (He’s also been known to say that AI will be “more profound” than fire.)
Subscribe to GreatGameIndia
Apple, meanwhile, isn’t even pretending to talk a big game when it comes to AI. John Gruber, a longtime Apple follower who runs the technology blog Daring Fireball, told me that he doesn’t expect any of the machine-learning features Apple announced this year to significantly alter the iPhone-user experience. They’ll just make it nominally better. “We expect autocorrect to just work,” he told me over email. “We notice when it doesn’t.”
The new autocorrect, which will be available in an iOS upgrade later this year, is sort of like a less powerful ChatGPT in your pocket. Apple says the software will be better at fine-tuning itself to how we type, as well at predicting what words and phrases we will use next. When you ask ChatGPT a question, you are accessing the same giant large language model stored on the cloud that everyone else is. But the much smaller and more personalized language model that will now power autocorrect will be living on your iPhone. Apple has not shared more details on how the feature will work, and the exact technical approach that Apple is using here is not clear, Tatsunori Hashimoto, a computer scientist at Stanford University, told me. Researchers, including Hashimoto, have been hard at work figuring out how to scale down large language models so that they fit on a mobile device.
The British Prime Minister, Rishi Sunak, made the announcement during a speech opening London Tech Week that the UK will get ‘early or priority access’ to AI models from Google and OpenAI.