I've known Siri, Apple's voice assistant, for almost a dozen years, and yet I still can't recall a single meaningful conversation we've had. Conversely, ChatGPT and I have known each other for six months and yet we've talked about everything from the meaning of life to planning a romantic dinner for two and have even collaborated on programming and film projects. I mean, we have a relationship.
Siri's limits mean it still can't carry on a conversation or engage in a lengthy, project-oriented back and forth - though, you can connect Siri to ChatGPT to give it a neural boost. For better or worse, the Siri we use today on our iPhones, iPads, MacBooks, Apple Watches, and Apple TVs isn't much different than the one we first encountered in 2011 on an iPhone 4s.
Six years ago, I wrote about Siri's first brain transplant (opens in new tab), the moment when Apple started using Machine Learning to train Siri and improve its ability to respond to conversational queries. The introduction of Machine Learning and, soon after, an onboard Neural network in the form of Apple's A11 Bionic chip on the iPhone 8, marked what I thought was a turning point for, arguably, the first consumer-grade digital assistant.
This programming and silicon helped Siri understand a question and its context, allowing it to move beyond rote answers to intelligent responses to more natural language questions.
Early Siri was no Her
Not being able to fully converse with Siri didn't seem like a big deal, even though we'd already seen the movie Her and understood what we could ultimately expect out of our chatbots.
It wasn't, though, until that once distant future was snapped to the present by OpenAI's GPT-3 and ChatGPT that Siri's deficits were thrown into base relief.
Despite Apple's best efforts, Siri has been idling in learning mode. Perhaps this is because Siri is still primarily built on Machine Learning and not Generative AI. It's the difference between learning and creating.
All the generative AI chatbots and image tools we're using today create something brand new out of prompts and, soon, art and images. They are not answer-bots, they're builder-bots.
I doubt any of this is lost on Apple. The question is, what will and can Apple do about it? I think we'll have no further to look than its upcoming World Wide Developers Conference (WWDC 2023). We're all fixated on the possible $3,000 mixed reality headset Apple might show off in June, but the company's most important announcements will surely revolve around AI.
"Apple must be under incredible pressure now that Google and Microsoft have rolled out their natural language solutions," Moor Insights CEO and Cheif Analyst Patrick Moorhead (opens in new tab) told me over Twitter DM.
A more chatty Siri
As reported in 9to5Mac, Apple may already be - finally - working on its own language generation update for Siri (Bobcat). Note, that's not the same as "generative AI." I think it means that Siri will get a little better at casual banter. I also don't expect a lot more than that.
Unfortunately, Apple's own ethos may prevent it from catching up to GPT-4, let alone GPT-3. Industry watchers do not exactly expect a breakthrough moment.
"I do think what they do in AI will not necessarily be a leap as much as a calculated and more ethically driven approach to AI in Siri. Apple loves, lives, and dies by their privacy commitments and I expect no less in how they deliver a more AI-driven Siri," Creative Strategies CEO and Principle Analyst Tim Bajarin (opens in new tab) wrote me in an email.
Privacy above all else
Apple's steadfast adherence to user privacy may leave it hamstrung when it comes to true generative AI. Unlike Google and Microsoft Bing, it doesn't have a massive search engine-driven data store to draw on. Nor is it training its AI on the vast ocean of Internet data. Apple does its machine learning on device. An iPhone and Siri know what they know about you based on what's on your phone and not what Apple can learn about you and its 1.5 billion global iPhone users. Sure, developers can use Apple's ML tools to build and integrate new AI models on their apps but they can't simply collect your data to learn more about you to help Apple deliver a better Siri AI.
As I wrote in 2016: "It’s also interesting to consider how Apple intentionally handicaps its own AI efforts. Your purchase habits data in iTunes, for instance, are not shared with any of Apple’s other systems and services."
Apple's local approach could handicap it in its possible generative AI efforts. As Moorhead told me, "I see most of the action on device and in the cloud. Apple is strong on the device but weak in the cloud and this is where I think the company will struggle"
As I see it, Apple has a choice to make. Cede a little user privacy to finally transform Siri into the voice assistant we always wanted or stay the course with incremental AI updates that improve Siri but never let it rival ChatGPT.