Apple Intelligence feels like the HomePod all over again
Apple plays catch up in AI like it did with its smart speaker effort not so long ago
Apple’s Glowtime event served an avalanche of new products and features centered around the Apple Intelligence AI that the company has been hyping up for months. For a lowdown on everything you need to know, check out our iPhone 16 Pro hub. And for our first impressions, do read our hands-on iPhone 16 Pro review.
But, amid all of the new and upcoming features, the actual rollout felt familiar. I kept noting how each new feature already had a counterpart at Google, or OpenAI, or Meta, or all three and more. I have more than a passing knowledge of what’s out there, but even so, Apple’s rush of features felt more like someone setting up a comparison than forging new, innovative ground, which was what Apple used to do in the 2000s. What it really felt like was watching Apple’s announcement of its HomePod smart speaker and its later iterations.
Siri blew everyone’s mind when it first came out, adding real power to the iPhone and setting a standard no one could reach for a while. But when Google Assistant and Amazon Alexa came along, suddenly Siri wasn’t so special, but still, no one was keen on using either with their smartphone the way they were with Siri. Then came the Amazon Echo and Google Home (later Nest after an acquisition). Both companies poured resources into not only making appealing, relatively cheap smart speakers and displays but also making sure to keep their voice assistants equal to the task, often with better natural language processing, superior context retention, and deeper third-party integrations than anything Siri could bring to the fore.
The first Echo came out in 2014 and the first Google Home arrived two years later. Both quickly iterated the voice assistant and the hardware linking users to Alexa and Google Assistant. The first HomePod didn’t come out until 2018, and it had much of the rigidity in performance that drew complaints about Siri. The HomePod, while technically impressive in terms of sound quality, failed to compete with Amazon Echo and Google Home. Both rivals had already solidified their place in homes, offering affordable smart speakers with expansive voice assistant capabilities that tied into a wider ecosystem of smart home devices. Apple’s HomePod was more expensive, limited, and frankly late to the game. Even the later HomePod Mini could only try to match what had already been available for a while from Amazon and Google.
Apple AI's Steep Competition
Apple Intelligence doesn’t have quite the dire delay in release time as Apple’s voice assistant and smart speaker faced, but if you look at the list of AI features, you could repeat the words, “Google/OpenAI/many more did it already”, nearly every time. The advanced natural language understanding, photo editing tools, and enhanced smartphone controls have all already been announced or released by Google and others. They reflect a company that is still trying to close the gap left open by its AI rivals. Even Apple’s partnership news seems familiar. Embedding OpenAI’s models to give ChatGPT-power to its features is a good idea, but one that Microsoft and others have already pursued. Even Google, with its own stable of AI models, looked to ChatGPT's abilities in developing features for the Gemini AI assistant.
There were only two ideas, one frivolous and one possibly important, from Apple that struck me as unique or at least notably different from what we’ve seen before. The custom emojis of Genmojis are a cute idea that doesn’t seem to be quite as easy to set up on Google-powered devices. More crucially, Apple made a point about how much on-device AI processing will happen and how it will leverage its Private Cloud Compute system to encourage privacy and data security. That could be a big selling point to potential customers even if it limits some of what the AI can do compared to a cloud-first approach. But, even the on-device processing as a selling point for a smartphone has already been done by Google when it debuted the Pixel 9.
Apple had a lot to say, and Apple Intelligence may bring some unique features to the table, but the company’s late entry and iterative approach to AI suggest that it is still playing catch-up. Much like the HomePod’s struggle to gain traction in a market dominated by earlier entrants, Apple’s AI tools—while carrying the Apple design polish and privacy focus many admire—seem designed more to match what others are already doing than to push the envelope further. Apple used to set the stage for the next big tech fad, but it will take more than fun custom emojis to retake that position; just ask the ten people who still have a HomePod.
You might also like
- Apple iPhone 16 event – everything Apple announced and 16 things we learned
- Apple Intelligence may be reason why OpenAI wants Microsoft to work with archrival Oracle — Azure may be feeling the pinch as iOS 18 AI-focus means far more GPUs are required
- Rumored Apple and Meta collaboration might make the iPhone 16 a better AI phone
Get daily insight, inspiration and deals in your inbox
Sign up for breaking news, reviews, opinion, top tech deals, and more.
Eric Hal Schwartz is a freelance writer for TechRadar with more than 15 years of experience covering the intersection of the world and technology. For the last five years, he served as head writer for Voicebot.ai and was on the leading edge of reporting on generative AI and large language models. He's since become an expert on the products of generative AI models, such as OpenAI’s ChatGPT, Anthropic’s Claude, Google Gemini, and every other synthetic media tool. His experience runs the gamut of media, including print, digital, broadcast, and live events. Now, he's continuing to tell the stories people want and need to hear about the rapidly evolving AI space and its impact on their lives. Eric is based in New York City.