Apple’s WWDC 2024 keynote on Monday focused on its new Apple Intelligence AI tools on the iPhone, iPad and Macs. Google’s I/O event last month was also all about AI, but Google’s approach to the topic was a hot casserole of nonsense that left me, a tech journalist of 13 years, scratching my head. Apple took a different approach in how it talked about AI. 

Presenters at Google I/O threw around a wild amount of info, including a whole host of new brand names to figure out. Gem, Gemma, Gemini, Veo, Astra, Learn LM — I sat through Google’s event struggling to even understand which thing does what, let alone why I should care about any of it. I ended my working day feeling out of my depth and like the tech world had left me behind. 

And while Apple talked a lot about how its host of new AI features will work across all its platforms, it did so in a way that actually made sense. It took a “show, don’t tell” approach by giving real examples of what its new features will actually do and, crucially, how you’ll benefit from using them. 

AI Atlas art badge tag AI Atlas art badge tag

We were given demos on how to ask the new AI-powered Siri to “make my photo pop” before seeing an image receive auto-edits. Or how the on-screen awareness allows you to just say “add this address to Mike’s contact card” when Mike texts you a new address. All of Apple’s examples were clear, understandable and showed how Apple continues to succeed in taking a customer-focused approach. 

apple intelligence smart reply feature apple intelligence smart reply feature

Apple showed real, practical examples of its AI tools in use. 

Apple/Screenshot by CNET

Both Google’s I/O and Apple’s WWDC are fundamentally developer-focused events, designed to tell industry professionals about these updates so they can then roll them out to users. Google’s event was so pro-heavy, however, so full of jargon and assumed knowledge, that it alienated consumer tech fans like me. As a result, I’m not excited about Google’s AI promises — simply because I don’t really know what they are. And that means I may be less inclined to use Chrome or Gmail or to buy an Android phone that makes use of its AI tools — which is Google’s whole mission. 

Apple’s keynote instead spoke to me, directly, not via indecipherable developer-talk. Tim Cook called AI on Apple devices “personal intelligence” and it felt just that, while the impactful slogan “AI for the rest of us” appeared on screen behind Craig Federighi, almost as a callout to Google’s impenetrable event. 

But more than that, Apple focused on issues that matter to me as a consumer. There was a lot of talk around privacy in AI, how my data will be protected and even things like the emphasis on cartoon-style generative images of your friends and family, rather than attempting photorealistic images, which would be both creepy as hell and potentially extremely problematic.

AI shout count on screen 121 times AI shout count on screen 121 times

Google even kept its own count of how many times it said “AI” although by our count the number was even higher. 

Google/Screenshot by CNET

Apple still talked directly to its developers, explaining for example that the SDK for its ChatGPT integration will be available to be used in different apps, but it still did so in a way that left me feeling exactly why this would be beneficial when I use my iPhone.

At the end of Apple’s event I felt like I knew what had happened. I understood the upcoming products, and more than that, I’m excited to try them. I may well be excited to try Google’s products once I get to know them, but Google was so busy tripping over itself in the 140 times it said the words “AI” in its keynote that I feel like it forgot to tell me what those products even are.

By sticking to its customer-first approach, Apple has been able to break through the jargon and actually tell me why I should care, and in the constantly developing, endlessly confusing world of AI, that’s a huge win for Apple. 





Source link