If there’s one thing Apple and Samsung can agree on, it’s that generative AI will change the way we use our phones. But when it comes to how, that’s where the two tech giants differ.

Apple on Monday announced Apple Intelligence, a collection of new AI-powered features that work across the iPhone, iPad and Mac to rewrite messages, generate images and ask Siri more complicated questions. On the surface, that may sound similar to Samsung’s Galaxy AI, which was announced in January for the Galaxy S24 and has since expanded to other devices.

While there are a few ways in which Apple Intelligence and Galaxy AI overlap — namely editing photos and summarizing, proofreading and rewriting texts and notes — Apple and Samsung generally have different ideas for what AI can bring to the smartphone experience. Apple views AI as connecting the dots between your apps, while Samsung applies AI to specific individual tasks, like language translation.

Read more: WWDC 2024: Everything Announced

Apple and Samsung are far from being the only tech behemoths to integrate AI more deeply into their products. But as the world’s two largest smartphone companies, they hold a lot of influence over how the technology will manifest in the mobile devices we use everyday.  

Apple and Samsung have made it clear that there’s a lot more to come with AI, so the version of Apple Intelligence and Galaxy AI that we know today is likely just the start. Without actually trying Apple Intelligence, it’s impossible to know how well it works and what it contributes to the iOS experience.

But even at this early stage, we’re getting a picture of how our phones’ software could evolve in the coming years. 

Apple puts a personal spin on generative AI

apple intelligence apple intelligence

Personal context is a big focus for Apple Intelligence. 

Apple/Screenshot by CNET

Apple may be late to the generative AI game compared to Samsung and other rivals, but it’s trying to make up for that by emphasizing how Apple Intelligence can help you make sense of all the data, files, photos and messages on your phone. 

“It needs to be integrated in the experience you’re using all the time,” Craig Federighi, Apple’s senior vice president of software engineering, said in reference to AI during a press event following the WWDC keynote. “It needs to be intuitive, but it also needs to be informed by your personal context.”

A lot of this functionality is tied to the new version of Siri, which will soon be better at finding things on your device, such as a recipe that a friend may have sent to you. Even if you can’t remember whether the recipe was stored in a text message or a Notes file, Siri will be able to help you find it, Apple says.

Apple says Siri can index photos, calendar events and files and reference information from messages and emails. That means it should be able to extract the right details from your phone when needed, no matter where that data is stored. One example included filling out a form that requires your driver’s license number. If there’s a photo of your license stored on your phone, Siri can input the identification number for you. 

Apple also wants Siri to go a step beyond finding things in apps. It wants the assistant to take action on your behalf, too. In its WWDC keynote, Apple demonstrated how you could ask Siri to pull up a photo of a specific person — “Show me my photos of Stacy in New York wearing her pink coat” — then edit that photo and add that image to a note just with your voice. Since Siri knows who you’re talking about based on your Photos app, it should be able to pull up the right picture.

wwdc24-siri wwdc24-siri

Siri is getting a big upgrade this year. 

Getty Images/Viva Tung/CNET

The Photos app provides another example of Apple’s personalized approach to AI. A new feature will make it possible to build your own photo memories movie just by typing in a specific prompt such as: “Leo trick-or-treating with a spooky vibe,” or “Everything we ate in Japan.” 

There’s a lot more to Apple AI, but these features are some of the strongest examples of how Apple’s approach differs from Samsung. However, that’s not to say it’s an entirely unique perspective on how AI can be applied to the gadgets we rely on everyday. 

The idea of referencing information from texts and email is similar to Microsoft’s Copilot assistant for PCs. Google’s Gemini virtual helper is also getting an upgrade that enables it to reference content on screen, much like Siri. And given how closely Google and Samsung work together, that feature will likely arrive on Samsung devices before long. 

Google also discussed its vision for futuristic AI agents that can even perform errands as complex as returning a pair of shoes for you by combing through your inbox for the right information. It’s another sign that Apple isn’t the only one who wants virtual assistants to handle tasks for us. 

Samsung goes big communication and productivity

The Galaxy S24 Ultra showing the new chat translation feature The Galaxy S24 Ultra showing the new chat translation feature

The Galaxy S24 Ultra showing the new chat translation feature

Lisa Eadicicco/CNET

Samsung’s approach to AI on smartphones is more centered on communication and productivity. One of Galaxy AI’s headlining features when it debuted was the ability to translate calls in real time directly from Samsung’s native phone app. 

In addition to rewriting texts in a different tone and proofreading them, Galaxy AI also makes it possible to translate full conversations into different languages. The emphasis on translation in particular shows how Samsung and Apple’s approaches differ, considering Apple didn’t really discuss language translation during its WWDC keynote. 

Productivity and content creation are also a big part of Samsung’s AI push so far. Generative Edit, which lets you remove unwanted objects from photos or resize and move them, was another highlight when Galaxy AI debuted in January. (Apple Intelligence will bring a similar feature to iPhones).

Other examples include previewing video clips in slow motion, summarizing and translating notes and Circle to Search, which lets you launch a Google search for anything on screen by circling it.

The Galaxy S24 Ultra showing the new Generative Edit feature The Galaxy S24 Ultra showing the new Generative Edit feature

The Galaxy S24 Ultra showing the new Generative Edit feature.

Lisa Eadicicco/CNET

What’s interesting, however, is that Apple’s Siri enhancements actually sound similar to Samsung’s original vision for its own Bixby voice assistant. Back in 2017, Samsung positioned Bixby as a means to navigate your phone and apps more easily rather than an assistant for quickly answering questions or setting timers. 

Over time, I imagine both Apple and Samsung’s software features will continue to overlap. Samsung plans to give Bixby a generative AI upgrade, and Apple will surely expand Apple Intelligence to more apps and services. And while the execution may be different, both companies clearly see AI for being a useful tool that can help us get things done more quickly on our phones. The question is whether Apple Intelligence and Galaxy AI will live up to those promises. 

We’ll likely get our next peek at what Samsung and Apple have planned over the coming months, considering Samsung is expected to announce new foldable phones this summer and Apple typically releases new iPhones in September.  

I Took 600+ Photos With the iPhone 15 Pro and Pro Max. Look at My Favorites

See all photos

Editors’ note: CNET used an AI engine to help create several dozen stories, which are labeled accordingly. The note you’re reading is attached to articles that deal substantively with the topic of AI but are created entirely by our expert editors and writers. For more, see our AI policy.





Source link