Google and Samsung just launched the AI features Apple couldn’t with Siri

https://www.profitableratecpm.com/f4ffsdxe?key=39b1ebce72f3758345b2155c98e6709c

Google just announced that Gemini will soon be able to take over certain multi-step tasks on your phone, like ordering food or hailing a car, starting with the Pixel 10, Pixel 10 Pro, and the just-announced Samsung Galaxy S26 phones. This all sounds a bit like the features Apple announced for Siri at the 2024 Worldwide Developers Conference – before Apple delayed those planned features to March 2025 and which are still not published.

On stage, Sameer Samat, president of Android at Google, gave a demo of how Gemini’s new agent features would work to help manage a pizza order during his busy family group chat. Samat asks Gemini to look at the thread and figure out what to order, then place the order with a delivery app. On screen – in a pre-recorded video, this wasn’t live – you can see Gemini figuring out what everyone wants in the group chat and displaying it in a window. Then the user, through a voice request, asks Gemini to finalize this order, naming a specific pizzeria. Gemini then clicks on Grubhub to prepare the order, all still on screen. When the order is ready, Gemini sends an alert so the user can view it and press the submit button.

Apart from the fact that this situation does not seem that complicated to do yourself in the Grubhub app (or even by calling the pizzeria to discuss it with a human), this is a potentially important moment for agentic AI. Google recently added the ability for Gemini to automatically search for users in Chrome, and being able to do something similar directly in Android seems like a logical next step; Google clearly wants Gemini to be seen as a helpful agent or productivity partner rather than just a chatbot or a series of AI models.

Assuming the Gemini agent features also launch “soon” as Google promises and Apple doesn’t pull a rabbit out of its hat, Google will also beat Apple to the punch on some of its most impressive Apple Intelligence demos – also shown only in pre-recorded videos – from this WWDC 2024 show. A feature Apple showed off would have allowed Siri to understand what’s on your screen and act on it, meaning you could ask Siri to add an address from a message thread to the contact card of the person you’re texting with. Apple demonstrated how Siri would be able to perform actions within and across apps for you. The company said Siri would even be able to understand your personal context, meaning you could ask it when your mom’s flight landed and the assistant would pull the information from an email and show it to you.

Nearly two years later, none of this is still available. When Apple announced that the features would be delayed, the company even released an ad showcasing the features. And based on reports from Bloombergsome features might not arrive until iOS 27.

Of course, there are still many questions about Gemini’s new capabilities. They will actually have to be shipped. We’ll have to try them out to see if they’re as useful and functional as advertised – Google is calling this initial launch a “beta”, so there might be some rough edges. And we don’t know how many developers will actually let Gemini browse their apps on users’ behalf, which Edge editor Nilay Patel likes to call the problem DoorDash. (Google says Gemini will work in “select ride-sharing and dining apps.”)

But Google appears to have far surpassed Apple, and now Apple has even more to do to catch up.

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button