I took a ride in an Nvidia-powered autonomous Mercedes at GTC 2026 – and it’s convinced me this is the future

As AI and robotics become more and more present in daily life, use cases are quickly moving from science fiction to real life.
One of the most popular areas of interest is autonomous vehicles, self-driving cars that can navigate roads and get us to our destinations without needing to touch the steering wheel.
Article continues below
Take to the streets
Nvidia has been at the forefront of autonomous driving for some time, developing its Hyperion platform and the ecosystem behind it, working with a number of the world’s largest automakers, including Geely, BYD Nissan and Hyundai, as well as a long-standing collaboration with GM and Mercedes-Benz.
My ride lasted about 45 minutes through downtown San Jose, on a predefined route that gave me a taste of how the technology would work in various configurations and road conditions, including single- and multi-lane traffic in urban and suburban settings.
The Hyperion 8 technology we experienced was “Level 2” of autonomous vehicles, meaning that a human in the driver’s seat was able to interact with the process at any time, with the ability to deactivate the system by touching the brake pedal – and in fact, the car required that person to touch the steering wheel every so often to ensure they weren’t distracted or asleep.
As anyone who has driven a self-driving car knows, the experience can be a little alarming at first (especially when I was in the front passenger seat), but after the first few intersections, I was able to relax and enjoy the incredibly luxurious car.
The car itself was equipped with 10 cameras and five radars, located at the front and rear of the vehicle, allowing it to view the world around it and interact if necessary. This was supported by Nvidia’s Alpamayo end-to-end stack, trained on real and synthetic data, as well as a fully traceable stack ensuring additional security – although the setup we experimented with is not yet for sale, it is expected to be available later in 2026.
Several actions immediately impressed me, demonstrating the decision-making and intelligence capabilities of the software.
For example, a city bus that stopped unexpectedly to avoid a parked car caused our vehicle to signal and move into the adjacent lane, thereby avoiding a collision. The vehicle was also able to spot an elderly person starting to cross the middle residential street without going through an official crossing, and slowed down and moved over to ensure we never made contact.
The car was also able to anticipate an upcoming turn on our predefined route, moving into the correct lane a block early, meaning there was no need to jump a line of turning vehicles or attempt a risky maneuver into other lanes.
A number of unprotected turns and stop sign crossings also demonstrated how the technology allowed the car to move forward efficiently, while remaining ready to stop in the event of an unexpected risk of collision.
Even more impressive, when a large truck (somewhat bizarrely) reversed abruptly from a parking lot, crossing our lane of traffic and moving into the far lane (a situation that left even our experienced driver-guide with an elevated pulse), the car was able to brake to avoid a collision.
Overall, my journey in the autonomous vehicle was undoubtedly a success: the car felt in control and able to react to the world around it, so I never felt unsafe. With a wider rollout planned in the coming years, it will be interesting to see how it evolves, and I look forward to trying it again soon.
Follow TechRadar on Google News And add us as your favorite source to get our news, reviews and expert opinions in your feeds. Make sure to click the Follow button!
And of course you can too follow TechRadar on TikTok for news, reviews, unboxings in video form and receive regular updates from us on WhatsApp Also.




