The Five Weirdest AI Inventions I Saw at CES 2026

We may earn commission from links on this page.
AI is still the big thing in the tech world, but it’s no longer the big thing new thing. It’s been around long enough that simply integrating it into your product isn’t enough to make it stand out, especially at the world’s biggest tech show. As I attended this year’s CES, the trend I noticed time and time again at the show floor was that AI was getting weird now. From personal holographic companions to a gaming monitor that essentially tricks you, here are the five weirdest AI inventions I saw at CES 2026.
Razer gives you your own personal anime girl
Credit: Michelle Ehrhardt
At last year’s CES, video game company Razer introduced Project AVA, an AI esports coach concept that was just a disembodied voice that lives inside your laptop. Yawn. This year, the company is expanding its efforts by bringing AVA into the real world.
This year in the Razer suite, I had a conversation with “Kira,” an anime “hologram” that lives in a small USB tube that you can plug into your laptop. She noticed my orange sweater thanks to a camera installed in the tube, before asking me about the show and encouraging me to launch a tour of Battlefield 6where she gave me some generic loading advice. I spoke to her using microphones also built into her tube, and she responded using her own speaker rather than the laptop’s. Razer said this demo was more directed, which is why it brought up the game right away, but that the end goal is to allow the new AVA to function as a versatile and compelling AI companion, so you don’t have to use it just for gaming.
To that end, the company says it’s “AI-agnostic,” so you can plug your own model into it. The demo I played through clearly used Grok and generally felt like I was talking to that app’s built-in AI companions, right down to the worthy jokes. But Razer said you could theoretically use ChatGPT or Gemini instead.
As we chatted, Kira played animations courtesy of Animation Inc., which powers similar but more app-driven AI companions. In other words, the chatbot and animations aren’t really new here, so what you would buy would be the USB tube and the characters.
Credit: Michelle Ehrhardt
Kira isn’t your only option for an AI companion here – she’s a typical anime gamer, but I also had the opportunity to briefly watch Zane, a muscular tattooed man with the deepest V-neck I’ve ever seen. You can kind of immediately see the target audience for these two characters, but if you want something more tame, you can also have your tube display the Razer logo surrounded by an audio waveform, which is simply called AVA (even though the project as a whole is still called AVA). And the company is also working on celebrity likenesses, with esports star Faker and influencer Sao already giving their approval.
Razer said it’s still working on how it will distribute these figures, and I’m told you’ll get one set with your purchase, but you’ll likely be able to purchase more later.
Regarding pricing and availability, no word on that. Technically, this is still a concept, so it could go back to the drawing board. But Razer’s website says it’s hoping for a release in the second half of 2026, and you can drop $20 right now to reserve your device.
In short, if you strip away the features already built into the apps you can download right now, the new Project Ava is essentially a talking hologram toy for your desktop. It’s still not a bad argument, but unfortunately I don’t know if hologram is the right word for it. Kira looked pretty flat to me, less like Princess Leia’s projection and more like she was being displayed on a normal transparent screen that just happened to be stuck inside a cylinder. I don’t think the novelty is up to par yet.
The gaming headset that uses AI to read your mind
Credit: Michelle Ehrhardt
Whenever I play a competitive game, instead of jumping straight into a match, I jump into a few practice sessions to warm up. It’s useful, but it takes time. The new Neurable x HyperX concept headset hopes to change that by helping you get locked in in just a few minutes.
Essentially, it looks like a normal gaming headset, but the earbuds feature various sensors that are supposed to read your concentration levels. These are similar to the brain-computer interfaces you may have seen in sci-fi shows, the ones with a bunch of wires and disks attached, but scaled down for the consumer market, with no scary wires in sight.
That’s where AI comes in. Reducing the sensors so much means this headset gets fewer readings than larger headsets in the lab, but Neurable says its models are still able to detect trends in those readings and translate them into useful data, while discarding unwanted data.
For gamers, this means it can have you perform a quick concentration exercise called “Prime,” in which you concentrate while noticing a cloud of dots shrinking into a solid orb. Once this is done, which took me about 90 seconds, you are supposed to be focused and ready to play.
Unfortunately, I did worse in a practice shooter after tuning than before, but that doesn’t mean the data was useless. I did the exercise with a colleague whose score improved by about a third after the focus, and with such a small sample size, there could be any number of reasons why I choked after the focus. The company said it might even be helpful to practice choking this way.
Credit: Michelle Ehrhardt
And anyway, numbers are fun. That’s why I’m particularly excited about the headset plugin for streamers, which allows them to display their concentration levels on screen so their conversation can be seen. I could easily imagine a community looking at this data and teasing their favorite streamer to try to distract them.
That said, it will be a while before you can actually buy it. It’s still a concept for now, with no price or release promise. However, Neurable already offers a similar, non-gaming headset made with Master & Dynamic, which will be available soon, without this software. To learn more, read my full article here.
What do you think of it so far?
Lenovo’s laptop may nod when you ask it a question
Credit: Michelle Ehrhardt
This one is more of a hardware innovation, but it’s a smart touch. At this CES, Lenovo showed off a laptop with a motorized hinge that can automatically close, open, and even rotate from side to side. It’ll be released later this summer, but while the company was demoing the unit to me, it also showed off a prototype chatbot app it was building. This uses ChatGPT for now, and it’s still just a concept and won’t ship with the laptop. But it was cute.
Essentially, as I talked with the app, the laptop displayed a large pair of animated eyes on the screen and used its hinge to nod or shake its head when I asked it questions. It also showed little animations in response to certain questions, like showing an umbrella when I asked about rainy weather.
It’s still very early, but I was impressed that the hardware was able to recognize what an affirmative answer was and trigger the laptop’s response accordingly. A lot of AI seems pretty disconnected from the real world, so anything that can give it a physical presence is probably a good idea if you want people to take it seriously.
The Lenovo AI gaming monitor that’s basically cheating
Credit: Michelle Ehrhardt
Also featured at CES this year, Lenovo’s AI Frame gaming monitor is probably the most practical item on this list, almost to the point where it feels like cheating. Essentially, this fills most of the 21:9 screen with a regular 16:9 view of everything on your computer and uses AI to display a zoomed-in preview of critical game information over the rest.
For example, in a demo showing a MOBA game (think League of Legends), the monitor zoomed in on the map. In a demo showing Counter-Strike 2he zoomed in on the crosshairs. Personally, I didn’t think having a zoomed-in view of the map was very useful, but being able to constantly see what was essentially a sniper scope around my crosshair was a game-changer, as it worked with any weapon and made targets much easier to see.
I could see Counter-Strike 2 Developer Valve is going as far as banning it if it ever hits the market, as it has already taken similar steps. But it’s still just an idea for now. However, it shows that companies are starting to understand how AI can actually help you in your games, beyond the simple advice that you probably already know.
XREAL’s new AR glasses can automatically convert any 2D content to 3D
Credit: Michelle Ehrhardt
Finally, my favorite AI invention at CES this year was probably XREAL’s new REAL 3D technology. Built into its latest AR glasses and already added to an existing pair via a firmware update, this uses AI to automatically find the depth of any 2D video source and convert it to 3D. And trying it out for myself, it practically looked official.
When I used it to play Mario Kart WorldI would have believed you if you had told me that Nintendo had added this mode itself. It also worked very well with James Cameron’s. Avatarand there was no loading time to set it up or turn it off. There was also no blurring, as might be the case with glasses-free 3D displays like the 3DS.
This is a great option for people who love watching games and movies in 3D, but might have trouble finding them now that 3D TVs and the Nintendo 3DS are mostly a thing of the past. Now you can just watch your existing 2D library, but in 3D.
The only problem you might encounter is content that doesn’t have depth. For example, Ralph Jodice of XREAL told me that the software didn’t quite know what to do when trying to read the original 8-bit format. Super Mario Bros.. with it, and would randomly emphasize only certain elements of the game, without any rhyme or reason. An illusion of depth, however, seems to work. Super Mario Bros. is entirely flat, but when I tried to watch the pen and paper animation Snow White and the Seven Dwarfs using this technology, he correctly separated the foreground characters from the background settings, even though everything on screen was entirely hand-drawn.



