Home Assistant Finally Lets You Undo and Redo in Automations


The home assistant officially abandoned his October release. Home Assistant 2025.10 provides massive quality of life improvements to the automation editor, as well as smarter dashboards and even the possibility for your connected AI model to generate images.
If you have already built complex automation, you know that the pain of making a mistake only to realize that the only way to repair it was to close the editor and start from scratch again. This nightmare is finally finished, because this version introduces the essential cancellation and recovery functionality. You can now cancel up to 75 steps in your publishing history, and yes, the keyboard shortcuts CTRL + Z and CTRL + Y Standard work perfectly.
Another major headache is resolved because the collage with CTRL + V is now super simple. If you have copied an automation block (such as a trigger or an action), you can now select any other block and press CTRL + V to stick the copied block just below. It is a small but huge change that is so welcome.
The sidebar, which has been introduced in the latest version, is now fortunately resident. The team also noticed that the “repeated” construction block in automation was trying to do too much, covering four different use cases in a complex block. To simplify things, the home assistant divided it into four smaller and easier blocks easier to understand with clearer descriptions.
The blocks are repeated several times, repeat until, repeat and repeat for everyone. This is an excellent decision to make automation in complex loops much more accessible without modifying the underlying structure of advanced users. Finally, the Overflow menu is back in the main section of the publisher, which makes essential actions as testing a much easier condition to achieve.
Home Assistant 2025.8 gave us the opportunity to generate data using an LLM, but now that AI becomes much more creative because it can generate images. The example that the Deputy Home Team has shown was that each time your doorbell is in a hurry, you can get a notification with an instantaneous cartoon version. It’s pretty cute and opens up a ton of possibilities. I am really curious to see which generation of wild and useful generation of images that the community offers.
The dashboard also becomes even more intelligent by introducing suggested entities. A basic algorithm now follows the entities with which you interact the most and suggest relevant controls depending on the time of day. It is essentially a question of letting your home suggest what you should see when you have to see it. Even better, you can integrate these predicted entities directly into one of your own manual dashboards.
For those who have households in a double language or anyone who wants a separate local and cloud assistant, the home assistant finally opens several wake words for vocal assistants based on esphoma. You can now define two wake words and two assistants for each voice assistant in your home. This means that you could define “Oke Nabu” for a French assistant based on the cloud and “Hey Jarvis” for an English room.
Even better than that, assistance becomes a little less talkative. If you emit a voice command and all the actions take place in the same area as the satellite device (like lighting a light in the same room), Assist will now play a simple “BIP” confirmation instead of the complete verbal response. It’s super useful because you’ve already seen the light on, so you don’t need a full discourse confirming it.
Source: Home assistant


