Prepare for non-human visitors! | TechRadar

Thus, non -human visitors are not exactly new. Robots have been scrolling through the web since the 90s, but research changes and changes and web owners must now start preparing AI agents also visiting their site.
You don’t believe me? Jason Mayes, leader of the web AI in Google, recently spoke at Wordcamp Europe by saying: “We are now entering the age when AI agents gain popularity”. Hostinger, one of the best web hosting suppliers, also believes and has recently launched a new tool to automatically create an LLMS.TXT file for WordPress sites.
AI agents are fed by large -language models (LLMS) such as Google Gemini, Anthropic Claude and Openai Chatpt. These models can apparently read and understand text as a human, but there are things we can do to help them better understand websites. Just like how robots.txt and sitemap.htm help traditional research methods to navigate and understand websites, llms.txt severs the goal of AI agents.
To find out more about AI agents, LLMs, and how they shape the research, I spoke to Saulius Lazaravičius, vice-president of the product in Hostinger, on the new tool to create llms.txt files.

Traditionally, people find information online via search engines like Google, which are based on robot.txt and sitemap.xml files to navigate and index the content. But with the rise of AI tools like Chatgpt and Claude, more users now obtain answers directly from large -language models (LLMS), bypassing traditional research.
This is where the LLMS.TXT file comes into play. It serves as a card for AI systems, helping them to identify and understand the most important parts of a website. The llms.txt file provides:
- A clear and priority list of the key pages of the site
- Concise summaries for the content of the page
- Links to more detailed and authoritarian resources
Placed alongside robots.txt and sitemap.xml, the llms.txt file improves the way IA engines interpret the structures of the complex site – potentially increasing the visibility of a site in the responses generated by the AI.
Are there currently data that shows the advantage of having LLMS.TXT alongside robots.txt and a sitemap.xml?
Currently, the adoption of LLMS.TXT is still in its infancy, with less than 1% of the best websites of one million sites that use it at the beginning of 2025. However, the share of traffic from the platforms of AI constantly increases. For example, the use of AI-led research in adults in the United States is expected to double by 2028.
While hard data on LLMS.TXT efficiency is still emerging, the broader concept of “SEO for AI” – also called generative engine optimization (GEO) – gain traction. Website owners are increasingly looking for ways to make their content more accessible and relevant to AI systems. LLMS.TXT is an early and proactive step in this direction.
What makes a good LLMS.TXT file, and how to get there with a click?
A well -structured LLMS.TXT file is clean, simple and focused on the surface of the most precious content of a site for AI systems. It usually starts with the main address of the website, followed by selected pages that AI models should prioritize. Optional descriptions can be added to clarify the structure or hierarchy of content.
The file is hosted at the Racine of the Website (for example Domaine.tld / LLMS.TXT) and is easy to configure – in particular with automated tools such as our LLMS.TXT file creator in one click.
Above all, the implementation of an LLMS.TXT file has no negative impact on traditional referencing. This is a proactive step to the future that makes a site more accessible to AI tools – now and in the future.
How long do you see LLMS.txt becoming a web standard?
AI playing an increasing role in the way people discover content, more companies will have to optimize websites not only for search engines, but also for AI systems. This adoption is expected to increase considerably in the coming months or years.
It is still not known if LLMS.TX will become a long -term standard. This could evolve in something more sophisticated, such as NLWEB or API solutions. But the concept of making content easily digestible for AI is there to stay.
At Hostinger, we are committed to giving our customers a competitive advantage. This is why we were among the first to offer automatic creation of LLMS.TXT files, and we will continue to evolve our tools as the Geo Change landscape.
Are there other things that websites can do to improve the visibility of their site at AI?
Like traditional search engines, AI systems are looking for precious and high quality content. This means creating information that is really useful for people, making sure that the site is fast, suitable for mobiles and easy to navigate, and make content technically accessible to crawl and index.
Each website owner must understand that the navigation supported by AI is there and that it grows. This means that they must constantly check what is new in the field of Geo and look for tools that expose the content of their website for LLM. Today, LLMS.TXT is a solid first step.
For the future, we believe that websites can evolve towards context protocol interfaces (MCP), where content is not only displayed for humans but served via MCP compatible APIs, and AI agents will consume it in name of users.