Claude AI Will Soon Be Able to Control Your Browser (If You Let It)

https://www.profitableratecpm.com/f4ffsdxe?key=39b1ebce72f3758345b2155c98e6709c

Did you know that you can customize Google to filter the garbage? Take these steps For best search results, including the addition of Lifehacker as a favorite source for new technological.


Have you ever thought about yourself by browsing the internet, “if only an AI bot could do that for me?” I do not do it, but I imagine that some people must, because Anthropic now deploys an experience to allow this for some chrome users.

The company announced on Tuesday the new integration. Eligible users will now have access to a chrome extension which, once activated, allows Claude AI to see everything you do in your browser. Claude can use this context to better answer questions and requests, to which you can access from the integrated extension chatbot window.

But although it is a component of functionality, the vision of anthropic goes far beyond a more useful chatbot experience. In addition to more contextual interactions, Claude for Chrome can also Take your browserAnd perform actions for you. It is really the fabric of the future, even if I am not sure that it is a future that I really want.

Here is an example: let’s say you’re looking for an apartment. Instead of opening Zillow yourself, you can click on the Claude button in Chrome to launch the chatbot and tell it exactly what you are looking for in a new house. As part of this request, you can ask Claude to search for the ads on Zillow for you and share the best announcements. According to Anthropic, Claude will do it for you and even tell you what authorizations he needs you to activate in the chatbot window to complete the task, such as the content of the reading page on Zillow.com.

In another example, Anthropic shows the user asking Claude to find a well -revised restaurant on Doordash which serves garlic noodles and add the dish to their cart. Claude travels through her steps, including what she sees on the home page of Doordash, how she has to look for “garlic noodles”, and even that she must press “entry” to carry out the search.

If it works as announced, it is a bit wild that you can ask a chatbot like Claude to do things on a web browser like Chrome, and it will. But for most tasks, I don’t necessarily see interest. I suppose that if you are too busy to consult the lists of apartments yourself or to find noodles to order for dinner, Claude for Chrome offers a multitasking opportunity. But I generally have no problem with these types of tasks. In fact, when I had to find a new apartment or a new house, I liked to get; I also like to choose a good restaurant for dinner. These are not the things I necessarily need or that I want a bot, especially for the results that are quite subjective: why would Claude know what apartments suit me, or if I would prefer noodles from one restaurant to another? I would prefer to choose these things for me.

What do you think so far?

Claude for Chrome and your safety

Then there are security problems, on which Anthropic is transparent. The company acknowledges that AI browsers are likely to cause injection attacks, a type of cyber attack in which bad actors add malicious instructions to AI models. In its tests, the company noted that before the implementation of one of its security measures, rapid injection attacks had a success rate of 23.6%. In one of these successful tests, Anthropic sent a malicious email with instructions to delete all emails in a reception box. Claude for Chrome read e-mail and followed the instructions. Not ideal.

But it is without the safety measures in place, on which Anthropic says that he has worked. This includes user control over all permits at the site level, as well as verification from the user before taking a “high -risk” action such as publication of content, making purchases or sharing personal information. The company also improved Claude’s instructions on how to manage personal data and blocked the bot of “high -risk” sites, such as those dealing with finances, or adult or hacked content. Anthropic also works on more railings, which is why this feature is currently quite limited.

How to register for Claude for Chrome

At present, Anthropic offers the initial test only at 1,000 Claude Max subscribers, which costs $ 100 or $ 200 per month. The company will continue to deploy early access to more maximum subscribers in the coming weeks, although I would not be surprised to see them possibly tests open to professional subscribers ($ 20 / month).

If you are eligible, you can register for the waiting list now. Although there are the above-mentioned security railings, the company warns that the testers will take the following risks of malicious users:

  • Access your accounts or files

  • Sharing your private information

  • Purchasing on your behalf

  • By taking measures that you have never planned

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button