5 ways you could use AI for help and support

The stories of people establishing emotional links with AI appear more often, but Anthropic has just dropped certain figures by affirming that it is far from being as common as it may seem. By scraping Claude’s 4.5 million conversations, the company has discovered that only 2.9% of users engage with it for emotional or personal support.
Anthropic wanted to emphasize that if the feeling generally improves compared to the conversation, Claude is not a digital narrowing. It rarely repels security problems, which means that it will not give medical advice and will not tell people not to self -manner.
But these figures could be more on the present than the future. Anthropic himself admits that the landscape changes quickly, and what matters as “emotional” use today is perhaps not so rare tomorrow. While more and more people interact with chatbots like Claude, Chatgpt and Gemini and, more often, there will be more people who will bring AI to their emotional life. So how do people use AI for support at the moment? Current use could also predict how people use them in the future as AI becomes more sophisticated and personal.
Ersatz therapy
Let’s start with the idea of AI as a non -quite therapist. Although no model of AI is today an approved therapist (and they all make this warning loud and clear), people always engage with them as if they are. They hit things like: “I really feel worried about work. Can you talk to me?” Or “I feel stuck. What questions should I ask myself?”
The question of whether the answers that come back are useful probably vary, but there are many people who claim to have moved away from a therapist from AI at least a little calmer. It is not because the AI gave them a miracle remedy, but because it gave them a place to let the thoughts take place without judgment. Sometimes you just have to practice vulnerability to start seeing advantages.
Sometimes, however, the help people need is less structured. They don’t want as much advice as relief. Enter what could be called the emotional emergency exit.
Imagine it’s 1 a.m. and everything feels a little too much. You don’t want to wake up your friend, and you certainly don’t want to scroll through more convictions. So you open an AI application and type: “I’m exceeded.” He will answer, probably with something calm and sweet. It could even guide you through a breathing exercise, say something nice or offer a little story at sunset in a soothing tone.
Some people use AI this way, such as a pressure valve – a place to decompress where nothing is expected in return. A user admitted that they were talking to Claude before and after each social event, just to repeat and then relax. It is not therapy. He’s not even a friend. But that’s it.
For the moment, the best case is a kind of hybrid. People use AI to prepare, evacuate, to imagine new possibilities. And then, ideally, they bring this clarity back to the real world. In conversations, in creativity, in their communities. But even if AI is not your therapist or your best friend, it may be the one who listens when no one else does.
Decision -making
Humans are undecided creatures, and finding what to do for major decisions is difficult, but some have found AI as the solution to navigate these choices.
The AI will not remember what you did last year or did not cultivate you about your choices, which some people find refreshing. Ask him to move to a new city, put an end to a long relationship or make follies on something that you can barely justify, and he will calmly put the advantages and disadvantages.
You can even ask him to simulate two interior voices, the risk lessee and the prudent planner. Everyone can assert their cause, and you can feel better that you have made an enlightened choice. This kind of detached clarity can be incredibly precious, especially when your real survey tables are too close to the problem or too emotionally invested.
Social coaching
Social situations can cause a lot of anxiety, and it is easy for some to make a spiral by thinking about what could go wrong. AI can help them as a kind of social script coach.
Say that you mean no but not fight, or that you meet some people you want to impress, but you worry about your first impression. AI can help write a text to refuse an invitation or suggest ways to make you easier with different people, and play the role to allow you to repeat complete conversations, testing different sentences to see what is good.
Accounting boyfriend
Responsibility partners are a common way for people to help each other to achieve their goals. Someone who will make sure to go to the gymnasium, to fall asleep at a reasonable hour and even to maintain a social life and to reach out to friends.
Habité monitoring applications can help if you don’t have the right friend or friends to help you. But AI can be a quieter co -pilot for real self -improvement. You can tell him your goals and ask him to register with you, to remember slowly, or help to reframe things when the motivation plunges.
Someone who tries to quit smoking could ask Chatgpt to help follow the desires and write motivation talks. Or an AI chatbot could assure you to follow your journalization with reminders and suggestions of ideas about what to write. It is not surprising that people can start to feel a penchant (or discomfort) towards the digital voice telling them to get up early to train or to invite people they have not seen for some time to meet for a meal.
Ethical choice
In connection with the use of AI to make decisions, some people turn to AI when they are struggling with questions of ethics or integrity. These are not always monumental moral dilemmas; Many daily choices can weigh heavily.
Is it acceptable to say a white lie to protect someone’s feelings? Should you report an error that your colleague made, even if she was not intentional? What is the best way to tell your roommate that they don’t get their weight without damaging the relationship?
AI can act as a neutral resonance box. This will suggest ethical means of considering things as if the acceptance of a friend’s marriage invitation, but that the planning secretly not to attend is better or worse than decreasing. AI does not have to offer a final decision. It can trace competing values and help define the principles of the user and how they lead to an answer. In this way, AI serves less moral authority than a flashlight in the fog.
Affective
Currently, only a small fraction of the interactions enters this category. But what happens when these tools become even more deeply anchored in our lives? What happens when your assistant has whispered in your headphones, appearing in your glasses or helping plan your day with reminders not only at your time zone but at your temperament?
Anthropic may not consider everyone as effective use, but maybe they should. If you are looking for an AI tool to feel understood, get clarity or move in something difficult, it is not only the recovery of information. This is the connection, or at least the digital shadow of one.