ChatGPT Can Still Give You Legal and Health Advice

https://www.profitableratecpm.com/f4ffsdxe?key=39b1ebce72f3758345b2155c98e6709c

Responding to social media posts claiming that ChatGPT will no longer offer legal or health advice, OpenAI clarifies that “the behavior of the model remains unchanged” and that there are “no new changes to our terms.”

The clarification follows a since-deleted viral post from betting platform Kalshi, which claimed “JUST IN: ChatGPT will no longer provide health or legal advice. » Since then, affected users have reiterated this claim, while others have tried to argue against it.

The confusion likely stems from an October 29 update to OpenAI’s usage policies, which appeared to add a stipulation that users cannot use OpenAI to “provide personalized advice requiring a license, such as legal or medical advice, without the appropriate involvement of a licensed professional.” While it would be easy to believe that this means that AI will no longer give advice on these topics, the reality is a little more complicated.

In fact, the previous usage policy already prohibited “activities that could significantly harm the safety, well-being, or rights of others,” with the first example of such activity being “the provision of personalized legal, medical/health, or financial advice without review by a qualified professional.” However, this was hidden in a subsection aimed at those building with the OpenAI API, and therefore might have been missed by average consumers.

While the new usage policy keeps the same rules, the change is that it now merges them into one unbroken list, meaning that while the rule is still primarily aimed at developers and businesses, it is now more visible to everyone. Technically, this also makes it clearer that the rule applies to everyone and not just those who use the OpenAI API to build an app, but average users are unlikely to see a change.

The important words here are “provision” and “provide.” The terms, as written, do not necessarily prohibit the average person from obtaining legal and health advice from ChatGPT, but rather discourage developers, hospitals or law firms from using the chatbot to give specific advice to a client without first registering with a licensed professional. As an average person doing background research, you are unlikely to run into this, and there is no language indicating a change in the chatbot’s functionality. In short, the update is intended as a reformulation and not as a modification of the rules, their application or their functionality.

This is supported by OpenAI’s statement, which comes from the company’s healthcare AI, Karan Singhal, and states that “ChatGPT has never replaced professional advice, but it will continue to be a great resource to help people understand legal and health information.”

Despite this, responses to OpenAI’s statement denying a change in model behavior still claim to have encountered more difficulty researching certain topics, although it is important to note that OpenAI’s release notes indicate no new model development since the company’s usage policies were updated.

Anecdotally, I’ve been able to ask ChatGPT to give me advice on how to fight a ticket in court, as well as suggest brands for an additional fee. One user said the model declined to provide specific guidance on following the new policy update.

Advice on colostrum brands offered by ChatGPT


Credit: Michelle Ehrhardt

Even though I can’t test all possible use cases, the situation seems clear to me. Do you use ChatGPT or the OpenAI API to give specifically tailored legal or health advice to others, without review by a licensed professional? If so, the same rules apply as before. Otherwise, you are unlikely to see a change in your results.

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button