‘Rectal garlic insertion for immune support’: Medical chatbots confidently give disastrously misguided advice, experts say

https://www.profitableratecpm.com/f4ffsdxe?key=39b1ebce72f3758345b2155c98e6709c

Popular AI chatbots often fail to recognize false health claims when they are made in safe, medical-sounding language, leading to questionable advice that could be dangerous to the general public, such as a recommendation that people insert garlic cloves into their butts, according to a study published in January in the journal. Lancet digital health. Another study, published in February in the journal Natural medicinefound that chatbots were no better than a simple internet search.

The findings add to a growing body of evidence suggesting that such chatbots are not reliable sources of health information, at least for the general public, experts told Live Science.

Related Articles

Leave a Reply

Your email address will not be published. Required fields are marked *

Check Also
Close
Back to top button