Gandalf_The_Grey
Level 83
Thread author
Verified
Honorary Member
Top Poster
Content Creator
Well-known
- Apr 24, 2016
- 7,389
What you need to know
- Researchers based in Germany and Belgium recently asked Microsoft Copilot a range of commonly asked medical questions.
- Analysing the results, the research suggests that Microsoft Copilot only offered scientifically-accurate information 54% of the time.
- The research also suggested that 42% of the answers generated could lead to "serious harm," and in 22% of extreme cases, even death.
- It's another blow for "AI search," which has seen search giant Google struggle with recommendations that users should "eat rocks" and bake glue into pizza.
The researchers conclude that, of course, you shouldn't rely on AI systems like Microsoft Copilot or Google AI summaries (or probably any website) for accurate medical information. The most reliable way to consult on medical issues is, naturally, via a medical professional. Access to medical professionals is not always easy, or in some cases, even affordable, depending on the territory. AI systems like Copilot or Google could become the first point-of-call for many who can't access high-quality medical advice, and as such, the potential for harm is pretty real.
"42% of AI answers were considered to lead to moderate or mild harm, and 22% to death or severe harm." A damning research paper suggests that Bing / Microsoft Copilot AI medical advice may actually kill you.
Don't use AI as your doctor, says European research on Microsoft Copilot.
www.windowscentral.com