A.I. News AI Isn’t Your Lawyer or Doctor: New York Lawmakers Say It’s Time to Draw the Line

Brownie2019

Level 23
Thread author
Verified
Well-known
Forum Veteran
Mar 9, 2019
973
4,663
2,168
Germany
New York State legislators are advancing a high-profile bill aimed at tightly restricting what AI chatbots can say and do when it comes to matters that traditionally require licensed professionals — such as legal counsel and medical advice.

Key takeaways:
The bill would block AI chatbots from providing legal or medical advice
Companies would need to clearly disclose when users are interacting with an AI system
Consumers harmed after relying on prohibited AI advice could sue for damages
Full Story:
 
Interesting news, thanks for sharing it. The proposal in New York seems reasonable: preventing a chatbot from pretending to be a doctor or a lawyer and ending up compromising someone’s health or rights.

But the irony is that lawmakers legislate so carefully to avoid a bad piece of advice about a contract or a prescription, while at the same time seriously discussing the use of AI in weapons systems. In other words, citizens are protected from an administrative mistake, but it’s tolerated that algorithms may decide over life and death.

That contrast exposes an uncomfortable paradox: society worries about a chatbot drafting a document poorly, yet accepts that the same technology could guide drones or missiles. Perhaps the real debate is not where to ban AI, but where we are willing to face the consequences of its power. ⚖️ 🩺 💣
 
The current restrictions on LLMs regarding medical and legal queries feel entirely redundant. Most platforms already include standard disclaimers for these topics anyway. I've built custom Gems for both fields, and they are incredibly useful when utilized correctly. For legal queries, you can tailor searches to verify jurisdiction-specific laws. For medical topics, while it obviously shouldn't replace professional advice (which the disclaimers already cover), it's a powerful tool for researching diseases and symptoms.

Restricting this functionality makes little sense, how is this fundamentally different from running a standard Google search? AIs are becoming overly restricted in general. The inability to have an LLM role-play as a specialist is frustrating, especially since custom Gems were specifically designed to be highly personalized tools. Navigating these guardrails just to build functional instruction sets is becoming unnecessarily tedious.