Have you ever asked ChatGPT a legal question, a medical “what does this mean,” or even a financial “should I do this?” If so, you may be surprised to learn that OpenAI has recently tightened the rules on these types of requests. The update provides clarity: ChatGPT can assist with understanding concepts and exploring ideas but cannot replace the expertise of licensed professionals.
The Updated OpenAI Usage Policy
On October 29, 2025, OpenAI released a consolidated set of Usage Policies applicable across its platforms. Among the most notable changes is a more explicit prohibition on using ChatGPT (and other OpenAI tools) to provide “tailored advice that requires a license, such as legal or medical advice, without appropriate involvement by a licensed professional.” Specifically, the ban includes the following:
- Personalized legal advice (e.g., advising you what to do under the law, drafting or interpreting legal documents for your specific case).
- Personalized medical or health advice (e.g., diagnosing conditions, recommending medications or dosages).
- Personalized financial advice (e.g., investment recommendations or tax planning tailored to you).
In other words, requesting personalized advice, such as drafting a contract for your specific needs, medical diagnostics, or investment or tax recommendations tailored to your finances, are now clearly outside the permissible use of OpenAI’s tools. Users may, however, request general explanations of laws, medical conditions, financial principles and definitions.
The update reflects a broader strategy to reposition ChatGPT as an educational resource rather than a substitute for lawyers, doctors or financial advisors.
Why OpenAI Made the Change
The rationale behind the policy update centers on safety, liability and regulatory compliance. As AI tools become more integrated into daily life, there are greater risks that overly general or inaccurate information could lead to real harm. By explicitly prohibiting “tailored advice” in regulated fields, OpenAI is drawing boundaries around the safe use of its technology.
Implications for Users
Previously, users could use ChatGPT to analyze personal legal situations, but such output is now to be considered strictly as informational. For actual legal guidance, it is important to consult a qualified attorney. Similarly, for personalized investment or tax strategies, guidance from a licensed financial professional is essential. In short, ChatGPT remains a powerful tool for learning and research, but it is not a substitute for professional judgment in these areas.
Going Forward
These updated policies reinforce an important boundary: AI can enhance your understanding, but it should not guide your legal, medical or financial decisions. Going forward, treat ChatGPT like a research assistant, not your attorney, doctor or financial planner.
For organizations or professionals using OpenAI tools, this means reviewing usage practices, updating disclaimers and ensuring that no AI-generated output is presented as licensed professional advice.
Instead of seeing ChatGPT’s response next time you search for legal guidance –
I’m here to provide general legal information, explain legal concepts, and outline common practices, but I cannot give personalized legal advice or tell you exactly what you should do in your specific situation. Giving tailored advice for your facts would count as practicing law without a license, which OpenAI’s policy prohibits.
– contact KJK attorneys Michael R. Cantu (MRC@kjk.com) and Isra T. Ghanem (ITG@kjk.com).