In Professional Liability

Artificial intelligence (AI) has made its way into nearly every corner of our professional and personal lives. From research assistants to document-drafting aids, tools like ChatGPT promise efficiency and convenience. Yet, for lawyers, these tools carry unique dangers, both ethical and professional, that cannot be ignored.

Over the last year, we have all seen the headlines: lawyers sanctioned for filing pleadings generated by ChatGPT without verifying the citations. In one widely reported Georgia case, a lawyer filed an appellate brief filled with “hallucinated” case law; citations to case opinions that never existed. The Court of Appeals sanctioned the lawyer, referred the matter to the State Bar, and the resulting disciplinary matter became front-page news. It’s a cautionary tale every Georgia lawyer should take seriously.

The Bar’s Response: A Special Committee on AI and Emerging Tech

Recognizing that courts and clients alike are raising concerns about AI use in the practice of law, the State Bar of Georgia has created the Artificial Intelligence and Emerging Technology Committee. The 27-member body has been tasked with reviewing the Georgia Rules of Professional Conduct and determining whether our existing rules adequately address lawyers’ use of AI.

The Committee will make recommendations to the Supreme Court of Georgia and the Board of Governors on whether new or revised rules are needed to address AI’s impact on practice, confidentiality, competence, and client protection. For now, the Committee’s formation signals that the Bar considers this issue a serious one — and that Georgia lawyers should, too.

OpenAI’s Policy Change: A Corporate Shift with Professional Consequences

Adding to the conversation, OpenAI, the company behind ChatGPT, recently updated its own policy, banning ChatGPT from providing professional legal or medical advice. Under the new rule, effective October 29, 2025, the platform is limited to educational or informational use only.

This may sound like a technology company being cautious, but it has larger implications for our profession. It shows that even the developers of AI understand the risks of crossing into areas that require licensing and accountability; risks that we, as lawyers, shoulder every day.

When an AI model generates advice that sounds authoritative but is inaccurate, misleading, or completely fabricated, clients can be harmed. And when those clients bring AI-generated material into our offices or courtrooms, it becomes our problem. Whether the advice came from “ChatGPT” or an “AI assistant” does not matter. The duty of competence, supervision, and diligence still rests squarely with us.

Where Risk and Responsibility Intersect

For lawyers and firms, the takeaway is simple: AI is a tool, not a substitute for professional judgment. Georgia’s Rules of Professional Conduct already require competence (Rule 1.1), honesty (Rule 4.1), diligence (Rule 1.3), and candor toward the tribunal (Rule 3.3). Each of these rules can be implicated if a lawyer uses AI carelessly, or even if a staff member does so without appropriate supervision.

This is why firms should now be reviewing how AI is used internally. Are associates or staff members relying on ChatGPT or similar tools for research or drafting? Are there written policies in place about AI use? Do engagement letters make clear that only licensed attorneys, not AI, provide the legal advice?

What Smart Firms Are Doing Now

Forward-thinking firms are responding in several practical ways:

– Auditing internal use

– Updating policies

– Revising engagement letters

– Training staff and associates

– Documenting AI involvement

These are not mere formalities. They are the first line of defense against malpractice claims and disciplinary grievances that can arise from misuse or misunderstanding of AI.

Why It Matters and How PLEPP Helps

Over 25 years of practice has shown me that most malpractice and disciplinary issues can be avoided through proactive education and structure. That’s why we created the Professional Liability & Ethics Protection Program (PLEPP) — a membership community for Georgia lawyers who want to stay ahead of risk while maintaining ethical excellence.

Within PLEPP, we focus on risk prevention, sharing strategies that help lawyers protect themselves, their firms, and their clients. Our latest members-only article dives deeply into the Bar’s new AI Committee, OpenAI’s policy shift, and the specific steps Georgia lawyers should be taking now to reduce exposure.

If you have ever wondered how to safely integrate technology into your practice or how to avoid being the next cautionary headline, this is the time to learn more.

Read the Full Members-Only Analysis

The full article, “OpenAI’s New Policy and the Bar’s Special Committee on AI: What Georgia Lawyers Need to Know,” is available exclusively to members of the Professional Liability & Ethics Protection Program.

Join PLEPP today to access the full analysis, quarterly risk-management resources, and member-only community Zoom discussions with updates on malpractice prevention, ethics, and technology trends affecting Georgia lawyers.

See if you qualify for PLEPP Membership now: Apply Today

Recent Posts

Leave a Comment

Start typing and press Enter to search