Bar Associations Threaten Pro-Se Litigants With Artificial Intelligence UPL Lawsuits

Think back to 2013. New companies started using machine language to mine trends and insights from legal databases. Their result looked an awful lot like legal advice, even if it was generated by algorithms. At the time, I worried that these companies might inadvertently run afoul of prohibitive laws unauthorized practice of law (UPL). During lunch, I warned the CEO of one of the leading companies about these risks. He acknowledged my concern and said he would prepare a letter. Who knows what came of it. I have outlined some of these concerns in a short article entitled Robot, Esq., chapter of the book and in the post entitled “Emerging Ethical Issues for Legal Analytics.” Here’s a clip:

The fourth problem, and the second elephant in the room, is the unauthorized practice of law (UPL). Reading charts to offer advice on how a case should be resolved or where it should be transferred is at the heart of the practice of law. That the algorithm spits it out doesn’t really matter. Non-lawyers, or even lawyers who do not work for a law firm, cannot give this type of advice. Data analytics firms should be careful about providing this type of personalized advice outside of the context of the attorney-client relationship.

Although, for now, I’m not too worried about this last question. The vast majority of UPL problems are avoided when the law firm or general counsel acts as an intermediary between the data analytics firm and the (non-lawyer) client. As long as a lawyer somewhere in the pipeline independently reviews the data analysis recommendations and gives them his blessing, I don’t see any of this as a significant problem (although bad advice can result in a malpractice suit). I’m working on another paper that analyzes paralegal law (this is actually a thing) and what types of legal tasks can be delegated to paralegals under attorney supervision.

But when data analytics companies try to expand to serve consumers directly—like LegalZoom—we run into this problem hard. When there is no lawyer in the plan, things get difficult very quickly.

Flash forward to present day. ChatGPT and other similar AI tools directly assist pro-se litigants. Consider the best-thought-out plans of Joshua Browder, who created a system for contesting traffic tickets.

A British man who planned to have a “robot lawyer” help a defendant fight a traffic ticket has backed out after receiving threats of possible prosecution and jail time.

Joshua Browder, CEO of New York-based startup DoNotPay, created a way for people challenging traffic tickets to use artificial intelligence-generated arguments in court.

Here’s how it was supposed to work: the person challenging the speeding ticket would wear smart glasses that would record the court proceedings and dictate answers into the defendant’s ear from a small speaker. The system was powered by several leading AI text generators, including ChatGPT and DaVinci.

The first legal defense based on artificial intelligence was supposed to take place in California on February 22, but no longer.

This strategy would not go well. Browder was allegedly threatened by UPL.

As word got out, according to Browder, uneasiness began to swirl among various State Bar officials. He says that angry letters have started to arrive.

“Multiple state bar associations have threatened us,” Browder said. “One even said it would be possible to refer both the prosecution and the jail sentence to the district attorney’s office.”

Specifically, Browder said one state bar official noted that unauthorized practice of law is a misdemeanor in some states punishable by up to six months in county jail.

Lawyers are very good at using cartels to suppress competition. Legal tech companies, beware.

Source link

Leave a Reply

Your email address will not be published. Required fields are marked *