A U.S. AI Lawyer is about to defend a human being Courtroom

Photo Illustration by Kelly Caminero / The Daily Beast / Getty

Kelly Caminero photo illustration / The Daily Beast/ Getty

The cost of hiring an attorney to represent you in traffic court can often exceed the fine. And that’s assuming you can find a lawyer to take on such a low-stakes case. Why not save money and get advice from artificial intelligence instead?

That’s a solution Joshua Browder, CEO of consumer-liberation startup DoNotPay, is testing out next month, when his company will pay two defendants going to traffic court up to $1,000 each to wear smart glasses that will double-up as their attorneys.

Yes, we’re living in a simulation, and it involves sentient eyewear.

Would You Hire a Robot as Your Lawyer?

It is, at least partially. The glasses will record the proceedings and a chatbot—built on OpenAI’s GPT-3Famous for transcribing ballads high school essays on demand—will offer legal arguments in real-time, which the defendants have pledged to repeat, Browder told The Daily Beast. To prevent judges from preventing the stunts being canceled, the locations of the hearings are kept secret. If they wish, each defendant can opt out of the stunt.

“My goal is that the ordinary, average consumer never has to hire a lawyer again,” said Browder.

DoNotPay, founded by Browder in 2015 while he attended Stanford University, states on its website that its mission is to help consumers “fight against large corporations and solve their problems like beating parking tickets, appealing bank fees, and suing robocallers.” Its app is supposed to help users navigate modern-day bureaucracy that interferes with doing everything from canceling subscriptions, to disputing fines, to bringing up litigation against anyone they may wish to sue. The company started out by helping users contest $100 parking tickets, but thanks to advances in AI, said Browder, they’re now helping clients fight bigger claims, like $10,000 medical bills.

The company’s latest trial will make use of CatXQ’s Smart Glasses. With square lenses and a spindly black frame, the glasses seem relatively unassuming on a glance, but they can connect to devices via Bluetooth and deliver sounds straight to the wearer’s cochlea (the hearing organ in the inner ear) through bone conduction. (similar to how some hearing aids work). The chatbot will exist on the defendant’s phone as a regular app, absorbing audio through the device’s microphone, and dictating legal arguments through the glasses.

The chatbot glasses won’t be a marketable product anytime soon due to legal restrictions. A license to practice law in the United States is required. This includes providing legal advice and representing parties in court. A lot of states have banned recording in courtrooms.

Nonetheless, Browder sees his company’s new experiment as an opportunity to reconceptualize how legal services could be democratized with AI.

Smart Glasses That Can Actually Help You Work and Play Smarter

But putting one’s rights into the hands of an algorithm as a solution to insufficient or inequitable legal representation is ethically worrisome, legal experts warned. AI could have separate legal consequences for defendants, which could be far more complicated than a traffic ticket. Chatbots might not be the way to justice Browder and others envision.

With Prejudice

GPT-3 is good at holding a conversation and spitting out some interesting ideas, but Browder admits it’s still bad at knowing the law. “It’s a great high school student, but we need to send it to law school,” he said.

GPT-3 must be trained correctly, just like any AI. DoNotPay’s law school for bots looks like mock trials run by team members at the company’s Silicon Valley headquarters in Palo Alto. The algorithms are nourished on datasets of legal documents from publicly available court records and DoNotPay’s own roster of 2.75 million cases, according to Browder, dating back to its conception in 2015. The bot that goes before a judge is trained on recent traffic tickets cases from the same jurisdiction and some adjacent countries. A quarter of these cases are from DoNotPay’s own database, while the rest are from publicly available records.

But all AI carries the risk of bias because society’s prejudices will find their way into these datasets. The Daily Beast was told by Nathalie Smuha (a legal scholar and philosopher at KU Leuven in Belgium) that if the AI search engine is trained to find people of color guilty in the cases, the AI will start to associate guilt with certain races.

The Women Geniuses Taking on Racial and Gender Bias in AI—and Amazon

“There is a risk that the systemic bias that already exists in the legal system will be exacerbated by relying on systems that reflect those biases,” she said. “So, you kind of have a loop, where it never gets better, because the system is already not perfect.” Similarly, not all legal cases are public, and the algorithm may only be trained on a subset restricted by specific dates or geography—which can distort the bot’s accuracy, Smuha added.

The American public is familiar with all of these facts. Princeton researchers carried out a 2017 study on the discretion of Florida police officers when speeding tickets were issued. a quarter of officers showed racial bias. The 2018 book by political scientists Suspect Citizens A 14-year analysis of 20 million North Carolina traffic stops revealed that Black drivers are the majority. 95 percent more likely to be stopped.

Any AI trained on those datasets would be at risk of developing unfair biases against certain demographics—affecting how they may deliver legal advice in traffic court. Browder stated to The Daily Beast that DoNotPay took steps to reduce any bias. The part of the bot responsible absorbing the case’s substance and making legal arguments doesn’t know any personal information other than vehicle type.

These bias concerns aren’t just for fighting traffic tickets. Browder’s automated legal utopia could lead to more serious systemic injustices against marginalized communities if the justice system is based on more complex cases.

In fact, we’re already seeing this unfold. Assessment tools for criminal risk using socioeconomic factors such as education, income, and housing are available. already used by some judges to inform sentencing and have been found to worsen disparities. Amnesty International explains that the NYPD uses predictive police algorithms to decide where facial recognition technology will be deployed. has called “digital stop-and-frisk.” In 2013, The Verge reported on how the Chicago Police Department used a predictive policing program to determine that Robert McDaniel was a “person of interest” in a shooting, despite having no record of violence. Facial recognition algorithms were introduced last month. led to the wrongful arrest of a man in Louisiana.

When asked about algorithmic biases, Browder said that people can use AI to fight AI—the bot puts algorithms into the hands of civilians. “So, rather than these companies using it to charge fees, or these governments using it to put people in jail, we want people to be able to fight back,” he said. “Power to the people.”

The lack of regulation around AI This means that this outcome is unlikely to happen.

The Can of Worms

Bias aside, defendants could also end up in hot water for the use of technology and recording—uncharted waters for the legal community. “Is [Browder] going to help erase their criminal conviction for contempt?” Jerome Greco, a public defender in the Legal Aid Society’s digital forensics unit, told The Daily Beast.

While DoNotPay has committed to paying any fines or court fees for clients that use its chatbot services, Browder does worry what could happen if the bot is rude to the judge—a misdemeanor could normally land a physical person in jail. And Smuha predicts that the chatbot’s malfunction wouldn’t be an adequate alibi: “A courtroom is where you defend yourself and take responsibility for your actions and words—not a place to test the latest innovation.”

And of course, there’s a risk that the algorithm could simply mess up and provide the wrong answers. There are several ways to hold an attorney responsible if they fail to follow up on your case, including filing complaints and suing. The framework to protect you if the chatbot fails to understand the legal arguments is not clear. You are to blame? Who are the scientists who trained this bot? What were the biases in training data?

NJ Bans Cops From Using Clearview AI, Facial-Recognition App That Scrapes Social Media

Smuha stated that technology is imperfect because software doesn’t understand what data means. “Take the sentence ‘that man is not guilty,’” she said. “The software has no idea what ‘man’ is or what the concept of ‘guilty’ is.” That’s in stark contrast to the years of training and ethical standards that lawyers are held accountable to. “There will be a risk that the system will speak nonsense.”

As a result, AI-enabled databases and pattern-spotting tools simply speed up the legal process, as opposed to determining a case’s outcome, “because the tech is just not accurate enough yet,” Smuha said.

Browder appears unaffected and responds aggressively to such criticisms. Last week he trolled the law community on Twitter by promising $1 million to any person or attorney with an upcoming Supreme Court case to follow the chatbot’s counsel. “I got so much hate from all the lawyers,” he said. He later deleted the tweet, saying that he would make this $5 million reward.

Greco finds the whole spectacle unsettling, and takes issue with DoNotPay finding willing participants to test its experimental AI via poorer clients who can’t afford a physical attorney. “Using them as guinea pigs to test an algorithm? I have a real problem with that,” he said. “And I think it overlooks the other solution… Why don’t we put more money into people having proper representation?”

Browder believes that this is only the beginning of consumer rights. “Courts should allow it, because if people can’t afford lawyers, at least they can have some help.”

Read more at The Daily Beast.

Get the Daily Beast’s biggest scoops and scandals delivered right to your inbox. Sign up now.

Stay informed and gain unlimited access to the Daily Beast’s unmatched reporting. Subscribe now.

Previous post Catherine McNeil Marries Basketball Player Miles Plumlee in a Striking and Risqué Wedding Gown
Next post No. 96, Purdue TE Payne Durham