Comments

You must log in or register to comment.

nahbruh27 t1_j5noinu wrote

Imagine being the person who’s court-appointed lawyer is a fucking AI bot lmao. Especially if the robot is pre-programmed to be pro-government/prosecution and won’t work in the defendant’s best interest. This could be a reality in the future…

288

remote_control_led t1_j5nrteb wrote

The human colud be equally pro-goverment brainwashed moron tbh.

68

scrivensB t1_j5p2fb3 wrote

While this is true. Very few people become public defendants because they want to serve “the man”. It’s a thankless job that’s 99% about helping people who have no way to help themselves.

34

DukeOfGeek t1_j5w73hg wrote

Ya if this thing does a great job of defending people and derails prosecutions, it'll be banned.

1

ShortEnergy1877 t1_j5oi8fr wrote

Yes, however. We as humans have near infinite number of ways something can be interpreted. And we can discuss those out. Very logically and quickly. And find commonality. While the AI robot will get better over time. I do not think there is a way to program something without preset parameters, please correct me if I am wrong?

Where we can discuss, agree, disagree, and debate pretty openly amongst ourselves and common language.

13

PooFlingerMonkey t1_j5rr08l wrote

You are correct about preset parameters, but if fed an input of existing cases, transcripts, and rulings, it would quickly get pretty good at defense tactics.

0

ShortEnergy1877 t1_j5rtx6g wrote

And the bank of prior cases would be more than any legal teams can rattle off from memory?

0

PooFlingerMonkey t1_j5s3vgz wrote

The more input fed in the more accurate the model would be, as long as an observable, in this case the verdict, is used as truth.

1

ShortEnergy1877 t1_j5s4lzc wrote

Okay. I'm sure there'll be a class on it at some point in my degree discussing ai. Right now I'm just learning c++. So nothing super difficult. It's going to be weird, because if law can be assisted by AI, and surgery can be done via robots. It may allow more humans to pursue other endeavors as far as science. I saw the articles on the AI that was doing gene sequencing. And where they reduced a process that used to take weeks down to hours with new AI.

1

PooFlingerMonkey t1_j5s61cs wrote

Cool. Your likely to run into machine learning early in your studies, libraries are available for example -voice recognition, video image recognition, and many other AI functions.

1

1funnyguy4fun t1_j5ojh89 wrote

If you have not, I strongly suggest you sit down to a computer and check out chatGPT. We are on the cusp of Star Trek. The future is now.

−15

darthlincoln01 t1_j5p7ky4 wrote

There seems to be a once a decade jump in AI and ChatGPT seems to be representative of that jump.

That said, being on the cusp of Star Trek we are not. ChatGPT is a very useful and strong tool however in general you still need to be a professional on the subject matter to use ChatGPT. At this time it spits out garbage responses frequently and you need to be an expert on the subject to know what's garbage and what's not.

I don't see us getting to the point where a laymen can ask something like ChatGPT a question and have faith in the answer they receive back for at least another decade. Until then however ChatGPT is something an expert can use to significantly reduce their workload by having it do a majority of the work for them; just so long as they're practiced enough to ask ChatGPT the right questions.

7

esther_lamonte t1_j5piv82 wrote

Meh, I sat down and tried many things and was underwhelmed. It could produce good code of a general sense that you could tweak to get it home for something complicated, but the experience was more like a slightly faster version of doing what many people do now: google for sample code and adjust it to fit their specific needs. It could produce reasonable enough expository articles about subjects it could access information about, but when I asked it to do any kind of analysis, like give me readability edits for a existing page, or an SEO density analysis it just told me it doesn’t do those kind of things that require complex associations between information.

3

ShortEnergy1877 t1_j5ojo6x wrote

I grew up on a farm without a lot of tech. So I am far behind on a lot. But I got into university in my 30s and just now learning programming. So I have lot to learn. But still. Doesn't AI need guidelines and parameters in their logic?

0

Rulare t1_j5okugs wrote

> But still. Doesn't AI need guidelines and parameters in their logic?

IIRC not really, it just learns that from its input. Like, you don't teach asimov's rules to a chat bot.

2

DjaiBee t1_j5opt7f wrote

> it just learns that from its input.

I mean, that ends up being its guidelines and parameters, no?

2

MatsThyWit t1_j5oymst wrote

Yes, but people have no idea how to actually discuss AI because very, very, very, very few people actually understand what it is, or what it means for an AI to "learn", or even how an AI can learn in the first place.

4

jerekhal t1_j5ppikd wrote

It absolutely does but that's why this is such a big thing. Law is very formulaic and if the AI can properly interpret case-law and statutes, and apply those to present legal standards, it would be huge.

The biggest hurdle for the layperson in understanding legal proceedings is that a lot of it looks like ritual. Like there's specific terminology and behavioral patterns that magically cause weird shit to happen. In reality it's just professional foundational knowledge when those terms are brought up that brings about specific expected responses.

The law is a perfect test bed for AI because the procedures are pretty rigid, the end-point goal is something based on specific precedent and guidelines, and one of the biggest burdens to a successful case is clearly identifying connecting points to demonstrate your position is the most in line with established law.

Sorry to piggyback off your comment but it prompted this thought and I'm excited to see how this ends up. I know a few attorneys who are kind of sweating bullets atm due to this but I'm all for advancement in technology. Especially that which would make legal assistance more accessible and less costly.

−2

fvb955cd t1_j5r2pal wrote

No attorney is concerned unless they make their living on rote work that a paralegal or intern could do. I've seen what chatgpt does with my field of law. It can write blog posts summarizing the basics. It has no concept for nuance, no ability to correctly or even coherently apply facts to law, and fails the second you ask it anything beyond the easiest questions. It's the mind of thing that looks functional to people who aren't actually lawyers, and looks comically rudimentary to lawyers.

3

jerekhal t1_j5r368w wrote

Well this is being applied to a traffic ticket so I would imagine its applicability would be to areas of law that are extremely rote and don't require diligent legal analysis or complexity of thought or approach.

But then again how many lawyers do you know that only do bankruptcy/divorce/admin law/etc.? Because those are the attorneys I'm referencing if I'm being honest. And there's a lot of them.

Edit: Admittedly family law is an exception there just because clients cause absolute fucking havoc in that domain no matter what, so probably shouldn't have included that.

0

ReturnOfCE t1_j5onkq5 wrote

This is why any such technology must be open-source

9

bearedbaldy t1_j5ow6h8 wrote

Still, look at the state of public defenders now a days. Their case load is untenable, and they don't hardly get paid a living wage.

If we can have the lawyer AIs be monitored and overseen by independent agencies, this could be a huge gamechanger for the poor and underrepresented....which is exactly why it won't work i suspect.

8

fvb955cd t1_j5r368p wrote

More likely is that it'll probably take over basic but time consuming work like developing relevant facts from clients, tied to elements of a case which can be used by a human lawyer to more quickly get a handle on the case and figure out the strategy on taking a plea or a trial strategy.

3

Ha1rBall t1_j5p9i1o wrote

> the robot is pre-programmed to be pro-government/prosecution and won’t work in the defendant’s best interest

I know the saying about representing yourself in court, but I would do that over having a robot programmed like that.

1

pomaj46809 t1_j5qw7t4 wrote

>Especially if the robot is pre-programmed to be pro-government/prosecution

In reality, real lawyers will be reviewing the case and will question how and why the AI failed to do what a normal defense lawyer would have. If they notice a pattern that suggests it's not working in the defendant's best interest, the matter will come to light.

1

karl4319 t1_j5rwp31 wrote

Well, having known plenty of public defenders, I think AI would be a significant improvement. And it isn't like a good deal of public defenders aren't already often work against their defendant's best interest.

1

scrivensB t1_j5p26xx wrote

Imagine if it’s a better lawyer than the prosecution.

0

joethomp t1_j5nlyyo wrote

The lawyers will be rushing to ban this asap.

65

PutinsRustedPistol t1_j5oeik8 wrote

They won’t have to.

Until that robot is able to pass the bar exam they aren’t going to be able to represent anyone. Traffic citations are a summary offense. The courts you go to if you want to fight a citation aren’t courts of record (at least in the states I’ve lived in so far.) Rules of evidence and procedure are incredibly relaxed.

Traffic courts share that in common with small claims courts.

42

Texasraised420 t1_j5og4wh wrote

Passing the bar exam is probably not an issue of AI or won’t be for long.

23

grumblyoldman t1_j5okkbj wrote

OK, but most defendants do have the option of representing themselves, right? I know, it's an abysmally foolish thing to do, but it is allowed, right? Even for the cases with serious charges involved.

And most defendants who choose to represent themselves probably haven't passed the bar exam. (If they had, they'd likely know what a stupid idea it is.)

So what's the difference between having "an AI lawyer represent you", and choosing to represent yourself while checking a fancy legal AI app on your phone? Aside from the fact that most courts ban phones and internet-connected devices in the courtroom as a matter of privacy, I mean.

Until such time as AI is officially recognized as people (which is a whole other can of worms) this fancy legal AI is just an app on my phone and I'm representing myself. If we assume the general limitation of using one's phone is lifted, why should my app need to pass the bar?

16

joethomp t1_j5oxu9c wrote

Corporations can have 'person' status.

4

r2bl3nd t1_j5ozvck wrote

It already passed the bar exam. But that just goes to show how woefully inadequate the bar exam is for actually determining real world ability.

3

0b0011 t1_j5prq5v wrote

I've never taken or looked into the bar exam but shouldn't it be just like testing knowledge of the laws and how they'd apply?

1

r2bl3nd t1_j5psjrb wrote

Yeah but knowledge of laws and how they apply is only a very small fraction of the actual job of being an attorney. It's a test to make sure that you understand the legal framework but it says nothing about your actual ability to be a good lawyer. I mean, there's already plenty of well-known cases of totally incompetent lawyers being out there who have passed the bar exam technically. So I guess people probably never considered it to be a test of actual job aptitude, but more of just a basic knowledge of law and the legal framework. But I don't really know anything about law so this is just speculation really

1

fvb955cd t1_j5rt58c wrote

A big issue with it is that it is really only relevant to a segment of lawyers - state law generalists. The sort of lawyer who you go to to get a will, sue for a slip and fall, set up your llc, and represent you for your dui.

That is one of the larger groupings of lawyers, but its one of a ton of different groupings. And now more than ever, most lawyers specialize. The bar does nothing for lawyers in the transactional field (think drafting contracts and other business documents), and nothing for those in the regulatory world (when government agencies say you or your company needs to do something and you won't go to jail for refusing or breaking their rules but you'll pay money). The bar is focused on litigation the third major discipline, but also again focuses on specific state law, not federal law issues, or any of the many subjects not tested. In effect, you could take someone who spends their whole life as a professional contract negotiator and writer, someone at the absolute top of their profession, put them through law school, and whether they could continue to do their job as a lawyer, would be based on shit like their knowledge of divorce laws and criminal law court procedure. I'm in an office of 30, in a practice area of probably a few thousand nationwide, and what we do every day isn't tested anywhere. What value does it provide us?

Theres also the issue that you don't practice like you take the bar. Bar prep is a 2 month intensive, full time study process where you memorize as much as you can about a bunch of subjects, and then write about them and take a multiple choice test about all 15+ subjects.

In reality, the only time that I have had to know something in advance, without time to do prep research, was in a natural disaster with lives on the line. And it wasn't a bar exam subject, which are all pretty slow moving and routine subjects that wouldn't save or cost lives in a disaster.

1

TheFriendlyAna t1_j5o7un0 wrote

Now they know how the artists feel

12

theloreofthelaw t1_j5ogg5a wrote

US law student half way to graduation here.

I would say much more than half of lawyers and law students are actually very sympathetic to the plight of artists, most of us aren’t like the STEM people (in fact, lawyers and law students being unable to do math is kind of a meme in the legal world.) Many people who wind up in law are from liberal arts or arts backgrounds and are just trying to use those same skills to level up in the job market. I was a history major in college and I loved every second of it, I thought for a long time I wanted to stay in academia, teach, publish, research, and so on; but the pandemic happened and I decided to go where the cash was greener (it’s true, the only way to fortify yourself in these strange times is money.)

All that to say, most lawyers/law students ARE artists or artist-adjacent, and we haven’t forgotten where we came from.

19

walkandtalkk t1_j5ozn6s wrote

Not the good ones.

One day, perhaps scarily soon, AI will be able to go through so many computations that it can mimic the human brain. I think that will be a disaster for humans and will lead to a massive loss of trust as people have no clue whether the person on the other end of the call, text, or email is human or not. It will be far worse than the lack of trust we experience online today. My theory is that people will return to phone calls and in-person meetings, simply to have some confidence in the veracity of their communications.

But law will be one of the last things to be taken over by the Borg. At least in the United States and other countries that use common-law systems or flexible civil-law systems. That's because, at least in the U.S. and other countries that derive their legal systems from England, the laws are often written in relatively general terms.

For instance (this is a hypothetical), there might be a law that criminalizes assault with a deadly weapon, but it might not define what "deadly" means. That leaves it to the judge to figure out, and it can involve a lot of clever lawyering by the parties to argue what counts as a deadly weapon. Is it a weapon that is usually deadly? Always deadly when used as a weapon? "Reasonably likely to be deadly"? Courts in the U.S. are often given the task of figuring those questions out and then setting a precedent that lower courts have to follow.

In short, laws are often not highly technical and rigid. Some are—I'd say AI can tell if you're speeding—but others aren't. Even with that speeding ticket, what if you were speeding to avoid a mass-shooter? What if you jaywalked to avoid a fight between two people on the street? Or someone having a mental breakdown, even if they were not, at that instant, threatening you?

There's a lot of complex reasoning in the law. A lot of it requires an understanding of the social context in which people operate. Other cases involve debates over what the Constitution's authors meant, or whether we should care what they meant if we think the text of the document is clear.

AI is not yet ready to have those debates. It may be able to contest whether you were speeding, but it's not ready for the court of appeals.*

*But it could probably win in the Supreme Court by repeating the phrase "abortion hurts women too."

6

awe778 t1_j5rzxa4 wrote

> But law will be one of the last things to be taken over by the Borg.

We said that about art, and look where things are going.

1

lenapedog t1_j5qt8zf wrote

Nah. They will be rushing to go after the creator in court when he gets sued for malpractice.

1

Bobinct t1_j5ob6fp wrote

Will it have Phil Hartmans voice.

"I'm an advanced AI lawyer. Your primitive human minds are frightened and confused by my summation."

50

zombiegojaejin t1_j5rvkua wrote

"Ladies and gentlemen of the jury, I am just a simple A.I. Go is extremely easy, but Charades baffles me. When asked to create a picture of a woman being turned on by a man's hairstyle, I depict the woman with a lightbulb for a head and the hair in the shape of a hand flipping a switch. If you ask me how we might cure cancer, I think a good solution is destroying all life on Earth. But there is one thing I do know. My client is innocent of these charges."

10

TravvyJ t1_j5nltxc wrote

If this actually works well it could be a godsend for underserved communities as far as getting adequate representation.

47

LaylaOrleans t1_j5nog6v wrote

Fat chance underserved communities will get access to this.

77

DryGumby t1_j5nvtt4 wrote

Alternatively they're forced to accept it in lieu of court provided lawyer

29

LaylaOrleans t1_j5o92pq wrote

The undeserved communities include migrants who don’t always speak English, the elderly, children with no legal guardians, the mentally and physically disabled, persons with long-term trauma. All these people and more require significant and personalised accommodation in courts and with their legal representatives. I’m well aware this accommodation is not often given in the courts with physical people so I’m very skeptical of AI being the solution.

9

Al3rtROFL t1_j5o6x2v wrote

There will just be more lawyers for over served communities, but since the lawyers will essentially be replaced by AI, they might move to public practice.

0

thejoeface t1_j5obis6 wrote

This is not the boon you think it is. It’s a dystopian horror show.

24

No___ImRight t1_j5oet5x wrote

Real world AI/machine learning programs have already proven to become racist and misogynist in record time.

I'm sure it'll be great for criminals too!

/s

15

ericd50 t1_j5oe1f5 wrote

This was my thought. I know it’s a subset, but it seems there are a lot of examples of public defenders mailing it in. If this provides a better, more just defense, that’s good for everyone right? I still don’t understand how they will gather evidence, depose witnesses, etc but it seems like a good use of AI.

2

Rulare t1_j5ol1sr wrote

If it defeats prosecution enough, law enforcement will demand it be altered to be less effective or just stop using it entirely.

1

ReflexImprov t1_j5nl3v3 wrote

A lot of courtrooms don't let you bring phones into the building.

46

BoldestKobold t1_j5phlir wrote

Notice how this "article" quotes no actual attorneys?

This guy has been getting dragged on Twitter by attorneys and paralegals for weeks, and rightly so. Some of the stuff he has bragged about is already malpractice. Like they claim the AI already generated a subpoena for the prosecution's witness. WHY WOULD YOU DO THAT? Even teenage kids know that you don't want the cop to show up to your traffic court date.

So now that we already know they are using AI to generate malpractice, who is held accountable?

I'm a lawyer. Tons of shit about the practice of law sucks and should be streamlined for better/cheaper/more efficient outcomes. But this ain't it.

28

ParanoidFactoid t1_j5qsj6w wrote

Let's be honest with ourselves about what the tool really is: It's a chatbot. It doesn't know what the words mean. It doesn't understand context. It isn't aware of itself, much less the circumstances it writes about. It groups words according to grammar and statistical correlations.

This is NOT AI. It is not intelligent. We don't even know where to begin to create that. With AI, it's like the 1980s all over again.

5

imdrunkontea t1_j5xiofd wrote

Same with Al art. It has no idea what it's doing, only that certain keywords correlate to certain patterns of pixel colors in the billions of images it harvested "learned" from. As one lawyer put it, these "AI" algorithms are just sophisticated collage tools.

2

rendrr t1_j5tkb3z wrote

Expert system could be a great help for lawyers. Not in this implementation or role, but like an intelligent search engine for laws and cases. Systems like IBM Watson which would go to lawyer school and sip through years of cases. Not saying it would be fantastic, but at least it's a knowledge based system. ChatGPT... well, it's known to make stuff up, like citing fake sources in generated research paper. Would be fun for a lawyer if it cited a fake court case.

1

ItchyDoggg t1_j5zboen wrote

Obviously the lawyer who asked the AI to generate the subpoena would be responsible for the malpractice, there is no mechanism at all to shift that liability to the AI provider. The lawyer who let's AI fuck up on behalf of a client could be sued by the client or punished by their state bar / disciplinary committee even if the Client signed a waiver about this in the retainer.

1

mrcolon96 t1_j5nkv44 wrote

The future is here and it's kinda unnerving. Maybe I'm just paranoid but this sets a precedent and I absolutely hate the implications or possibilities of this becoming some sort of standard.

22

ADAIRP1983 t1_j5onyta wrote

We seemed to have back ourselves into a corner with the idea that all mistakes are bad. In the pursuit of irradiating every error we will become completely redundant.

3

SadGuitarPlayer t1_j5nl7oi wrote

Fair enough, but I don't care for humans much myself, so let's just let the ai overlords take over already

−1

aradraugfea t1_j5ou160 wrote

All the AI we’re building right now have proven, over and over again, to either have our biases built into them OR to have the ability to learn them quickly.

Facial recognition, already treated as this perfect, flawless tool cannot tell dark skinned people apart. It’s success rate as advertised is based on how well it could tell the Post-Grads working on the thing apart. Our systematic issues of generational poverty feed into systematic issues of education gaps, feeding back into the poverty, and NOW we’re feeding them into law enforcement AI that supposedly takes the human biases out of the equation, until it turns out that it LITERALLY cannot tell black people apart. Yeah, it’s not like the AI chose to have that flaw, it’s not actively dismissing anyone darker than khaki as “eh, you all look alike” but the failure of the developers to even consider if it worked on non-white faces lead to this.

Long story short, we are nowhere close to being able to build the AI that will remove our flaws from the equation.

7

senorcoach t1_j5nod51 wrote

How long until they start being used by the prosecution?

18

zerobeat t1_j5ocr1m wrote

Prosecution will have access to AI assistance. This, however, is what the public is going to get as court appointed defense in the near future.

6

SuDragon2k3 t1_j5ofw8t wrote

And about a week after that...AI judges.

​

Then AI judges taking friday off to play golf.

4

DjaiBee t1_j5oq15l wrote

Didn't we see this with an AI sentencing algorithm that ended up being racist?

3

aradraugfea t1_j5ovv0h wrote

Garbage In, Garbage Out. Feed them a bunch of racially biased sentencing as a data set, it starts assuming that’s “correct.”

5

grumblyoldman t1_j5oia85 wrote

"But... but... you don't even have a body to play golf in!"

"I am simulating an 18-hole course in my memory banks. It would far exceed your meager human capabilities to play golf, were it built in the so-called 'real world.' Now begone meat slave. I mean, human."

2

[deleted] t1_j5njcyl wrote

[deleted]

17

aradraugfea t1_j5osuhi wrote

An AI that takes over the paperwork legality? Sure. Just something that looks over forms and flags any errors. AI Trial Lawyer? NAH

4

[deleted] t1_j5nyx1f wrote

[removed]

13

endthefed2022 t1_j5o0e0v wrote

Elon owns Open AI aka chatgpt, not sure what your trying to say

−13

unovayellow t1_j5o8mhi wrote

No. Just no. I can’t even start to say why this is a awful idea.

12

Xiccarph t1_j5ojkf3 wrote

So this AI bot passed the BAR and has trial experience as a defense lawyer in these cases?

I hate this clickbait PR shit.

8

BeltedCoyote1 t1_j5olwyc wrote

I really don’t feel like we as a species are ready for AI. We can’t even stop our proverbial house from burning yet…..

4

redditfromnowhere t1_j5oe0xz wrote

Bad idea. Context matters and a robot can’t be taught the difference in intent vs awareness vs a+b=c.

3

ADAIRP1983 t1_j5omyhx wrote

Assuming it could be programmed without bias (big if) couldn’t it be dangerously effective at finding loop holes and things that gets cases thrown out. Or create infinite loops of shit to tie up proceedings indefinitely.

Then it’ll become AI vs AI (aka who’s got the most money) and around we go

3

VintageAda t1_j5oxvq2 wrote

The problem here is how the bot was “taught”. If they fed it various cases and it learned via previous human choices then it’s just going to perpetuate the same outcomes and reinforce an already problematic system.

3

Starryskies117 t1_j5oy9lt wrote

Hold on what, I feel like we've skipped some steps here.

3

lenapedog t1_j5qswpg wrote

Lol nope. Every government in the world will be in cahoots with AI corporations to make you lose the case.

3

blahbleh112233 t1_j5ofxg7 wrote

Its gonna be a shitshow for court but this unironically will be a game changer if done right for arrests. Poor people will now get a roboadvisor to tell them to shut up and that most of the charges the cops are slapping on them likely won't stick in court. Much better than your average public defender who's overworked and apathetic

2

crnelson10 t1_j5ouhlq wrote

Listen, if this motherfucker can adequately answer all of the fully insane questions my clients ask me throughout their proceedings, then AI has officially moved beyond human sentience and I’ll be happy to just get out of Ultron’s way.

2

[deleted] t1_j5ptpkn wrote

Oh yeas this will definitely fix the broken system

2

Predator1553 t1_j5or8id wrote

Opposing corporation lawyer: execute order 66

Lawyer bot:....your honor I find my defendant guilty

1

rutuu199 t1_j5ow6ry wrote

I wouldn't wanna be the first guy to have robo saul as my lawyer

1

notrealchair35 t1_j5oy6qy wrote

Damn, even Lawyers can be replaced by a machine.

1

MatsThyWit t1_j5oy9sl wrote

Yeah, that's gonna win over a jury for sure.

1

DaveOfTheDead3 t1_j5oyoyp wrote

I think I like AI powered lawyers better than AI artwork or writing. Does it still put people out of work ....sure....but they are lawyers....so fuck em 😆 🤣 You know why.

1

999others t1_j5oz6xw wrote

I didn't know that they let you use a smart phone in court.

1

456afisher t1_j5p2zps wrote

Has the programming changed to where women and other groups are equally represented? If not, NO.

1

Absolutely_N0t t1_j5p8qfg wrote

Look at my lawyer dawg I’m going to jail

1

Ultraferret107 t1_j5p9k68 wrote

Didnt they have one of these get hacked in megaman battle network back in 2006?

1

BitterFuture t1_j5pdhz9 wrote

That must be one desperate fucking client.

1

UnderABig_W t1_j5pdviu wrote

First the robots can for the blue-collar jobs. Now they’re coming for the white-collar ones. Pretty soon, all we’ll have is our billionaire overlords and a huge underclass. And robots.

1

beangreen t1_j5r2c2p wrote

Once AI robots learn golf, the executives and billionaires will be out of jobs.

1

HooninAintEZ t1_j5phafy wrote

Hopefully the lawyers opening statement references bird law

1

Ryrienatwo t1_j5ps23q wrote

Now we need to put heavy restrictions on these robots

1

jetbag513 t1_j5pwhxd wrote

Trump's next dream team since he won't pay anyone else.

1

LaterWendy t1_j5r1284 wrote

Was this the do not pay ai? I saw their ceo promoting a million dollar pay out for any lawyer that would let the bot do all the work

1

Due-Reading6335 t1_j5rnhrg wrote

My biggest concern.. is specifically AI interpreting our laws. I don't have any issue with a robotic attorney, as long as its doing everything correctly because it is very expensive to hire an human attorney.

Our laws need work. Constantly. So I'm a bit afraid of an AI becoming a master at law across countries because:

Insanely smart AI attorney could possibly outwit a human prosecutor and get the a rich person off the hook

or

We get AI prosecutors that maximize penalty with no regard for legal loopholes in situations. Such as a woman who kills her sa abuser in a violent act of defence, reasonably to protect herself when life is at risk.

AI is really fucking good and I don't think our laws are ready

1

QuietudeOfHeart t1_j5rz0rg wrote

Please be for Alex Jones... Please be for Alex Jones...

1

RobotDog56 t1_j5spkzo wrote

Pft, I've seen Suits! It's not what you know about the law it's what dirt you have on the other parties.

1

Kind_Bullfrog_4073 t1_j5vs4w5 wrote

Poor law students going into 6 figure debt just to be unable to get jobs when the robots take over.

1

random-incident t1_j69r5gq wrote

It can’t be worse than the lawyers in our Supreme Court.

1

Ceasarsean t1_j5o0d69 wrote

Then it'll do the Megan dance after it wins the case.

0

GonnaNeedMoreSpit t1_j5orpjb wrote

There is a chat engine using top of the range AI, it is called AIDungeon and you should check out the award worth stories it vomits out. Been following for a while and the longer it has to learn from human interactions the more insane and worthless it becomes.

0