Comments

You must log in or register to comment.

ToDonutsBeTheGlory t1_j6n9r8d wrote

The president of Stanford is under investigation for potentially falsifying research findings.

96

mindofstephen t1_j6nox0h wrote

95% accuracy, so 5% of the time this software will be destroying peoples lives and academic careers. Imagine getting a false positive, having 80 grand in school debt and then getting kicked out of school for this.

91

sharkinwolvesclothin t1_j6oeiwf wrote

Universities can have other punishments for different forms of academic dishonesty besides kicking the student out. In fact, I've never heard of one that doesn't. Also, accuracy is not necessarily the same for positive and negative cases.

8

fjaoaoaoao t1_j6oxlb1 wrote

Yep. Especially at that low accuracy, There can be alternative interventions (e.g. resubmission). And perhaps even secondary backup checks. hopefully the detection methods can improve although it is a moving target….

Ultimately it’s better for universities to hire more faculty and look to evaluation methods that are better for students and can’t just be replicated by AI

5

raylolSW t1_j6ooe3n wrote

The top ones are know for kicking out for any form of cheating.

Best case scenario, you just automatically fail the subject, but still one single subject is expensive as hell.

2

KSA_crown_prince t1_j6oro23 wrote

> having 80 grand in school debt

putting students into debt is a war crime tbh, software/technology is only accelerating the need for these conversations because there are so many lazy psychos in power who want to automate their decisions in the first place

4

fjaoaoaoao t1_j6owjf5 wrote

Agreed. Some form of debt is okay especially for high earning fields but shouldn’t be as wild as it is now. Some people’s lives are harmed by debt, and research shows people are psychologically maturing slower. At least there are some pathways for forgiveness.

3

qa_anaaq t1_j6ox2xu wrote

Lol 80 grand in school debt. Try 150.

3

DeathGPT t1_j6o1ept wrote

ChatGPT revise this with two grammatical errors, one UK word, one run on sentence, and write it to be less detectable by ChatGPT detectors. Write it how a human college student would write it. <0%. This is why the founder of OpenAi said it’s impossible to detect. Plus, unless you have 100% detectability you can always deny. Without 100% proof, colleges can’t say without a doubt you cheated and that’s the main issue.

These colleges doing this are just for fun and to waste tax payer dollars.

65

Spire_Citron t1_j6p4pn0 wrote

Yup. Also, a lot of these detectors just seem to detect writing that is formal and grammatically correct. There's nothing special about the way that it writes. It learned from things written by humans, after all.

17

sharkinwolvesclothin t1_j6ooyby wrote

Your main issue is absolutely trivial - just make a rule that anything detected by the chosen algorithm results in redoing the assignment in class without internet, or even just the following assignment if you accept it as a helper but want to make sure they can do it themselves.

−10

DeathGPT t1_j6ovt8e wrote

But what about the people that don’t use ChatGPT or they use Grammarly and the algorithm says they have a 70% match, then what? Make them redo the assignment? How is that fair?

Then what I gave you as a prompt, would reduce the detection from the algorithm to 0% so your point is flawed in the fact most of these detect ChatGPT software/sites are open to the public rather than a proprietary one only academia has access to.

Per openai ceo, humans adapted to using calculators in class, this will be true with ChatGPT.

19

MelodiGreig t1_j6p1gd4 wrote

Dunno why this got downvoted, they're not wrong.

−9

Bakagami- t1_j6n7o5v wrote

This is just stupid. Ridiculous and stupid.

42

TheRidgeAndTheLadder t1_j6nsq5r wrote

Predictable though

They'll figure it out around the third rewrite

3

Bakagami- t1_j6ogrsv wrote

And then I'll rephrase some of the sentences. That's it. Besides, by the time they get sufficiently good AI's to detect chatGPT confidently we'll have other and better alternatives.

3

ShortNjewey t1_j6ohpcw wrote

All things you would have available to you when solving a problem outside of a controlled testing environment. Why not make them available during testing and evaluation? The objective is to identify if this person can solve problems IRL where they have access to all these items. They should have the same access when solving problems by an evaluating university or employer.

As a manager, if I task you to give me the answer to 2+2, and you can do it faster by looking it up online vs someone else performing mental math, then you are more valuable to me.

Further, if a universities evaluation criteria for students can be outsmarted by AI then that points to a problem with the evaluation criteria, not the students.

It's not cheating. It's being resourceful.

21

Spire_Citron t1_j6p5j7m wrote

Generally, essays are a way for students to show that they have a deep understanding of the subject matter. I think it's fine if you use AI to edit it and improve your grammar, but if you're having it do the thinking for you when it comes to examining the information, it defeats the whole point.

8

ShortNjewey t1_j6p7c7k wrote

If I hire someone to provide rich content about specific topics, I'd encourage them to use tools that allow them to complete the job better, faster, and cheaper. This should be the perspective of educators as well.the resources used (less plagiarism and illegal activity).chalkboards. Then society (and educators) slowly accepted, and acclimated to these tools, focusing on higher-level capabilities. I expect the same will happen here...eventually. Once AI is capable of providing valid and "well-thought-out content", the most valuable higher-level skill will be the ability to direct and control AI to get the desired result. The ability to 'ask the right questions'.

Most education is for the purpose of being a productive member of society in the work force. In the end, if you can provide higher quality contributions at a lower cost then you are more valuable, regardless of the resources used (less plagiarism and illegal activity).

If I hire someone to provide rich content about specific topics, I'd encourage them to use tools that allow them to complete the job better, faster, cheaper. This should be the perspective of educators as well.

1

STJ608 t1_j6n855g wrote

Futile.

16

ShoonSean t1_j6o23hb wrote

By all means, continue to yell at the wind, civilization!

11

ChartNo3583 t1_j6ofdpr wrote

you cant detect chatgpt lmao, I can add any unique settings and it will write me 100% unique text, of course if youre dumb enough to use default prompt like: write me 1000 text about "subject" without adding anything then its easy to see chatgpts pattern.

8

spacehippieart t1_j6obn9p wrote

Can't you still use ChatGPT as a base and then edit it into your own words?

6

Hello_Hurricane t1_j6ogly9 wrote

Or have it generate a list of bullet points, then expand each point from there. Perhaps randomize the order in which you talk about each point, just to be safe.

6

dandaman910 t1_j6patxb wrote

University's need to figure out what the role of humans will be in the future and form their curriculum around that. If they're just forcing people to learn the things that AI will be good for then they're making themselves irrelevant.If we graduate just for no one to need us because AI can do it for free then the course was pretty useless.

Its like teachers in the 90s saying we won't always have a calculator. Well yea now we do. And there's no real situation where I won't use one so learning the manual calculations was a waste of time. I could've spent that time learning how to better apply the technology I had access to.

6

TFenrir t1_j6natgl wrote

Should be interesting to see how well this works. I was reading a Twitter thread from someone who worked on some of the progenitor technology to this and they weren't confident it would be anywhere near as accurate as claimed - this will be our first demo

5

darkjediii t1_j6p1fzn wrote

How does the detection work? Does it also take samples of the students previous writing and compare? That would be the only thing I can think of that would make detection fairly accurate.

2

redroverdestroys t1_j6ody79 wrote

Any kid going to stanford using this will know how to make it idiot proof, lol. this is so dumb.

1

Redditing-Dutchman t1_j6o6w2y wrote

I know people are gonna hate on this but these kind of detectors will become super important in the future. Even more so for video.

−1

[deleted] t1_j6oc8lf wrote

[removed]

−3

maskedpaki t1_j6owh7z wrote

Good thing no one gives a fuck about poetry

−2