Comments

You must log in or register to comment.

nosnevenaes t1_jdl0klr wrote

I think in 100 years people will look back at history and not understand the concept of privacy.

117

Better_Path5755 t1_jdl5thq wrote

Or they’ll understand it and wonder what took us so long to understand it.

13

Throwaway-tan t1_jdlc0va wrote

Or they'll understand it and wish we fought for it, then get punished by their boss for "attention theft" because their brainwave monitor said they weren't focused on their job for 30 seconds.

20

Amookoo t1_jdllmxc wrote

You assume people will have to work when these technologies exist en masse.

Very negative.

5

Throwaway-tan t1_jdllyz3 wrote

As the value of human life decreases I think some form of indentured servitude will continue to exist.

9

Amookoo t1_jdlmslk wrote

The thing is you are looking for pain. There will be no bondage but for those that seek it. Especially in a society with "Individual Brainwave Monitors"

−7

[deleted] t1_jdmhxdi wrote

[deleted]

3

Amookoo t1_jdp80bo wrote

Please don't whinge at me for your failure to keep yourself content. Life is a meaningless sack of placebo. It's futility. It's dogma all the way down, yet I'm still coming out happy. If you truly think this way, grab a gun and go solve some problems: yours or the actual ones preventing progress. If you find you cant, then strap yourself down for life, because it doesn't care how much you mope.

0

MockStarket t1_jdpkao4 wrote

That's deep man. Good luck in life is all I got to say. Hope you're trolling.

1

WildGrem7 t1_jdpq3q0 wrote

This is the kind of thinking that makes people think, fuck everyone else, my shit is more important. Jeeze.

1

The_One_Who_Slays t1_jdl8se6 wrote

That's actually amazing. Imagine an ability to record the dreams THAT YOU ALWAYS FORGET ABOUT AFTER WAKING UP, GODDAMMIT!

57

Throwaway-tan t1_jdlc6qs wrote

Yeah because it won't almost exclusively be used to violate the integrity of one's mind for the purposes of legal persecution and maximising workforce compliance through thought monitoring.

94

ginja_ninja t1_jdlm0ri wrote

The attempted implementation of mind jannies will be the breaking point for society where heads start rolling

23

ThisZoMBie t1_jdnal25 wrote

“Eh, I don’t care, I have nothing to hide.”

> The attempted implementation of mind jannies will be the breaking point for society where heads start rolling

I highly doubt it

4

WildGrem7 t1_jdpq72r wrote

the fucking worst. I had a co worker that would say this when we were talking about Snowden like 10 years ago. I couldn't believe people actually thought like this.

2

Chard069 t1_jdorrcj wrote

Thinking unsanctioned stuff is a severe offense. Think nice, now. Or else. 8-(

1

chocolatehippogryph t1_jdmiuev wrote

yeah man. We are on the prespice of horror and greatness...

Related annecdote: I met a ~60-65 German tech CEO guy on an airplane once, and we were talking about potential near future tech. I think we started talking about neuralink, my mind goes to the possibilities for increasing accessibility for disabled people etc. He immediately started talking about how if you could read people's minds you could make sure they were paying attention at meetings and generally keep them focused and productive during work.

It was pretty horrifying, but I think this will happen. Wealthier people will see the benefits of technology-mind integration. For the poorest, it will just be another implement of control.

16

Throwaway-tan t1_jdmktew wrote

External "read only" brain wave monitoring is one thing. Internal direct interface chips is a whole other can of worms.

Computers are inherently insecure, and now you want to intrinsically tie your existence to one. OK when someone ransomwares your free will, the government fires off a kill switch or a rogue brain worm sends everyone into a bath salts style murder rage I don't want to hear a peep from the optimists.

9

TheReverend5 t1_jdn5bxk wrote

There are already people receiving very beneficial therapies from secure and implanted brain-computer interfaces. The devices are built to make it impossible to deliver dangerous amounts of current.

The “optimists” in this case just have a better understanding of the current reality than you do.

1

avatarname t1_jdo15v3 wrote

They could already monitor computer screens either with security cameras or some software, if you work in office, yet at least where I am from they do not do it. Even though it is possible. They even do not monitor the time you log in or log off, at least for white collar work I do. Of course, it is different in manufacturing and warehouses etc. where people are treated as slaves... It has always sadly been so that white collar workers (especially not in entry level jobs) can slack off more than people who actually do hard physical work. I know part of that is that they think the more free we are, the better our brains will work and come up with million dollar ideas, but still...

2

FeatheryBallOfFluff t1_jdm8y0u wrote

How is this in the futurology sub? On every thread with a new technology, everyone is hating on it because it will be used for oppression.

13

PLAAND t1_jdmeyim wrote

Because tools can be picked up by anyone, even shiny new ones and we see very clearly who in the world has the power to pick up these tools and the kinds of things they tend to do with them.

11

Defiyance t1_jdmmsja wrote

Because if it can be used for that it will be used for that by the current pricks in charge. Maybe we should restructure our society before we come up with a bunch of tech out of a dystopian wet dream

5

Hiseworns t1_jdmc54a wrote

Well I mean, look around at how all current and even old technology is and has been used

4

Chard069 t1_jdoslth wrote

Electricity: Zap people and animals to death.
Mechanics: Crush people and critters to death.
Chemistry: Poison people and critters to death.
Mind-control: Scare people and animals to death.
Media: Bore people to death. Beware animals.
Politics: Bludgeon people to death. Run faster.

1

BaboonHorrorshow t1_jdn3z85 wrote

Because most Redditors are American and America is an inverted totalitarianism/oligarch-ruled dystopia.

4

Dentrius t1_jdntoo7 wrote

Its just some loud minority of people who think they smarter and above all the rest because they read or watched too many dystopian fiction and now can forsee the dark future!

2

BaboonHorrorshow t1_jdn3pg6 wrote

Yep, to say nothing of the volunteer thought crimes police that would sprint up.

They’ll try to destroy people for saying the wrong thing in social media, even if that person apologizes.

Imagine if Twitter could see your humor brainwaves spiked at some off color joke - you could lose your job over a bad THOUGHT

3

Alekillo10 t1_jdmbrg1 wrote

Ugh… It would be like a crappier version of Total Recall… “You dreamt of killing your wife! You’re going to jail!”

2

Philosipho t1_jdmkb8z wrote

People decided it was a good idea to let citizens control the economy and government, because they wanted the opportunity to have that wealth and power themselves.

Society is just one big episode of r/LeapordsAteMyFace

1

Throwaway-tan t1_jdmlriq wrote

What? I'm not sure what your criticism is targeting... Is it that society is run by people?

Society has generally been a net positive for everyone. We went from subsistence and survivalism to plenitude and philosophy.

Even a feudal society is preferable to no society in my opinion.

It's not perfect, but I much prefer the fucked up society we have now when compared to "return to monke".

5

sqwuakler t1_jdmwxtf wrote

"Democracy is the worst form of government (except for all the others that have been tried).”

4

MistyDev t1_jdmo3o5 wrote

Even if this was possible. The 5th amendment would absolutely protect against this kind of thing in the US.

I feel like you have to be unreasonably pessimistic to think that those would be the 1st areas where such a technology is used.

1

Throwaway-tan t1_jdmow4m wrote

5th amendment only protects you from incriminating yourself in potential criminal proceedings.

It does not prevent your employer from mandating you use it at work and then any data gathered being subpoenaed.

Or let's say it becomes something more ubiquitous like a smartphone, everyone uses it daily and all that data is gathered - your 5th amendment isn't going to do shit.

3

-zero-below- t1_jdon37p wrote

Additionally, the 5th would only protect what you say. It doesn’t, for example, prohibit search or manipulation of your body. For example, fingerprints are not protected by the 5th. I don’t see why brain fingerprints would be.

1

[deleted] t1_jdlne2j wrote

[removed]

−11

urmomaisjabbathehutt t1_jdmkncx wrote

Will it be able to pull images of possible suspects from its memory and recognize that the subject is familiar with those individuals?

that could be used for crime solving but also an authoritarian government would love to know which people a disenter meet and relates with

1

Inevitable_Syrup777 t1_jdormsh wrote

no, currently it would be using images in it's own database. that would mean harold smith would simply be drawn as john doe from the image database. john doe is just training data and doesn't exist in real life in this instance. i saw the image results, from looking at a skyscaper, yes it drew a skyscraper but the skyscraper looked like the training image, not the real life image seen by the person.

1

urmomaisjabbathehutt t1_jdpl13m wrote

Right, so at this point its able to resolve the subjec mntal image as a generic skyscraper basd on comparisons to its own database

the question would be if the rsolution would became good enough for it to assess that the subjet mental image correspond to one of the samples rather than something generic

i guss that if the subjct mental image was something easily recognizable may be easy even if the resolution is sketchy, but in any case this is question ofmaking improvements

1

elehman839 t1_jdmt4om wrote

The claims are interesting, but far more modest than people here seem to realize. This is what they say about their evaluation process:

we conducted two-way identification experiments: examined whether the image reconstructed from fMRI was more similar to the corresponding original image than randomly picked reconstructed image. See Appendix B for details and additional results.

So, if I understand correctly, they claim that if you take a randomly-generated image and an image generated by their system from an fMRI scan, then their generated image more closely matches what the subject actually saw than the randomly-generated image only 80% of the time.

This is statistically significant (random guessing would give only 50%), but the practical significance seems pretty low. In particular, that's waaaay far form a pixel-perfect image of what you're dreaming. The paper has only cherry-picked examples. The full evaluation results are apparently in Appendix B, which I can not locate. (I'm wondering wether the randomly-generated images had some telling defect, for example.) Also, the paper seems measured, but this institution seems to very aggressively seek press coverage.

4

The_One_Who_Slays t1_jdn2ac5 wrote

I mean, Rome wasn't built in a day. Just the fact that it's possible to do speaks volume. As for seeking press coverage, it's understandable: could be them trying to secure more funding by getting more publicity, could be them being genuinely passionate about their tech, could be both. The time will tell.

Still, it's an interesting application for the image gen technology, it's never even crossed my mind to my surprise.

2

elehman839 t1_jdollr0 wrote

If anyone cares: I found Appendix B, but there wasn't much more helpful information. In particular, I don't understand how the randomly-generated images in their evaluation process were produced. And, as far as I can tell, the significance of the paper comes down to that detail.

  • If the randomly-generated images were systematically defective in any way, then the 80% result is meaningless.
  • On the other hand, if these randomly-generated images are fairly close to the image shown to the person in the fMRI-- but just differing in some subtle ways-- then 80% would be absolutely amazing.

Sooo... I think there's something moderately cool here, but I don't see a way to conclude more (or less) from than that from their paper. Frustrating. :-/

2

The_One_Who_Slays t1_jdoodjv wrote

Yeah, some public trials would come in handy there. Show, don't tell, and all that.

1

nuclearbananana t1_jdlhqtw wrote

I'd rather not. The fleeting nature of dreams is part of what makes them special and surreal.

3

sanburg t1_jdn6g7o wrote

I can just see IKEA selling fMRI beds

1

m1cr05t4t3 t1_jdp34x3 wrote

Keep a small journal and a pen next to your bed. As soon as you wake up write down two words summing up what you were dreaming about. It's enough to allow you to remember the whole thing. You never lose your memories, just your ability to recall them. A little prompt hacking is often enough.

1

The_One_Who_Slays t1_jdp91rv wrote

I did that before, but I stopped, because I always go into excruciating detail and it takes a huge chunk out of my time. Just can't do the "two words summary" thing to save my life.

Plus, the idea of being able to watch a dream in a movie format is pretty amazing.

1

m1cr05t4t3 t1_jdp9gjj wrote

Oh I would totally buy a VR headset that replays my dreams. I would still write down two-words though to remind me of whoch 'movie' to pick.

(I would not want it to connect to the Internet though I'll do the updates with a USB or an SD card or something of a new version comes out).

1

paperdahlia t1_jdlxx71 wrote

This idea seems scary but I'm hoping it means we can use it to help people with disabilities that prevent them from communicating. Ex. Dementia patients, nonverbal autistic folk, folks with ALS, coma patients, etc. And maybe we can even extend that to animals someday!

25

[deleted] t1_jdm47uq wrote

Even if I had a disability i dont want anyone reading my mind. That's off limits.

12

Cubey42 t1_jdml7rs wrote

It could also upend our entire criminal justice system. Imagine the power of being able to subpoena someone and have them ready their mind. Beyond reasonable doubt will come to an end

−1

Lysmerry t1_jdo1m2c wrote

i think the images that run through your head could not be used no matter how convicing they are. Recalled memories are not video or photographic evidence. Someone asked about a murder would be very likely to visualize a murder. The human mind is messy.

2

Orc_ t1_jdm0hzk wrote

No. It can take those crappy images that actually "read" people's mind and make them look better.

"Stable Difussion can read your mind!!!" is the silliest thing I've read this month, congratulations.

10

PM_ME_YOUR_EXERCISE t1_jdm1zxm wrote

I would love to see this applied to animals so I could talk with my cat

9

IgnobleQuetzalcoatl t1_jdmnlxm wrote

A few things to note based on the comments here.

(1) This isn't particularly new or noteworthy. This kind of thing has been done for at least a decade. They claim better results than previous efforts, but their examples don't appear categorically better. Setting aside previous efforts, the results here are just not that good. They kinda get a sense of what the participants are viewing, but that's it.

(2) This isn't mind-reading in the colloquial sense that people are interpreting it as. They are using brain activity while participants are actually viewing images, not while they are imagining them. That is a big difference and is much easier than anything that would generally be considered "mind-reading".

(3) Even if it was mind-reading, and even if it actually was high-fidelity, this requires a million dollar MRI machine and having a participant basically bolted onto a sled for a couple hours. All the comments by people talking about how we're all doomed and privacy is gone seem to be missing that fact.

7

andrew21w t1_jdn2t6a wrote

Thank you sir (or lady) for pointing it out. As I said: People are way too quick to become doomeristic when they have zero idea what they're talking about

1

albatros096 t1_jdlv7hh wrote

Eeee what am i reading minds when i analyze fmri data

5

mariegriffiths t1_jdmkh8z wrote

I just skim read it. It looks like the brain stores a visual model and a semantic model. The first matches what is there for real time manipulation and the semantic model that gets stored for long term use like a cartoon version of what we see. Is this why cartoons work so well?

5

andrew21w t1_jdn0vyv wrote

So many people of this thread, clearly didn't read the paper and just the title of the post

4

MockStarket t1_jdpkqdy wrote

But my data is being farmed!!! I knew I should have been on parler this whole time!

1

NickOnMars t1_jdm2pam wrote

Eventually we will have little helmets for cats/dogs/hamsters/ducks/pigs/etc so we can communicate with pets.

3

EnjoyableGamer t1_jdmczb1 wrote

Our brain thoughts are mapped the same way as in a AI model, to some degree. Crazy

3

Competitive_Dog_7007 t1_jdmpoxl wrote

Ok, but how would anyone go about testing this? This is a claim that sounds extremely outlandish and needs some solid proof backing it.

3

f08f2481 t1_jdmppkq wrote

Submitted a paper doesn’t equal quality or truth. Not buying it without review and reproduction.

3

M4err0w t1_jdmu638 wrote

is this still the one that essentially had people look at 10 pictures while being scanned and then with that data, it could recreate almost more or less if these specific people were looking at those specific 10 images?

2

[deleted] t1_jdlmpio wrote

[deleted]

1

KisaruBandit t1_jdlobts wrote

Reading already requires a damn MRI machine, writing would require incredibly invasive surgical implants at minimum. Significant hardware issues. I wouldn't sweat it until we're at the deep neural implant stage of cyberpunk future.

2

zoinkability t1_jdm7pta wrote

Welp, time to watch Until the End of the World again

1

FlowersAndFourier t1_jdn1888 wrote

Sci-Fi Author: In my book I invented the Torment Nexus as a cautionary tale

Tech Company: At long last, we have created the Torment Nexus from classic sci-fi novel Don't Create The Torment Nexus

1

LINKfromTp t1_jdonwcq wrote

I swear, early futurama had lots of thought put into it. I can imagine them adding ads to dreams being a new thing when technology gets far enough. This is just part of the baby steps towards that reality.

1

Voidtoform t1_jdntnn4 wrote

I wonder what exactly it can tell, people all think different, some people think in words, some a visual picture, and others spatially.

0