Submitted by Choice_Card t3_10jwmvq in Futurology

For something like taste, how would this be simulated? How could someone code a slice of pizza to taste like a slice of pizza? What about new, nonexistent foods? Could it become the best way for chefs to practice? Throw together your newest, complex dish without wasting resources? And this isn’t even mentioning other interactions like heat, cold, smell, etc.

34

Comments

You must log in or register to comment.

itsgoingtobeebanned t1_j5n94dz wrote

2006 "You wouldn't steal a car"

2036 "You wouldnt download cocaine"

30

400Volts t1_j5neoh6 wrote

If someone had a good answer to this right now, they would be a top researcher or leading a multi-billion dollar startup.

But it'll probably be something along the lines of isolating the signals those things send/cause in our brains currently then playing them back

24

Choice_Card OP t1_j5nevb5 wrote

Time to become that person haha

6

400Volts t1_j5nexgj wrote

Honestly, I'd love to be lol

4

Choice_Card OP t1_j5nf1w9 wrote

It would be interesting to try and make it a communal effort, not just capitalized on by one person or group

4

400Volts t1_j5nfcxu wrote

Oh there's absolutely no way a single person or group could crack this on their own. This it one of those things that is compiled from research papers over years until someone can reliably implement it.

A very interesting and important aspect of this, I think, would be a rise of a sort of medical computer science degree since having write access to the brain has essentially zero margin for error

13

Runktar t1_j5o8pcg wrote

Everything you experience is simply an electrical signal in your brain. Once we fully map what all these signals are and we make the proper tech we can simulate any experience.

12

Nopain59 t1_j5ozp9g wrote

Everyone then spends all their time having VR sex with everyone else. End of the advancement of civilization.

6

Extra-Confection-706 t1_j5rsjtz wrote

Who the hell cares about civilization like It had any higher value. If apes are Happy in their virtual reality, so be It.

6

Codydw12 t1_j5ry7ki wrote

Or it gets used as a vacation tool and people still live life normally

3

---nom--- t1_j5pde09 wrote

To map signals, you'd have to be able to identify what synapses contain what memories. You can bet your life everyone is different and is observed differently.

Although by monitoring someone doing a specific activity, like moving a limb - you could potentially mimick this person's action. So who knows, maybe we'll calibrate such a device using this method. But it may be difficult to target this part of the brain specifically.

Considering our smart phones and apps are so jam full of bugs, I don't hold much hope for such technology. Imagine the sort of bugs you could observe with such tech, super invasive bugs too.

1

BigZaddyZ3 t1_j5n4xo9 wrote

That probably won’t happen until we perfect something like Neurolink. (Or some other technology that seamlessly connects our brains to a computer.) One of the few things mentioned on this type of sub that actually is still a long time away.

11

[deleted] t1_j5oqwgs wrote

Neuralink itself is actually behind, another company already has an implant in a person with ALS so they can communicate, so the tech is not far off.

8

Choice_Card OP t1_j5n4z2o wrote

I know this could be classified as another AskReddit question, but I feel like if given the depth the conversation needs, it could be incredibly insightful.

3

hansolosaunt t1_j5naphr wrote

I wonder this often. I used to lucid dream a lot and the hardest senses to recreate in a dream were taste and smell.

3

HackDice t1_j5nbhzx wrote

The means to do so already exist in our bodies. It is more about reverse engineering the sensations into something that can be represented and replicated digitally than trying to "code a pizza". Once you have a baseline of sensations figured out, you can then experiment from there and technically 'create new tastes' if you will. At least that's what makes the most sense to me.

3

CragMcBeard t1_j5nhawp wrote

The virtual reality will be so advanced in the future that it can simulate being dehydrated from lack of water and starving from lack of adequate food. Oh wait that will be reality, I got it mixed up.

3

fakingglory t1_j5rblw9 wrote

You read about how many monkeys Elon killed?

Now imagine how many humans need to die before you could taste virtual pizza.

2

Sicon3 t1_j5rlzxu wrote

So there is a lot of research ongoing with neutral interfaces but one obstacle that every experiment is running into is that nobody's brain is the same. Much like fingerprints you can know where a specific feature probably is but the individual pattern of neurons is fully unique. X Input in Y location might make one person taste a roast beef sandwich, another may smell a grape, and a third might have a seizure. Suffice to say any neutral interfaces will need to be individually calibrated for it's user and I don't see broad market experiences using such technology until such implants are universal. I think the system in Ready Player One where you just use advanced haptics and rigs that allow for full movement is far more plausible in our lifetimes.

2

Saeker- t1_j5p4h78 wrote

The movie Brainstorm (1983) might be of interest to folk considering this topic.

Centers around a brain experience capturing technology and explores the ramifications of several of its potential use cases.

1

Chaosfox_Firemaker t1_j5p7cyw wrote

The existence of Full dive implies not only the ability to "send" sensations, but also to "read" signals, for stuff like motion intention. From there its a "small" step to reading sensations off sensory nerves, of someone eating an actual thing during the dev process. That's a pretty big data packet though, so you certainly don't want individual samples for every possible thing. You could probably use it as training data though, for an algorithm to take some simpler features from game data in, and spit out an appropriate sensory gestalt procedurally to the interface.

how closely that model reflects reality, who knows.

1

---nom--- t1_j5pbrt2 wrote

We're so very far off such things, I suspect we'll have to artificially grow modified human brains with an interface we can tap into.

Considering funding is waning for VR investment and games, I'm not so convinced VR will continue to advance. And going for things like taste and stuff is way out of our realm of possibility, and may always be. We don't really understand the brain still, and we may never due to how complex &vast it is, we also have limitations on our organic brains.

We don't even have real AI like many things claim to implement, they're largely neural networks/algorithms. You can't even get chatgpt to successfully continue a relatively easy sequence of numbers without it giving a false value. And it writes all stories in roughly the same structure. Heck, if you tell it to write Star Wars episode 10, it'll resurrect Luke Skywalker.

1

Apprehensive_Arm6074 t1_j5pck6p wrote

Taste may be easier than we think. Have you been to the hospital and they put a saline solution in your I.V? You can taste it...there you go, Elon - take it away!

Jokes aside, getting an injection to taste your game is a stretch. BUT if an implant controls how our subconscious brains "display" our five senses to our conscious mind during VR, we could "think" a sensation into being. But I don't pretend to know what Neurolink is implementing.

Furthermore, another question follows: How would chronic gaming with false senses affect how we experience everyday life?

1

jamespherman t1_j5rkjaz wrote

I'm a visual systems neuroscientist working in the Department of Ophthalmology under José Alain Sahel at Pitt. José is working towards an international clinical trial of a visual cortical prosthesis. My own work concerns how we learn the meaning of visual information, and I hope to contribute to the prosthesis project by helping develop tools to aid training of patients to make use of their newly restored sensorium. Other researchers here at Pitt such as Rob Gaunt have done work on neural prosthetics for proprioceptipn, touch. Other responders are right that the major approach used today is electrical microstimulation. However, in my opinion optogenetic approaches - using light to stimulate cellular (in this case neuronal) activity - hold more long term promise. In fact, José recently demonstrated a retinal prosthesis based on transfecting an opsin into the normally light insensitive retinal ganglion cells (RGCs) of a blind patient.

1

Nv1023 t1_j5s5rqz wrote

Spike to the back of the head in a chair like the matrix. Direct access to the brain somehow

1

BinyaminDelta t1_j5szqc3 wrote

Machine learning and Reverse Engineering.

You sample "what things feel like" and then you replay them, and with AI you can likely predict / extrapolate similar sensations.

There are only so many base sensations. Remember the joke about seven sneezes being one orgasm or whatever? It's not really so far off from being true.

1

DarkBlade230 t1_j5xqt0y wrote

Wonder how diffrent bodies would feel like. How would the program know how to simulate the difference?

1