Viewing a single comment thread. View all comments

KnightOfNothing t1_jdtvjbm wrote

that's exactly all humans are and i don't understand you could see anything "magical" about reality or anything inside it.

1

phyto123 t1_jdu4wp4 wrote

Most things in nature follow fibonacci sequence and golden ratio in design which I find fascinating, and the fact I can ponder and appreciate the beauty in that is, to me, magical.

4

BilingualThrowaway01 t1_jdudp5a wrote

Life always finds the path of least residence through natural selection. It will always gradually tend towards being more efficient over time through evolutionary pressure. The Fibonacci sequence and golden ratio happen to be geometrically efficient ratios to use when it comes to many physical distributions, for example when deciding how to place leaves in a spiral that will collect as much sunlight as possible.

4

phyto123 t1_jdvm93a wrote

Excellent explanation. I also find it fascinating there is evidence that our ancient ancestors would build according to this natural order. The way Luxor Temple was built utilizes this order from its first room to the last

1

4354574 t1_jdu8nat wrote

We're conscious. Subjective experience is magical. The experience of emotions is magical. Being aware of experience is magical. If that isn't magical to you, then...sucks to be you. What is even the point of existing? You might as well just go through the motions until you die.

There is no evidence at all that AI is conscious.

2

Surur t1_jdubb31 wrote

How do you know you are not the only one who is conscious?

3

4354574 t1_jdunko1 wrote

I don't. It's the classic "problem of other minds". This is not an issue for Buddhism and the Yogic tradition, however, and ultimately at the highest level all of the mystical traditions, whether Sufism, Christian mysticism (St. John of the Cross and others), shamanism, the Kabbalah etc. What's important to these traditions is what your own individual experience of being conscious is like. More precisely, from a subjective POV, there are no "other minds" - it's all the same mind experiencing itself as what it thinks are separate minds.

If your experience of being conscious is innately freeing, and infinite, and unified, and fearless, and joyous, as they all, cross-culturally and across time, claim the state of being called 'enlightenment' is, then whether there are other minds or not is academic. You help other people to walk the path to enlightenment because they perceive *themselves* to be isolated, fearful, angry, grieving individual minds, that still perceive the idea that there are "other minds" to be a problem.

In Buddhism, the classic answer to people troubled by unanswerable questions is that the question does not go away, but the 'questioner' does. You don't care about the answer anymore, because you've seen through the illusion that there was anyone who wanted an answer in the first place.

2

Surur t1_jdur5b3 wrote

Sure, but my point is that while you may be conscious, you can not really objectively measure it in others, you can only believe when they say it or not.

So when the AI says it's conscious....

3

audioen t1_jdw2frs wrote

The trivial counterargument is that I can write a python program that says it is conscious, while being nothing such, as it is literally just a program that always prints these words.

It is too much of a stretch to regard a language model as conscious. It is deterministic -- it always predicts same probabilities for next token (word) if it sees the same input. It has no memory except words already in its context buffer. It has no ability to process more or less as task needs different amount of effort, but rather data flows from input to output token probabilities with the exact same amount of work each time. (With the exception that as input grows, its processing does take longer because the context matrix which holds the input becomes bigger. Still, it is computation flowing through the same steps, accumulating to the same matrices, but it does get applied to progressively more words/tokens that sit in the input buffer.)

However, we can probably design machine consciousness from the building blocks we have. We can give language models a scratch buffer they can use to store data and to plan their replies in stages. We can give them access to external memory so they don't have to memorize contents of wikipedia, they can just learn language and use something like Google Search just like the rest of us.

Language models can be simpler, but systems built from them can display planning, learning from experience via self-reflection of prior performance, long-term memory and other properties like that which at least sound like there might be something approximating a consciousness involved.

I'm just going to go out and say this: something like GPT-4 is probably like 200 IQ human when it comes to understanding language. The way we test it shows that it struggles to perform tasks, but this is mostly because of the architecture of directly going prompt to answer in a single step. The research right now is adding the ability to plan, edit and refine the replies from the AI, sort of like how a human makes multiple passes over their emails, or realizes after writing for a bit that they said something stupid or wrong and go back and erase the mistake. These are properties we do not currently grant our language models. Once we do, their performance will go through the roof, most likely.

0

4354574 t1_jdwkos3 wrote

Well, I don’t believe consciousness is computational. I think Roger Penrose’s quantum brain theory is more likely to be accurate. So if an AI told me it was conscious, I wouldn’t believe it. If consciousness arose from complexity alone, we should have signs of it in all sorts of complex systems, but we don’t, and not even the slightest hint of it in AI. The AI people hate his theory because it means literal consciousness is very far out.

0

Surur t1_jdwqof7 wrote

> If consciousness arose from complexity alone, we should have signs of it in all sorts of complex systems

So do you believe animals are conscious, and if so, which is the most primitive animal you think is conscious, and do you think they are equally conscious as you?

1

4354574 t1_jdx1c88 wrote

If you want to know more about what I think is going on, research Orchestrated Objective Reduction, developed by Penrose and anaesthesiologist Stuart Hameroff.

It is the most testable and therefore the most scientific theory of consciousness. It has made 14 predictions, which is 14 more than any other theory. Six of these predictions have been verified, and none falsified.

Anything else would just be me rehashing the argument of the people who actually came up with the theory, and I’m not interested in doing that.

1

Outrageous_Nothing26 t1_jdvbget wrote

Just calculate the probability of that arising from randomness. That’s just incredible, you see the answers and think easy because the problem was already solved for you.

2

KnightOfNothing t1_jdvcn5y wrote

no i see the answer and think "wow i really didn't care about the problem in the first place" sorry but things in reality stopped impressing/interesting me many years ago.

1

Outrageous_Nothing26 t1_jdvcq2h wrote

Sounds like a skill issue or depression one of the two

1

KnightOfNothing t1_jdvhuy8 wrote

you're not the first one to bring up "skill issue" when I've expressed my utter disappointment in all things real, is the human game of socialize work and sleep really so much fun for you guys? is this limited world lacking of anything fantastical really so impressive for all of you?

i've tried exceptionally hard to understand but all my efforts have been for naught. The only rational conclusion is that there's something necessary to the human experience i'm lacking but it's so fundamental no one would even think of mentioning it.

1

Outrageous_Nothing26 t1_jdvi8qx wrote

Well the truth, it doesn’t really matter, we could be living in a the magical world of harry potter and your anhedonia would do the same. I was just kidding with the skill issue but it sounds like depression, i had something similar happen but it’s just my unsolicited opinion and it doesn’t carry thar much weight

2