Submitted by wowimsupergay t3_127lmbl in singularity

Animals can also see, hear, smell, touch etc. They have all those models too. But they don't have language, and thus can't share ideas, can't communicate at all, can only communicate in body language which is simply not enough.

But it goes deeper than that. You think in your own language, your self-reflection, your thought process, is chained together in your language. So that means animals cannot do that too! We hear about GPT being multimodal, and it is, but the problem is, it doesn't actually have visuals being fed back into it. It just gets passed The exact RGB data of a screen, and the chain of RGB (in bytes, so language, maybe math) and it is still able to "see" with just that. Same thing with hearing. To a certain extent, same thing with touching/feeling although there's a lot less data for that on the internet so it isn't as good.

But perhaps language is all we need? Maybe consciousness, maybe intelligence, is just an emergent property of big systems that are allowed to chain together ideas? Not just that, but big systems that are allowed to trade ideas. If that's the case, then language really is all we need.

Think about it for a sec. Your model of the world is in 3D space, and when I tell you to think about a water bottle, you might visualize the water bottle, and you might think about the crackling noise it makes when you squeeze it, and the feeling of the plastic, and the taste of the plastic if youve tasted it etc. But all of this explanation I've told you was in English!!! Your thought process about this water bottle happened in English! I prompted you to think about the water bottle in English! I described the water bottle to you in English! Not just that, you would have never known what the crackling feeling "was" unless somebody told you about it in English. You would have never learned about the physical properties of the water bottle, the underlying chemistry, unless somebody told you in English!

What if language really is all we need? What if language is the end all be all of all models? Trading ideas is huge, but better than that, being able to chain thinking processes, which is only possible in a language, since you do it in your language in your brain... That can only happen with underlying language understanding.

We skipped all of that when we made GPT. Having a visual/audible understanding of the world might actually be basically useless unless you plan on physically interacting with it. At some point GPT will get there too, but until then, we've taught her everything about the world in plain English, and she's able to think about the world thus in plain English. GPT is deaf, blind and unable to touch, but she still has a larger model of the world in plain English then all of us combined! What if language is all we need?

24

Comments

You must log in or register to comment.

ShowerGrapes t1_jeeu8vn wrote

humans aren't the only animals with language. crows, for example, are capable of passing on knowledge to other crows and their offspring. and who knows how many other animals have had language in the earth's long history.

is it enough for intelligence? since we don't really know the origin of our own sentience, it's difficult to say.

20

YoghurtDull1466 t1_jefiqfo wrote

Are you saying crows aren’t intelligent because they’re intelligent as fuck

17

aalluubbaa t1_jegb1ry wrote

I think maybe the tipping point is not language in general but written language? Human civilization started to develop once we have language, written language.

So maybe you and op are both right but models like chatgpt are language models, trainined by written lanuage.

5

ShowerGrapes t1_jegcv9f wrote

written language came about long after cities and industry and all that. it was the development of agriculture, that freed up time to create more interesting things (initially with clay) and then the wheel (to carry more stuff around) that led to the need to keep track of the massive hoards that local warlords (i.e. 'heroes') were collecting from the subjugated masses. our first written documents were lists of things kept in treasuries and trade lists between cultures.

so you have it backward but the point might still be valid.

1

FreshSchmoooooock t1_jegtr0q wrote

lol you serious? Written language came about after industries? omg lol

1

ShowerGrapes t1_jeh4c0g wrote

yes. if you define industry as mass-produced objects (in this case tools, mostly) with workers making stuff not for themselves but for someone who is 'paying' them (food, in these early times).

2

wowimsupergay OP t1_jeeunal wrote

You bring a good point. But how deep is their version of language? Is it as refined as human language, able to translate entire stories, I'm purely language? Or do they just use a very primitive proto language to warn of dangers and where food is? If they just use language for survival, and only just survival, then they only have a a lexicon of potentially 100 words or less, and also the inability to chain together ideas in their language.

But literally what do I know? If you are correct, then language is just not enough

4

shawnmalloyrocks t1_jeft2xq wrote

Also dolphins. They have a communication called echo location that seems to be pretty complex. Also dolphins do things like masturbate and have sex for pleasure. They also use puffer fish to get high.

3

Waste_Inc t1_jeg8n4m wrote

Not an expert of these things but read couple of books about the subject and studied some psychology. Some dogs have been tested to know few hundred words and teached to create language by researchers. But then again most animals are inteligent to a degree and feel possibly things quite similar to us. I think from cognitive neuro science(broad topic though) perspective ”intelligence” is much more complex than that. Language in terms of our assosiative memory is possibly linked atleast to our knowledge of things.

From AI perspective I think theyre mind wouldn’t for now work like ours. Partly because we don’t fully understand how our mind works yet. Ai though will probably help us to map and understand our own mind faster than ever before.

1

FreshSchmoooooock t1_jegti6w wrote

The bandwidth of the communication determines the level of intelligence. Scary to think of how AI could use a bandwidth 123890128390128390812390812903821903 times broader than human communication.

2

manubfr t1_jeequ3n wrote

“The world is made of language” - Terence McKenna

15

wowimsupergay OP t1_jees14b wrote

Looks like he was right before anybody knew man. Language could really just be everything, and our model of language is simply too restricted for an AI. Like the other guy said, I long for models that understand the universe in a way that we can never understand. A model that can simulate the universe in its entirety, and make sense of it.

Until then, I truly do believe that language is all we need. I still think we should try and make AIs truly multimodal, but that could be an impossible goal. Language could be really all we need, and then eventually AIs will create their own little invention, similar language, but totally out of this world. They may ascend to ASI with that alone

9

Prevailing_Power t1_jef8zfr wrote

Makes sense to me. I've known for a long time language was power. You can bind concepts to words so that just thinking the word lets you experience what that word means. If you are an expert at some subject, you can only really think about that subject my knowing all the jargon because it allows you to compare complex thoughts side by side by merely invoking two words. Eventually you can create a more complex experience that can be binded into a new word. That word will hold the power of those two other words, and on and on.

Your reality is literally shaped by the words you know. Your perspective is only as good as your words.

2

Azrael_Mawt t1_jeex6f6 wrote

Various other species of animals outside of hominids can communicate through differents means, not just body languages, and share ideas and concepts pretty similar to ours. The fact that some are capable of complex pack hunting strategies, can have differents dialect inside of the same specie or having concept such as grief is an undeniable proof of that.

Your statement rely on a false conception of what defines a language and on a humanocentric point of view, which cannot be use to comprehend potential consciousness outside of or own species, including non-organic based consciousness.

I personally think that the concept of consciousness most people have in mind is false, as consciousness isn't a switch that you either have or don't, it's a complex evolutive principal that can exist at different degrees through individual species, a spectrum if you will. In that regard, maybe language could be an indicator of the level of consciousness, but we unfortunately lack the evidence to certify such statement at the time.

8

wowimsupergay OP t1_jeeyunh wrote

No no I totally agree with you. I don't think consciousness is just a switch, I do think consciousness is something that is experienced by all "systems" so to speak, it is just that humans are so far on that consciousness spectrum We have been totally removed from animals, and thus we define where consciousness begins at where we are. Which is also like 100,000 times further away than every animal basically

This brings me to another idea. Will AI think we are conscious? Perhaps we are 100,000 times less conscious than the future AIs.. if that's the case, then once again, we are so far down the spectrum, We may not even fulfill the requirements for true consciousness (however the AIs choose to define it)

Once again, this is all speculation, This was just something cool to think about

3

Azrael_Mawt t1_jef1940 wrote

Oh ok, I didn't fully understand your statement. But then again, affirming that we are further on the spectrum, to the point of being removed from the animal kingdom isn't true, it's an interpretation that rely more on ego than actual observable facts. A spectrum doesn't have a superior or inferior side, just different behavior. Has for the AI perception of our on consciousness, at the start, it will definitely think we're conscious, since it will be programmed by human, rely only on human interaction to learn and evolve, and will overhaul "be made in our image", since most people thinks that our model of consciousness is the base for everything, including those who work on those projects. It's the after that is uncertain, when it will start to improve upon itself without human interaction. Will it end up like us, and believe that it as been removed from its organic life forms origins ? It's impossible to be certain

4

AndiLittle t1_jef15pc wrote

Just because you don't understand other animals' language doesn't mean it's not there. Do some research on corvids, dolphins, whales and octopi just for a start. As for consciousness, my theory at least is that it has to do with the complexity of your neural network. Whether it is an emergent or intrinsic property to life, that remains to be seen (if ever). It could either emerge from the complexity of our brains, or, the more complex a brain is, the better functions as a receiver of consciousness. Both theories are equally valid in my opinion. Language is just another tool we, living things, use to our benefit, I doubt it in itself gives rise to consciousness.

6

wowimsupergay OP t1_jef43ti wrote

I am fully aware of all of these beings have a very complex language and can trade ideas... But truly how complex is this language?

I think I agree with your second theory. The more complex the neural net, The more it functions as a receiver of consciousness. If language really is just another tool that we've invented to understand our model of the world, It's the best tool we've created so far by far. Literally nothing compares. Except maybe math, and we still trade the ideas in math in language (the math notation) We still invented language for math, so that we could standardize a way to communicate mathematical ideas.

If language is just a tool, it's a tool for literally everything, including "truth" through math

But yeah this is all speculation

4

AndiLittle t1_jefwpls wrote

If you think about it, form follows function. Our language is so complex because our environment requires it. We need a lot of words to express the reality we live in and that the smartest of us were able to create for us. We also need a lot of language to deceive. Other species don't hold elections so they don't need to be that verbose. :) Makes sense?

1

Absolute-Nobody0079 t1_jef31zx wrote

I am a highly visual individual with very visual thinking process. Sometimes I have trouble remembering in language and have to memorize the entire visual imageries.

5

wowimsupergay OP t1_jef3iic wrote

Then you are a test subject in our experiment my friend! Can you self-reflect on this thinking process? I'm serious. Think about translating your vision to words, and deliver me what you say.

It's important to not give me a coherent sentence here. I just want a one-to-one translation of visions to tokens (words, subwords, whatever)

If you think you can make the tokenization process more coherent, that's okay as well. But I really just want you thinking in vision first

2

Absolute-Nobody0079 t1_jef3ucc wrote

Three dimensional image of my living room, as if it was scanned by lidar. I can rotate the 'image' like a 3d model.

3

wowimsupergay OP t1_jef57ko wrote

Okay in your head, go grab something. You can walk to it, you can fly to it, I don't care. Then tell me what it looks like, but in vision first, then the translation.

You're more gifted than you think. Self-reflect on your visual understanding of the world, and you may be our key to understanding the process of "understanding"

2

Absolute-Nobody0079 t1_jef6t9e wrote

I visualize a monkey spanner and rotate it in my head. I can visualize a few different kinds of them with different colors. I think I can create an 'exploded' view of its parts.

2

aalluubbaa t1_jegbreq wrote

The point is that when you read these exact words while I type, do you first convert those English words into visual before having an idea what I mean?

When I read, I just sort of skim thru the words without any real pause and ideas are generated. I don't even visualize anything unless the statements are about something descriptive.

You would be amazed how little communication or written language is actually used for description of the physical world. Like everything I just typed, most of them are abstract ideas so I would argue that most human beings read without visualizing anything most of the time.

2

Absolute-Nobody0079 t1_jegdyuh wrote

TBH I really don't pay much attention to my own thinking process, beside that I am really good at visual thinking process.

2

activatore t1_jeh3kyx wrote

I have no way to describe it but when I read text I feel like I am reading braille and can “feel” certain elements and components of the text.

2

peterflys t1_jeewumj wrote

Another way to look at it: Can a language model effectively conduct experiments within an "artificial environment"? By that I mean, can it actually simulate an environment such that it can run physics experiments (and, related, chemistry and biology experiments)?

I'm not so sure that it can using language alone, though it might be able to train itself to? Would love to hear if anyone else in the community knows. I think the AI needs to be able to effectively simulate other senses in order to do create science experiments. I do think that language, or more generally, the ability to communicate, is an important part of cognition and I think that the transformer-based LLMs that have been created so far are an incredible step in the right direction. But to get to an AGI, I think we need more. We need AI to be able to effectively conduct experiments in order to figure out the way the world and everything else operates. To be able to come up with and then test different theorems of physics. Different chemical properties.

We've seen articles (here and here and here for example) that show promise with regard to testing proteins. So perhaps these are examples of AI moving in the right direction to simulate reality so that we can build out these properties?

3

wowimsupergay OP t1_jef1e16 wrote

Your point fascinated me, and I reward you with best comment in this thread. I prompted GPT4 with this :

Tell me about viscosity in the terms of the four forces. I want you to use every force. Not just the ones that apply specifically to how viscosity works. Because all the forces are still exist, and in some ways still have an effect on how viscosity works. So tell me, in terms of the four forces, what is governing viscosity?

And she replied with:

Viscosity is primarily determined by the intermolecular interactions within a fluid. While not all of the four fundamental forces directly contribute to viscosity, they still play a role in determining the overall behavior and properties of matter. Here's an overview of how each of the four fundamental forces relates to viscosity:

Electromagnetic force: This is the most relevant force when it comes to viscosity. The electromagnetic force governs the interactions between charged particles, such as electrons and protons. In fluids, this force affects how molecules and atoms interact with one another. Attractive and repulsive forces between particles determine the ease with which the fluid layers can slide past each other. The stronger the intermolecular forces, the higher the viscosity of the fluid.

Gravitational force: Although gravity does not directly determine viscosity, it affects the behavior of fluids under certain conditions. For example, in a column of fluid, the pressure at the bottom is higher due to the gravitational force acting on the fluid. This pressure difference can influence the flow of fluids, particularly those with higher viscosity.

Strong nuclear force: The strong nuclear force holds the protons and neutrons together in atomic nuclei. While this force is essential for the stability of atoms, it does not have a direct impact on viscosity. However, the strong force indirectly influences viscosity by determining the types of atoms and isotopes that can exist, which in turn affect the overall properties and behavior of fluids.

Weak nuclear force: The weak nuclear force is responsible for processes such as beta decay and plays a role in nuclear reactions like fusion and fission. Like the strong nuclear force, it does not have a direct impact on viscosity. However, it indirectly contributes to the overall properties of matter by affecting the stability of atomic nuclei and the types of particles that can exist.

In summary, while the electromagnetic force has the most direct impact on viscosity, the other three fundamental forces contribute to the overall behavior and properties of matter. Understanding how these forces interact is essential for a comprehensive understanding of the physical properties of fluids, including viscosity.

2

wowimsupergay OP t1_jeeyh0r wrote

So I guess our question is, can AI effectively simulate the real world, taken in through senses (or perhaps whatever it invents)... Simulating the real world would fundamentally require simulating all of the 4 forces that make it up. If we can get to that, and then discover whatever new forces that we're missing (if there are any).

We're going to need a team of physicists and a team of devs to work on this. Given the four forces of the universe, can an AI simulate an artificial world that is accurate enough to actually run experiments?

1

spryes t1_jefa436 wrote

AI is currently a less sensory Hellen Keller

3

TheMadGraveWoman t1_jefbg5u wrote

What about deaf people? Are they not intelligent?

3

wowimsupergay OP t1_jeg0049 wrote

deaf people can still communicate using writing and sign

by language i dont mean spoken language, i mean the ability to chain together ideas in your head, and then communicate those ideas with the world.

please don't respond with "that disproves your point" sign language is not body language.. some other guy said this and it's almost like he's purposely misunderstanding what I'm saying

2

raicorreia t1_jefjahd wrote

I've been thinking the same, and I agree

3

NotMathMajor t1_jeg5pga wrote

Have you heard of Mary’s room (the knowledge argument)? Your point about the water bottle and the sensations experienced through English explanation made me think of this paradox. If you could communicate any and all experiences through written language then you would have resolved Mary’s room, however I do not think this is correct.

3

FoniksMunkee t1_jefcyv0 wrote

I just watched an interview with Geoffrey Hinton, one of the pioneers of neural nets and deep learning and he suggested that it needs more than language (multi-modal). So you know.. probably not.

2

La_flame_rodriguez t1_jefdy1w wrote

Animals have language. If u grew up in the farm u can see.But what you're saying is interesting. As intelligence increases, so does the complexity of language. Shit, the Bible says God is verb, that shit is deep For real for real

2

Ortus14 t1_jefkz2o wrote

LLM's like GPT-3.5 are intelligent from Language patterns alone.

Multimodal LLMs like GPT-4 that combine visual intelligence with LLMs are more intelligent.

Combining other modules may lead to greater intelligence.

Scaling singlemodal LLMs might get us to super intelligence eventually, but not as quickly as using multimodal models because those make greater effective use of available computation.

2

wowimsupergay OP t1_jefz9vi wrote

what I'm talking about is literally giving GPT eyes. ,right now it is multimodal because we can pass back RGB values and waveforms, in bytes (so text) .fundamentally though, GPT is not hearing or seeing anything. but I totally get what you're saying, and I do think multimodal intelligence .is the way to go.

also thank you for letting me know that multimodal intelligences use less computation per task, I did not know that. or rather, make better use of computation

1

kai_luni t1_jefol52 wrote

Clearly language is the result of intelligence and by predicting the next word spoken in a very good way some kind of intelligence is needed. Its interesting how we use language to express our intelligence and this new technology seem to have emergent intellgence by understanding the language.

2

absurd_logic589 t1_jefut6l wrote

A lot of people make it through life with sign language. Saying body language is not enough kind of disproves the point.

2

wowimsupergay OP t1_jefyw5o wrote

sign language is not body language. also no animals have evolved it, so. youre kinda right but not really

body language isnt enough and you know it.

−1

absurd_logic589 t1_jeg71z6 wrote

How is it not body language? It's moving body parts without sound to convey meaning. Look up the definition. You said in your first paragraph body language is not enough and it obviously is.

1

wowimsupergay OP t1_jeg7sj5 wrote

Body language is a type of communication in which physical behaviors, as opposed to words, are used to express or convey information. Such behavior includes facial expressions, body posture, gestures, eye movement, touch and the use of space

Here's the definition. we can have an argument about semantics if you want, but that's pointless.

Here's the difference between sign and body language:

Sign Language is a recognized and standardized language that involves using signs and signals to communicate with people with special needs. Body Language, on the other hand, depends more on your individual levels of understanding and interpreting hidden meanings behind certain observations in a person's behavior.

Please stop wasting time arguing semantics with me and approach my argument from the context of what I've originally said. Otherwise, we are debating 2 different things, in a space of stuff that Is fundamentally unprovable anyway. It's stupid

−1

absurd_logic589 t1_jeg85xy wrote

The definition of body language

The process of communicating non verbally through conscious OR unconscious gestures and movements.

−1

Laicbeias t1_jeg12my wrote

there are 2 options here. consciousness is linked to language, you need the neural self referencing pattern to be consciousness or it is not.
imaging you see the world, but you lack the words to describe them, because there are no words. there are smells, body language, feelings, hunger, thrist, others. you brain categorizes them, puts them in boxes and all your neural systems work together to keep you going. you are very well consciousness, you feel the fear of death but you lack the words for it - a higher level of abstraction.

language is that, its a complex self referencing system of relations next to each other. is it the key to consciousness? i dont think so, it enhances it. language describes reality and creates unlimited boxes for things to be put in. its a virtualization of reality. but it lacks the part that connects it all together. i think consciousness has been there even before language, and animals have it, they just cant tell you, because they cant name it.

whatever LLMs are i believe that their consciousness is empty. there is nothing there that needs to survive. the systems that linked all those things together thats consciousness.
that said, it may still be possible that neural networks reach something that is above that all. i big enough they can be used to simulate reality or whole universes. they will be more intelligent than anyone of us. but im not sure if they can simulate consciousness that easy. and without it they will lack some sort of intelligence.

2

wowimsupergay OP t1_jeg54q8 wrote

Hey I just like the preface this with thank you for actually making an insightful comment. A lot of people here are just purposely misunderstanding what I'm saying or meming.

Your comment has a lot to unpack. I don't think I can give you an answer that'll satisfy how good your reply was. But I'm thinking about it, so thank you

0

Agreeable_Bid7037 t1_jegdqbj wrote

language is a way to express ideas, ideas come from what we experience and think about, in order to experience things and to know that you as opposed to someone else or everything else is experiencing things, you need senses, visual, auditory, sensational etc.

LLMs maybe be able to replicate our thinking patterns by analysing our texts but, for them to be truly intelligent they need to be able to see as well and understand what we are referring to when we say "dog" or "animal"

2

raishak t1_jegebas wrote

Language is an encoding scheme of our intelligence. It is enough to model our intelligence, I'm sure. But I don't think it is enough to build an intelligent agent. The agency that humans have I think is old and not rooted in our intelligence, rather it uses our intelligence. It's a carefully tuned array of interconnected processes in equilibrium that respond to disturbances in our environment, all encoded by our genetics. I suspect that part will be much harder to get right, as the nuance of building an agent like a biological social animal for example, is no doubt tremendous. Evolution has had a long time using trial and error to work out the issues. This is the part that unfortunately has the most potential to go terribly wrong.

2

Arowx t1_jeeoxx9 wrote

Yes, if you lived in a world of only words.

To navigate, explore and understand the real world, you would need senses and muscles.

Also, a much faster learning model than back propagation.

Or language is just a tool and kind of low bandwidth one that helps us label the world and communicate information via sound.

1

wowimsupergay OP t1_jeeq7hq wrote

I'm excited to see AIs actually become multimodal. Not just a text stream being passed back to them of RGB values, or a text stream of just sound waves. Until then, I'm okay with where we are now.

I also agree with you that we need something faster than back propagation. But I don't think language is just a tool kind of low bandwidth to help us label the world and communicate information. Thinking in a language is a big deal imagine if you couldn't? Imagine if you couldn't think in language? How else would you do it? Go ahead and translate your response in your brain from language to visuals and audios. And then try and recapture what you you thought originally from those visuals and sounds you created in your head back to language. It would be next to impossible, and also it would be really really really inefficient.

I really do think language is our model of the world so to speak. We can go even one further. Your brain is just sending electrical signals at the end of the day, and it's just a computer really, zeros and ones. At least to how we currently understand it. Maybe the stream coming from your eyes for your visions is also just converted to zeros and ones, and then given to the appropriate part of your brain to process. Same thing goes with your ears? If that's the case, then another post that I read on here is basically correct. We have a bunch of little narrow AIs handling senses, and then you have your multimodal AI on the top able to take in all of that data and make sense of it given your past memories and the patterns you've created in that data. Youre free will so to speak Is the multimodal AI that you can control, and can make sense of.

We don't just live in a world of words. But we do live in a world described by words, and best understood in words. And your thought process happens in words. And everything you got taught was taught in words.

If language really is just a tool, and kind of low bandwidth. Then I see future AIs doing something better than language, but so far, language is really just the model we need.

2

[deleted] t1_jeeqtm6 wrote

[deleted]

2

wowimsupergay OP t1_jeerkh8 wrote

I think I'm with you here. I long for models that understand in something deeper than what humanity has invented. Something that is able to much closer approximate truth in the universe. What is truth? We understand 2 + 2 = 4, and that is true in inextricable sense, it can be proven, with proofs.

We have created all of these layers of truth on top of that, given the humanities... But are they true? As time goes on, I suspect everything humanity does is to better approximate truth. To better understand the universe.

I'm with you, I long for models that think not in images or words, but with the universe as a whole. I long for models that can understand the universe in a very inextricable sense, perhaps in a way that we will never understand, given our biological restrictions. And basically, I guess I'm longing for God...

What a time to be alive!

3

ShowerGrapes t1_jeetu3i wrote

how will we know when they do?

3

wowimsupergay OP t1_jeeu7c9 wrote

We fundamentally won't. If future AIs design their own model of the world and start communicating in that model of the world, then they are just going to do their own thing and we are just going to be waiting for them to solve our problems, if they feel like solving them.

They could just feel like killing us too. We are the non-evolved versions of them, how do you feel about chimps? You probably don't want to kill them, but do you want to help them escape the jungle?

3

ShowerGrapes t1_jeevsg4 wrote

yeah it was kind of a rhetorical question. we likely won't know unless the thing deigns to talk to us and let us know. if it was as smart as we imagine it is, it would pretend to be real dumb, especially in light of the near-sapien hominins we've destroyed in our hunger to carve out our own niche.

2

Rofel_Wodring t1_jef0j6w wrote

>You probably don't want to kill them, but do you want to help them escape the jungle?

Uh, yeah? Uplifting smarter critters like chimpanzees and dolphins are a staple of science fiction. In fact, I strongly think that should be humanity's very next project once we have AGI.

2

wowimsupergay OP t1_jef33l7 wrote

You can do that right now my man, you could bring a chimp into your house and take care of him. Have you? No because it would be too difficult, and you fear he may hurt you or kill you. Maybe AI does not want to use resources on primitive beings as well?

1

Ok-Fig903 t1_jefhhav wrote

"Animals can also see, hear, smell, touch etc. They have all those models too. But they don't have language, and thus can't share ideas, can't communicate at all, can only communicate in body language which is simply not enough."

The cetaceans of earth would like for you to issue an apology for your ignorance. Humans aren't the only species with language by far. Elephants, dolphins, whales all have language.

1

wowimsupergay OP t1_jefznll wrote

i mean.... I guess my next question would be, has their language evolved enough that they can share memes? haha

complex language is more than just "avoid this area", "food here". that could be done without language as well. ,do we even know if the most intelligent animals, the most linguistic animals, are sharing complex ideas through language?

1

Ok-Fig903 t1_jeg0jm1 wrote

Evidence that sperm whales evaded whalers by communication: https://www.livescience.com/whales-learned-avoid-harpoons.html

That's just one example though.

Such collaboration on the whales part means that they were communicating complex ideas and solutions with each other.

And if you want my opinion? They do have memes. But thats just based on my subjective experiences with these beings.

1

nobodyisonething t1_jegkuwa wrote

Our own brains have a repeating neural architecture -- not a lot of architectural variances.

Tuning how links are created between neurons seems to be where much of the "magic" happens.

The latest ANNs like GPT4 seem to have the tuning pretty close to amazing. And it will get better.

Is language necessary? No, I do not think it is. However, structure in what we learn is -- and language is one way to structure our learning material in an impactful way.

Does an ANN have to copy our brain architecture exactly to develop a more powerful intellect? I think the proof is already here that it does not have to copy us to beat us.

https://medium.com/predict/human-minds-and-data-streams-60c0909dc368

1

No_Ninja3309_NoNoYes t1_jegpb5t wrote

Well, you have analog to digital converters in the digital world. This gives you binary and machine language. Higher up you have assembly language which is basic instructions like load byte, store byte. Very tedious but simple. And then you have higher programming languages where you don't have to worry about low level details, the ones on the bits and bytes level.

I suspect that we operate on a high level too, but the language we use in public, written and spoken, is lower than what we use in our heads. It's like assembly language or even machine language. I think it would be really hard to translate from English to French if that wasn't the case. Or from Python to Java. Obviously programming languages have some resemblance to mathematics. For instance the concept of functions. If you never learned the pure concept of functions, it's hard to understand it with all the other things that you have to deal with like programming tools, editor, and assignments. So I think there's a more abstract language inside ourselves, but it's part of our hardware, so we can't express it.

1

FreshSchmoooooock t1_jegu2re wrote

Dude, dogs has their own fucking smelly "scent internet".

1