theotherquantumjim

theotherquantumjim t1_jedspfh wrote

The language and the symbols are simply the tools to learn the inherent truths. You can change the symbols but the rules beneath will be the same. Doesn’t matter if one is called “one” or “zarg” or “egg”. It still means one. With regards LLMs I am very interested to see how far they can extend the context windows and if there are possibilities for long-term memory.

1

theotherquantumjim t1_je8ysa1 wrote

This is largely semantic trickery though. Using apples is just an easy way for children to learn the fundament that 1+1=2. Your example doesn’t really hold up since a pile of sand is not really a mathematical concept. What you are actually talking about is 1 billion grains of sand + 1 billion grains of sand. Put them together and you will definitely find 2 billion grains of sand. The fundamental mathematical principles hidden behind the language hold true

6

theotherquantumjim t1_je8shkz wrote

This is not correct at all. From a young age people learn the principles of mathematics, usually through the manipulation of physical objects. They learn numerical symbols and how these connect to real-world items e.g. if I have 1 of anything and add 1 more to it I have 2. Adding 1 more each time increases the symbolic value by 1 increment. That is a rule of mathematics that we learn very young and can apply in many situations

4

theotherquantumjim t1_jdz19vm wrote

I think AGI (depending on your definition) is pretty close already. As you’ve alluded to, we may never get ASI. I’m not sure that matters really. Singularity suggests a point where the tech is indistinguishable from magic e.g. nanotech, ftl travel etc. I don’t think we need that kind of event to fundamentally re-shape society, as others have said

2

theotherquantumjim t1_jamo41v wrote

Recent study suggests otherwise doesn’t it? Yet to be confirmed independently I guess, but hasn’t it very recently been posited (maybe also evidenced) that black holes are driving expansion by returning energy to the quantum vacuum? Does this not mean expansion would indeed be physical?

1

theotherquantumjim t1_j1hbwsd wrote

Reply to comment by fortunum in Hype bubble by fortunum

Is it not reasonable to posit that AGI doesn’t need consciousness though? Notwithstanding we aren’t yet clear exactly what it is, but there doesn’t seem to be a logical requirement for AGI to have consciousness. Having said that, I would agree that a “language mimic” is probably very far away from AGI and that some kind of LTM, as well as multi-mode sensory input, cross referencing and feedback is probably a pre-requisite.

6