wanfuse1234
wanfuse1234 t1_j9ntswj wrote
Reply to comment by Hostilis_ in Google announces major breakthrough that represents ‘significant shift’ in quantum computers by Ezekiel_W
Tech development starts as a nearly linear progression till it reaches an asymptotic phase where it quickly goes nearly exponential and then levels off and the curve inverts, we are about to reach the asymptotic phase in this tech, and with it n^2 problems will become solvable which is a whole new class of problems that can be solved, both good and bad, including AGI. We will reach the singularity likely within 10 years. Maybe 20.
wanfuse1234 t1_jdhjhet wrote
Reply to The internal language of LLMs: Semantically-compact representations by Lesterpaintstheworld
Another method of compressing is using a shorthand middle interpreter that takes out vowels that are not necessary for interpretation ( some removal makes it ambiguous) , using Huffman compression, or doing things like possibly training the network on compressed data and passing inputs as compressed data, there are even better middle ground compression methods than Huffman ( spell) algorithms and types of look up table substitutions. I have another method I theorize is far better but can’t yet prove it, two methods actually, that get closer to the entropy limit of compression, financial constraints, there are however other limits to developing a AGI that too are possible to overcome, but should we is the big question.