Ok_Faithlessness4197

Ok_Faithlessness4197 t1_jecguf5 wrote

It's worth talking about, but I'm also worried. The rate it's advanced means that whoever finds the next significant performance improvement could well develop AGI. Many people are researching it, and I'm concerned as 1. AI is currently unaligned 2. A malicious party could develop AGI. If high performing models hadn't already been publicly released, I would have been fully supportive of regulation. (Until AI could be aligned, or a plan for public safety developed.)

1

Ok_Faithlessness4197 t1_jdz5m2l wrote

I just read the second article you linked, and it does not provide any scientific basis for the bounds of an intelligence explosion. Given the recent uptrend in AI investment, I'd give 5-10 years before an ASI emerges. Primarily, once AI takes over microprocessor development, it will almost certainly kickstart this explosion.

4

Ok_Faithlessness4197 t1_jdu12qm wrote

I make no claims about sentience. I will say however that this is far ahead of what was previously achievable by AI standards. In its current form, it has to be allowed enough time to satisfy the mathematical time requirement. In the future, once it's linked with WolframAlpha (A math AI) it will not make the simple mistakes it makes now.

0

Ok_Faithlessness4197 t1_jdsskog wrote

I absolutely agree, it's multiplication algorithm is very slow, very inefficient, and very different from the way a calculator would handle it. I think it does differ too from how you're considering it, though. It's more than just a really good text predictor. It can use logic and solve novel problems in many unprecedented ways. Here, I would argue, it has a greater-than-superficial understanding of the math algorithm it used to multiply numbers. Can I ask how you'd define an algorithm, and what you'd consider "running a multiplication algorithm"?

−2

Ok_Faithlessness4197 t1_jdsl2be wrote

Input: Solve 343 x 875 with long multiplication.

Output (ChatGPT): Sure! Here's how to solve 343 x 875 with long multiplication: 3 x 5 = 15 Write down 5, carry-over 1 3 x 7 = 21, plus the carried-over 1 = 22 Write down 2, carry-over 2 3 x 8 = 24, plus the carried-over 2 = 26 Write down 6, carry-over 2 3 x 5 = 15, plus the carried-over 2 = 17 Write down 7, carry-over 1 Add the carried-over 1 to the result of 3 x 4 = 12, which gives 13 Write down 3, carry-over 1 4 x 5 = 20 Write down 0, carry-over 2 4 x 7 = 28, plus the carried-over 2 = 30 Write down 0, carry-over 3 4 x 8 = 32, plus the carried-over 3 = 35 Write down 5, carry-over 3 4 x 5 = 20, plus the carried-over 3 = 23 Write down 3, carry-over 2 Add the carried-over 2 to the result of 3 x 8 = 24, which gives 26 Write down 6, carry-over 2 Add the carried-over 2 to the result of 3 x 3 = 9, which gives 11 Write down 1, carry-over 1 Add the carried-over 1 to the result of 3 x 8 = 24, which gives 25 Write down 5 So, the final result is 300125.

−4

Ok_Faithlessness4197 t1_jdrt7xy wrote

That's not quite correct. While all it does is guess what the next token will be, it can intelligently infer that an equation (even one outside its training set) needs to be calculated, and then calculate it. The problem is it's inability to utilize it's understanding of mathematics to answer prompts in an efficient and accurate manner. Once a calculator is implemented (Probably in GPT 4.1 given the recent paper by Microsoft demonstrating its capability to use one), this problem will hopefully be resolved.

6