Viewing a single comment thread. View all comments

FuturologyBot t1_jcdvnw5 wrote

The following submission statement was provided by /u/izumi3682:


Submission statement from OP. Note: This submission statement "locks in" after about 30 minutes and can no longer be edited. Please refer to my statement they link, which I can continue to edit. I often edit my submission statement, sometimes for the next few days if needs must. There is often required additional grammatical editing and additional added detail.


The opening of this article tells you everything you need to know.

>In 2018, Sundar Pichai, the chief executive of Google — and not one of the tech executives known for overstatement — said, “A.I. is probably the most important thing humanity has ever worked on. I think of it as something more profound than electricity or fire.”

>Try to live, for a few minutes, in the possibility that he’s right. There is no more profound human bias than the expectation that tomorrow will be like today. It is a powerful heuristic tool because it is almost always correct. Tomorrow probably will be like today. Next year probably will be like this year. But cast your gaze 10 or 20 years out. Typically, that has been possible in human history. I don’t think it is now.

>Artificial intelligence is a loose term, and I mean it loosely. I am describing not the soul of intelligence, but the texture of a world populated by ChatGPT-like programs that feel to us as though they were intelligent, and that shape or govern much of our lives. Such systems are, to a large extent, already here. But what’s coming will make them look like toys. What is hardest to appreciate in A.I. is the improvement curve.

>“The broader intellectual world seems to wildly overestimate how long it will take A.I. systems to go from ‘large impact on the world’ to ‘unrecognizably transformed world,’” Paul Christiano, a key member of OpenAI who left to found the Alignment Research Center, wrote last year. “This is more likely to be years than decades, and there’s a real chance that it’s months.”

I constantly re-iterate; The "technological singularity" (TS) is going to occur as early as the year 2027 or as late as the year 2031. But you know what? Even I could be off by as many as 3 years too late. The TS could occur in 2025. But I just don't feel comfortable saying as early as 2025. That is the person of today's world in me, that thinks even as soon as 2027 is sort of pushing it. It's just too incredible for me even. I say 2027 because I tend to rely on what I call the accelerating change "fudge factor" that is how Raymond Kurzweil came to the conclusion in the year 2005 that the TS would occur in the year 2045. He knows now that his prediction was wildly too conservative. He too now acknowledges that the TS is probably going to occur around the year 2029.

I put it like this in a very interesting dialogue with someone who we have argued what and by what timeline was coming for almost the last 7 years I believe. Now he is a believer.

https://www.reddit.com/r/Futurology/comments/113f9jm/from_bing_to_sydney_something_is_profoundly/j8ugejf/?context=3

https://www.reddit.com/r/Futurology/comments/11o6g71/microsoft_will_launch_chatgpt_4_with_ai_videos/jbr2k1c/?context=3


Please reply to OP's comment here: https://old.reddit.com/r/Futurology/comments/11shevz/this_changes_everything_by_ezra_kleinthe_new_york/jcdrt9v/

1