Gaudrix

Gaudrix t1_je7fuom wrote

It's not a very good example.

99% of streamers don't make enough to survive as a single source of income

99% of all revenue of streamers is earned by those in the top .1-.2% of viewership

That's not a job. That's an incredibly lucky and fortunate situation.

There may be different and new things we will be able to do with AI, but 99% of people will never directly benefit financially from it. Not including UBI and profit sharing due to automation which would be indirect. The compensation of UBI will also never exceed current earning potential pre-AGI. Unless we are actually post-scarcity, and the AGI can do everything for us. Early stages of AI rollout, what we are experiencing now, up until full post-scarcity will be dick for just about everyone. People with resources and capital will never willingly share if they don't need human labor.

14

Gaudrix t1_je08kxu wrote

This technology makes tutors nearly obsolete. Only small improvements need to be made in reliability and consistency. It is already able to be configured to approach text from the perspective of a certain level of skill or education. GPT 4 won't remove that many jobs, but 5 will be able to fill in almost all of the gaps preventing that now.

6

Gaudrix t1_jd4mjmc wrote

As an AI language model, 🤣 movies before 2021, obviously

The Matrix (1999)

  • IMDb: 8.7/10
  • Rotten Tomatoes: Critics - 88%, Audience - 85%

2001: A Space Odyssey (1968)

  • IMDb: 8.3/10
  • Rotten Tomatoes: Critics - 92%, Audience - 89%

Blade Runner (1982)

  • IMDb: 8.1/10
  • Rotten Tomatoes: Critics - 89%, Audience - 90%

The Terminator (1984)

  • IMDb: 8.0/10
  • Rotten Tomatoes: Critics - 89%, Audience - 89%

Ex Machina (2014)

  • IMDb: 7.7/10
  • Rotten Tomatoes: Critics - 92%, Audience - 86%

Her (2013)

  • IMDb: 8.0/10
  • Rotten Tomatoes: Critics - 94%, Audience - 82%

Ghost in the Shell (1995)

  • IMDb: 8.0/10
  • Rotten Tomatoes: Critics - 96%, Audience - 89%

Transcendence (2014)

  • IMDb: 6.3/10
  • Rotten Tomatoes: Critics - 19%, Audience - 36%

A.I. Artificial Intelligence (2001)

  • IMDb: 7.2/10
  • Rotten Tomatoes: Critics - 74%, Audience - 64%

I, Robot (2004)

  • IMDb: 7.1/10
  • Rotten Tomatoes: Critics - 56%, Audience - 70%

It made an error sorting Transcendence there lol, it's also bad but conceptually provocative. I've seen every movie on this list, so I'd say it's pretty good.

3

Gaudrix t1_j6n1gg9 wrote

It's not the same thing. Microsoft is already huge, and the percent growth on capital investment is not even close to the disruptive capacity of OpenAi. Any increase in valuation of OpenAi is not directly impacting Microsoft's, it's considerably diluted.

It's like eating the shit of the people at the table instead of eating at the table.

3

Gaudrix t1_j6mzofx wrote

The worst part so far out of all this is all of the best AI out there can't profit share. They are deemed research projects and non-profits to avoid bias but something has to be done.

The people making them are getting rich with cash infusions and investors in the billions. Yet, these companies can't be invested in by the average person and no public company truly owns them. So they are able to wipe out millions of jobs and those people can't cover themselves by investing in their replacement. Only the select few and very fortunate will monetarily benefit off AI as it grows. The only way to make money off AI on the outside is to use it for a business or wait for UBI, probably years later than it will be needed.

It's the dawn of a new paradigm like the internet, and you can't invest in anything to ride the wave. Yet these projects and non-profits will 10 to 50x in a decade and none of that productivity boon will be shared with the public. This will only lead to truly destitute economic situations because nothing is in place to mitigate the fallout of lost and obsolete human labor. What we do in the next 5 years, legislatively and technologically, will dramatically affect the next several decades.

2

Gaudrix t1_j3q6b6k wrote

I think within 10 years we'll have something that at least makes us ask the question 🤔 "Is it conscious?" It might not be full AGI but proto-intelligence, about 80% there but not quite fully realized.

1

Gaudrix t1_j154nkx wrote

Yeah, I think people misconstrue technological singularity and AI singularity. It has nothing to do with not going backwards or any other constraint. Technology can always be destroyed and lost. The entire planet can be destroyed any instant.

The technological singularity was first and explains the confluence of different technologies that reach a stage where they begin to have compounding effects in progress, and there is an explosion of progress and trajectory.

The AI singularity specifically refers to the point AI becomes sentient and transitions into AGI. At which point we have no clue what the repercussions are after the creation of true artificial consciousness. Especially considering if it has the ability to self improve and on shorter and shorter time tables.

We are living through the technological singularity, and when they look back 100 years from now they'll probably put the onset somewhere in the late 90s or early 2000s. Things are getting faster and faster with breakthroughs across many different sectors due to cross-pollination of technological progress.

4

Gaudrix t1_ir7uc04 wrote

Yeah people are weird in this subreddit. Everyone is working off a different definition of what the singularity is and what it entails. The singularity is a point but it's not possible to experience a point just what comes before and what comes after. It's very hard to determine where is the point specifically that's why it's easier to quantify the closeness or speed approaching the singularity than the singularity itself. I'd say we are firmly locked in and have an obviously accelerating trajectory. We are in the endgame.

14

Gaudrix t1_ir6otcr wrote

It honestly feels like we are living on that steep edge.

In just the last few years there has been like +3 revolutionary cancer treatments, advancement in fusion/solar/battery tech, ai creation of art/video/physics sims/voices/faces etc.

There are so many new breakthroughs that there isn't even enough time to profit off of anything and make a product because by the time you hit market, a free solution is made available and it's better. The next 50 years will look like the last 50 x 100.

We are living it.

48

Gaudrix t1_iqsr219 wrote

An AR device would replace the smartphone. Eventually the devices would be just about the same one. But a sleeker more productivity focused AR device would replace the phone outright. While VR would be left for work, gaming, and more involved entertainment applications.

Productivity would be the first big factor in adoption I think. If it can replace multiple monitor setups and allow you to interact and work more efficiently then it will be used.

The ideal version of VR is one where you don't have to move your body at all and you are no longer limited by it or your physical space. The VR we have now is really good, but early stages compared to where it will be in a few decades. I think a big leap in the short term would be very good eye tracking. It will allow for higher fidelity on same hardware and it opens the door to an incredibly fast human input device. Going forward you wouldn't need a mouse/controller to track and be manually pointed to where you are looking. You'd just need buttons and as soon as you look at something you can click the button and it clicks there. This would allow you to interact with menus in VR and extended pc desktop views at an insanely accelerated pace compared to a mouse or controller pointing. Using a virtual keyboard with eyes would be faster than now but still not as fast as a physical one. Voice to text could be utilized for longer typing sessions.

I think half the weight, double the resolution while pushing higher fps >=144, +50-100% the fov, eye tracking input system, and very comfortable to wear for several hours with no issue. I can imagine laying in bed and just working, or exploring the virtual net.

All of this working towards a full neural connection for input and output of course but that might be several decades away so no point in waiting for it. It's best to just make the best use out of what we have.

12