Viewing a single comment thread. View all comments

GlobusGlobus t1_j5wd3ux wrote

Some of Marcus' comments are so strange because he always thinks about AGI and seems to to think that other people also think that way. His critique against ChatGPT is that he doesn't think it is a fast way to reach AGI. He basically says we should scrap GPT and do other things. I agree on GPT not being a steep stepping stone towards AGI, I don't think GPT has much at all to do with AGI. But that is not the point! GPT3 is a fantastic tool made to solve lots of things. Even if it never has anything to do with AGI it still worth an insane amount of money and will be extremely beneficial.

For me GPT might be more important than AGI. Each time Marcus speaks he just assumes that everyone's goal is AGI. It is very strange.

1

dasnihil t1_j5ybc3o wrote

if gpt is more important for you that's okay. everyone has a mission and it doesn't have to be the same. there are physicists still going at it without caring much about gpt or agi. who cares man, we have a limited life and we'll all be dead sooner or later. relax.

1

GlobusGlobus t1_j5yct43 wrote

I am not convinced either or, but it is a strange, and clearly false, assumption that all ML has AGI as a goal. Most of the time people just want to solve a problem.

1

dasnihil t1_j5yd0ai wrote

that's fine and it's a great tool like most tools humans have invented, id even say NN and gradient descent is the greatest idea so far. so what, we must keep going while society makes use of inventions along the way.

1