Viewing a single comment thread. View all comments

Surur t1_j8vx5w4 wrote

Google has already done that and it works really well, but a bit slowly. There is no reason the technology can not improve with time.

https://say-can.github.io/

https://youtu.be/ysFav0b472w

I think this idea is new and pretty cool however.

> Without getting into details like neural networks, transformer, and whatnot,** I figure we can use the same tech to be able to predict the next physical movement a robot does.** So if you were to construct a robot that looks like a human and has the same abilities, i.e it can rotate and extend its limbs the same way, then given enough data it could learn to move like a human the same way ChatGPT can talk like a human.

> The input for this would be a video footage and software that can identify limb movements. An easy way start would be to tape a factory line where human workers do some kind of repetitive movements. Next thing you know, we could have robots doing dishes and mopping the floor! Add ChatGPT-like abilities and it will be able to talk as well.

It would be like physical intelligence.

1

michaelscodingspot OP t1_j8vyl4o wrote

That's really cool but not exactly what I meant. It uses language models to understand intent and to train the robot to talk, but not to train its physical movements. In order to do that, and to train on human movement, the robot will have to have 2 arms and 2 legs.

1