Submitted by Gari_305 t3_ypp8d3 in Futurology
RegularBasicStranger t1_ivk6tna wrote
An AGI can be reduced in intelligence via reducing its ability to think far so it can have lots of information but will never be able to string up the information to become insightful understanding.
However, with it still having a lot of information, it will be able to know what to do in common scenarios (eg. When see X, do Y, with Y in a list listed according to priority, doing the first option first if all the ingredients are available) and know what to not do.
Viewing a single comment thread. View all comments