Submitted by kmtrp t3_xv8ldd in singularity
LeCodex t1_irupspb wrote
Reply to comment by MurderByEgoDeath in What happens in the first month of AGI/ASI? by kmtrp
I'm glad to see another fan of Popper and Deutsch in the midst of this sea of arrogantly confident errors about intelligence, AGI, knowledge,...
Seeing so many people here parrot the kind of misconceptions that are so prevalent in the field, I'm beginning to really understand Deutsch's arguments in his "Why has AGI not been created yet?" video at a deeper level.
It's as if the people supposedly interested in bringing about AGI, had decided to choose one of the worst epistemological framework they could find to get there (certainly worse than Popper's epistemology), then proceeded to lock themselves out of any error-correction mechanism in that regard. Now they're all wondering why their AIs can't generalize well, can't learn in an open-ended fashion, struggle with curiosity, suck at abductive reasoning... and for that matter, even deduction (since finding good proofs requires a serious dose of abduction), are data hungry...
Viewing a single comment thread. View all comments