michaelhoney
michaelhoney t1_irzeinc wrote
Reply to comment by NightmareOmega in Why does everyone assume that AI will be conscious? by Rumianti6
Fair point: conscious things not made of evolved meat are still hypothetical, as a far as we know. We don’t yet know what the secret sauce is.
michaelhoney t1_irvmyyl wrote
Reply to comment by moos14 in Why does everyone assume that AI will be conscious? by Rumianti6
There are definitely learned behaviours which are passed on socially, and some populations have them and some don’t. An example here in Australia is swooping magpies: in many parts they become homicidal in the nesting season, whereas where I live they are calm around humans all year round.
michaelhoney t1_irvml4y wrote
Reply to comment by NightmareOmega in Why does everyone assume that AI will be conscious? by Rumianti6
Being really fast is not the same as being sufficiently complex, though. “Complex, in the right way” is important.
michaelhoney t1_ivedxri wrote
Reply to comment by abudabu in Nick Bostrom on the ethics of Digital Minds: "With recent advances in AI... it is remarkable how neglected this issue still is" by Smoke-away
You’re thinking of the humans-doing-the-computation concept as a reductio ad absurdum, but have you even an order-of-magnitude idea of just how long it would take for humans to simulate an AGI? If you had a coherent sect of humans spending thousands of years doing rituals they couldn’t possibly understand, yet those rituals resulted in (very slow!) intelligent predictions…