Submitted by [deleted] t3_zqehz3 in Futurology
LizardWizard444 t1_j0xy97v wrote
I disagree, i don't think we can have a tech revolution fast enough before existing systems and tech almost kill us.
I think people's first assumptions they make is believing capitalism is gonna remain stable forever and that the game theory that makes this big thing we call society work remains stable indefinitely; your promising that the system we live in (capitalism) and a stable majority of people will always be worthwhile enough for everyone to stay. I don't think this is the case, finacial or even just generic disasters happens and sometimes society can handles it well and people get what they need but one day it might not happen. There isn't anything built into the structure of capitalism that promises with absloute certainty that the market or businesses or the government will provide for the needs of a big enough majority that everything will forever and always be businesses as usual. Perhaps someday a perfect storm of disasters occurr (for example sake a actually killer disease and several exteme weather events all at the same time) and suddenly no one can buy or produce and the whole thing breaks and stays broken for so long that people turn to something else to save them.
The next big assumption is that AI will never be good enough to do everything. If you asked me a 2 years ago "do you think ai will be able to automate art", I'd probably have given a solid no. I'd assumed that art would have been one of the last things to be automated if ever, but now there's news and posts raising serious concerns. I think the ai generated art is solid proof that we definitely don't know what AI will be able to do in 5 years let alone if it will become a main source of art commercially in the next few years.
Honestly in a grander more grim prediction on the capabilites of AI speacialist on the issue of "will ai kill us" (Elizer Yudvowski to be specific) said "people having kids today might be able to see they're children graduatee kindergarten". What i think is going to be necessary is considering AI alignment and making sure that the AI we make doesn't manage to end us in some catastrophic manner and will help humanity rather than destroy it.
Overall i think automation is part of this issue, capitalism as it is has businesses that will much rather automate a job out of existence forever with AI solutions rather than keep those jobs around for the good of the people working them. New discoveries that might produce a new job that an AI can't do are not being made fast enough to keep up with technologies ability to remove jobs forever and i doubt striking a balance between truely new jobs like "bozon cutting" or "horizon deer" and AI general trend of automating any job we can get enough data on is a sustainable solution.
When people raise the concern "will AI replace artist?" It's the first touch of something in the murkywater just under us(humanity). To keep the metaphor going, i believe we can escape the jaws of runaway AI but it's gonna mean taking things like ai generated art seriously. It definitely doesn't mean assuming that "new jobs" will miraculously emerge from nothing and mindlessly playing around in the great unknown with artificial intelligence.
Viewing a single comment thread. View all comments