3Quondam6extanT9
3Quondam6extanT9 t1_ismog67 wrote
The strangest part of AI generated video is that if you look at a specific part of it in detail directly, there is this fuzzy detachment in the arrangement of elements. Always like a rearranging dream in a cohesive state of flux never quite stable but never quite chaos.
3Quondam6extanT9 t1_is3exfn wrote
Reply to comment by subdep in Computer-connected human neurons learn to play video games by galactic-arachnid
You're confusing "biological" with "natural." For an intelligence to be naturally occuring it would have to emerge independent of intentional design.
Regardless of it's makeup, our design and build automatically implies an artificial construct.
3Quondam6extanT9 t1_irsnqn6 wrote
I think calling things dumb because you don't agree with a conceptual theory is dumb.
Nobody is really assuming it will just be conscious, but there are a myriad of reasons, both human and technology oriented, where understanding consciousness becomes beneficial.
Most people simply feel concerned at the possibility of it occuring because we are developing a literal intelligence. Since we know little about what consciousness truly is it becomes difficult to project or gauge what building an intelligence could lead to. It's a good thing to be aware of going into development, and until we know more about consciousness it may be more ridiculous to assume we wouldn't create a conscious intelligence.
It's kind of like outwardly assuming that fission couldn't result in an explosion because you don't know how fission works.
3Quondam6extanT9 t1_irjr9gl wrote
Reply to comment by Lawjarp2 in Singularity, Protests and Authoritarianism by Lawjarp2
We'll continue hoping for the people of Iran. I wish we could do more.
3Quondam6extanT9 t1_irjm3c7 wrote
Reply to comment by Ezekiel_W in Singularity, Protests and Authoritarianism by Lawjarp2
This is not accurate. Protests vary in degree, location, context, and methods. Protests like Stonewall, Mayday, Women's Suffrage, Arab Spring in some places, and MOL all had different influences that directed history in some way and changed quite a lot.
It would be more accurate to say that generally speaking protests don't amount to much beyond bringing awareness to the public.
It's also not accurate for OP to claim that the current protests haven't amounted to much since some are ongoing. The Iranian protests have spread throughout their country from schools to businesses and regime change is a potential outcome if the pressure remains.
3Quondam6extanT9 t1_iravwfo wrote
Reply to comment by matt_flux in The last few weeks have been truly jaw dropping. by Particular_Leader_16
You're right, it is speculation, and initially it would likely be no better than human influence.
However limited improvement itself should be able to be written into code that at the very least is given the parameters to analyze and choose between the better options.
The AI that goes into deep faking, image generation, and now video generation is essentially taking different variables and applying them to the outcome through a set of instructions.
So it wouldn't be beyond the realm of possibility to program a system that can choose between fewer options with a given understanding that each variable outcome has with it an improvement of some sort.
That improvement could alter the speed at which its calculating projections or increasing it's database.
Call it handholding self-improvement to begin. I would like to think over time one could "speculate" that an increasingly complex system is capable of these very limited conditions.
3Quondam6extanT9 t1_irando4 wrote
Reply to comment by matt_flux in The last few weeks have been truly jaw dropping. by Particular_Leader_16
We automate most systems currently through manual setup so I can only assume this will continue on until AI has developed enough to self program, at least at limited scale.
3Quondam6extanT9 t1_irajrw3 wrote
Reply to comment by matt_flux in The last few weeks have been truly jaw dropping. by Particular_Leader_16
In context to what the redditor was talking about, I'm not sure. I'm assuming they may be basing their perspectives on pop culture concepts like Skynet.
I don't think one AGI will take over "everything", but I do think various versions of AGI will become responsible for more automated system throughout different sectors. It won't be a consistent one size fits all as some business and industry will adopt different approaches and lean into it more than others.
In fact I think we'll see an oversaturation of AGI being haphazardly applied or thrown at the wall to see what sticks.
It wouldn't be until an ASI emerges that it's "possible" for unification to occur at some level.
Until that point though I personally do not see it "taking over". But thats just me.
3Quondam6extanT9 t1_iracj87 wrote
Reply to comment by matt_flux in The last few weeks have been truly jaw dropping. by Particular_Leader_16
I didn't say it wasn't speculation, but that was never the point.
You're mentioning big data without considering the simple to moderate AI tasks which have been operating at different levels in different sectors for years. Not in terms of "return" but in efficient data management, calculation, logistics, and storage.
Those are basic automated operations that are barely considered AI but still a function of business in day to day management.
But thats enterprise, we aren't even talking about sectors like entertainment and content creation which utilize AI far more readily. We see a lot of AI going into systems that render and utilize recognition patterns like indeep fake and rotoscoping.
Your perception of AI integration equaling a 0 return omits an entire world of operation and doesn't consider future integration. As I said, reductionist.
3Quondam6extanT9 t1_ira5vqw wrote
Reply to comment by matt_flux in The last few weeks have been truly jaw dropping. by Particular_Leader_16
I'm not targeting anyone, just the overall dialogue between you two held slightly condescending context with regard to redditors intelligence.
I'm sure you're familiar with the amount of AI out in the world and it's different forms and uses under the development of different sectors and entities.
I think it would be virtually impossible to offer any concrete predictions about what exactly AI will "take over".
Your comment regarding business use of AI and its efficiency is fairly reductionist though. It assumes that the goal of a company is linear and that it will have to make a binary choice between human or AI influence.
Generally there is a slow integration of AI input as industry models for software and calculation. It's not one or the other, it's a combination of the two to start and over time you tend to see a gradual increase in use of the AI model in those specific use cases.
3Quondam6extanT9 t1_ira2lv7 wrote
Reply to comment by matt_flux in The last few weeks have been truly jaw dropping. by Particular_Leader_16
Redditors aren't required to be genius level professors. It's a social media platform. Expectations should be low, but that doesn't mean we discount everything being discussed or the people discussing them.
The context of "taking over everything" may be clear to the redditor and may be a rational conclusion based on their available knowledge.
I do think it's important to discuss what is meant without discouraging them from being involved in that discussion through passive aggressive remarks or slights to their intelligence.
That being said I think they probably meant through a combination of integral human systems AGI could replace the need for human interaction at various levels. To them that might mean government, enterprise, technology innovation, and utilities.
Personally I don't see it as being so straightforward as one AGI to rule them all, but in certain respects over the next few decades we could see industry adopting stronger AGI influence and control within various sectors.
There will be a lot of nuance and this is what some people may not recognize, thereby assuming it's a binary outcome.
3Quondam6extanT9 t1_iqqtmel wrote
Sounds like this is the beginning to a Lovecraftian nightmare. 😆
3Quondam6extanT9 t1_iqkmtvm wrote
Reply to comment by TinyBurbz in The Age of Magic Has Just Begun by Ohigetjokes
The only reason the public generally doesn't integrate with tech is the accessibility and cost of said technology. The more AI is developed in different areas under different circumstances, the more it becomes normalized, and even now AI in general is used on a daily basis with the population barely recognizing they use it.
As long as the populace isn't forced to pay a lot and as long as the tech sector is capable of broader case use/ open source systems, it will continue to happen and integrate into normal life.
3Quondam6extanT9 t1_issrxuq wrote
Reply to Talked to people minimizing/negating potential AI impact in their field? eg: artists, coders... by kmtrp
Are you an artist or coder?