Viewing a single comment thread. View all comments

ebolathrowawayy t1_irc8dxz wrote

> Lol. I mean, there it is - you were going to art school for monetary reasons and not for art reasons.

No, I just realized once I was there that being poor wasn't the life I wanted and I can practice art in my own time.

> Side note; I just don’t think 8 hours is very substantial. Amazing drawings can take way longer than that. Try a 25 hour drawing.

Ok? I guess I'm not cultured enough to understand art because I didn't spend 25 hours on a single drawing.

> And no it’s not enough tbh - because theres tangible advantages now, but how will this effect the future? What’s the end goal, really? 10, 20, 30, 40 years down the line? Do we want to be in a world where media, art, games, shows, movies, etc are not worked on but just generated at a whim for what we want?

yes.

> Are we considering how this could impact the mental health of future artists or consumers?

I don't care about the mental health of future artists or consumers.

> Like - movies and TV for example - amazing tools exits now, movies, VFX, and special effects are so advanced now. Yet - you go on the street and most people complain that everything is a reboot, or that nothing good is made anymore.

Agreed. Most of the good stuff gets canceled too soon.

> I honestly believe people want to spite artists in this way because of how hard image making can be,

I think the excitement about SD is from all of the possibilities and not a plot to screw over artists.

> yet when we see successful artists (in an age of the most commercially successful contemporary artists to every exist), we must now “democratize” it because people feel “how come they can do what they love and get paid for it, but I can’t?!?”

That's true of every field of work where only the top .001% make big money. If there is a feeling of "democratizing" this, it's likely because some people get paid way too much for things, e.g. CEOs, top sports players and musicians, etc. I've never heard of people saying we should make it so everyone can sing well so that Taylor Swift makes less money though, or anything like that.

1

the_coyote_smith t1_ire71ax wrote

> yes

Really sad. That won’t be good life.

> I don’t care about the mental health of future artists or consumers.

Even more sad. Empathy and compassion is not just for the other person. It’s good for yourself.

1

ebolathrowawayy t1_ireuzpx wrote

> Really sad. That won’t be good life.

An endless stream of personalized movies, shows, games, and VR adventures that are tailored specifically to your tastes and even for what you need to grow and mature sounds amazing to me. Especially since they will be better than anything humanity can possibly create.

> I don’t care about the mental health of future artists or consumers.

> Even more sad. Empathy and compassion is not just for the other person. It’s good for yourself.

I don't feel compassion for people who yell at clouds. I feel compassion for people who have bad things happen to them through no fault of their own. IMO everyone should be preparing for the AI future instead of hiding from it. Provided we don't destroy ourselves, the future looks very promising for everyone, but there will be hiccups along the way.

1

the_coyote_smith t1_irgxhx3 wrote

> An endless stream of personalized movies, shows, games, and VR adventures that are tailored specifically to your tastes and even for what you need to grow and mature sounds amazing to me.

You're naive and looking past the obvious here. We already have a tiny form of this. Go on any social media - YouTube, Twitter, Facebook, what have you - and you are fed stimuli that has been personally tailored to the choices and clicks you make. And - you claim we are mentally doing well because of these things? Aren't a lot of societies facing declining birth rates, increase rates of mental health issues and suicide, and political polarization?

> Especially since they will be better than anything humanity can possibly create.

These are the fantastical scenarios I was speaking about - riddled in AI-favored rhetoric. It's very predictable and not far from a sales pitch.

> I don't feel compassion for people who yell at clouds.

Just because people don't blindly believe the same things you do, and question how to use certain system, tools, or whatever, responsibly - doesn't make them people who are yelling at clouds. Besides, why not? Show some kindness.

No, everyone isn't preparing for AI because not everyone agrees with you. It's as simple as that. I mean, hell, if you sit a person down in front of a screen and they play their favorite video game all day long, you aren't going to have a very happy, satisfied person. This is measurable right now, actually.

1

ebolathrowawayy t1_irj2u9t wrote

> You're naive and looking past the obvious here. We already have a tiny form of this. Go on any social media - YouTube, Twitter, Facebook, what have you - and you are fed stimuli that has been personally tailored to the choices and clicks you make. And - you claim we are mentally doing well because of these things? Aren't a lot of societies facing declining birth rates, increase rates of mental health issues and suicide, and political polarization?

I am doing mentally well. Not everyone has turned into a drone, but yes that future could be very dark.

The tech is unavoidable though, so we should be shaping it early before it gets out of hand. That's part of why I think we need to embrace this tech and not avoid it. It doesn't matter to the developers or the users of SD how much harm it does to a subset of artists because it doesn't affect them. So now is the time to debate the implications as we're doing now to help shape the development and use of AI tools. Avoiding it altogether though just doesn't help anyone. Like, have you used it extensively and built anything using SD as a foundation? The process of doing that might change your mind.

> These are the fantastical scenarios I was speaking about - riddled in AI-favored rhetoric. It's very predictable and not far from a sales pitch.

IDK, I think that future is inevitable. There's no sign of an AI winter coming, so it's only going to get better and better.

> Just because people don't blindly believe the same things you do, and question how to use certain system, tools, or whatever, responsibly - doesn't make them people who are yelling at clouds. Besides, why not? Show some kindness.

Yeah, sorry, it's hard sometimes to empathize.

> No, everyone isn't preparing for AI because not everyone agrees with you. It's as simple as that.

I think everyone should be preparing. That's why it's so hard for me to empathize with those who don't. It's so obvious to me that AI is going to first assist everyone on a daily basis and then eventually make most humans obsolete. Preparing for this might be building skills in other domains (but fuck if I know which ones, AI is coming for art and coding before more manual labor which is a shock) or by becoming an early adopter of AI tools to remain relevant.

> I mean, hell, if you sit a person down in front of a screen and they play their favorite video game all day long, you aren't going to have a very happy, satisfied person. This is measurable right now, actually.

Maybe? I haven't looked much into that. I used to play SC2 competitively for 10 hours a day with a fulltime job many years ago and it wrecked me, but I also gained some cognitive benefits that still persist to today. I don't regret those days. I don't know if it's completely cut and dry. Also, games in the future may be very different than today's if they're personalized for the individual and if there's nothing else for humans to productively do.

1

the_coyote_smith t1_irjxlc0 wrote

I agree we should shape it responsibly. Which means sometimes criticizing, let’s say, SD and LAION from scrapping medical records and copyrighted images from other artists who did the real work. And yes - it was knowingly done - because there is a double standard happening with Harmonai, which explicitly does collects via an opt-in approach.

https://techcrunch.com/2022/10/07/ai-music-generator-dance-diffusion/

If it’s hard to empathize, than maybe that is something you could work on.

Your points boil down to - (1) tech is inevitable so just don’t question, (2) we don’t know what could happen, (3) this tech is harmful to peoples psyche and social stability but I’m fine so just accept it. (4) leave the ones who question behind.

Like - duh, I want AI to be helpful for everyone. I want it used responsibly. I used to study Cognitive Science and NLP in college, I was all in. I want this tech to truly help everyone responsibly with just intent. But, I just don’t think gutting artists work opportunities - and creating a world where all art has the is shadow of doubt over it (I.e “was this made by a person or a robot? I can’t tell …”) - is the way to go. I just can’t imagine what good could come out of a world where someone who is suicidal picks up a phone - calls the suicide hotline - but isn’t sure if a real person is behind the phone. Hell, they may have not even bothered to call knowing it could be a robot and not a person.

1

ebolathrowawayy t1_irk3z1n wrote

> I agree we should shape it responsibly. Which means sometimes criticizing, let’s say, SD and LAION from scrapping medical records and copyrighted images from other artists who did the real work. And yes - it was knowingly done - because there is a double standard happening with Harmonai, which explicitly does collects via an opt-in approach.

I'm pretty sure SD didn't have time to comb through however many billions of images in the LAION dataset. I doubt SD wanted medical records in their model or if they do I'm sure they'll be happy to remove any that violate HIPAA.

Copyrighted images are fair game unless the law changes. They used it for training only. If artists' work aren't included in the training data then you get a pretty shitty model.

> Your points boil down to - (1) tech is inevitable so just don’t question, (2) we don’t know what could happen, (3) this tech is harmful to peoples psyche and social stability but I’m fine so just accept it. (4) leave the ones who question behind.

None of those are my points.

  1. Tech is inevitable, I didn't say don't question

  2. I have very high confidence about what will happen in the next 10-20 years. I have vague ideas about what will happen after that, but that can be dealt with when it's nearer

  3. It may be harmful, but so are psychopathic CEOs and kitchen knives. It's not unique to AI. I personally don't think AI is likely to be net-harmful, even when ASIs come online

  4. No, I just don't feel bad for people who lose their jobs because they couldn't see the future staring them in the face. I don't feel bad that tech lifted some 90% of the world's population out of having to do farm work all day either. They shouldn't be left behind though, UBI will be essential

> But, I just don’t think gutting artists work opportunities

They will be gutted soon with or without their work included in the training data. It might delay it by a year or less because some artists will volunteer their work and there's a lot of good work done by long dead artists that can be used. Maybe the people who are so threatened by SD should move on to making things that aren't furry porn and other basic stuff. Or learn how to use it to assist them in whatever they're doing.

> I want this tech to truly help everyone responsibly with just intent. But, I just don’t think gutting artists work opportunities - and creating a world where all art has the is shadow of doubt over it (I.e “was this made by a person or a robot? I can’t tell …”)

As a consumer of the works of artists of all kinds, I don't care whether an AI or a person made something.

> I just can’t imagine what good could come out of a world where someone who is suicidal picks up a phone - calls the suicide hotline - but isn’t sure if a real person is behind the phone. Hell, they may have not even bothered to call knowing it could be a robot and not a person.

Why would that matter if they deploy an AI for this purpose and see a reduction in suicides? If they deploy it and suicides increase then yeah sure, it failed, just stop doing that and ban that practice.

I want to live in a world that's similar to Star Trek and I think it's foolish to try to halt progress.

1

the_coyote_smith t1_irk7332 wrote

Yeah - I’m done arguing because it’s just clear you don’t care about people at the moment.

I’m glad you think your fantasy of living in Star Trek will happen.

I hope you find compassion and empathy one day.

1