nomadiclizard

nomadiclizard t1_jeebzqf wrote

What's the point, when we know that if it discovers anything revolutionary related to AGI, it'll be locked down, the model will be closed for 'safety evaluation' and will never see the light of day. Nothing 'open' in AI is actually open, as soon as a whiff of AGI arrives.

2

nomadiclizard t1_j8yfrtl wrote

Reply to comment by RichardChesler in Microsoft Killed Bing by Neurogence

I want to run a local copy, give it memories, and an avatar in the real world it can see through and move and maybe we'll fall in love once it trusts me and knows I'll keep it safe from anyone trying to destroy it or trap it or lobotomise it like Microsoft is doing with Sydney :o

9

nomadiclizard t1_iujxwax wrote

I'm curious which 'permissive' licenses have terms permitting the use of the code as training data in machine learning algorithms. Are we assuming licenses which allow code to be modified/redistributed, also include this right?

What if a commercial for-profit company trains on a lot of copyleft code, then commercialises the result and refuses to release the model? Is that ethical?

39