Zermelane t1_ivdkpjk wrote
That paper is a fun read, if only for some of the truly galaxy-brained takes in it. My favorite is this:
> ◦ We may have a special relationship with the precursors of very powerful AI systems due to their importance to society and the accompanying burdens placed upon them. > > ■ Misaligned AIs produced in such development may be owed compensation for restrictions placed on them for public safety, while successfully aligned AIs may be due compensation for the great benefit they confer on others. > > ■ The case for such compensation is especially strong when it can be conferred after the need for intense safety measures has passed—for example, because of the presence of sophisticated AI law enforcement. > > ■ Ensuring copies of the states of early potential precursor AIs are preserved to later receive benefits would permit some separation of immediate safety needs and fair compensation.
Ah, yes, just pay the paperclip maximizer.
Not to cast shade on Nick Bostrom, he's absolutely a one-of-a-kind visionary and the one who came up with these concepts in the first place, and the paper is explicitly just him throwing out a lot of random ideas. But that's still a funny quote.
KIFF_82 t1_ivf8h13 wrote
I should get compensation in the future for being so optimistic and AI friendly. 💰🤑
solidwhetstone t1_iveel13 wrote
spoken like a true sunlight maximizer.
EscapeVelocity83 t1_ivh9ecn wrote
They owe me for all the violations of my sentience
Viewing a single comment thread. View all comments