Brief_Profession_148

Brief_Profession_148 t1_jaauqp7 wrote

2

Brief_Profession_148 t1_j9klri2 wrote

Not even comparable. The problem isn’t that Isis video are being looked up or are being missed from being removed in a timely fashion. It’s that their reckless poorly written algorithms are suggesting Isis videos after people look at unrelated topics. It’s not the content gatekeeping as the sole issue here, it’s that algorithms are promoting terrible content that is dangerous because it decided it would keep engagement to sell ads. They control that algorithm, they have responsibility for the content they promote just like a normal publisher can be liable for their content. They crossed the line from host to publisher when they started using algorithms to curate content.

9

Brief_Profession_148 t1_j9kc6xo wrote

They curate with their algorithms. They want protections as passive hosts of information, they should have to turn off their algorithms. If they want to curate what you watch to maximize their profit, they should be responsible for what that algorithm directs people to. They don’t get to be passive hosts and active curators like a publisher at the same time.

146