seaburno

seaburno t1_j9mc2jz wrote

Should we, as the public, be paying for YouTube's private costs? Its my understanding that AI already does a lot of the categorization. It also isn't about being perfect, but good enough. Its my understanding that even with all that they do to keep YouTube free from porn, some still slips through, but it is taken down as soon as it is reported.

But the case isn't about categorizing it, but is about how it is promoted and monetized by YouTube/Google and their algorithms, and, then the ultimate issue of the case - is the algorithm promoting the complained of content protected under 230 which was written to give safe harbor to companies who act in good faith to take down material that violates that company's terms of service?

2

seaburno t1_j9m0rzo wrote

I probably would not hold the same standard to search engines, but with more understanding about how the search algorithms work, I could change my mind. Even if YT removed the ISIS videos at issue in the case that was heard yesterday from its algorithm, if someone just searched: "ISIS videos" and the videos came up, then I think it falls within the 230 exception, because they are merely hosting, not promoting, the videos.

Again, using the bookstore analogy, search is much more like saying to the employee: "I'm looking for information as to X" and being told its on "aisle 3, row 7 and shelf 2." In that instance, its a just a location. What you do with that location is up to you. Just because you ask Yahoo! where your nearest car dealership is and the nearest bar is doesn't mean that Yahoo! is liable because you were driving under the influence.

When you add in "promoted" search results, it gets stickier, because they're selling the advertising. So, if you asked where the nearest car dealership is, and they gave you that information and then also sent you a coupon for 12 free drinks that are good only on the day you purchased a new (to you) vehicle, that's a different story, and they may be liable.

7

seaburno t1_j9l34e6 wrote

The difference is because B&N is making the money on the sale of the book, not on the advertising at other locations in the store (or in the middle of the book).

YT isn't selling videos. They do not make money on the sale of a "product" to the end user. Instead, they are selling advertising. To increase their revenue via advertising, they are pushing content to increase the time on site.

The YouTube/Google algorithm is like saying: "Oh, you're interested in a cup of coffee? Here, try some meth instead."

15

seaburno t1_j9ks54q wrote

Its not like a book store at all. First, Google/YouTube aren't being sued because of the content of the videos (which is protected under 230), they're being sued because they are promoting radicalism (in this case from ISIS) to susceptible users in order to sell advertising. They know that they are susceptible because of their search history and other discrete data that they have. Instead of the bookstore analogy, its more like a bar that keeps serving the drunk at the counter more and more alcohol, even without being asked, and handing the drunk his car keys to drive home.

The purpose of 230 is to allow ISPs to remove harmful/inappropriate content without facing liability, and allow them to make good faith mistakes in not removing harmful/inappropriate content and not face liability. What the Content Providers are saying is that they can show anything without facing liability, and that it is appropriate for them to push harmful/inappropriate content to people who they know are susceptible to increase user engagement to increase advertising revenue.

The Google/YouTube algorithm actively pushes content to the user that it thinks the user should see to keep the user engaged in order to sell advertising. Here, the Google/YouTube algorithm kept pushing more and more ISIS videos to the guy who committed the terrorism.

What the Google/YouTube algorithm should be doing is saying "videos in categories X, Y and Z will not be promoted." Not remove them. Not censor them. Just not promote them via the algorithm.

44