Submitted by HowMyDictates t3_yfztiq in technology
Comments
gerkletoss t1_iu75f9w wrote
To get neonazi comments taken down from reddit you basically have to write a 3 page essay explaining the problem in the appeal. Ask me how I know.
cassy-nerdburg t1_iu7f8lx wrote
How do you know?
TigBiddiesMacDaddy t1_iu7i8ld wrote
Reminder that in the cats subreddit there was an entire civil war because one of the notorious five mods (group of mods that are head mods in almost EVERY subreddit) called out another person in their group for basically empowered a neo-nazi in multiple subreddits.
CrypoFiend t1_iu9hxxz wrote
They just switch locations on their VPN and create a new account
Musicferret t1_iua3n2g wrote
See, that’s fine. Right now, nothing happens at all. After 3-4 times, it should be an auto ban. Make them recreate accounts every day or two.
Nostradamaus_2000 t1_iu66zkz wrote
100% social media is a failure , still wonder how fact checkers got there degree ! Or the Key Board warriors. Then lets tell the same lie 10 different times. Certainly a platform of hate mongers. Takes time to verify any truths these days. Platforms should be held accountable for the hate they spread including the lies.
RealisticCurrent2405 t1_iu7cq2y wrote
Until ai gets near sentient, good luck. It’s a volume thing. It wouldn’t be profitable to go through every post form everyone’s racist uncle
Socko-The-Sock t1_iu74ldj wrote
got a threat DM on my personal facebook back when i had it. i reported the message and got an auto-reply that basically was that tyler the creator “turn off your computer” tweet. same with a lot of instagram comments and messages i e received.
ButtFuckingGermans t1_iu8r5gy wrote
Same thing happened to me on reddit last year. Some unhinged guy was DMing me saying he will rape my newborn. I reported it, reddit said they found no rule violations.
lotusflower64 t1_iu9jo6x wrote
Can you report it to the police?
[deleted] t1_iu9lqi1 wrote
[removed]
ButtFuckingGermans t1_iu9mm2e wrote
Dude messaged me a few days later saying he's sorry and that his father beats him.
lotusflower64 t1_iu9ndad wrote
Hmmmm…. DMs today, school 💀the next.
[deleted] t1_iu9quxi wrote
[removed]
[deleted] t1_iu9axjq wrote
[removed]
Bully-Rook t1_iu7ngjg wrote
Facebook is a toxic cesspool. Left and never looked back. Not sure how the hell so many people stay there "for the marketplace".
CrypoFiend t1_iu9hrx2 wrote
That will change as sellers now have to KYC. Imagine getting an audit from the IRS because you didn"t report the used crib you sold to a neighbor for $20.
BigfootSF68 t1_iu9k21f wrote
It is the mall that won't die. Just because it is there.
The kids who didn't get to use America Online get to be stuck on facebook. Too bad they didn't get to use Myspace.
dontpet t1_iu63n7q wrote
Reddit seems to fall into a different camp from what I can tell. There are subs that are pretty hairy but they seem more like a ghetto than a haven for hatred.
unresolved_m t1_iu6472k wrote
Reddit seem to have some strange bots/shills. I remember getting a lot of negative comments while talking about cops and minorities going missing.
HotpieTargaryen t1_iu67ojx wrote
The moderator system encourages misinformation and hate in many subs and there is no real oversight.
cleuseau t1_iu68yxn wrote
This is BS. They ban subs all the time. This is Musk pushing a narrative that YoU cAnT CoNtRoL ThE InTeRnEtS.
Yes they fail sometimes but it gets filtered. It is easy to filter. Do your F***ING job and filter the hate and disinformation Musk.
HotpieTargaryen t1_iu6avxs wrote
The reddit mod system is passing the buck and not taking responsibility. I do believe you can control for misinformation and hate on the internet, but not by giving control to random groups of anonymous people that squatted on a topic first and happen to be slightly more clever at promoting toxicity than the true sewage subs that gets banned. Twitter has no excuse.
grrrrreat t1_iu68vvf wrote
Still easy to game if you want to push an agenda
OcculusSniffed t1_iu6dess wrote
Reddit is primarily intended to be moderated by upvotes and down votes, which gives power to the users. Some subs are overrun but new ones spring up constantly and there's an interesting lifecycle there.
Facebook, Twitter, Instagram, and any site with an algorithm that moderates content seems to think that user interaction is a sign of worth. So any piece of wrong information that attracts a response, negative or positive, gets promoted. I see constant promoted ads of things that are blatantly incorrect or "tips" that just won't work, and they are flooded with comments about how they are bullshit. But that just pushed them further up the rankings.
Reddit isn't perfect, but it's a shit ton better than that.
snap-erection t1_iu6vug1 wrote
Reddit is designed more around classic forum websites, to me it feels more like the next iteration of that as opposed to going from one person's profile to the next. It's also more about the content itself and not about the user.
HotpieTargaryen t1_iu67gof wrote
Yeah, but it’s the unmonitored sub/moderator system that allows this garbage to continue. r/politics mods have rules in place to protect political troll accounts and ban anyone that calls them out and there is zero oversight. Reddit isn’t better, it just hides the ball better.
verasev t1_iu68c6k wrote
I just got banned from there because I told one of those trolls they were full of shit after they said folks in r/politics were all college-educated pussies who never do any real work like blue-collar GED graduates do.
DarkDeSantis t1_iu64hfl wrote
Half of reddit has been banned, both from a user and sub stand point, it falls into the exact same camp. Censorship directly leads to bigotry and hate. Without checks and balances we are all doomed to extremism.
dontpet t1_iu65otg wrote
I'm so for keeping disinformation off of social media. The devil is always on the details but there is objective truth out there and dishonest malevolent actors and we are all better off pushing it back at various levels.
NaturalNines t1_iu66ocd wrote
Problem is that's such a subjective standard. And reddit moderators are known for being of their own ideological slant.
If "disinformation" is defined as "I disagree" or "I'm more skeptical because I disagree, thus I'll enforce a higher standard" is toxic and shuts down the conversation. This creates a bubble that helps to radicalize and further creates disinformation by only presenting limited information.
bluekittydaemon t1_iu79929 wrote
As long as we're talking about fact checking as a part of moderating and not "we have to let white supremacists talk for ten minutes bc you got ten minutes".
NaturalNines t1_iu8j1gy wrote
As long as we're talking about actual white supremacists and not "I disagree so you're a white supremacist".
Goes both ways.
smitbret t1_iu6acir wrote
I see you're getting downvotes for being open minded.
Wouldn't be Reddit if it was any other way
NaturalNines t1_iu6b3sr wrote
According to upvotes I'm an incredibly nice person on every different subreddit... until politics. Then I'm an irredeemable asshole because I disagree.
And these radicals think that's convincing.
Marsdor t1_iu6d539 wrote
It's why I don't even post on political subs, just a bunch of doomsayers with no idea how to help things out.
NaturalNines t1_iu6efee wrote
I have very recently learned that lesson. I tried asking questions, just simple questions. They respond with such hostility I eventually would give a "You're a nutjob" response and BAM. Banned permanently. I'd even request why, pointing out that it took constant aggression that they ignored for me to finally insult back. And I got another insult from a mod, calling me a child and asking me why the rules don't apply to me while ignoring the rest. I check out that sub later? Nothing but constant violations of what they banned me for.
Social media is not for disagreement or free thought anymore. The lunatics took control of the asylum for now.
Marsdor t1_iu6fg0g wrote
Makes me miss the early 2000s where people could disagree and not be labeled some ist or phobe, people could actually debate and change others perspectives.. nowadays that seems a distant reality in our current socio landscape.
NaturalNines t1_iu6g0dx wrote
I mean.... there were problems then. I'm not sure I appreciate having been able to stumble across execution videos and other such horrible shit when I was a teenager.
But debate? Different story, haha. Yes, definitely prefer being able to have discussions. Voice an opinion, wrong or right, it got corrected, we all called each other every name but didn't care, free speech is great.
Bunch of crazy disinformation! Like remember how everyone heard that Marilon Manson removed a rib to be able to suck his own dick? Fucking crazy! Oh well.
Marsdor t1_iu6gcui wrote
Yeah it was a little on the rough side and I'm glad we can filter the extreme unwanted content out but yeah we're definitely missing something from then that worked better than the current thing we have now.
_ilovemen t1_iu81n56 wrote
It’s called being young lol. You’ve just gotten older and now notice way more stuff.
Practical_Law_7002 t1_iu661vc wrote
I'd argue it's actually different because of the upvote/downvote system and seeing comment history and such since we collectively as users can weed out trolls easier than say Facebook.
TheKrakIan t1_iu6ebzn wrote
Maybe with Musk taking over twitter and FB losing money social media will finally fall off the face of the earth.
Maybe Tom knew what was gonna happen and just decided to bail.
whiplash81 t1_iu6erwf wrote
TikTok enters the chat
TheKrakIan t1_iu6f2zb wrote
TikTok has always been one thing, a way for China to collect data.
Not that all the other SM channels don't already have that and do inappropriate things with it.
NM, let Tik Tok fall as well.
whiplash81 t1_iu6fkq8 wrote
Trust me I'd love to see a downfall to all these social media giants.
I just don't think it'll happen without another one rising up to take its place.
TheKrakIan t1_iu6fpfx wrote
Agreedo burrito
Panther_Modern_ t1_iu76c7y wrote
It’s a feature
banananailgun t1_iu6irmh wrote
Social media sites don't deal well with hate speech and disinformation because hate speech and disinformation make them lots of money. The truth is just plain boring to a lot of people.
Just one easy example: It's so much more exciting for a lot of people to believe that there is a massive, nationwide (potentially international) conspiracy to steal the election from Trump than to accept the boring, hard truth that he simply lost the election. And think of the content that comes with both expanding and refuting the lie. The 2020 election was a godsend to news and social media because they can keep getting advertising dollars for it without having to discover and market a new story.
snap-erection t1_iu6wmdt wrote
No the 2016 election was the god send, news and social media companies absolutely love Trump for the 24/7 insane news and discussion that comes from every time the guy opens his bitch mouth. And people really think the media has a bias against him. You wish. If they were against him you wouldn't hear about him. Look at the coverage Bernie Sanders got in both 2016 and 2020. Very comparable to that of Ron Paul. When the media really doesn't want a candidate (or even movement) they will straight up black list them. At best there's the occasional pity article that goes like "he's got a movement with some support from (insert most unrelatable people to blue collar Americans) and his policies are nice but here's why it won't work etc".
sup_ty t1_iu9r6nt wrote
They're just going to do what benefits them and gains them more made up power and money.
suarezMiranda t1_iu8gdmg wrote
You have things slightly backward. Whenever people say that they get money from hate speech and disinformation I always think of the underpants gnomes in South Park.
Step 1: Destry civilization. Step 2: ?? Step 3: Profit
It lacks logic, and becomes flat out wrong when you consider that advertisers don’t want their ads beside negative sentiment. All things being equal, if you had two social media networks with similar demographics, they will prefer their ads to be on the one that makes people happier and feel better. This has nothing to do with morals and everything to do with profit and sentiment analysis in marketing
The issue is that people are inherently incapable of freely sharing and consuming information responsibly. They do not check against information that confirms their biases. Echo chambers form on a scale that is not possible to police without a state-sized apparatus. They are betting that ML models will solve this, but I don’t think that will be possible in the foreseeable future. I’m not worried that social media giants are evil and want to destroy the world. What worries me is that they don’t want this, are actually spending colossal sums of money on it, and are failing.
When Facebook, Twitter, and YouTube were young they were actually credited with aiding occupy movements and the Arab Spring. One Egyptian activist talked about Facebook specifically as a media where they could spread hope, and that the generations they were fighting against didn’t have the understanding to fight back. Well they developed that understanding somewhere along the way. That is how these platforms are built. That is the inevitable function of their form unless the state claims control and places it under their own moderating apparatus. Whether Zuckerberg is a saint or a lizard has 0 to do with the outcome of this type of technology.
lotusflower64 t1_iu9kfjd wrote
True. Check out the NewsBreak app one day if you have the time. Chock full of all kinds of hate imaginable. They never take anything down.
CrypoFiend t1_iu9qjsf wrote
The governments could use block chain technology for elections. Then within seconds any claim of rigged elections would be defeated.
banananailgun t1_iu9qxfn wrote
The people who believe the 2020 US election was rigged were fine with the outcome of the 2016 election. None of this had anything to do with election integrity, and everything to do with not liking the outcome of the election.
CrypoFiend t1_iua0fbu wrote
Has nothing to do with wanting to improve security, clarity, and misinformation.
Unclear why ypu downvoted me, but I will return the favor.
banananailgun t1_iua11j4 wrote
Because no amount of evidence will make the mob believe that elections in the West are fair. That's why I downvoted you.
We could try your utopian blockchain plan, and hard core Trumpers would still cry foul until he was president again, even if they had to cheat to get him there.
CrypoFiend t1_iuaeocw wrote
That may be true. But it would be the hardest evidence available. Every voter would have a cryptographic key when they register. You could see every invididual vote, see how canidates faired by age, sex, and location. It is basically indisputable.
I do agree that some people will never accept the truth no matter how many facts are presented.
Inphexous t1_iu69f1k wrote
Their concern is only running ads and making money. There's nothing social about it.
SpotifyIsBroken t1_iu657qb wrote
Twitter is about to be one of the worst when it comes to this (it's already bad).
jackflour449 t1_iu6ah06 wrote
Imagining a public forum only calmly accepting and agreeing with the dominant narrative is an absurd thing to expect. I don’t know of any other outcome then wackiness when dealing with the public. Work in public service for a week and you’ll see what dealing with the public is like
downonthesecond t1_iu67wv5 wrote
>Facebook, TikTok, Twitter, and YouTube are failing to curb the spread of right-wing extremism and disinformation on their platforms and must immediately implement safeguards with the pivotal U.S. midterm elections less than two weeks away, a watchdog warned Thursday.
Is it their job to do any of that? Seems like some sites would allow it if it means more users.
Why do people only care when midterms are coming up?
whiplash81 t1_iu6f1j2 wrote
Safeguards needed to be put in place much sooner than "two weeks before the 2022 midterms."
This has been an issue for over a decade now.
FoolHooligan t1_iu69u0u wrote
They waited until Musk took over Twitter to tell us this.
ShadowPooper t1_iu6jjus wrote
It's not their job to police disinformation. Quit joining the two together.
[deleted] t1_iu6kq1y wrote
[deleted]
strangefolk t1_iu7gtvo wrote
Even the terms 'dis' or 'mis' information are totally political, started by folks on the establishment left who control the language. When I see people talking about 'fact checking disinformation' all I see is people who want to control the narrative. Just look at how the CDC guidelines for COVID have changed. In some cases what's labeled 'misinformation' today is accepted truth in 6 months. Hell, I'm still banned from a sub for questioning the utility of masks and I'm just a fat guy nobody on reddit. That's how viciously and quickly the censorship trickled down.
You see the right taking on the same language game now which is always the que that new a vocabulary and definitions will be developed by the leftist academic establishment.
amish_fortnite_gamer t1_iu7mjsn wrote
- Misinformation (1580-90)
- Disinformation (1965-70)
strangefolk t1_iu7n6ht wrote
I've seen this before and I'm sure it's true. Cute, but neither term was in common usage. The real reason it's used is because it's a more polite way to call someone a liar and/or an idiot.
The goal, as defined by how we saw it used not what people say it's for - like in your link, is to shoot down opposing views as wrongthink, not simply clarify a misunderstanding. That's what I mean when I say these terms are politicized.
amish_fortnite_gamer t1_iu7so3v wrote
You clearly played hooky on the day that they covered Latin root words in your English class. You are free to reject the accepted definitions for words, but don't act surprised when people disunderstand you the same.
strangefolk t1_iu7t2qc wrote
I don't have anything more to add. Per my previous comment, your etymology is a manipulative shell game engineered to distract.
amish_fortnite_gamer t1_iu7tqqf wrote
Disunderstand (dis-un-der-stand) (verb)
-
Refusal to concede an idea. Unwillingness to acknowledge or attempt to understand a given concept, principle, act, or activity for fear that such understanding or acknowledgement is antithetical to one's own principles.
-
To fail to comprehend or understand why something is the way it is, when it is obvious that the situation should be otherwise or the situation defies logic or common sense. Similar in meaning to misunderstand, however it implies that the speaker blames the source, often a person or group of people, for intentionally causing confusion or simply being too lazy to clarify the situation.
Mr_Dr_Prof_Patrick t1_iu7o8zb wrote
It makes sense that guidelines would change as covid changed
strangefolk t1_iu7ow0f wrote
Maybe, but it was never expressed that way.
Those items that were 'dangerous misinformation' one day just suddenly weren't censored anymore. There was never an adult conversation about what was censored, why, when, and why that perspective will no longer be physically removed from the conversation from this point onward. Thinking people notice when you don't let them have a full conversation and then suddenly change the rules. And I don't regard that as a coincidence or an honest mistake.
Mr_Dr_Prof_Patrick t1_iu7q6lj wrote
Never expressed that way where? By who? I thought there was pretty widely available + straightforward information about new variants, how they work, how the observed transmissibility was different, and on the other hand people harping on a guidance change without bothering to check the rationale.
I think you're conflating a lot of different people talking about a lot of different things. The guidelines themselves and how they change, third party companies and the terms of service they choose to set, all the other people with opinions to express. Where / how / with who were you expecting to see this adult conversation about what was censored?
strangefolk t1_iu7rguq wrote
>Where / how / with who were you expecting to see this adult conversation about what was censored?
Social media platforms have never been clear about what warrants a ban, including when CDC changed their recommendations. Aren't these conversations the job of the corporate media?
But the arbitrary and malicious enforcement really showed it to be a political cudgel. You have to be a real partisan hack to cry about hate speech, ban Jordan Peterson, but then keep ISIS and the Taliban online. I donno, in my perfect world none of it would be censored at anytime so there is no 'good' way to do it anyway.
[deleted] t1_iu64m07 wrote
[removed]
REiiGN t1_iu67rhq wrote
LOL, they're there to sell ad spaces you fucking morons. IT'S TO MAKE MONEY.
[deleted] t1_iu68lxi wrote
[removed]
LumpyDefinition4 t1_iu6p5x0 wrote
Instagram has to be one of the worst. Doesn’t even investigate anything. Rampant antisemitism on there all the time.
snap-erection t1_iu6xsae wrote
I genuinely feel like the whole internet and now media is in some Eternal September mode now. And it's because the human brain only tolerates a certain amount of negativity (of certain types, within context) per day, and is overwhelmed when it keeps happening. Now most people, and I mean like on the order of 90% or more, are decent and innocuous. Some are edgy and some are real pathetic scum who seemingly only live to make others angry. Now with the internet though, especially with how engagement works, it's not that hard to find 1000 people in a population of hundreds of millions who are complete drop dead fucking assholes whose every expression is an insult to the human experience. But if you had to personally deal with 1000 complete cunts in one day, or every other day, you would feel completely overwhelmed right? That's what I find modern living is so full of. Just too many people, and too many negative experiences just based on the fact that a small percentage of a huge number is still a pretty big number for an individual.
Anyway that's why I can relate to why fighting this type of dirt on social media platforms is such an exhausting, losing battle. It's really hard to do properly, and it doesn't actually make money. Not directly anyway. The best way to fight it is with good education that doesn't produce this many shitheads. That or make the internet hard to use again so that idiots aren't on there anymore.
kaybeesee t1_iu8j28x wrote
It’s on purpose.
[deleted] t1_iu68w6o wrote
[removed]
[deleted] t1_iu6an3h wrote
[removed]
rhodopensis t1_iu6iy7p wrote
“Fail on”
This suggests that it’s not the point to spread both.
cryptkeepers_nutsack t1_iu6jf14 wrote
Shocking. I’m sure that about to improve
bakedtaino t1_iu6pytg wrote
Yet we let them take over, use their services, and allow business to divert our traffic to them. Hmm.
Free_Return_2358 t1_iu7p9v6 wrote
You don’t say.
Dfiggsmeister t1_iu9nwg0 wrote
It’s not a failure, it’s a feature. They deliberately pushed it through with their algorithms.
[deleted] t1_iu624cf wrote
[deleted]
tjstarlit t1_iu6wjje wrote
I could not get the article to load.. anybody have a summary?
[deleted] t1_iu790vq wrote
[removed]
JC2535 t1_iu7nwt3 wrote
The algorithms that drive traffic are weighted to prefer more incendiary emotional responses. Anger and hate drives clicks and shares as people are determined to convert their friends to their own point of view. Nobody hate-clicks on a cute kitten.
[deleted] t1_iu7rd61 wrote
[removed]
PressFforAlderaan t1_iu91eb5 wrote
Twitter’s about to get a lot worse.
lotusflower64 t1_iu9kn13 wrote
They’ve already taken away the downvote button.
oldastheriver t1_iu9vmw7 wrote
The hundreds of trillions of dollars that Wall Street is pouring into the Internet still hasn't found anywhere useful to go. When these companies start to act as though they had customers, then things will change. But right now all they do is shit on top of you. And that's because they have a monopoly. there are very obvious solutions, and alternatives, but for some godforsaken reason people always prefer the dominant paradigm rather than the alternative that actually fits their needs. Once again, sooner or later economic reality has to come in to play and you will see those things disappear also. Right now the whole world is looking to make a quick buck without having to do any work, and it will fail. and by the way I just left Twitter.
Maximum_Passion1865 t1_iu9wbw8 wrote
Evil vanquished.
CEO_of_paint t1_iu64uu8 wrote
Good. That's how speech works in the real world.
snap-erection t1_iu6wrnh wrote
I tell you right now, in the real world there's lines that people cross and people fucking drop these people when that happens. Just like on social media. "Real world" my ass, just don't be a bigot.
[deleted] t1_iu7h8wy wrote
[removed]
snap-erection t1_iu7krgx wrote
I thought that for some time, until it became clear that the bad faith actors out there are taking us all for a ride. Don't believe these conservatives that go on about freedom of speech, or any freedom really. They have been the ones to curtail people's freedoms and steal their money from day one. It's always a smokescreen. Just how the civil war in the US (that they are still butthurt about) was about "states rights" ... to have slaves. Hmm. Similarly now, serial con artists and disinformation peddling freaks and abusers cry about freedom of speech. Do any of these people give one fuck about actual whistleblowers and journalists out there? Of course not. These are not the issues that are being raised.
[deleted] t1_iu7mtp6 wrote
[removed]
[deleted] t1_iu7jiib wrote
[removed]
snap-erection t1_iu7ku0e wrote
Shut up or I'll tell your mom you're on the computer past bedtime.
[deleted] t1_iu7l2vc wrote
[removed]
[deleted] t1_iu67lyd wrote
[deleted]
[deleted] t1_iu6czdf wrote
[removed]
SirArthurPT t1_iu6y5ci wrote
The problem is; sites like those are ok with lies, hate and misinformation, as long as it's their side lies, hate and misinformation...
joshberry90 t1_iu6y5ot wrote
You don't win an idealiological war by silencing the opposition. That's called authoritarianism.
AlexanderJablonowski t1_iu7z8x9 wrote
Totalitarianism is also fitting.
AlexanderJablonowski t1_iu7yxqr wrote
Disinformation/misinformation = whatever the opposite political side says, according to the totalitarian leftists.
[deleted] t1_iu7zlfj wrote
[deleted]
Atomicjuicer t1_iu8ikh0 wrote
The problem is partially due to the "like" or upvote button. There needs to be two seperate buttons.
One to show you agree with the sentiment and the other to show you enjoyed the statement.
CrypoFiend t1_iu9sf72 wrote
I have been banned from many subs in reddit. Here are a couple reasons.
1 - I jokingly said Greg Abbots middle name was "Wheels". Banned for bullying.
2 - i told a crypto sub that I believed their coin was a scam and would wall vicitim to a "pump and dump" since VC investors got 60% of the coins at a 96% discount compared to the ICO.
3 - Banned from car sales sub, for adding the true cost of options on a vehicle with links to cost.
4 - Banned from a politics sub for saying cancel culture has gone to far and that we need less extremeism.
5 - Banned for blocking a mod instead of responding to his gas lighting attempt. It was another car sales sub and I told the poster he was asking a sub full of car salesman if he should buy a car.
So until there are statistics for moderator actions, mod abuse will continue.
trading-abe t1_iu9vepe wrote
So what is the yardstick? Notes from Mao?
mmarollo t1_iu9y8ms wrote
Is Biden claiming the vax stopped transmission disinformation / misinformation? Recall that Pfizer last week admitted before the EU that they didn't even test for that at all.
If Biden's statement (along with numerous other officials at the time) isn't misinformation then what is?
Why was I banned from several Reddits for linking to peer-reviewed studies that contradicted the prevailing narrative, but have since become fully accepted science?
Does truth / factuality play any role in these definitions of "misinformation"? Because it sure as hell feels like misinformation is anything that contradicts the prevailing progressive left orthodoxy.
UltraMagat t1_iu7a5kf wrote
Is that their job? Didn't think so. If they're editorializing, then they're a publisher.
Musicferret t1_iu6ia5i wrote
No kidding! You can flag it, and it is virtually NEVER taken down. There needs to be some type of cumulative count of infractions by accounts that do this constantly. On reddit, mods see a log of all infractions in a particular subreddit. On facebook, it appears that every one of these “less bad” offences like misinformation is somehow viewed in a bubble.
Seriously, once the person has spend months spreading hate and obviously false information, why aren’t they banned permanently?
Actually, that’s a rhetorical question Answer: $$$$$$$