Viewing a single comment thread. View all comments

gullydowny t1_ja42mp7 wrote

More children are seeking to leverage the internet early for financial and social gain, so many meme account administrators are young. George Locke, 20, a college student who began running meme accounts at age 13, the youngest age at which Instagram permits a user to have an account, said he has never posted gore, but has seen many other young people turn to those methods.

“I’d say over 70 percent of meme accounts are [run by kids] under the age of 18,” he said. “Usually when you start a meme account, you’re in middle school, maybe a freshman in high school. That’s the main demographic for meme pages, those younger teens. It’s super easy to get into, especially with the culture right now where it’s the grind and clout culture. There’s YouTube tutorials on it.”

Meta says it puts warning screens and age restrictions on disturbing content. “I don’t think there’s a world where all [meme pages and their followers] are 18-year-olds,” Locke said.

Jackson Weimer, 24, a meme creator in New York, said he began to notice more graphic content on meme pages last year, when Instagram began to push Reels content heavily in his Instagram feed. At first, meme pages were posting sexually explicit videos, he said. Then the videos became darker.

“Originally, these pages would use sexual content to grow,” he said, “but they soon transitioned to use gore content to grow their accounts even quicker. These gore Reels have very high engagement, there’s a lot of people commenting.”

Commenting on an Instagram video generates engagement. “People die on my page,” one user commented on a video posted by a meme page of a man and a woman simulating sex, hoping to draw viewers. Other comments below graphic videos promoted child porn groups on the messaging app Telegram.

In 2021, Weimer and 40 other meme creators reached out to the platform to complain about sexually explicit videos shared by meme pages, warning the platform that pages were posting increasingly violative content. “I am a little worried that some of your co-workers at Instagram aren’t fully grasping how huge and widespread of an issue this is,” Weimer said in an email to a representative from the company, which he shared with The Post.

Instagram declined to meet with the creators about their concerns. The content shared by many large pages has only become more graphic and violent. “If I opened Instagram right now, and scrolled for five seconds there’s a 50 per cent chance I’ll see a gore post from a meme account,” Weimer said. “It’s beheadings, children getting run over by cars. Videos of the most terrible things on the internet are being used by Instagram accounts to grow an audience and monetize that audience.”

A Meta spokesperson said that, since 2021, the company has rolled out a suite of controls and safety features for sensitive content, including demoting posts that contain nudity and sexual themes.

The rise in gore on Instagram appears to be organized. In Telegram chats viewed by The Post, the administrators for large meme accounts traded explicit material and coordinated with advertisers seeking to run ads on the pages posting graphic content. “Buying ads from nature/gore pages only,” read a post from one advertiser. “Buying gore & model ads!!” said another post by a user with the name BUYING ADS (#1 buyer), adding a moneybag emoji.

In one Telegram group with 7,300 members, viewed by The Post, the administrators of Instagram meme pages with millions of followers shared violent videos with each other. “Five Sinola [Sinaloa] cartel sicarios [hired killers] are beheaded on camera,” one user posted including the beheading video. “ … Follow the IG,” and included a link to his Instagram page.

Sam Betesh, an influencer marketing consultant, said that the primary way these sorts of meme accounts monetize is by selling sponsored posts to OnlyFans marketing agencies which act as middlemen between meme pages and OnlyFans models, who generate revenue by posting pornographic content behind a paywall to subscribers. An OnlyFans representative declined to comment but noted that these agencies are not directly affiliated with OnlyFans.

Meme accounts are fertile ground for this type of advertising because of their often young male audience. OnlyFans models’ advertising options are limited on the broader web because of the sexual nature of their services. The higher the meme page’s engagement rate is, the more the page can charge the OnlyFans agencies for ads.

“The only place you can put one dollar in and get three dollars out is Instagram meme accounts,” Betesh said. “These agencies are buying so many meme account promos they’re not doing due diligence on all the accounts.”

OnlyFans models whose images were promoted in advertisements on meme pages said they were unaware that ads with their image were being promoted alongside violent content. Nick Almonte, who runs an OnlyFans management company, said that he does not purchase ads from any accounts that post gore, but he has seen gore videos pop up in his Instagram feed.

“We’ve had [OnlyFans] girls come to us and say ‘Hey, these guys are doing these absurd things to advertise me, I don’t want to be involved with the type of people they’re associated with,’” Almonte said. “This happens on a weekly basis.”

Meme accounts are potentially raking in millions by posting the violence, said Liz Hagelthorn, a meme creator who formerly ran the largest meme network on Instagram, consisting of 127 pages and a collective 300 million followers. Hagelthorn said none of her pages ever posted violence. But young, often teenage, meme account administrators see gore as a way to cash in, she said.

“With gore, the more extreme the content is, is what the algorithm is optimizing for,” she said. “Overall what you see is when people hate the content or disagree with the content they’re spending 8 to 10 percent longer on the post and it’s performing 8 to 10 percent better.”

Some pages posting graphic violence are making over $2 million a year, she estimated. “The meme industry is an extension of the advertising and influencer industry,” she said, “and it is a very lucrative industry. If you have a million followers, you make at a base $3,000 to $5,000 per post. Bigger meme pages can make millions a year.”

“This is organized,” said Weimer. “It’s not two people posting gore videos, it’s hundreds of people in group chats coordinating posting and account growth.”

The administrators for several accounts posting gore appear to be young men, which Hagelthorn said is expected because most meme administrators are in their teens or early 20s. “These meme page audiences are 13-to 17- year olds, so the people who run the page are young,” Hagelthorn said.

Roberts, the assistant professor at UCLA, said that she worries about the effect this content and ecosystem is having on young people’s notions of morality.

“It seems like we’re raising a generation of adolescent grifters who will grow up having a totally skewed relationship of how to be ethical and make a living at the same time,” she said. “This is not normal and it’s not okay for young people to be exposed to it, much less be profiting from it.”

19

gullydowny t1_ja4320b wrote

The most disturbing thing I’ve seen is clips of that band Greta Van Susteren, something about them creeps me out

2