Submitted by retepretepretep t3_102zkm0 in Futurology

With everything that’s happening in the social media space right now — Elon Musk’s purchase of Twitter and the subsequent layoff that followed, Meta’s mass layoffs and $36 billion gamble on the metaverse that has yet to yield results, and the earlier collective negative reaction to Instagram’s new algorithm — it’s easy to say that the once darling of the internet has lost its luster, some might even say it’s already dead, or at least on the way there.

But if we look at similar events in the social media timeline — from the rise and fall of Friendster, MySpace, Google+, Tumblr, Vine and many others that walked the same path — the space and all those in it are probably just changing. But what is it changing into? For some pundits, it’s transforming into what he calls the “recommendation media.”

Here, the main mode of content distribution is no longer users’ network or social graph. Instead, content is shared through centralised algorithms. These are designed to attract the most attention, bring forth the most emotion, and to produce the most engagement. It’s a battle of what’s the best thing to watch, read, or listen to. And the winner gets all the views.

768

Comments

You must log in or register to comment.

Harvey_Rabbit t1_j2wxegi wrote

Can I pay to have my algorithm maximize my happiness instead of someone else's profit. I'd legit pay for an news feed that showed me all the things I'd be interested in without feeding into all the things that are bad for my mental health.

453

12beatkick t1_j2xep8k wrote

The issue at hand is defining what “interested in” is. Algorithms have defined this as clicks/engagement. By all measures it is delivering. “If it bleeds, it leads” remains true in the current age because by and large that is what people are interested in.

62

Harvey_Rabbit t1_j2xgcj0 wrote

Right, it's in their interest to maximize my engagement. I want to subscribe to an algorithm that maximizes for making me a happier, more well rounded person, who maybe isn't on my phone so much.

38

mdjank t1_j2xwigt wrote

You don't need to pay for that. You just need to delete your social media accounts.

30

Harvey_Rabbit t1_j2xxv7s wrote

I guess I'm just wondering if this power can be used for good instead of evil. Like, if I'm trying to lose weight, make it show me things that make eating healthy and working out look fun and eating crap look bad. The way it is now, they might identify that I engage with pictures of candy or fastfood, so they sell ads to crap food companies to get me to buy their crap food. Or, how about we incorporate smart watches into it, so they can measure my vitals. Say I want to get to bed at 10, I want my algorithm to start showing me soothing things at like 9:30. Over time, it might learn the kind of things that keep me up at night and avoid them.

15

mdjank t1_j2y0ffi wrote

Gamification of self improvement activities are its own industry. You can go buy a piece of software that already does that.

All social media can do is share your progress or lack there of.

Think of it this way. You're not going to stop alcoholics by putting a salad bar in a tavern and charging people to eat their salad.

9

Still_Study_6059 t1_j31cdei wrote

That wasn't entirely what he was saying though. Obese people usually come from obese environments, and so it is with other stuff. I need to read up on the science again, but commercials doing their best to make eating shit look great is definitely a thing.

What if you could pay to simply avoid that. In the Netherlands we've opened up the gambling market and with that came a flurry of gambling advertisements. And lo and behold we suddenly have a lot more people addicted to gambling, so now the ads are getting axed by 2025 or something.

Food works through the same mechanisms, as did smoking.

N=1 here, but if I watch the food channel or things like our version the great British bake off I find myself craving "comfort food"(diabetes on a fork). I've rid my life completely from that and have found a couple communities that actually are about that healthy lifestyle and am doing much better now. From 110 to 80 kg's on 1.87m. Avoiding exposure has made that process infinitely more easy to do for me.

Now imagine you could tailor social media to do that for you. Maybe you already can btw, simply by using the current algorithms to look for healthfood etc.

1

mdjank t1_j32oxg0 wrote

I already explained how algorithms worked in this post.

https://www.reddit.com/r/Futurology/comments/102zkm0/they_say_were_past_social_media_and_are_now_in/j2ycwdp?utm_medium=android_app&utm_source=share&context=3

Tailoring your own social media to work for you is probable. It would require disciplined responses directed by unbiased self analysis. In other words, it's not bloody likely.

Then there's the question of limiting the dataset in your feed. You do not have direct control over the data in your feed. You can only control which people can publish to your feed.

You can cut people out of your feed for some level of success. The more people you cut, the less it is a "tool to keep you connected". It stops being social media.

The only sure way to keep from seeing material on social media is to not look at social media. You remove the drunk from the tavern. Change your environment by removing yourself from it.

2

Perfect-Rabbit5554 t1_j2yqbu1 wrote

In theory it's possible, but the incentives aren't there and when the technology is developed, it'll come after the initial mind flaying phase we're seeing now.

2

D2G23 t1_j2zw9rx wrote

I liked one kettlebell video, once in 2018, instagram has shown me kb workouts every 5th video since. To be fair, I now swing a lot of bells, so…

2

0james0 t1_j2xs04d wrote

It shows you things you engage with. You essentially have to retrain it now.

If you see a video that although you might usually watch it, but it's going to be a negative for your brain, quit the app. Doing that is the ultimate flag for the algo, as the last thing it wants you to do is leave.

Then after watching only positive videos, you'll then end up with a feed full of only that.

6

collin-h t1_j2yhs0v wrote

at least on tiktok I 100% quickly skip past videos that I know might be interesting, I just don't want to see more of them for the next hour.

a good algorithm is like a banzai tree... need to perpetually prune it into shape.

8

collin-h t1_j2yhlyh wrote

sure, just like I'm interested in what's going on with that terrible car wreck over there.... but it doesn't rubbernecking is my passion. c'mon algorithms.

3

futurespacecadet t1_j2xsz8g wrote

If a social media website made a happiness algorithm, people would gladly pay for it rather than knowing they rely on making profit from advertisers

6

mdjank t1_j2xze5c wrote

There are major problems with a happiness algorithm.

First how do you measure a person's level of happiness? The person's emotional state is not a metric in the system.

An algorithm can decide if a piece of media is uplifting but it cannot say if that media would produce the desired effect on an individual. It can only predict the media's effect on a group of individuals.

You can ask individuals about their mental state and measure changes after presenting stimuli. That introduces all the problems of self reporting. e.g. People lie.

Second, a solution to happiness already exists. It's called "delete your social media". Any "happiness algorithm" has to compete with this as a solution.

"Delete your social media" is such an effective solution that Social Media will lie to you to make it seem incomprehensible. It tells you "social media is the only way to be connected with others" and "you're 'in the know' because you use social media and that makes you special".

4

futurespacecadet t1_j2y0sme wrote

Well, I don’t think there is some magic, happiness algorithm, I’m just saying the concept of it. What would create happiness in an algorithm form? I think control.

I think control over what people see is pivotal to how they interact and I think we need to give back control to the users .

So maybe when you sign up, you can choose what you want to see, if you do want politics, maybe you can choose the level at what politics you see. Do you want to be challenged, do you want to be in a bubble? I mean that in itself could cause problems.

But I also think we don’t need any of that, I think what people really, like it was the fact that Facebook used to just be about connecting with your friends, purely a communication tool before it was bloated with wanting to be everything else like on marketplace and an advertisement center and pages for clubs etc

It’s the same thing that’s happening with LinkedIn. Are used to be affective as just a job search tool, and now it is bloated with politics. I don’t care about I would rather have more services that the one specific thing rather than one service that tries to do it all and I think that’s where people are getting overwhelmed and depressed.

2

mdjank t1_j2ycwdp wrote

The way statistical learners (algorithms) work is by using a labeled dataset of features to determine the probability a new entry should be labeled as 'these' or 'those'. You then tell it if it is correct or not. The weights of the features used in its determination are then adjusted and the new entry is added to the dataset.

The points you have control over are the labels used, the defined features and decision validation. The algorithm interprets these things by abstraction. No one has any direct control on how the algorithm correlates features and labels. We can only predict the probabilities of how the algorithm might interpret things.

In the end, the correlations drawn by the algorithm are boolean. 100% one and none of the other. All nuance is thrown out. It will determine which label applies most and that will become 'true'. If you are depressed, it will determine the most depressed you. If you are angry, it will determine the most angry you.

You can try to adjust feature and label granularity for a semblance of nuance. This only changes the time needed to determine 'true'. In the end, all nuance will still be lost and you'll be left with a single 'true'.

People already have the tools to control how their algorithms work. They just don't understand how the algorithms work so they misuse the tools they have.

Think about "Inside Out" by Pixar. You can try to make happy all the time but at some point you get happy and sad. The algorithm cannot make that distinction. It's either happy or sad.

1

Pseudonymico t1_j341oz6 wrote

> Second, a solution to happiness already exists. It's called "delete your social media". Any "happiness algorithm" has to compete with this as a solution.

>"Delete your social media" is such an effective solution that Social Media will lie to you to make it seem incomprehensible.

That really depends on who you are though. Social media really is a huge game changer for people who can’t get out of the house for whatever reason - in particular people with disabilities, parents of young children, and the elderly, along with other marginalised groups who can have trouble connecting in person such as the queer community. “Deleting your social media” for these kinds of groups means going back to being an isolated shut-in, and that’s part of why a lot just won’t do it. Regulating algorithms is a better solution by far imo.

0

mdjank t1_j34gvh4 wrote

Social media makes it easier for people to find their communities specifically because of the way statistical learners (algorithms) work. Statistical learners work by using statistics to predict the probabilities of specific outcomes. It matches like with like. Regulating the functionality of statical learners would require the invention of new math that supersedes everything we know about statistics.

Regulation is easier said than done.

It is not possible to regulate how the algorithms work. That would be like trying to regulate the entropy of a thermodynamic system. Eliot Nash won a Pulitzer for his paper on equilibriums. Statistical Learners solve Nash Equilibriums the hard way.

One thing people suggest is manipulating the algorithm's inputs. This only changes the time it takes to reach the same conclusions. The system will still decay into equilibrium.

Maybe it's possible to regulate how and where algorithms are implemented. Even then, you're still only changing the time it takes to solve the Nash Equilibrium. I would love to see someone disprove this claim. Disproving that claim would mean the invention of new math that can be used to break statistics. I would be in Vegas before the next sunrise with that math on my belt.

Any effective regulation on the implementation of statistical learners would be indistinguishable from people just deleting their social media. Without the Statistical Learners to help people more effectively sort themselves into communities, there is no social media. These algorithms are what defines social media.

To claim that people wouldn't be able to find their communities without social media is naive at best. People were finding their communities online long before social media used statistical learners to make it easier. If anything, social media was so effective that other methods could not compete. It has been around so long; it just seems like the only solution.

P.S. Your thinly veiled argumentum ad passiones isn't without effect. Still, logos doesn't care about your pathos.

0

Pseudonymico t1_j34hg0r wrote

> P.S. Your thinly veiled argumentum ad passiones isn't without effect. Still, logos doesn't care about your pathos.

Good grief, are we back in Plato’s Academy or something?

0

mdjank t1_j35exxw wrote

Going back to school might do you some good.

0

Pseudonymico t1_j35gr7p wrote

Argumentum ad latinum ≠ argumentum ad verecundiam

0

mdjank t1_j35n8pt wrote

Which begs the question; why do you think appealing to the needs of the downtrodden and infirmed is a valid argument for not deleting your social media?

Or maybe you're confusing reference to a specific field of mathematics as an appeal to authority?

0

Gloriathewitch t1_j2yh213 wrote

This could also be bad for you, not necessarily your Mental health, but peoples world views becoming even more insular and echo chamber like.

The reality of life unfortunately is that a well rounded person should have at least some experience with things they don't want to see so that we can all have a good sense of empathy for how others live and experience life, how they think.

There's a good reason people who travel have more empathy for different cultures and are usually less racist, and why people who are told their country is the best and never experience other cultures are less empathetic and often more racist.

I'm really wary of echo chambers like a lot of people are in nowadays because they radicalize people a lot.

6

Pistolf t1_j2yji1l wrote

Make an RSS feed, you don’t need to pay for that

4

dogonix t1_j2y390h wrote

>Harvey_Rabbit

That's definitely part of the solution to the dilemma. It's necessary but not sufficient.

For an algorithmic recommendation engine to truly serve the interests of consumers, it has to not only be paid for directly by the end users but also:·

1/ Unbundled from the platforms.

2/ Be run locally by users instead of by a central organization.

The tech may not be ready yet but it's a potential path out of all the currently occurring manipulations.

3

krectus t1_j3091w8 wrote

I got bad news for you, that’s what it already does. It feeds you what you want and you engage in it, that IS how it maximizes profits.

1

Chaos_Ribbon t1_j30on7k wrote

That's exactly what TikTok does. The problem is there's no such thing as being perfectly fed what you want without it impacting your mental health in a negative way. It can quickly lead to confirmation biases and prevent you from growing as a person when all you're receiving is information you want. It's also extremely addictive.

1

Bierbart12 t1_j2w3ar2 wrote

Pretty accurate to how some bots distribute media on Reddit

73

ClammyHandedFreak t1_j2x7hdv wrote

The amount of bot posts and replies is frightening these days on here. I’ve noticed it this week more than ever.

7

faithOver t1_j2x7w0d wrote

Right? The amount of accounts that are 2-3 days old. No karma. Its weird. But very noticeable.

5

Kyuckaynebrayn t1_j2xcu7e wrote

Joe Rogan and Jordan Peterson Stan bots are out full force lately. Media has generally been a right-wing disinformation tool for a few years now

6

MpVpRb t1_j2xo6vj wrote

I love recommendation robots when they work in my interest. I was easily able to train the Spotify robot to suggest new music for me with a 100% success rate. Youtube works a bit worse, it suggest stuff I want to see about 50% of the time and I haven't found the magic way to train it to be better. FB is the worst, sending me loads of crap despite my efforts to train it

Recommendation robots would be better if they were more easily and reliably trained by the users

25

restivepluto397 t1_j301v8v wrote

TikTok is super trainable as well, but you have to be really deliberate about it. Even a couple seconds too long on one video could poison your recs.

9

pretendtotime t1_j2z3t1q wrote

My Discover Weekly hardly ever gives me new music I'm interested in. How'd you train the algorithm to give you more personalized stuff?

6

kev_ng t1_j5wyfmz wrote

How do you train the spotify algo?

1

TheLianeonProject t1_j2w8m58 wrote

>"It’s a battle of what’s the best thing to watch, read, or listen to"

Youtube thumbnails where every influencer had to spend 25 minutes taking selfies of themselves with furled brows or making gasps would make me think these algorithms aren't really curating the best content.

22

ArtSchnurple t1_j2whvx0 wrote

Yeah "best" is not the right word here. "Worst" might be more accurate, if anything. Unfortunately, the algorithm determines that things that are dumb, false, and inspire anger get the most engagement and therefore get pushed on people more. Even more unfortunately, bad actors (Russia, most successfully and expertly) have realized they can use that to spread disinformation and get people at each other's throats. It's bad news all around.

7

fail-deadly- t1_j2x3ql6 wrote

Stickiest content, the stuff you’re most likely to overconsume, ads and all.

2

aphasial t1_j2w906i wrote

I mean... This is not news. Recommendation engines have been driving things since at least 2012, and basically as far back as Facebook's developers realizing that there were ways to alter user behavior based on automatic filters and prioritization being applied to the News Feed (second only to the "Share" button in importance to the rise of the modern hellscape).

Frankly, I think there's only this near-concern-trolling awareness occurring because suddenly the "wrong" people are in control of the algorithms and platforms. Conservatives have been complaining about the ability of social media to direct attention towards and away from what its designers desire for a decade now.

21

DragoonXNucleon t1_j2wsd64 wrote

This wasn't true in the beginning of social media. Think about Digg, Fark or the old Facebook. Facebook and Twitter used to he about who you followed and who you liked. Thats the only content you saw. If you wanted someones content, you has to opt-in.

Thats dead.

Now all social media feeds you what they want, rather than what you request. Its not new this year, but its new in the last decade.

25

aphasial t1_j2xcvqq wrote

Did Fark have anything beyond Fark and Total Fark that would affect filtering via account subscribe/follow links? I seem to be it being a straight weblog stream. Slashdot had a following->newsfeed feature for things like blog stories, but I don't recall that it ever really got a ton of use (or maybe that was just me).

If your social circle (especially friends-of-friends) is large enough, then IMO there isn't much of a difference. On any given day I might have 1200x400 different users and posts available to me, and the FB algorithm has to sort them somehow, even before getting to the "out of the blue" or paid microtargetting ads it wants to show. If someone wants to manipulate your perceptive worldview, they can do it using existing opt-ins with the amount of metadata they've got.

0

MonasticMuff42 t1_j2wg5kd wrote

The reason for conservatives - specifically American Republicans - to be so concerned is that their election strategy has relied on a corner of the media space to be hyper conservative. Now, since Big Tech goes overwhelmingly for the other party, the tables might have turned in this respect. It will be another ten or twenty years before the shift is felt more fully, but their willingness to go with an outsider like Donald Trump back in 2016 shows, at least to me, that they were cognizant of the problems in the propaganda strategy which has consistently won them elections since the 1980s.

−5

aphasial t1_j2xc8yy wrote

>It will be another ten or twenty years before the shift is felt more fully, but their willingness to go with an outsider like Donald Trump back in 2016 shows, at least to me, that they were cognizant of the problems in the propaganda strategy which has consistently won them elections since the 1980s.

Hardly. Much more a result of things like this:

https://www.theatlantic.com/technology/archive/2012/11/did-facebook-give-democrats-the-upper-hand/264937/

https://www.scientificamerican.com/article/facebook-experiment-found-to-boost-us-voter-turnout/

If someone at FB (thanks for admitting that "Big Tech goes overwhelmingly for the other party") wants to lean on the algorithms and features, it'd be easy to rationalize. Just look at some of the internal discussions from the Twitter team from two years ago about the NY Post (or... anything else) for an example.

2

MonasticMuff42 t1_j2xqe62 wrote

Just because I pointed it out doesn't mean I'm in favor of it, nor that I think the Republicans are wrong to point it out. But let's be real, neither of the two parties are strongholds of integrity and courage. If something benefits one party - in this case, the massive bias of Big Tech towards the current status quo Democratic Party - it will be pointed out and critiqued and challenged by the other party - in this case, Republicans raising the issue as one of freedom of speech in the digital polis. And they're not wrong. It's just that if they were in the Democrats shoes, they wouldn't care one bit because the bias would be in their direction and would help them win elections.

Personally, reddit is the only social media I use and I find it easy to discover, join, and participate in subs where more balanced content is posted and there are nuanced views among the participants. But I've been using the platform for a long time - a decade or so, probably. I don't go on r/politics, for example, because it's an echo chamber.

Ultimately it's up to people to be more discerning about the content they read and the communities they join, but platforms should be making it easier for them to do so, and not harder. As it is they are funneling people into the so-called echo chamber and exploiting it. And we can see the results in our society.

1

collin-h t1_j2yhce7 wrote

In the early days of the internet there was a certain charm about finding that one interesting little website after hours of following rabbit holes. Now everything is force fed right down your throat from the 3 or 4 social websites you use... I kinda miss the "holy shit check out this crazy site I found" days of the late 90s early 2000s. But such is the way of technology and convenience.

17

bgva t1_j2ytf3n wrote

I'd love to go back to the Internet of about 20 years ago, give or take (Youtube didn't come along until about 06, so I'd lump that in there too).

3

DickieGreenleaf84 t1_j2w35j6 wrote

I think we're on the border of heading back TO social media via multiple apps that keep you in touch with those you actually care about. No more old highschool buddies and no more algorithms. Look at Discord as a great example of this. Or Matadon (? Spelling? Haven't tried it yet).

14

Wanshu-t2 t1_j2w4dfx wrote

I agree. I see a trend that people start to go back to different apps for different platforms. And because of this trend of decentralisation, a new need for integration of identity among all apps is generated. One example is how Twitter ID can be linked to other apps' profiles, and Mastodon users can connect their id to other Fediverse apps such as Bookwyrm and Peertube.

6

Dagamoth t1_j2xcq1y wrote

Corporate media is the term I like. A well curated presentation that the corporations in your life want you to consume, absorb, and base decisions on.

Remember when “Native Advertising” was new and magazines started having “articles” that were just advertisements mimicking an actual article. Now they have just removed the little label on the edge that said it was paid advertisement.

14

bgva t1_j2yu0tf wrote

As a photographer who was told Reels/TikTok is the next wave, coming up with video content to find potential clients - in addition to trying to find photography clients - feels like another job atop my bread and butter.

I actually enjoy doing videos, and have been brainstorming for new ideas. But making content to satisfy a seemingly arbitrary algorithm is a pain in the ass.

Never mind the barrage of pop-ups and prompts that feel the need to overexplain everything to me like I'm 5. Just give me the simpler Internet from my college years in the early-2000s (minus the pop-up ads).

7

RufusCranium t1_j2w3isl wrote

I'd say it's because it was infiltrated with politics, religion, advertising, and begging.

6

Cactus_TheThird t1_j2wbujg wrote

So just humans in general with their neuroses

7

RufusCranium t1_j2wi3ik wrote

WellI mean, I like swapping jokes outside of those contexts, exchanging pleasantries, play games, sharing cat pics, and other stuff friendlies do. Using it as a platform to change the world or maybe your portion of it I don't think was the original intent.But, like anything, money'sat the heart of it, and whoever's enabling it's many facets are the ones who'll get the most practical use out of it.

3

defcon_penguin t1_j2w515b wrote

Is that news? The news feed from social networks is algorithmic since some years already

3

Mobiggz t1_j2zpick wrote

I’m observing tailored content that I believe is being delivered by AI, more than just the algorithm itself.

3

MrZwink t1_j2x3u1t wrote

Trying to rebrand the tainted name "social media" have begotten?

2

Explicit_Tech t1_j2zxr7h wrote

So exploiting the human brain for profits. It's not doing our society any favors.

2

krectus t1_j309cm0 wrote

Yep that seems about right. Social media is much more about engaging people with content than engaging people with other people anymore.

2

iuytrefdgh436yujhe2 t1_j33w6v8 wrote

Timelines have become 'feeds' and one byproduct of this is that it almost doesn't matter who you're following anymore because your feed just ends up being a mashup of things the system thinks you'll engage with. There's a weird effect that occurs within this where it extends to friends, too. When I first started using IG, for instance, my friends and I posted a fair amount and it was a nice platform because it was a good way for a friend group to easily and conveniently share group photos and the likes and there was a genuine sense of human, friend-scale connection happening. (Facebook was much the same way, early on) But that function is severely diminished in today's experience. The amount of random algorithm engagement stuff has the added effect of reducing my interest in my actual friend's posts, meanwhile, there's a 'wag the dog' effect happening as well where people who are trying to promote their art or music or whatever increasingly feel they need to shape their content a certain way to appease the algorithm. The end result is the entire experience feels less human, less friendly, and all together kind of pointless.

And yet, despite it all, the dumb truth of it is the mindless scrolling and moderate amusement from it is still addictive just the same. My personal usage of social media has evolved away from posting and toward lurk/scrolling and along the way I feel like it's made me more distant from friends and human connection where it used to actually feel like it was a feasible proxy for it.

Weird shit.

2

hashn t1_j2x1qb5 wrote

The goal of the algorithms is just to match up the right content off social media. Its not one or the other.

1

SpitefulAnxiety t1_j2yrh88 wrote

We’ll have feed curators like in Neal Stephenson’s Fall.

1

New-Investigator2309 t1_j3077ct wrote

Welcome to 20 years ago. If you were born before then, your mind is likely already owned by an algorithm.

1

Balla7a t1_j308ctd wrote

The rise and fall of Google+? But that thing never had a rise

1

Icommentor t1_j30rdc2 wrote

What the world needs is a social network where tons of users are secretly AIs. AND THEY LEAD LIVES LESS INTERESTING THAN YOURS.

Whatever you do, make instant coffee, tie your shoe laces, have the hiccups, it’s usually impressive to them. You feel good about yourself and go on with your day.

1

Unhappy-Chest2187 t1_j36trqr wrote

It’s not really what’s ‘best’ but what gets engagement and what usually gets engagement is the junk of the online world.

1

mentalflux t1_j304ijf wrote

Honestly I kind of prefer it this way. Hear me out.

The original "social media" was destined to fail. People need real face-to-face contact, or failing that at least a video or phone call to meet their social needs. Facebook with its feeds of posts and comments, Twitter and even Instagram all fail to really satisfy the human need for meaningful communication. The old special interest forums (for things like car enthusiasts, DIYers, gardeners, etc.) even did it better, because at least you could count on having a reasonably meaningful exchange and then get out without it trying to cannibalize your whole social life.

I much prefer the new social media as it is, being content recommendation feeds. It's all about being entertained and/or being informed, and doing so efficiently with content that is likely to be worth watching for you. There's no longer any pretense that you're going to be enjoying a human connection through social media. It's straight content injected into the veins. And when we need to socialize, we can do it properly in person like we were meant to. This system just feels more honest and clean to me.

0

khamelean t1_j2x6k9k wrote

That’s what social media has always been…have you been under a rock for the past 15 years??

−2

darkjackcork t1_j2w780f wrote

I can't stand another muh Elon Musk changed everything.

The honeymoon being over has nothing to do him. He is the same person he was before.

The quantity of fainting spells in journalists, so embarrassing.

−4