How Facebook Broke Us
Mother Jones|March/April 2021
The social network has pushed poison on users for years. Now we know it did so deliberately.
By Monika Bauerlein and Clara Jeffery

Joyce Jones’ Facebook page is almost an archetype of what the social network is supposed to look like: Pictures of her kids, her kids’ friends, her sports teams, her kids’ friends’ sports teams. Videos of her husband’s sermons at New Mount Moriah Baptist Church. Memes celebrating achievement and solidarity, holiday greetings, public health messages. It’s what Mark Zuckerberg extols when he talks about how his company is all about “bringing people together.”

So when Jones decided to run for mayor in her Alabama town, it seemed obvious that she’d try to bring people together on Facebook. Her bid to be Montevallo’s first Black mayor, challenging a 12-year City Council incumbent, drew an enthusiastic, diverse crew of volunteers. They put up a campaign page, One Montevallo, and started posting cheery endorsements along with plugs for drive-in movies and recycling updates.

It was a historic moment for Montevallo, whose population (7,000) is two-thirds white and which sits in Shelby County, the infamous plaintiff in the Supreme Court case that gutted the Voting Rights Act in 2013. It was also a turning point for Jones, who grew up in the shotgun house her father had built on a dirt road far from the neighborhood where her grandmother cleaned houses. “My cousins and I would come with her,” the 45-year-old recalls. “We would do yardwork in the houses that she worked in. We never ever thought that living here was an option.”

“Now I’ve been living here for 17 years. We have a wonderful home. We have raised four wonderful children. And part of what I was being challenged with was: It’s not okay for me to make it out. I have to do something to make sure that other people have every opportunity. I ran because our kids needed to see you don’t have to be white and you don’t have to be a man to run for office in our town.”

But getting her campaign message out was tough. “We’re in a pandemic, so we couldn’t go to churches and meet people,” Jones says. Montevallo does not have a news outlet of its own, and the Shelby County Reporter, based in nearby Columbiana, has a single staff reporter for the 14 communities it covers. “For us, the fastest way to get news is through social media,” she says.

Jones is not quite sure how the rumors started, but she remembers how fast they spread. Facebook accounts popped up and shared posts to Montevallo community groups, implying she wanted to defund police (she does not). Someone made up a report of a burglary at her home, referencing her landlord’s name—to highlight that she was renting, she believes. Another account dredged up a bounced check she’d written for groceries as her family struggled during the 2008 recession.

“The algorithm, how fast the messages were shared and how quickly people saw them, that was just eye-opening to me,” Jones says. Her campaign would put up posts debunking the rumors, but the corrections were seen far fewer times than the attack posts. “It was so much more vitriolic, and it would get so many hits. It was just lightning fast.”

Soon, Jones noticed a chill around her. “I’d be going to the grocery store and people who would normally speak to you and be nice to you would avoid you. I’d go to a football game and people would avoid me. I was baffled by all that. It’s one thing to not know me, but it’s another to know me my whole life and treat me like the plague.”

One night her 16-year-old son, who had been hanging out at the park with a group of families he’d grown up with, called to ask her to pick him up. The adults had been talking about her, not realizing he was within earshot. When Jones came to get him, he told her, “For the first time, I felt like the Black kid.”

“What happens on Facebook doesn’t just stay on Facebook,” Jones says. “It comes off social media. You have to live with that.”

WHAT HAPPENED in Montevallo made far fewer headlines than the social media venom driving the Capitol riot on January 6. But it was part of the same toxic feedback loop that is breaking our elections, and that loop won’t end with the bans, on Donald Trump and others, that social platforms rolled out after the insurrection. The problem is not just—or even primarily— whether Trump or anyone else can post violent, racist, antidemocratic, and conspiratorial rants on these platforms. It’s that the platforms have actively pushed those rants into the feeds of millions, making lies viral while truth languishes unseen.

The technical term for this is algorithmic amplification, and it means pretty much what it says. Take Facebook: When you (or Mother Jones, or Trump) post something, no one else might see it unless they specifically navigate to your page. But Facebook’s algorithm analyzes each post, factoring in who you and your connections are, what you’ve looked at or shared before, and myriad other data points, and then it decides whether to show that post in someone else’s News Feed, the primary page you see when you log on. Think of it as a speed-reading robot that curates everything you see before you even open the app.

The way social media companies tell it, their robots are benevolent, serving only your interests. You’ve clicked on your cousin’s recipe posts but not your friend’s fitness bragging? Here is more pasta and less Chloe Ting. You’ve shown an interest in Trump and also fanciful pottery? Here are some maga garden gnomes. The founding narrative of social media companies is that they merely provide a space for you, the user, to do and see what you want.

In reality, as the people who work at these companies know very well, technology reflects the biases of those who make it, but more than that, it reflects the bias of corporate imperatives. In Facebook’s case, those biases—chief among them, to grow faster than anyone else—have played out with especially high stakes, making the company one of the world’s most significant threats to democracy, human rights, and just plain decency.

Facebook has been proved to be a vehicle for election disinformation in many countries (see: Brexit, Trump, Duterte). It has been an organizing space and megaphone for violent extremism and genocidal hate (see: Kenosha, Myanmar, Sri Lanka). Its power is so far-reaching, it shapes elections in small-town Alabama and helps launch mobs into the Capitol. It reaches you whether or not you are on social media, because, as Jones says, what happens on Facebook doesn’t stay on Facebook.

With a new administration, a new Congress, and a growing recognition of this danger, one of the biggest battles of the next two years is likely to be over regulating social media. So far, the debate on this (thanks in part to Trump’s fury about his lies being asterisked and finally banned) has been about making companies liable for their content decisions—i.e., whether they should censor particular posts or users. But the far more important debate is about their algorithmic decisions, which shape our information universe more powerfully than any censor could.

As the scholar Renée DiResta of the Stanford Internet Observatory has put it, “Free speech is not the same as free reach.” The overwhelming evidence today is that free reach is what is fueling disinformation, disinformation fuels violence, and social platforms are not only failing to stop it, but fanning the flames.

BECAUSE ALGORITHMS are impersonal and invisible, it helps to picture them in terms of something more familiar, say, a teacher who grades on a curve. Nominally, the curve is unbiased—it’s just math! In practice, it reflects biases built into tests, the way we measure academic success, the history and environment kids bring to the test, and so forth.

That’s a given. But now suppose that one group of kids, call them yellow hats, figures out that the teacher rewards showy handwriting, and they bling up their cursive accordingly. They are rewarded and—because there can be only so many A’s in the class (or so many posts at the top of anyone’s Facebook feed)—the other kids’ grades suffer. The yellow hats have gamed the algorithm.

Continue reading your story on the app

Continue reading your story in the magazine

MORE STORIES FROM MOTHER JONESView All

In The Tank

Ethanol’s clean promise has only led to dirtier air.

4 mins read
Mother Jones
March/April 2021

Lost In Translation

Hey Siri, why don’t digital assistants understand people who don’t sound like white Americans?

8 mins read
Mother Jones
March/April 2021

The Nuclear Option

Biden has signaled support for a new generation of smaller nuclear reactors. Are they the path to a carbon-free economy—or just a bid to keep a troubled industry alive?

10+ mins read
Mother Jones
March/April 2021

How Facebook Broke Us

The social network has pushed poison on users for years. Now we know it did so deliberately.

10+ mins read
Mother Jones
March/April 2021

Or Other Crime

How Black voting rights were sabotaged by a three-word phrase in the 14th Amendment, and how they can be reconstructed

10+ mins read
Mother Jones
March/April 2021

Unpopularity Contest

The case for finally doing away with the Electoral College

4 mins read
Mother Jones
March/April 2021

Tyranny of the Minority

Democrats may control Washington, but the fight for democracy is far from over.

10 mins read
Mother Jones
March/April 2021

Josh Hawley – The Apprentice

Before the Capitol insurrection, Josh Hawley was seen as the future of the Republican Party. He still may be.

10+ mins read
Mother Jones
March/April 2021

American Carnage

Trump used the White House to unleash a domestic terrorism movement. Security experts are worried what it will do next.

10+ mins read
Mother Jones
March/April 2021

“Everyone is tired of always staying silent''

Amid the pandemic, farmworkers are demanding a seat at the table.

10+ mins read
Mother Jones
January/February 2021
RELATED STORIES

USERS COULD SOON HIDE ‘LIKE' COUNTS ON INSTAGRAM, FACEBOOK

The tiny red hearts that appear under Instagram photos of kids, kittens and sandwiches can be a source of stress for many users, an insidious way of measuring self-worth and popularity.

1 min read
Techlife News
Techlife News #494

FACEBOOK USERS CAN APPEAL HARMFUL CONTENT TO OVERSIGHT BOARD

Facebook’s quasi-independent Oversight Board said this week that it will start letting users file appeals over posts, photos, and videos that they think the company shouldn’t have allowed to stay on its platforms.

1 min read
AppleMagazine
AppleMagazine #494

FACEBOOK DELIVERS BIASED JOB ADS, SKEWED BY GENDER

Facebook is showing different job ads to women and men in a way that might run afoul of anti-discrimination laws, according to a new study.

4 mins read
AppleMagazine
AppleMagazine #494

7 Home Businesses You Can Launch in 2021

If you're considering starting your own business in 2021, you're in good company. Many people are looking to pivot and try something new in light of the recent economic climate.

6 mins read
Home Business Magazine
Spring 2021

FACEBOOK DATA ON MORE THAN 500M ACCOUNTS FOUND ONLINE

Details from more than 500 million Facebook users have been found available on a website for hackers.

1 min read
AppleMagazine
April 09, 2021

Your Facebook Friend Has Some Thoughts To Share About Your Covid Vaccine

Mark Zuckerberg wanted to make Facebook a source of reliable information about the pandemic. Instead he created a perfect platform for conspiracy theorists

10+ mins read
Bloomberg Businessweek
April 12, 2021

The Internet Doesn't Have To Be Awful

The civic habits necessary for a functioning republic have been killed off by an internet kleptocracy that profits from disinformation, polarization, and rage. Here’s how to fix that.

10+ mins read
The Atlantic
April 2021

Pretty in Paint & Patina

Anchor your rooms and center your spaces with Latina Patina’s one-of-a-kind, hand-painted, stunning furniture pieces.

3 mins read
Cottages and Bungalows
April - May 2021

CONGRESS TO PRESS BIG TECH CEOS OVER SPEECH, MISINFORMATION

The CEOs of social media giants Facebook, Twitter and Google face a new grilling by Congress Thursday, one focused on their efforts to prevent their platforms from spreading falsehoods and inciting violence.

4 mins read
Techlife News
Techlife News #491

FACEBOOK WORKING ON INSTAGRAM FOR KIDS UNDER 13

Facebook says it is working on a version of its Instagram app for kids under 13, who are technically not allowed to use the app in its current form due to federal privacy regulations.

1 min read
AppleMagazine
AppleMagazine #491