Breaking

The story of Carol and Karen: Two experimental Facebook accounts show how the company helped divide America
Oct 23, 2021 3 mins, 19 secs
In 2019, two users joined Facebook.

The experiment shows that Facebook, which had 2.9 billion monthly active users as of June 30, knew before the 2020 presidential election that its automated recommendations amplified misinformation and polarization in the U.S., yet the company largely failed to curtail its role in deepening the political divide.

Jose Rocha said he's experienced the divisiveness firsthand.

A military veteran who grew up in a Democratic, pro-union family in Selah, Washington, Rocha said Facebook normalized racist views and led him down a rabbit hole to far-right ideologies. .

“I’ve seen people on Facebook saying, ‘If you are voting for Trump, unfriend me.' But I didn’t see anyone saying, ‘If you are voting for Biden, unfriend me,'” he said.

But posting a meme or putting something on Facebook, it’s not going to change anyone’s mind,” he said.

Concerned that Facebook was prioritizing profits over the well-being of its users, Haugen reviewed thousands of documents over several weeks before leaving the company in May.

In response, Facebook radically altered the algorithm that determines what to display at the top of users' News Feed, the stream of posts from friends, family, groups and pages. The change was aimed at bringing users more updates from friends and family that spark meaningful social exchanges, the company said at the time. .

They really want me to engage with these friends I haven’t talked to in a long time about their very different political views, and clearly not in a positive way,” Dodds said.

“Whether or not Facebook is intentional about what their algorithm is doing, it is responsible for it, and it’s doing harm to our society and they should be held accountable,” he said.

"I think it’s also pretty clear that the company had made a whole bunch of decisions to prioritize engagement, and those have had public consequences,” he said.

For years she bit her tongue at family gatherings. “I didn’t want to get into a barroom brawl over politics with friends,” she said.

In 2008, she joined Facebook and used her account to speak out against the Iraq War at the urging of her son, a Marine who had become disillusioned with the war effort. .

Brent Kitchens, an assistant professor of commerce at the University of Virginia, co-authored a 2020 report that found Facebook users' News Feeds become more polarized as they spend more time on the platform.

Chris Bail, the director of Duke University's Polarization Lab, said he believes Facebook has played a role in deepening political divisions, but he cautioned there are other factors.

"Changing a few algorithms wouldn't do the trick to change that," said Bail, the author of  "Breaking the Social Media Prism."

“I have often said to people: It was harder for me to come out as a gay conservative than it was for me to come out,” she said. 

Her political views and support of Trump cost her friends on Facebook, including her best friend from grade school, she said

"These were people that were friends, that I knew, that I broke bread with, that I went to church with." 

It seems to me that whether it’s Facebook or Twitter or any other social media platform, everybody is entitled to have an opinion.”

And because they wanted that growth back, they wanted the acceleration of the platform back after the election, they returned to their original defaults," Haugen said when she testified before Congress earlier this month

Bail said Facebook should change its system in a more fundamental way. 

Now, she said, “I don’t really even enjoy logging on to Facebook anymore." 

Since Haugen came forward, Bryan deleted the Facebook and Instagram apps from her phone

RECENT NEWS

SUBSCRIBE

Get monthly updates and free resources.

CONNECT WITH US

© Copyright 2024 365NEWSX - All RIGHTS RESERVED