The Biggest Scandals To Hit Facebook

There are certain movements, events, and fads that define each generation. The 1950s had the birth of rock ‘n roll, the 1960s had the civil rights movement, the ’80s had Jazzercise, and the era that started in 2003 had social media. Why 2003? That’s when Tom Anderson founded Myspace, the Facebook predecessor that allowed users to bling out their pages with all the GIFs, sparkles, pop-ups, and music that their hearts desired. That was quickly eclipsed by the new kid on the block.

Mark Zuckerberg founded “The facebook” in 2004, and it was a massively shocking success. In 2007, The Guardian reported that Zuckerberg already had offers of a $2 billion buyout, and things have only gone up from there.

But they’ve also gone a little sideways, too — and they’ve dragged a lot of people down in the process. Facebook — and Zuckerberg — has had so many scandals that there’s a good chance many users hear the words “Facebook” and “scandal” in the same sentence and wonder which one is being talked about this time. There’s been a lot going on over at Facebook, so let’s have a refresher on some of the biggest scandals that have ever rocked this seemingly invincible juggernaut of social networking.

Facebook is not concerned about hate, allegedly

Facebook had been called out repeatedly for allowing things like misinformation and hate speech, and in 2021, a whistleblower came forward with tens of thousands of pages of documents that she said proved that the higher-ups at Facebook didn’t actually care how much vitriol was being spewed, they were only looking at the bottom line. Her name was Frances Haugen, and she told “60 Minutes“: “The thing I saw at Facebook over and over again was there were conflicts of interest between what was good for the public and what was good for Facebook. And Facebook, over and over again, chose to optimize for its own interests, like making more money.”

Haugen (pictured) was headhunted by Facebook in 2019, and worked in the Civic Integrity program. The name of the program is self-explanatory, and so is the fact that they got rid of it immediately post-election because there hadn’t been any rioting (…yet). Haugen says it was a red flag that Facebook just wanted to be free to do whatever it wanted — and that included keeping the algorithm that decides what shows up on news feeds.

Like guys with Confederate flags? Worry about chemtrails? Think 5G is the real problem? Then that’s what’s going to show up in your news feed: Even though it might be damaging, it’s still going to target people it knows will stay on Facebook — and keep them clicking on ads — longer, which all increases their revenue.

Accusations Facebook is a tool to incite genocide

In 2020, the BBC reported on a crisis that had been going on in Myanmar for a long time. Here’s the gist: The government views the Rohingya Muslims — one of Myanmar’s ethnic minorities — as illegal immigrants, even though they say they’ve been there for generations. Persecution led to a mass exodus that turned violent, and thousands of people ended up dead with what the UN called “genocidal intent.”

Here’s where Facebook comes in. Myanmar had little to no internet access — it was blocked by the militaristic government — for a long time, and it wasn’t until 2011 that telecommunications firms were allowed in. When they were, they brought Facebook with them, often pre-loaded onto devices. It quickly became a news source for many, so when “news” stories started being shared about atrocities supposedly being committed by the Rohingya Muslims, a lot of people saw them.

And a lot of people got very, very angry. According to The New York Times, Facebook became one of the largest sources of anti-Rohingya hate and propaganda of the conflict, and even when Facebook noticed — and removed many of the accounts that were spreading the misinformation — more just popped up in their place. Some of the military-run accounts purported to be entertainment or beauty sites, and amassed millions of followers before the propaganda machine kicked into high gear. Facebook was ultimately called out for allowing themselves to become a platform for inciting genocide.

Facebook's removal of an iconic image from the Vietnam War

It’s one of the most iconic images from the Vietnam War: little 9-year-old Kim Phuc, running down the road, screaming in agony — she and her family had just been accidentally napalmed. In 2016, a Norwegian author named Tom Egeland shared the photo on Facebook, and not only was it censored for violating the nudity policy, but Egeland’s account got hit with a 24-hour ban hammer. According to The Washington Post, the following outrage spiraled through Norway — with even the prime minister and Norwegian print media outlets speaking out against the censorship — and Kim Phuc herself (pictured here in 2019) condemned the decision.

Facebook initially doubled down and defended the censorship, but the outrage just kept growing to the point where they finally said that the “historical importance” of the image “outweighs the value of protecting the community by removal.”

It was a huge deal, and not just because of Facebook’s hit-or-miss enforcement of censorship. At the time, 44% of Americans used Facebook as a major news source, and the removal of the photo showed just how much the powers-that-be could shape what people did and didn’t see. When Norway’s biggest print newspaper published an open letter about it, it described Zuckerberg as the “world’s most powerful editor,” and that’s food for thought.

Facebook downplays negative effects to their target audience

In March 2021, Mark Zuckerberg made this comment in front of Congress when he was asked about Facebook’s findings on the relationship between mental health and social media: “The research that we’ve seen is that using social apps to connect with other people can have positive mental health benefits.” Instagram (which is also owned by Facebook) boss Adam Mosseri has said similar things, but according to The Wall Street Journal, the findings that were kept behind closed doors were very different.

Instagram in particular — filled as it is with photos of scantily-clad, perfect bodies on beaches and in bikinis — was found to have the potential to devastate teen users. A March 2020 presentation explained that 32% of teenage girls surveyed said that looking at Instagram made them feel worse about their own appearance. The presentation also said, “Teens blame Instagram for increases in the rate of anxiety and depression. This reaction was unprompted and consistent across all groups.”

That’s become a big deal, and it’s led to Congress demanding Facebook hand over their research. They haven’t. It’s also led to people like San Diego State University psychology professor Jean Twenge likening the connection between Facebook and teen mental health to smoking and cancer, making this not entirely a scandal, but more accurately a scandal-in-progress.

Facebook was a source of election information… via Russia

Few American elections were as divisive as the 2016 presidential election, and piling on top of the chaos was Facebook. Sort of. It wasn’t until late 2017 that NBC reported that Facebook had submitted evidence to a government judicial committee showing that around 126 million people had gotten Russian-backed campaign information delivered straight to their news feed. At least 120 Russian Facebook pages shared tens of thousands of posts, which people then helped go viral, to the point where even Facebook didn’t know how far it had gone or how many people had actually seen it.

It wasn’t a good look. Estimates suggest that about a third of all Americans saw campaign information that was definitely from Russia but didn’t have an easily identifiable origin. (In all fairness, other social media sites, like Twitter, were also hit.) Facebook downplayed the impact it may have had, with Facebook attorney Colin Stretch explaining that in reality, it meant only 1 in 23,000 posts was from Russia. Was there an impact on elections? The answer was dissatisfying, and essentially a “probably not… but we can’t really rule it out.”

Still, most people weren’t easily swayed, because what about next time? And the time after? As pointed out by George Washington University media and technology professor Dave Karpf, the real problem is that yes, it’s proof that other countries can and will try to influence American elections, and Facebook is a soft way in.

Facebook's undisclosed psychological experiment

In 2012, Facebook decided to do a little experiment. They took 689,003 people, and mucked about with their news feed to remove what The Guardian described as “emotional words,” all to see how it impacted things like likes, shares, and interactions. Then, they monitored what people posted and did after the keywords were manipulated, to see if there were identifiable patterns.

It gets worse: This emotional manipulation was done without the knowledge and consent of these test subjects. University of Maryland law professor James Grimmelmann explained the multi-faceted problem: not only did they bypass one of the hallmarks of ethical research — consent of subjects — but Grimmelmann also said that the idea of a study that was meant and confirmed to change a person’s emotional and mental state is downright awful: “This is bad, even for Facebook.”

Facebook, of course, started by saying that people accepted the terms of use, and consent was buried in there somewhere. When outrage continued, COO Sheryl Sanberg released a … well, it was a statement: “This was part of ongoing research companies do to test different products, and that was what it was; it was poorly communicated … We never meant to upset you.” At the heart of it was advertising: Facebook wanted to know if ads with positive and negative words would cause people to post similar sentiments. When news broke, people were posting with words like, “creepy,” “terrifying,” and some even went right to “evil.”

Facebook accidentally released the personal information of 6 million users

File this one under “W” for “Whoops!” In 2013, Facebook announced they had just realized there was a massive bug in their system that had been kicking around for about a year, and had accidentally released the personal data of about 6 million people out into the nether regions of the internet — and onto the hard drives of other users.

According to CNet, Facebook’s own white hat hackers (the good guys, who try to find exploits in systems before they become problems) discovered the bug in the Download Your Information tool. When users downloaded their own information, it downloaded the information for all the contacts they had, too. But there was a massive problem: If those contacts had phone numbers and addresses, for example, that were in the system but set as private, it downloaded all that private information, too.

Facebook was quick to say (via Reuters) that while the bug had been downloading private information all over the place for a year, they plugged the hole within 24 hours, and found no evidence it had “been exploited maliciously.”

Ever think Facebook is stalking you?

There’s a good chance that everyone who has a Facebook account has had a freaky experience that’s raised questions about whether or not the company is doing some hardcore stalking. In short? They are, they’ve been pretty squirrely about saying so, and no one’s happy about it. In 2018, The Guardian reported that a lawsuit had been filed in California, alleging that Facebook had designed apps that would allow access to things like text messages, photos, GPS, and even a phone’s microphone. Facebook responded that they didn’t have to give up all their information proving or disproving the allegations because they were associated with confidential business matters, but that response did the opposite of reassure people.

Fast forward through two years of people wondering just what information Facebook had on them — and what it was listening to — and when The Washington Post reported on the unveiling of the new “Off-Facebook Activity” tracker, it was confirmation of what everyone had been saying for years, all packaged up in a way that made it look like Facebook was really concerned about your privacy.

The tracker came with confirmation that even if the app is closed on a phone, use of other apps — from stores’ rewards programs to what news articles are opened and read — were all reporting right back to Facebook. And here’s the really terrifying thing: Even turning your phone off doesn’t stop the reporting, as many stores will upload information about the purchases you make. Big Brother is definitely watching.

Facebook gives some users special treatment

In a perfect world, the rules would apply to everyone. It’s not a perfect world, though, and Facebook is about as far from perfect as you can get. In theory, Facebook says they put everyone on the same platform, whether you’re a college student, a college professor, or the college president, for example. Everyone is supposed to be playing by the same rules, but in 2021, The Wall Street Journal published what they found out about a program called XCheck.

In a nutshell, users who are a part of XCheck are “whitelisted” as being exempt from having their posts deleted or even checked by the programs put in place to stop regular people from posting things like nudity. And that’s where our example comes in. In 2019, the Brazilian footballer Neymar posted nude pictures of a woman who had said he’d raped her. His XCheck status let the photos pass, and millions saw them before they were taken down.

Facebook’s official stance on XCheck was summed up by spokesman Andy Stone, who wrote that it “was designed for an important reason: to create an additional step so we can accurately enforce policies on content that could require more understanding.” Meanwhile, regular, non-XCheck users are subjected to what WSJ calls “rough justice,” which a lot of people are unhappy about for a lot of reasons. 

The Cambridge Analytica scandal

The Cambridge Analytica scandal is one of the biggest in Facebook’s dubious history, and here’s what went down … in a nutshell, because it’s a doozy. According to The Guardian, it started when the companies Cambridge Analytica and Global Science Research teamed up on a Facebook app called thisisyourdigitallife. It was marketed as a personality test and collected all kinds of data from all kinds of people who took it.

The first problem was that it also collected all kinds of data from all kinds of people that were friends with the people who took it. It then came out (in part thanks to a whistleblower) that Cambridge Analytica was headed by Steve Bannon — who just happened to be buddy-buddy with Donald Trump. The same informant revealed that they then turned around to find a way to use all that personal data to build a program that would target voters with political advertisements specially selected to have the most impact on said voter and guide them toward making the “right” choice.

It wasn’t just the U.S. elections that were impacted, either. The investigations claimed that there was a distinct possibility that interference from the secret data collection and skewed ads had impacted the Brexit vote as well. For their part, Facebook had a response: It was all fake news. Then, in 2020, the BBC reported that they were sued for illegally harvesting data from a whopping 87 million people.

Facebook's $5 billion fine

Facebook’s Cambridge Analytica scandal was the scandal that just kept on giving, and things were still going on in 2019 — a full seven years after it first hit headlines. That, says Forbes, was when the (FTC) announced they were hitting Facebook with a whopping $5 billion fine for privacy violations. It was the largest they’d ever issued, and it also involved instructions on how to make sure something like this didn’t happen again. Outlined in the settlement were guidelines that explained how Facebook would be responsible — and held accountable — for privacy concerns in the future, and it included a more hands-on approach by the FTC.

Five billion is a ton of money, and to be exact, if it was paid out in $100 bills, it would actually weigh 50 tons (or 100,000 pounds). The really shocking thing didn’t come out until 2021, when Politico reported that shareholders were none too pleased about a massive WTF moment. It came out that the FTC originally said the fine would have been around $106 million, but they would allow Facebook to overpay — up to that magic $5 billion number — in exchange for a promise that Mark Zuckerberg and Facebook COO Sheryl Sandberg were not going to be held personally liable.

One shareholder put it like this: “The Board has never provided a serious check on Zuckerberg’s unfettered authority. Instead, it has enabled him, defended him, and paid billions of dollars from Facebook’s corporate coffers to make his problems go away.”

Facebook's slow response to human trafficking

Facebook has a really, really dark side, and it’s one that came out in 2021. That’s when The Wall Street Journal posted a massive expose on tons of documents they’d received, showing that there were a lot of people using Facebook for unsavory purposes. Worse? Facebook seemed to know about it, and just didn’t care.

Stories are horrible, and they start with a Mexican cartel that was reportedly using Facebook for everything from recruiting new members to hiring hit men. Look to the Middle East, and there’s a disturbing trend of human traffickers using the platform to reach women who are then held as sex workers or slaves. Investigators even found pages for illegally selling organs, and bizarrely, a lot of this activity is in plain sight. That Mexican cartel? They’re technically labeled as one of the “Dangerous Individuals and Organizations” Facebook says they’re on the lookout for, but investigators found multiple pages that not only went by the cartel’s name, but showed pictures of blood, beheadings, and guns to … advertise?

That extended to the Facebook-owned Instagram, which had photos including one of a bag of severed hands. Facebook declined to comment, but they did say they spent millions of hours taking down potentially damaging or violent content. The problem? Even though more than 90% of users are outside of the U.S., just 13% of Facebook’s time is spent looking at those users.

Planning the Capitol riots on Facebook

The Capitol riots of January 6, 2021 were the moment that the rest of the world stopped, looked at America, and asked, “What the heck are you guys even doing?” 

Every riot needs a way to organize, and even as the chaos died down, fingers pointed to Facebook. NBC News reported that Facebook had been used as a major platform to plan attacks on the U.S. Capitol, and a nonprofit organization called the Tech Transparency Project flagged a bunch of pages that specifically called out January 6th as go-time. They found that Facebook content in the previous month included things like groups calling citizens to arms, posts in Nazi-style fonts, and appeals for vigilantes who were ready to “Occupy Congress.”

Facebook COO Sheryl Sandberg adamantly said that it definitely wasn’t Facebook’s fault, and that they had removed somewhere around 350,000 questionable profiles for inciting violence. That’s great, but CNBC says they still missed some big ones. On January 5th, for example, the Black Conservatives Fund told their 80,000 followers that it was time to move on the Capitol.

The conversation was still going on months later, with Facebook saying they’d had 35,000 employees working on security teams in the months leading up to the election. Forbes said that internal memos warned employees to be ready for whistleblower accusations that their work had stopped too soon and Facebook had contributed to the rioting, and a whistleblower saying that is exactly what happened.

Facebook's OG scandal

When “The Social Network” hit theaters, it was 2010, Facebook was still a fairly novel idea, and only two years had passed since Facebook founder Mark Zuckerberg settled the OG scandal by making Cameron and Tyler Winklevoss an offer of $45 million in shares in the company, and another $20 million in cash.

The basics of the lawsuit were pretty straightforward: The Winklevoss twins argued that they were the ones that had come up with the whole idea of Facebook, and Zuckerberg had first agreed to help them get it off the ground. While he dragged his feet with their ConnectU — they claimed — he turned around and launched Facebook first.

The entire thing was a saga that just never seemed like it was going to end, until it finally did — pretty abruptly, says Wired. And here’s the weird thing: It might seem like accusations of theft, double-crossing, and just being a jerk would put a stink on the whole thing, but Wired later observed: “Facebook can afford to buy back its good name.”

pictellme.com