This article can be found in the MyWallSt App, alongside an audio companion. Sign up today for a free account and get access to dozens of expertly written articles and analyst opinion pieces every month.
“The scope of Facebook’s misconduct is staggering”*
*This is a quote from the most recent lawsuit that has found itself at Facebook’s door, however, it could reasonably describe pretty much anything that has gone on at the company ever.
Facebook (NASDAQ: FB) is in a bit of a pickle. In the wake of its first dip in users since its formation, the stock has been in a free-fall that included the largest one-day loss of value in the history of the stock market. All of this, coupled with an overhaul of its primary focus to a new and unproven technology — along with accompanying name-change — paints the portrait of a company that is scrambling.
There are many reasons for investors to be worried. One could cite the rise to power of TikTok and how it has entrenched itself among younger users ahead of Facebook’s native apps. Or maybe the iOS 14.5 update which has hamstrung what was up until very recently the greatest advertising platform ever built (with Google announcing this week it will follow a similar path).
These are both big-ticket items and real causes of concern for the company that are sure to provide headwinds long into the future. However, with almost half the planet still a Facebook user and a deep and talented workforce with virtually infinite resources at its disposal, one would be brave to bet against the company turning it around. This will all play out over the coming months and years and is sure to be a fascinating watch from an investor’s perspective. While there’s definitely an interesting story in there, instead I’m using today’s Insight to kick the company while it’s down.
Today’s piece is inspired by an all-hands meeting in which Mark Zuckerberg updated the company’s — and I hope you take this next word with all the irony and sarcasm that it deserves — “values”. As reported by Alex Heath of The Verge:
- “Move fast” becomes “move fast together”,
- “Be bold” becomes “build awesome things”,
- “Be open” becomes “live in the future”,
- It added “focus on long term impact” and “be direct and respect your colleagues”,
- And the cherry on top: “Meta, metamates, me”, whatever the hell that means.
So, to celebrate the advent of a new era of “values” let loose on the world, let’s take a look at a sample of what the old ones brought us over the years. Here’s ‘The Life and Crimes of Facebook’.
Is Facebook Guilty of ‘Fanning Ethnic Violence’?
Let’s kick it off with a doozy that probably deserves a bit more air time. When a company we know and use every day gets sued for $150 billion for facilitating genocide, you’d think we’d hear a bit more about it. A lawsuit brought against Facebook claims that its algorithm promoted hate speech and it refused to take down inflammatory posts that directly lead to violence against Rohingya Muslims in Myanmar.
Facebook launched in the country in 2011, quickly becoming the front page of the internet for many. Of its 53 million population, about half use the platform, with many utilizing it as their primary source of news. Its growth in popularity coincided with a rise in violence against the Rohingya Muslim minority. According to Médicins sans Frontières, up to 10,000 Rohingyans were killed in the Myanmar government’s ‘clearance operations’ in 2017, with up to 650,000 more forced to flee to refugee camps in Bangladesh. Research from The Guardian has revealed that at the time of the crisis, hate speech on Facebook exploded.
It becomes even more damning when one realizes that the company was aware of its role and admitted culpability, with an independent report commissioned by Zuck & Co stating: “Facebook has become a means for those seeking to spread hate and cause harm, and posts have been linked to offline violence”. In spite of its awareness of the problem, we saw no evidence of compensation, reparations, or any form of real contrition.
The class action in question claims Facebook was:
“willing to trade the lives of the Rohingya people for better market penetration in a small country in south-east Asia … In the end, there was so little for Facebook to gain from its continued presence in Burma [Myanmar], and the consequences for the Rohingya people could not have been more dire. Yet, in the face of this knowledge, and possessing the tools to stop it, it simply kept marching forward.”
You’d be forgiven for assuming that Facebook learned from its mistakes in Myanmar and would improve the foreign language policing on its platform. But that’s just not how this story goes. Recent troubles in Ethiopia that resulted in the assassination of a popular singer and ensuing ethnic violence of a similar ilk to Myanmar can also be proven to be exacerbated by the promotion of hate speech and glorification of violence on the platform. Vicehas written an extensive piece on the subject here.
Both Myanmar and Ethiopia are examples of the endemic issue that has set down roots deep within the makeup of Facebook and how the company works. Whistleblower Frances Haugen claims 87% of the spending on combating misinformation at Facebook is spent on English content, while only 9% of users are English speakers. Its inability or lack of interest in truly policing the misconduct that is allowed to roam free on the platform around the world has led to some dire and inhumane consequences.
The Facebook Files
The opening to The Facebook Files (paywalled) by the Wall Street Journal goes like this:
“Facebook Inc. knows, in acute detail, that its platforms are riddled with flaws that cause harm, often in ways only the company fully understands.”
The newspaper gathered a trove of internal documents and insider information to put together the piece de la resistance of hit pieces. Let’s fire through what they uncovered:
- Facebook exempted high-profile users from the community rules and standards that are followed by everyone else through a process known as XCheck.
- Through its own research, it became aware that Instagram was damaging to the mental health of teenage girls, while it outwardly played down the negative effects of the app.
- In the face of declining engagement in 2018, it changed the platform’s algorithm, which led to the promotion of certain combative posts, making its users angrier.
- It has provided inadequate responses to the use of the site for human trafficking, ethnic violence, drug trafficking, the selling of organs, and pornography.
- The platform became a trove for anti-vaccination content, in spite of Zuckerburg’s efforts to steer it in the opposite direction.
- It has actively strategized on how to recruit pre-teen users to its apps, in spite of the effects it knows it has on the mental health of teenagers.
- It has rejected efforts to undertake a systematic approach to incendiary and divisive posts as it would reduce engagement.
- The platform was found to be facilitating the dissemination of religious hatred in India.
- Through its own research, Facebook affects 1 in 8 users negatively, to the point that compulsive use impedes work, sleep, parenting, or relationships.
- It has allowed stolen content to flourish, with internal research showing that 40% of traffic in 2018 went to plagiarized or recycled content.
To reduce each of these actions to a bullet point is an injustice to the severity of the evidence, but we’d be here till Christmas if I was to delve deeper. Much of this information is relayed in a series of podcasts if you would like to learn more.
What Was Facebook’s Facial Recognition Scandal?
A Valentine’s Day surprise is bringing up the rear in this timeline of sordid events, with the Texas Attorney General suing the company on Monday for “hundreds of billions of dollars in civil penalties” over the collection and misuse of biometric data of its 20 million Facebook-using citizens. The facial recognition technology used by the company is said to have violated the Lone Star State’s privacy protections, by “capturing biometric information from photos and videos that users uploaded without consent, disclosing the information to others and failing to destroy it within a reasonable time.”
While Meta has, of course, undermined the claims, there is plenty of precedence here on Texas’ side, with the social media giant paying a settlement of $650 million to the State of Illinois in 2020 under similar claims. The fact that the company shut down its facial recognition system in November 2021, deleting over a billion users’ individual facial recognition templates, speaks volumes to its culpability.
Creepy, dystopian vibes aside, it looks like this could shape up to be the next of a long line of costly lawsuits that Facebook will end up paying out and continuing on their merry way. And similar to those For a company that takes in over $100 billion in revenue a year, a lot of it coming as a direct result of these questionable practices, one must ask the question: when will the real accountability begin?
Does Facebook Have Any Accountability?
First things first, Meta isn’t perpetrating the atrocities that occur on its various channels. Bad actors will find any opportunity to spread their toxicity, whether that be on or offline. The company is not responsible for the crimes committed by these people. However, this is not an excuse. In fact, it is actually an opportunity to quell tensions and perhaps help ease much of the conflict that has found its way to its platform.
Unsurprisingly, this opportunity is always left untaken. What has become abundantly clear is that time and time again, the company identifies an abhorrent practice and yet refuses to intervene. It’s a callous and cowardly application of a “not my problem” attitude, and in doing so it has not only avoided policing these actions but actually facilitated the proliferation of them. Its consistent refusal to act has compounded each and every problem that has arisen.
The actions listed above — or more accurately inactions — indicate a pattern of behavior that is entrenched in the “values” of the company, and it’s clearly coming from the very top. With 58% of the voting share as of last year, Mark Zuckerburg’s power over the direction of the business is unique for a company so large.
With his move to the metaverse, he is clearly trying to start afresh in a yet unconquered landscape that, in his mind, will free him of the shackles of Apple and Google that it currently finds itself in. Yet, how scary does a more immersive and life-like version of Facebook sound right now? It’s tough to see any real changes come to the fore while Zuckerberg remains in charge and bringing the same pattern of behavior to the metaverse could lead to an even more sinister platform than it already is.
To sign off, I’ve reworked new “values” for Zuck that might suit better for the company’s adventure into the metaverse:
- “Move fast together” becomes “Move fast to stop hate speech”,
- “Build awesome things” becomes “Build an algorithm that doesn’t reward pitting users against each other for the sake of engagement”,
- “Live in the future” becomes “In the future can we stop the incitement of violence in developing countries”,
- “Focus on long term impact” becomes “Focus on stopping human trafficking”,
- “Be direct and respect your colleagues” becomes “Respect the mental health of teenage girls”,
- And “Meta, metamates, me” becomes “Is there anyone here who can actually step up and take accountability for something we’ve done wrong?”.
Don’t forget that you can listen to this blog for free in the MyWallSt App. Sign up today for a free account and get access to dozens of expertly written articles and analyst opinion pieces every month.
Financial Analyst at MyWallSt
Michael's first and favorite stock is Square, which he sees becoming a massive player in the payments industry and a leader in the war on cash.