‘The Facebook Files’: The Response
Thank you to all those who gave feedback on the Facebook piece we did on last week's podcast. Some of it was incredibly well-considered and will certainly be kept in mind going forward.
In September 1982, a twelve-year-old girl mysteriously passed away in Chicago after taking Tylenol. Several other people died in the area before it was discovered that someone was lacing the Tylenol capsules with cyanide. The CEO of Johnson & Johnson at the time, James E Burke, determined to put public safety ahead of profit, halted production and advertising of Tylenol products, and ordered a nationwide recall of all 31 million bottles. The FBI and Food and Drug Administration (FDA) both tried to convince Burke that a recall was an overreaction, but he went ahead with it anyway. At the time, it was one of the biggest recalls in American history, costing the company over $100 million.
Tylenol was the biggest selling over-the-counter painkiller in the country before the incidents. Burke recalled, "Innocent people were killed. Industry analysts and advertising experts told us that Tylenol was finished." Yet within a year, Tylenol had rebounded thanks to the decisiveness and openness of the company's response. Burke subsequently brought in tamper-proof bottles and replaced capsules with solid tablets to prevent further poisonings. The case is now renowned as being one of the greatest examples of business leadership in history.
Every business makes mistakes. In the case of Johnson & Johnson, it just happened to be the victim of sabotage that, until that stage, no one had anticipated. However, when it happened, it didn’t try to wipe its hands of it — it got straight out in front of it and said ‘Mea culpa — this is how we’re going to make sure it never happens again’. That’s how great businesses and their leaders behave.
Since The Wall Street Journal started publishing ‘The Facebook Files’ over two weeks ago, we have gotten two primary responses from Facebook. The first was written by their Vice President of Global Affairs, Nick Clegg.
A series of articles published by the Wall Street Journal has focused on some of the most difficult issues we grapple with as a company — from content moderation and vaccine misinformation, to algorithmic distribution and the well-being of teens. These are serious and complex issues, and it is absolutely legitimate for us to be held to account for how we deal with them.
At the heart of this series is an allegation that is just plain false: that Facebook conducts research and then systematically and willfully ignores it if the findings are inconvenient for the company. This impugns the motives and hard work of thousands of researchers, policy experts and engineers at Facebook who strive to improve the quality of our products, and to understand their wider (positive and negative) impact. It’s a claim which could only be made by cherry-picking selective quotes from individual pieces of leaked material in a way that presents complex and nuanced issues as if there is only ever one right answer.
Indeed, that does clash with The Journal’s reporting that notes numerous occasions in which researchers that were hired by Facebook to investigate serious abuse on the platform saw their suggestions ignored by management.
From The Journal’s piece on human trafficking and domestic servitude:
The former police officer recommended that Facebook disable WhatsApp numbers associated with the rings, put in new policies about ads purchased anonymously and improve its artificial intelligence to better root out posts related to human trafficking, according to the documents. He added that Facebook should develop a network to prevent trafficking by sharing findings with other tech companies [...] In another memo, the Polish trafficking expert wrote that 18 months after it first identified the problem, Facebook hadn’t implemented systems to find and remove the trafficking posts.
The second rebuttal came from Facebook’s Head of Research Pratiti Raychoudhury. In response to accusations that Instagram causes mental health problems for teenage girls, Raychoudhury had this to say:
The slide in question, which The Wall Street Journal did not publish as part of their report and we’re releasing below, shows that Instagram helps many teens who are struggling with some of the hardest issues they experience. On 11 of the 12 issues in the slide referenced by the Journal, such as eating issues, loneliness, anxiety and sadness, teenage girls who said they experienced these challenges were more likely to say that Instagram made these issues better vs. worse.¹ The one exception was body image. While the headline in the internal slide does not explicitly state it, the research shows one in three of those teenage girls who told us they were experiencing body image issues reported that using Instagram made them feel worse — not one in three of all teenage girls. This is an important difference that is not explicit in the Journal’s reporting. And, among those same girls who said they were struggling with body image issues, 22% said that using Instagram made them feel better about their body image issues and 45.5% said that Instagram didn’t make it either better or worse (no impact).
Facebook employees have reportedly expressed surprise that neither of these responses has gone down particularly well with the press. As noted by many, the company appears to be taking the line that their work is either too complex to be understood or too informal to be taken seriously. And nowhere yet has Facebook had the actual researchers involved come forward and defend the company’s actions. Even their former head of Civic Integrity has questioned the manner in which Facebook seems to be addressing the issues.
Facebook certainly seems to have got itself caught between a rock and a hard place. With one breath it claims that the media is only focusing on the negative aspects of its internal reports, in another, it refuses to release them in full to be properly scrutinized. In fact, as I write this, Congress is hearing testimony from researchers who received cease and desist letters from Facebook as they gathered information on microtargeting of political advertising before being outright banned from the platform. Facebook claimed that was part of an Federal Trade Commission (FTC) order. The FTC promptly came out and said they had no idea what Facebook was talking about.
Perhaps the hard truth is that there aren't many positives for Facebook to share. Even its own reports, designed to shed a positive light on the company, have been shelved when the findings didn’t align with their expectations.
From The New York Times:
In June, the company compiled a report on Facebook’s most-viewed posts for the first three months of 2021. But Facebook did not release the report. After the policy communications team discovered that the top-viewed link for the period was a news story with a headline that suggested a doctor had died after receiving the Covid-19 vaccine, they feared the company would be chastised for contributing to vaccine hesitancy, according to internal emails reviewed by The New York Times.
One person we haven’t heard from yet is the company’s founder and CEO, Mark Zuckerberg. And we won’t if Facebook has its way. The company has launched an internal project to limit Mr. Zuckerberg and other top executives from public scrutiny...the fruits of which the company was happy enough to share.