6 Questions After The New York Times’ Facebook Bombshell


On Wednesday afternoon, The New York Times published a blockbuster—five byline, 50 source, 5,000 word—report on the failures of Facebook’s management team during the past three years. It begins with Sheryl Sandberg yelling at one of her employees; it ends with her hand-written stage directions, captured by a Times photographer, as she sat before the Senate: “Slow, Pause, Determined.” The story, in other words, is not very flattering (and you should definitely read it) We did, and we have six follow-up questions that merit more investigation.

1) What is Sheryl Sandberg’s future at Facebook?

For most of Facebook’s history, Sandberg has avoided criticism. During the past year, most of the anger at the company has been directed at Zuckerberg. That has started to change recently. The Wall Street Journal reported, for example, on a “swat team” that Sandberg runs, tasked with identifying and preventing future catastrophes. The Times story, though, is the first to cast her as the central antagonist.

It was Sandberg, the story reports, who seethed after Stamos disclosed to Facebook’s board audit committee that the extent of Russian interference was still unknown and unchecked. It was Sandberg who chastised Stamos for investigating the Russian campaign without the company’s approval. It was Sandberg who sided with Kaplan about leaving the Russians out of the company white paper on election interference and Sandberg who encouraged Stamos to be less specific in his initial blog post about Russia’s propaganda campaign. Sandberg appealed to senator Amy Klobuchar (D-MN) to dial down her attacks on Facebook. And Sandberg was the one who came out in support of the Stop Enabling Sex Trafficking Act, a decision the Times asserts was motivated in part to make other tech giants like Google look bad.

In its blog post, Facebook rejects that assertion. “Sheryl championed this legislation because she believed it was the right thing to do, and that tech companies need to be more open to content regulation where it can prevent real world harm,” the post reads.

The question now is whether the woman charged with solving Facebook’s hardest problems has caused a few too many of her own.

2) What other tech company has been hiring an opposition group to smear Apple?

One of the juiciest sentences in the piece involves an opposition research group called Definers Public Affairs, run by Matt Rhoades, a former campaign manager for Mitt Romney. The company also employs Tim Miller, a former spokesman for Jeb Bush and a contributor to Crooked Media, the company that runs Pod Save America. Facebook hired Definers to look into the funding of the company’s opposition.

During this period, a conservative news website called NTK Network, which the Times says is affiliated with Definers, published a number of stories critical of Apple. But, in the Times report, Miller also says that Definers’ Apple work is funded by a third technology company. In other words, Facebook paid Definers; Facebook was fighting Apple; Definers wrote stories critical of Apple; but another technology company was paying for those stories.

Facebook ended its contract with Definers on Wednesday evening, shortly after the Times story was published. The company, however, defended its work with the research firm. “The New York Times is wrong to suggest that we ever asked Definers to pay for or write articles on Facebook’s behalf – or to spread misinformation,” Facebook wrote in a blog post. “Our relationship with Definers was well known by the media – not least because they have on several occasions sent out invitations to hundreds of journalists about important press calls on our behalf.” And yet, despite the public nature of the relationship, a Facebook spokesperson couldn’t say for sure whether Zuckerberg and Sandberg were aware of Definers’ involvement.

This is not the first time Facebook has engaged in these sorts of tactics. In 2011, Facebook hired a public relations firm to plant unflattering stories about Google’s user privacy practices.

3) What will the relationship be between prominent Democrats and Facebook in the coming year?

The leadership of the Democratic Party has, generally, supported Facebook over the years. But as public opinion turns against the company, prominent Democrats have started to turn, too. At one moment in the story, the Times reporters describe senate minority leader Chuck Schumer (D-NY) confronting senator Mark Warner (D-VA). “In July, as Facebook’s troubles threatened to cost the company billions of dollars in market value, Mr. Schumer confronted Mr. Warner, by then Facebook’s most insistent inquisitor in Congress. Back off, he told Mr. Warner, according to a Facebook employee briefed on Mr. Schumer’s intervention. Mr. Warner should be looking for ways to work with Facebook, Mr. Schumer advised, not harm it.”

Last night, Schumer’s office offered an oddly implausible denial in a statement to CBS, saying, “Senator Schumer was worried that Facebook would bow to pressure from the right wing, who opposed Facebook’s purging of fake accounts and bots, and so he encouraged Sen. Warner to make sure the Intelligence Committee prioritized focusing on the company’s issues related to disinformation and future election interference.”

It’s bizarre to suggest that Warner, who has been the most aggressive senator in dealing with disinformation, would be soft on disinformation. This non-denial denial from Schumer suggests the Times account was probably spot on. Just this week, Schumer was re-elected as senate minority leader, meaning he’ll continue to play a key role in crafting Democrats’ response to Facebook and other tech giants. In the wake of this report, that work will undoubtedly be scrutinized. Warner’s office declined to comment on the issue.

4) What exactly was Facebook doing with its accusations about George Soros?

One of the darkest parts of the piece describes the way Facebook and Definers dealt with anti-Semitism. According to the report, Facebook worked to paint its critics as anti-Semitic (both Sandberg and Zuckerberg are Jewish), while simultaneously working to spread the idea that billionaire George Soros was supporting its critics—a classic tactic of anti-Semitic conspiracy theorists and other extremists. Russian propagandists have also sought to tie Soros to their opponents.

In July, during a meeting of the House Judiciary Committee, members of the group Freedom from Facebook sat in the audience, holding up signs depicting Zuckerberg and Sandberg as a two-headed octopus with its tentacles wrapped around the globe. The Times reports that a Facebook official alerted the Anti-Defamation League, which condemned the anti-Facebook group for being anti-Semitic.

But even as they were defending against supposedly anti-Semitic attacks, the Times alleges Facebook was also helping spread rumors that critics call anti-Semitic in turn. Definers consistently tried to push research (including to WIRED) about the financial ties behind groups like Freedom from Facebook and the Open Markets Institute, which are funded in part by Soros.

On Wednesday, Patrick Gaspard, president of the Open Society Foundation, which was founded by Soros, sent a letter to Sandberg saying, “As you know, there is a concerted right-wing effort the world over to demonize Mr. Soros and his foundations, which I lead—an effort which has contributed to death threats and the delivery of a pipe bomb to Mr. Soros’ home. You are no doubt also aware that much of this hateful and blatantly false and anti-Semitic information is spread via Facebook.”

Facebook hasn’t denied engaging in these tactics, but the company defended its work against Freedom from Facebook in a blog post, writing, “The intention was to demonstrate that it was not simply a spontaneous grassroots campaign, as it claimed, but supported by a well-known critic of our company. To suggest that this was an anti-Semitic attack is reprehensible and untrue.” A Facebook spokesperson told WIRED that neither Zuckerberg nor Sandberg had any idea about “the Soros stuff.”

5) Did Facebook lie about what they knew about Russian operations on the platform during the 2016 election?

This is a complicated, tangled question! Facebook executives have testified before Congress, saying that during the summer of 2016, they knew about Russian hacking attempts and the actions of groups like Fancy Bear, or APT 28, to use Facebook to spread information stolen from politicians. But they’ve always maintained that they were unaware of the Russian propaganda campaigns run by the Internet Research Agency until the summer of 2017. As late as mid-July of 2017, Facebook told WIRED they had no evidence of Russian entities buying political ads in the US.

The Times story provides lots of damning evidence that Facebook’s management team was less interested than they should have been in learning the full extent of Russian operations on the platform. It describes the company’s former chief security officer, Alex Stamos, as running almost a rogue campaign to uncover the truth: “Acting on his own, [Stamos] directed a team to scrutinize the extent of Russian activity on Facebook.” When he tells Sandberg about the work, she becomes angered. “Looking into the Russian activity without approval, she said, had left the company exposed legally. Other executives asked Mr. Stamos why they had not been told sooner.”

Stamos, for his part, tweeted Thursday, saying he was “never told by Mark, Sheryl or any other executives not to investigate.” Whether he was reprimanded for doing so on his own accord, as the Times reports, is unclear.

But being insufficiently interested in something is different than lying about it. That’s what makes the timeline here so important. It appears from the Times’ reporting that Facebook’s management had three different fights about how to deal with Russian operations. The first occurred after the election, when Stamos disclosed his rogue operation to Zuckerberg and Sandberg. The second occured in the winter of 2017 when Facebook was deciding whether to specifically name Russia in a report that Stamos published in April of 2017. The third occurred in September of 2017 over how to respond to the fake pages created by the IRA.

The Times report gives great texture to the fights. But one element is confusing. The authors write that in January of 2017, as Stamos was pushing to publish a paper on the company’s findings, Joel Kaplan, Facebook’s vice president for public policy, objected. According to the Times, Kaplan argued that by implicating Russia, Facebook ran the risk of appearing to side with Democrats at a time when US intelligence agencies were already saying that Russia’s president had ordered a campaign to get Donald Trump elected. “And if Facebook pulled down the Russians’ fake pages, regular Facebook users might also react with outrage at having been deceived,” the Times writes. “His own mother-in-law, Mr. Kaplan said, had followed a Facebook page created by Russian trolls.”

The problem with that is that Facebook says it didn’t even know about those Russian pages in January of 2017. If it did, Facebook has lied repeatedly about what it knew and when. Facebook insists, however, that the Times confused something in the chronology. Through a spokesperson, Kaplan said his mother-in-law followed an inauthentic page originating in Macedonia, not Russia. The spread of Macedonian fake news pages was known to Facebook during the election. “They got the timing wrong on that. Period,” Facebook spokesperson Andy Stone said of the Times. One of the reporters for the Times, meanwhile, says that they stand by the story.

6) Has the company fundamentally changed in the past two years?

In a lot of ways, Facebook has changed dramatically. Since the fall of 2017, when Facebook first publicly acknowledged Russian interference, the company has assumed a position of contrition. Zuckerberg, the company’s press-shy leader, has been on a seemingly nonstop apology tour, repeatedly telling members of the press and Congress that Facebook didn’t take a wide enough view of its responsibilities. It’s hired thousands more people on its safety and security team and is investing in automated tools to spot toxic content on the platform. Facebook is now proactively finding and suspending coordinated networks of accounts and pages aiming to spread propaganda, and telling the world about it when it does. The company has enlisted fact-checkers to help prevent fake news from spreading as far as it once did. Some of the changes have come at a financial cost to Facebook, like requiring political advertisers to go through a lengthy authorization process before they can pay Facebook to run ads. On Thursday afternoon, Facebook announced even more efforts to disincentivize sensationalist content and increase transparency about content removal.

The company’s central problem leading up to 2016, WIRED has argued (https://www.wired.com/story/inside-facebook-mark-zuckerberg-2-years-of-hell/), was a failure to recognize how the platform could be used for ill. Now, at least, they’ve realized it. But the real question is whether any of these fixes are enough to address what seems to be a serious problem Facebook. Even as the company has rolled out these technical and staffing changes, the Times report shows that the company’s ruthless efforts to protect its reputation at all costs remains unchanged. That’s an issue that can’t be solved with better algorithms.


More Great WIRED Stories





Source link

Leave a Reply

Your email address will not be published. Required fields are marked *