News | Inside Twitter's Long, Slow Struggle to Police Bad Actors

Mohamed Hakim
News

When Twitter Inc. Chief Executive Jack Dorsey testifies before Congress this week, he’ll likely be a

To some Twitter users—and even some employees—it is a mystery.

In policing content on the site and punishing bad actors, Twitter relies primarily on its users to report abuses and has a consistent set of policies so that decisions aren’t made by just one person, its executives say.

Yet, in some cases, Mr. Dorsey has weighed in on content decisions at the last minute or after they were made, sometimes resulting in changes and frustrating other executives and employees, according to people familiar with the matter.
Understanding Mr. Dorsey’s role in making content decisions is crucial, as Twitter tries to become more transparent to its 335 million users, as well as lawmakers about how it polices toxic content on its site.

In a hearing Wednesday morning before the Senate Intelligence Committee, Mr. Dorsey will appear alongside Facebook Inc. Chief Operating Officer Sheryl Sandberg to discuss how foreign actors can use the social-media platforms to spread misinformation and propaganda. Later in the day, the House Commerce Committee will question Mr. Dorsey individually about whether Twitter is silencing conservative voices.

The latter hearing “is about pulling back the curtain on Twitter’s algorithms, how the company makes decisions about content, and how those decisions impact Americans,” said Rep. Greg Walden (R., Ore.), the chairman of the House Commerce Committee.

Twitter and rival Facebook are increasingly caught in a Catch-22 situation—criticized by some users for allowing hateful posts, but blasted by others for removing content because it curtails free speech.

Twitter has taken a different approach than Facebook, which has hired thousands of content reviewers in the last couple of years to review posts and built out technology to flag inappropriate content. Twitter has far less staff and typically only investigates harassment and abuse that has been reported by users.

Last month, after Twitter’s controversial decision to allow conspiracy theorist Alex Jones to remain on its platform, Mr. Dorsey told one person that he had overruled a decision by his staff to kick Mr. Jones off, according to a person familiar with the discussion. Twitter disputes that account and says Mr. Dorsey wasn’t involved in those discussions.

Twitter’s initial inaction on Mr. Jones, after several other major tech companies banned or limited his content, drew fierce backlash from the public and Twitter’s own employees, some of whom tweeted in protest.

A similar chain of events unfolded in November 2016, when the firm’s trust and safety team kicked alt-right provocateur Richard Spencer off the platform, saying he was operating too many accounts. Mr. Dorsey, who wasn’t involved in the initial discussions, told his team that Mr. Spencer should be allowed to keep one account and stay on the site, according to a person directly involved in the discussions.

Twitter says Mr. Dorsey doesn’t overrule staffers on content issues. The company declined to make Mr. Dorsey available.

“Any suggestion that Jack made or overruled any of these decisions is completely and totally false,” Twitter’s chief legal officer, Vijaya Gadde, said in a statement. “Our service can only operate fairly if it’s run through consistent application of our rules, rather than the personal views of any executive, including our CEO.”

In the coming weeks, the company plans to start showing users a picture of a tombstone in the place of a tweet that has been taken down as a way to signal that a user has violated a company policy, rather than a notice saying the tweet is unavailable. That step, which hasn’t been reported, is among a number of policy changes Twitter plans to make in the coming weeks, according to people familiar with the matter.

Mr. Dorsey has frequently promised that the company will do better in policing content. “We moved too slow. We are fixing,” Mr. Dorsey tweeted at one user in January 2017.

In October of that year, he tweeted: “We decided to take a more aggressive stance in our rules and how we enforce them.”

And then last month: “Truth is we’ve been terrible at explaining our decisions in the past. We’re fixing that.”

After a user flags a tweet, the company says a user-services team first decides whether to elevate a complaint to Twitter’s trust and safety team. The company doesn’t disclose how many of its more than 3,500 employees are on each team or the number of contractors it hires to moderate content. On a case-by-case basis, the trust and safety team may ask Ms. Gadde to participate.

Mr. Dorsey weighs in on the most high-profile cases, according to people familiar with the matter. The company says he participates in discussions about account issues on occasion but isn’t the final word.

Some current and former employees say Mr. Dorsey’s philosophical, arm’s-length leadership style has at times complicated decision-making. Mr. Dorsey is also CEO of the payments company Square Inc. and splits his days between the two offices. He generally delegates to his subordinates, but at times projects stall because either no one knows what he thinks or he doesn’t pull the trigger, according to people familiar with the matter. For example, Twitter took almost two years to decide how to expand beyond its 140-character limit for tweets, a delay many employees attribute to Mr. Dorsey’s indecision.

Twitter’s Chief Financial Officer Ned Segal said decision-making is a weak point for the company, telling analysts at a conference in May that Twitter is “getting better amongst ourselves at both making decisions and executing on them.”

A Twitter spokesman declined to comment on the character-length decision.

In the Alex Jones incident, Twitter’s vice president of global communications, Brandon Borrman, and other executives say Mr. Dorsey wasn’t involved in any of the decision-making because there technically wasn’t a decision to make: No one had flagged Mr. Jones’ content to Twitter as inappropriate, even though he posted hate-speech posts to other social media sites. Mr. Borrman says he told Mr. Dorsey in a text message that the staff wasn’t planning to ban Mr. Jones.

On Aug. 14 the company did suspend Mr. Jones for seven days after CNN flagged tweets to the company. Many staffers viewed that as a half measure and complained that the firm hadn’t acted decisively. Mr. Jones’s account has since been restored.

With Mr. Spencer, Twitter shut down his accounts in November 2016 amid what Mr. Dorsey internally declared “an abuse emergency,” according to people familiar with the matter.

After Twitter reinstated one of Mr. Spencer’s accounts at Mr. Dorsey’s insistence the following month, many employees were upset about the decision, according to a person involved in the decision. At the company’s next all-hands meeting known as “Tea Time,” one employee asked about it. Mr. Dorsey instead turned the question over to Ms. Gadde, that person said.

In an interview, Mr. Spencer says he doesn’t recall being told by Twitter why his accounts had been shut down until Twitter offered to reinstate one of them. He said he found their reasoning “a bit incredible,” but “I went with it because it’s a public space, it’s the way everyone can issue their own little press release.”

Mr. Borrman of Twitter says the company emailed Mr. Spencer soon after shutting down his accounts Nov. 15.