Fixing Facebook? Zuckerberg falls short of his New Year's goal
Facebook CEO Mark Zuckerberg started the year with a tough challenge: fix the problems plaguing the world's largest social network.
"The world feels anxious and divided, and Facebook has a lot of work to do -- whether it's protecting our community from abuse and hate, defending against interference by nation states, or making sure that time spent on Facebook is time well spent," Zuckerberg wrote on Facebook in January.
If the tech firm succeeded, Facebook would end 2018 on a much better path. But the cracks in Zuckerberg's social media empire only grew as scandals about data misuse, security and even Facebook's leadership piled up.
The social network has faced criticism many times since launching 14 years ago, but the public uproar reached new heights in 2018. Facebook's missteps, even as it tried to fix its problems, were yet another reminder of what happens when a company grows rapidly with little oversight. They also set the stage for another showdown between the tech powerhouse and lawmakers who have their own ideas on how to manage a platform used by 2.3 billion people every month.
"I think there's just a general growing consensus from both parties in Congress that self-policing is not going to work," Democratic Sen. Mark Warner of Virginia said in an interview.
Facebook pointed to a series of notes Zuckerberg published this year outlining what the tech firm has done to combat election meddling, as well as hate speech, misinformation and other offensive content. The social network pulled down more than 1.5 billion fake accounts, launched a database of political ads and announced the creation of a Supreme Court-like independent body to oversee content appeals.
But in many ways, Zuckerberg fell short of his New Year's resolution. UN investigators said Facebook played a role in spreading hate speech that fueled ethnic cleansing in Myanmar. Media outlets found loopholes and errors in Facebook's political ads database. Users questioned whether they should #DeleteFacebook after learning that Cambridge Analytica, a UK political consulting firm with ties to Donald Trump's 2016 presidential campaign, gathered data on as many as 87 million Facebook users without their permission.
In short, Facebook's problems ballooned out of the the company's control.
"They created a platform where sharing was mindlessly easy and interacting with each other required almost no forethought at all," said Woodrow Hartzog, a law and computer science professor at Northeastern University. "As a result, there as massive sharing, including gushing of personal information that put lots of people at risk."
Facebook controversies this year not only tarnished the company's already battered image, they also fueled more distrust in the social network.
In the wake of the Cambridge Analytica scandal, Zuckerberg acknowledged there was a "breach of trust between Facebook and the people who share their data with us and expect us to protect it."
The public outcry over data misuse caught the attention of lawmakers, who asked Zuckerberg to give his first public testimony before both houses of Congress. Facebook vowed more changes, including removing developer access to a user's data for apps that haven't been used for three months. It started building a new tool so users could clear their browsing history on the social network.
But some privacy experts said Facebook's changes didn't go far enough. The social network, which makes money from ads targeted at users based on what they "like" and do on the internet, benefits when users share more information about themselves.
"Facebook is not new to privacy controversies. Each time something happens, there's some media attention," said Ari Ezra Waldman, a law professor and the director of the Innovation Center for Law and Technology at New York Law School. "Zuckerberg apologizes and says we're going to do better and we're going to get the trust of our users back. Then they make cosmetic changes and it goes back to business as usual."
Facebook's privacy woes weren't the only problems that fostered more distrust.
In September, the company disclosed a security breach that let attackers steal the personal information of 29 million Facebook users, including phone numbers, birth dates and hometowns.
The growing distrust in Facebook has put users in "a tough spot," said Jennifer Grygiel, an assistant professor at Syracuse University who studies social media.
"They know there are problems, but the platforms are woven into the basic infrastructures of their lives," Grygiel said.
Then in November, an investigation by The New York Times shined a harsh spotlight on how Zuckerberg and Facebook COO Sheryl Sandberg handled some of these scandals, escalating tensions with lawmakers and advocacy groups.
In the latest twist to that saga, The Times on Thursday reported that Sandberg asked the company's communications staff to dig into the finances of billionaire investor George Soros after he called both Facebook and Google "a menace." Facebook said in a statement that it had already begun researching Soros when Sandberg made her request.
Soros said that as "near-monopoly distributors" of information, the two big tech companies should be more heavily regulated.
Setting the stage
Some civil rights and advocacy groups are pushing Facebook to share more about how the company decides what to leave up or pull down.
"There is no way for us to know how big of an impact (if any) Facebook's efforts have had on the amount of hate online. That needs to be fixed," the Anti-Defamation League's vice president of innovation and strategy, Adam Neufeld, said in a statement. "Facebook must radically increase transparency about the amount of hate as well as their efforts to fight hate on their platform."
At the same time, social media companies are grappling with concerns that they're censoring free speech as they pull down accounts of conspiracy theorists such as Alex Jones. Facebook has also been accused of suppressing conservative voices, which the company denies doing.
And it isn't only Facebook that's seeing online hate spill into the real world. Twitter apologized in October for failing to remove an online threat made against former congressional press secretary Rochelle Ritchie by mail bombing suspect Cesar Sayoc. Some tech businesses cut ties with fringe social network Gab following revelations that the man arrested in the Pittsburgh synagogue shooting used the site to spew his hatred for Jews.
This has all driven lawmakers to look into possible regulation around privacy, competition and combating disinformation, although it's unclear what the end result will be.
Warner, who co-sponsored a bill to regulate online political ads, outlined 20 ideas for regulating social media and technology companies. One idea, he said, would be to require tech businesses to tell consumers how their data is being used, along with how much it's worth to the company every month.
"I think you'll see action in 2019," Warner said. "What has been one of the biggest frustrations to me is I constantly urged Facebook and others to work with Congress to get this right, because if we act on our own we could screw it up. What is evident is while they've superficially said yes they want to work with us, their actions have not demonstrated that."
Now Zuckerberg will have to prove yet again he's up for that challenge.