But with the company's vast reach has come another kind of problem: Facebook is becoming too big for its computer algorithms and relatively small team of employees and contractors to manage the trillions of posts on its social network.
Earlier Wednesday, Mark Zuckerberg, the company's chief executive, acknowledged the problem. In a Facebook post, he said that over the next year, the company would add 3,000 people to the team that polices the site for inappropriate or offensive content, especially in the live videos the company is encouraging users to broadcast.
"If we're going to build a safe community, we need to respond quickly," he wrote. "We're working to make these videos easier to report so we can take the right action sooner -- whether that's responding quickly when someone needs help or taking a post down." He offered no details on what would change.
Facebook is also grappling with the limitations of its automated algorithms on other fronts, from the prevalence of fake news on the service to a News Feed that tends to show people information that reinforces their views rather than challenges them.
Despite Mr. Zuckerberg's pledge to do a better job in screening content, many Facebook users did not seem to believe that much would change. Hundreds of commenters on Mr. Zuckerberg's post related personal experiences of reporting inappropriate content to Facebook that the company declined to remove.
Zeynep Tufekci, an associate professor at University of North Carolina who studies online speech issues, said that Facebook designed Live to notify your friends automatically about a live feed -- something guaranteed to appeal to publicity seekers of all sorts.
"It was pretty clear to me that this would lead to on-camera suicides, murder, abuse, torture," she said. "The F.B.I. did a pretty extensive study of school shooters: The infamy part is a pretty heavy motivator."
Facebook has no intention of dialing back its promotion of video, including Live, telling investors on a conference call Wednesday that it would continue to rank it high in users' news feeds and add more advertising within live videos and clips.
"All policies need to recognize that distressing speech is sometimes the most important to a public conversation," said Lee Rowland, a senior staff attorney at the American Civil Liberties Union who works on free speech issues.
She said that the decision to hire more moderators can only help the company make better judgments, especially about live events where fast decisions can be critical. "Humans tend to have more nuance and context than an algorithm," Ms. Rowland said.
But Ms. Rowland said Facebook must also be more clear to the public about its rules on making those calls.