As news organizations, tech companies, voters and politicians continue to raise questions about the potential impacts of "fake news" on the 2016 U.S. presidential election, Facebook has come under intense scrutiny for its role in disseminating misinformation. In response, the company yesterday unveiled four steps it plans to take to clamp down on unreliable news articles.

Noting that these efforts are tests that will be reviewed and adjusted over time, Adam Mosseri, Facebook's vice president of News Feed, said his company will "keep working on this problem for as long as it takes to get it right."

Among the changes Facebook is rolling out: easier tools for reporting hoaxes and fake news content; a new program launched with third-party fact-checking partners; algorithmic tweaks to identify misleading information; and new efforts targeting financial incentives that can encourage misinformation.

'A Greater Responsibility'

It remains to be seen whether these latest efforts help lower the heat that Facebook has been feeling regarding its massive impact on users' access to news and other information. Facebook CEO Mark Zuckerberg has repeatedly resisted suggestions that his company is a media organization, despite the Pew Research Center's findings last year that 63 percent of users of both Facebook and Twitter said they get news from those sites.

"Facebook is a new kind of platform different from anything before it," Zuckerberg noted yesterday in an update on his Facebook page. "I think of Facebook as a technology company, but I recognize we have a greater responsibility than just building technology that information flows through."

A more recent Pew survey, published yesterday, found that 64 percent of U.S. adults said that made-up information in the news is causing "a great deal of confusion."

"We believe in giving people a voice and that we cannot become arbiters of truth ourselves, so we're approaching this problem carefully," Mosseri wrote yesterday in a Facebook newsroom post. "We've focused our efforts on the worst of the worst, on the clear hoaxes spread by spammers for their own gain, and on engaging both our community and third party organizations."

'More Work Beyond This'

One of the challenges in slowing the spread of false and misleading information online lies with who or what organization decides what's fake. Facebook has already been a source of controversy in this regard following a Gizmodo report in May that quoted some of the company's human "news curators" as saying they regularly avoided right-wing leaning publications like The Blaze and Breitbart. This led to outrage among some conservatives, and Facebook eventually dismissed its human news staff in favor of algorithmic news analysis.

Yesterday's announcement by Facebook has already been met with similar pushback from some conservative sites, with The Daily Caller today branding one of Facebook's new fact-checking partners -- Snopes -- as an "unreliable liberal fact-checker."

Snopes has been targeted for such criticisms from the right before, with FactCheck.org putting the site to the test after allegations in 2009 that the site was run by "very Democratic" operators. FactCheck's own analysis concluded that those allegations contained "a number of false claims" and that its past reviews of Snopes found the site to be "solid and well-documented."

In addition to Snopes, Facebook will be working with a number of other third-party fact-checkers as part of its new initiative under the International Fact Checking Code of Principles of the journalism organization Poynter. Signatories to those principles must commit to non-partisanship and fairness, transparency of sources, transparency of funding and organization, transparency of methodology and, when required, "open and honest corrections."

"We have a responsibility to make sure Facebook has the greatest positive impact on the world," Zuckerberg noted yesterday. "This update is just one of many steps forward, and there will be more work beyond this."