Facebook is encouraging users to report what they believe to be “fake news.”

On Wednesday, the social media giant informed its users of several techniques to help “stop the spread of false news.”

The list recommends 10 ways users should verify the truthfulness of an article, including being “skeptical of headlines,” looking out for “unusual formatting” and checking to see if anyone else is reporting the story.

“If you see a story in News Feed that you believe is false, you can report it to Facebook,” the website says, describing exactly how to mark a post as “false news.”

Here’s an example of what users will see when they attempt to report a story as false:

In user news feeds, suspected fake news stories will also be marked with a note stating, “Disputed by 3rd Party Fact-Checkers,” meaning third party organizations such as The Washington Post, ABC News, or Snopes have investigated and flagged the article.

Back in December, we reported the social media site was mounting an initiative to combat the spread of so-called fake news.

The site announced it would begin warning users if they tried to share stories which “fact checking organizations” had identified as fake.

Facebook CEO Mark Zuckerberg said the effort would “build a more informed community” and help “fight misinformation.”

“You’ll still be able to read and share the story, but you’ll now have more information about whether fact checkers believe it’s accurate. No one will be able to make a disputed story into an ad or promote it on our platform,” he wrote.

Facebook meanwhile has come under scrutiny in the UK for allowing users to post videos and images of illegal activity.

Website moderators are accused of refusing to take down “flagged” content, including materials supporting terrorist acts, child sexual abuse and pedophile cartoons.

“Facebook went as far as promoting the content, [The Times] says, as its algorithms send out such pages and groups to other users if they are deemed to be of interest,” reports RT.

UK QC Julian Knowles has claimed Facebook could be held legally responsible for not removing such content.

“If someone reports an illegal image to Facebook and a senior moderator signs off on keeping it up, Facebook is at risk of committing a criminal offense because the company might be regarded as assisting or encouraging its publication and distribution,” Knowles told the UK Times.

The Emergency Election Sale is now live! Get 30% to 60% off our most popular products today!


Related Articles