New Scientist by Andrew Hetherington (Author)A search for “fake” on Facebook has turned up thousands of “fake news” stories and “false stories” (which, in a more general sense, is the term for content that is false and has no factual basis), according to a new study.
But these are not the stories that are widely spread by fake news.
Instead, the research showed that the majority of the stories are from Facebook users who are trying to get news from their friends and family.
These are, according to the researchers, the stories most likely to be spread by Facebook users.
Facebook has recently been accused of suppressing news content in an attempt to silence criticism of President Donald Trump.
And the company has repeatedly denied that it is intentionally suppressing news.
“Fake news is a phenomenon that has evolved over time to reach a new and disturbing degree.
It is spreading faster than we can detect it,” said study co-author and Facebook researcher Mark Zuckerberg.”
But it is also one of the most important things that we can do to protect our community from it.”
Facebook’s new policy on fake news is the first of its kind.
In the study, researchers analyzed more than 3,000 stories posted by Facebook over a two-year period.
They found that in the first half of 2016, more than one-third of the fake news stories were related to a political campaign, while a further 13% were about an economic event or policy change.
“In short, Facebook has made it much easier for fake news to spread by allowing its users to share fake stories, thereby giving them an easy way to disseminate them to their network of friends,” the researchers wrote.
Facebook said the study’s findings had “no impact” on its policy on how its platform handles fake news and other content.
“We have long said that we will not tolerate fake news, and we will fight against it in the strongest possible way,” Facebook said in a statement.
“We continue to review how we use the data we collect on this topic, including by examining how we can make it easier for people to report suspicious content.”
Facebook said it has a number of tools in place to flag content that it believes violates its terms of service, such as when users share content that includes a threat or is false.
“To our knowledge, this is the only study of its type to directly assess the impact of fake news on Facebook’s community,” Facebook told ABC News in a blog post.
The researchers concluded that “Facebook’s recent crackdown on fake stories is unlikely to have any real impact on the spread of fake stories”.