Fake news website

This article is about intentionally fraudulent websites. For satirical websites, see news satire.

Fake news websites are websites that publish hoaxes, propaganda, or disinformation to increase web traffic through sharing on social media. Unlike news satire, where humor is the object, fake news websites seek to increase their traffic by knowingly circulating false stories. Fake news websites have promoted misleading or factually incorrect information concerning the politics of several countries including: Germany, Indonesia and the Philippines, Sweden, China, Myanmar, Italy, France, Brazil, Australia, India, and the United States. Many of the false news sites are hosted in Russia, Macedonia, Romania, and the U.S.

One Pan-European newspaper, The Local, described the proliferation of fake news as a form of psychological warfare.[1] The European Parliament's Committee on Foreign Affairs called attention to the problem in 2016 when it passed a resolution warning that the Russian government was using think tanks, "pseudo-news agencies" and "Internet trolls" as forms of propaganda and disinformation to weaken confidence in Western institutions.[2]

In 2015, the Swedish Security Service, Sweden's national security agency, issued a report concluding Russia was utilizing the tactic to inflame "splits in society" through the proliferation of propaganda.[1] Sweden's Ministry of Defence tasked its Civil Contingencies Agency to combat fake news from Russia.[1] Fraudulent news affected politics in Indonesia and the Philippines, where there was simultaneously widespread usage of social media and limited resources to check the veracity of political claims.[3] German Chancellor Angela Merkel warned of the societal impact of "fake sites, bots, trolls".[4]

Fraudulent articles spread through social media during the 2016 U.S. presidential election.[5][6][7] Several officials within the United States Intelligence Community said that Russia was engaged in spreading fake news.[8] Computer security company FireEye concluded Russia used social media as cyberwarfare.[9] Google and Facebook banned fake sites from using online advertising.[10][11] U.S. President Barack Obama said a disregard for facts created a "dust cloud of nonsense".[12] Concern advanced bipartisan legislation in the U.S. Senate to authorize U.S. State Department action against foreign propaganda.[13] U.S. Senate Intelligence Committee member Ron Wyden said: "There is definitely bipartisan concern about the Russian government engaging in covert influence activities of this nature."[13]

Prominent sources

Prominent among fraudulent news sites include false propaganda created by individuals in the countries of Russia,[14][2] Macedonia,[15][16] Romania,[17] and the United States.[18][19] Several of these websites are often structured to fool visitors that they are actually real publications and mimic the stylistic appearance of ABC News and MSNBC, while other pages are specifically propaganda.[16]

Macedonia

The town of Veles in Macedonia
Fraudulent news stories during the 2016 U.S. election were traced by a BuzzFeed News investigation to adolescent youths in the town of Veles, Macedonia.

A significant amount of fraudulent news during the 2016 United States election cycle came from adolescent youths in Macedonia attempting to rapidly profit from those believing their falsehoods.[15][20] An investigation by BuzzFeed revealed that over 100 websites spreading fraudulent articles supportive of Donald Trump were created by teenagers in the town of Veles, Macedonia.[21][22] The Macedonian teenagers experimented with writing fraudulent news about Bernie Sanders and other articles from a politically left or liberal slant; they quickly found out that their most popular fraudulent writings were about Donald Trump.[21]

The Guardian performed its own independent investigation and reached the same conclusion as BuzzFeed News; and traced over 150 fraudulent news sites to the same exact town of Veles, Macedonia. One of the Macedonian teenagers, "Alex", was interviewed by The Guardian during the ongoing election cycle in August 2016 and stated that regardless of whether Trump won or lost the election fraudulent news websites would remain profitable. He explained he often began writing his pieces by plagiarism through copy and pasting direct content from other websites.[15]

One of the investigative journalists who exposed the ties between fraudulent websites and Macedonian teenagers, Craig Silverman of BuzzFeed News, told Public Radio International that false stories net the Balkan adolescents a few thousand dollars per day and fake articles aggregate to earn them on average a few thousand per month.[23] Public Radio International reported that after the 2016 election season the teenagers from Macedonia would likely turn back to making money off fraudulent medical advice websites, which Silverman noted was where most of the youths had previously garnered clickbait revenues.[23]

The Associated Press tracked down an 18-year-old in Veles, Macedonia, and interviewed him about his tactics. The Associated Press and the fake news website operator met together at Gemdidzii Sports Hall in Veles, Macedonia. The teenager used Google Analytics to assess his web traffic and over the course of one week was able to garner 650,000 views. He regularly copy-and-pasted stories favorable of Donald Trump from a right-wing site called The Political Insider. The 18-year-old from Veles told the Associated Press he did not personally care for politics and was merely engaging in fake news production as a way to earn additional finances, in addition to a methodology to garner extra experience in his chosen field of marketing. The teenager said the effort should be on the consumer to check information: "They can read it if they want to. I’m not the one pushing them to click on the article."[24] The Associated Press used DomainTools to confirm the teenager was behind several fake news websites, and additionally to determine there were approximately 200 websites tracked to Veles, Macedonia that focused on U.S. news.The Associated Press reported that the majority of fake news sites were composed of plagiarism. In the locality of Veles with a population of 50,000, the additional income brought in by fake news sites was not objected to by the local populace. Petar Peckov, a native reporter in the area, told the Associated Press the local townspeople were happy the youths were working.[24]

Romania

"Ending the Fed", a site set up in March 2016 by Ovidiu Drobota reached over 3 million visitors a month according to Alexa Internet. At the time of the site's launch, Drobota was a 24-year-old Romanian web developer specialized in search engine optimization. The site propagated a false story in August 2016 about FOX News firing journalist Megyn Kelly. Facebook removed the article from its "Trending News" manually once it became clear the story was bogus. "Ending the Fed" held four out of the 10 most popular fake articles on Facebook related to the 2016 U.S. election in the three months prior to the election. Their associated Facebook page had 350,000 followers in November 2016.[17]

Russia

For more details on this topic, see Russian propaganda and Cyberwarfare by Russia.

Internet Research Agency

Beginning in fall 2014, The New Yorker writer Adrian Chen performed a six-month-long investigation into Russian propaganda campaigns on the Internet orchestrated by a group that called itself the Internet Research Agency.[25] Evgeny Prigozhin, a close associate of Vladimir Putin, was behind the operation which hired hundreds of individuals to work in Saint Petersburg to support Russian government views online.[25]

Internet Research Agency came to be regarded as a "troll farm", a term used to refer to propaganda efforts controlling many accounts online with the aim of artificially providing a semblance of a grassroots organization.[25] Chen reported that Internet trolling came to be used by the Russian government as a tactic largely after observing the organic social media organization of the 2011 protests against Putin.[25]

Chen interviewed reporters in Russia in addition to political activists, and was informed the end goal of fake news usage by the Russian government was not to attempt to persuade particular readers that it was factual, but rather to simply sew discord and chaos generally online.[25] Chen wrote: "The real effect, the Russian activists told me, was not to brainwash readers but to overwhelm social media with a flood of fake content, seeding doubt and paranoia, and destroying the possibility of using the Internet as a democratic space."[25]

EU regulation of Russian fake news

Building of the European Union's Committee on Foreign Affairs
European Union parliamentary Committee on Foreign Affairs drew greater attention to the problem when it passed a resolution in November 2016, condemning: "pseudo-news agencies ... social media and internet trolls" used by Russia.

In 2015, the Organization for Security and Co-operation in Europe released an analysis highly critical of disinformation campaigns by Russia employed to appear as legitimate news reporting.[26] These propaganda campaigns by Russia were intended to interfere with Ukraine relations with Europe after the removal of former Ukraine president Viktor Yanukovych from power.[26] According to Deutsche Welle, "The propaganda in question employed similar tactics used by fake news websites during the U.S. elections, including misleading headlines, fabricated quotes and misreporting".[26] This propaganda motivated the European Union to create a special taskforce to deal with disinformation campaigns originating out of Russia.[2][26][27]

Foreign Policy reported that the taskforce, called East StratCom Team, "employs 11 mostly Russian speakers who scour the web for fake news and send out biweekly reviews highlighting specific distorted news stories and tactics."[28] The European Union voted to add to finances for the taskforce in November 2016.[28]

Deutsche Welle noted: "Needless to say, the issue of fake news, which has been used to garner support for various political causes, poses a serious danger to the fabric of democratic societies, whether in Europe, the U.S. or any other nation across the globe."[26]

In November 2016, the European Parliament Committee on Foreign Affairs passed a resolution warning of the use by Russia of tools including: "pseudo-news agencies ... social media and internet trolls" as forms of propaganda and disinformation in an attempt to weaken democratic values.[2] The resolution emphatically requested media analysts within the European Union to investigate, explaining: "with the limited awareness amongst some of its member states, that they are audiences and arenas of propaganda and disinformation."[2] The resolution condemned Russian sources for publicizing "absolutely fake" news reports, and the tally on 23 November 2016 passed by a margin of 304 votes to 179.[29]

Observations

Gleb Pavlovsky, who assisted in creating an propaganda program for the Russian government prior to 2008, told The New York Times in August 2016: "Moscow views world affairs as a system of special operations, and very sincerely believes that it itself is an object of Western special operations. I am sure that there are a lot of centers, some linked to the state, that are involved in inventing these kinds of fake stories."[30]

Anders Lindberg, a Swedish attorney and reporter, explained a common pattern of fake news distribution: "The dynamic is always the same: It originates somewhere in Russia, on Russia state media sites, or different websites or somewhere in that kind of context. Then the fake document becomes the source of a news story distributed on far-left or far-right-wing websites. Those who rely on those sites for news link to the story, and it spreads. Nobody can say where they come from, but they end up as key issues in a security policy decision."[30]

Counter-Disinformation Team

Logo of the United States Department of State
The United States Department of State spent 8 months creating a unit to counter Russian disinformation campaigns against the U.S. before scrapping their own program in September 2015.

The International Business Times reported that the United States Department of State had plans in the works to specifically use a unit that had been formed with the intention of fighting back against disinformation from the Russian government, and that the unit was disbanded in September 2015 after department heads within the State Department did not foresee the peril of the propaganda in the months immediately prior to the 2016 U.S. presidential campaign.[31] The U.S. State Department had put 8 months of work into developing the counter-disinformation unit before deciding to scrap it.[31]

Titled Counter-Disinformation Team, the program would have been a reboot of the Active Measures Working Group set up by the Reagan Administration which previously operated under the auspices of the U.S. State Department and United States Information Agency. The Counter-Disinformation Team was set up underneath the Bureau of International Information Programs of the U.S. State Department. Work began in the Obama Administration on the Counter-Disinformation Team in 2014. The intention of the Counter-Disinformation Team was to combat propaganda from Russian sources such as Russia Today. A beta release version website was established ready to go live and several staff members were hired by the U.S. State Department for the Counter-Disinformation Team prior to its cancellation. United States Intelligence Community officials explained to former National Security Agency analyst and counterintelligence officer John R. Schindler, that the Obama Administration decided to cancel the Counter-Disinformation Team because they were afraid of antagonizing the Russian government.[32][33]

Under Secretary of State for Public Diplomacy and Public Affairs Richard Stengel was the point person at the U.S. State Department for the Counter-Disinformation Team before it was canceled.[32][33] Stengel had experience previously on the matter, having written publicly for the U.S. State Department about the disinformation campaign by the Russian government and Russia Today.[34] After United States Secretary of State John Kerry called Russia Today: a "propaganda bullhorn" for Vladimir Putin the president of Russia,[35] Russia Today insisted that the State Department give an "official response" to Kerry's statement.[34][36] In his response, Stengel wrote for the U.S. State Department that Russia Today engaged in a "disinformation campaign".[34][36] Stengel spoke out against the spread of fake news, and explained the difference between reporting and propaganda: "Propaganda is the deliberate dissemination of information that you know to be false or misleading in order to influence an audience."[34][36]

A representative for the U.S. State Department explained to the International Business Times in a statement after being contacted regarding the closure of the Counter-Disinformation Team: "The United States, like many other countries throughout Europe and the world, has been concerned about Russia's intense propaganda and disinformation campaigns. We believe the free flow of reliable, credible information is the best defense against the Kremlin's attack on the truth."[31]

Peter Kreko of the Hungary-based Political Capital Institute spoke to International Business Times about his work studying the disinformation initiatives by the Russian government, and said: "I do think that the American [Obama] administration was caught not taking the issue seriously enough and there were a lot more words than action."[31] Kreko recounted that employees within the U.S. government told him they were exasperated due to the "lack of strategy, efficiency and lack of taking it seriously" regarding the information warfare by the Russian government against the United States.[31]

Further role in 2016 U.S. presidential election

Adrian Chen noticed an odd trend in December 2015 where pro-Russia accounts suddenly became supportive of Donald Trump during the 2016 election.

Adrian Chen observed a strange pattern in December 2015 whereby online accounts he had been monitoring as supportive of Russia had suddenly additionally become highly supportive of 2016 U.S. presidential candidate Donald Trump.[14] Writers Andrew Weisburd and Foreign Policy Research Institute fellow and senior fellow at the Center for Cyber and Homeland Security at George Washington University, Clint Watts,[37] wrote for The Daily Beast in August 2016: "Fake news stories from Kremlin propagandists regularly become social media trends."[14] Weisburd and Watts documented how a disinformation campaign spread from Russia Today and Sputnik News, "the two biggest Russian state-controlled media organizations publishing in English", to pro-Russian accounts on Twitter.[14] Prior to the election, U.S. national security officials told BuzzFeed News they were more anxious about Russia tampering with U.S. news than hacking the election itself.[38]

Citing the prior research by Adrian Chen, Weisburd and Watts observed compared the tactics used by Russia during the 2016 U.S. election to those previously utilized by the Soviet Union against the U.S. during the Cold War.[14] They referenced the 1992 United States Information Agency report to the United States Congress, which warned about Russian propaganda campaigns called active measures.[14] Weisburd and Watts concluded these Active measures became much easier for the intelligence agents with the advent of social media on the Internet.[14] Institute of International Relations Prague senior research fellow and scholar on Russian intelligence, Mark Galeotti, agreed the Kremlin operations were a form of active measures.[8] The Guardian reported in November 2016 that the most strident among Internet promoters of Trump were not U.S. citizens but instead paid Russian propagandists.[39] The paper estimated there were several thousand trolls engaged in the offense, and that their primary topics included promoting Trump and Putin, and criticizing President Obama.[39]

Weisburd and Watts collaborated with colleague J. M. Berger and published a follow-up study to their Daily Beast article in the online magazine War on the Rocks, titled: "Trolling for Trump: How Russia is Trying to Destroy Our Democracy".[37][40] The three writers researched 7,000 user accounts on social media over a two-and-a-half year period of time that promoted Donald Trump.[40] Their research detailed techniques of Internet trolls to degrade the reputation of critics of Russian activities in Syria, and to proliferate falsehoods Hillary Clinton's health.[40] Watts explained his colleagues' analysis in War on the Rocks to CNN, and said the Russian propaganda effort targeted the alt-right movement, individuals from right-wing politics, and fascist groups.[37]

BuzzFeed News reported Internet trolls financed by the Kremlin were open about their authorship and spread of fake news.[38] After each presidential debate, tens of thousands of Twitter bots proliferated hashtags including #Trumpwon and to change online perceptions.[38] The Federal Bureau of Investigation released a statement to BuzzFeed News stating they were investigating the propaganda.[38] United States Intelligence Community officials told BuzzFeed News they believed the Russian government was engaged in spreading fake news.[8]

The United States Intelligence Community tasked resources debating why Vladimir Putin chose summer 2016 to escalate active measures towards influencing domestic U.S. politics.[41] Director of National Intelligence James R. Clapper said that after the 2011–13 Russian protests, Putin's confidence in his long term viability as a politician was damaged, and he decided to respond with the propaganda intelligence operation.[41] Former Central Intelligence Agency case officer Patrick Skinner explained that the true goal of the propaganda operation was to spread uncertainty, regardless of whether or not a particular fake statement had been debunked.[42] Investigative analyst at Bellingcat, Aric Toler, explained that fact-checking could simply draw further attention to the fake news.[42]

David DeWalt, the chairman of computer security company FireEye
David DeWalt, chairman of computer security company FireEye, concluded that the intelligence operation during the 2016 U.S. election by the Russian government was a new development in cyberwarfare by Russia.

U.S. Congressman Adam Schiff, Ranking Member of the House Permanent Select Committee on Intelligence, commented on Putin's aims, and said the U.S. intelligence agencies were significantly concerned with Russia propaganda in the U.S.[41] Speaking about online disinformation that appeared in Hungary, Slovakia, the Czech Republic, and Poland, Schiff said there was an increase of the same behavior in the U.S.[41] Schiff concluded Russian propaganda intelligence operations would likely continue against the U.S. after the election.[41]

On 24 November 2016, The Washington Post reported that members of the Foreign Policy Research Institute[lower-alpha 1] had stated Russian propaganda during the election helped foment criticism of Hillary Clinton and support for Donald Trump.[5][6][7] The strategy involved social media users, Internet trolls working for hire, botnets, and organized websites in order to cast Clinton in a negative light.[5][6][7] Clint Watts monitored Russian propaganda and stated its tactics were similar to Cold War era strategies applied to social media. Watts stated the goal of Russia was to decrease trust in the U.S. government.[5] Watts research along with colleagues Andrew Weisburd and J.M. Berger was published in November 2016.[5] These conclusions were confirmed by prior research from the Elliott School of International Affairs at George Washington University and by the RAND Corporation.[5] The Nation editor Katrina vanden Heuvel opined that "hysteria being drummed up around Putin's alleged intervention in the campaign" was overblown, arguing that it was the broken U.S. electoral system that decided the election rather than propaganda from afar.[44]

In the same article, The Washington Post reported that the previously unknown group PropOrNot[lower-alpha 2] came to similar conclusions about involvement by Russia in propagating fake news during the 2016 U.S. election.[5][6] The Washington Post and PropOrNot received criticism from The Intercept,[46] Fortune,[43] Rolling Stone,[47] AlterNet,[48] Adrian Chen at The New Yorker,[45] and in an opinion piece in the paper itself, written by Katrina vanden Heuvel.[44]

Ari Shapiro on the National Public Radio program All Things Considered interviewed Washington Post journalist Craig Timberg, who explained there was a massive amount of botnets and financed Internet trolls to increase the spread of fake news online.[49] Timberg said there were thousands of social media accounts working for Russia that functioned as a "massive online chorus".[49] Timberg stated Russia had a vested interest in the 2016 U.S. election due to a dislike for Hillary Clinton over the 2011–13 Russian protests.[49]

Bloomberg News reported computer security company FireEye concluded the Russian government utilized social media as a weapon to influence perspectives regarding the U.S. election.[9] FireEye Chairman David DeWalt told Bloomberg News the intelligence operation by the Russian government in 2016 was a new development in cyberwarfare by Russia.[9] FireEye CEO Kevin Mandia stated the tactics of Russian propaganda cyberwarfare changed significantly after fall 2014, from covert computer hacking to suddenly more overt tactics with decreased concerns for operational security or being revealed to the public as an intelligence operation.[9]

United States

Homepage of fake news website, RealTrueNews, which states on its main page: "Everything on RealTrueNews Was A LIE".
RealTrueNews intended to show reader gullibility its fiction was widely believed as factual.[50][51][52]

U.S. News & World Report warned readers to be wary of popular fraudulent news sites composed of either outright hoaxes or propaganda, and recommended the website Fake News Watch for a listing of such problematic sources.[53]

Marco Chacon created the fake news website called RealTrueNews to show his alt-right friends "how ridiculous" their gullibility was for such websites.[50][51] In one of the stories Chacon wrote a fake transcript for Hillary Clinton's leaked speeches in which Clinton explains bronies to Goldman Sachs bankers.[50][51] Chacon was shocked when his fake article was attributed as factual by Fox News and he heard his own creation on The Kelly File hosted by Megyn Kelly.[50][51] Trace Gallagher repeated Chacon's story word for word by saying Clinton had called Bernie Sanders supporters a "bucket of losers" a phrase made-up and written by Chacon himself.[50] Megyn Kelly apologized after emphatic denials from representatives for Hillary Clinton.[50][51][52]

After his fake stories that he made up were believed as factual and shared and viewed tens of thousands of times, Chacon told Brent Bambury of CBC Radio One program Day 6 that he was so shocked at Internet consumers' ignorance he felt it was like an episode from The Twilight Zone.[52] In an interview with ABC News, Chacon defended his site, saying his was only an over-the-top parody of other fake news sites to teach them the how ridiculous they were: "The only way I could think of to have a conversation with these people is to say, 'if you have a piece of crazy fake news, look I got one too, and it’s even crazier, it’s absurd.'"[54]

Jestin Coler from Los Angeles is the founder and CEO of Disinfomedia, a company which owns many fake news websites He had previously given interviews to multiple media organizations about fake news under a pseudonym, Allen Montgomery, in order to evade personal scrutiny.[18] With the help of tech-company engineer John Jansen, journalists from NPR found Coler's identity. After being identified as Disinformedia's owner, Coler agreed to an interview.[18] Coler explained how his original intent for his project backfired: "The whole idea from the start was to build a site that could kind of infiltrate the echo chambers of the alt-right, publish blatantly or fictional stories and then be able to publicly denounce those stories and point out the fact that they were fiction."[18] He stated his company attempted to write fraudulent reports for the left-wing perspective, but found those articles were not shared nearly as much as fake news from a right-wing point-of-view.[18] Coler told NPR that consumers of information must be more skeptical of content in order to combat fake news: "Some of this has to fall on the readers themselves. The consumers of content have to be better at identifying this stuff. We have a whole nation of media-illiterate people. Really, there needs to be something done."[18]

Paul Horner, a creator of fraudulent news stories, stated in an interview with The Washington Post that he was making approximately US$10,000 a month through advertisements linked to the fraudulent news.[19][55][56] He claimed to have posted a fraudulent advertisement to Craigslist offering thousands of dollars in payment to protesters, and to have written a story based on this which was later shared online by Trump's campaign manager.[19][55][56] Horner believed that when the stories were shown to be false, this would reflect badly on Trump's supporters who had shared them, but concluded "Looking back, instead of hurting the campaign, I think I helped it. And that feels [bad]."[57]

In a follow-up interview with Rolling Stone, Horner revealed that The Washington Post profile piece on him spurred greatly increased interest with over 60 interview requests from media including ABC News, CBS News, and Inside Edition.[58] Horner explained that his writing style was such that articles appeared legitimate at the top and became increasingly couched in absurdity as the reader progressed: "Most of my stuff, starts off, the first paragraph is super legit, the title is super legit, the picture is super legit, but then the story just gets more and more ridiculous and it becomes obvious that none of it is true."[58] Horner told Rolling Stone that he always placed his name as a fictional character in his fake articles.[58] He said he supported efforts to decrease fake news websites.[58]

Impacts by country

Fake news has influenced political discourse in multiple countries, including Germany,[4] Indonesia and the Philippines,[3] Sweden,[1] China,[59][60] Myanmar,[61][62] and the United States.[14]

Australia

Australia was plagued with fake stories being shared as if they were truth on Facebook, especially regarding false news about Muslim religious practices in the country.[63] A group prominent on Facebook in the country was focused on getting rid of Halal, the Muslim laws regarding religious dietary restrictions.[63] "Boycott Halal in Australia group" had about 100,000 members on its page on Facebook in 2016.[63] The group publicized a satirical newspaper report in November 2014 and passed it off as truth.[63] Another page, for proponents of Q Society, which refers to itself as "Australia's leading Islam-critical movement", frequently posts baseless fake statements.[63]

Brazil

Brazil faced increasing influence from fake news after the 2014 re-election of president Dilma Rousseff and subsequent impeachment in August 2016.[63] BBC Brazil reported in April 2016 that sixty percent of the top shared articles on Facebook about the impeachment proceedings against Rousseff were fake.[63]

In 2015, reporter Tai Nalon resigned from her position at Brazilian newspaper Folha de S Paulo in order to start the first fact-checking website in Brazil, called Aos Fatos (To The Facts).[63]

Nalon told The Guardian: "There is a lot of false news, but I would be cautious about saying the problem is similar to what happens in the USA."[63]

China

The government of China used the growing problem of fake news as a rationale for increasing Internet censorship in China in November 2016.[64] China took the opportunity to publish an editorial in its Communist Party newspaper The Global Times called: "Western Media's Crusade Against Facebook", and criticized "unpredictable" political problems posed by freedoms enjoyed by users of Twitter, Google, and Facebook.[59] China government leaders meeting in Wuzhen at the third World Internet Conference in November 2016 said fake news in the U.S. election justified adding more curbs to free and open use of the Internet.[60] China Deputy Minister Ren Xianliang, official at the Cyberspace Administration of China, said increasing online participation led to additional "harmful information" and that "intimidation and fraud are more common than ever".[65] Kam Chow Wong, a former Hong Kong law enforcement official and criminal justice professor at Xavier University, said at the conference: "it's a good move that the U.S. is trying to regulate social media; it’s overdue."[66] The Wall Street Journal noted China's themes of Internet censorship became more relevant at the World Internet Conference due to the outgrowth of fake news: "China’s efforts to promote its concept of the internet had fresh resonance as Western minds now debate whether social media sites should screen out fake news".[67] Fake news during the 2016 U.S. election spread to China.[63] Translation efforts were made and articles were then translated into Chinese and shared within the country after becoming popularized within the United States.[63]

France

France saw an uptick in amounts of disinformation and propaganda, primarily in the midst of election cycles.[63] Le Monde fact-checking division "Les décodeurs" was headed by Samuel Laurent, who told The Guardian in December 2016: "I think the French presidential election campaign [next spring] will be fraught with this type of thing."[63]

The country faced controversy regarding fake websites providing false information about abortion.[63] The government's lower parliamentary body moved forward with intentions to ban such fake sites.[63] Laurence Rossignol, women's minister for France, informed parliament though the fake sites "appear neutral and objective", in actuality there intentions were "deliberately seeking to trick women".[63]

Germany

German Chancellor Angela Merkel lamented the problem of fraudulent news reports in a November 2016 speech, days after announcing her campaign for a fourth term as leader of her country.[4] In a speech to the German parliament, Merkel was critical of such fake sites: "Something has changed -- as globalisation has marched on, (political) debate is taking place in a completely new media environment. Opinions aren't formed the way they were 25 years ago. Today we have fake sites, bots, trolls -- things that regenerate themselves, reinforcing opinions with certain algorithms and we have to learn to deal with them."[4] She warned that such fraudulent news websites were a force increasing the power of populist extremism.[4] Merkel called fraudulent news a growing phenomenon that might need to be regulated in the future.[4]

Germany's foreign intelligence agency Federal Intelligence Service Chief, Bruno Kahl, warned of the potential for cyberattacks by Russia in the 2017 German election.[68] He said the cyberattacks would take the form of the intentional spread of misinformation.[68] Kahl said the goal is to "elicit political uncertainty".[68] Germany's domestic intelligence agency Federal Office for the Protection of the Constitution Chief, Hans-Georg Maassen, said: "The information security of German government, administrative, business, science and research institutions is under permanent threat. ... Russian intelligence agencies are also showing a readiness to [carry out] sabotage."[68]

India

India had over 50 million accounts on the cellphone application Whatsapp in 2016.[63] The country's prime minister declared in November 2016 there would be a 2,000-rupee currency bill established, and fake news went viral over Whatsapp that the note came equipped with spying technology which could track bills up to 120 meters below the earth.[63] India's reserve bank refuted the falsities, but not before they had spread to the country's mainstream news outlets.[63] Prabhakar Kumar of the Indian media research agency CMS, told The Guardian: "Mainstream media in India is more impacted by the phenomena [of fake news] because they broadcast these kinds of stories without verifying. There is no standard policy for TV news and newspapers about the process of researching and publishing stories."[63]

Law enforcement officers in India arrested individuals with charges of creating fictitious articles, predominantly if there is a likelihood it will inflame societal conflict.[63] The country warned supervisors of Whatsapp groups they may be liable for the proliferation of fake news.[63]

Indonesia and Philippines

Fraudulent news has been particularly problematic in Indonesia and the Philippines, where social media has an outsized political influence.[3] According to media analysts, "many developing countries with populations new to both democracy and social media" are particularly vulnerable to the influence of fraudulent news.[3] In some developing countries, "Facebook even offers free smartphone data connections to basic public online services, some news sites and Facebook itself — but limits access to broader sources that could help debunk fake news."[3]

Italy

President of the Italian Chamber of Deputies, Laura Boldrini, stated: "Fake news is a critical issue and we can’t ignore it. We have to act now."[69]

Between October 1 and November 30, 2016, ahead of the Italian constitutional referendum, five out of the ten referendum-related stories with most engagements on social media (shares, likes, and comments on Facebook, plus shares on Twitter, LinkedIn and Google+) were hoaxes or contained a misleading title.[70] Of the three stories with the most social media engagements, two were fake.[70]

Prime Minister of Italy Matteo Renzi met with U.S. President Barack Obama and with leaders of European nations at a meeting in Berlin, Germany in November 2016, and privately spoke with them about the pervasive problem of fake news.[69]

Pervasiveness of propaganda grew in advance of the constitutional referendum scheduled for 4 December 2016.[63] The influence became so problematic that a senior adviser to Matteo Renzi began a defamation complaint on an anonymous Twitter user who had used the screenname "Beatrice di Maio".[63] Cyberwarfare propaganda against Matteo Renzi increased before the referendum date, and Italian newspaper La Stampa brought attention to false reporting by Russia Today where they wrongly published that supportive rally in favor of Renzi in Rome was actually against him.[63]

The Hollywood Reporter and The New York Times reported on the Five Star Movement (M5S), an Italian political party founded by Beppe Grillo, and how the party was said to manage a consortium of fake news websites amplifying support for Russian news sources, propaganda, and inflamed conspiracy theories.[69][71] The Hollywood Reporter noted that Five Star Movement's site TzeTze had 1.2 million fans on Facebook and it regularly shared fake news articles and pieces supportive of Vladimir Putin primarily cited to Russian government owned news sources including Sputnik News.[71] TzeTze often plagiarized the Russian source, and copied article titles and content directly from Sputnik News for its articles and re-posted them on its site.[72]

BuzzFeed News investigative journalists tracked TzeTze, another site critical of Renzi called La Cosa, and a blog by Beppe Grillo all to the same technology company called Casaleggio Associati which was started by Five Star Movement co-founder Gianroberto Casaleggio.[71] These Five Star Movement controlled sources all cross-post to each other when they publish articles and thereby amplify their reach to a wider audience.[72] Casaleggio's son Davide Casaleggio owns and manages TzeTze and La Cosa in addition to medical website La Fucina which markets anti-vaccine conspiracy theories and medical cure-all methods.[72] BuzzFeed News reporting discovered that the blog by Grillo, the websites for the Five Star Movement, and the all the fake news sites operated by the party all use the exact same IP adresses, in addition to identical Google Analytics and Google Adsense accounts.[72]

A former Google Adsense staff member analyzed the investigation by BuzzFeed News and compared the network of fake news sites run by the Five Star Movement party to the Donald Trump supportive fake news sites BuzzFeed News had previously investigated and found to be run out of one town in Macedonia.[72] The official stated: "M5S talks a lot about transparency, but then as part of my job I realised that they are making so much money off this thing. When you look online there is no transparency about the amount of money they make with the blog and the sites. It’s all so mixed up. The leaders of the party are making money with a fake news aggregator. It’s like if Trump owned the Macedonian sites."[72]

In October 2016, the Five Star Movement disseminated a video from Kremlin-aligned Russia Today which falsely reported displaying thousands of individuals protesting the 4 December 2016 scheduled referendum in Italy when in fact the video that went on to 1.5 million views was actually showing people who supported the referendum itself and were not opposed to it.[71][72] According to BuzzFeed News, the fake news sites run by Five Star Movement profit financially from the spread of such disinformation.[72]

President of the Italian Chamber of Deputies, Laura Boldrini, stated: "Fake news is a critical issue and we can’t ignore it. We have to act now."[69] Boldrini met on 30 November 2016 with vice president of public policy in Europe for Facebook Richard Allan to voice her concerns about the spread of fake news.[69] She said Facebook needed to admit they functioned for all intents and purposes as a media organization: "They can’t pretend that they are just a platform. They are giant media companies."[69]

Myanmar

Fake news negatively affected individuals in Myanmar, leading to a rise in violence against Muslims in the country.[61][62] Online participation within the country surged from a value of one percent to 20 percent of Myanmar's total populace from the period of time of 2014 to 2016.[61][62] Fake stories from Facebook in the country grew so influential that they were reprinted in paper periodicals called Facebook and The Internet that simply regurgitated the website's newsfeed text often without factual oversight, for those without Internet access.[62] False reporting related to practitioners of Islam in the country was directly correlated with increased attacks on people of the religion in Myanmar, and protests against Muslims.[61][62]

BuzzFeed News journalist Sheera Frenkel reported: "there has also been an increase in articles that demonize the country’s minority Muslim community, with fake news claiming that vast hordes of Muslim worshippers are attacking Buddhist sites. These articles, quickly shared and amplified on social media, have correlated with a surge in anti-Muslim protests and attacks on local Muslim groups."[61][62] Frenkel noted countries that were relatively newer to Internet exposure were more susceptible to the problem, writing: "Countries like Myanmar, which come online quickly and without many government-backed programs to teach safe internet habits — like secure passwords and not revealing personal details online — rank among the lowest in digital literacy. They are the most likely to fall for scams, hacks, and fake news."[62]

Sweden

Logo of the Swedish Security Service
The Swedish Security Service issued a report in 2015 identifying propaganda from Russia had the goal to "create splits in society."

The Swedish Security Service issued a report in 2015 identifying propaganda from Russia infiltrating Sweden with the objective to: "spread pro-Russian messages and to exacerbate worries and create splits in society."[1]

The Swedish Civil Contingencies Agency (MSB), part of the Ministry of Defence of Sweden, identified fake news reports targeting Sweden in 2016 which originated from Russia.[1] Swedish Civil Contingencies Agency official Mikael Tofvesson stated: "This is going on all the time. The pattern now is that they pump out a constant narrative that in some respects is negative for Sweden."[1]

The Local identified these tactics as a form of psychological warfare.[1] The newspaper reported the MSB identified Russia Today and Sputnik News as "important channels for fake news".[1] As a result of growth in this propaganda in Sweden, the MSB planned to hire six additional security officials to fight back against the campaign of fraudulent information.[1]

United States

U.S. President Barack Obama
U.S. President Barack Obama said, "If we can't discriminate between serious arguments and propaganda, then we have problems."

Fraudulent stories during the 2016 U.S. presidential election popularized on Facebook included a viral post that Pope Francis had endorsed Donald Trump, and another that wrote actor Denzel Washington "backs Trump in the most epic way possible".[73]

Donald Trump's son and campaign surrogate Eric Trump, top national security adviser Michael T. Flynn, and then-campaign managers Kellyanne Conway and Corey Lewandowski shared fake news stories during the campaign.[57][74][75][76]

U.S. President Barack Obama commented on the significant problem of fraudulent information on social networks impacting elections, in a speech the day before Election Day in 2016: "The way campaigns have unfolded, we just start accepting crazy stuff as normal. And people, if they just repeat attacks enough and outright lies over and over again, as long as it’s on Facebook, and people can see it, as long as its on social media, people start believing it. And it creates this dust cloud of nonsense."[12][77]

Shortly after the election, Obama again commented on the problem, saying in an appearance with German Chancellor Angela Merkel: "If we are not serious about facts and what’s true and what's not, and particularly in an age of social media when so many people are getting their information in sound bites and off their phones, if we can't discriminate between serious arguments and propaganda, then we have problems."[75][78]

One prominent fraudulent news story released after the electionthat protesters at anti-Trump rallies in Austin, Texas, were "bused in"started as a tweet by one individual with 40 Twitter followers.[79] Over the next three days, the tweet was shared at least 16,000 times on Twitter and 350,000 times on Facebook, and promoted in the conservative blogosphere, before the individual stated that he had fabricated his assertions.[79]

BuzzFeed called the problem an "epidemic of misinformation".[22] According to BuzzFeed's analysis, the 20 top-performing election news stories from fraudulent sites generated more shares, reactions, and comments on Facebook than the 20 top-performing stories from 19 major news outlets.[80][81]

Fox News host of the journalism meta analysis television program Media Buzz, Howard Kurtz, acknowledged fraudulent news was a serious problem.[81] Kurtz relied heavily upon the BuzzFeed analysis for his reporting on the controversy.[81] Kurtz wrote that: "Facebook is polluting the media environment with garbage".[81] Citing the BuzzFeed investigation, Kurtz pointed out: "The legit stuff drew 7,367,000 shares, reactions and comments, while the fictional material drew 8,711,000 shares, reactions and comments."[81] Kurtz concluded Facebook founder Mark Zuckerberg must admit the website is a media company: "But once Zuckerberg admits he’s actually running one of the most powerful media brands on the planet, he has to get more aggressive about promoting real news and weeding out hoaxers and charlatans. The alternative is to watch Facebook’s own credibility decline."[81]

Worries that fake news spread by the Russian government swayed the outcome of the election grew, and representatives in the U.S. Congress took action to safeguard the National security of the United States by advancing legislation to monitor incoming propaganda from external threats.[13][82] On 30 November 2016, legislators approved a measure within the National Defense Authorization Act to ask the U.S. State Department to take action against foreign propaganda through an interagency panel.[13][82] The legislation authorized funding of $160 million over a two-year-period.[13]

The initiative was developed through a bipartisan bill written in March 2016 by U.S. Senators Chris Murphy and Rob Portman titled: Countering Foreign Propaganda and Disinformation Act.[13] U.S. Senator Rob Portman stated: "This propaganda and disinformation threat is real, it’s growing, and right now the U.S. government is asleep at the wheel. The U.S. and our allies face many challenges, but we must better counter and combat the extensive propaganda and disinformation operations directed against us."[13] U.S. Senator Chris Murphy was interviewed by The Washington Post about the legislation and said: "In the wake of this election, it’s pretty clear that the U.S. does not have the tools to combat this massive disinformation machinery that the Russians are running."[13] United States Senate Select Committee on Intelligence member Senator Ron Wyden told The Washington Post: "There is definitely bipartisan concern about the Russian government engaging in covert influence activities of this nature."[13]

Members of the United States Senate Select Committee on Intelligence traveled to Ukraine and Poland in March 2016 and heard from officials in both countries on Russian operations to influence their affairs.[83] U.S. Senator Angus King told the Portland Press Herald that tactics used by Russia during the 2016 U.S. election cycle were analogous to those used against other countries as well.[83] King recalled: "We were told by various officials in both countries about the Russian standard practice of interfering with elections: planting fake news stories".[83] On 30 November 2016, King joined a letter in which seven members of the U.S. Senate Select Committee on Intelligence asked President Obama to publicize more information from the intelligence community on Russia's role in the U.S. election.[83][84] In an interview with CNN, Senator King warned against ignoring the problem: "I don't consider this a partisan issue. We can't just let it go and say that's history because they will keep doing it."[85]

Response

Google CEO comment and actions

A screenshot of a fake news story, falsely claiming Donald Trump won the popular vote in the 2016 United States presidential election
A screenshot of a fake news story, falsely claiming Donald Trump won the popular vote in the 2016 United States presidential election.[86][87]
Google CEO Sundar Pichai
Google CEO Sundar Pichai has said there should be "no situation where fake news gets distributed" and that it is possible fake news had some effect on the 2016 election.

In the aftermath of the 2016 U.S. presidential election, Google, along with Facebook, faced increased scrutiny in the role of fake-news websites in the election.[88] The top result on Google for results of the race was to a fraudulent news site.[89] "70 News" had fraudulently written an incorrect headline and article that Donald Trump won the popular vote against Hillary Clinton in the 2016 U.S. election.[86][87][88] With regards to the false results posted on "70 News", Google later stated that its prominence in search results was a mistake: "In this case we clearly didn't get it right, but we are continually working to improve our algorithms."[90] By Monday, November 14, the "70 News" result was the second link that people saw when searching for results of the race.[88]

When asked shortly after the election whether fraudulent news sites could have changed the election's results, Google CEO Sundar Pichai responded: "Sure" and went on to emphasize the importance of stopping the spread of fraudulent news sites: "Look, it is important to remember this was a very close election and so, just for me, so looking at it scientifically, one in a hundred voters voting one way or the other swings the election either way. ... From our perspective, there should just be no situation where fake news gets distributed, so we are all for doing better here."[91]

On 14 November 2016, Google responded to the growing problem of fraudulent news sites by banning such companies from profiting on advertising from traffic to false articles through its marketing program AdSense.[10][11][88] The company already had a policy for denying ads for dieting ripoffs and counterfeit merchandise.[92] Google stated upon the announcement: "We’ve been working on an update to our publisher policies and will start prohibiting Google ads from being placed on misrepresentative content. Moving forward, we will restrict ad serving on pages that misrepresent, misstate, or conceal information about the publisher, the publisher’s content, or the primary purpose of the web property."[93] This builds upon one of Google's existing better-advertisement policies, wherein misleading advertising is already banned from Google AdSense.[88][94] The ban is not expected to apply to news satire sites like The Onion; some satirical sites may be inadvertently blocked under this new system.[88]

Facebook deliberations

Blocking fraudulent advertisers

Facebook CEO Mark Zuckerberg
Facebook CEO Mark Zuckerberg specifically recommended fact-checking website Snopes.com as a way to respond to fraudulent news on Facebook.

Facebook made the decision to take a similar move to Google, and blocked fake news sites from advertising on its website the following day after Google took earlier action first on the matter.[11][88] Facebook explained its new policy: "We do not integrate or display ads in apps or sites containing content that is illegal, misleading or deceptive, which includes fake news. ... We have updated the policy to explicitly clarify that this applies to fake news. Our team will continue to closely vet all prospective publishers and monitor existing ones to ensure compliance."[93] The steps by both Google and Facebook intended to deny ad revenue to fraudulent news sites; neither company took actions to prevent dissemination of false stories in search engine results pages or web feeds.[10][95]

Facebook CEO Mark Zuckerberg said, in a post to his website on the issue, that the notion that fraudulent news sites impacted the 2016 election was a "crazy idea".[96][97] Zuckerberg rejected that his website played any role in the outcome of the election, describing the idea that it might have done so as "pretty crazy".[98] In a blog post, he stated that more than 99% of content on Facebook was authentic (i.e. neither fake news nor a hoax).[99] In the same blog post, he stated that "News and media are not the primary things people do on Facebook, so I find it odd when people insist we call ourselves a news or media company in order to acknowledge its importance."[100] Separately, Zuckerberg advised Facebook users to check the fact-checking website Snopes.com whenever they encounter fake news on Facebook.[101][102]

Top staff members at Facebook did not feel that simply blocking ad revenue from these fraudulent sites was a strong enough response to the problem, and together they made an executive decision and created a secret group to deal with the issue themselves.[96][97] In response to Zuckerberg's first statement that fraudulent news did not impact the 2016 election, the secret Facebook response group disputed this idea: "It’s not a crazy idea. What’s crazy is for him to come out and dismiss it like that, when he knows, and those of us at the company know, that fake news ran wild on our platform during the entire campaign season."[96][97] BuzzFeed reported that the secret task force included "dozens" of Facebook employees.[96][97]

Response

Facebook faced mounting criticism in the days after its decision to solely revoke advertising revenues from fraudulent news providers, and not take any further actions on the matter.[103][104] After one week of negative coverage in the media including assertions that the proliferation of fraudulent news on Facebook gave the 2016 U.S. presidential election to Donald Trump, Mark Zuckerberg posted a second post on the issue on 18 November 2016.[103][104] The post was a reversal of his earlier comments on the matter where he had discounted the impact of fraudulent news.[104]

Zuckerberg said there was an inherent difficult nature in attempting to filter out fraudulent news: "The problems here are complex, both technically and philosophically. We believe in giving people a voice, which means erring on the side of letting people share what they want whenever possible."[103] The New York Times reported some measures being considered and not yet implemented by Facebook included: "third-party verification services, better automated detection tools and simpler ways for users to flag suspicious content."[103] The 18 November post did not announce any concrete actions the company would definitively take, or when such measures would formally be put into usage on the website.[103][104]

Many people commented positively under Zuckerberg's second post on fraudulent news.[105] National Public Radio observed the changes being considered by Facebook to identify fraud constituted progress for the company into a new medium: "Together, the projects signal another step in Facebook's evolution from its start as a tech-oriented company to its current status as a complex media platform."[105] On 19 November 2016, BuzzFeed advised Facebook users they could report posts from fraudulent news websites.[106] Users could do so by choosing the report option: "I think it shouldn't be on Facebook", followed by: "It’s a false news story."[106]

In November 2016, Facebook began assessing use of warning labels on fake news.[107] The rollout was at first only available to a few users in a testing phase.[107] A sample warning read: "This website is not a reliable news source. Reason: Classification Pending".[107] TechCrunch analyzed the new feature during the testing phase and surmised it may have a tendency towards false positives.[107]

Impact

Fake news proliferation on Facebook had a negative financial impact for the company. The Economist reported: "Brian Wieser of Pivotal Research recently wrote that the focus on fake news and the concerns over the measurement of advertising could well cut revenue growth by a couple of percentage points."[108]

The New York Times reported shortly after Mark Zuckerberg's second statement on fake news proliferation on his website, that Facebook would engage in assisting the government of China with a version of its software in the country to allow increased censorship by the government.[109] Barron's newspaper contributor William Pesek was highly critical of this move, writing: "By effectively sharing its fake news problem with the most populous nation, Facebook would be a pawn of [China’s President Xi] Jinping's intensifying censorship push."[109]

Fact-checking websites and journalists

For more details on this topic, see FactCheck.org, PolitiFact.com, and Snopes.com.

Fact-checking websites play a role as debunkers to fraudulent news reports.[110][111][112] Such sites saw large increases in readership and web traffic during the 2016 U.S. election cycle.[110][111] FactCheck.org,[lower-alpha 3] PolitiFact.com,[lower-alpha 4] Snopes.com,[lower-alpha 5] and "The Fact Checker" section of The Washington Post,[lower-alpha 6] are prominent fact-checking websites that played an important role in debunking fraud.[101][110][112][118] The New Yorker writer Nicholas Lemann wrote on how to address fake news, and called for increasing the roles of FactCheck.org, PolitiFact.com, and Snopes.com, in the age of post-truth politics.[119] CNN media meta analyst Brian Stelter wrote: "In journalism circles, 2016 is the year of the fact-checker."[110]

Logo of PolitiFact
Fact-checking website PolitiFact.com was praised by rival fact-checking service FactCheck.org and recommended as a resource for readers to check before sharing a potentially fake story.

By the close of the 2016 U.S. election season, fact-checking websites FactCheck.org, PolitiFact.com, and Snopes.com, had each authored guides on how to respond to fraudulent news.[120][118][121] FactCheck.org advised readers to check the source, author, date, and headline of publications.[118] They recommended their colleagues Snopes.com, The Washington Post Fact Checker, and PolitiFact.com as important resources to rely upon before re-sharing a fraudulent story.[118] FactCheck.org admonished consumers to be wary of their own biases when viewing media they agree with.[118] PolitiFact.com announced they would tag stories as "Fake news" so that readers could view all fraudulent stories they had debunked.[121] Snopes.com warned readers: "So long as social media allows for the rapid spread of information, manipulative entities will seek to cash in on the rapid spread of misinformation."[120]

The Washington Post's "The Fact Checker" section, which is dedicated to evaluating the truth of political claims, greatly increased in popularity during the 2016 election cycle. Glenn Kessler, who runs the Post's "Fact Checker", wrote that "fact-checking websites all experienced huge surges in readership during the election campaign."[111] The Fact Checker had five times more unique visitors than during the 2012 cycle."[111] Kessler cited research showing that fact-checks are effective at reducing "the prevalence of a false belief."[111] Will Moy, director of the London-based Full Fact, a UK fact-checking website, said that debunking must take place over a sustained period of time to truly be effective.[111] Full Fact began work to develop multiple products in a partnership with Google to help automate fact-checking.[122]

FactCheck.org former director Brooks Jackson remarked that larger media companies had devoted increased focus to the importance of debunking fraud during the 2016 election: "It's really remarkable to see how big news operations have come around to challenging false and deceitful claims directly. It's about time."[110] FactCheck.org began a new partnership with CNN journalist Jake Tapper in 2016 to examine the veracity of reported claims by candidates.[110]

Angie Drobnic Holan, editor of PolitiFact.com, noted the circumstances warranted support for the practice: "All of the media has embraced fact-checking because there was a story that really needed it."[110] Holan was heartened that fact-checking garnered increased viewership for those engaged in the practice: "Fact-checking is now a proven ratings getter. I think editors and news directors see that now. So that's a plus."[110] Holan cautioned that heads of media companies must strongly support the practice of debunking, as it often provokes hate mail and extreme responses from zealots.[110]

On 17 November 2016, the International Fact-Checking Network (IFCN) published an open letter on the website of the Poynter Institute to Facebook founder and CEO Mark Zuckerberg, imploring him to utilize fact-checkers in order to help identify fraud on Facebook.[112][123] Created in September 2015, the IFCN is housed within the St. Petersburg, Florida-based Poynter Institute for Media Studies and aims to support the work of 64 member fact-checking organizations around the world.[124][125] Alexios Mantzarlis, co-founder of FactCheckEU.org and former managing editor of Italian fact-checking site Pagella Politica, was named director and editor of IFCN in September 2015.[124][125] Signatories to the 2016 letter to Zuckerberg featured a global representation of fact-checking groups, including: Africa Check, FactCheck.org, PolitiFact.com, and The Washington Post Fact Checker.[112][123] The groups wrote they were eager to assist Facebook root out fraudulent news sources on the website.[112][123]

In his second post on the matter on 18 November 2016, Zuckerberg responded to the fraudulent news problem by suggesting usage of fact-checking websites.[101][102] He specifically identified fact-checking website Snopes.com, and pointed out that Facebook monitors links to such debunking websites in reply comments as a method to determine which original posts were fraudulent.[101][102] Zuckerberg explained: "Anyone on Facebook can report any link as false, and we use signals from those reports along with a number of others — like people sharing links to myth-busting sites such as Snopes — to understand which stories we can confidently classify as misinformation. Similar to clickbait, spam and scams, we penalize this content in News Feed so it's much less likely to spread."[101][102]

Society of Professional Journalists president Lynn Walsh said in November 2016 that the society would reach out to Facebook in order to provide assistance with weeding out fake news.[126] Walsh said Facebook should evolve and admit that it functioned as a large media company: "The media landscape has evolved. Journalism has evolved, and continues to evolve. So I do hope that while it may not be the original thought that Facebook had. I think they should be now."[126]

Proposed technology tools

New York magazine contributor Brian Feldman responded to an article by media communications professor Melissa Zimdars, and used her list to create a Google Chrome extension that would warn users about fraudulent news sites.[127] He invited others to use his code and improve upon it.[127]

Slate magazine senior technology editor Will Oremus wrote that fraudulent news sites were controversial; and their prevalence was obscuring a wider discussion about the negative impact on society from those who only consume media from one particular tailored viewpoint and therefore perpetuate filter bubbles.[128]

Upworthy co-founder and The Filter Bubble author Eli Pariser launched an open-source model initiative on 17 November 2016 to address false news.[129][130] Pariser began a Google Document to collaborate with others online on how to lessen the phenomenon of fraudulent news.[129][130] Pariser called his initiative: "Design Solutions for Fake News".[129] Pariser's document included recommendations for a ratings organization analogous to the Better Business Bureau, and a database on media producers in a format like Wikipedia.[129][130]

Writing for Fortune, Matthew Ingram agreed with the idea that Wikipedia could serve as a helpful model to improve Facebook's analysis of potentially fake news.[131] Ingram concluded: "If Facebook could somehow either tap into or recreate the kind of networked fact checking that Wikipedia does on a daily basis, using existing elements like the websites of Politifact and others, it might actually go some distance towards being a possible solution."[131]

Academic analysis

Writing for MIT Technology Review, Jamie Condliffe said that merely banning ad revenue from the fraudulent news sites was not enough action by Facebook to effectively deal with the problem.[20] He wrote: "The post-election furor surrounding Facebook’s fake-news problem has sparked new initiatives to halt the provision of ads to sites that peddle false information. But it’s only a partial solution to the problem: for now, hoaxes and fabricated stories will continue to appear in feeds."[20] Condliffe concluded: "Clearly Facebook needs to do something to address the issue of misinformation, and it’s making a start. But the ultimate solution is probably more significant, and rather more complex, than a simple ad ban."[20]

Indiana University informatics and computer science professor Filippo Menczer commented on the steps by Google and Facebook to deny fraudulent news sites advertising revenue: "One of the incentives for a good portion of fake news is money. This could cut the income that creates the incentive to create the fake news sites."[132] Menczer's research team engaged in developing an online tool titled: Hoaxy to see the pervasiveness of unconfirmed assertions as well as related debunking on the Internet.[133]

Dartmouth College political scientist Brendan Nyhan has criticized Facebook for "doing so little to combat fake news... Facebook should be fighting misinformation, not amplifying it."[80]

Zeynep Tufekci, a writer and academic
Zeynep Tufekci wrote for The New York Times that Facebook "policies entrench echo chambers and fuel the spread of misinformation."

Zeynep Tufekci wrote critically about Facebook's stance on fraudulent news sites in a piece for The New York Times, pointing out fraudulent websites in Macedonia profited handsomely off false stories about the 2016 U.S. election: "The company's business model, algorithms and policies entrench echo chambers and fuel the spread of misinformation."[134]

Merrimack College assistant professor of media studies Melissa Zimdars wrote an article "False, Misleading, Clickbait-y and Satirical 'News' Sources" in which she advised how to determine if a fraudulent source was a fake news site.[135] Zimdars identified strange domain names, lack of author attribution, poor website layout, the use of all caps, and URLs ending in "lo" or "com.co" as red flags of a fake news site.[135] In evaluating whether a website contains fake news, Zimdars recommends that readers check the "About Us" page of the website, and consider whether reputable news outlets are reporting on the story.[135]

Education and history professor Sam Wineburg of the Stanford Graduate School of Education at Stanford University and colleague Sarah McGrew authored a 2016 study which analyzed students' ability to discern fraudulent news from factual reporting.[136][137] The study took place over a year-long period of time, and involved a sample size of over 7,800 responses from university, secondary and middle school students in 12 states within the United States.[136][137] The researchers were "shocked" at the "stunning and dismaying consistency" with which students thought fraudulent news reports were factual in nature.[136][137] The study found that 82 percent of students in middle school were unable to differentiate between an advertisement denoted as sponsored content from an actual online news article.[138] The authors concluded the solution was to educate consumers of media on the Internet to themselves behave like fact-checkers and actively question the veracity of all sources they encounter online.[136][137]

Scientist Emily Willingham proposed applying the scientific method towards fake news analysis.[139] She had previously written on the topic of differentiating science from pseudoscience, and applied that logic to fake news.[139] Her recommended steps included: Observe, Question, Hypothesize, Analyze data, Draw conclusion, and Act on results.[139] Willingham suggested a hypothesis of "This is real news", and then forming a strong set of questions to attempt to disprove the hypothesis.[139] These tests included: check the URL, date of the article, evaluate reader bias and writer bias, double-check the evidence, and verify the sources cited.[139]

Media commentary

Full Frontal

Samantha Bee, host of the TV show Full Frontal
Samantha Bee went to Russia for her television show Full Frontal and met with individuals financed by the government of Russia to act as Internet trolls and attempt to manipulate the 2016 U.S. election in order to subvert democracy.

Samantha Bee went to Russia for her television show Full Frontal and met with individuals financed by the government of Russia to act as Internet trolls and attempt to subvert the 2016 U.S. election in order to subvert democracy. The man and woman interviewed by Bee said they influenced the election by commenting on websites for New York Post, The Wall Street Journal, The Washington Post, Twitter, and Facebook.[140][141][142] They kept their identities covert, and maintained cover identities separate from their real Russian names, with the woman claiming in posts to be a housewife residing in Nebraska. They blamed consumers for believing all they read online.[140][141][142]

Executive producers for Full Frontal told The Daily Beast that they relied upon writer Adrian Chen, who had previously reported on Russian trolls for The New York Times Magazine in 2015, as a resource to contact those in Russia agreeable to be interviewed by Bee. The Russian trolls wore masks on camera and asked Full Frontal producers to maintain the confidentiality of all of their fake accounts so they would not be publicly identified. Full Frontal producers paid the Russian trolls to utilize the Twitter hashtag #SleazySam in order to troll the show itself, so the production staff could verify the trolls were indeed able to manipulate content online as they claimed.[142]

Subsequent to their research within Russia itself for a second segment on Full Frontal, the production staff came to the conclusion that Russian leader Vladimir Putin supported Donald Trump for U.S. President in order to subvert the system of democracy within the U.S.[142] Television producer Razan Ghalayini explained to The Daily Beast: "Russia is an authoritarian regime and authoritarian regimes don’t benefit from the vision of democracy being the best version of governance." Television producer Miles Kahn concurred with this analysis, adding: "It’s not so much that Putin wants Trump. He probably prefers him in the long run, but he would almost rather the election be contested. They want chaos."[142]

Last Week Tonight

John Oliver said the problem of fraudulent news sites fed into a wider issue of echo chambers in the media, saying there was "a whole cottage industry specializing in hyper-partisan, sometimes wildly distorted clickbait."[53]

Other media

Critics contended that fraudulent news on Facebook may have been responsible for Donald Trump winning the 2016 U.S. presidential election, because most of the fake news stories Facebook allowed to spread portrayed him in a positive light.[99] Facebook is not liable for posting or publicizing fake content because, under the Communications Decency Act, interactive computer services cannot be held responsible for information provided by another Internet entity. Some legal scholars, like Keith Altman, think that Facebook's huge scale creates such a large potential for fake news to spread that this law may need to be changed.[143] Writing for The Washington Post, Institute for Democracy in Eastern Europe co-director Eric Chenoweth wrote "many 'fake news' stories that evidence suggests were generated by Russian intelligence operations".[144]

British BBC News interviewed a fraudulent news site writer who went by the pseudonym "Chief Reporter (CR)", who defended his actions and possible influence on elections: "If enough of an electorate are in a frame of mind where they will believe absolutely everything they read on the internet, to a certain extent they have to be prepared to deal with the consequences."[145]

See also

Footnotes

  1. Fortune magazine described the Foreign Policy Research Institute as: "a conservative think tank known for its generally hawkish stance on relations between the U.S. and Russia"[43]
  2. The Washington Post and the Associated Press described PropOrNot as a nonpartisan foreign policy analysis group composed of persons with prior experience in international relations, warfare, and information technology sectors.[5][6][7] Their spokeman, interviewed by Adrian Chen of the The New Yorker said they were composed of government officials and tech company employees who agreed "that Russia should not be able to fuck with the American people".[45]
  3. FactCheck.org, a nonprofit organization and a project of the Annenberg Public Policy Center of the Annenberg School for Communication at the University of Pennsylvania,[113] won a 2010 Sigma Delta Chi Award from the Society of Professional Journalists.[114]
  4. PolitiFact.com, run by the Tampa Bay Times,[115] received a 2009 Pulitzer Prize for National Reporting for its fact-checking efforts the previous year.[115]
  5. Snopes.com, privately run by Barbara and David Mikkelson, was given "high praise" by FactCheck.org, another fact-checking website;[116] in addition, Network World gave Snopes.com a grade of "A" in a meta-analysis of fact-checking websites.[117]
  6. "The Fact Checker" is a project by The Washington Post to analyze political claims.[110] Their colleagues and competitors at FactCheck.org recommended The Fact Checker as a resource to use before assuming a story is factual.[118]

References

  1. 1 2 3 4 5 6 7 8 9 10 "Concern over barrage of fake Russian news in Sweden", The Local, 27 July 2016, retrieved 25 November 2016
  2. 1 2 3 4 5 Lewis Sanders IV (11 October 2016), "'Divide Europe': European lawmakers warn of Russian propaganda", Deutsche Welle, retrieved 24 November 2016
  3. 1 2 3 4 5 Paul Mozur and Mark Scott (17 November 2016), "Fake News on Facebook? In Foreign Elections, That's Not New", The New York Times, retrieved 18 November 2016
  4. 1 2 3 4 5 6 "Merkel warns against fake news driving populist gains", Yahoo! News, Agence France-Presse, 23 November 2016, retrieved 23 November 2016
  5. 1 2 3 4 5 6 7 8 Timberg, Craig (24 November 2016), "Russian propaganda effort helped spread 'fake news' during election, experts say", The Washington Post, retrieved 25 November 2016, Two teams of independent researchers found that the Russians exploited American-made technology platforms to attack U.S. democracy at a particularly vulnerable moment
  6. 1 2 3 4 5 "Russian propaganda effort likely behind flood of fake news that preceded election", PBS NewsHour, Associated Press, 25 November 2016, retrieved 26 November 2016
  7. 1 2 3 4 "Russian propaganda campaign reportedly spread 'fake news' during US election", Nine News, Agence France-Presse, 26 November 2016, retrieved 26 November 2016
  8. 1 2 3 Ali Watkins and Sheera Frenkel (30 November 2016), "Intel Officials Believe Russia Spreads Fake News", BuzzFeed News, retrieved 1 December 2016
  9. 1 2 3 4 Strohm, Chris (1 December 2016), "Russia Weaponized Social Media in U.S. Election, FireEye Says", Bloomberg News, retrieved 1 December 2016
  10. 1 2 3 "Google and Facebook target fake news sites with advertising clampdown", Belfast Telegraph, 15 November 2016, retrieved 16 November 2016
  11. 1 2 3 Shanika Gunaratna (15 November 2016), "Facebook, Google announce new policies to fight fake news", CBS News, retrieved 16 November 2016
  12. 1 2 John Ribeiro (14 November 2016), "Zuckerberg says fake news on Facebook didn't tilt the elections", Computerworld, retrieved 16 November 2016
  13. 1 2 3 4 5 6 7 8 9 Timberg, Craig (30 November 2016), "Effort to combat foreign propaganda advances in Congress", The Washington Post, retrieved 1 December 2016
  14. 1 2 3 4 5 6 7 8 Weisburd, Andrew; Watts, Clint (6 August 2016), "Trolls for Trump - How Russia Dominates Your Twitter Feed to Promote Lies (And, Trump, Too)", The Daily Beast, retrieved 24 November 2016
  15. 1 2 3 Dan Tynan (24 August 2016), "How Facebook powers money machines for obscure political 'news' sites - From Macedonia to the San Francisco Bay, clickbait political sites are cashing in on Trumpmania – and they're getting a big boost from Facebook", The Guardian, retrieved 18 November 2016
  16. 1 2 Ben Gilbert (15 November 2016), "Fed up with fake news, Facebook users are solving the problem with a simple list", Business Insider, retrieved 16 November 2016, Some of these sites are intended to look like real publications (there are false versions of major outlets like ABC and MSNBC) but share only fake news; others are straight-up propaganda created by foreign nations (Russia and Macedonia, among others).
  17. 1 2 Townsend, Tess (21 November 2016), "Meet the Romanian Trump Fan Behind a Major Fake News Site", Inc. magazine, ISSN 0162-8968, retrieved 23 November 2016
  18. 1 2 3 4 5 6 Sydell, Laura (23 November 2016), "We Tracked Down A Fake-News Creator In The Suburbs. Here's What We Learned", All Things Considered, National Public Radio, retrieved 26 November 2016
  19. 1 2 3 THR staff (17 November 2016), "Facebook Fake News Writer Reveals How He Tricked Trump Supporters and Possibly Influenced Election", The Hollywood Reporter, retrieved 18 November 2016
  20. 1 2 3 4 Jamie Condliffe (15 November 2016), "Facebook's Fake-News Ad Ban Is Not Enough", MIT Technology Review, retrieved 16 November 2016
  21. 1 2 Craig Silverman and Lawrence Alexander (3 November 2016), "How Teens In The Balkans Are Duping Trump Supporters With Fake News", BuzzFeed News, retrieved 16 November 2016, As a result, this strange hub of pro-Trump sites in the former Yugoslav Republic of Macedonia is now playing a significant role in propagating the kind of false and misleading content that was identified in a recent BuzzFeed News analysis of hyperpartisan Facebook pages.
  22. 1 2 Ishmael N. Daro and Craig Silverman (15 November 2016), "Fake News Sites Are Not Terribly Worried About Google Kicking Them Off AdSense", BuzzFeed, retrieved 16 November 2016
  23. 1 2 Christopher Woolf (16 November 2016), "Kids in Macedonia made up and circulated many false news stories in the US election", Public Radio International, retrieved 18 November 2016
  24. 1 2 "In Macedonia's fake news hub, this teen shows how it's done", CBS News, Associated Press, 2 December 2016, retrieved 3 December 2016
  25. 1 2 3 4 5 6 Chen, Adrian (27 July 2016), "The Real Paranoia-Inducing Purpose of Russian Hacks", The New Yorker, retrieved 26 November 2016
  26. 1 2 3 4 5 Lewis Sanders IV (17 November 2016), "Fake news: Media's post-truth problem", Deutsche Welle, retrieved 24 November 2016
  27. European Parliament Committee on Foreign Affairs (23 November 2016), "MEPs sound alarm on anti-EU propaganda from Russia and Islamist terrorist groups" (PDF), European Parliament, retrieved 26 November 2016
  28. 1 2 Surana, Kavitha (23 November 2016), "The EU Moves to Counter Russian Disinformation Campaign", Foreign Policy, ISSN 0015-7228, retrieved 24 November 2016
  29. "EU Parliament Urges Fight Against Russia's 'Fake News'", Radio Free Europe/Radio Liberty, Agence France-Presse and Reuters, 23 November 2016, retrieved 24 November 2016
  30. 1 2 MacFarquhar, Neil (29 August 2016), "A Powerful Russian Weapon: The Spread of False Stories", The New York Times, p. A1, retrieved 24 November 2016
  31. 1 2 3 4 5 Porter, Tom (28 November 2016), "How and EU failings allowed Kremlin propaganda and fake news to spread through the West", International Business Times, retrieved 29 November 2016
  32. 1 2 Schindler, John R. (5 November 2015), "Obama Fails to Fight Putin's Propaganda Machine", New York Observer, retrieved 28 November 2016
  33. 1 2 Schindler, John R. (26 November 2016), "The Kremlin Didn't Sink Hillary—Obama Did", New York Observer, retrieved 28 November 2016
  34. 1 2 3 4 LoGiurato, Brett (29 April 2014), "Russia's Propaganda Channel Just Got A Journalism Lesson From The US State Department", Business Insider, retrieved 29 November 2016
  35. LoGiurato, Brett (25 April 2014), "RT Is Very Upset With John Kerry For Blasting Them As Putin's 'Propaganda Bullhorn'", Business Insider, retrieved 29 November 2016
  36. 1 2 3 Stengel, Richard (29 April 2014), "Russia Today's Disinformation Campaign", Dipnote, United States Department of State, retrieved 28 November 2016
  37. 1 2 3 Dougherty, Jill (2 December 2016), "The reality behind Russia's fake news", CNN, retrieved 2 December 2016
  38. 1 2 3 4 Frenkel, Sheera (4 November 2016), "US Officials Are More Worried About The Media Being Hacked Than The Ballot Box", BuzzFeed News, retrieved 2 December 2016
  39. 1 2 Benedictus, Leo (6 November 2016), "Invasion of the troll armies: from Russian Trump supporters to Turkish state stooges", The Guardian, retrieved 2 December 2016
  40. 1 2 3 "U.S. officials defend integrity of vote, despite hacking fears", WITN-TV, 26 November 2016, retrieved 2 December 2016
  41. 1 2 3 4 5 "Vladimir Putin Wins the Election No Matter Who The Next President Is", The Daily Beast, 4 November 2016, retrieved 2 December 2016
  42. 1 2 Schatz, Bryan, "The Kremlin Would Be Proud of Trump's Propaganda Playbook", Mother Jones, retrieved 2 December 2016
  43. 1 2 Ingram, Matthew (25 November 2016), "No, Russian Agents Are Not Behind Every Piece of Fake News You See", Fortune magazine, retrieved 27 November 2016
  44. 1 2 vanden Heuvel, Katrina (29 November 2016), "Putin didn't undermine the election. We did.", The Washington Post, retrieved 1 December 2016
  45. 1 2 "The Propaganda About Russian Propaganda". The New Yorker. 1 December 2016. Retrieved 3 December 2016.
  46. Ben Norton; Glenn Greenwald (26 November 2016), "Washington Post Disgracefully Promotes a McCarthyite Blacklist From a New, Hidden, and Very Shady Group", The Intercept, retrieved 27 November 2016
  47. Taibbi, Matt (28 November 2016), "The 'Washington Post' 'Blacklist' Story Is Shameful and Disgusting", Rolling Stone, retrieved 30 November 2016
  48. Blumenthal, Max (25 November 2016). "Washington Post Promotes Shadowy Website That Accuses 200 Publications of Being Russian Propaganda Plants". AlterNet. Retrieved 3 December 2016.
  49. 1 2 3 Shapiro, Ari (25 November 2016), "Experts Say Russian Propaganda Helped Spread Fake News During Election", All Things Considered, National Public Radio, retrieved 26 November 2016
  50. 1 2 3 4 5 6 Collins, Ben (28 October 2016), "This 'Conservative News Site' Trended on Facebook, Showed Up on Fox News—and Duped the World", The Daily Beast, retrieved 27 November 2016
  51. 1 2 3 4 5 Chacon, Marco (21 November 2016), "I've Been Making Viral Fake News for the Last Six Months. It's Way Too Easy to Dupe the Right on the Internet.", The Daily Beast, retrieved 27 November 2016
  52. 1 2 3 Bambury, Brent (25 November 2016), "Marco Chacon meant his fake election news to be satire — but people took it as fact", Day 6, CBC Radio One, retrieved 27 November 2016
  53. 1 2 Rachel Dicker (14 November 2016), "Avoid These Fake News Sites at All Costs", U.S. News & World Report, retrieved 16 November 2016
  54. Chang, Juju (29 November 2016), "When Fake News Stories Make Real News Headlines", ABC News, retrieved 29 November 2016
  55. 1 2 McAlone, Nathan (17 November 2016), "This fake-news writer says he makes over $10,000 a month, and he thinks he helped get Trump elected", Business Insider, retrieved 18 November 2016
  56. 1 2 Goist, Robin (17 November 2016), "The fake news of Facebook", The Plain Dealer, retrieved 18 November 2016
  57. 1 2 Dewey, Caitlin (17 November 2016), "Facebook fake-news writer: 'I think Donald Trump is in the White House because of me'", The Washington Post, ISSN 0190-8286, retrieved 17 November 2016
  58. 1 2 3 4 Hedegaard, Erik (29 November 2016), "How a Fake Newsman Accidentally Helped Trump Win the White House - Paul Horner thought he was trolling Trump supporters – but after the election, the joke was on him", Rolling Stone, retrieved 29 November 2016
  59. 1 2 Eunice Yoon and Barry Huang (22 November 2016), "China on US fake news debate: We told you so", CNBC, retrieved 28 November 2016
  60. 1 2 Cadell, Catherine (19 November 2016), China says terrorism, fake news impel greater global internet curbs, Reuters, retrieved 28 November 2016
  61. 1 2 3 4 5 Read, Max (27 November 2016), "Maybe the Internet Isn't a Fantastic Tool for Democracy After All", New York Magazine, retrieved 28 November 2016
  62. 1 2 3 4 5 6 7 Frenkel, Sheera (20 November 2016), "This Is What Happens When Millions Of People Suddenly Get The Internet", BuzzFeed News, retrieved 28 November 2016
  63. 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 Kate Connolly, Angelique Chrisafis, Poppy McPherson, Stephanie Kirchgaessner, Benjamin Haas , Dominic Phillips, and Elle Hunt (2 December 2016), "Fake news: an insidious trend that's fast becoming a global problem - With fake online news dominating discussions after the US election, Guardian correspondents explain how it is distorting politics around the world", The Guardian, retrieved 2 December 2016
  64. Orlowski, Andrew (21 November 2016), "China cites Trump to justify 'fake news' media clampdown. Surprised?", The Register, retrieved 28 November 2016
  65. Pascaline, Mary (20 November 2016), "Facebook Fake News Stories: China Calls For More Censorship On Internet Following Social Media's Alleged Role In US Election", International Business Times, retrieved 28 November 2016
  66. Rauhala, Emily (17 November 2016), "After Trump, Americans want Facebook and Google to vet news. So does China.", The Washington Post, retrieved 28 November 2016
  67. Dou, Eva (18 November 2016), "China Presses Tech Firms to Police the Internet - Third-annual World Internet Conference aimed at proselytizing China's view to global audience", The Wall Street Journal, retrieved 28 November 2016
  68. 1 2 3 4 Murdock, Jason (30 November 2016), "Russian hackers may disrupt Germany's 2017 election warns spy chief", International Business Times UK edition, retrieved 1 December 2016
  69. 1 2 3 4 5 6 Horowitz, Jason (2 December 2016), "Spread of Fake News Provokes Anxiety in Italy", The New York Times, retrieved 3 December 2016
  70. 1 2 "La notizia più condivisa sul referendum? È una bufala", Pagella Politica (in Italian), pagellapolitica.it, retrieved 2 December 2016
  71. 1 2 3 4 Anderson, Ariston (30 November 2016), "Italy's Populist Party Found to Be Leader in Europe for Fake News", The Hollywood Reporter, retrieved 3 December 2016
  72. 1 2 3 4 5 6 7 8 Alberto Nardelli and Craig Silverman (29 November 2016), "Italy's Most Popular Political Party Is Leading Europe In Fake News And Kremlin Propaganda", BuzzFeed News, retrieved 3 December 2016
  73. Alyssa Newcomb (15 November 2016), "Facebook, Google Crack Down on Fake News Advertising", NBC News, NBC News, retrieved 16 November 2016
  74. Drum, Kevin (17 November 2016), "Meet Ret. General Michael Flynn, the Most Gullible Guy in the Army", Mother Jones, retrieved 18 November 2016
  75. 1 2 Tapper, Jake (17 November 2016), "Fake news stories thriving on social media - Phony news stories are thriving on social media, so much so President Obama addressed it. CNN's Jake Tapper reports.", CNN, retrieved 18 November 2016
  76. Masnick, Mike (14 October 2016), "Donald Trump's Son & Campaign Manager Both Tweet Obviously Fake Story", Techdirt, retrieved 18 November 2016
  77. President Barack Obama (7 November 2016), Remarks by the President at Hillary for America Rally in Ann Arbor, Michigan, White House Office of the Press Secretary, retrieved 16 November 2016
  78. Gardiner Harris and Melissa Eddy (17 November 2016), "Obama, With Angela Merkel in Berlin, Assails Spread of Fake News", The New York Times, retrieved 18 November 2016
  79. 1 2 Maheshwari, Sapna (20 November 2016), "How Fake News Goes Viral", The New York Times, ISSN 0362-4331, retrieved 20 November 2016
  80. 1 2 Craig Silverman (16 November 2016), "Viral Fake Election News Outperformed Real News On Facebook In Final Months Of The US Election", BuzzFeed, retrieved 16 November 2016
  81. 1 2 3 4 5 6 Kurtz, Howard, "Fake news and the election: Why Facebook is polluting the media environment with garbage", Fox News, archived from the original on 18 November 2016, retrieved 18 November 2016
  82. 1 2 Porter, Tom (1 December 2016), "US House of representatives backs proposal to counter global Russian subversion", International Business Times UK edition, retrieved 1 December 2016
  83. 1 2 3 4 Miller, Kevin (1 December 2016), "Angus King: Russian involvement in U.S. election 'an arrow aimed at the heart of democracy'", Portland Press Herald, retrieved 2 December 2016
  84. Staff report (30 November 2016), "Angus King among senators asking president to declassify information about Russia and election", Portland Press Herald, retrieved 2 December 2016
  85. Jim Sciutto and Manu Raju (3 December 2016), "Democrats want Russian hacking intelligence declassified", CNN, retrieved 3 December 2016
  86. 1 2 Bump, Philip (14 November 2016), "Google's top news link for 'final election results' goes to a fake news site with false numbers", The Washington Post, retrieved 26 November 2016
  87. 1 2 Jacobson, Louis (14 November 2016), "No, Donald Trump is not beating Hillary Clinton in the popular vote", PolitiFact.com, retrieved 26 November 2016
  88. 1 2 3 4 5 6 7 Wingfield, Nick; Isaac, Mike; Benner, Katie (14 November 2016), "Google and Facebook Take Aim at Fake News Sites", The New York Times, retrieved 28 November 2016
  89. Sonam Sheth (14 November 2016), "Google looking into grossly inaccurate top news search result displayed as final popular-vote tally", Business Insider, retrieved 16 November 2016
  90. "Google to ban fake news sites from its advertising network", Los Angeles Times, Associated Press, 14 November 2016, retrieved 16 November 2016
  91. Avery Hartmans (15 November 2016), "Google's CEO says fake news could have swung the election", Business Insider, retrieved 16 November 2016
  92. "Google cracks down on fake news sites", The Straits Times, 15 November 2016, retrieved 16 November 2016
  93. 1 2 Richard Waters (15 November 2016), "Facebook and Google to restrict ads on fake news sites", Financial Times, retrieved 16 November 2016
  94. Sridhar Ramaswamy (21 January 2016), "How we fought bad ads in 2015", Google blog, Google, retrieved 28 November 2016
  95. Paul Blake (15 November 2016), "Google, Facebook Move to Block Fake News From Ad Services", ABC News, retrieved 16 November 2016
  96. 1 2 3 4 Gina Hall (15 November 2016), "Facebook staffers form an unofficial task force to look into fake news problem", Silicon Valley Business Journal, retrieved 16 November 2016
  97. 1 2 3 4 Frenkel, Sheera (14 November 2016), "Renegade Facebook Employees Form Task Force To Battle Fake News", BuzzFeed, retrieved 18 November 2016
  98. Shahani, Aarti (15 November 2016), "Facebook, Google Take Steps To Confront Fake News", National Public Radio, retrieved 20 November 2016
  99. 1 2 Cooke, Kristina (15 November 2016), Google, Facebook move to restrict ads on fake news sites, Reuters, retrieved 20 November 2016
  100. "Facebook's Fake News Problem: What's Its Responsibility?", The New York Times, Associated Press, 15 November 2016, retrieved 20 November 2016
  101. 1 2 3 4 5 Ohlheiser, Abby (19 November 2016), "Mark Zuckerberg outlines Facebook's ideas to battle fake news", The Washington Post, retrieved 19 November 2016
  102. 1 2 3 4 Vladimirov, Nikita (19 November 2016), "Zuckerberg outlines Facebook's plan to fight fake news", The Hill, ISSN 1521-1568, retrieved 19 November 2016
  103. 1 2 3 4 5 Mike Isaac (19 November 2016), "Facebook Considering Ways to Combat Fake News, Mark Zuckerberg Says", The New York Times, retrieved 19 November 2016
  104. 1 2 3 4 Samuel Burke (19 November 2016), "Zuckerberg: Facebook will develop tools to fight fake news", CNNMoney, CNN, retrieved 19 November 2016
  105. 1 2 Chappell, Bill (19 November 2016), "'Misinformation' On Facebook: Zuckerberg Lists Ways Of Fighting Fake News", National Public Radio, retrieved 19 November 2016
  106. 1 2 Silverman, Craig (19 November 2016), "This Is How You Can Stop Fake News From Spreading On Facebook", BuzzFeed, retrieved 20 November 2016
  107. 1 2 3 4 Taylor Hatmaker and Josh Constine (1 December 2016), "Facebook quietly tests warnings on fake news", TechCrunch, retrieved 2 December 2016
  108. "False news items are not the only problem besetting Facebook", The Economist, 26 November 2016, retrieved 28 November 2016
  109. 1 2 Pesek, William (27 November 2016), "Will Facebook be China's propaganda tool?", The Japan Times, Barron's newspaper, retrieved 28 November 2016
  110. 1 2 3 4 5 6 7 8 9 10 Stelter, Brian (7 November 2016), "How Donald Trump made fact-checking great again", CNNMoney, CNN, retrieved 19 November 2016
  111. 1 2 3 4 5 6 Kessler, Glenn (10 November 2016), "Fact checking in the aftermath of a historic election", The Washington Post, retrieved 19 November 2016
  112. 1 2 3 4 5 Neidig, Harper (17 November 2016), "Fact-checkers call on Zuckerberg to address spread of fake news", The Hill, ISSN 1521-1568, retrieved 19 November 2016
  113. Hartlaub, Peter (24 October 2004), "Web sites help gauge the veracity of claims / Online resources check ads, rumors", San Francisco Chronicle, p. A1, retrieved 25 November 2016
  114. "Fact-Checking Deceptive Claims About the Federal Health Care Legislation - by Staff, FactCheck.org", 2010 Sigma Delta Chi Award Honorees, Society of Professional Journalists, 2010, retrieved 25 November 2016
  115. 1 2 Columbia University (2009), "National Reporting - Staff of St. Petersburg Times", 2009 Pulitzer Prize Winners, retrieved 24 November 2016, For "PolitiFact," its fact-checking initiative during the 2008 presidential campaign that used probing reporters and the power of the World Wide Web to examine more than 750 political claims, separating rhetoric from truth to enlighten voters.
  116. Novak, Viveca (10 April 2009), "Ask FactCheck - Snopes.com", FactCheck.org, retrieved 25 November 2016
  117. McNamara, Paul (13 April 2009), "Fact-checking the fact-checkers: Snopes.com gets an 'A'", Network World, retrieved 25 November 2016
  118. 1 2 3 4 5 6 Lori Robertson and Eugene Kiely (18 November 2016), "How to Spot Fake News", FactCheck.org, retrieved 19 November 2016
  119. Lemann, Nicholas (30 November 2016), "Solving the Problem of Fake News", The New Yorker, retrieved 30 November 2016
  120. 1 2 LaCapria, Kim (2 November 2016), "Snopes' Field Guide to Fake News Sites and Hoax Purveyors - Snopes.com's updated guide to the internet's clickbaiting, news-faking, social media exploiting dark side.", Snopes.com, retrieved 19 November 2016
  121. 1 2 Sharockman, Aaron (16 November 2016), "Let's fight back against fake news", PolitiFact.com, retrieved 19 November 2016
  122. Burgess, Matt (17 November 2016), "Google is helping Full Fact create an automated, real-time fact-checker", Wired magazine UK edition, retrieved 29 November 2016
  123. 1 2 3 The International Fact-Checking Network (17 November 2016), "An open letter to Mark Zuckerberg from the world's fact-checkers", Poynter Institute, retrieved 19 November 2016
  124. 1 2 Hare, Kristen (September 21, 2015), Poynter names director and editor for new International Fact-Checking Network, Poynter Institute for Media Studies, retrieved 20 November 2016
  125. 1 2 About the International Fact-Checking Network, Poynter Institute for Media Studies, 2016, retrieved 20 November 2016
  126. 1 2 Klasfeld, Adam (22 November 2016), "Fake News Gives Facebook a Nixon-Goes-to-China Moment", Courthouse News Service, retrieved 28 November 2016
  127. 1 2 Brian Feldman (15 November 2016), "Here's a Chrome Extension That Will Flag Fake-News Sites for You", New York Magazine, retrieved 16 November 2016
  128. Will Oremus (15 November 2016), "The Real Problem Behind the Fake News", Slate magazine, retrieved 16 November 2016
  129. 1 2 3 4 Morris, David Z. (27 November 2016), "Eli Pariser's Crowdsourced Brain Trust Is Tackling Fake News", Fortune magazine, retrieved 28 November 2016
  130. 1 2 3 Burgess, Matt (25 November 2016), "Hive mind assemble! There is now a crowdsourcing campaign to solve the problem of fake news", Wired magazine UK edition, retrieved 29 November 2016
  131. 1 2 Ingram, Matthew (21 November 2016), "Facebook Doesn't Need One Editor, It Needs 1,000 of Them", Fortune magazine, retrieved 29 November 2016
  132. "Google, Facebook move to curb ads on fake news sites", Kuwait Times, Reuters, 15 November 2016, retrieved 16 November 2016
  133. Menczer, Filippo (28 November 2016), "Fake Online News Spreads Through Social Echo Chambers", Scientific American, The Conversation, retrieved 29 November 2016
  134. Douglas Perry (15 November 2016), "Facebook, Google try to drain the fake-news swamp without angering partisans", The Oregonian, retrieved 16 November 2016
  135. 1 2 3 Cassandra Jaramillo (15 November 2016), "How to break it to your friends and family that they're sharing fake news", The Dallas Morning News, retrieved 16 November 2016
  136. 1 2 3 4 Domonoske, Camila (23 November 2016), "Students Have 'Dismaying' Inability To Tell Fake News From Real, Study Finds", National Public Radio, retrieved 25 November 2016
  137. 1 2 3 4 McEvers, Kelly (22 November 2016), "Stanford Study Finds Most Students Vulnerable To Fake News", National Public Radio, retrieved 25 November 2016
  138. Shellenbarger, Sue (21 November 2016), "Most Students Don't Know When News Is Fake, Stanford Study Finds", The Wall Street Journal, retrieved 29 November 2016
  139. 1 2 3 4 5 Willingham, Emily (28 November 2016), "A Scientific Approach To Distinguishing Real From Fake News", Forbes magazine, retrieved 29 November 2016
  140. 1 2 "Samantha Bee Interviews Russian Trolls, Asks Them About 'Subverting Democracy'", The Hollywood Reporter, 1 November 2016, retrieved 25 November 2016
  141. 1 2 Holub, Christian (1 November 2016), "Samantha Bee interviews actual Russian trolls", Entertainment Weekly, retrieved 25 November 2016
  142. 1 2 3 4 5 Wilstein, Matt (7 November 2016), "How Samantha Bee's 'Full Frontal' Tracked Down Russia's Pro-Trump Trolls", The Daily Beast, retrieved 25 November 2016
  143. Rogers, James (11 November 2016), "Facebook's 'fake news' highlights need for social media revamp, experts say", Fox News, retrieved 20 November 2016
  144. Chenoweth, Eric (25 November 2016), "Americans keep looking away from the election's most alarming story", The Washington Post, retrieved 26 November 2016
  145. "'I write fake news that gets shared on Facebook'", BBC News, BBC, 15 November 2016, retrieved 16 November 2016

Further reading

Wikinews has related news: Wikinews investigates: Advertisements disguised as news articles trick unknowing users out of money, credit card information

External links

Wikimedia Commons has media related to Fake news websites.
Look up spamvertise in Wiktionary, the free dictionary.
This article is issued from Wikipedia - version of the 12/4/2016. The text is available under the Creative Commons Attribution/Share Alike but additional terms may apply for the media files.