Government Surveillance Program Finds It Takes 13 Hours to Fact Check Twitter
A hoax analysis program doesn't have to look far for falsehood, but it has to be patient waiting for truth.
Anyone who has ever been on Twitter has seen falsehoods and lies. Hoaxy has seen more than that. A government-financed program, Hoaxy has surveyed “Wrong Twitter,” the fact-averse swath of the social media network, between October and February. The program found that it takes 13 hours on average for a fact check to put wrongs right.
The program compares tweets, re-tweets, quotes, and replies to corresponding information from fact-checking sites including snopes.com, opensecrets.org, truthorfiction.com, politifact.com, and hoaxslayer.com.
Some might remember the controversial accusations of Orwellian government censorship thrown at some of these same researchers from Indiana University when their former program dubbed Truthy was released as an analytical tool to study the abuses of social media.
Truthy (based off Stephen Colbert’s made up word Truthiness) was seen by several commentators as a government tool to reach in and determine what has news value and what doesn’t, or what’s the truth and what’s false, when in reality these are grey areas.
The researchers were quick to dismiss these kinds of claims, noting that its status as a federally funded project was in line with many other efforts in academic circles, and saying that its program is not motivated along partisan lines. But media, especially conservative circles, were particularly virulent in their attacks against the study.
Now, with this new program Hoaxy taking on similar subject matter, the researchers could see another debate on their hands.
One of the study’s findings is not only does fake news spread faster than its fact checking counterpart, but also that “fake news are dominated by very active users, while fact checking is a more grassroots activity.”
However, it’s not clear Hoaxy is including verified news organizations in this analysis, especially ones who are actually paid to do a job, not just as a “grassroots effort.”
For example, the study cites one instance in which they analyzed stories surrounding the January 14 death of actor Alan Rickman, specifically news sources that were reporting he had not died.
“We used the keywords ‘alan’ and ‘rickman’ to match URLs from our database, and found 15 matches among fake news sources and two from fact-checking ones,” the study reads.
While there may have only been two news sources who directly addressed the 15 false reports, there were, without question, hundreds of publications covering that story sourcing legitimate information.
But assuming the 13 hour turn around time between fake and real news is accurate, that would still represent a far better marker than the days of print when the earliest a story got a correction was in the next day’s paper buried somewhere on the editorial page.
The researchers aim to continue their research into misinformation through social media. They concluded the paper by saying, “In the future we plan to study the active spreaders of fake news to see if they are likely social bots.”