Students build a fix for Facebook’s fake news problem in just 36 hours
Issued on: Modified:
Are we really in the “post-truth” age? In a year of flawed public polling and surprising election results, the rise of unverified news shared across social media is a major part of the problem. Access to factual information and the debunking of dodgy news stories is more important than ever.
A group of enterprising students in the USA came up with a solution to Facebook’s problem with fake news by inventing a Google Chrome extension that can automatically indicate whether stories being shared on the social network have been verified or not.
Qinglin Chen, Nabanita De, Anant Goel and Mark Craft created the ‘FiB’ extension (with the slogan “Let’s stop living a lie”) at a Princeton hackathon event on November 11 - 13, a marathon coding event where students have only 36 hours to create an original software or hardware project from scratch.
“Before we went to the hackathon there was lots of news on the US election,” De told FRANCE 24's Observers. “People were commenting on who won and why, and lots of people were blaming Facebook for the fake news being shared, which led to voters supporting a certain candidate. That struck us and we thought that we should do something about it.”
The group ended up winning the Google Moon Shot award for the most ambitious project at the hackathon, having decided to invent a solution before Facebook even admitted the problem existed.
The FiB team explained that their extension works like an ad blocker, reading only whatever content is on your screen currently and ‘scraping’ this information in order to analyse whether it is verified or not.
How does it work?
The algorithm works with two levels of checks: first, it checks the link against existing databases of malware and phishing sites such as Google’s Safe Browse, and then grants the article an initial ‘trust’ score. The second stage is to take keywords from the content of the article and see if they match up with other articles in Google and Bing, and another score of reliability is produced. The software then combines the two scores to decide whether the story or link is reliable or not, and a tag appears on the post saying either ‘verified’ or ‘not verified’. If the programme is unable to verify a link, it offers alternative content.
The team are convinced of the importance of their project. “Fake news is a serious problem and can lead, maybe indirectly, to bad things,” says De. “If you look at the number of people using Facebook and the percentage of them that perhaps haven’t had much of an education, then it is very easy to target this audience with fake news. It becomes too easy for people trying to sell things.”
From left to right: Anant Goel; Nabanita De; Qinglin Chen and Mark Craft
Mark Zuckerberg, the CEO of Facebook, said the day after the US election results came out that the idea that Facebook could have influenced the presidential race was “a pretty crazy idea”. In a post on Facebook after the election, he said that only a small fraction of the content shared across the site was a hoax.
But just a few days after that announcement, Facebook banned fake news sites from profiting from advertising on the social network, following Google’s decision to do the same.
Open source project
The rules of the hackathon meant that the team was obliged to release the extension online as an open source project, meaning that other hackers can use the code and tweak it or improve it themselves.
The extension garnered a huge amount of attention in a very short space of time.
“We tried to set up a counter to see how many people were downloading it but it kept crashing. The last time that it crashed, we had got up to 50,000 users within two days,” says Goel. “Currently we’re scaling it up so that more people can access our services.”
What the team need now is money to back the project – and time, seeing as they’re all full-time students. They are looking at ways of developing the programme, possibly to other browsers or social media networks.
They all have different opinions of what Facebook should do to tackle the spread of fake news.
“If Facebook starts to verify things itself, that becomes a form of censorship,” says Goel. “Facebook needs to partner up with a software company working on a pro bono basis, which is able to judge links much better than Facebook can as a for-profit company.”
“Users should be able to share whatever links they want,” says Craft, “but fake news shouldn’t be able to become trending news.”
The team is confident that the propagation of fake news will soon be under control. They’ve already made a very good start.
Debunking fake news is something of a speciality at The Observers. Read our guide on how to verify news stories, videos and photos on social media networks, and keep up to date with our latest debunks here.