Facebook Is Giving Its Users Trust Scores (Report)
By Janko Roettgers
LOS ANGELES (Variety.com) – Facebook is deciding whether it should trust its users: The company has begun to give them individual trust scores ranging from zero to one, according to a new Washington Post report . The company is reportedly using this new system to better evaluate feedback from users who have flagged posts as fake news.
The goal of the system is to account for instances when users report accurate news reports as false just because they disagree with their premise, product manager Tessa Lyons told the paper. The score is presumably also meant to counteract organized disinformation campaigns that rely on mass reporting of unwanted posts.
The new score is just one of numerous signals the company uses to mitigate these risks; it also looks into the frequency with which users report posts as false, and collects data on which publishers are deemed trustworthy by its users.
Facebook began to offer users the ability to flag posts as false in 2015, and more recently started to work with third-party fact checkers to review flagged posts. Lyons didn’t go into details about other signals Facebook is using to generate its trust score, citing concerns that bad actors could use this information to bypass the company’s filters.