Parkland Conspiracies Overwhelm the Internet's Broken Trending Tools

An algorithm change apparently designed to keep fake news out is precisely what put a Parkland conspiracy video at the top of YouTube's Trending section.
Image may contain Human Person Vigil Glasses Accessories Accessory and Candle
A Parkland conspiracy video sat at the top of YouTube's Trending section Wednesday, because algorithms are no match for bad faith.Matt McClain/The Washington Post/Getty Images

It takes a special sort of heartlessness to create a conspiracy video about a teenage survivor of one of the deadliest school shootings in US history. But it takes a literally heartless algorithm to ensure that thousands, or even millions, of people see it.

For a brief period on Wednesday, YouTube awarded the top spot in its Trending section to a conspiracy video claiming that 17-year-old David Hogg, a survivor of the Marjory Stoneman Douglas High School shooting that killed 17 students, was in fact an actor. The prime placement of the video, which has since been removed, shocked YouTube users and members of the media alike. It shouldn't have. YouTube's screwup is only the latest to highlight the fundamental flaws of the algorithms that decide what gets surfaced across all social platforms.

On Trend

YouTube, Facebook, and Twitter all have a section designed to surface the most newsworthy, relevant information in the midst of a vast sea of content. But time and again, they have utterly failed. In the worst cases, the algorithms backing these trending sections drive bot-fueled hashtag campaigns promoting gun rights to the top of Twitter Trends, and fake news stories about former Fox news anchor Megyn Kelly into Facebook's Trending Topics portal. Human curation hasn't worked out much better. Reports that Facebook's curators suppressed news from conservative outlets in trending topics set off a two-year cascade of crises for the social network.

But even at their most benign, these algorithmically derived trends rarely serve their expressed purpose. Based largely on conversation volume, trending tools naturally drive the public consciousness toward topics of outrage; an outrageous topic trending only adds to the outrage. How many times have you clicked on a trending topic on Twitter, only to see an endless scroll of Tweets decrying that the topic is trending in the first place? The conversation about the trend becomes the trend itself, an interminable loop of outrage that all started because some line of code decided to tell millions of people that topic was important.

The Parkland video topping YouTube's trending page seems especially galling because it appears to have gotten there not by accident but as the result of an attempt on YouTube's part to fix fake news. YouTube says its system “misclassified” the conspiracy video “because the video contained footage from an authoritative news source.” Whatever minimal nuance was needed to block the Hogg conspiracy, algorithms lack it.

Though YouTube got most of the blame on Wednesday, Facebook ought to have shared it. David Hogg's name also appeared in the company's Trending Topics section. As of Wednesday afternoon, the first story that surfaces when users clicked his name was a news clip debunking rumors Hogg is an actor. But just three results down sat another video, showing a visibly nervous Hogg stumbling over his words with the caption, "This one is David hogg, the video that keeps coming down on YouTube. Seems like he's been scripted #davidhogg #actor #falseflag #censorship #floridashooting #florida."

Top videos under the trending topic "David Hogg," as seen on Facebook on February 21, 2018.

Facebook

Below that, Facebook ranked another conspiracy post by former Sports Illustrated swimsuit model Amber Smith as the top Public Post on the topic, above legitimate news sources like the Toronto Star and CBS Boston. Smith’s post reads in part, “Fascist-Book will take this down soon so view quickly.. David Hogg just 6 months ago was in an anti-gun rally (pictured, gee, no kidding!), he is not a student at the recent false flag event in Florida that was staged to take away your rights. Please, fight for your rights!”

Top public posts under the trending topic "David Hogg," as seen on Facebook on February 21, 2018.

Facebook

In a statement, Mary deBree, head of content policy at Facebook said, “Images that attack the victims of last week's tragedy in Florida are abhorrent. We are removing this content from Facebook."

It's a standard response that does little to prevent future disinformation campaigns from spreading on the platform, and does nothing to mitigate the damage that has already been done.

Half-Measures

The system is broken. It directly contributes to the spread of fake information that has plagued social media platforms for years. So why not scrap it? Why have a trending module at all? It's largely because of money, says Dipayan Ghosh, a fellow at the think tank New America who recently left his job on Facebook's privacy and public policy team. "The Facebook of 10 years or five years ago isn't the Facebook of today," says Ghosh. "This Facebook has grown tremendously in its size and influence around the world, and part of that is because of the promotion of particularly engaging content that attracts eyeballs and keeps them on the screen for long periods of time."

Facebook and YouTube's best answer so far, other than vague promises of algorithm improvements, has been for each to pledge to build a team of 10,000 moderators to take down problematic content. But more than 400 hours of content gets uploaded to YouTube alone each minute. Ten million humans would have a hard time keeping up, much less 10,000.

Twitter, meanwhile, announced Wednesday that it was making changes to the way automated accounts, or bots, are allowed to operate on the platform, which could have important repercussions for Twitter Trends, arguably the most easily gamed of all of the platforms. These coordinated networks of bots sync up to promote the same hashtag in rapid succession in order to get a given topic trending.

As Clint Watts, a fellow at the Foreign Policy Research Institute and a former FBI special agent, recently put it during a congressional hearing on terrorism and social media, "The negative effects of social bots far outweigh any benefits. The anonymous replication of accounts that routinely broadcast high volumes of misinformation can pose a serious risk to public safety and, when employed by authoritarians, a direct threat to democracy."

Twitter has stopped short of banning bots entirely, but it will drastically limit the ways in which they can interact with each other. In a blog post, the company detailed a number of new limitations for third-party developers designed to stop users from posting or liking simultaneously from multiple accounts, or to rally multiple accounts behind a single hashtag all at once.

It remains to be seen how effective any of these changes will be at cleaning up these trending tools. Hoaxers and trolls have, after all, found a way around almost every obstacle these platforms have put in their way up until now. Why should this time be any different?

By introducing the concept of what's trending, tech companies told their billions of users they were going to show them the news they needed to know. And yet at a time when social platforms have repeatedly fallen down on the job, it's worth wondering whether the public really needs their help.

Trending Machine