Separating the wheat from the chaff in an age of bots and trolls

bots and trolls2In the age of ubiquitous connectivity and social media, information is at our fingertips. Unfortunately, so is misinformation and often it is hard to tell one from the other.

A recent roundtable discussion sponsored by the South Big Data Hub examined the rapidly changing landscape for building online communities, sharing information, and creating what often appears to be a groundswell of support for particular points of view. 

The roundtable panelists have studied social media data and worked to understand its impacts for years. They have a wide range of experience in computer science, data science, and the behavioral and social sciences (for panelist bios, click here).

According to one of those panelists, Kathleen M. Carley, PhD, of Carnegie Mellon University, social media, easy access to the internet, and the proliferation of bots (web “robot” software that runs automated tasks over the internet) and trolls (people who seek arguments on the internet) has turned discourse about big ideas into “a wild west.” The cast of characters on this new frontier include Anonymous—a loosely associated group of international network hackers and activists who generally oppose internet censorship—and social media platforms like Twitter and YouTube, who police their sites and evolve and change on a daily basis.

Increasingly, information from social media, posted by ordinary citizens rather than trained journalists, shapes the news, said Carley. Marketing campaigns are designed to go viral, spreading through networks and communities using social media. Misinformation can multiply, deep information and context are often hard to find, and bots and trolls busily create new online communities and shape public conversations.

How do bots work? They are often embedded deeply in online communities and used to link different communities so that community members receive the same information. They share posts between social media platforms and post stories that appear to be from legitimate news sites but are often from sites created to appeal to a constituency or give the appearance of wide support.

“What this does is create a groundswell of what appears to be strong support for common ideas even though it is just fiction,” said Carley. “You get truth and fiction being spread to people so fast through a series of soundbites that there’s not an ability for them to check it.”

Panelist Nitin Agarwal, PhD, of the University of Arkansas at Little Rock referred to a recent Wired magazine cover story that profiled teenage bloggers in Macedonia, where the chance to make comparatively large sums of money motivates tech savvy young people to create blogs that lift misinformation from sites with right wing or alt-right viewpoints. Since recent research shows 34 percent of Americans trust the information they receive from social media and 14 percent consider social media their most important information source, these blogs can have real impact.

“This is a huge percentage when you consider the electorate; 1 percent can make a difference there,” Agarwal said.

Bot and troll image

Left to right: Panel moderator Lea Shanley of the South Big Data Hub, Nitin Agarwal and Kathleen Carley. On screen are Huan Liu and Rand Waltzman

Agarwal’s research team tracks blogs and bloggers, how fringe ideas make their way into the mainstream, and how these ideas are further shared through mainstream social media, taking full advantage of a mass communication system that supports one-to-one, one-to-many, and most importantly many-to-many communications.

Panelist Huan Liu, PhD, of Arizona State University has been a big data researcher for more than 20 years and is an expert on gleaning knowledge from data, but, he said, traditional methods of data analysis don’t work as well on social media data, partly because of the misinformation on social media channels. Furthermore, artificial intelligence, which is integrated into social media platforms and into bots, makes bots more “intelligent” and harder to detect, he said.

Data science related to social media must account for misinformation by detecting bots, thereby reducing their impact, according to Liu. However, traditional big data research tactics are not enough in the era of artificial intelligence.

“Big data alone is not enough,” said Liu. “We need to enable user-controlled information filters and checkers. We need to promote and build diverse collective intelligence.”

Manipulating the narrative

According to panelist Rand Waltzman, PhD, of Rand Corporation fake news is an old idea that’s been given new life through social media and the internet. It’s an example of “active measures,” a term coined by the KGB, he said.

“Active measures are techniques used to manipulate groups of people—anywhere from 10 to a billion—using information as a weapon, doing whatever the perpetrator wishes, while making those people think it was their idea,” said Waltzman. “It doesn’t matter whether the information is true, half true or false as long as it gets the job done.”

Active measure techniques, said Waltzman, are used in everything from criminal activity to mass marketing to political campaigns. “At some point, and I think we are almost there, active measures completely dominate the information environment and objective truth and reality become almost meaningless or irrelevant concepts.”

How can researchers and ordinary humans fight back? The panelists agreed that researchers need to look at the big picture, tracking multiple platforms and multiple sources of messages, rather than one social media platform or one post. With Twitter, for example, they need to study not individual tweets, but whole conversations in order to understand context. As social media and the internet become more image based, images must be tracked and analyzed to determine if they have been altered to support a particular narrative.

In addition, counter arguments and attempts to discredit information sources won’t work, the panelists agreed.

“It’s not so much about countering, but replacing one narrative with another,” said Waltzman. “All of these things exist because they fulfill a need of some sort. You can’t discredit or fight against it without providing an alternative. The question becomes how do you inject the alternative (narrative) and get it to replace the one you don’t like.”

To listen to the entire roundtable discussion, click here. To learn about upcoming South Big Data Hub roundtables and other events, please see our calendar of events.

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s