'Astroturfing'. You probably already know the term. If you don't, learn it now, and the practice it describes. A centrally-planned and sophisticated campaign deliberately made to look organic, crowdsourced and from the grassroots. A lot of metaphor for a now-standard method of influencing a chosen demographic. You can probably tell when you're the subject of an astroturfing campaign, but as we grow in sophistication, so too do the astroturfers. Georgia Institute of Technology offer some telltale signs of propagandist twitter use, but do bear in mind – the PR agencies read this stuff too.
As Election Day 2012 draws nearer, the "Twitterverse" promises to light up again and again with explosions of political opinion. But which tweets are the genuinely expressed feelings of individual users and which are systematic disseminations of information meant to support or discredit an idea—the textbook definition of propaganda?
A new study out of the Georgia Tech School of Computer Science calls such patterns of communication "hyperadvocacy." The study identifies four characteristic behaviors of Twitter hyperadvocates, whose actions clearly separate them from the tweeting behavior of typical users. Associate Professor Nick Feamster directed the study, working with former postdoctoral researcher Cristian Lumezanu and Associate Professor Hans Klein of Georgia Tech's School of Public Policy.
The study examined tweets from two recent politically charged U.S. events: the 2010 U.S. Senate race in Nevada and the 2011 debate over raising the U.S. debt ceiling. Collecting tweets that used the hashtags #nvsen and #debtceiling, the researchers were able to gather approximately 80 percent of all tweets on those issues during the time frame under study. From a dataset of nearly 100,000 tweets for the two issues combined, Feamster and his colleagues identified the following behaviors that characterize propagandistic activities on Twitter by users on both sides of the partisan aisle:
Sending high volumes of tweets over short periods of time;
Retweeting while publishing little original content;
Quickly retweeting others' content; and
Coordinating with other, seemingly unrelated users to send duplicate or near-duplicate messages on the same topic simultaneously.
"As social media become more and more ingrained in our culture, and as people use social media more as a source of information about the world, it's important to know the provenance of that information—where it's coming from and whether it can be trusted," Feamster said. "As a user, you might think the information you see is coming from lots of different sources, but in fact it can be part of an orchestrated campaign."
Indeed, the very aspect of Twitter that makes it appear less amenable to traditional propaganda also makes it difficult to address with traditional content analysis techniques. Historically researchers could sift through the content of major media vehicles (The New York Times or Wall Street Journal, for instance) looking for "extreme" language, but such methods are often rendered meaningless in the world of social media where the huge number of users makes it nearly impossible to identify a baseline "standard" language.
"Twitter is a sort of 'extreme democracy'– everyone's a publisher, and people can say whatever they want with no rejection or limit. It's complete freedom of expression," said Lumezanu, now a researcher at NEC Laboratories America in Princeton, N.J. "We had to come up with a way to identify hyperadvocate behavior that didn't try to politically valuate content, because in Twitter the content often can be misleading."
Source: Georgia Institute of Technology
Of course, sometimes it's nothing so sinister. The video reproduced above was the subject of all those indicators mentioned in the study. Propaganda? Probably just for the barman's job prospects.