CNetS alumnus Mihai Avram is the recipient of the 2020 Indiana University Distinguished Master’s Thesis Award for his work on Hoaxy and Fakey: Tools to Analyze and Mitigate the Spread of Misinformation in Social Media. This award recognizes a “truly outstanding” Master’s thesis based on criteria such as originality, documentation, significance, accuracy, organization, and style. Some of the findings in Mihai’s thesis have recently been published in the paper Exposure to social engagement metrics increases vulnerability to misinformation, in The Harvard Kennedy School Misinformation Review. Congratulations Mihai!
On 15 September 2020, The Washington Post published an article by Isaac Stanley-Becker titled “Pro-Trump youth group enlists teens in secretive campaign likened to a ‘troll farm,’ prompting rebuke by Facebook and Twitter.” The article reported on a network of accounts run by teenagers in Phoenix, who were coordinated and paid by an affiliate of conservative youth organization Turning Point USA. These accounts posted identical messages amplifying political narratives, including false claims about COVID-19 and electoral fraud. The same campaign was run on Twitter and Facebook, and both platforms suspended some of the accounts following queries from Stanley-Becker. The report was based in part on a preliminary analysis we conducted at the request of The Post. In this brief we provide further details about our analysis.Continue reading Evidence of a coordinated network amplifying inauthentic narratives in the 2020 US election
We are excited to announce the new v.1.3 of BotSlayer, our OSoMe cloud tool that lets journalists, researchers, citizens, & civil society organizations track narratives and detect potentially coordinated inauthentic information networks on Twitter in real-time. Improvements and new features include better stability, a new alert system, a Mac installer, and many additions to the interface. This version is released in time for those who would like to use BotSlayer to monitor #Election2020 manipulation.Continue reading UPDATE: BotSlayer tool to expose disinformation networks
Indiana University’s Observatory on Social Media, funded in part last year with a $3 million grant from the John S. and James L. Knight Foundation, has named two new Knight Fellows. Matthew DeVerna and Harry Yaojun Yan will help advance the center’s ongoing investigations into how information and misinformation spread online. The Observatory on Social Media, or OSoMe (pronounced “awesome”), is a collaboration between CNetS in the Luddy School of Informatics, Computing and Engineering; The Media School; and the IU Network Science Institute. Congratulations to Harry and Matt! More…
Indiana University will establish a $6 million research center to study the role of media and technology in society. With leadership by CNetS faculty, the Observatory on Social Media will investigate how information and misinformation spread online. It will also provide students, journalists and citizens with resources, data and training to identify and counter attempts to intentionally manipulate public opinion. Major support for the center comes from the John S. and James L. Knight Foundation, which will contribute $3 million, as well as funds from the university. The center is a collaboration between the IU School of Informatics, Computing and Engineering, The Media School and the IU Network Science Institute. More…
UPDATE: This paper is ranked #3 most read among all articles published by Nature Communications in 2018
Analysis by CNetS researchers of information shared on Twitter during the 2016 U.S. presidential election has found that social bots played a disproportionate role in spreading misinformation online. The study, published in the journal Nature Communications, analyzed 14 million messages and 400,000 articles shared on Twitter between May 2016 and March 2017 — a period that spans the end of the 2016 presidential primaries and the presidential inauguration on Jan. 20, 2017. Among the findings: A mere 6 percent of Twitter accounts that the study identified as bots were enough to spread 31 percent of the low-credibility information on the network. These accounts were also responsible for 34 percent of all articles shared from low-credibility sources. The study also found that bots played a major role promoting low-credibility content in the first few moments before a story goes viral. Continue reading Twitter bots play disproportionate role spreading misinformation
Congratulations to Rion Correia, who successfully defended his PhD dissertation on Prediction of Drug Interaction and Adverse Reactions, with data from Electronic Health Records, Clinical Reporting, Scientific Literature, and Social Media, using Complexity Science Methods. Dr. Correia’s research used network science, machine learning, and data science to uncover population-level associations of drugs and symptoms, useful for public health surveillance. His findings show that Social Media (Instagram and Twitter) and Electronic Health Records of an entire city in Southern Brazil, are very useful to reveal how the Drug interaction phenomenon varies across distinct groups. For instance, he identifying gender biases and specific communities of interest in chronic disease (e.g. Epilepsy and Depression). In addition to Complex Networks and Systems, his dissertation contributes to the fields of biomedical informatics and precision public health by leveraging heterogeneous data sources at multiple levels to understand population and individual pharmacology differences and other public health problems.
Congratulations to Dimitar Nikolov, who successfully defended his PhD dissertation on Information Exposure Biases in Online Behaviors. Dr. Nikolov’s research explored the unintentional biases introduced by filtering, ranking, and recommendation algorithms that mediate our online consumption of information. His findings show that our reliance on modern online technologies limits exposure to diverse points of view and makes us vulnerable to misinformation. In particular, he analyzed two massive Web traffic datasets to quantify the popularity and homogeneity bias of several popular online platforms including social media, email, personalized news, and search engines. He also leveraged Twitter data to characterize the link between political partisanship and vulnerability to online pollution, such as fake news, conspiracy theories, and junk science. His dissertation contributes to the field of computational social science by putting the study of bias in information consumption and derived phenomena like political polarization, echo chambers, and online pollution on a more firm quantitative foundation.
Filippo Menczer, a professor of computer science and informatics at CNetS, appeared on a panel of experts to discuss the emergence and dissemination of misinformation, and how it threatens society at the annual meeting of American Association for the Advancement of Science in Washington, D.C., Feb. 15.
Menczer was a part of a three-person panel and presented a talk, “Eight Ways Social Media Makes Society Vulnerable to Misinformation.” The talk provided an overview of ongoing network analytics, modeling, and machine learning efforts to study the viral spread of misinformation and to develop tools for countering the online manipulation of opinions. Menczer has previously developed systems such as Botometer, which detects social media bots, and Hoaxy, which maps the diffusion of low-credibility content.Continue reading CNeTS researcher provides expertise on misinformation battle at AAAS conference
Researchers at CNetS, IUNI, and the Indiana University Observatory on Social Media have launched upgrades to two tools playing a major role in countering the spread of misinformation online: Hoaxy and Botometer. A third tool Fakey — an educational game designed to make people smarter news consumers — also launches with the upgrades. Continue reading 3 new tools to study and counter online disinformation