We are looking for a research scientist to help run the Observatory on Social Media (OSoMe, pronounced awe•some) at Indiana University Bloomington (IUB). The official title of the position is Senior Project Coordinator (SPC). The Senior Project Coordinator will join the OSoMe senior management team — director Filippo Menczer, co-directors for research Betsi Grabe and Alessandro Flammini, co-directors for education Elaine Monaghan and John Paolillo, Dean James Shahahan, and associate director for technology Val Pentchev. The mission of the Observatory, which recently received a $6 million investment from the John S. and James L. Knight Foundation and Indiana University, is to study the media and technology networks that drive the online diffusion of dis/mis/information. OSoMe offers access to data and tools for researchers worldwide to uncover the vulnerabilities of the media ecosystem and develops methods for increasing the resilience of citizens and democratic systems to manipulation.Continue reading OSoMe Research Scientist Wanted
First announced in September 2019, the new BotSlayer software to expose disinformation networks is designed and developed by CNetS faculty and students in collaboration with IUNI staff and the Observatory on Social Media. BotSlayer is an application that helps track and detect potential manipulation of information spreading on Twitter. It can be used by journalists, researchers, civil society organizations, corporations, and political candidates to discover in real-time new coordinated disinformation campaigns. Read about how you can join the effort to spot the manipulation of social media.
Indiana University will establish a $6 million research center to study the role of media and technology in society. With leadership by CNetS faculty, the Observatory on Social Media will investigate how information and misinformation spread online. It will also provide students, journalists and citizens with resources, data and training to identify and counter attempts to intentionally manipulate public opinion. Major support for the center comes from the John S. and James L. Knight Foundation, which will contribute $3 million, as well as funds from the university. The center is a collaboration between the IU School of Informatics, Computing and Engineering, The Media School and the IU Network Science Institute. More…
UPDATE: This paper is ranked #3 most read among all articles published by Nature Communications in 2018
Analysis by CNetS researchers of information shared on Twitter during the 2016 U.S. presidential election has found that social bots played a disproportionate role in spreading misinformation online. The study, published in the journal Nature Communications, analyzed 14 million messages and 400,000 articles shared on Twitter between May 2016 and March 2017 — a period that spans the end of the 2016 presidential primaries and the presidential inauguration on Jan. 20, 2017. Among the findings: A mere 6 percent of Twitter accounts that the study identified as bots were enough to spread 31 percent of the low-credibility information on the network. These accounts were also responsible for 34 percent of all articles shared from low-credibility sources. The study also found that bots played a major role promoting low-credibility content in the first few moments before a story goes viral. Continue reading Twitter bots play disproportionate role spreading misinformation
Congratulations to Clayton A. Davis, who successfully defended his PhD dissertation titled “Collect, Count, and Compare”: Expanding Access and Scope of Social Media Analysis. Dr. Davis’ work explored ways to facilitate research using massive social data through tools that are friendly for non-technical users, robust to manipulation by social bots, and that offer strict anonymity guarantees. His work has been featured on the cover of Communications of the ACM and quoted in top worldwide media venues. Web interfaces for his projects, including Botometer, Kinsey Reporter, and the Observatory on Social Media, have served millions of queries to thousands of Internet users. Davis has also made key pedagogical contributions, and co-authored a textbook on network science to be published later this year by Cambridge University Press.
Congratulations to Dimitar Nikolov, who successfully defended his PhD dissertation on Information Exposure Biases in Online Behaviors. Dr. Nikolov’s research explored the unintentional biases introduced by filtering, ranking, and recommendation algorithms that mediate our online consumption of information. His findings show that our reliance on modern online technologies limits exposure to diverse points of view and makes us vulnerable to misinformation. In particular, he analyzed two massive Web traffic datasets to quantify the popularity and homogeneity bias of several popular online platforms including social media, email, personalized news, and search engines. He also leveraged Twitter data to characterize the link between political partisanship and vulnerability to online pollution, such as fake news, conspiracy theories, and junk science. His dissertation contributes to the field of computational social science by putting the study of bias in information consumption and derived phenomena like political polarization, echo chambers, and online pollution on a more firm quantitative foundation.
Filippo Menczer, a professor of computer science and informatics at CNetS, appeared on a panel of experts to discuss the emergence and dissemination of misinformation, and how it threatens society at the annual meeting of American Association for the Advancement of Science in Washington, D.C., Feb. 15.
Menczer was a part of a three-person panel and presented a talk, “Eight Ways Social Media Makes Society Vulnerable to Misinformation.” The talk provided an overview of ongoing network analytics, modeling, and machine learning efforts to study the viral spread of misinformation and to develop tools for countering the online manipulation of opinions. Menczer has previously developed systems such as Botometer, which detects social media bots, and Hoaxy, which maps the diffusion of low-credibility content.Continue reading CNeTS researcher provides expertise on misinformation battle at AAAS conference
Onur Varol, a postdoctoral research associate at Northeastern University who earned his Ph.D. in Informatics from CNetS, has been honored with the University Distinguished Ph.D. Dissertation Award for 2018, which is the highest honor for research Indiana University bestows on its graduate students. “I am extremely happy to receive this award,” Varol said. “I would like to especially thank my advisor, Filippo Menczer, and the Informatics department for nominating me. I was lucky to be surrounded by the best advisors, collaborators, and research group I could imagine during my doctoral studies, and I am a proud IU alumni and a Hoosier.” Varol’s dissertation, “Analyzing Social Big Data to Study Online Discourse and Its Manipulation,” provided insights into analysis of online conversations and mechanisms used for their manipulation. Varol built machine learning frameworks like Botometer to detect social bots. More…
The spread of fake news is no game, but to recent CNetS graduate Mihai Avram, a game just might be the solution. As a graduate student in CNetS, Avram developed a mobile app called Fakey to help combat the spread of fake news on social media. It is available to download for both Android and iOS. The news literacy game places users in a simulated social media environment where they can share, “like” or fact-check articles. Users are given feedback for their actions and earn points if they share stories from legitimate news sources, or if they fact-check articles from low-credibility sources. More…
Researchers at CNetS, IUNI, and the Indiana University Observatory on Social Media have launched upgrades to two tools playing a major role in countering the spread of misinformation online: Hoaxy and Botometer. A third tool Fakey — an educational game designed to make people smarter news consumers — also launches with the upgrades. Continue reading 3 new tools to study and counter online disinformation