Thanks to support from the Indiana University Network Science Institute (IUNI) and Digital Science Center (DSC), the full content of the Twitter data repository from the Observatory on Social Media (OSoMe) is now available to all IU researchers. Many tools to detect social bots, study the spread of fake news, visualize meme diffusion networks, trends, and maps, as well as APIs to access this data, have been available to the general public since mid-2016. Now, however, the IU research community can access enhanced data and content from the large collection, based on a 10% sample of all public tweets. A dedicated portal allows IU faculty and students to submit queries to the OSoMe cluster based on hashtags, URLs, keywords, geo-coordinates, and other criteria. At any time the system can search and retrieve data from the previous 18 months. We hope this resource will spur and support new research projects in all areas of computing, natural, and social sciences. Click here to read how to get access and learn more about the data, or attend our Open Science Forum!
The Center for Complex Networks and Systems Research (cnets.indiana.edu) at Indiana University, Bloomington has an open postdoctoral position to study how information spreads through complex online social networks. The position funded by the DARPA program on Computational Simulation of Online Social Behavior (SocialSim). The anticipated start date for this position is January 1, 2018 (negotiable). This is an annual renewable appointment for up to 3 years subject to performance and funding. Continue reading Postdoctoral Fellowship: Simulation of Information Diffusion in Online Social Networks
A project from NaN and IUNI was among 20 selected (out of over 800 applications) to address the spread of misinformation with support from the Knight Prototype Fund. Led by Fil Menczer, Giovanni Ciampaglia, Alessandro Flammini and Val Pentchev, the project will integrate the Hoaxy and Botometer tools and uncover attempts to use Internet bots to boost the spread of misinformation and shape public opinion. The tool aims to reveal how this information is generated and broadcasted, how it becomes viral, its overall reach, and how it competes with accurate information for placement on user feeds. The project will be supported by the Democracy Fund, which in March, along with partners Knight Foundation and Rita Allen Foundation, launched an open call for ideas around the question: How might we improve the flow of accurate information? The call sought projects that could be quickly built to respond to the challenges affecting the health of our news ecosystem and ultimately our democracy. The winning projects will receive a share of $1 million through the Knight Prototype Fund, a program focused on human-centered approaches to solving difficult problems.
Every year the Informatics Department awards a prize to an Associate Instructor (AI) who has excelled at teaching and service. For the 2016-2017 academic year, CNetS PhD candidate Clayton A. Davis was singled out among a crowd of outstanding nominees as being particularly deserving of this award. The nomination noted Clayton’s commitment to teaching and learning, the above-and-beyond work that he put into preparing creative assignments, and his overall excellence as an instructor. Clayton was “born to disseminate knowledge,” and we predict for him a brilliant career as a teacher and communicator, as well as researcher. Congratulations!
Congratulations to Onur Varol for successfully defending his dissertation entitled “Analyzing Social Big Data to Study Online Discourse and its Manipulation” on April 25th 2017, supervised by Filippo Menczer. Onur completed a PhD degree in the Complex Systems track of the Informatics PhD Program. Onur has accepted a postdoctoral position at Northeastern University at the Center for Complex Network Research.
The deluge of online and offline misinformation is overloading the exchange of ideas upon which democracies depend. Many have argued that echo chambers are increasingly constricting the ability of alternative perspectives to provide a check on one’s viewpoints. Suffering fragmentation and declining public trust, the Fourth Estate struggles to carry out its traditional editorial role distinguishing facts from fiction. Without those safeguards, fake news, conspiracy theories, and deceptive social bots proliferate, facilitating the manipulation of public opinion. Countering misinformation while protecting freedom of speech will require collaboration between stakeholders across industry, journalism, and academia. To foster such collaboration, the Workshop on Digital Misinformation will be held in conjunction with the 2017 International Conference on Web and Social Media (ICWSM) in Montreal, on May 15, 2017. Continue reading ICWSM 2017 Workshop on Digital Misinformation
If you get your news from social media, as most Americans do, you are exposed to a daily dose of hoaxes, rumors, conspiracy theories and misleading news. When it’s all mixed in with reliable information from honest sources, the truth can be very hard to discern.
Among the millions of real people tweeting about the presidential race, there are also a lot accounts operated by fake people, or “bots.” Politicians and regular users alike use these accounts to increase their follower bases and push messages. PBS NewsHour science correspondent Miles O’Brien reports on how CNetS computer scientists can analyze Twitter handles to determine whether or not they are bots.