Two CNetS teams were awarded prestigious awards from Minerva, a research initiative of the Department of Defense that supports basic social science research focusing on topics of particular relevance to U.S. national security. One of the two awards will develop Science Genome, a new quantitative framework to investigate science of science using representation learning and graph embedding. The $4.4M project will take advantage of the availability of digitized bibliographic data sets and powerful computational methods, such as machine learning with deep neural networks, to tap into hidden information present in complex scholarly graphs. The project is led by YY Ahn and also includes Staša Milojević, Alessandro Flammini, and Fil Menczer (more…). The other award aims to understand the fundamental laws ruling science dynamics: the description and prediction of the evolution of scientific fields, how to define and measure the novelty of a scientific work, how to assemble successful teams to solve a specific task, and how to define and measure the impact of scholars’ research. The $5M project is led by a consortium of seven prominent science of science experts in four US institutions, including CNetS professor Santo Fortunato (more…). Both projects have potential applications in policy-making, for institutions and funding agencies.
Tag Archives: citation network
Nature story on universality of impact metrics
A story in Nature discusses a recent paper (preprint) from CNetS members Jasleen Kaur, Filippo Radicchi and Fil Menczer on the universality of scholarly impact metrics. In the paper, we present a method to quantify the disciplinary bias of any scholarly impact metric. We use the method to evaluate a number of established scholarly impact metrics. We also introduce a simple universal metric that allows to compare the impact of scholars across scientific disciplines. Mohsen JafariAsbagh integrated this metric into Scholarometer, a crowdsourcing system developed by our group to collect and share scholarly impact data. The Nature story highlight how one can use normalized impact metrics to rank all scholars, as illustrated in the widget shown here.
Scholarometer presented at WebSci2010
Scholarometer is becoming a more mature tool. The idea behind scholarometer — crowdsourcing scholarly data — was presented at the Web Science 2010 Conference in Raleigh, North Carolina, along with some promising preliminary results. Recently acquired functionality includes a Chrome version, percentile calculations for all impact measures, export of bibliographic data in various standard formats, heuristics to determine reliable tags and detect ambiguous names, etc. Next up: an API to share annotation and impact data, and an interactive visualization for the interdisciplinary network.
Press discusses social tool to study scholarly impact
CNetS graduate student Diep Thi Hoang and associate director Filippo Menczer have developed a tool (called Scholarometer, previously Tenurometer in beta version) for evaluating the impact of scholars in their field. Scholarometer uses the h-index, which combines the scholarly output with the influence of the work, but adds the universal h-index proposed by Radicchi et al. to compare the impact of research in different disciplines. This is enabled by a social mechanism in which users of the tool collaborate to tag the disciplines of the scholars. “We have computer scientists, physicists, social scientists, people from many different backgrounds, who publish in lots of different areas,” says Menczer. However, the various communities have different citation methods and different publishing traditions, making it difficult to compare the influence of a sociologist and a computer scientist, for example. The universal h-index controls for differences in the publishing traditions, as well as the amount of research scholars in various fields have to produce to make an impact. Menczer is especially excited about the potential to help show how the disciplines are merging into one another. More from Inside Higher Ed… (Also picked up by ACM TechNews and CACM.)