Tag Archives: citation network

Nature story on universality of impact metrics

top-scholars-widget
You can embed this top scholars widget from Scholarometer

A story in Nature discusses a recent paper (preprint) from CNetS members Jasleen Kaur, Filippo Radicchi and Fil Menczer on the universality of scholarly impact metrics. In the paper, we present a method to quantify the disciplinary bias of any scholarly impact metric. We use the method to evaluate a number of established scholarly impact metrics. We also introduce a simple universal metric that allows to compare the impact of scholars across scientific disciplines. Mohsen JafariAsbagh integrated this metric into Scholarometer, a crowdsourcing system developed by our group to collect and share scholarly impact data. The Nature story highlight how one can use normalized impact metrics to rank all scholars, as illustrated in the widget shown here.

Science of Science

Scholarometer

Scholarometer is a social tool to facilitate citation analysis and help evaluate the impact of an author’s publications. One of the promises of Web Science is to leverage the wisdom of the crowds to give rise to emergent, bottom-up semantics, by making it easy for users to express relationships between arbitrary kinds of objects. Rather than starting with an ontology that determines the kinds of objects and relationships to be described and reasoned about, the idea is to give users the freedom to annotate arbitrary objects with arbitrary predicates, along with incentives for such annotations. Social tagging systems for images are one example, where the motivation can stem from the wish to organize and share one’s photos or from entertaining games to guess one another’s tags. The Scholarometer project explores a similar approach in the domain of scholarly publications. Scholarometer provides a service to scholars by computing citation-based impact measures. This motivates users to provide disciplinary annotations for authors, which in turn can be used to compute measures that allow to compare authors’ impact across disciplinary boundaries. This crowdsourcing approach can lead to emergent semantic networks to study interdisciplinary annotations and trends. To learn more please visit http://scholarometer.indiana.edu/about.html

Impact metrics

We proposed a method to quantify the disciplinary bias of any scholarly impact metric, and used this method to evaluate a number of established scholarly impact metrics. We introduced a simple universal metric that allows to compare the impact of scholars across scientific disciplines. This metric is now publicly available for scholars via Scholarometer.

We also developed a method to decouple the roles of quantity and quality of publications to explain how a certain level of impact is achieved. The method is based on the generation of a statistical baseline specifically tailored on the academic profile of each researcher. As an illustration, we used it to capture the quality of the work of Nobel laureates irrespective of number of publications, academic age, and discipline, even when traditional metrics indicate low impact in absolute terms. We further applied the methodology to almost a million scholars and over six thousand journals to measure the impact that cannot be explained by the volume of publications alone.

Emergence of fields

The birth and decline of disciplines are critical to science and society. How do scientific disciplines emerge? We developed an agent-based model in which the evolution of disciplines is guided mainly by social interactions among agents representing scientists. Disciplines emerge from splitting and merging of social communities in a collaboration network. We find that this social model can account for a number of stylized facts about the relationships between disciplines, scholars, and publications. These results provide strong quantitative support for the key role of social interactions in shaping the dynamics of science. While several “science of science” theories exist, this is the first account for the emergence of disciplines that is validated on the basis of empirical data.

We are currently exploring signals from coauthorship and citation networks to predict the emergence and decline of scientific fields.

Team members

Fil Menczer, PI
Fil Menczer
Sandro Flammini
Sandro Flammini
Stasa Milojevic
Stasa Milojevic
Santo Fortunato
Santo Fortunato
Aditya Tandon
Aditya Tandon
Diego R. Amancio
Diego R. Amancio
Filipi N. Silva
Filipi N. Silva
Wen Chen
Wen Chen
Filippo Radicchi
Filippo Radicchi
Jasleen Kaur
Jasleen Kaur
Mohsen JafariAsbagh
Mohsen JafariAsbagh
Snehal Patil
Snehal Patil
Xiaoling Sun
Xiaoling Sun
Lino Possamai
Lino Possamai
Diep Hoang
Diep Hoang

Project Publications:

Support

Our work on the emergence of fields is supported by US Navy grant N00174-17-1-0007.

Scholarometer presented at WebSci2010

scholarometer statsScholarometer is becoming a more mature tool.  The idea behind scholarometer — crowdsourcing  scholarly data — was presented at the Web Science 2010 Conference in Raleigh, North Carolina, along with some promising preliminary results. Recently acquired functionality includes a Chrome version, percentile calculations for all impact measures, export of bibliographic data in various standard formats, heuristics to determine reliable tags and detect ambiguous names, etc. Next up: an API to share annotation and impact data, and an interactive visualization for the interdisciplinary network.

Press discusses social tool to study scholarly impact

Impact metrics based on user queries
Impact metrics based on user queries

CNetS graduate student Diep Thi Hoang and associate director Filippo Menczer have developed a tool (called Scholarometer, previously Tenurometer in beta version) for evaluating the impact of scholars in their field. Scholarometer uses the h-index, which combines the scholarly output with the influence of the work, but adds the universal h-index proposed by Radicchi et al. to compare the impact of research in different disciplines. This is enabled by a social mechanism in which users of the tool collaborate to tag the disciplines of the scholars. “We have computer scientists, physicists, social scientists, people from many different backgrounds, who publish in lots of different areas,” says Menczer. However, the various communities have different citation methods and different publishing traditions, making it difficult to compare the influence of a sociologist and a computer scientist, for example. The universal h-index controls for differences in the publishing traditions, as well as the amount of research scholars in various fields have to produce to make an impact. Menczer is especially excited about the potential to help show how the disciplines are merging into one another. More from Inside Higher Ed… (Also picked up by ACM TechNews and CACM.)