Scholarometer is a social tool to facilitate citation analysis and help evaluate the impact of an author’s publications. One of the promises of Web Science is to leverage the wisdom of the crowds to give rise to emergent, bottom-up semantics, by making it easy for users to express relationships between arbitrary kinds of objects. Rather than starting with an ontology that determines the kinds of objects and relationships to be described and reasoned about, the idea is to give users the freedom to annotate arbitrary objects with arbitrary predicates, along with incentives for such annotations. Social tagging systems for images are one example, where the motivation can stem from the wish to organize and share one’s photos or from entertaining games to guess one another’s tags. The Scholarometer project explores a similar approach in the domain of scholarly publications. Scholarometer provides a service to scholars by computing citation-based impact measures. This motivates users to provide disciplinary annotations for authors, which in turn can be used to compute measures that allow to compare authors’ impact across disciplinary boundaries. This crowdsourcing approach can lead to emergent semantic networks to study interdisciplinary annotations and trends. To learn more please visit http://scholarometer.indiana.edu/about.html
We proposed a method to quantify the disciplinary bias of any scholarly impact metric, and used this method to evaluate a number of established scholarly impact metrics. We introduced a simple universal metric that allows to compare the impact of scholars across scientific disciplines. This metric is now publicly available for scholars via Scholarometer.
We also developed a method to decouple the roles of quantity and quality of publications to explain how a certain level of impact is achieved. The method is based on the generation of a statistical baseline specifically tailored on the academic profile of each researcher. As an illustration, we used it to capture the quality of the work of Nobel laureates irrespective of number of publications, academic age, and discipline, even when traditional metrics indicate low impact in absolute terms. We further applied the methodology to almost a million scholars and over six thousand journals to measure the impact that cannot be explained by the volume of publications alone.
Emergence of fields
The birth and decline of disciplines are critical to science and society. How do scientific disciplines emerge? We developed an agent-based model in which the evolution of disciplines is guided mainly by social interactions among agents representing scientists. Disciplines emerge from splitting and merging of social communities in a collaboration network. We find that this social model can account for a number of stylized facts about the relationships between disciplines, scholars, and publications. These results provide strong quantitative support for the key role of social interactions in shaping the dynamics of science. While several “science of science” theories exist, this is the first account for the emergence of disciplines that is validated on the basis of empirical data.
We are currently exploring signals from coauthorship and citation networks to predict the emergence and decline of scientific fields.
- J. Kaur, E. Ferrara, F. Menczer, A. Flammini, F. Radicchi (2015) Quality versus quantity in scientific impact. Journal of Informetrics 9(4): 800-808, doi:10.1016/j.joi.2015.07.008
- Jasleen Kaur, Mohsen JafariAsbagh, Filippo Radicchi, Filippo Menczer (2014) Scholarometer: a system for crowdsourcing scholarly impact metrics. Proceedings of the 2014 ACM conference on Web Science, 285-286
- Jasleen Kaur, Mohsen JafariAsbagh, Filippo Radicchi, Filippo Menczer (2014) Crowdsourced disciplines and universal impact. Proc. ACM WebSci14 Altmetrics workshop.
- Jasleen Kaur, Filippo Radicchi, Filippo Menczer (2014) On the use of sampling statistics to advance bibliometrics. Journal of Informetrics 8(2): 419-420, doi:10.1016/j.joi.2014.01.010
- Jasleen Kaur, Filippo Radicchi, and Filippo Menczer (2013) Universality of scholarly impact metrics. Journal of Informetrics 7(4): 924-932, doi:10.1016/j.joi.2013.09.002
- Presented at ECCS’13 satellite workshop on Quantifying Success
- Presented at Microsoft Research Faculty Summit 2013, panel on Publishing and Perishing in the 21st Century
- Discussed in Nature
- Xiaoling Sun, Jasleen Kaur, Stasa Milojevic, Alessandro Flammini and Filippo Menczer (2013) Social Dynamics of Science. Nature Scientific Reports 3(1069). doi:10.1038/srep01069
- Presented at International Symposium on Science of Science, Library of Congress, Washington, DC, 2016
- Presented at 2013 International Science of Team Science (SciTS) Conference
- Presented at ECCS’13 COVENANT workshop
- Xiaoling Sun, Jasleen Kaur, Lino Possamai and Filippo Menczer (2013) Ambiguous author query detection using crowdsourced digital library annotations. Information Processing & Management 49(2): 454-464. doi:10.1016/j.ipm.2012.09.001
- Jasleen Kaur, Diep Thi Hoang, XIaoling Sun, Lino Possamai, Mohsen JafariAsbagh, Snehal Patil and Filippo Menczer (2012) Scholarometer: A Social Framework for Analyzing Impact across Disciplines. PLoS ONE 7(9): e43235. doi:10.1371/journal.pone.0043235
- Xiaoling Sun, Jasleen Kaur, Lino Possamai and Filippo Menczer (2011). Detecting Ambiguous Author Names in Crowdsourced Scholarly Data. Proceedings of 3rd IEEE Conference on Social Computing
- Thi Hoang, Diep and Kaur, Jasleen and Menczer, Filippo (2010) Crowdsourcing Scholarly Data. Proceedings of the Web Science Conference (WebSci10): Extending the Frontiers of Society On-Line
Our work on the emergence of fields is supported by US Navy grant N00174-17-1-0007.