Tag Archives: bibliometrics

CNetS researchers study sleeping beauties

476706_w296Why do some research papers remain dormant for years and then suddenly explode with great impact upon the scientific community? These “sleeping beauties” are the subject of a new study by CNetS researchers Qing KeEmilio FerraraFilippo Radicchi, and Alessandro Flammini published in the Proceedings of the National Academy of Sciences. The study provides empirical evidence that a paper can truly be ahead of its time. A ‘premature’ topic may fail to attract attention even when it is introduced by authors who have already established a strong scientific reputation. The authors show that sleeping beauties can be dormant for many decades, and are more common than previously thought. The findings have been covered by media such as Nature and The New York Times. More…

Social Dynamics of Science

doi:10.1038/srep01069Read our latest paper titled Social Dynamics of Science in Nature Scientific Reports. Authors Xiaoling Sun, Jasleen Kaur, Staša Milojević, Alessandro Flammini & Filippo Menczer ask, How do scientific disciplines emerge? No quantitative model to date allows us to validate competing theories on the different roles of endogenous processes, such as social collaborations, and exogenous events, such as scientific discoveries. Here we propose an agent-based model in which the evolution of disciplines is guided mainly by social interactions among agents representing scientists. Disciplines emerge from splitting and merging of social communities in a collaboration network. We find that this social model can account for a number of stylized facts about the relationships between disciplines, scholars, and publications. These results provide strong quantitative support for the key role of social interactions in shaping the dynamics of science. While several “science of science” theories exist, this is the first account for the emergence of disciplines that is validated on the basis of empirical data.

Article in Nature on the plenitude of scientific performance metrics features the research of Bollen and Vespignani

The relatively new field of bibliometrics has experienced an explosion of research as scientists become more interested in developing metrics that can accurately measure scientists’ performance. The common but naive practice of tallying the number of journal citations accumulated by researchers has serious limitations insofar as many salient factors like the weight of a citation as a function of a journal or database’s popularity, how well an article integrates with contemporaneous research, and individual productivity are not taken into account.

The article discusses Bollen’s concern that the scramble to uncover new metrics and combinations of them has obscured an equal need to define the concepts under measurement more rigidly. It also addresses an approach taken by Vespignani and colleagues to apply the concept of weighted citations to develop a network of over 400,000 papers published over 100 years in order to demonstrate the variable influence scientists have over the scientific community. Read more…

Scholarometer presented at WebSci2010

scholarometer statsScholarometer is becoming a more mature tool.  The idea behind scholarometer — crowdsourcing  scholarly data — was presented at the Web Science 2010 Conference in Raleigh, North Carolina, along with some promising preliminary results. Recently acquired functionality includes a Chrome version, percentile calculations for all impact measures, export of bibliographic data in various standard formats, heuristics to determine reliable tags and detect ambiguous names, etc. Next up: an API to share annotation and impact data, and an interactive visualization for the interdisciplinary network.

Press discusses social tool to study scholarly impact

Impact metrics based on user queries
Impact metrics based on user queries

CNetS graduate student Diep Thi Hoang and associate director Filippo Menczer have developed a tool (called Scholarometer, previously Tenurometer in beta version) for evaluating the impact of scholars in their field. Scholarometer uses the h-index, which combines the scholarly output with the influence of the work, but adds the universal h-index proposed by Radicchi et al. to compare the impact of research in different disciplines. This is enabled by a social mechanism in which users of the tool collaborate to tag the disciplines of the scholars. “We have computer scientists, physicists, social scientists, people from many different backgrounds, who publish in lots of different areas,” says Menczer. However, the various communities have different citation methods and different publishing traditions, making it difficult to compare the influence of a sociologist and a computer scientist, for example. The universal h-index controls for differences in the publishing traditions, as well as the amount of research scholars in various fields have to produce to make an impact. Menczer is especially excited about the potential to help show how the disciplines are merging into one another. More from Inside Higher Ed… (Also picked up by ACM TechNews and CACM.)