Novel bibliometric scores for evaluating research quality and output: a correlation study with established indexes
Int J Biol Markers 2016; 31(4): e451 - e455
Article Type: ORIGINAL RESEARCH ARTICLE
AuthorsValeria Scotti, Annalisa De Silvestri, Luigia Scudeller, Paola Abele, Funda Topuz, Moreno Curti
Novel bibliometric indexes (commonly known as altmetrics) are gaining interest within the scientific community and might represent an important alternative measure of research quality and output.
We evaluate how these new metrics correlate with established bibliometric indexes such as the impact factor (IF), currently used as a measure of scientific production as well as a criterion for scientific research funding, and how they might be helpful in assessing the impact of research.
We calculated altmetrics scores for all the articles published at our institution during a single year and examined the correlation between altmetrics scores and IFs as a measure of research quality and impact in all departments.
For all articles from the various departments published in a single year, the altmetrics score and the sum of all IFs showed a strong and significant correlation (Spearman’s rho 0.88). The correlation was significant also when the major components of altmetrics, including Facebook, Twitter and Mendeley, were analyzed. The implementation of altmetrics has been found to be easy and effective at both the researcher and librarian levels.
The novel bibliographic index altmetrics is consistent and reliable and can complement or be considered a valid alternative to standard bibliometric indexes to benchmark output and quality of research for academic and funding purposes.
- • Received on 15/04/2016
- • Accepted on 23/05/2016
- • Available online on 08/06/2016
- • Published online on 23/12/2016
This article is available as full text PDF.
The problem of measuring the scientific and social impact of research publications has been of extreme interest to scientists and scholars since the inception of modern science, but open questions still remain on the efficacy of the current indexes. The journal impact factor (IF) and the h-index are the most well-known indicators based on citation analysis (1, 2). These indexes show several limitations; a crucial one is their lack of timeliness. It takes a long time (several months to years) for an article to be cited because it first must be read and cited by other researchers whose articles require additional time to be published. Furthermore, young researchers are generally disadvantaged since they have by definition published fewer articles than senior researchers and citations build up over time (3). The rapid growth of the World Wide Web has made these limits even more evident. The development of tools that are more Web 2.0 oriented has profoundly changed the scientific communication process (4). In this context, many Web tools are often referred to as “social media” given their role in supporting communication and building communities (Facebook, Twitter, etc.) (5). Altmetrics is the creation and study of new metrics based on the Social Web for analyzing and informing scholarship (6), thus combining the traditional bibliometrics tools with the use of the Web (7).
All research centers in Italy (defined as IRCCS) by April each year communicate to a central database (
In this study we wanted to assess the validity of the new metrics as an index of the research impact. For this purpose we examined the correlation between IFs and altmetrics scores obtained in each department of our hospital, taking the Oncology Department as the reference. Our aim was to ascertain whether the altmetrics score could be used as a complementary or alternative index to evaluate the impact of research.
Materials and methods
Samples and tools
We analyzed all full-text articles published in 2013 in indexed journals (with a 2012 IF score) by researchers affiliated to our hospital. The list of articles was exactly the same as the one supplied to MoH for funding purposes. Data were collected searching PubMed and Web of Science, and researchers were asked to verify if all the articles they had authored had been retrieved. For each retrieved article, the altmetrics score on Altmetric.com was searched using the PMID of the article. The 646 papers published in 2013 were then grouped across departments and both the IFs and altmetrics scores were summed up.
For all these articles, we listed the altmetrics score as formulated by Jason Priem in 2010 (8). Altmetrics tools capture information through the use of metrics from HTML views and downloads of articles, blog posts, tweets, bookmarks, etc. Such information is provided in real time and altmetrics show not only the impact of scientific research by researchers but also the impact of the research on the public through social media (9).
Altmetric.com, PlumX, ImpactStory and PLoS Impact Explorer are currently the main tools that aggregate and provide article-level metrics (10, 11). In particular, the Altmetric.com badges function allows publishers to add altmetrics data with 2 simple lines of code added to the article HTML. Altmetric badges is currently used by leading publishers including Wiley, Sage, Springer, Nature, Wichtig and many others, which have integrated it into their Web pages. For the purpose of our study we selected Altmetric.com as a major tool aggregating data at the article level.
Quantitative variables are described as median and interquartile range (IQR), i.e., the 25th and 75th percentiles. The association between IF and the altmetrics score or its components is expressed through the nonparametric Spearman rho correlation coefficient. Furthermore, we correlated the number of citations in the Web of Science and the number of Mendeley readers or PubMed citations using the nonparametric Spearman rho coefficient. All analyses were performed with Stata 13 (StataCorp LP) and a p value below 0.05 was considered statistically significant (12).
A total of 268 of the 646 papers (41.6%) presented an altmetrics score and 45 out of 53 departments obtained an altmetrics score (median 14, IQR 3-68). Among the components of the altmetrics score, Mendeley readers was the highest (2,403 citations obtained by 45 departments; median 36, IQR 7-50) followed by Tweeters for a total of 1,998 tweets obtained by 45 departments (median 15, IQR: 6-48) and Facebook walls for a total of 247 obtained by 32 departments (median 5, IQR: 2-9). Less frequent were news outlets, for a total of 74 obtained by 13 departments (median 4, IQR: 2-8); bloggers for a total of 70 obtained by 18 departments (median 3, IQR: 1-5); CiteULike readers for a total of 53 obtained by 17 departments (median 2, IQR 1-2); Google authors for a total of 23 obtained by 9 departments (median 2, IQR: 2-3); and F1000 reviews for a total of 18 obtained by 13 departments (median 1, IQR 1-2). The results are summarized in
The median IF by department was 44 (IQR 14-128) and the correlation between the sum of altmetrics scores and the sum of IFs relative to all articles published in 2013 calculated for each of the departments of our institution was very high (Spearman’s rho 0.88; p<0.0001). Each point in
Correlation between the sum of altmetrics scores and the sum of IFs relative to all the articles published in 2013 calculated for each of the departments of our institution.
Furthermore, the correlation between IF and single major components of altmetrics such as Facebook (Spearman’s rho 0.80; p<0.0001), Twitter (Spearman’s rho 0.90, p<0.0001) and Mendeley (Spearman’s rho 0.90, p<0.0001) was very good.
Analysis of single departments
Looking at the impact of different departments we focused on the Oncology Department to see where a single department with a typical output of relevant articles would fit in our analysis compared to other departments.
Highlighting where the Oncology Department (in red in the graphs) stands (
Interestingly, while the Oncology Department seems to closely follow the trend of most departments, 2 departments had different altmetrics scores from the others: the Thromboembolic Disease Unit, which had higher Facebook, Twitter and Mendeley scores, and the Neurosurgery Clinic, with a high Facebook score. Also the Internal Medicine Unit (with papers about celiac disease) and the Hospital Management Department (particularly with a paper on exposure to pesticides or solvents and risk of Parkinson disease) had Facebook scores higher than expected on the basis of the IF. Besides these few, highly specific examples, there was very good and statistically significant agreement between altmetrics and IF.
Evaluating the importance of an article is becoming ever more important for researchers who lack the time to read all relevant papers. Traditionally, bibliometrics is the application of quantitative analysis and statistics to publications such as journal articles and their accompanying citations. The newly developed altmetrics have proven to be user friendly, graphic, self-explaining also for nonspecialized readers, rapidly evolving and interacting with media and public or users.
Institutions are also very interested in implementing the use of altmetrics data. For example, the Dublin Business School and the University of Tor Vergata will integrate PlumX within the school’s institutional repository (IR), so that the impact of any research output added to the IR can be measured by PlumX. Many others institutions – such as the University of Cambridge, the University of Manchester, the University of South Australia and the University of Aalto in Finland – are using Altmetric.com for the evaluation of their impact, indicating that the faculty’s research activity is an integral aspect of program accreditation and validation.
In our study, we documented a very good correlation of altmetrics with standard bibliometric indexes both at the institutional and departmental level, as already shown by Costas et al (13) and in a meta-analysis by Bornmann (14). A high percentage of manuscripts had their own altmetrics score, consistent with data shown in the biomedical field by Haustein et al (15, 16).
Altmetrics could act as a reliable tool in evaluating departments, and could be considered in addition to traditional metrics when managing funding activities. For this reason, altmetrics can greatly help institutions understand their impact on society. They may also help researchers and institutions to maximize the success of their own research efforts.
In our institution, a high score was obtained by many items, both within the research community (e.g., Mendeley readers) and among the general public (e.g., Twitter and Facebook users). Interestingly, in many cases, hot topics like thromboembolic disorders, neurodegenerative diseases and celiac disease have a greater impact on the general public than on the research community. So it is becoming more and more evident that alternative metrics may play a crucial role in helping society as well as patient communities to retrieve reliable information. Thus, together with knowledgeable scientific journalists, they could contribute to spreading relevant scientific results for the scientific education of the public. They may also highlight the value of the most successful research programs to their institutions, since altmetrics measure the impact in real time (17). Showing how research is relevant to the general public is useful especially for institutions and foundations funded by public money, like the one taken into consideration in this study.
Limitations of altmetrics and implications of the study
There are certain limits that should be considered in the use of these new metrics. First, when dealing with citations there is no distinction between positive and negative comments and this fact could distort an article’s score. In the same way, social media may be particularly vulnerable to “gaming” by commercial services that sell Facebook posts, tweets or blog mentions (18, 19).
Another critical issue is that there is currently no standard for reporting altmetrics. For this reason, last year the National Information Standards Organization (NISO) received a grant to develop a standard in the field of these new metrics (20).
Last but not least, altmetrics – like bibliometrics before – need time: the concept of these metrics, together with the development and changes of social media, is still evolving and not yet fully understood by the scientific community.
The data from our study are from a single institution, resulting in a smaller sample size compared to other studies. However, the in-depth analysis of various departments in a single institution reduces the heterogeneity inherent in data coming from different institutions, and allows to perform an analysis of a real-life situation and to measure in a pragmatic way the impact this new metric should have in addition to the traditional ones.
The data resulting from this study indicate that altmetrics are useful and may well be considered as reliable metrics for measuring research. Furthermore, they could actually represent an interesting and relevant complement to citations, providing institutions and researchers with a new framework to evaluate not only their academic influence but their social impact. Together with traditional metrics, they could be a useful tool in guiding decision makers when funding public research. Nevertheless, further investigations are still needed to explore and understand how these new indexes can be used in the evaluation of research.
Garfield E The history and meaning of the journal impact factor. 2006 295 1 90 93
Hirsch JE An index to quantify an individual’s scientific research output. 2005 102 46 16569 16572
Priem J Piwowar H Hemminger B Altmetrics in the wild: using social media to explore scholarly impact. Published March, 2012. Available at http://adsabs.harvard.edu/abs/2012arXiv1203.4745P. Accessed February 14, 2016.
Priem J Hemminger B Scientometrics 2.0: New metrics of scholarly impact on the social Web. 2010 15 7
Torres-Salinas D Cabezas-Clavijo Á Jiménez-Contreras E Altmetrics: nuevos indicadores para la comunicación científica en la Web 2.0. 2013 21 41 53 60
Priem J Taraborelli D Groth P Neylon C Altmetrics: a manifesto. Published October 26, 2010. Available at http://altmetrics.org/manifesto. Accessed February 14, 2016.
Priem J Groth P Taraborelli D The altmetrics collection. 2012 7 11 e48753
Priem J Twitter. Published September 28, 2010. Available at https://twitter.com/jasonpriem/status/25844968813. Accessed February 14, 2016.
Cave R Overview of the Altmetrics landscape. In: Bernhardt BR, Hinds LH, Strauch KP. Accentuate the positive! Charleston Conference Proceedings, 2012. Against the Grain Press 2013; 349-356.
Adie E Roe W Altmetric: enriching scholarly content with article-level discussion and metrics. 2013 26 1 11 17
Rodgers E Barbrow S A Look at altmetrics and its growing significance to research libraries. Hdlhandlenet, 2013. Available at http://hdl.handle.net/2027.42/99709. Accessed February 14, 2016.
Landis JR Koch GG The measurement of observer agreement for categorical data. 1977 33 1 159 174
Costas R Zahedi Z Wouters P Do “altmetrics” correlate with citations? Extensive comparison of altmetric indicators with citations from a multidisciplinary perspective. 2015 66 10 2003 2019
Bornmann L Alternative metrics in scientometrics: a meta-analysis of research into three altmetrics. 2015 103 3 1123 1144
Haustein S Thelwall M Lariviere V Sugimoto CR On the relation between altmetrics and citations in medicine (RIP). In: Hinze S, Lottmann A, eds. Proceedings of the 18th International Conference on Science and Technology Indicators (STI), Berlin, Germany, September 4-6, 2013 164 166
Haustein S Peters I Sugimoto C Thelwall M Larivière V Tweeting biomedicine: an analysis of tweets and citations in the biomedical literature. 2014 65 4 656 669
Galligan F Dyas-Correia S Altmetrics: rethinking the way we measure. 2013 39 1 56 61
Lapinski S Piwowar H Priem J Riding the crest of the altmetrics wave. 2013 4 6 292 300
Barbaro A Gentili D Rebuffi C Altmetrics as new indicators of scientific impact. 2014 10 1 3 6
Alternative Metrics Initiative - National Information Standards Organization (ISO). Available at http://www.niso.org/topics/tl/altmetrics_initiative/. Accessed February 14, 2016.
- Scotti, Valeria [PubMed] [Google Scholar] 1, 3, * Corresponding Author (firstname.lastname@example.org)
- De Silvestri, Annalisa [PubMed] [Google Scholar] 2, 3
- Scudeller, Luigia [PubMed] [Google Scholar] 2, 3
- Abele, Paola [PubMed] [Google Scholar] 1
- Topuz, Funda [PubMed] [Google Scholar] 1
- Curti, Moreno [PubMed] [Google Scholar] 1
Center for Scientific Documentation, Fondazione IRCCS Policlinico San Matteo, Pavia - Italy
Service of Biometry & Statistics Unit, Fondazione IRCCS Policlinico San Matteo, Pavia - Italy
ALMT-ALL Metrics Team, Fondazione IRCCS Policlinico San Matteo, Pavia - Italy