Advertisement

Novel bibliometric scores for evaluating research quality and output: a correlation study with established indexes

Novel bibliometric scores for evaluating research quality and output: a correlation study with established indexes

Int J Biol Markers 2016; 31(4): e451 - e455

Article Type: ORIGINAL RESEARCH ARTICLE

DOI:10.5301/jbm.5000217

OPEN ACCESS ARTICLE

Authors

Valeria Scotti, Annalisa De Silvestri, Luigia Scudeller, Paola Abele, Funda Topuz, Moreno Curti

Abstract

Introduction

Novel bibliometric indexes (commonly known as altmetrics) are gaining interest within the scientific community and might represent an important alternative measure of research quality and output.

Aims

We evaluate how these new metrics correlate with established bibliometric indexes such as the impact factor (IF), currently used as a measure of scientific production as well as a criterion for scientific research funding, and how they might be helpful in assessing the impact of research.

Methods

We calculated altmetrics scores for all the articles published at our institution during a single year and examined the correlation between altmetrics scores and IFs as a measure of research quality and impact in all departments.

Results

For all articles from the various departments published in a single year, the altmetrics score and the sum of all IFs showed a strong and significant correlation (Spearman’s rho 0.88). The correlation was significant also when the major components of altmetrics, including Facebook, Twitter and Mendeley, were analyzed. The implementation of altmetrics has been found to be easy and effective at both the researcher and librarian levels.

Conclusions

The novel bibliographic index altmetrics is consistent and reliable and can complement or be considered a valid alternative to standard bibliometric indexes to benchmark output and quality of research for academic and funding purposes.

Article History

Disclosures

Financial support: None.
Conflict of interest: The authors have no conflicts of interest related to this article.

This article is available as full text PDF.

Download any of the following attachments:

Introduction

The problem of measuring the scientific and social impact of research publications has been of extreme interest to scientists and scholars since the inception of modern science, but open questions still remain on the efficacy of the current indexes. The journal impact factor (IF) and the h-index are the most well-known indicators based on citation analysis (1, 2). These indexes show several limitations; a crucial one is their lack of timeliness. It takes a long time (several months to years) for an article to be cited because it first must be read and cited by other researchers whose articles require additional time to be published. Furthermore, young researchers are generally disadvantaged since they have by definition published fewer articles than senior researchers and citations build up over time (3). The rapid growth of the World Wide Web has made these limits even more evident. The development of tools that are more Web 2.0 oriented has profoundly changed the scientific communication process (4). In this context, many Web tools are often referred to as “social media” given their role in supporting communication and building communities (Facebook, Twitter, etc.) (5). Altmetrics is the creation and study of new metrics based on the Social Web for analyzing and informing scholarship (6), thus combining the traditional bibliometrics tools with the use of the Web (7).

All research centers in Italy (defined as IRCCS) by April each year communicate to a central database (http://ricerca.cbim.it/index_en.html) the complete list of scientific publications in their research areas for the Italian Ministry of Health (MoH). The list of published work is one of the main factors for allocating resources to current research activities by the MoH. The judgment is based on the IF, which is also used to allocate funds received from the MoH to departments within each institution.

In this study we wanted to assess the validity of the new metrics as an index of the research impact. For this purpose we examined the correlation between IFs and altmetrics scores obtained in each department of our hospital, taking the Oncology Department as the reference. Our aim was to ascertain whether the altmetrics score could be used as a complementary or alternative index to evaluate the impact of research.

Materials and methods

Samples and tools

We analyzed all full-text articles published in 2013 in indexed journals (with a 2012 IF score) by researchers affiliated to our hospital. The list of articles was exactly the same as the one supplied to MoH for funding purposes. Data were collected searching PubMed and Web of Science, and researchers were asked to verify if all the articles they had authored had been retrieved. For each retrieved article, the altmetrics score on Altmetric.com was searched using the PMID of the article. The 646 papers published in 2013 were then grouped across departments and both the IFs and altmetrics scores were summed up.

For all these articles, we listed the altmetrics score as formulated by Jason Priem in 2010 (8). Altmetrics tools capture information through the use of metrics from HTML views and downloads of articles, blog posts, tweets, bookmarks, etc. Such information is provided in real time and altmetrics show not only the impact of scientific research by researchers but also the impact of the research on the public through social media (9).

Altmetric.com, PlumX, ImpactStory and PLoS Impact Explorer are currently the main tools that aggregate and provide article-level metrics (10, 11). In particular, the Altmetric.com badges function allows publishers to add altmetrics data with 2 simple lines of code added to the article HTML. Altmetric badges is currently used by leading publishers including Wiley, Sage, Springer, Nature, Wichtig and many others, which have integrated it into their Web pages. For the purpose of our study we selected Altmetric.com as a major tool aggregating data at the article level.

Statistical analysis

Quantitative variables are described as median and interquartile range (IQR), i.e., the 25th and 75th percentiles. The association between IF and the altmetrics score or its components is expressed through the nonparametric Spearman rho correlation coefficient. Furthermore, we correlated the number of citations in the Web of Science and the number of Mendeley readers or PubMed citations using the nonparametric Spearman rho coefficient. All analyses were performed with Stata 13 (StataCorp LP) and a p value below 0.05 was considered statistically significant (12).

Results

A total of 268 of the 646 papers (41.6%) presented an altmetrics score and 45 out of 53 departments obtained an altmetrics score (median 14, IQR 3-68). Among the components of the altmetrics score, Mendeley readers was the highest (2,403 citations obtained by 45 departments; median 36, IQR 7-50) followed by Tweeters for a total of 1,998 tweets obtained by 45 departments (median 15, IQR: 6-48) and Facebook walls for a total of 247 obtained by 32 departments (median 5, IQR: 2-9). Less frequent were news outlets, for a total of 74 obtained by 13 departments (median 4, IQR: 2-8); bloggers for a total of 70 obtained by 18 departments (median 3, IQR: 1-5); CiteULike readers for a total of 53 obtained by 17 departments (median 2, IQR 1-2); Google authors for a total of 23 obtained by 9 departments (median 2, IQR: 2-3); and F1000 reviews for a total of 18 obtained by 13 departments (median 1, IQR 1-2). The results are summarized in Figure 1.

Score components.

The median IF by department was 44 (IQR 14-128) and the correlation between the sum of altmetrics scores and the sum of IFs relative to all articles published in 2013 calculated for each of the departments of our institution was very high (Spearman’s rho 0.88; p<0.0001). Each point in Figure 2 represents an individual department.

Correlation between the sum of altmetrics scores and the sum of IFs relative to all the articles published in 2013 calculated for each of the departments of our institution.

Furthermore, the correlation between IF and single major components of altmetrics such as Facebook (Spearman’s rho 0.80; p<0.0001), Twitter (Spearman’s rho 0.90, p<0.0001) and Mendeley (Spearman’s rho 0.90, p<0.0001) was very good.

Analysis of single departments

Looking at the impact of different departments we focused on the Oncology Department to see where a single department with a typical output of relevant articles would fit in our analysis compared to other departments.

Figure 3A shows the same correlation as Figure 2, but highlights where the Oncology Department (marked in red in the graphs) stands and where the outliers stand. Figures 3B, 3C and 3D show where the Oncology Department and the outliers stand when only Facebook, Mendeley and Twitter, respectively, are considered.

Highlighting where the Oncology Department (in red in the graphs) stands (A). Correlation between the sum of Facebook wall scores and the sum of IFs relative to all articles published in 2013 calculated for each department of our institution (B). Correlation between the sum of Mendeley reader scores and the sum of IFs relative to all articles published in 2013 calculated for each department of our institution (C). Correlation between the sum of Twitter scores and the sum of IFs relative to all articles published in 2013 calculated for each department of our institution (D).

Interestingly, while the Oncology Department seems to closely follow the trend of most departments, 2 departments had different altmetrics scores from the others: the Thromboembolic Disease Unit, which had higher Facebook, Twitter and Mendeley scores, and the Neurosurgery Clinic, with a high Facebook score. Also the Internal Medicine Unit (with papers about celiac disease) and the Hospital Management Department (particularly with a paper on exposure to pesticides or solvents and risk of Parkinson disease) had Facebook scores higher than expected on the basis of the IF. Besides these few, highly specific examples, there was very good and statistically significant agreement between altmetrics and IF.

Discussion

Evaluating the importance of an article is becoming ever more important for researchers who lack the time to read all relevant papers. Traditionally, bibliometrics is the application of quantitative analysis and statistics to publications such as journal articles and their accompanying citations. The newly developed altmetrics have proven to be user friendly, graphic, self-explaining also for nonspecialized readers, rapidly evolving and interacting with media and public or users.

Institutions are also very interested in implementing the use of altmetrics data. For example, the Dublin Business School and the University of Tor Vergata will integrate PlumX within the school’s institutional repository (IR), so that the impact of any research output added to the IR can be measured by PlumX. Many others institutions – such as the University of Cambridge, the University of Manchester, the University of South Australia and the University of Aalto in Finland – are using Altmetric.com for the evaluation of their impact, indicating that the faculty’s research activity is an integral aspect of program accreditation and validation.

In our study, we documented a very good correlation of altmetrics with standard bibliometric indexes both at the institutional and departmental level, as already shown by Costas et al (13) and in a meta-analysis by Bornmann (14). A high percentage of manuscripts had their own altmetrics score, consistent with data shown in the biomedical field by Haustein et al (15, 16).

Altmetrics could act as a reliable tool in evaluating departments, and could be considered in addition to traditional metrics when managing funding activities. For this reason, altmetrics can greatly help institutions understand their impact on society. They may also help researchers and institutions to maximize the success of their own research efforts.

In our institution, a high score was obtained by many items, both within the research community (e.g., Mendeley readers) and among the general public (e.g., Twitter and Facebook users). Interestingly, in many cases, hot topics like thromboembolic disorders, neurodegenerative diseases and celiac disease have a greater impact on the general public than on the research community. So it is becoming more and more evident that alternative metrics may play a crucial role in helping society as well as patient communities to retrieve reliable information. Thus, together with knowledgeable scientific journalists, they could contribute to spreading relevant scientific results for the scientific education of the public. They may also highlight the value of the most successful research programs to their institutions, since altmetrics measure the impact in real time (17). Showing how research is relevant to the general public is useful especially for institutions and foundations funded by public money, like the one taken into consideration in this study.

Limitations of altmetrics and implications of the study

There are certain limits that should be considered in the use of these new metrics. First, when dealing with citations there is no distinction between positive and negative comments and this fact could distort an article’s score. In the same way, social media may be particularly vulnerable to “gaming” by commercial services that sell Facebook posts, tweets or blog mentions (18, 19).

Another critical issue is that there is currently no standard for reporting altmetrics. For this reason, last year the National Information Standards Organization (NISO) received a grant to develop a standard in the field of these new metrics (20).

Last but not least, altmetrics – like bibliometrics before – need time: the concept of these metrics, together with the development and changes of social media, is still evolving and not yet fully understood by the scientific community.

The data from our study are from a single institution, resulting in a smaller sample size compared to other studies. However, the in-depth analysis of various departments in a single institution reduces the heterogeneity inherent in data coming from different institutions, and allows to perform an analysis of a real-life situation and to measure in a pragmatic way the impact this new metric should have in addition to the traditional ones.

Conclusion

The data resulting from this study indicate that altmetrics are useful and may well be considered as reliable metrics for measuring research. Furthermore, they could actually represent an interesting and relevant complement to citations, providing institutions and researchers with a new framework to evaluate not only their academic influence but their social impact. Together with traditional metrics, they could be a useful tool in guiding decision makers when funding public research. Nevertheless, further investigations are still needed to explore and understand how these new indexes can be used in the evaluation of research.

Disclosures

Financial support: None.
Conflict of interest: The authors have no conflicts of interest related to this article.
References
  • 1. Garfield E The history and meaning of the journal impact factor. JAMA 2006 295 1 90 93
  • 2. Hirsch JE An index to quantify an individual’s scientific research output. Proc Natl Acad Sci USA 2005 102 46 16569 16572
  • 3. Priem J Piwowar H Hemminger B Altmetrics in the wild: using social media to explore scholarly impact. Published March, 2012. Available at http://adsabs.harvard.edu/abs/2012arXiv1203.4745P. Accessed February 14, 2016.
  • 4. Priem J Hemminger B Scientometrics 2.0: New metrics of scholarly impact on the social Web. First Monday 2010 15 7
  • 5. Torres-Salinas D Cabezas-Clavijo Á Jiménez-Contreras E Altmetrics: nuevos indicadores para la comunicación científica en la Web 2.0. Comunicar: Media Education Research Journal 2013 21 41 53 60
  • 6. Priem J Taraborelli D Groth P Neylon C Altmetrics: a manifesto. Published October 26, 2010. Available at http://altmetrics.org/manifesto. Accessed February 14, 2016.
  • 7. Priem J Groth P Taraborelli D The altmetrics collection. PLoS ONE 2012 7 11 e48753
  • 8. Priem J Twitter. Published September 28, 2010. Available at https://twitter.com/jasonpriem/status/25844968813. Accessed February 14, 2016.
  • 9. Cave R Overview of the Altmetrics landscape. In: Bernhardt BR, Hinds LH, Strauch KP. Accentuate the positive! Charleston Conference Proceedings, 2012. Against the Grain Press 2013; 349-356.
  • 10. Adie E Roe W Altmetric: enriching scholarly content with article-level discussion and metrics. Learn Publ 2013 26 1 11 17
  • 11. Rodgers E Barbrow S A Look at altmetrics and its growing significance to research libraries. Hdlhandlenet, 2013. Available at http://hdl.handle.net/2027.42/99709. Accessed February 14, 2016.
  • 12. Landis JR Koch GG The measurement of observer agreement for categorical data. Biometrics 1977 33 1 159 174
  • 13. Costas R Zahedi Z Wouters P Do “altmetrics” correlate with citations? Extensive comparison of altmetric indicators with citations from a multidisciplinary perspective. J Assoc Inf Sci Technol 2015 66 10 2003 2019
  • 14. Bornmann L Alternative metrics in scientometrics: a meta-analysis of research into three altmetrics. Scientometrics 2015 103 3 1123 1144
  • 15. Haustein S Thelwall M Lariviere V Sugimoto CR On the relation between altmetrics and citations in medicine (RIP). In: Hinze S, Lottmann A, eds. Proceedings of the 18th International Conference on Science and Technology Indicators (STI), Berlin, Germany, September 4-6, 2013 164 166
  • 16. Haustein S Peters I Sugimoto C Thelwall M Larivière V Tweeting biomedicine: an analysis of tweets and citations in the biomedical literature. J Assoc Inf Sci Technol 2014 65 4 656 669
  • 17. Galligan F Dyas-Correia S Altmetrics: rethinking the way we measure. Serials Review 2013 39 1 56 61
  • 18. Lapinski S Piwowar H Priem J Riding the crest of the altmetrics wave. How librarians can help prepare faculty for the next generation of research impact metrics. College & Research Libraries News 2013 4 6 292 300
  • 19. Barbaro A Gentili D Rebuffi C Altmetrics as new indicators of scientific impact. Journal of the European Association of Health Information and Libraries 2014 10 1 3 6
  • 20. Alternative Metrics Initiative - National Information Standards Organization (ISO). Available at http://www.niso.org/topics/tl/altmetrics_initiative/. Accessed February 14, 2016.

Authors

Affiliations

  • Center for Scientific Documentation, Fondazione IRCCS Policlinico San Matteo, Pavia - Italy
  • Service of Biometry & Statistics Unit, Fondazione IRCCS Policlinico San Matteo, Pavia - Italy
  • ALMT-ALL Metrics Team, Fondazione IRCCS Policlinico San Matteo, Pavia - Italy

Article usage statistics

The blue line displays unique views in the time frame indicated.
The yellow line displays unique downloads.
Views and downloads are counted only once per session.

No supplementary material is available for this article.