Bibliometrics as a research assessment tool : impact beyond the impact factor
Abstract: Introduction: While bibliometrics is frequently used as a tool for research assessment and is thought of as objective, quantitative and unobtrusive, the usability of bibliometric indicators has seldom been properly assessed and their value therefore been questioned. The aim of this thesis is to explore and develop the utility of bibliometrics as a research assessment tool. This is done through four studies that address the validity of bibliometrics as a research assessment tool. The issues that are further investigated are related with field delineation (Study I), collaboration (II) and research performance (I/III/IV). Materials and methods: The thesis is primarily based on data from the citation indices (Cl) produced by Thomson Scientific. In the different studies bibliometric indicators are calculated based on 24 223 (I), 62 104 (II) 6 142 055 (III) and 596 (IV) publications. To assess the validity of bibliometric indicators calculated using the Cl, these are combined with data from PubMed (I) and compared with data from manual assessments (I), financial data (II) and a Swedish system for identification and early assessment of new methods in health care (IV). Three new indicators are developed based on theoretical reasoning (III). Results: Frequently used bibliometric methods do not allow valid assessment of the development of research areas (I) or collaboration between academia and industry (II). There are also flaws associated with the current state-of-the-art performance indicator (III). At the same time, there are bibliometric methods available that can be used for identifying research areas (I) and there are certain types of collaboration between academia and industry that accurately can be described using bibliometric indicators (II). New performance indicators, which assign equal weight to all publications and control for the skewed and differing distribution of citations over publications, have been developed (III). Publications related to health care technologies that are deemed as very promising by clinical experts receive high scores using these indicators (IV). Discussion: The results of this thesis show that uncritical assessments of research areas based on rudimentary article identification strategies or collaboration analyses based solely on coauthorship data may be misleading and thus provide incorrect information for decision-making. At the same time, a correct use of refined bibliometric indicators may provide valuable background information for decision makers.
This dissertation MIGHT be available in PDF-format. Check this page to see if it is available for download.