Bias against Novelty in Science: A Cautionary Tale for Users of Bibliometric Indicators
Research which explores unchartered waters has a high potential for major impact but also carries a higher uncertainty of having impact. Such explorative research is often described as taking a novel approach. This study examines the complex relationship between pursuing a novel approach and impact. Viewing scientific research as a combinatorial process, we measure novelty in science by examining whether a published paper makes first time ever combinations of referenced journals, taking into account the difficulty of making such combinations. We apply this newly developed measure of novelty to all Web of Science research articles published in 2001 across all scientific disciplines. We find that highly novel papers, defined to be those that make more (distant) new combinations, deliver high gains to science: they are more likely to be a top 1% highly cited paper in the long run, to inspire follow on highly cited research, and to be cited in a broader set of disciplines. At the same time, novel research is also more risky, reflected by a higher variance in its citation performance. In addition, we find that novel research is significantly more highly cited in “foreign” fields but not in its “home” field. We also find strong evidence of delayed recognition of novel papers and that novel papers are less likely to be top cited when using a short time window. Finally, novel papers typically are published in journals with a lower than expected Impact Factor. These findings suggest that science policy, in particular funding decisions which rely on traditional bibliometric indicators based on short-term direct citation counts and Journal Impact Factors, may be biased against “high risk/high gain” novel research. The findings also caution against a mono-disciplinary approach in peer review to assess the true value of novel research.
Non-Technical Summaries
- Evaluating scientific impact using short citation windows and focusing only on the most prominent journals may fail to recognize the...
Published Versions
Wang, Jian & Veugelers, Reinhilde & Stephan, Paula, 2017. "Bias against novelty in science: A cautionary tale for users of bibliometric indicators," Research Policy, Elsevier, vol. 46(8), pages 1416-1436. citation courtesy of