Quantitative Social Research

Language Selection

Breadcrumb Navigation


Anchoring effects in the assessment of papers


In this study, we empirically study the assessment of cited papers within the framework of the anchoring-and-adjustment heuristic. We are interested in the question whether citation decisions are (mainly) driven by the quality of cited references.

We undertook a survey of corresponding authors with an available email address in the Web of Science database. The authors were asked to assess the quality of papers that they cited in previous papers. Some authors were assigned to three treatment groups that received further information alongside the cited paper: citation information, information on the publishing journal (journal impact factor), or a numerical access code to enter the survey. The control group did not receive any further information.

Our results show that the quality assessments of papers seem to depend on the citation impact information of single papers. The other information (anchors) such as an arbitrary number (an access code) and journal impact information did not play a (important) role in the assessments of papers (Bornmann et al., 2023).

The study was pre-registered. An article with the results was published in March 2023. At the moment (May 2023) we are planning a secon survey in which we want to better control for the actual quality of the papers than it was possible in the in the first survey.

Research team: Lutz Bornmann1, Christian Ganser2, Alexander Tekles1,2

1 Science Policy and Strategy Department, Administrative Headquarters of the Max Planck Society, Munich,
Germany, 2 Department of Sociology

Bild von Mudassar Iqbal auf Pixabay