Please wait a minute...
Journal of Data and Information Science  2018, Vol. 3 Issue (4): 74-84    DOI: 10.2478/jdis-2018-0022
Research Paper     
A Local Adaptation in an Output-Based Research Support Scheme (OBRSS) at University College Dublin
Liam Cleere, Lai Ma()
University College Dublin, Belfield 4, Ireland
Download: PDF (1283 KB)      HTML  
Export: BibTeX | EndNote (RIS)      

Abstract  

University College Dublin (UCD) has implemented the Output-Based Research Support Scheme (OBRSS) since 2016. Adapted from the Norwegian model, the OBRSS awards individual academic staff using a points system based on the number of publications and doctoral students. This article describes the design and implementation processes of the OBRSS, including the creation of the ranked publication list and points system and infrastructure requirements. Some results of the OBRSS will be presented, focusing on the coverage of publications reported in the OBRSS ranked publication list and Scopus, as well as information about spending patterns. Challenges such as the evaluation of the OBRSS in terms of fairness, transparency, and effectiveness will also be discussed.



Key wordsOutput-Based Research Support Scheme      Norwegian model      Performance-based funding      Research assessment      Research information systems     
Received: 22 August 2018      Published: 08 January 2019
Cite this article:

Liam Cleere, Lai Ma. A Local Adaptation in an Output-Based Research Support Scheme (OBRSS) at University College Dublin. Journal of Data and Information Science, 2018, 3(4): 74-84.

URL:

http://manu47.magtech.com.cn/Jwk3_jdis/10.2478/jdis-2018-0022     OR     http://manu47.magtech.com.cn/Jwk3_jdis/Y2018/V3/I4/74

Figure 1. OBRSS points.
Publication types Points Level 1 ‘normal’ Points Level 2 ‘prestigious’
Book 5 8
Journals Article 1 3
Book Chapter 1 3
Conference Publication 0.5 2
Edited Book 1 3
Other Publication 0.5 2
Published Report 1 3
Table 1 Points allocation per publication type.
Publication output-points = B x C x F x N, where
B = Points (allocated based on the type of publications and whether it is in a ‘normal’ or ‘prestigious’ channel)
C = collaboration factor (multiply by 1.25 if there are any international authors on the paper)
F = UCD author factor (multiply by 0.7 if there are two UCD academic staff on the paper; multiply by 0.6 if there are three UCD academic staff on the paper; multiply by 0.5 if there are four or more UCD academic staff on the paper)
N = if the total number of authors on a paper exceeds 100, multiply the result by 0.1
Table 2 Calculation of publication output point.
UCD School Name (Discipline) Scopus Total 2013-2017 CRIS Total 2013-2017 % Coverage
in Scopus
Agriculture & Food Science 868 1,085 80.0%
Archaeology 73 202 36.1%
Architecture, Planning and Environmental Policy 127 580 21.9%
Art History & Cultural Policy 12 141 8.5%
Biology & Environmental Science 448 572 78.3%
Biomolecular & Biomedical Science 468 522 89.7%
Biosystems and Food Engineering 470 665 70.7%
Business 459 1,047 43.8%
Chemical & Bioprocess Engineering 250 266 94.0%
Chemistry 391 452 86.5%
Civil Engineering 205 470 43.6%
Classics 5 53 9.4%
Computer Science 767 916 83.7%
Earth Sciences 149 390 38.2%
Economics 147 186 79.0%
Education 83 178 46.6%
Electrical & Electronic Engineering 730 860 84.9%
English, Drama & Film 78 415 18.8%
Geography 86 324 26.5%
History 40 291 13.7%
Information & Communication Studies 61 149 40.9%
Irish, Celtic Studies and Folklore 3 145 2.1%
Languages, Cultures and Linguistics 57 367 15.5%
Law 52 495 10.5%
Mathematics & Statistics 460 617 74.6%
Mechanical & Materials Engineering 478 919 52.0%
Medicine 1,867 2,451 76.2%
Music 7 126 5.6%
Nursing, Midwifery & Health Systems 241 534 45.1%
Philosophy 98 246 39.8%
Physics 1,325 1,403 94.4%
Politics & International Relations 124 321 38.6%
Psychology 277 622 44.5%
Public Health, Physiotherapy and Sports Science 747 977 76.5%
Social Policy, Social Work and Social Justice 128 437 29.3%
Sociology 46 267 17.2%
Veterinary Medicine 626 1,094 57.2%
Grand Total 12,453 20,785 59.9%
Table 3 Comparison of publications for academic staff only from 2013 to 2017 inclusive; Scopus data from SciVal 25 May 2018; CRIS data from UCD RMS Profiles 22 June 2018.
UCD College Name Scopus Total 2013-2017 CRIS Total
2013-2017
% Coverage in Scopus
College of Arts & Humanities 228 1,556 14.7%
College of Business 468 1,046 44.7%
College of Engineering & Architecture 2,384 3,691 64.6%
College of Health and Agricultural Sciences 4,017 6,021 66.7%
College of Science 3,925 5,075 77.3%
College of Social Sciences & Law 1,227 3,487 35.2%
Grand Total 12,249 20,876 58.7%
Table 4 Comparison of publications for academic staff only from 2013 to 2017 inclusive; Scopus data from SciVal 25 May 2018; CRIS data from UCD RMS Profiles 22 June 2018.
Figure 2. Shares in percent of total output per College in the OBRSS categories: Prestigious Channel - Level 2; Normal Channel - Level 1; Not recognised in OBRSS publication list.
OBRSS categories 2016 Scheme (Publications from 2013 to 2015) 2017 Scheme (Publications from 2014 to 2016) Difference %Difference
Prestigious Channel - Level 2 4,230 4,444 214 5.1%
Normal Channel - Level 1 4,267 6,323 2,056 48.2%
Not recognised in OBRSS publication list 4,515 3,202 -1,313 -29.1%
Grand Total 13,012 13,969
Table 5 Number of publications per OBRSS category, per scheme year.
Journal List 2016 2017 Difference % Difference
Prestigious Channel - Level 2 4,485 3,958 -527 -11.80%
Normal Channel - Level 1 38,544 39,128 584 1.50%
Grand Total 45,045 45,103 58 0.10%
Table 6 Number of ranked journals, conferences and book series channels per OBRSS category, per scheme year.
Publisher list 2016 2017 Difference % Difference
Prestigious Channel - Level 2 265 257 -8 -3.00%
Normal Channel - Level 1 2,190 2,200 10 0.50%
Grand Total 2,455 2,457 2 0.10%
Table 7 Number of ranked publisher channels per OBRSS category, per scheme year.
Figure 3. Number of award recipients per college and average award value.
[1]   Aagaard K. (2015). How incentives trickle down: Local use of a national bibliometric indicator system. Science and Public Policy, 42, 725-737.
doi: 10.1093/scipol/scu087
[2]   Aagaard K., Bloch C., & Schneider J. W. (2015). Impacts of performance-based research funding systems: The case of the Norwegian Publication Indicator.Research Evaluation,24(2),106-117. Retrieved from .
[3]   Butler L. (2003). Modifying publication practices in response to funding formulas.Research Evaluation,12(1),39-46. Retrieved from .
[4]   Butler L. (2004).What happens when funding is linked to publication Counts? In Handbook of Quantitative Science and Technology Research (pp. 389-405). Dordrecht: Kluwer Academic Publishers. Retrieved from .
[5]   Castelvecchi D. (2015).Physics paper sets record with more than 5,000 authors.Nature, May 15. Retrieved from .
[6]   Dahler-Larsen P. (2014). Constitutive effects of performance indicators: Getting beyond unintended consequences.Public Management Review,16(7),969-986. Retrieved from .
[7]   Hicks D. (2012). Performance-based university research funding systems.Research Policy,41(2),251-261. Retrieved from .
[8]   Federation of Finnish Learned Societies. (n.d.).Publication Forum. Retrieved from
[9]   Liefner I. (2003). Funding, resource allocation, and performance in higher education systems.Higher Education,46(4),469-489. Retrieved from .
[10]   Ministry of Higher Education and Science. (2018). The BFI lists-Uddannelses-og Forskningsministeriet. Retrieved from https://ufm.dk/en/research-and-innovation/statistics-and-analyses/bibliometric-research-indicator/bfi-lists.
[11]   NSD-National Centre for Research Data. (2018). Scientific journals, series and publishers. Retrieved from https://dbh.nsd.uib.no/publiseringskanaler/Forside?request_locale=en.
[12]   Schneider , J.W. (2009).An Outline of the Bibliometric Indicator Used for Performance-Based Funding of Research Institutions in Norway.European Political Science,8(3),364-378. Retrieved from .
[13]   Schneider J. W., Aagaard K., & Bloch C. W. (2015) What happens when national research funding is linked to differentiated publication counts? A comparison of the Australian and Norwegian publication-based funding models. Research Evaluation; 25(3), 244-256.
doi: 10.1093/reseval/rvv036
[14]   Sivertsen G.(2016).Publication-based funding: The Norwegian model. In Research Assessment in the Humanities (pp. 79-90). Cham: Springer International Publishing. Retrieved from .
[15]   Woelert P.,& Yates ,L. (2015).Too little and too much trust: performance measurement in Australian higher education. Critical Studies in Education,56(2),175-189. Retrieved from .
[1] Björn Hammarfelt. Taking Comfort in Points: The Appeal of the Norwegian Model in Sweden[J]. Journal of Data and Information Science, 2018, 3(4): 85-95.
[2] Tim C. E. Engels, Raf Guns. The Flemish Performance-based Research Funding System: A Unique Variant of the Norwegian Model[J]. Journal of Data and Information Science, 2018, 3(4): 45-60.
[3] Janne Pölönen. Applications of, and Experiences with, the Norwegian Model in Finland[J]. Journal of Data and Information Science, 2018, 3(4): 31-44.
[4] Gunnar Sivertsen. The Norwegian Model in Norway[J]. Journal of Data and Information Science, 2018, 3(4): 3-19.
[5] Kaare Aagaard. Performance-based Research Funding in Denmark: The Adoption and Translation of the Norwegian Model(1)[J]. Journal of Data and Information Science, 2018, 3(4): 20-30.
[6] Emanuel Kulczycki, Przemysław Korytkowski. Redesigning the Model of Book Evaluation in the Polish Performance-based Research Funding System[J]. Journal of Data and Information Science, 2018, 3(4): 61-73.