Please wait a minute...
Journal of Data and Information Science  2019, Vol. 4 Issue (3): 55-72    DOI: 10.2478/jdis-2019-0015
Research Paper     
Methods and Practices for Institutional Benchmarking based on Research Impact and Competitiveness: A Case Study of ShanghaiTech University
Jiang Chang1†(),Jianhua Liu1,2
1 Library and information services, ShanghaiTech University, Shanghai, 201210, China
2 Beijing Wanfang Data Ltd., Beijing, 100038, China
Download: PDF (1838 KB)      HTML  
Export: BibTeX | EndNote (RIS)      

Abstract  

Purpose To develop and test a mission-oriented and multi-dimensional benchmarking method for a small scale university aiming for internationally first-class basic research.

Design/methodology/Approach: An individualized evidence-based assessment scheme was employed to benchmark ShanghaiTech University against selected top research institutions, focusing on research impact and competitiveness at the institutional and disciplinary levels. Topic maps opposing ShanghaiTech and corresponding top institutions were produced for the main research disciplines of ShanghaiTech. This provides opportunities for further exploration of strengths and weakness.

Findings: This study establishes a preliminary framework for assessing the mission of the university. It further provides assessment principles, assessment questions, and indicators. Analytical methods and data sources were tested and proved to be applicable and efficient.

Research limitations: To better fit the selective research focuses of this university, its schema of research disciplines needs to be re-organized and benchmarking targets should include disciplinary top institutions and not necessarily those universities leading overall rankings. Current reliance on research articles and certain databases may neglect important research output types.

Practical implications: This study provides a working framework and practical methods for mission-oriented, individual, and multi-dimensional benchmarking that ShanghaiTech decided to use for periodical assessments. It also offers a working reference for other institutions to adapt. Further needs are identified so that ShanghaiTech can tackle them for future benchmarking.

Originality/value This is an effort to develop a mission-oriented, individually designed, systematically structured, and multi-dimensional assessment methodology which differs from often used composite indices.



Key wordsDifferentiating evaluation      Mission-driven assessment      Evidence-based assessment      Benchmarking      Research performance     
Received: 24 May 2019      Published: 02 September 2019
Cite this article:

Jiang Chang, Jianhua Liu. Methods and Practices for Institutional Benchmarking based on Research Impact and Competitiveness: A Case Study of ShanghaiTech University. Journal of Data and Information Science, 2019, 4(3): 55-72.

URL:

http://manu47.magtech.com.cn/Jwk3_jdis/10.2478/jdis-2019-0015     OR     http://manu47.magtech.com.cn/Jwk3_jdis/Y2019/V4/I3/55

Figure 1. Flow diagram of assessment process.
Dimension Category Indicator Description Data Source
Research Output Publication counts Number of publications Total publication of research papers InCites, Scival
Research Impact Counts of high quality publication % Papers in Q1 journals Percentage of publications in Q1 journals of JIF Quartile InCites
% Highly cited publications Percentage of papers ranked in the top 1% by citations in a period of time InCites, Scival
% Hot publications Percentage of papers ranked in the top 1‰ by citations in a period of time (article and review) ESI
% Publications on CNS Percentage of papers published in Cell, Nature and Science WoS, Scopus
Impact of citation Times cited Number of citation of total publication InCites, Scival
H-index H-index of the set of publications WoS
Citation impact Average (mean) number of citations per year InCites, Scival
Normalized citation impact Citation impact that normalized for subject, year and document type InCites, Scival
Research Topics (content analysis) Knowledge maps Content analysis based on research topics Comparison of structure of maps constructed based on keywords pairing (reflect research topics) between Inst. S and benchmarking institutions WoS, Scopus
Contribution to research Front Publications as core papers in research fronts, and publishing year comparison Number of publications as core papers in research fronts, and the comparison between publishing year of these contributed papers and average publishing year of the total core papers in each research front ESI
Research Connection Collaboration Main collaborators and research areas Top collaborators with the most collaborative papers (contain co-authors), and the most collaborative research areas with each collaborator WoS, Scopus
Table 1 Establishment of analytic dimensions and indicators for institutional benchmarking.
Region Institution Publication % documents in Q1
journals
% highly cited papers % hot papers % international collaboration h-index citation impact CNCI
Inter-
national
Univ A 19,508 63.50% 3.60% 0.20% 69.30% 160 14.7 2.03
Univ B 6,898 65.00% 4.90% 0.50% 50.60% 116 16.9 2.41
Univ C 19,034 64.80% 5.40% 0.40% 51.50% 190 17.8 2.47
Univ D 19,834 62.50% 3.60% 0.30% 68.60% 156 13.9 2.01
Univ E 22,013 66.00% 5.70% 0.50% 55.90% 198 17.9 2.54
Univ F 35,249 64.80% 4.70% 0.40% 55.60% 221 17.4 2.39
Univ G 14,252 63.30% 2.80% 0.20% 66.40% 112 12.5 1.82
Univ H 16,885 65.30% 4.40% 0.30% 52.90% 165 17.1 2.15
Domestic Univ I 31,673 50.40% 2.40% 0.10% 32.70% 139 8.9 1.45
Univ J 20,278 52.00% 2.30% 0.20% 33.00% 128 9.9 1.42
Univ K 27,699 45.90% 1.40% 0.10% 31.70% 108 7.2 1.19
Univ L 15,927 48.50% 1.60% 0.00% 31.30% 102 9.0 1.24
Univ M 19,123 52.10% 2.60% 0.10% 30.10% 121 9.7 1.43
Univ N 32,958 47.60% 1.60% 0.10% 18.30% 123 7.3 1.21
Univ O 2,074 53.10% 2.30% 0.10% 41.50% 48 7.6 1.6
SHTech-A 1,555 55.00% 3.80% 0.10% 41.00% 48 9.0 1.75
SHTech-B 660 52.00% 3.00% 0.00% 37.30% 27 5.6 1.58
SHTech-C 225 50.70% 5.30% 0.00% 46.70% 21 6.2 1.9
Table 2 Overall difference of research performance between ShanghaiTech and benchmarking institutions.
Region Institution % total publication % documents in Q1
journals
% Highly cited
papers
% Hot papers % International Collaboration H-index Citation Impact CNCI
Inter-
national
Univ A 12.44% 70.90% 3.70% 0.41% 74.20% 90 23.5 2.5
Univ B 5.55% 72.85% 7.00% 0.78% 38.90% 55 29.1 2.75
Univ C 14.88% 69.98% 5.30% 0.35% 51.70% 111 25.7 2.71
Univ D 13.77% 70.90% 4.80% 0.33% 73.40% 104 23.3 2.52
Univ E 12.97% 76.91% 10.40% 0.54% 57.80% 148 40.7 4.14
Univ F 22.79% 72.08% 5.80% 0.29% 59.00% 178 28.2 2.81
Univ G 6.41% 68.13% 2.60% 0.11% 75.00% 58 17.9 1.98
Univ H 8.92% 69.39% 4.90% 0.00% 47.10% 88 29.2 2.55
Median 12.70% 70.90% 5.10% 0.34% 58.40% 97 27 2.63
Domestic Univ I 2.83% 61.61% 2.70% 0.00% 58.40% 53 15.9 1.84
Univ J 8.38% 55.39% 1.20% 0.00% 46.40% 56 12.6 1.35
Univ K 10.68% 50.51% 0.70% 0.03% 37.10% 57 9.4 1.1
Univ L 14.93% 53.26% 0.50% 0.04% 30.40% 55 9.2 1.08
Univ M 1.82% 54.60% 1.10% 0.00% 32.90% 31 14.6 1.33
Univ N 4.91% 55.69% 1.50% 0.00% 37.10% 51 9.2 1.21
Univ O 3.66% 57.89% 0.00% 0.00% 33.50% 11 5.6 1.12
Median 4.91% 55.39% 1.10% 0.00% 37.10% 53 9.4 1.21
Total Median 8.92% 68.13% 2.70% 0.04% 47.10% 57 17.9 1.98
SHTech-A 14.92% 71.55% 2.20% 0.00% 72.40% 23 9.4 1.64
SHTech-B 15.15% 71.00% 3.00% 0.00% 49.10% 15 7.9 1.68
SHTech-C 10.00% 63.63% 10.00% 0.00% 45.00% 7 10.6 2.6
Table 3 Disciplinary difference (Molecular Biology & Genetics) of research performance between ShanghaiTech and benchmarking institutions
Figure 2. Maps of keywords of molecular biology & genetics of Univ E (left) and ShanghaiTech (right); mapped parameter: co-occurrence; unit of analysis: all keywords; threshold of (a) = 10, threshold of (b) = 2, full counting.
Frequent Keywords 2014 2015 2016 2017 2018
1 alzheimer disease
2 cancer cells
3 cell cycle
4 cell lung cancer
5 cell proliferation
6 colorectal-cancer
7 crystal structures
8 dna methylation
9 embryonic stem-cells
10 epithelial-mesenchymal transition
11 escherichia coli
12 gene expression
13 human genome
14 inflammation
15 lung cancer
16 mammalian-cells
17 mesenchymal stem cells
18 molecular mechanism
19 mouse model
20 nf-kappa-b
21 oxidative stress
22 pluripotent stem-cells
23 progenitor cells
24 saccharomyces-cerevisiae
25 signaling pathway
26 skeletal muscle
27 stem cell
28 susceptibility loci
29 tumor growth
30 tumor suppression
Table 4 The most frequent keywords of Molecular Biology & Genetics of benchmarking institutions and year of first appearance in ShanghaiTech’s papers.
Papers in Research Front No later than average publication year Earlier than average publication year
Papers Percentage Papers Percentage
Univ A 382 180 47.12% 42 10.99%
Univ B 221 85 38.46% 15 6.79%
Univ C 570 319 55.96% 77 13.51%
Univ D 399 192 48.12% 40 10.03%
Univ E 713 364 51.05% 65 9.12%
Univ F 921 470 51.03% 86 9.34%
Univ G 260 123 47.31% 22 8.46%
Univ H 436 234 53.67% 53 12.16%
Univ I 434 221 50.92% 47 10.83%
Univ J 275 148 53.82% 23 8.36%
Univ K 213 104 48.83% 21 9.86%
Univ L 134 61 45.52% 10 7.46%
Univ M 256 142 55.47% 26 10.16%
Univ N 272 131 48.16% 27 9.93%
Univ O 27 6 22.22% 0 0.00%
SHTech-A 32 13 40.63% 0 0.00%
Table 5 Number of papers of each institution as core papers in the research front.
Figure 3. Map of PI’s research productivity and competitiveness of ShanghaiTech.
1   Abramo G.,&D’Angelo C.(2011). Evaluating research: From informed peer review to bibliometrics. Scientometrics, 87(3), 499-514. doi:10.1007/s11192-011-0352-7
doi: 10.1007/s11192-011-0352-7
2   Amin M.,& Mabe, M. (2000). Impact factors: Use and abuse. Perspectives in Publishing, 1(1). doi:10.1177/0891988714527516
3   ARWU. (2018). Academic ranking of world universities. Retrieved from
4   ASCB. (2016). San Francisco Declaration on Research Assessment (DORA). Retrieved from
5   Borgman C. (2015). Big data, little data, no data: Scholarship in the Networked world. Journal of the Association for Information Science & Technology, 2016, 67(3), 751-753.
6   Chen C.(2017). Science mapping: A systematic review of the literature. Journal of Data and Information Science, 2(2), 1-40. doi:10.1515/jdis-2017-0006
7   Clarivate Analytics. (2018). InCites Indicators Handbook. Category Normalized Citation Impact. Retrieved from
8   Ding Z. Q., Ge J. P., Wu X. M.,& Zheng X. N. (2013). Bibliometrics evaluation of research performance in pharmacology/pharmacy: China relative to ten representative countries. Scientometrics, 96(3), 829-844. doi:10.1007/s11192-013-0968-x
doi: 10.1007/s11192-013-0968-x
9   Garfield E.(1972). Citation analysis as a tool in journal evaluation: Journals can be ranked by frequency and impact of citations for science policy studies. Science, 178(4060), 471-479.
10   Garfield E.(1990). Keywords plus - ISI’s breakthrough retrieval method. 1. Expanding your searching power on current - contents on diskette. Current Contents, 32, 5-9.
11   Hicks D., Wouters P., Waltman L., de Rijcke S.,& Rafols I. (2015). The Leiden Manifesto for research metrics. Nature, 520(7548), 429-431. doi:10.1038/520429a
doi: 10.1038/520429a pmid: 25903611
12   Huang Y., Zhu D. H., Lv Q., Porter A. L., Robinson D. K. R.,& Wang X. F. (2017). Early insights on the Emerging Sources Citation Index (ESCI): An overlay map-based bibliometric study. Scientometrics, 111(3), 2041-2057. doi:10.1007/s11192-017-2349-3
13   Ibrahim B.(2018). Arab Spring’s effect on scientific productivity and research performance in Arab countries. Scientometrics, 117(3), 1555-1586. doi:10.1007/s11192-018-2935-z
14   James W., Liz A., Eleonora B., Philip C., Stephen C., Steven H., . . , & Ben, J. (2015). The metric tide: Report of the independent review of the role of metrics in research assessment and management.
15   Jia T.(2018). China’s research works are underrepresented. Paper presented at the The 2nd International Conference on Data-driven Knowledge Discovery, Beijing.
16   Kumar , A.J., &Pandit R.(2018). Science and engineering research in India (1985-2016): Insights from two scientometric databases. Current Science, 115(3), 399-409. doi:10.18520/cs/v115/i3/399-409
17   Lopez-Illescas C., de Moya Anegon F.,& Moed H. (2009). Comparing bibliometric country-by-country rankings derived from the Web of Science and Scopus: The effect of poorly cited journals in oncology. Journal of Information Science, 35(2), 244-256. doi:10.1177/0165551508098603
18   Lukman R., Krajnc D.,& Glavic P. (2010). University ranking using research, educational and environmental indicators. Journal of Cleaner Production, 18(7), 619-628. doi:10.1016/j.jclepro.2009.09.015
doi: 10.1016/j.jclepro.2009.09.015
19   Moed H.,&Plume A.(2011). The multi-dimensional research assessment matrix. Research Trends(23).
20   QS. (2019). QS World University Rankings.
21   Rousseau R., Egghe L.,& Guns R. (2018). Becoming metric-wise: A bibliometric guide for researchers: Chandos-Elsevier.
22   Shanghai Ranking. (2018). The best universities in China. Retrieved from
23   Sivertsen G.(2018). The Norwegian model in Norway. Journal of Data and Information Science, 3, 3-19. doi:10.2478/jdis-2018-0017
24   The University of Manchester. (2015). Manchester 2020: The university of Manchester’s strategy plan.
25   Thelwall M., Kayvan K., Paul W., Ludo W., Sarah de R., Alex R.,& Thomas F. (2015). The metric tide:Literature review (Supplementary Report to the Independent Review of the Role of Metrics in Research Assessment and Management).
26   Times Higher Education. (2019). The world university rankings.
27   U.S. News. (2018). Best global universities rankings.
28   Vieira,E. S., & Gomes, J. A. N. F. (2010). Citations to scientific articles: Its distribution and dependence on the article features. Journal of Informetrics, 4(1), 1-13. doi:10.1016/j.joi.2009.06.002
doi: 10.1016/j.joi.2009.06.002
29   Waltman L.(2016). A review of the literature on citation impact indicators. Journal of Informetrics, 10(2), 365-391. doi:10.1016/j.joi.2016.02.007
30   Waltman L.,&Eck N.(2013). Source normalized indicators of citation impact: An overview of different approaches and an empirical comparison. Scientometrics, 96(3), 699-716. doi:10.1007/s11192-012-0913-4
doi: 10.1007/s11192-012-0913-4
[1] Jian DU, Bin ZHANG, Yang LI, Xiaoli TANG & Peiyang XU. A causational analysis of scholars' years of active academic careers vis-à-vis their academic productivity and academic influence[J]. Journal of Data and Information Science, 2011, 4(2): 77-91.