MEASURING UNIVERSITY ENGAGEMENT

. This article presents a model for the evaluation of scienti ﬁ c research output from the standpoint of university engagement with the socio-economic environment based on a scientometric analysis of topical areas. The primary aim was to examine various interrelations between conventional and alternative scientometric indicators that most clearly re ﬂ ect the relationship between universities, industry and society. Three countries and ﬁ ve topical research areas were chosen as the object of the study. A comparative analysis showed that conventional scientometric indicators correlate quite well with the indicators of social and commercial relevance of scienti ﬁ c research. However, since this relationship was not observed in the case of Brazil, an assumption was made about the in ﬂ uence of the national and disciplinary context. The evaluation of university engagement cannot be performed based exclusively on quantitative indicators, thus requiring qualitative assessment, e. g. peer review.


Introduction
Until recently, universities have enjoyed great academic freedom.The liberal governance model implied autonomy (delegation) based on trust [1].Since the 19th century, governments and private sponsors have been allocating significant resources for the development of universities without requiring much accountability in response.At that time, there was no clear link between the progress of science and economic growth in public consciousness.
The Second World War convincingly demonstrated the ample possibilities of science.In addition, the post-war fertility boom stimulated expenditures on higher education [2].The increased spending led to a demand for greater accountability, as the society became interested in how its tax money was spent.People required that knowledge gained by pure science be practically useful.Industry that directly or indirectly (through the tax system) funded science also wanted to maximize outcome for their money spent.
Towards the end of the 20th century, the concept of knowledge economy became the mainstream development paradigm.Within the framework of neoliberalism, science is increasingly being considered as a production process with its input and output parameters.The university has become a principal actor in the socio-economic system.Undeniably, the ties between the university, government, and business have existed long before.The theory of innovation, the backbone of which was laid by Schumpeter [3], can be distinguished into the following distinct areas: Взаимодействие университетов и бизнеса -product design -diffusion of innovation [4]; -evolutionary -triple helix [5][6][7][8][9]; -organizational or strategic -open innovation [10][11][12][13][14][15][16][17], agile innovations [18]; -political -national and regional innovation systems [19][20][21][22].The Triple Helix model proposed a new role for the university in the economy.The triple helix is applicable when overlapping of institutional spheres occurs.It is in the places of overlap that the phenomenon of the endless frontier of new knowledge generation arises, which is a prerequisite for the evolutionary development of systems [9].
The demand for greater science accountability raised the problem of new indicators for research productivity measurement.Until the 90s, research performance had been primarily assessed using such qualitative instruments, as peer review.However, the rapid development of information technologies coupled with growing scholarly output resulted in dominance of scientometric (quantitative) indicators over qualitative ones.
Do the results of peer review and scientometric indicators coincide?The results of a few studies thus far conducted have produced conflicting results.Thus, Mryglod et al. [23] found a strong correlation between quality and impact, although normalized per head indicators showed a rather weak correlation.It was argued that scientometric indicators are not suitable for assessment of research productivity in social sciences and humanities.At the same time, Harzing [24] found a strong link between the results of peer review carried out at British universities in the framework of REF (Research Excellence Framework) and the citation data retrieved from Microsoft Academic (MA).A recent study established that consistency between metrics and peer review is observed at the institutional level (rather than at the publication level), at least in the fields of physics, clinical medicine, public health, health services & primary care [25].Nevertheless, it should be accepted that the entire evaluation procedure is becoming more impersonal.
At almost the same time, at the turn of the century, the first university rankings began to appear1 .To a certain extent, they were designed to give a quantitative answer to the question of what should be done "in order to become Harvard".This presumption determined their bibliometric-based character; moreover, expert voting is also an impersonal procedure by nature.University rankings are a convenient quantitative tool, but their design presupposes their weaknesses.University rankings are rather a marketing tool for attracting resources (human and financial); their value for improving research performance remains unclear [26].Most university rankings have a strong organizational profile of an American university inside; therefore, it does not come as surprise that most of the first places are occupied by American universities [27].Rankings create "weak expertise," which is a compromise between the interests of key stakeholders and the robustness of methodology [28].The ranking of the Three University Missions from Moscow State University2 stands apart.It is one of the first largescale attempts to assess the engagement of universities in the solution of societal problems.In this context, U-Multirank3 , which includes the indicators of regional engagement and knowledge transfer, should be mentioned.
Thus, the discussion around topics of measuring of university engagement in socio-economic processes is continuing.Bibliometric methods have limitations; at the same time, even ardent supporters of the peer review approach recognize the impossibility of using exclusively expert methods under the conditions of rapidly increasing information flows.In this study, we aim to show the applicability of alternative indicators for research performance evaluation.To this end, we set out to investigate those research areas in the technological frontier zone, where maximum commercial and socially relevant results can be expected.
The rest of the article is organized as follows: the following section presents a scientometric analysis of the recent research in the field of university engagement; further, we describe the applied methodology; the Results section summarizes the analysis of traditional and alternative scientometric indicators, as well as the correlation analysis.In the Discussion and Conclusion section, we provide interpretation of the results, present the examples of university cases and also discuss the results of the Three University Missions ranking for 2019.

Recent Research
An analysis of recent literature was carried out using VOSviewer4 .In addition to citation and co-authorship analysis, this software product possesses text mining functionality [29,30].At the first stage, we performed a topical search in the Scopus 5 and

Interaction between the university and business
Web of Science6 databases.Documents were taken for five years 2014-2018.We identified terms that had occurred in combinations at least five times.Table 1 presents a comparative analysis of the results.
Subsequently, we opted for better coverage, i. e., Scopus database.At the next stage, we merged single-root words and synonyms and also eliminated the words not carrying the thematic load (e. g., articles), denoting research methods (e. g., questionnaire, interview, etc.) or denoting a specific location (e. g., the United Kingdom, United States).As a result, we received a scientometric map of 54 terms (Fig. 1).
The red cluster is a topic core.Note that most of "research" refers to university relations with society [31][32][33][34][35][36][37]; "innovation" [38] and "third mission" [39,40] point to connections with industry.The blue cluster contains documents related to the educational foundations of university processes, such as "learning" and "curriculum" [41][42][43][44].It also includes the organizational aspects of the university processes: "organization and management" and "public relations" [45].The green cluster represents the psychological foundations of higher education, with the centre of this class being formed by the identity of a student [46,47].A small yellow cluster combines "academic engagement" with "academic achievement" and "academic performance."Academic engagement, including academic entrepreneurship, is often considered at the individual level [44].Interestingly, the connecting term between the red and blue clusters is "public health" [48,49], which indicates the focus of modern economic, social, political and educational systems on maintaining human health and wellbeing.At the same time, "social justice" is the unifying term for all 4 clusters [50].
A complete list of terms is given in Appendix 1.Each link has its own strength, represented by a positive numerical value.The higher this value, the stronger the link.The total link strength attribute indicates the total strength of the co-occurrence links of a given term with other terms.The average normalized citation score is a relative indicator.The mean value normalizes the values; thus, the mean value always equals 1 [51].

Methodology
The data was retrieved from the Scopus database for the period between 2014 and 2017 7 .This period can be considered sufficient for the evaluation of research processes.Three countries were selected for analysis: the Netherlands, Brazil and Russia.The Netherlands represents a country with a developed economy.At the same time, the Netherlands features a developed university system, which not only produces high-quality research results, but also has successfully commercialized its research.Brazil is a country with an emerging economy and a reasonably stable higher education system with a large share of the private sector.Russia, on the contrary, is characterized by the lion's share of public universities and large-scale attempts to improve the global competitiveness of its higher education system.For the analysis purposes, five areas were chosen, where commercially and socially relevant results can be expected: -Biochemistry; -Computer Science; -Energy; -Engineering; -Medicine.At the first stage, we analysed the values of conventional scientometric indicators for the indicated countries and research domains: -The scholarly output is an indicator of the relative strength of a research area for a given object of analysis.-Citation is an indicator of research impact.
Citations were taken as normalized per paper.Further, we analysed two alternative indicators that show the link between scientific research and industry: -Share of industry co-authored papers, i. e., at least one author with a university affiliation and one author with an industry affiliation.It is an apparent link between university research and the economy.The advantage of this metric is realtime availability.-Scholarly output cited by patents.This indicator is available with a time lag (2 years minimum).Finally, we introduced the indicator of the number of mentions in the media as an indicator of the social relevance of research.To this end, we had to go down to the level of analysis below, because mentions in the media usually refer to the university (author), rather than to the country or the research area as a whole.
We identified 30 universities with the most significant number of publications for each country and research area.We used correlation analysis to search for possible relationships.In this case, we proceeded from the following hypotheses: 1.The number of publications in collaboration with industry positively correlates with the total scholarly output.
2. The number of mentions in the media is related to the total number of publications and/or citations.
3. The number of citations of scientific publications in patents positively correlates with the total number of citations of scientific publications of a university.
4. The number of publications co-authored by industry positively correlates with the number of citations of university publications in patents.
The citation indicator was taken as an absolute value, since the indicator of references in the media cannot be normalized to the article.

Results
The results of a comparative analysis of conventional scientometric indicators and indicators of the commercialization of research are presented in Figure 2.
Russia has an advantage in engineering and energy; these areas are based on the foundation laid down back in the Soviet times.At the same time, in medicine, the supreme position of the Netherlands is evident; Russia's lag in this area is particularly significant.The Netherlands is leading in terms of scientific impact in almost all analysed domains.A similar picture can be observed concerning the share of industry co-authored articles and the number of citations in patents.This similarity suggests the existence of a correlation between these indicators.For the correlation analysis, we selected 30 universities with the highest number of publications for each subject area and country.Tables 2-6 represent the results of the correlation analysis.
There is an average correlation between the number of publications and media references in the field

Interaction between the university and business
of biochemistry, computer science and engineering.At the same time, Russia demonstrates a pronounced correlation between these indicators in all areas except medicine.In Brazil, however, these figures are not correlated with each other.
We found a moderate correlation between the number of publications in general and the number of publications in collaboration with industry.Again, in Russia, these indicators correlate in almost all areas.
The results of the correlation analysis of citations and media are very similar to those presented in Table 2. Therefore, a reasonable assumption can be made about the correlation between the number of publications and the number of citations.
Citations correlate with the patent-citation count in almost all areas for Russia and the Netherlands; however, this relationship is not observed for Brazil.Thus, conventional scientometric indicators and indicators of social engagement correlate almost everywhere for Russia and moderately for the Netherlands.In Brazil, this relationship is absent in most cases.In addition, we analysed the relationship between the number of publications in collaboration with industry and the number of citations of university publications in patents.
We observed a very high correlation coefficient in the field of medicine for all the countries under study.Thus, the participation of practitioners in the preparation of a medical article is an essential condition for its use in a patent application.

Discussion and Conclusion
The results of the correlation analysis partially support the hypothesis about the relationship between conventional scientometric indicators and indicators of social and commercial relevance of research.In Russia, these indicators correlate in almost all the analysed areas; in the Netherlands, we also observed a correlation, but not in all areas.In Brazil, the relationship between the indicators in most cases is absent.We also found a relatively strong correlation between the number of publications in collaboration with industry and the number of citations of scholarly output in patents.This relationship is most strongly expressed in the field of medicine.
On the basis of the obtained results, we argue that national and disciplinary contexts significantly influence the evaluation of university engagement.In each research domain, established traditions affect the number of publications, citations, industrial partnerships and knowledge transfer.At the same time, the activities of a university are influenced by the national economic, political and cultural context.Our results do not support the global university -local university dichotomy.We can only talk about the matrix of a university's strategic choice (Fig. 3).In this Figure, the horizontal focus is on research vs. education, while the vertical orientation is global vs. local markets.
It is essential that, under current conditions, a university cannot work exclusively at one of the poles horizontally; it can only make a strategic shift towards one direction or another.For example, it can be said that Harvard is somewhat more focused on education, while MIT -on research and technology transfer.However, it is difficult to imagine that one of these institutions will completely abandon research or education, respectively.Universities opt either for the global or local market.However, universities tend to be isomorphic: "they operate under similar incentive structures and imitate one another [52]." The position of a locally engaged university also opens up plenty of strategic opportunities.Here is an example of the Zuyd University of Applied Sciences (the Netherlands) 8 , which is located on three campuses in Heerlen, Sittard and Maastricht.Zuyd is not included in the global university rankings.Its mission statement is short: "Professionals develop themselves with Zuyd."Zuyd University hosts 30 research centres.Associate professors, lecturers and students carry out practical and socially relevant research.They connect practice and education, contribute to innovations and R&D in the business sector.Research and knowledge transfer contribute to regional development and are designed in close cooperation with the regional or Euregional government bodies, the business world and educational institutions.
In the global or local market, the engagement mechanism works similarly.The thesis of the falsity of the opposition between global and local universities is also supported by the results of the The Three University Missions ranking.In the Top 10, we again observe the dominance of American universities, with Harvard and MIT ranking the first (Table 7).

Table 7
Top 10 rankings of the Three University Missions*

Interaction between the university and business
It is interesting to note that the leading group is stable in composition (we compared with the data in 2018); the only change is the emergence of Duke University in the 10th place, which replaced the Columbia University.
We can assume that a modern university cannot function without a social mission and knowledge transfer.Nevertheless, we should note that this ranking still uses conventional scientometric indicators and a few altmetrics, such as views, the number of visitors of the university website and the number of subscribers to the university account in social media.Most local universities are out of sight due to low scientometric indicators (the ranking includes only 333 universities).In this case, we do need a peer review analysis.
It is not by chance that there are many examples of engaged universities in the Netherlands.The Dutch university evaluation system called the Standard Evaluation Protocol (SEP) 9 is focused on assessing not only the quality of research but also its social significance.In particular, it contains Table D

Fig. 2 .
Fig. 2. Comparative analysis of conventional and alternative scientometric indicators Source: authors' own analysis.Data source: SciVal by Elsevier.

Fig. 3 .
Fig. 3.The matrix of a university's strategic choice.Source: authors' own analysis

Table 1 Results of literature search*
Fig. 1.Scientometric map of recent studies in university engagement.Source: authors' own analysis using VOSviewer

Table 2 Publications vs. Mass Media*
* Source: authors' own analysis.Data source: SciVal by Elsevier.

of the terms*
1, where peers evaluate how effectively the university produces scientific knowledge for targeted social groups.The Dutch case is undoubtedly a positive experience, but it is not entirely clear how it can be scaled up.At the moment, we are not ready to offer a suitable organizational mechanism, but are open to discussion with interested readers.