Keywords: Performance-based research funding, national research evaluation system, innovation, research and innovation system.


Topicality. The implementation of the Performance-based research funding (PBRF) in EU Member States has been a priority for the development of research and innovation in last time. This financial mechanism refers to the type of competitive organizational and institutional allocation of research funding.
Aim and tasks. The aim of the article is to develop the theoretical basis of Performance-Based Research Funding (PBRF) assessment to increase the effectiveness of scientific research. Also the aim of the article to analyze the implementation of this mechanism in in Ukraine.
Research results. Given the formation of Academic Freedom in the article, it will be clarified: the structure of the evaluation of the financing of research based on the results, the constellations of scientific activities, the category of research activities, and the hypotheses concerning the effectiveness of the national research evaluation systems (NRES). The taxonomy of the indicators of the efficiency of subsidiary activity in Ukraine, New Zealand, Sweden is considered.
The following factors explain efficiency differences between science systems: The level of competition; Share of project funding; Performance based funding systems; National evaluation systems.
The following activities are excluded from the definition of research except where they are used primarily for the support, or as part, of research and experimental development activities: preparation for teaching; the provision of advice or opinion, except where it is consistent with the PBRF’s Definition of research; scientific and technical information services; general purpose or routine data collection; standardisation and routine testing (but not including standards development); feasibility studies (except into research and experimental development projects); specialised routine medical care; the commercial, legal and administrative aspects of patenting, copyrighting or licensing activities.
Conclusions. Theoretical basis of Performance-Based Research Funding (PBRF) assessment to increase the effectiveness of scientific research in Ukraine is developed. The Performance-based research funding is governed by the following set of principles: comprehensiveness; respect for academic traditions; consistency; continuity; differentiation; credibility; efficiency; transparency; complementarity. The Methodology for evaluating the effectiveness of scientific institutions of the National Academy of Sciences of Ukraine, in our opinion, can be supplemented by 3 elements: Moderation Panel. Comprehensive peer-review panel and PBRF audit.

Author Biographies


Dr.Sc. (Economics), SR
Institute Of Market Problems And Economic&Ecological Research of the
National Academy Of Sciences Of Ukraine
Frantsuzskiy Boulevard, 29, Odesa, Ukraine


Postgraduate Student
Institute Of Market Problems And Economic&Ecological Research of the
National Academy Of Sciences Of Ukraine
Frantsuzskiy Boulevard, 29, Odesa, Ukraine


1. Bernal Dzh. (1969). Nauka v istorii obshchestva: sb. statey i vystupleniy [Science in the history of society: sat. articles and speeches]. Moskva: Izd-vo inostr. lit., 300. [in Russian].
2. Prays D. (1966). Malaya nauka, bol'shaya nauka [Small science, big science]. Nauka o nauke – Science about science. 281. 384. [in Russian].
3. Jonkers Koen&Zacharewicz Thomas (2016 ). Research Performance Based Funding Systems: a Comparative Assessment Authors // Publications Office of the European Union. Retrieved from: [in English].
4. Crespi, G. A.& Geuna, A. (2008). An empirical study of scientific production: A cross country analysis, 1981–2002. Research Policy, 37, 565-579 [in English].
5. Supporting growth and jobs – An agenda for the modernization of Europe’s higher education systems. COM (2011), 567. Retrieved from: pdf [in English].
6. Frølich, N. (2008). The politics of steering by numbers: debating performance-based funding in Europe. NIFU STEP. Retrieved from: [in English].
7. Hicks, D. (2012). Performance-based university research funding systems. Research Policy, vol. 41, Issue 2 [in English].
8.OECD (2016): OECD Reviews of Innovation Policy: Lithuania 2016. OECD Publishing, Paris. Retrieved from: [in English].
9.Geuna, A.&Martin, B.R. (2003). University research evaluation and funding: an international comparison. Minerva 41, 277-304 [in English].
10.Marginson, S. (1997). Steering from a distance: power relations in Australian higher education. Higher Education 34, 63-80 [in English].
11. Wikipedia contributors. Funding of science. In Wikipedia, The Free Encyclopedia. Retrieved from: Funding_of_science&oldid=875261494 [in English].
12 Hazelkorn, E. (2010). Pros and cons of research assessment, in ISSC/UNESCO. World Social Science Report 2010, Paris: UNESCO Press [in English].
13. Abramo, G., Cicero, T.&D'Angelo, C (2012). The dispersion of research performance within and between universities as a potential indicator of the competitive intensity in higher education systems. Journal of Infometrics, vol.6, 2, 155-168 [in English].
14. Auranen, O.&Nieminen, M. (2010). University research funding and publication performance – an international comparison. Research Policy, 39, 6, 822-834 [in English].
15. Peer Review of the Ukrainian Research and Innovation System. European Commission. Retrieved from: [in English].
16. Aghion, P., Dewatripont, M., Hoxby, C., Mas-Colell, A.&Sapir, A. (2010). The Governance and Performance of Universities: Evidence from Europe and the US. Economic Policy, (January), 7-59 [in English].
17. Van Steen, J. (2012). Modes of public funding of research and development: Towards internationally comparable indicators. OECD, Science Technology and Industry. Working Papers, 2012/04 [in English].
18. Vyznachennia indeksu tsytovanosti vchenoho z vykorystanniam Google Scholar [Determination the citation index of the scientist using Google Scholar]. URL: http://www. [in English].
19. Sandström, U., P.&Van den Besselaar (2018). Funding, evaluation, and the performance of nationalresearch systems. Journal of Informetrics 12. 365-384 [in English].
20. Hazelkorn, E. (2010). Pros and cons of research assessment, in ISSC. UNESCO World Social Science Report 2010. Paris: UNESCO Press [in English].
21. Metric-based vs peer-reviewed evaluation of a research output: Lesson learnt from UK’s national research assessment exercise. Retrieved from: [in English].
22. Zhang, H., Patton, D.&Kenney, M. (2013). Building global-class universities: Assessing the impact of the 985 Project. Research Policy. vol. 42, Issue 3, April 2013, 765-775 [in English].
23. Funding, evaluation, and the performance of nationalresearch systemsUlf Sandströma,∗, Peter Van den BesselaarbaDept Journal of Informetrics 12 (2018) 365–384 [in English].
24.Bornmann, L., Stefaner, M., de Moya Anegon, F. & Mutz, R. (2014). What is the effect of country-specific characteristics on the research performance ofscientific institutions? Journal of Informetrics, 8(3) [in English].
25.Chapman, I. & Farina, C. (1983). Peer review and national need. Research Policy, 12, 317-327 [in English].
26 Performance-Based Research Fund Evaluating Research Excellence – the 2012 Assessment Final Report Retrieved from: [in English].
27.Whitley, R., Glaeser J. & (Eds.) (2007). Changing governance of the public sciences: The consequences of establishing diverse research evaluation systems. The changing governance of the sciences: The advent of research evaluation systems. Dordrecht, Netherlands: Springer. 3-27 [in English].
28. Cruz-Castro, L., Bleda, M., Derrick, GE., Jonkers, K., Martinez, C. & Sanz-Menendez,L. (2011). Issue brief.: Public sector research funding. Retrieved from: Sector%20Research%20Funding_0_0.pdf [in English].
How to Cite