DESIGNING A NATIONAL AND INTERNATIONAL RATING SYSTEM TO ANALYZE THE RESEARCH ACTIVITIES OF UNIVERSITY PROFESSORS AND TEACHERS

Authors

  • Bakhromov Sayfiddin Akbarovich,Meliyev Abduxabib Abdufayyoz o'g'li Associate Professor, Department of Computational Mathematics and Information Systems, National University of Uzbekistan,Master’s Student, National University of Uzbekistan Author

Keywords:

publishing activity, scholar profile, scient metric databases, ranking system, research activity, research competence, digital skills

Abstract

Participation in national and international rankings reflects a university's influence and competitiveness. Leading rankings evaluate universities' quality based on the publishing activities of their faculty. This paper analyzes the design of systems for assessing the research performance of university faculty and proposes a structural and functional model for a ranking system that incorporates research productivity, digital skills, and scientometric indicators.

The developed model ensures systematic monitoring of research transparency and effectiveness. Key metrics from databases like Scopus, Web of Science, and Google Scholar form the basis of research performance evaluation. These indicators collectively assess faculty productivity and their contribution to the university’s overall research impact.

The study describes the first phase of implementing the "Structural Units Transparency Rating," including experimental ranking outcomes. Findings highlight the system's role in improving the visibility and dissemination of research outputs while supporting faculty in achieving higher ranking scores. The model fosters faculty engagement in international academic networks, enhancing the university's position in global and local rankings.

 

References

1.Aria, M., & Cuccurullo, C. (2017). Bibliometrix: An R-tool for comprehensive science mapping analysis. Journal of Informetrics, 11(4), 959-975.

2.Moed, H. F. (2005). Citation Analysis in Research Evaluation. Springer.

3.Bornmann, L., & Daniel, H. D. (2008). What do citation counts measure? A review of studies on citing behavior. Journal of Documentation, 64(1), 45-80.

4.Waltman, L., & van Eck, N. J. (2012). A new methodology for constructing a publication-level classification system of science. Journal of the American Society for Information Science and Technology, 63(12), 2378-2392.

5.Abramo, G., Cicero, T., & D’Angelo, C. A. (2011). The dangers of performance-based research funding in non-competitive higher education systems: The case of Italian universities. Scientometrics, 87(3), 641-654.

6.Hicks, D., Wouters, P., Waltman, L., de Rijcke, S., & Rafols, I. (2015). Bibliometrics: The Leiden Manifesto for research metrics. Nature, 520(7548), 429-431.

7.Egghe, L. (2006). Theory and practise of the g-index. Scientometrics, 69(1), 131-152.

8.Leydesdorff, L., & Opthof, T. (2010). Scopus’s source normalized impact per paper (SNIP) versus a journal impact factor based on fractional counting of citations. Journal of the American Society for Information Science and Technology, 61(11), 2365-2369.

9.Harzing, A. W. (2010). The Publish or Perish Book: Your guide to effective and responsible citation analysis. Tarma Software Research.

10.Archambault, É., & Gagné, É. V. (2004). The use of bibliometrics in the social sciences and humanities. Science-Metrix, 1-34.

11.Martin, B. R. (2011). The Research Excellence Framework and the ‘impact agenda’: are we creating a Frankenstein monster?. Research Evaluation, 20(3), 247-254.

12.Glänzel, W., & Moed, H. F. (2002). Journal impact measures in bibliometric research. Scientometrics, 53(2), 171-193.

13.Van Raan, A. F. J. (2005). Fatal attraction: Conceptual and methodological problems in the ranking of universities by bibliometric methods. Scientometrics, 62(1), 133-143.

14.Butler, L. (2003). Modifying publication practices in response to funding formulas. Research Evaluation, 12(1), 39-46.

15.Garfield, E. (2006). The history and meaning of the journal impact factor. JAMA, 295(1), 90-93.

16.Spivakovsky, A., Vinnyk, M., Poltoratskiy, M., Tarasich, Y., Spivakovska, Y., Gardner, G., & Panova, K. (2019). Information system of scientific activity indicators of scientific organizations: Development status and prospects. CEUR Workshop Proceedings, 2393, 220–228. Retrieved from http://ceur-ws.org/Vol-2393/paper_260.pdf.

Downloads

Published

2024-12-29