The science-wide author databases of standardized citation indicators is a multidimensional ranking of the world's scientists produced since 2015 by a team of researchers led by John P. A. Ioannidis at Stanford.[1][2]
Main
Based on data from Scopus, this indicators explore about 8 million records of scientists’ citations in order to rank a subset of 200,000 most-cited authors across all scientific fields. This is commonly referred to as Stanford ranking of the 2% best scientists.[3]
The ranking is achieved via a composite indicator built on six citation metrics
The number of citations to papers as a single author;
The number of citations to papers as single or first author;
The number of citations to papers as single, first, or last author.
Data
Data (about 200,000 records) are freely downloadable from Elsevier through the International Center for the Study of Research (ICSR) Lab.[2][4][5]
Output
The index classifies researchers into 22 scientific fields and 174 sub-fields. Different rankings are produced: career-long and most recent year, with and without self-citations. This results in four different configurations. The difference between this ranking and the pure h-index is that it is sensitive to details of co-authorship and author positions: configurations such as single, first, and last author are given more emphasis. Many authors point to the importance of the index created by Ioannidis in the context of accurate, cheap and simple descriptions of research systems,[6][7][8]
Being listed in Stanford's Rank is treated as prestigious and translates into increased visibility of scientists, which may translate into increased networking potential and for obtaining research funding.[9][10][11] Moreover, The rank offers an opportunity to researchers in a field to compare the citation behavior of their field with others.[6]
These articles variously point to the methodological papers and associated measure to discuss social aspects of the publication activity, such as unequal access to publishing of different social or national groups, including gender bias[13][20] or the properties of the underlying Scopus' abstract and citation database.
^ abPetersen, Kai; Ali, Nauman Bin (2021). "An analysis of top author citations in software engineering and a comparison with other fields". Scientometrics. 126 (11): 9147–9183. doi:10.1007/s11192-021-04144-1.
^ abSingh, P. K. (1 January 2022). "t-index: entropy based random document and citation analysis using average h-index". Scientometrics. 127 (1): 637–660. doi:10.1007/s11192-021-04222-4. ISSN1588-2861.
^Tohalino, Jorge A.V.; Amancio, Diego R. (2022). "On predicting research grants productivity via machine learning". Journal of Informetrics. 16 (2). arXiv:2106.10700. doi:10.1016/j.joi.2022.101260.
^Oliveira, Leticia DE; Reichert, Fernanda; Zandonà, Eugenia; Soletti, Rossana C.; Staniscuaski, Fernanda (2021). "The 100,000 most influential scientists rank: The underrepresentation of Brazilian women in academia". Anais da Academia Brasileira de Ciências. 93. doi:10.1590/0001-3765202120201952.
^ abHodge, D. R., Turner, P. R. (1 March 2023). "Who are the Top 100 Contributors to Social Work Journal Scholarship? A Global Study on Career Impact in the Profession". Research on Social Work Practice. 33 (3). SAGE Publications Inc: 338–349. doi:10.1177/10497315221136623. ISSN1049-7315.
^Jones, A. W. (1 March 2022). "Highly cited forensic practitioners in the discipline legal and forensic medicine and the importance of peer-review and publication for admission of expert testimony". Forensic Science, Medicine and Pathology. 18 (1): 37–44. doi:10.1007/s12024-021-00447-0. ISSN1556-2891. PMID35129820.
^Monte-Serrat, D. M., Cattani, C. (1 June 2021). "Interpretability in neural networks towards universal consistency". International Journal of Cognitive Computing in Engineering. 2: 30–39. doi:10.1016/j.ijcce.2021.01.002. ISSN2666-3074.
^Baas, J., Schotten, M., Plume, A., Côté, G., Karimi, R. (1 February 2020). "Scopus as a curated, high-quality bibliometric data source for academic research in quantitative science studies". Quantitative Science Studies. 1 (1): 377–386. doi:10.1162/qss_a_00019. ISSN2641-3337.
^Szomszor, M., Pendlebury, D. A., Adams, J. (1 May 2020). "How much is too much? The difference between research influence and self-citation excess". Scientometrics. 123 (2): 1119–1147. doi:10.1007/s11192-020-03417-5. ISSN1588-2861.
^Wu, C. (2023). "The gender citation gap: Why and how it matters". Canadian Review of Sociology/Revue Canadienne de Sociologie. 60 (2): 188–211. doi:10.1111/cars.12428. ISSN1755-618X. PMID36929271.