Web of Science (WoS) and Scopus: The Titans of Bibliographic Information in Today’s Academic World
Abstract
:1. Introduction
Main Bibliographic Databases
2. Literature Overview
3. Evaluations and Comparisons of WoS and Scopus
3.1. Overall Content Coverage, Overlap and Variations between Disciplines
3.2. Coverage by Source and Document Types
3.3. Coverage of Regional and Non-English Literature
3.4. Coverage of Citations
3.5. Coverage of Patents
3.6. Coverage of Funding Information
3.7. Publication and Source Information
3.8. Author Information
3.9. Institution Information
3.10. Content Quality
3.10.1. Content Indexing and Inclusion Policies
3.10.2. Open Access Content and “Predatory” Journals
3.10.3. Errors in Publication Metadata
3.10.4. Inconsistencies in Subject Classification Schemes and Document Types
3.11. Comprehensiveness of Information Provided by the DBs’ Owners
Content Coverage Evaluation by Source Lists
3.12. Search and Online Analysis Capabilities
Data Export Limitations
4. Citation Impact Indicators Implemented in WoS and Scopus
4.1. Basic Journal Impact Indicators
4.2. Advanced Indicators
4.2.1. Normalized Indicators
4.2.2. Journal Prestige Indicators
4.3. H-Index
4.4. Recommendations for the Correct Choice of Journal Impact Indicators
5. Fundamental Concerns in Bibliometric Practices
6. Discussion
7. Conclusions
Funding
Data Availability Statement
Acknowledgments
Conflicts of Interest
References
- Abramo, G. Revisiting the Scientometric Conceptualization of Impact and Its Measurement. J. Informetr. 2018, 12, 590–597. [Google Scholar] [CrossRef] [Green Version]
- Agarwal, A.; Durairajanayagam, D.; Tatagari, S.; Esteves, S.; Harlev, A.; Henkel, R.; Roychoudhury, S.; Homa, S.; Puchalt, N.; Ramasamy, R.; et al. Bibliometrics: Tracking Research Impact by Selecting the Appropriate Metrics. Asian J. Androl. 2016, 18, 296. [Google Scholar] [CrossRef] [Green Version]
- Aksnes, D.W.; Langfeldt, L.; Wouters, P. Citations, Citation Indicators, and Research Quality: An Overview of Basic Concepts and Theories. SAGE Open 2019, 9. [Google Scholar] [CrossRef] [Green Version]
- Leydesdorff, L.; Wouters, P.; Bornmann, L. Professional and Citizen Bibliometrics: Complementarities and Ambivalences in the Development and Use of Indicators—a State-of-the-Art Report. Scientometrics 2016, 109, 2129–2150. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Moed, H.F.; Halevi, G. Multidimensional Assessment of Scholarly Research Impact. J. Assoc. Inf. Sci. Technol. 2015, 66, 1988–2002. [Google Scholar] [CrossRef] [Green Version]
- Bianco, M.; Gras, N.; Sutz, J. Academic Evaluation: Universal Instrument? Tool for Development? Minerva 2016, 54, 399–421. [Google Scholar] [CrossRef]
- De Rijcke, S.; Wouters, P.F.; Rushforth, A.D.; Franssen, T.P.; Hammarfelt, B. Evaluation Practices and Effects of Indicator Use—A Literature Review. Res. Eval. 2016, 25, 161–169. [Google Scholar] [CrossRef]
- Kun, Á. Publish and Who Should Perish: You or Science? Publications 2018, 6, 18. [Google Scholar] [CrossRef] [Green Version]
- Moral-Muñoz, J.A.; Herrera-Viedma, E.; Santisteban-Espejo, A.; Cobo, M.J. Software Tools for Conducting Bibliometric Analysis in Science: An up-to-Date Review. Prof. Inf. 2020, 29, e290103. [Google Scholar] [CrossRef] [Green Version]
- Zhu, J.; Liu, W. A Tale of Two Databases: The Use of Web of Science and Scopus in Academic Papers. Scientometrics 2020, 123, 321–335. [Google Scholar] [CrossRef] [Green Version]
- Li, K.; Rollins, J.; Yan, E. Web of Science Use in Published Research and Review Papers 1997–2017: A Selective, Dynamic, Cross-Domain, Content-Based Analysis. Scientometrics 2018, 115, 1–20. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Baas, J.; Schotten, M.; Plume, A.; Côté, G.; Karimi, R. Scopus as a Curated, High-Quality Bibliometric Data Source for Academic Research in Quantitative Science Studies. Quant. Sci. Stud. 2020, 1, 377–386. [Google Scholar] [CrossRef]
- Harzing, A.W.; Alakangas, S. Google Scholar, Scopus and the Web of Science: A Longitudinal and Cross-Disciplinary Comparison. Scientometrics 2016, 106, 787–804. [Google Scholar] [CrossRef]
- Aghaei Chadegani, A.; Salehi, H.; Md Yunus, M.M.; Farhadi, H.; Fooladi, M.; Farhadi, M.; Ale Ebrahim, N. A Comparison between Two Main Academic Literature Collections: Web of Science and Scopus Databases. Asian Soc. Sci. 2013, 9, 18–26. [Google Scholar] [CrossRef] [Green Version]
- Dallas, T.; Gehman, A.-L.; Farrell, M.J. Variable Bibliographic Database Access Could Limit Reproducibility. Bioscience 2018, 68, 552–553. [Google Scholar] [CrossRef] [Green Version]
- Frenken, K.; Heimeriks, G.J.; Hoekman, J. What Drives University Research Performance? An Analysis Using the CWTS Leiden Ranking Data. J. Informetr. 2017, 11, 859–872. [Google Scholar] [CrossRef]
- Vernon, M.M.; Andrew Balas, E.; Momani, S. Are University Rankings Useful to Improve Research? A Systematic Review. PLoS ONE 2018, 13, e0193762. [Google Scholar] [CrossRef] [Green Version]
- Moed, H.F. A Critical Comparative Analysis of Five World University Rankings. Scientometrics 2017, 110, 967–990. [Google Scholar] [CrossRef] [Green Version]
- Safón, V. Inter - Ranking Reputational Effects: An Analysis of the Academic Ranking of World Universities (ARWU) and the Times Higher Education World University Rankings (THE) Reputational Relationship. Scientometrics 2019, 121, 897–915. [Google Scholar] [CrossRef]
- Lim, M.A. The Building of Weak Expertise: The Work of Global University Rankers. High. Educ. 2018, 75, 415–430. [Google Scholar] [CrossRef] [Green Version]
- Lim, M.A.; Øerberg, J.W. Active Instruments: On the Use of University Rankings in Developing National Systems of Higher Education. Policy Rev. High. Educ. 2017, 1, 91–108. [Google Scholar] [CrossRef] [Green Version]
- Haddawy, P.; Hassan, S.U.; Abbey, C.W.; Lee, I.B. Uncovering Fine-Grained Research Excellence: The Global Research Benchmarking System. J. Informetr. 2017, 11, 389–406. [Google Scholar] [CrossRef]
- Gusenbauer, M. Google Scholar to Overshadow Them All? Comparing the Sizes of 12 Academic Search Engines and Bibliographic Databases. Scientometrics 2019, 118, 177–214. [Google Scholar] [CrossRef] [Green Version]
- Okhovati, M.; Sharifpoor, E.; Aazami, M.; Zolala, F.; Hamzehzadeh, M. Novice and Experienced Users’ Search Performance and Satisfaction with Web of Science and Scopus. J. Librariansh. Inf. Sci. 2017, 49, 359–367. [Google Scholar] [CrossRef]
- Ellegaard, O. The Application of Bibliometric Analysis: Disciplinary and User Aspects. Scientometrics 2018, 116, 181–202. [Google Scholar] [CrossRef] [Green Version]
- Waltman, L. A Review of the Literature on Citation Impact Indicators. J. Informetr. 2016, 10, 365–391. [Google Scholar] [CrossRef] [Green Version]
- Badia, G. Identifying “Best Bets” for Searching in Chemical Engineering: Comparing Database Content and Performance for Information Retrieval. J. Doc. 2018, 74, 80–98. [Google Scholar] [CrossRef]
- Carloni, M.; Tsenkulovsky, T.; Mangan, R. Web of Science Core Collection Descriptive Document. 2018. Available online: https://clarivate.libguides.com/ld.php?content_id=45175981 (accessed on 13 August 2020).
- Liu, W. The Data Source of This Study Is Web of Science Core Collection? Not Enough. Scientometrics 2019, 121, 1815–1824. [Google Scholar] [CrossRef]
- Valderrama-Zurián, J.C.; Aguilar-Moya, R.; Melero-Fuentes, D.; Aleixandre-Benavent, R. A Systematic Analysis of Duplicate Records in Scopus. J. Informetr. 2015, 9, 570–576. [Google Scholar] [CrossRef] [Green Version]
- Halevi, G.; Moed, H.; Bar-Ilan, J. Suitability of Google Scholar as a Source of Scientific Information and as a Source of Data for Scientific Evaluation—Review of the Literature. J. Informetr. 2017, 11, 823–834. [Google Scholar] [CrossRef]
- Martín-Martín, A.; Thelwall, M.; Orduna-Malea, E.; Delgado López-Cózar, E. Google Scholar, Microsoft Academic, Scopus, Dimensions, Web of Science, and OpenCitations’ COCI: A Multidisciplinary Comparison of Coverage via Citations. Scientometrics 2021, 126, 871–906. [Google Scholar] [CrossRef]
- Orduna-Malea, E.; Martín-Martín, A.; Delgado López-Cózar, E. Google Scholar as a Source for Scholarly Evaluation: A Bibliographic Review of Database Errors. Rev. española Doc. Científica 2017, 40, e185. [Google Scholar] [CrossRef] [Green Version]
- López-Cózar, E.D.; Robinson-García, N.; Torres-Salinas, D. The Google Scholar Experiment: How to Index False Papers and Manipulate Bibliometric Indicators. J. Assoc. Inf. Sci. Technol. 2014, 65, 446–454. [Google Scholar] [CrossRef]
- Herzog, C.; Hook, D.; Konkiel, S. Dimensions: Bringing down Barriers between Scientometricians and Data. Quant. Sci. Stud. 2020, 1, 387–395. [Google Scholar] [CrossRef]
- Hook, D.W.; Porter, S.J.; Herzog, C. Dimensions: Building Context for Search and Evaluation. Front. Res. Metrics Anal. 2018, 3, 23. [Google Scholar] [CrossRef]
- Thelwall, M. Dimensions: A Competitor to Scopus and the Web of Science? J. Informetr. 2018, 12, 430–435. [Google Scholar] [CrossRef] [Green Version]
- Bornmann, L. Field Classification of Publications in Dimensions: A First Case Study Testing Its Reliability and Validity. Scientometrics 2018, 117, 637–640. [Google Scholar] [CrossRef]
- Harzing, A.W. Two New Kids on the Block: How Do Crossref and Dimensions Compare with Google Scholar, Microsoft Academic, Scopus and the Web of Science? Scientometrics 2019, 120, 341–349. [Google Scholar] [CrossRef]
- Visser, M.; Jan Van Eck, N.; Waltman, L. Large-Scale Comparison of Bibliographic Data Sources: Scopus, Web of Science, Dimensions, Crossref, and Microsoft Academic. arXiv 2020, arXiv:2005.10732. [Google Scholar]
- Waltman, L.; Larivière, V. Special Issue on Bibliographic Data Sources. Quant. Sci. Stud. 2020, 1, 360–362. [Google Scholar] [CrossRef]
- Gusenbauer, M. Which Academic Search Systems Are Suitable for Systematic Reviews or Meta-Analyses? Evaluating Retrieval Qualities of Google Scholar, PubMed, and 26 Other Resources. Res. Synth. Methods 2020, 11, 181–217. [Google Scholar] [CrossRef] [Green Version]
- Bramer, W.M.; Rethlefsen, M.L.; Kleijnen, J.; Franco, O.H. Optimal Database Combinations for Literature Searches in Systematic Reviews: A Prospective Exploratory Study. Syst. Rev. 2017, 6, 245. [Google Scholar] [CrossRef]
- Wouters, P.; Thelwall, M.; Kousha, K.; Waltman, L.; de Rijcke, S.; Rushforth, A.; Franssen, T. The Metric Tide: Literature Review (Supplementary Report I to the Independent Review of the Role of Metrics in Research Assessment and Management); HEFCE: Bristol, UK, 2015. [Google Scholar] [CrossRef]
- Walters, W.H. Citation-Based Journal Rankings: Key Questions, Metrics, and Data Sources. IEEE Access 2017, 5, 22036–22053. [Google Scholar] [CrossRef]
- Mingers, J.; Leydesdorff, L. A Review of Theory and Practice in Scientometrics. Eur. J. Oper. Res. 2015, 246, 1–19. [Google Scholar] [CrossRef] [Green Version]
- Mongeon, P.; Paul-Hus, A. The Journal Coverage of Web of Science and Scopus: A Comparative Analysis. Scientometrics 2016, 106, 213–228. [Google Scholar] [CrossRef]
- Leydesdorff, L.; de Moya-Anegón, F.; de Nooy, W. Aggregated Journal–Journal Citation Relations in Scopus and Web of Science Matched and Compared in Terms of Networks, Maps, and Interactive Overlays. J. Assoc. Inf. Sci. Technol. 2016, 67, 2194–2211. [Google Scholar] [CrossRef] [Green Version]
- Vera-Baceta, M.-A.; Thelwall, M.; Kousha, K. Web of Science and Scopus Language Coverage. Scientometrics 2019, 121, 1803–1813. [Google Scholar] [CrossRef]
- Aksnes, D.W.; Sivertsen, G. A Criteria-Based Assessment of the Coverage of Scopus and Web of Science. J. Data Inf. Sci. 2019, 4, 1–21. [Google Scholar] [CrossRef] [Green Version]
- Arencibia-Jorge, R.; Villaseñor, E.A.; Lozano-Díaz, I.A.; Calvet, H.C. Elsevier’s Journal Metrics for the Identification of a Mainstream Journals Core: A Case Study on Mexico. Libres 2016, 26, 1–13. [Google Scholar]
- Moed, H.F.; Markusova, V.; Akoev, M. Trends in Russian Research Output Indexed in Scopus and Web of Science. Scientometrics 2018, 116, 1153–1180. [Google Scholar] [CrossRef] [Green Version]
- Huang, C.-K.; Neylon, C.; Brookes-Kenworthy, C.; Hosking, R.; Montgomery, L.; Wilson, K.; Ozaygen, A. Comparison of Bibliographic Data Sources: Implications for the Robustness of University Rankings. Quant. Sci. Stud. 2020, 1, 445–478. [Google Scholar] [CrossRef]
- Burghardt, K.J.; Howlett, B.H.; Khoury, A.S.; Fern, S.M.; Burghardt, P.R. Three Commonly Utilized Scholarly Databases and a Social Network Site Provide Different, but Related, Metrics of Pharmacy Faculty Publication. Publications 2020, 8, 18. [Google Scholar] [CrossRef] [Green Version]
- Martín-Martín, A.; Orduna-Malea, E.; Delgado López-Cózar, E. Coverage of Highly-Cited Documents in Google Scholar, Web of Science, and Scopus: A Multidisciplinary Comparison. Scientometrics 2018, 9, 2175–2188. [Google Scholar] [CrossRef] [Green Version]
- Martín-Martín, A.; Orduna-Malea, E.; Thelwall, M.; López-Cózar, E.D. Google Scholar, Web of Science, and Scopus: A Systematic Comparison of Citations in 252 Subject Categories. J. Informetr. 2018, 12, 1160–1177. [Google Scholar] [CrossRef] [Green Version]
- Bar-Ilan, J. Tale of Three Databases: The Implication of Coverage Demonstrated for a Sample Query. Front. Res. Metrics Anal. 2018, 3, 6. [Google Scholar] [CrossRef] [Green Version]
- Pajić, D. On the Stability of Citation-Based Journal Rankings. J. Informetr. 2015, 9, 990–1006. [Google Scholar] [CrossRef]
- Hug, S.E.; Brändle, M.P. The Coverage of Microsoft Academic: Analyzing the Publication Output of a University. Scientometrics 2017, 113, 1551–1571. [Google Scholar] [CrossRef] [Green Version]
- Trapp, J. Web of Science, Scopus, and Google Scholar Citation Rates: A Case Study of Medical Physics and Biomedical Engineering: What Gets Cited and What Doesn’t? Australas. Phys. Eng. Sci. Med. 2016, 39, 817–823. [Google Scholar] [CrossRef]
- Franceschini, F.; Maisano, D.; Mastrogiacomo, L. Influence of Omitted Citations on the Bibliometric Statistics of the Major Manufacturing Journals. Scientometrics 2015, 103, 1083–1122. [Google Scholar] [CrossRef] [Green Version]
- Franceschini, F.; Maisano, D.; Mastrogiacomo, L. Empirical Analysis and Classification of Database Errors in Scopus and Web of Science. J. Informetr. 2016, 10, 933–953. [Google Scholar] [CrossRef]
- Krauskopf, E. Missing Documents in Scopus: The Case of the Journal Enfermeria Nefrologica. Scientometrics 2019, 119, 543–547. [Google Scholar] [CrossRef]
- Wang, Q.; Waltman, L. Large-Scale Analysis of the Accuracy of the Journal Classification Systems of Web of Science and Scopus. J. Informetr. 2016, 10, 347–364. [Google Scholar] [CrossRef] [Green Version]
- Olensky, M.; Schmidt, M.; van Eck, N.J. Evaluation of the Citation Matching Algorithms of CWTS and IFQ in Comparison to the Web of Science. J. Assoc. Inf. Sci. Technol. 2016, 67, 2550–2564. [Google Scholar] [CrossRef] [Green Version]
- Van Eck, N.J.; Waltman, L. Accuracy of Citation Data in Web of Science and Scopus. In Proceedings of the 16th International Conference on Scientometrics and Informetrics (ISSI 2017), Wuhan, China, 16–20 October 2017; pp. 1087–1092. [Google Scholar]
- Franceschini, F.; Maisano, D.; Mastrogiacomo, L. Errors in DOI Indexing by Bibliometric Databases. Scientometrics 2015, 102, 2181–2186. [Google Scholar] [CrossRef] [Green Version]
- Xu, S.; Hao, L.; An, X.; Zhai, D.; Pang, H. Types of DOI Errors of Cited References in Web of Science with a Cleaning Method. Scientometrics 2019, 120, 1427–1437. [Google Scholar] [CrossRef]
- Zhu, J.; Hu, G.; Liu, W. DOI Errors and Possible Solutions for Web of Science. Scientometrics 2019, 118, 709–718. [Google Scholar] [CrossRef]
- Aman, V. Does the Scopus Author ID Suffice to Track Scientific International Mobility? A Case Study Based on Leibniz Laureates. Scientometrics 2018, 117, 705–720. [Google Scholar] [CrossRef]
- Demetrescu, C.; Ribichini, A.; Schaerf, M. Accuracy of Author Names in Bibliographic Data Sources: An Italian Case Study. Scientometrics 2018, 117, 1777–1791. [Google Scholar] [CrossRef]
- Donner, P.; Rimmert, C.; van Eck, N.J. Comparing Institutional-Level Bibliometric Research Performance Indicator Values Based on Different Affiliation Disambiguation Systems. Quant. Sci. Stud. 2020, 1, 150–170. [Google Scholar] [CrossRef]
- Liu, W.; Hu, G.; Tang, L. Missing Author Address Information in Web of Science—An Explorative Study. J. Informetr. 2018, 12, 985–997. [Google Scholar] [CrossRef] [Green Version]
- Tang, L.; Hu, G.; Liu, W. Funding Acknowledgment Analysis: Queries and Caveats. J. Assoc. Inf. Sci. Technol. 2017, 68, 790–794. [Google Scholar] [CrossRef] [Green Version]
- Grassano, N.; Rotolo, D.; Hutton, J.; Lang, F.; Hopkins, M.M. Funding Data from Publication Acknowledgments: Coverage, Uses, and Limitations. J. Assoc. Inf. Sci. Technol. 2017, 68, 999–1017. [Google Scholar] [CrossRef] [Green Version]
- Paul-Hus, A.; Desrochers, N.; Costas, R. Characterization, Description, and Considerations for the Use of Funding Acknowledgement Data in Web of Science. Scientometrics 2016, 108, 167–182. [Google Scholar] [CrossRef] [Green Version]
- Liu, W.; Tang, L.; Hu, G. Funding Information in Web of Science: An Updated Overview. Scientometrics 2020, 122, 1509–1524. [Google Scholar] [CrossRef] [Green Version]
- Álvarez-Bornstein, B.; Morillo, F.; Bardons, M. Funding Acknowledgments in the Web of Science: Completeness and Accuracy of Collected Data. Scientometrics 2017, 112, 1793–1812. [Google Scholar] [CrossRef]
- Liu, W. Accuracy of Funding Information in Scopus: A Comparative Case Study. Scientometrics 2020, 124, 803–811. [Google Scholar] [CrossRef]
- Kokol, P.; Blažun Vošner, H. Discrepancies among Scopus, Web of Science, and PubMed Coverage of Funding Information in Medical Journal Articles. J. Med. Libr. Assoc. 2018, 106, 81–86. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Chavarro, D.; Ra, I. To What Extent Is Inclusion in TheWeb of Science an Indicator of Journal ‘Quality’? Res. Eval. 2018, 27, 106–118. [Google Scholar] [CrossRef]
- Giménez-Toledo, E.; Mañana-Rodríguez, J.; Sivertsen, G. Scholarly Book Publishing: Its Information Sources for Evaluation in the Social Sciences and Humanities. Res. Eval. 2017, 26, 91–101. [Google Scholar] [CrossRef]
- Linder, S.K.; Kamath, G.R.; Pratt, G.F.; Saraykar, S.S.; Volk, R.J. Citation Searches Are More Sensitive than Keyword Searches to Identify Studies Using Specific Measurement Instruments. J. Clin. Epidemiol. 2015, 68, 412–417. [Google Scholar] [CrossRef] [Green Version]
- Bates, J.; Best, P.; McQuilkin, J.; Taylor, B. Will Web Search Engines Replace Bibliographic Databases in the Systematic Identification of Research? J. Acad. Librariansh. 2017, 43, 8–17. [Google Scholar] [CrossRef] [Green Version]
- Powell, K.R.; Peterson, S.R. Coverage and Quality: A Comparison of Web of Science and Scopus Databases for Reporting Faculty Nursing Publication Metrics. Nurs. Outlook 2017, 65, 572–578. [Google Scholar] [CrossRef] [PubMed]
- Meho, L.I. Using Scopus’s CiteScore for Assessing the Quality of Computer Science Conferences. J. Informetr. 2019, 13, 419–433. [Google Scholar] [CrossRef]
- Grégoire, C.; Roberge, G.; Archambault, É. Bibliometrics and Patent Indicators for the Science and Engineering Indicators 2016─Comparison of 2016 Bibliometric Indicators to 2014 Indicators. In Science & Engineering Indicators 2016 (SEI 2016); Available online: https://science-metrix.com/?q=en/publications/reports&page=2#/?q=en/publications/reports/bibliometrics-and-patent-indicators-for-the-science-and-engineering-indicator-0 (accessed on 13 August 2020).
- Ochsner, M.; Hug, S.E.; Daniel, H. (Eds.) Assessment in the Humanities; Springer Nature: Switzerland, Zürich, 2016; ISBN 978-3-319-29016-4. [Google Scholar]
- Elsevier. Scopus Content Coverage Guide. Elsevier 2020. Available online: https://www.elsevier.com/solutions/scopus/how-scopus-works/content (accessed on 13 August 2020).
- Kousha, K.; Thelwall, M. Can Microsoft Academic Help to Assess the Citation Impact of Academic Books? J. Informetr. 2018, 12, 972–984. [Google Scholar] [CrossRef] [Green Version]
- Chapman, K.; Ellinger, A.E. An Evaluation of Web of Science, Scopus and Google Scholar Citations in Operations Management. Int. J. Logist. Manag. 2019, 30, 1039–1053. [Google Scholar] [CrossRef]
- Ko, Y.M.; Park, J.Y. An Index for Evaluating Journals in a Small Domestic Citation Index Database Whose Citation Rate Is Generally Very Low: A Test Based on the Korea Citation Index (KCI) Database. J. Informetr. 2013, 7, 404–411. [Google Scholar] [CrossRef]
- Moskaleva, O.; Pislyakov, V.; Akoev, M.; Shabanova, S. Russian Index of Science Citation: Overview and Review. Scientometrics 2018, 116, 449–462. [Google Scholar] [CrossRef]
- Clarivate. Introducing the Arabic Citation Index. Clarivate Analytics. 2020. Available online: https://clarivate.com/webofsciencegroup/solutions/arabic-citation-index/#:~:text=TheArabicCitationIndex (accessed on 21 September 2020).
- Mika, P.; Szarzec, J.; Sivertsen, G. Data Quality and Consistency in Scopus and Web of Science in Their Indexing of Czech Journals. In Proceedings of the 21st International Conference on Science and Technology Indicators (STI 2016), València, Spain, 1–7 September 2016; Rafols, I., Molas-Gallart, J., Castro-Martinez, Woolley, R., Eds.; pp. 78–86. [Google Scholar] [CrossRef] [Green Version]
- Mohammadi, J.; Rasoolzadeh Tabatabaei, K.; Janbozorgi, M.; Pasandideh, A.; Salesi, M. A Review of Scientific Outputs on Spirituality and Depression Indexed in Important Databases. Int. J. Med. Rev. 2018, 5, 41–46. [Google Scholar] [CrossRef]
- Patelli, A.; Cimini, G.; Pugliese, E.; Gabrielli, A. The Scientific Influence of Nations on Global Scientific and Technological Development. J. Informetr. 2017, 11, 1229–1237. [Google Scholar] [CrossRef] [Green Version]
- Van Raan, A.F.J. Patent Citations Analysis and Its Value in Research Evaluation: A Review and a New Approach to Map Technology-Relevant Research. J. Data Inf. Sci. 2017, 2, 13–50. [Google Scholar] [CrossRef] [Green Version]
- Chen, L. Do Patent Citations Indicate Knowledge Linkage? The Evidence from Text Similarities between Patents and Their Citations. J. Informetr. 2017, 11, 63–79. [Google Scholar] [CrossRef]
- Clarivate. DWPI Country/Region Coverage—Derwent. Available online: https://clarivate.com/derwent/dwpi-reference-center/dwpi-coverage/ (accessed on 25 November 2020).
- Fukuzawa, N.; Ida, T. Science Linkages between Scientific Articles and Patents for Leading Scientists in the Life and Medical Sciences Field: The Case of Japan. Scientometrics 2016, 106, 629–644. [Google Scholar] [CrossRef]
- Qi, Y.; Zhu, N.; Zhai, Y. The Mutually Beneficial Relationship of Patents and Scientific Literature: Topic Evolution in Nanoscience. Scientometrics 2018, 115, 893–911. [Google Scholar] [CrossRef]
- Mangan, R. Need Funding Data? Exploring Funding Data in Web of Science. 2019. Available online: https://wok.mimas.ac.uk/support/documentation/presentations/english_Funding_data_web_%0Aof_science.pdf (accessed on 26 October 2020).
- Hubbard, D.E.; Laddusaw, S. Acknowledgment of Libraries in the Journal Literature: An Exploratory Study. J. Data Inf. Sci. 2020, 5, 178–186. [Google Scholar] [CrossRef]
- Clarivate. Web of Science Core Collection—Quick Reference Guide. Clarivate Analytics. 2019. Available online: https://clarivate.libguides.com/ld.php?content_id=35888196 (accessed on 21 September 2020).
- Elsevier. Scopus Fact Sheet. 2018. Available online: https://www.elsevier.com/__data/assets/pdf_file/0017/114533/Scopus_GlobalResearch_Factsheet2019_FINAL_WEB.pdf (accessed on 21 September 2020).
- Meschede, C.; Siebenlist, T. Cross-Metric Compatability and Inconsistencies of Altmetrics. Scientometrics 2018, 115, 283–297. [Google Scholar] [CrossRef]
- Elsevier. Scopus Quick Reference Guide. Elsevier 2019. Available online: https://supportcontent.elsevier.com/RightNow%20Next%20Gen/Scopus/Files/Scopus_User_Guide.pdf (accessed on 21 September 2020).
- Chang, N. Web of Science: Platform Release 5.27. 2017. Available online: https://support.clarivate.com/ScientificandAcademicResearch/servlet/fileField?entityId=ka14N000000MrQBQA0&field=CA_Attachment_1__Body__s (accessed on 2 December 2020).
- Elsevier. Scopus to Launch Open Access Indicator for Journals on July 29, Elsevier Scopus Blog. Available online: https://blog.scopus.com/posts/scopus-to-launch-open-access-indicator-for-journals-on-july-29 (accessed on 25 November 2020).
- McCullough, R. What’s New on Scopus: Article Level Open Access Indicator Now at the Article Level and Other Exciting Changes. Available online: https://blog.scopus.com/posts/what-s-new-on-scopus-article-level-open-access-indicator-now-at-the-article-level-and-other (accessed on 25 November 2020).
- Bosman, J.; Kramer, B. Open Access Levels: A Quantitative Exploration Using Web of Science and OaDOI Data. PeerJ Prepr. 2018, 6, e3520v1. [Google Scholar] [CrossRef]
- Clarivate. Journal Citation Reports: Open Access Data Beta. 2020. Available online: https://clarivate.libguides.com/ld.php?content_id=54083756 (accessed on 25 November 2020).
- Clarivate. Journal Citation Reports—Descriptive Document. 2017. Available online: https://clarivate.libguides.com/ld.php?content_id=48842741 (accessed on 25 November 2020).
- Scopus. Scopus preview—Sources. Available online: https://www.scopus.com/sources.uri?zone=TopNavBar&origin=SearchAffiliationLookup (accessed on 26 November 2020).
- Clarivate. Web of Science Master Journal List—WoS MJL. Available online: https://mjl.clarivate.com/home (accessed on 25 November 2020).
- D’Angelo, C.A.; van Eck, N.J. Collecting Large-Scale Publication Data at the Level of Individual Researchers: A Practical Proposal for Author Name Disambiguation. Scientometrics 2020, 123, 883–907. [Google Scholar] [CrossRef]
- Reimann, P. Author Search BETA and Author Records BETA-Web of Science. External Release Documentation. 2019. Available online: https://clarivate.com/webofsciencegroup/wp-content/uploads/sites/2/dlm_uploads/2019/10/ProfilesP2-external-release-notes.pdf (accessed on 25 November 2020).
- Elsevier. What is Scopus Preview?—Scopus: Access and Use Support Center. Available online: https://service.elsevier.com/app/answers/detail/a_id/15534/c/10543/supporthub/scopus/ (accessed on 26 November 2020).
- ORCID. Available online: https://orcid.org/ (accessed on 3 December 2020).
- Ioannidis, J.P.A.; Baas, J.; Klavans, R.; Boyack, K.W. A Standardized Citation Metrics Author Database Annotated for Scientific Field. PLoS Biol. 2019, e3000384. [Google Scholar] [CrossRef] [Green Version]
- Kawashima, H.; Tomizawa, H. Accuracy Evaluation of Scopus Author ID Based on the Largest Funding Database in Japan. Scientometrics 2015, 103, 1061–1071. [Google Scholar] [CrossRef]
- Elsevier. What Can I Do on an Affiliation Details Page? Available online: https://service.elsevier.com/app/answers/detail/a_id/11264/supporthub/scopus/ (accessed on 3 December 2020).
- Elsevier. About the Institution Profile Wizard. Available online: https://service.elsevier.com/app/answers/detail/a_id/25554/supporthub/scopus/related/1/ (accessed on 3 December 2020).
- Maddi, A. Measuring Open Access Publications: A Novel Normalized Open Access Indicator. Scientometrics 2020, 124, 379–398. [Google Scholar] [CrossRef]
- Jokic, M.; Mervar, A.; Mateljan, S. Scientific Potential of European Fully Open Access Journals. Scientometrics 2018, 1373–1394. [Google Scholar] [CrossRef]
- Science Europe. Plan S: Principles and Implementation. Available online: https://www.coalition-s.org/addendum-to-the-coalition-s-guidance-on-the-implementation-of-plan-s/principles-and-implementation/ (accessed on 4 December 2020).
- Wei, M. Research on Impact Evaluation of Open Access Journals. Scientometrics 2020, 122, 1027–1049. [Google Scholar] [CrossRef]
- Asai, S. The Effect of Collaboration with Large Publishers on the Internationality and Influence of Open Access Journals for Research Institutions. Scientometrics 2020, 124, 663–677. [Google Scholar] [CrossRef]
- Siler, K.; Frenken, K. The Pricing of Open Access Journals: Diverse Niches and Sources of Value in Academic Publishing. Quant. Sci. Stud. 2020, 1, 28–59. [Google Scholar] [CrossRef]
- Tennant, J.P.; Waldner, F.; Jacques, D.C.; Masuzzo, P.; Collister, L.B.; Hartgerink, C.H.J. The Academic, Economic and Societal Impacts of Open Access: An Evidence-Based Review. F1000Research 2016, 5, 632. [Google Scholar] [CrossRef] [PubMed]
- Ignat, T.; Ayris, P. Built to Last! Embedding Open Science Principles and Practice into European Universities. Insights UKSG J. 2020, 33. [Google Scholar] [CrossRef] [Green Version]
- Van Vlokhoven, H. The Effect of Open Access on Research Quality. J. Informetr. 2019, 13, 751–756. [Google Scholar] [CrossRef]
- Dorsch, I.; Askeridis, J.M.; Stock, W.G. Truebounded, Overbounded, or Underbounded? Scientists’ Personal Publication Lists versus Lists Generated through Bibliographic Information Services. Publications 2018, 6, 7. [Google Scholar] [CrossRef] [Green Version]
- Clarivate. Web of Science Journal Evaluation Process and Selection Criteria. Available online: https://clarivate.com/webofsciencegroup/journal-evaluation-process-and-selection-criteria/ (accessed on 25 November 2020).
- Clarivate. Web of Science Core Collection Journal Selection Process. 2019. Available online: https://clarivate.com/webofsciencegroup/wp-content/uploads/sites/2/2019/08/WS369553747_Fact-Sheet_Core-Collection_V4_Updated1.pdf (accessed on 25 November 2020).
- Clarivate. Web of Science Core Collection: Journal Evaluation Criteria. 2019. Available online: https://clarivate.com/webofsciencegroup/wp-content/uploads/sites/2/2019/08/WS366313095_EditorialExplanation_Factsheet_A4_RGB_V4.pdf (accessed on 25 November 2020).
- Clarivate. Web of Science Journal Citation Reports: Suppression Policy. 2020. Available online: https://clarivate.com/webofsciencegroup/wp-content/uploads/sites/2/2020/06/jcr-suppression-policy-2020.pdf (accessed on 25 November 2020).
- Clarivate. Editorial selection process—Web of Science Group. Available online: https://clarivate.com/webofsciencegroup/solutions/editorial/ (accessed on 25 November 2020).
- Testa, J. The Book Selection Process for the Book Citation Index in Web of Science. 2012. Available online: http://wokinfo.com/media/pdf/BKCI-SelectionEssay_web (accessed on 25 November 2020).
- Clarivate. Web of Science Conference Proceedings Selection Process. Available online: https://clarivate.com/webofsciencegroup/essays/web-science-conference-proceedings-selection-process/ (accessed on 26 November 2020).
- Testa, J. The Book Selection Process for the Book Citation Index in Web of Science. Available online: https://clarivate.com/webofsciencegroup/essays/selection-process-book-citation-index-web-science/ (accessed on 2 January 2021).
- Holland, K.; Brimblecombe, P.; Meester, W.; Steiginga, S. The Importance of High-Quality Content: Curation and Re- Evaluation in Scopus. Elsevier. 2019. Available online: https://www.elsevier.com/research-intelligence/resource-library/scopus-high-quality-content (accessed on 26 November 2020).
- Elsevier. Scopus—Content—Content Policy and Selection. Available online: https://www.elsevier.com/solutions/scopus/how-scopus-works/content/content-policy-and-selection (accessed on 26 November 2020).
- Cortegiani, A.; Ippolito, M.; Ingoglia, G.; Manca, A.; Cugusi, L.; Severin, A.; Strinzel, M.; Panzarella, V.; Campisi, G.; Manoj, L.; et al. Inflated Citations and Metrics of Journals Discontinued from Scopus for Publication Concerns: The GhoS(t)Copus Project. F1000Research 2020, 9, 415. [Google Scholar] [CrossRef]
- Elsevier. Content—How Scopus Works. Available online: https://www.elsevier.com/solutions/scopus/how-scopus-works/content (accessed on 25 November 2020).
- Clarivate. Title Suppressions. Available online: http://jcr.help.clarivate.com/Content/title-suppressions.htm (accessed on 25 November 2020).
- Krauskopf, E. An Analysis of Discontinued Journals by Scopus. Scientometrics 2018, 116, 1805–1815. [Google Scholar] [CrossRef]
- Mckiernan, E.C.; Bourne, P.E.; Brown, C.T.; Buck, S.; Kenall, A.; Lin, J.; Mcdougall, D.; Nosek, B.A.; Ram, K.; Soderberg, C.K.; et al. How Open Science Helps Researchers Succeed. elife 2016, 5, e16800. [Google Scholar] [CrossRef]
- Shen, C.; Björk, B.C. “Predatory” Open Access: A Longitudinal Study of Article Volumes and Market Characteristics. BMC Med. 2015, 13, 1–15. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Tenopir, C.; Dalton, E.; Christian, L.; Jones, M.; McCabe, M.; Smith, M.; Fish, A. Imagining a Gold Open Access Future: Attitudes, Behaviors, and Funding Scenarios among Authors of Academic Scholarship. Coll. Res. Libr. 2017, 78, 824–843. [Google Scholar] [CrossRef]
- González-Betancor, S.M.; Dorta-González, P. Publication Modalities ‘Article in Press’ and ‘Open Access’ in Relation to Journal Average Citation. Scientometrics 2019, 120, 1209–1223. [Google Scholar] [CrossRef] [Green Version]
- Copiello, S. The Open Access Citation Premium May Depend on the Openness and Inclusiveness of the Indexing Database, but the Relationship Is Controversial Because It Is Ambiguous Where the Open Access Boundary Lies. Scientometrics 2019, 121, 995–1018. [Google Scholar] [CrossRef]
- Piwowar, H.; Priem, J.; Larivière, V.; Alperin, J.P.; Matthias, L.; Norlander, B.; Farley, A.; West, J.; Haustein, S. The State of OA: A Large-Scale Analysis of the Prevalence and Impact of Open Access Articles. PeerJ 2018, 6. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Wang, X.; Liu, C.; Mao, W.; Fang, Z. The Open Access Advantage Considering Citation, Article Usage and Social Media Attention. Scientometrics 2015, 103, 555–564. [Google Scholar] [CrossRef] [Green Version]
- Holmberg, K.; Hedman, J.; Bowman, T.D.; Didegah, F. Do Articles in Open Access Journals Have More Frequent Altmetric Activity than Articles in Subscription - Based Journals? An Investigation of the Research Output of Finnish Universities. Scientometrics 2020, 122, 645–659. [Google Scholar] [CrossRef] [Green Version]
- O’Kelly, F.; Fernandez, N.; Koyle, M.A. Predatory Publishing or a Lack of Peer Review Transparency?—A Contemporary Analysis of Indexed Open and Non-Open Access Articles in Paediatric Urology. J. Pediatr. Urol. 2019, 15, 159.e1–159.e7. [Google Scholar] [CrossRef] [PubMed]
- Demir, S.B. Predatory Journals: Who Publishes in Them and Why? J. Informetr. 2018, 12, 1296–1311. [Google Scholar] [CrossRef]
- Green, T. Is Open Access Affordable? Why Current Models Do Not Work and Why We Need Internet-Era Transformation of Scholarly Communications. Learn. Publ. 2019, 32, 13–25. [Google Scholar] [CrossRef] [Green Version]
- Tenopir, C.; Dalton, E.; Fish, A.; Christian, L.; Jones, M.; Smith, M. What Motivates Authors of Scholarly Articles? The Importance of Journal Attributes and Potential Audience on Publication Choice. Publications 2016, 4, 22. [Google Scholar] [CrossRef] [Green Version]
- Asai, S. Market Power of Publishers in Setting Article Processing Charges for Open Access Journals. Scientometrics 2020, 123, 1037–1049. [Google Scholar] [CrossRef]
- Beall, J. Criteria for Determining Predatory Publishers. 2015. Available online: https://beallslist.net/wp-content/uploads/2019/12/criteria-2015.pdf (accessed on 25 November 2020).
- Tennant, J.P.; Crane, H.; Crick, T.; Davila, J.; Enkhbayar, A.; Havemann, J.; Kramer, B.; Martin, R.; Masuzzo, P.; Nobes, A.; et al. Ten Hot Topics around Scholarly Publishing. Publications 2019, 7, 34. [Google Scholar] [CrossRef] [Green Version]
- Teixeira da Silva, J.A.; Dobránszki, J.; Tsigaris, P.; Al-Khatib, A. Predatory and Exploitative Behaviour in Academic Publishing: An Assessment. J. Acad. Librariansh. 2019, 45, 102071. [Google Scholar] [CrossRef]
- Manca, A.; Martinez, G.; Cugusi, L.; Dragone, D.; Dvir, Z.; Deriu, F. The Surge of Predatory Open-Access in Neurosciences and Neurology. Neuroscience 2017, 353, 166–173. [Google Scholar] [CrossRef] [PubMed]
- Björk, B.C.; Kanto-Karvonen, S.; Harviainen, J.T. How Frequently Are Articles in Predatory Open Access Journals Cited. Publications 2020, 8, 17. [Google Scholar] [CrossRef] [Green Version]
- Gorraiz, J.; Melero-Fuentes, D.; Gumpenberger, C.; Valderrama-Zurián, J.C. Availability of Digital Object Identifiers (DOIs) in Web of Science and Scopus. J. Informetr. 2016, 10, 98–109. [Google Scholar] [CrossRef]
- Niel, G.; Boyrie, F.; Virieux, D. Chemical Bibliographic Databases: The Influence of Term Indexing Policies on Topic Searches. New J. Chem. 2015, 39, 8807–8817. [Google Scholar] [CrossRef] [Green Version]
- Zhu, J.; Liu, F.; Liu, W. The Secrets behind Web of Science’s DOI Search. Scientometrics 2019, 119, 1745–1753. [Google Scholar] [CrossRef]
- Franceschini, F.; Maisano, D.; Mastrogiacomo, L. Do Scopus and WoS Correct “Old” Omitted Citations? Scientometrics 2016, 107, 321–335. [Google Scholar] [CrossRef]
- Meester, W.J.N.; Colledge, L.; Dyas, E.E. A Response to “The Museum of Errors/Horrors in Scopus” by Franceschini et Al. J. Informetr. 2016, 10, 569–570. [Google Scholar] [CrossRef]
- Franceschini, F.; Maisano, D.; Mastrogiacomo, L. The Museum of Errors/Horrors in Scopus. J. Informetr. 2016, 10, 174–182. [Google Scholar] [CrossRef]
- Hu, Z.; Tian, W.; Xu, S.; Zhang, C.; Wang, X. Four Pitfalls in Normalizing Citation Indicators: An Investigation of ESI’s Selection of Highly Cited Papers. J. Informetr. 2018, 12, 1133–1145. [Google Scholar] [CrossRef]
- Clarivate. Web of Science Core Collection: Early Access Articles. Available online: https://support.clarivate.com/ScientificandAcademicResearch/s/article/Web-of-Science-Core-Collection-Early-Access-articles?language=en_US (accessed on 2 December 2020).
- James, C.; Colledge, L.; Meester, W.; Azoulay, N.; Plume, A. CiteScore Metrics: Creating Journal Metrics from the Scopus Citation Index. Learn. Publ. 2019, 32, 367–374. [Google Scholar] [CrossRef] [Green Version]
- Leydesdorff, L.; Bornmann, L. The Operationalization of “Fields” as WoS Subject Categories (WCs) in Evaluative Bibliometrics: The Cases of “Library and Information Science” and “Science & Technology Studies”. J. Assoc. Inf. Sci. Technol. 2016, 67, 707–714. [Google Scholar] [CrossRef]
- Shu, F.; Julien, C.; Zhang, L.; Qiu, J.; Zhang, J.; Larivière, V. Comparing Journal and Paper Level Classifications of Science. J. Informetr. 2019, 13, 202–225. [Google Scholar] [CrossRef]
- Bornmann, L.; Marx, W. Critical Rationalism and the Search for Standard (Field-Normalized) Indicators in Bibliometrics. J. Informetr. 2018, 12, 598–604. [Google Scholar] [CrossRef] [Green Version]
- Leydesdorff, L.; Carley, S.; Rafols, I. Global Maps of Science Based on the New Web-of-Science Categories. Scientometrics 2013, 94, 589–593. [Google Scholar] [CrossRef] [Green Version]
- Waltman, L.; van Eck, N.J. Source Normalized Indicators of Citation Impact: An Overview of Different Approaches and an Empirical Comparison. Scientometrics 2013, 96, 699–716. [Google Scholar] [CrossRef]
- Campanario, J.M. Are Leaders Really Leading? Journals That Are First in Web of Science Subject Categories in the Context of Their Groups. Scientometrics 2018, 115, 111–130. [Google Scholar] [CrossRef]
- Huang, M.; Shaw, W.-C.; Lin, C. One Category, Two Communities: Subfield Differences in “Information Science and Library Science” in Journal Citation Reports. Scientometrics 2019, 119, 1059–1079. [Google Scholar] [CrossRef]
- Gómez-Núñez, A.J.; Vargas-Quesada, B.; de Moya-Anegón, F. Updating the SCImago Journal and Country Rank Classification: A New Approach U Sing Ward ’ s Clustering and Alternative Combination of Citation Measures. J. Assoc. Inf. Sci. Technol. 2016, 67, 178–190. [Google Scholar] [CrossRef]
- Leydesdorff, L.; Bornmann, L.; Zhou, P. Construction of a Pragmatic Base Line for Journal Classifications and Maps Based on Aggregated Journal-Journal Citation Relations. J. Informetr. 2016, 10, 902–918. [Google Scholar] [CrossRef] [Green Version]
- Waltman, L.; van Eck, N.J. A Systematic Empirical Comparison of Different Approaches for Normalizing Citation Impact Indicators. J. Informetr. 2013, 7, 833–849. [Google Scholar] [CrossRef] [Green Version]
- Bouyssou, D.; Marchant, T. Ranking Authors Using Fractional Counting of Citations: An Axiomatic Approach. J. Informetr. 2016, 10, 183–199. [Google Scholar] [CrossRef] [Green Version]
- Perianes-Rodriguez, A.; Ruiz-Castillo, J. A Comparison of the Web of Science and Publication-Level Classification Systems of Science. J. Informetr. 2017, 11, 32–45. [Google Scholar] [CrossRef] [Green Version]
- Milojević, S. Practical Method to Reclassify Web of Science Articles into Unique Subject Categories and Broad Disciplines. Quant. Sci. Stud. 2020, 1, 183–206. [Google Scholar] [CrossRef]
- Haunschild, R.; Schier, H.; Marx, W.; Bornmann, L. Algorithmically Generated Subject Categories Based on Citation Relations: An Empirical Micro Study Using Papers on Overall Water Splitting. J. Informetr. 2018, 12, 436–447. [Google Scholar] [CrossRef] [Green Version]
- Thelwall, M. Are There Too Many Uncited Articles? Zero Inflated Variants of the Discretised Lognormal and Hooked Power Law Distributions. J. Informetr. 2016, 10, 622–633. [Google Scholar] [CrossRef] [Green Version]
- Matthews, T. Web of Science Group: Welcome to our Training Portal. Available online: https://clarivate.libguides.com/home/welcome (accessed on 25 November 2020).
- Clarivate. Web of Science Service for UK Education. Available online: Wok.mimas.ac.uk (accessed on 25 November 2020).
- Clarivate. Web of Science Platform—Web of Science Group. Available online: https://clarivate.com/webofsciencegroup/solutions/webofscience-platform/ (accessed on 25 November 2020).
- Clarivate. Scientific and Academic Research-Web of Science Group. Available online: https://support.clarivate.com/ScientificandAcademicResearch/s/?language=en_US (accessed on 25 November 2020).
- Clarivate. Web of Science Core Collection: Quick Reference Cards (PDF). Available online: https://clarivate.libguides.com/woscc/guides (accessed on 26 November 2020).
- Ruccolo, M. Web of Science Core Collection: Web of Science: Summary of Coverage. Available online: https://clarivate.libguides.com/woscc/coverage (accessed on 25 November 2020).
- Clarivate. Journal Citation Reports-Inforographic. Available online: https://clarivate.com/webofsciencegroup/web-of-science-journal-citation-reports-2020-infographic/ (accessed on 25 November 2020).
- Clarivate. Web of Science Core Collection—Web of Science Group. Available online: https://clarivate.com/webofsciencegroup/solutions/web-of-science-core-collection/ (accessed on 25 November 2020).
- Clarivate. Publishers. Available online: http://wokinfo.com/mbl/publishers/?utm_source=false&utm_medium=false&utm_campaign=false (accessed on 25 November 2020).
- Ruccolo, M. Web of Science Core Collection: Introduction. Available online: https://clarivate.libguides.com/woscc/basics (accessed on 2 January 2021).
- Clarivate. Backfiles. Available online: http://wokinfo.com/products_tools/backfiles/?utm_source=false&utm_medium=false&utm_campaign=false (accessed on 25 November 2020).
- Elsevier. About Scopus—Abstract and citation database. Available online: https://www.elsevier.com/solutions/scopus (accessed on 25 November 2020).
- Elsevier. Scopus: Access and use Support Center-Home. Available online: https://service.elsevier.com/app/home/supporthub/scopus/ (accessed on 25 November 2020).
- Elsevier. Product Releases | Elsevier Scopus Blog. Available online: https://blog.scopus.com/product-releases (accessed on 25 November 2020).
- Elsevier. Scopus tutorials—Scopus: Access and Use Support Center. Available online: https://service.elsevier.com/app/answers/detail/a_id/14799/kw/export/supporthub/scopus/related/1/ (accessed on 25 November 2020).
- Clarivate. Web of Science-Videos. Available online: https://videos.webofsciencegroup.com/ (accessed on 25 November 2020).
- Clarivate. InCites Indicators Handbook. 2018. Available online: https://incites.help.clarivate.com/Content/Resources/Docs/indicators-handbook-june-2018.pdf (accessed on 25 November 2020).
- Clarivate. Master Book List—MBL. Available online: http://wokinfo.com/mbl/ (accessed on 4 December 2020).
- Matthews, T. Regional Citation Indexes. Available online: https://clarivate.libguides.com/webofscienceplatform/rci (accessed on 4 December 2020).
- Matthews, T. Web of Science: Direct Links: Home. Available online: https://clarivate.libguides.com/c.php?g=648493&p=4547878 (accessed on 25 November 2020).
- Clarivate. Emerging Sources Citation Index Backfile (2005–2014). Available online: https://clarivate.com/wp-content/uploads/2018/05/M255-Crv_SAR_ESCI-Individual-infographic-002.pdf (accessed on 25 November 2020).
- Rovira, C.; Codina, L.; Guerrero-Solé, F.; Lopezosa, C. Ranking by Relevance and Citation Counts, a Comparative Study: Google Scholar, Microsoft Academic, WoS and Scopus. Futur. Internet 2019, 11, 9020. [Google Scholar] [CrossRef] [Green Version]
- Clarivate. Author Search Beta. Web of Science Core Collection—Quick Reference Guide. Clarivate Analytics 2019. Available online: https://clarivate.libguides.com/woscc/guides (accessed on 25 November 2020).
- Ruccolo, M. Web of Science Core Collection: Searching for an Institution. Available online: http://clarivate.libguides.com/woscc/institution (accessed on 3 December 2020).
- Elsevier. What is the Scopus Affiliation Identifier? Available online: https://service.elsevier.com/app/answers/detail/a_id/11215/supporthub/scopus/ (accessed on 25 November 2020).
- Elsevier. How Do I Email, Print, or Create a Bibliography, or Save Documents to PDF Format? Available online: https://service.elsevier.com/app/answers/detail/a_id/12009/supporthub/scopus/kw/bibliography/ (accessed on 4 December 2020).
- Elsevier. How Do I Export Documents from Scopus?—Scopus: Access and Use Support Center. Available online: https://service.elsevier.com/app/answers/detail/a_id/11234/kw/export/supporthub/scopus/related/1/ (accessed on 25 November 2020).
- Birkle, C.; Pendlebury, D.A.; Schnell, J.; Adams, J. Web of Science as a Data Source for Research on Scientific and Scholarly Activity. Quant. Sci. Stud. 2020, 1, 363–376. [Google Scholar] [CrossRef]
- Williams, R.; Bornmann, L. Sampling Issues in Bibliometric Analysis. J. Informetr. 2016, 10, 1225–1232. [Google Scholar] [CrossRef] [Green Version]
- Gu, X.; Blackmore, K.L. Recent Trends in Academic Journal Growth. Scientometrics 2016, 108, 693–716. [Google Scholar] [CrossRef]
- Mingers, J.; Yang, L. Evaluating Journal Quality: A Review of Journal Citation Indicators and Ranking in Business and Management. Eur. J. Oper. Res. 2017, 257, 323–337. [Google Scholar] [CrossRef]
- Setti, G. Bibliometric Indicators: Why Do We Need More than One? IEEE Access 2013, 1, 232–246. [Google Scholar] [CrossRef]
- Garfield, E. Citation Indexes for Science: A New Dimension in Documentatio through Association of Ideas. Science 1955, 122, 108–111. [Google Scholar] [CrossRef] [PubMed]
- Larivière, V.; Sugimoto, C.R. The Journal Impact Factor: A Brief History, Critique, and Discussion of Adverse Effects. In Springer Handbook of Science and Technology Indicators; Glänzel, W., Moed, H.F., Schmoch, U., Thelwall, M., Eds.; Springer International Publishing: Cham, Switzerland, 2018; pp. 1–33. ISBN 978-3-030-02511-3. [Google Scholar]
- Ranjan, C.K. Bibliometric Indices of Scientific Journals: Time to Overcome the Obsession and Think beyond the Impact Factor. Med. J. Armed Forces India 2017, 73, 175–177. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Callaway, E. Publishing Elite Turns against Impact Factor. Nature 2016, 535, 210–211. [Google Scholar] [CrossRef] [Green Version]
- Moustafa, K. The Disaster of the Impact Factor. Sci. Eng. Ethics 2014, 21, 139–142. [Google Scholar] [CrossRef]
- Clarivate. Journal Citation Reports—Journal Impact Factor (JIF). Available online: https://clarivate.libguides.com/ld.php?content_id=55911188 (accessed on 25 November 2020).
- Ferrer-Sapena, A.; Sánchez-Pérez, E.A.; Peset, F.; González, L.-M.; Aleixandre-Benavent, R. The Impact Factor as a Measuring Tool of the Prestige of the Journals in Research Assessment in Mathematics. Res. Eval. 2016, 25, 306–314. [Google Scholar] [CrossRef] [Green Version]
- Alguliyev, R.; Aliguliyev, R.; Ismayilova, N. Impact Factor Penalized by Self-Citations. Appl. Inf. Commun. Technol. AICT 2016 Conf. Proc. 2017, 2–5. [Google Scholar] [CrossRef]
- Teixeira da Silva, J.A.; Memon, A.R. CiteScore: A Cite for Sore Eyes, or a Valuable, Transparent Metric? Scientometrics 2017, 111, 553–556. [Google Scholar] [CrossRef]
- Elsevier. How Are CiteScore Metrics Used in Scopus? Available online: https://service.elsevier.com/app/answers/detail/a_id/14880/supporthub/scopus/ (accessed on 5 December 2020).
- Bornmann, L.; Marx, W. Methods for the Generation of Normalized Citation Impact Scores in Bibliometrics: Which Method Best Reflects the Judgements of Experts? J. Informetr. 2015, 9, 408–418. [Google Scholar] [CrossRef] [Green Version]
- Moed, H.F. Measuring Contextual Citation Impact of Scientific Journals. J. Informetr. 2010, 4, 265–277. [Google Scholar] [CrossRef] [Green Version]
- Waltman, L.; van Eck, N.J.; van Leeuwen, T.N.; Visser, M.S. Some Modifications to the SNIP Journal Impact Indicator. J. Informetr. 2013, 7, 272–285. [Google Scholar] [CrossRef] [Green Version]
- Leydesdorff, L.; Opthof, T. Scopus’s Source Normalized Impact per Paper (SNIP) versus a Journal Impact Factor Based on Fractional Counting of Citations. J. Am. Soc. Inf. Sci. Technol. 2010, 61, 2365–2369. [Google Scholar] [CrossRef] [Green Version]
- Moed, H.F. Comprehensive Indicator Comparisons Intelligible to Non-Experts: The Case of Two SNIP Versions. Scientometrics 2016, 106, 51–65. [Google Scholar] [CrossRef] [Green Version]
- Colledge, L.; de Moya-Anegón, F.; Guerrero-Bote, V.; López-Illescas, C.; El Aisati, M.; Moed, H. SJR and SNIP: Two New Journal Metrics in Elsevier’s Scopus. Serials 2010, 23, 215–221. [Google Scholar] [CrossRef] [Green Version]
- Mingers, J. Problems with SNIP. J. Informetr. 2014, 8, 890–894. [Google Scholar] [CrossRef]
- Bergstrom, C.T. Eigenfactor: Measuring the Value and Prestige of Scholarly Journals. Coll. Res. Libr. News 2007, 68, 314–316. [Google Scholar] [CrossRef]
- González-Pereira, B.; Guerrero-Bote, V.P.; Moya-Anegón, F. A New Approach to the Metric of Journals Scientific Prestige: The SJR Indicator. J. Informetr. 2010, 4, 379–391. [Google Scholar] [CrossRef]
- Guerrero-Bote, V.P.; Moya-Anegón, F. A Further Step Forward in Measuring Journals’ Scientific Prestige: The SJR2 Indicator. J. Informetr. 2012, 6, 674–688. [Google Scholar] [CrossRef] [Green Version]
- Hirsch, J.E. An Index to Quantify an Individual’s Scientific Research Output. Proc. Natl. Acad. Sci. USA 2005, 102, 16569–16572. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Barnes, C. The H-Index Debate: An Introduction for Librarians. J. Acad. Librariansh. 2017, 43, 487–494. [Google Scholar] [CrossRef]
- Montazerian, M.; Zanotto, E.D.; Eckert, H. A New Parameter for (Normalized) Evaluation of H - Index: Countries as a Case Study. Scientometrics 2019, 118, 1065–1078. [Google Scholar] [CrossRef]
- Vîiu, G.A. A Theoretical Evaluation of Hirsch-Type Bibliometric Indicators Confronted with Extreme Self-Citation. J. Informetr. 2016, 10, 552–566. [Google Scholar] [CrossRef]
- Raheel, M.; Ayaz, S. Evaluation of H-Index, Its Variants and Extensions Based on Publication Age & Citation Intensity in Civil Engineering. Scientometrics 2018, 114, 1107–1127. [Google Scholar] [CrossRef]
- Ding, J.; Liu, C.; Asobenie, G. Exploring the Limitations of the h - Index and h - Type Indexes in Measuring the Research Performance of Authors. Scientometrics 2020, 122, 1303–1322. [Google Scholar] [CrossRef]
- Ghani, R.; Qayyum, F.; Tanvir, M.; Hermann, A. Comprehensive Evaluation of H-index and Its Extensions in the Domain of Mathematics. Scientometrics 2019, 118, 809–822. [Google Scholar] [CrossRef]
- Schubert, A.; Schubert, G. All along the H-Index-Related Literature: A Guided Tour. In Springer Handbook of Science and Technology Indicators. Springer Handbooks; Glänzel, W., Moed, H.F., Schmoc, U., Thelwall, M., Eds.; Springer: Cham, Switzerland, 2019; pp. 301–334. [Google Scholar] [CrossRef]
- Teixeira da Silva, J.A.; Dobránszki, J. Multiple Versions of the H-Index: Cautionary Use for Formal Academic Purposes. Scientometrics 2018, 115, 1107–1113. [Google Scholar] [CrossRef]
- Hu, G.; Wang, L.; Ni, R.; Liu, W. Which H-Index? An Exploration within the Web of Science. Scientometrics 2020, 123, 1225–1233. [Google Scholar] [CrossRef]
- Walters, W.H. Do Subjective Journal Ratings Represent Whole Journals or Typical Articles? Unweighted or Weighted Citation Impact? J. Informetr. 2017, 11, 730–744. [Google Scholar] [CrossRef]
- Moed, H.F.; Colledge, L.; Reedijk, J.; Moya-Anegon, F.; Guerrero-Bote, V.; Plume, A.; Amin, M. Citation-Based Metrics Are Appropriate Tools in Journal Assessment Provided That They Are Accurate and Used in an Informed Way. Scientometrics 2012, 92, 367–376. [Google Scholar] [CrossRef]
- Okagbue, H.I.; Teixeira, J.A. Correlation between the CiteScore and Journal Impact Factor of Top—Ranked Library and Information Science Journals. Scientometrics 2020, 124, 797–801. [Google Scholar] [CrossRef]
- Cockriel, W.M.; Mcdonald, J.B. The Influence of Dispersion on Journal Impact Measures. Scientometrics 2018, 116, 609–622. [Google Scholar] [CrossRef]
- Giuffrida, C.; Abramo, G.; Andrea, C.; Angelo, D. Are All Citations Worth the Same? Valuing Citations by the Value of the Citing Items. J. Informetr. 2019, 13, 500–514. [Google Scholar] [CrossRef] [Green Version]
- Elsevier. CiteScore Metrics: The Basics. 2018. Available online: https://www.elsevier.com/__data/assets/pdf_file/0007/652552/SC_FS_CiteScore-metrics-The-Basics.pdf (accessed on 9 October 2020).
- Krauskopf, E. Sources without a CiteScore Value: More Clarity Is Required. Scientometrics 2020, 122, 1801–1812. [Google Scholar] [CrossRef]
- Bornmann, L. Measuring Impact in Research Evaluations: A Thorough Discussion of Methods for, Effects of and Problems with Impact Measurements. High. Educ. 2017, 73, 775–787. [Google Scholar] [CrossRef] [Green Version]
- Antonoyiannakis, M. Impact Factors and the Central Limit Theorem: Why Citation Averages Are Scale Dependent. J. Informetr. 2018, 12, 1072–1088. [Google Scholar] [CrossRef] [Green Version]
- Wilsdon, J.; Allen, L.; Belfiore, E.; Campbell, P.; Curry, S.; Hill, S.; Jones, R.; Kain, R.; Kerridge, S.; Thelwall, M.; et al. Metric Tide: Report of the Independent Review of the Role of Metrics in Research Assessment and Management; HEFCE: London, UK, 2015. [Google Scholar] [CrossRef]
- Desrochers, N.; Paul-Hus, A.; Haustein, S.; Mongeon, P.; Quan-haase, A.; Bowman, T.D.; Pecoskie, J.; Tsou, A.; Larivière, V. Authorship, Citations, Acknowledgments and Visibility in Social Media: Symbolic Capital in the Multifaceted Reward System of Science. Soc. Sci. Inf. 2018, 57, 233–248. [Google Scholar] [CrossRef]
- Ravenscroft, J.; Liakata, M.; Clare, A.; Duma, D. Measuring Scientific Impact beyond Academia: An Assessment of Existing Impact Metrics and Proposed Improvements. PLoS ONE 2017, 12, e0173152. [Google Scholar] [CrossRef] [Green Version]
- Reale, E.; Avramov, D.; Canhial, K.; Donovan, C.; Flecha, R.; Holm, P.; Larkin, C.; Lepori, B.; Mosoni-Fried, J.; Oliver, E.; et al. A Review of Literature on Evaluating the Scientific, Social and Political Impact of Social Sciences and Humanities Research. Res. Eval. 2018, 27, 298–308. [Google Scholar] [CrossRef] [Green Version]
- Tahamtan, I.; Bornmann, L. What Do Citation Counts Measure? An Updated Review of Studies on Citations in Scientific Documents Published between 2006 and 2018. Scientometrics 2019, 121, 1635–1684. [Google Scholar] [CrossRef] [Green Version]
- MacRoberts, M.H.; MacRoberts, B.R. The Mismeasure of Science: Citation Analysis. J. Assoc. Inf. Sci. Technol. 2018, 69, 474–482. [Google Scholar] [CrossRef]
- Wang, B.; Bu, Y.; Xu, Y. A Quantitative Exploration on Reasons for Citing Articles from the Perspective of Cited Authors. Scientometrics 2018, 116, 675–687. [Google Scholar] [CrossRef]
- Crothers, C.; Bornmann, L.; Haunschild, R. Citation Concept Analysis (CCA) of Robert K. Merton’s Book Social Theory and Social Structure: How Often Are Certain Concepts from the Book Cited in Subsequent Publications? Quant. Sci. Stud. 2020, 1, 675–690. [Google Scholar] [CrossRef] [Green Version]
- Lee, D.H. Predictive Power of Conference-Related Factors on Citation Rates of Conference Papers. Scientometrics 2019, 118, 281–304. [Google Scholar] [CrossRef]
- Xie, J.; Gong, K.; Li, J.; Ke, Q.; Kang, H.; Cheng, Y. A Probe into 66 Factors Which Are Possibly Associated with the Number of Citations an Article Received. Scientometrics 2019, 119, 1429–1454. [Google Scholar] [CrossRef]
- Tahamtan, I.; Safipour, A.; Khadijeh, A. Factors Affecting Number of Citations: A Comprehensive Review of the Literature. Scientometrics 2016, 107, 1195–1225. [Google Scholar] [CrossRef]
- Tahamtan, I.; Bornmann, L. Core Elements in the Process of Citing Publications: Conceptual Overview of the Literature. J. Informetr. 2018, 12, 203–216. [Google Scholar] [CrossRef] [Green Version]
- Bornmann, L.; Leydesdorff, L. Skewness of Citation Impact Data and Covariates of Citation Distributions: A Large-Scale Empirical Analysis Based on Web of Science Data. J. Informetr. 2017, 11, 164–175. [Google Scholar] [CrossRef] [Green Version]
- Uddin, S.; Khan, A. The Impact of Author-Selected Keywords on Citation Counts. J. Informetr. 2016, 10, 1166–1177. [Google Scholar] [CrossRef]
- Abramo, G.; D’Angelo, A.C.; Reale, E. Peer Review versus Bibliometrics: Which Method Better Predicts the Scholarly Impact of Publications? Scientometrics 2019, 121, 537–554. [Google Scholar] [CrossRef] [Green Version]
- Brito, R.; Rodríguez-Navarro, A. Evaluating Research and Researchers by the Journal Impact Factor: Is It Better than Coin Flipping? J. Informetr. 2019, 13, 314–324. [Google Scholar] [CrossRef] [Green Version]
- Lei, L.; Sun, Y. Should Highly Cited Items Be Excluded in Impact Factor Calculation? The Effect of Review Articles on Journal Impact. Scientometrics 2020, 122, 1697–1706. [Google Scholar] [CrossRef]
- Milojević, S.; Radicchi, F.; Bar-Ilan, J. Citation Success Index − An Intuitive Pair-Wise Journal Comparison Metric. J. Informetr. 2017, 11, 223–231. [Google Scholar] [CrossRef] [Green Version]
- Haddawy, P.; Hassan, S.U.; Asghar, A.; Amin, S. A Comprehensive Examination of the Relation of Three Citation-Based Journal Metrics to Expert Judgment of Journal Quality. J. Informetr. 2016, 10, 162–173. [Google Scholar] [CrossRef]
- Thelwall, M. Not Dead, Just Resting: The Practical Value of per Publication Citation Indicators. J. Informetr. 2016, 10, 667–670. [Google Scholar] [CrossRef] [Green Version]
- Traag, V.A.; Waltman, L. Systematic Analysis of Agreement between Metrics and Peer Review in the UK REF. Palgrave Commun. 2019, 5, 29. [Google Scholar] [CrossRef]
- Baccini, A.; De Nicolao, G. Do They Agree? Bibliometric Evaluation versus Informed Peer Review in the Italian Research Assessment Exercise. Scientometrics 2016, 108, 1651–1671. [Google Scholar] [CrossRef] [Green Version]
- Baccini, A.; De Nicolao, G.; Petrovich, E. Citation Gaming Induced by Bibliometric Evaluation: A Country-Level Comparative Analysis. PLoS ONE 2019, 14, e0221212. [Google Scholar] [CrossRef]
- Nazarovets, S. Controversial Practice of Rewarding for Publications in National Journals. Scientometrics 2020, 124, 813–818. [Google Scholar] [CrossRef]
- Schneider, J.W.; Aagaard, K.; Bloch, C.W. What Happens When National Research Funding Is Linked to Differentiated Publication Counts? A Comparison of the Australian and Norwegian Publication-Based Funding Models. Res. Eval. 2016, 25, 244–256. [Google Scholar] [CrossRef]
- Eisner, D.A. Reproducibility of Science: Fraud, Impact Factors and Carelessness. J. Mol. Cell. Cardiol. 2018, 114, 364–368. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Kulczycki, E.; Engels, T.C.E.; Pölönen, J.; Bruun, K.; Dušková, M.; Guns, R.; Nowotniak, R.; Petr, M.; Sivertsen, G.; Istenič Starčič, A.; et al. Publication Patterns in the Social Sciences and Humanities: Evidence from Eight European Countries. Scientometrics 2018, 116, 463–486. [Google Scholar] [CrossRef] [Green Version]
- Moher, D.; Naudet, F.; Cristea, I.A.; Miedema, F.; John, P.; Ioannidis, A.; Goodman, S.N. Assessing Scientists for Hiring, Promotion, and Tenure. PLoS Biol. 2018, 16, e2004089. [Google Scholar] [CrossRef]
- ASCB. Read the declaration—DORA. Available online: https://sfdora.org/read/ (accessed on 25 November 2020).
- Hicks, D.; Wouters, P.; Waltman, L.; De Rijcke, S.; Rafols, I. Bibliometrics: The Leiden Manifesto for Research Metrics. Nature 2015, 520, 429–431. [Google Scholar] [CrossRef] [Green Version]
- Wilsdon, J.; Bar-ilan, J.; Frodeman, R.; Lex, E.; Peters, I.; Wouters, P. Next-Generation Metrics: Reponsible Metrics and Evaluation for Open Science; European Commission: Brussels, Belgium, 2017; ISBN 978-92-79-66130-3. [Google Scholar]
- Wouters, P.; Ràfols, I.; Oancea, A.; Kamerlin, L.S.C.; Holbrook, B.J.; Jacob, M. Indicator Frameworks for Fostering Open Knowledge Practices in Science and Scholarship; European Commission: Brussels, Belgium, 2019; ISBN 978-92-76-11882-4. [Google Scholar] [CrossRef]
- Bornmann, L.; Adams, J.; Leydesdorff, L. The Negative Effects of Citing with a National Orientation in Terms of Recognition: National and International Citations in Natural-Sciences Papers from Germany, the Netherlands, and the UK. J. Informetr. 2018, 12, 931–949. [Google Scholar] [CrossRef] [Green Version]
- Kosanović, B.; Šipka, P. Output in WoS vs. Representation in JCR of SEE Nations: Does Mother Thomson Cherish All Her Children Equally. J. Publ. Dev. Transit. Emerg. Ctries. 2013, 125–137. [Google Scholar] [CrossRef]
- Archambault, É.; Campbell, D.; Gingras, Y.; Larivière, V. Comparing Bibliometric Statistics Obtained From TheWeb of Science and Scopus. J. Am. Soc. Inf. Sci. Technol. 2009, 60, 1320–1326. [Google Scholar] [CrossRef]
- Robinson-Garcia, N.; Mongeon, P.; Jeng, W.; Costas, R. DataCite as a Novel Bibliometric Source: Coverage, Strengths and Limitations. J. Informetr. 2017, 11, 841–854. [Google Scholar] [CrossRef] [Green Version]
- Miguel, S.; Tannuri de Oliveira, E.; Cabrini Grácio, M. Scientific Production on Open Access: A Worldwide Bibliometric Analysis in the Academic and Scientific Context. Publications 2016, 4, 1. [Google Scholar] [CrossRef] [Green Version]
- Rodrigues, R.; Taga, V.; Passos, M. Research Articles about Open Access Indexed by Scopus: A Content Analysis. Publications 2016, 4, 31. [Google Scholar] [CrossRef] [Green Version]
- Clarivate. Release Notes—Web of Science Group. Available online: https://clarivate.com/webofsciencegroup/release-notes/wos/ (accessed on 29 December 2020).
- Zhang, N.; Wan, S.; Wang, P.; Zhang, P.; Wu, Q. A Bibliometric Analysis of Highly Cited Papers in the Field of Economics and Business Based on the Essential Science Indicators Database. Scientometrics 2018, 116, 1039–1053. [Google Scholar] [CrossRef]
Covered Item | Scopus 1 [89] | WoS CC | Source of WoS CC Information 2 |
---|---|---|---|
Source titles (overall number) | >25,100 (active) | >21,400 (journals only) | [196] |
Journals with impact metrics | ~23,500 | >12,100 (JCR) | [197] |
Full Open Access sources | >5500 | ~5000 (JCR—1658) | [197,198] |
Hybrid sources | n.i. | JCR—7487 | [197] |
Conferences (number of events) | >120,000 | ~220,000 | [196] |
Conferences (number of entries) | >9.8 M | >10 M | [28] |
Books (overall number of items) | >210,000 | >119,000 | [196] |
Book series | >850 | >150 (BkCI) | [28] |
Patents (number of entries) | >43.7 M | Not included in WoS CC | [196] |
Trade publications | 294 | Not indexed in WoS | [28] |
Publishers | >5000 | >3300 (JCR), >600 (BkCI) | [114,199] |
Author profiles | >16 M | n.i. | |
Institutional profiles | ~70 M | >5000 | [200] |
Overall number of core records (indexed in the DB) | >77.8 M | >79 M | [196] |
Number of non-core records (cited in the core records, but not indexed) | >210 M | >177 M | [28] |
Entries with cited references | >68 M | n.i. | |
Cited references (overall number) | >1.7 B | >1.6 B | [196] |
Cited references (covered time-frame) | since 1970 | since 1900 | [196] |
Entries dating back to | 1788 | 1864 | [201] |
Characteristic | WoS CC | Scopus | ||||
---|---|---|---|---|---|---|
JIF/Five-Year JIF | ES | AIS | CS | SNIP | SJR | |
Calculation principle | ratio of citations and publications | based on Eigenvector centrality | based on Eigenvector centrality | ratio of citations and publications | ratio of citations and publications, normalized by citing densities in different disciplines | based on citation networks |
Publication counting window | 2 years/5 years | 5 years | 5 years | 4 years | 3 years | 3 years |
Citation counting window | 1 year | 1 year | 1 year | 4 years | 1 year | 1 year |
Inclusion of journal self-citations | yes | no | no | yes | yes | limited up to 33% |
Normalized by number of papers in the journal (size-independent) | yes | no | yes | yes | yes | yes |
Normalized by disciplines | no | no | no | no | yes | Not directly, accounts for thematic closeness between journals |
Normalized by prestige (weighted) | no | yes | yes | no | no | yes |
Applicability | for journals included in JCR | for all serial sources (including journals, book series, conference proceeding and trade publications) | ||||
Availability | requires additional subscription of JCR | free (subscription is not required) | ||||
Major drawbacks and limitations | differing document types included in numerator and denominator (susceptible to manipulations); Short citation counting window (for classical JIF); not normalized by disciplines | inconvenient numerical value, decreasing with new journals being added to the database; not normalized by disciplines | named as article metric, but indicates the impact of average article in a journal (not a particular article); not normalized by disciplines | not normalized by disciplines | named as impact per paper, but indicates the impact of average article in a journal (not a particular article) | complexed calculation, difficult to interpret |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2021 by the author. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).
Share and Cite
Pranckutė, R. Web of Science (WoS) and Scopus: The Titans of Bibliographic Information in Today’s Academic World. Publications 2021, 9, 12. https://doi.org/10.3390/publications9010012
Pranckutė R. Web of Science (WoS) and Scopus: The Titans of Bibliographic Information in Today’s Academic World. Publications. 2021; 9(1):12. https://doi.org/10.3390/publications9010012
Chicago/Turabian StylePranckutė, Raminta. 2021. "Web of Science (WoS) and Scopus: The Titans of Bibliographic Information in Today’s Academic World" Publications 9, no. 1: 12. https://doi.org/10.3390/publications9010012
APA StylePranckutė, R. (2021). Web of Science (WoS) and Scopus: The Titans of Bibliographic Information in Today’s Academic World. Publications, 9(1), 12. https://doi.org/10.3390/publications9010012