The document discusses various citation databases and metrics for evaluating publications and journals. It describes Web of Science, Scopus, and Google Scholar as the major citation databases. It provides details on the coverage, citation data included, and analytical tools available for each database. The document also explains journal citation reports, which allow comparison of journals using citation data. Key metrics for journals are defined, including impact factor, eigenfactor, and article influence score. Quartile comparisons that enable evaluation of journal rankings are also outlined.
This document provides an overview of various bibliometric products and metrics that can be used to measure research impact, including journal impact factor, h-index, citation counts, and journal/article ranking tools from Journal Citation Reports, Scopus, and Google Scholar. It discusses the purpose and calculations of metrics like impact factor, eigenfactor, and source normalized impact per paper (SNIP). It also covers limitations of bibliometrics and recommends using multiple metrics and tools to evaluate research. Exercises are provided to help understand how to analyze journals, articles, and individual researchers using different bibliometric resources.
Bibliometrics literally means "book measurement" but the term is used about all kinds of documents (with journal articles as the dominant kind of document).
What is measured are not the physical properties of documents but statistical patterns in variables such as authorship, sources, subjects, geographical origins, and citations.
The document discusses citation indexing. It defines citation indexing as a process that detects relationships between documents through citations. When one document cites another document, there is a conceptual relationship between the ideas in the two documents. The document outlines the history and development of citation indexing, including the first citation index created by Frank Shephard and important contributions by Eugene Garfield. It describes the major citation indexes produced by the Institute for Scientific Information (ISI), now Thomson Reuters, including the Science Citation Index, Social Sciences Citation Index, and Arts and Humanities Citation Index.
Altmetrics are alternative metrics for measuring the impact of scholarly work that use social media and online mentions. They provide a more comprehensive view of impact across different sectors. Altmetric tools track how often research is mentioned online, including the number of tweets, likes, shares and saves. This provides a more timely measure of impact than traditional citations alone. While altmetrics don't tell the whole story, they can help understand broader impacts beyond academia. Librarians can help researchers understand and use altmetrics to track engagement and influence.
Scientometrics is the quantitative study of science, communication in science, and science policy. It involves the analysis, evaluation, and visualization of science using quantitative methods. Some key points:
- The term was coined in 1969 in Russia and gained recognition after the founding of the journal Scientometrics in 1978.
- It applies bibliometric methods to measure the impact and quality of science by analyzing patterns of citation and publication.
- Important indicators include the journal impact factor, h-index, and citation analysis which are used to evaluate scientific institutions, departments, and individual researchers.
- Tools like CiteSpace and Authormap are used to map and visualize citation networks and trends within literature.
The document discusses author level metrics and how they are used to measure the impact of individual authors. It defines author level metrics as citation metrics that measure the bibliometric impact of individual researchers. It also discusses different types of author level metrics, including article-level metrics, journal-level metrics, h-index, i10-index, g-index, and altmetrics. Finally, it discusses tools that can be used to measure author metrics, such as Google Scholar, Web of Science, Scopus, and Publish or Perish.
The document discusses citation analysis and its importance in measuring the quality and impact of research. Citation analysis evaluates the citations received by research papers, scientists, universities, and countries as a measure of scientific influence and productivity. Citations serve several purposes, such as acknowledging prior work, substantiating claims, and showing consideration of different opinions. Co-citation coupling and bibliographic coupling are methods to establish relationships between scholarly works based on their citations. Common citation metrics include the h-index and impact factor, which provide ways to quantify the impact of research, though they also have limitations. Overall, citation analysis through various metrics is an objective way to determine how influential and important a piece of research has been to the scientific community.
This document discusses bibliometrics, scientometrics, citation analysis, and content analysis. It defines bibliometrics as the quantitative study of recorded information and describes common tools used in bibliometrics like the Science Citation Index. Key variables that are studied include authors, origin, sources, contents, and citations. Important laws and methods in bibliometrics are also summarized, such as Lotka's Law, Bradford's Law, and Zipf's Law. Scientometrics is defined as the quantitative study of science output using bibliometric methods. Citation analysis examines citation patterns and links between scholarly works.
This document provides an overview of various bibliometric products and metrics that can be used to measure research impact, including journal impact factor, h-index, citation counts, and journal/article ranking tools from Journal Citation Reports, Scopus, and Google Scholar. It discusses the purpose and calculations of metrics like impact factor, eigenfactor, and source normalized impact per paper (SNIP). It also covers limitations of bibliometrics and recommends using multiple metrics and tools to evaluate research. Exercises are provided to help understand how to analyze journals, articles, and individual researchers using different bibliometric resources.
Bibliometrics literally means "book measurement" but the term is used about all kinds of documents (with journal articles as the dominant kind of document).
What is measured are not the physical properties of documents but statistical patterns in variables such as authorship, sources, subjects, geographical origins, and citations.
The document discusses citation indexing. It defines citation indexing as a process that detects relationships between documents through citations. When one document cites another document, there is a conceptual relationship between the ideas in the two documents. The document outlines the history and development of citation indexing, including the first citation index created by Frank Shephard and important contributions by Eugene Garfield. It describes the major citation indexes produced by the Institute for Scientific Information (ISI), now Thomson Reuters, including the Science Citation Index, Social Sciences Citation Index, and Arts and Humanities Citation Index.
Altmetrics are alternative metrics for measuring the impact of scholarly work that use social media and online mentions. They provide a more comprehensive view of impact across different sectors. Altmetric tools track how often research is mentioned online, including the number of tweets, likes, shares and saves. This provides a more timely measure of impact than traditional citations alone. While altmetrics don't tell the whole story, they can help understand broader impacts beyond academia. Librarians can help researchers understand and use altmetrics to track engagement and influence.
Scientometrics is the quantitative study of science, communication in science, and science policy. It involves the analysis, evaluation, and visualization of science using quantitative methods. Some key points:
- The term was coined in 1969 in Russia and gained recognition after the founding of the journal Scientometrics in 1978.
- It applies bibliometric methods to measure the impact and quality of science by analyzing patterns of citation and publication.
- Important indicators include the journal impact factor, h-index, and citation analysis which are used to evaluate scientific institutions, departments, and individual researchers.
- Tools like CiteSpace and Authormap are used to map and visualize citation networks and trends within literature.
The document discusses author level metrics and how they are used to measure the impact of individual authors. It defines author level metrics as citation metrics that measure the bibliometric impact of individual researchers. It also discusses different types of author level metrics, including article-level metrics, journal-level metrics, h-index, i10-index, g-index, and altmetrics. Finally, it discusses tools that can be used to measure author metrics, such as Google Scholar, Web of Science, Scopus, and Publish or Perish.
The document discusses citation analysis and its importance in measuring the quality and impact of research. Citation analysis evaluates the citations received by research papers, scientists, universities, and countries as a measure of scientific influence and productivity. Citations serve several purposes, such as acknowledging prior work, substantiating claims, and showing consideration of different opinions. Co-citation coupling and bibliographic coupling are methods to establish relationships between scholarly works based on their citations. Common citation metrics include the h-index and impact factor, which provide ways to quantify the impact of research, though they also have limitations. Overall, citation analysis through various metrics is an objective way to determine how influential and important a piece of research has been to the scientific community.
This document discusses bibliometrics, scientometrics, citation analysis, and content analysis. It defines bibliometrics as the quantitative study of recorded information and describes common tools used in bibliometrics like the Science Citation Index. Key variables that are studied include authors, origin, sources, contents, and citations. Important laws and methods in bibliometrics are also summarized, such as Lotka's Law, Bradford's Law, and Zipf's Law. Scientometrics is defined as the quantitative study of science output using bibliometric methods. Citation analysis examines citation patterns and links between scholarly works.
This document discusses the concept of bibliometrics. It defines bibliometrics as the application of mathematical and statistical methods to books and publications. The document outlines the origin and development of bibliometrics, provides definitions, and discusses the scope, laws, software, applications, and limitations of bibliometric analysis.
Web of Science is an online scientific citation indexing service that allows users to search bibliographic databases for academic literature. It is owned by Clarivate Analytics and provides access to multiple databases that index thousands of scholarly journals, books, and conference proceedings. Some key points:
- Web of Science allows citation searching to find academic sources that have cited a particular work or have been cited by other works.
- It provides citation metrics like the h-index and citation reports that measure the impact and influence of authors, publications, and institutions.
- Advanced search features allow using Boolean operators, field tags, and other tools to construct complex queries across various databases within Web of Science.
The document discusses bibliometrics and bibliometric laws. It defines bibliometrics as the application of mathematical and statistical methods to bibliographic information, and discusses its origins and objectives. Three important bibliometric laws are covered: Lotka's law describes the frequency of publication by authors, Bradford's law explains the scattering of articles across journals, and Zipf's law concerns the frequency of word usage. In conclusion, bibliometrics is an important tool in library and information science for studying information processes and flows.
This document provides an overview of bibliometrics and discusses several key bibliometric techniques and laws. It begins with definitions of bibliometrics and related terms. It then describes various bibliometric techniques including citation analysis, citation indexing, self-citation, publication counts, direct citations, bibliographic coupling, and co-citation coupling. Several bibliometric laws are also covered, such as Bradford's law of scattering, Lotka's law, and Zipf's law. The document concludes by discussing the foundations and uses of bibliometric studies, which include measuring relationships between authors, journals, and fields as well as research evaluation and management.
This document provides an overview and summary of the Web of Science database. It discusses that Web of Science is a platform consisting of literature search databases designed to support scientific research. It was envisioned by Eugene Garfield in the 1960s to connect scientists and scholars globally across disciplines. The document outlines the scope and impact of Web of Science, including that it indexes over 20,000 peer-reviewed journals. It also summarizes the specific databases subscribed to by the AUI Library, including the Web of Science Core Collection, MEDLINE, and SciELO Citation Index. Finally, it briefly describes some of the analysis and metric tools available through Web of Science, such as citation mapping and InCites journal metrics.
This document provides an overview of citation indexing and describes some key tools and concepts. Citation indexing traces the use of ideas across research by identifying papers that cite older publications. The Institute for Scientific Information pioneered citation indexing databases like the Web of Science. While comprehensive, the WoS has limitations in coverage of non-English language and developing world journals. The Indian Citation Index was created to index more Indian publications and support research evaluation in India. Impact factors are calculated based on citations in the Journal Citation Reports to measure journal influence.
Informetrics is the study of quantitative aspects of information and involves the measurement of information phenomena and the application of mathematical methods to information problems. It encompasses fields like bibliometrics, scientometrics, and webometrics. Informetrics is an interdisciplinary field that is used in library management, the sociology and history of science, science policy, and information retrieval. It examines information flows and follows informetric laws like Bradford's law of scattering, Lotka's law, and Zipf's law. Informetrics offers opportunities for information professionals and is becoming a scientific discipline.
Web of Science and Scopus: Understanding the indexing systemDr. Sharad Chand
In this article, Ii is explained about the Web of Science and Scopus indexing databases and their quality measures. This provides a basic insight into the selection of a good quality journal for publications.
POPSI (Postulate based permuted subject indexing) is a pre-coordinate indexing system developed by G. Bhattacharyya that uses an analytic-synthetic method and permutation of terms to approach documents from different perspectives. It is based on Ranganathan's postulates and classification principles. POPSI helps formulate subject headings, derive index entries, determine subject queries, and formulate search strategies. The main POPSI table contains notation used in the indexing process. Key steps include analysis, formalization, modulation, standardization, and generating organized and associative classification entries and references.
In the professional literature, the term consortia cover a great range of cooperation situation and expressions according to the different contexts and period they were formed and developed. It has long been a precept of librarianship; however libraries have not used it widely until about the 1980s. The prime reason of establishing a library consortium is to share physical resources between member organizations. Through consortia, libraries are coordinating their purchasing to offer the best quality and quantity of resources to their patrons at a lower cost and also to make them available on users’ desktops. “Consortium purchasing is assisting libraries to deal with the increasing pressure of diminishing budgets, increasing user demand, and rising journal cost”
This document discusses journal impact factors and how they are calculated. The Institute for Scientific Information (ISI) defines top journals and tracks the number of citations to their articles in their Journal Citation Reports (JCR). A journal's impact factor is calculated by dividing the number of citations in the current year to articles from the previous two years by the total number of articles published in the previous two years. Impact factors help researchers identify important journals in their field.
This presentation discusses citation indexes and the h-index metric for measuring research impact. It begins by defining a citation index as an index of citations between publications, allowing users to see which documents cite earlier works. The presentation then covers: how to calculate the h-index using citations and publications; tools to find your own h-index such as Google Scholar and Publish or Perish; tips to increase your h-index like publishing in journals with high citation rates and citing your own work; and major citation indexes including the Science Citation Index, Social Sciences Citation Index, and Journal Citation Reports.
This document provides an introduction to open access resources for participants. It begins with welcoming the participants and laying out the structure of the paper. The paper will discuss the meaning and definitions of open access resources, their importance and types, the open access movement, the role of librarians, advantages, and conclusions. It introduces how the internet is transforming libraries and the emergence of electronic documents. Open access resources are knowledge resources made freely available online without subscription fees or access charges.
The h-index is a metric used to characterize both the productivity and impact of a researcher's publications. It is defined as the number of papers (h) that have been cited at least h times each. The h-index takes into account both the number of publications and the number of citations received. Several research databases, including Scopus, Web of Science, and Google Scholar, will calculate a researcher's h-index.
The presentation discusses about a Thesis, Research paper, Review Article & Technical Reports: Organization of thesis and reports, formatting issues, citation methods, references, effective oral presentation of research. Quality indices of research publication: impact factor, immediacy factor, H- index and other citation indices. A verbal consent of Prof. Dr. C. B. Bhatt was obtained (at 4.15pm on Dt. 26-11-2016 at Hall A-2, GTU, Chandkheda) to float the presentation online in benefits of the research scholar society.
Scopus is Elsevier’s abstract and citation database launched in 2004. Scopus covers nearly 36,377 titles from approximately 11,678 publishers, of which 34,346 are peer-reviewed journals in top-level subject fields: life sciences, social sciences, physical sciences, and health sciences
Journal impact measures: the Impact FactorTorres Salinas
The seminar on impact measures will first shed light on the best known and most controversial indicator, namely Garfield’s Journal Impact Factor. Its strengths and weaknesses as well as its correct use will be discussed thoroughly. Moreover the corresponding analytical tool, Clarivate Analytics’s Journal Citation Reports will be demonstrated.
Presented at the european summer school for scientometrics ESSS - July 16th, 2019 - Louvain
Impact factor of Journal as per Journal citation report, SNIP, SJR, IPP, Cite...Omprakash saini saini
The document discusses several metrics for evaluating journals:
- Cite Score measures citations received over a 3-year period divided by number of published items in Scopus.
- Impact Factor from Journal Citation Reports measures average citations over a 2-year period.
- SNIP accounts for differences in citation behavior between fields using a source normalization approach.
- SJR measures influence based on weighted citations from prestigious journals over 3 years.
- Impact per Publication calculates citations in a year divided by number of publications in the prior 3 years.
The document discusses the use of bibliometric data and citation metrics to evaluate research performance and support decision making. It notes the increasing importance of demonstrating research impact and return on investment. Thomson Reuters products like the Journal Citation Reports and Web of Science are positioned as providing objective citation and bibliometric data to help with research assessment and evaluation exercises. The document also provides examples of how this data can be used to analyze the research performance of institutions and individuals.
This document discusses the concept of bibliometrics. It defines bibliometrics as the application of mathematical and statistical methods to books and publications. The document outlines the origin and development of bibliometrics, provides definitions, and discusses the scope, laws, software, applications, and limitations of bibliometric analysis.
Web of Science is an online scientific citation indexing service that allows users to search bibliographic databases for academic literature. It is owned by Clarivate Analytics and provides access to multiple databases that index thousands of scholarly journals, books, and conference proceedings. Some key points:
- Web of Science allows citation searching to find academic sources that have cited a particular work or have been cited by other works.
- It provides citation metrics like the h-index and citation reports that measure the impact and influence of authors, publications, and institutions.
- Advanced search features allow using Boolean operators, field tags, and other tools to construct complex queries across various databases within Web of Science.
The document discusses bibliometrics and bibliometric laws. It defines bibliometrics as the application of mathematical and statistical methods to bibliographic information, and discusses its origins and objectives. Three important bibliometric laws are covered: Lotka's law describes the frequency of publication by authors, Bradford's law explains the scattering of articles across journals, and Zipf's law concerns the frequency of word usage. In conclusion, bibliometrics is an important tool in library and information science for studying information processes and flows.
This document provides an overview of bibliometrics and discusses several key bibliometric techniques and laws. It begins with definitions of bibliometrics and related terms. It then describes various bibliometric techniques including citation analysis, citation indexing, self-citation, publication counts, direct citations, bibliographic coupling, and co-citation coupling. Several bibliometric laws are also covered, such as Bradford's law of scattering, Lotka's law, and Zipf's law. The document concludes by discussing the foundations and uses of bibliometric studies, which include measuring relationships between authors, journals, and fields as well as research evaluation and management.
This document provides an overview and summary of the Web of Science database. It discusses that Web of Science is a platform consisting of literature search databases designed to support scientific research. It was envisioned by Eugene Garfield in the 1960s to connect scientists and scholars globally across disciplines. The document outlines the scope and impact of Web of Science, including that it indexes over 20,000 peer-reviewed journals. It also summarizes the specific databases subscribed to by the AUI Library, including the Web of Science Core Collection, MEDLINE, and SciELO Citation Index. Finally, it briefly describes some of the analysis and metric tools available through Web of Science, such as citation mapping and InCites journal metrics.
This document provides an overview of citation indexing and describes some key tools and concepts. Citation indexing traces the use of ideas across research by identifying papers that cite older publications. The Institute for Scientific Information pioneered citation indexing databases like the Web of Science. While comprehensive, the WoS has limitations in coverage of non-English language and developing world journals. The Indian Citation Index was created to index more Indian publications and support research evaluation in India. Impact factors are calculated based on citations in the Journal Citation Reports to measure journal influence.
Informetrics is the study of quantitative aspects of information and involves the measurement of information phenomena and the application of mathematical methods to information problems. It encompasses fields like bibliometrics, scientometrics, and webometrics. Informetrics is an interdisciplinary field that is used in library management, the sociology and history of science, science policy, and information retrieval. It examines information flows and follows informetric laws like Bradford's law of scattering, Lotka's law, and Zipf's law. Informetrics offers opportunities for information professionals and is becoming a scientific discipline.
Web of Science and Scopus: Understanding the indexing systemDr. Sharad Chand
In this article, Ii is explained about the Web of Science and Scopus indexing databases and their quality measures. This provides a basic insight into the selection of a good quality journal for publications.
POPSI (Postulate based permuted subject indexing) is a pre-coordinate indexing system developed by G. Bhattacharyya that uses an analytic-synthetic method and permutation of terms to approach documents from different perspectives. It is based on Ranganathan's postulates and classification principles. POPSI helps formulate subject headings, derive index entries, determine subject queries, and formulate search strategies. The main POPSI table contains notation used in the indexing process. Key steps include analysis, formalization, modulation, standardization, and generating organized and associative classification entries and references.
In the professional literature, the term consortia cover a great range of cooperation situation and expressions according to the different contexts and period they were formed and developed. It has long been a precept of librarianship; however libraries have not used it widely until about the 1980s. The prime reason of establishing a library consortium is to share physical resources between member organizations. Through consortia, libraries are coordinating their purchasing to offer the best quality and quantity of resources to their patrons at a lower cost and also to make them available on users’ desktops. “Consortium purchasing is assisting libraries to deal with the increasing pressure of diminishing budgets, increasing user demand, and rising journal cost”
This document discusses journal impact factors and how they are calculated. The Institute for Scientific Information (ISI) defines top journals and tracks the number of citations to their articles in their Journal Citation Reports (JCR). A journal's impact factor is calculated by dividing the number of citations in the current year to articles from the previous two years by the total number of articles published in the previous two years. Impact factors help researchers identify important journals in their field.
This presentation discusses citation indexes and the h-index metric for measuring research impact. It begins by defining a citation index as an index of citations between publications, allowing users to see which documents cite earlier works. The presentation then covers: how to calculate the h-index using citations and publications; tools to find your own h-index such as Google Scholar and Publish or Perish; tips to increase your h-index like publishing in journals with high citation rates and citing your own work; and major citation indexes including the Science Citation Index, Social Sciences Citation Index, and Journal Citation Reports.
This document provides an introduction to open access resources for participants. It begins with welcoming the participants and laying out the structure of the paper. The paper will discuss the meaning and definitions of open access resources, their importance and types, the open access movement, the role of librarians, advantages, and conclusions. It introduces how the internet is transforming libraries and the emergence of electronic documents. Open access resources are knowledge resources made freely available online without subscription fees or access charges.
The h-index is a metric used to characterize both the productivity and impact of a researcher's publications. It is defined as the number of papers (h) that have been cited at least h times each. The h-index takes into account both the number of publications and the number of citations received. Several research databases, including Scopus, Web of Science, and Google Scholar, will calculate a researcher's h-index.
The presentation discusses about a Thesis, Research paper, Review Article & Technical Reports: Organization of thesis and reports, formatting issues, citation methods, references, effective oral presentation of research. Quality indices of research publication: impact factor, immediacy factor, H- index and other citation indices. A verbal consent of Prof. Dr. C. B. Bhatt was obtained (at 4.15pm on Dt. 26-11-2016 at Hall A-2, GTU, Chandkheda) to float the presentation online in benefits of the research scholar society.
Scopus is Elsevier’s abstract and citation database launched in 2004. Scopus covers nearly 36,377 titles from approximately 11,678 publishers, of which 34,346 are peer-reviewed journals in top-level subject fields: life sciences, social sciences, physical sciences, and health sciences
Journal impact measures: the Impact FactorTorres Salinas
The seminar on impact measures will first shed light on the best known and most controversial indicator, namely Garfield’s Journal Impact Factor. Its strengths and weaknesses as well as its correct use will be discussed thoroughly. Moreover the corresponding analytical tool, Clarivate Analytics’s Journal Citation Reports will be demonstrated.
Presented at the european summer school for scientometrics ESSS - July 16th, 2019 - Louvain
Impact factor of Journal as per Journal citation report, SNIP, SJR, IPP, Cite...Omprakash saini saini
The document discusses several metrics for evaluating journals:
- Cite Score measures citations received over a 3-year period divided by number of published items in Scopus.
- Impact Factor from Journal Citation Reports measures average citations over a 2-year period.
- SNIP accounts for differences in citation behavior between fields using a source normalization approach.
- SJR measures influence based on weighted citations from prestigious journals over 3 years.
- Impact per Publication calculates citations in a year divided by number of publications in the prior 3 years.
The document discusses the use of bibliometric data and citation metrics to evaluate research performance and support decision making. It notes the increasing importance of demonstrating research impact and return on investment. Thomson Reuters products like the Journal Citation Reports and Web of Science are positioned as providing objective citation and bibliometric data to help with research assessment and evaluation exercises. The document also provides examples of how this data can be used to analyze the research performance of institutions and individuals.
This document provides guidance on selecting an appropriate journal to publish research. It discusses factors to consider like the paper's content, intended audience, and journal scope. It also covers differences between indexed and non-indexed journals, as well as open access and subscription models. Metrics for evaluating journals are defined, including impact factor, eigenfactor, h-index, and quartiles. The differences between Scopus and Web of Science databases are outlined. Tools for preliminary journal searches like Ulrich's and journal finder databases are recommended. The presentation emphasizes understanding journal metrics and selection criteria before submitting to ensure matching research with a suitable publication outlet.
The document provides information about the Journal Citation Reports (JCR) from Clarivate. It discusses what the JCR is, how it can be used by publishers, librarians, researchers and data scientists, and some of the metrics it includes like impact factor, immediacy index, and cited half-life. It also summarizes some strategies for publishing, including aiming for high ranked journals, journals that are cited for a long time or quickly, and internationally recognized or government accredited journals. Key points are that context is important when using metrics, and the JCR can help evaluate journals and find related publications.
The document provides information about the Journal Citation Reports (JCR) from Clarivate. It discusses what the JCR is, how it can be used by publishers, librarians, researchers and data scientists, and some of the metrics it includes like impact factor, immediacy index, and cited half-life. It also summarizes some strategies for publishing, including aiming for high ranked journals, journals that are cited for a long time or quickly, and internationally recognized or government accredited journals. Key points are that context is important when using metrics, and the JCR can help evaluate journals and find related publications.
Quality Assurance for Journal GuidanceSmriti Arora
Definitions
What is the need for quality assurance in journals ?
Type of journals
Bibliometric indicators
How to identify credible journals ?
Predatory/cloned journals
This document provides information about accessing and using Journal Citation Reports (JCR) through Web of Science (WoS). JCR offers quantitative tools to evaluate and rank journals. It uses citation data from over 20,000 journals to demonstrate the most influential journals in different fields and categories. WoS provides access to JCR, allowing users to find journal impact factors and rankings. The document outlines how to create JCR reports for specific journals or browse categories to find the most impactful journals in different subject areas. It also explains several common metrics for measuring journal impact, including total citations, journal impact factor, and Eigenfactor score.
This document provides an overview of various bibliometric tools and metrics for measuring scientific output and impact. It discusses journal ranking metrics like impact factor, eigenfactor, SNIP, and SJR. It also covers article-level metrics including F1000 factors and citation analysis tools from Google Scholar, Web of Science, and Scopus. Additionally, it introduces author-level metrics such as the h-index and its variants that can be calculated using various databases and tools. Finally, the document briefly discusses altmetrics and ways to track scholarly impact on social media and the open web.
The document provides information about the Journal Citation Reports (JCR) and how journal metrics like the Journal Impact Factor (JIF) are calculated and presented. It discusses:
- How the JIF is calculated based on citations in the current year to articles from the previous two years, divided by total citable items published in the previous two years.
- Changes in the 2023 JCR release including expanding JIFs to more indexes and reporting JIFs to one decimal place instead of three to remove false precision.
- How ties in JIF values will impact category ranks and quartile distributions with the change to one decimal place.
- The calculation of the new Journal Citation Indicator
Presented to members of the Psychology department as part of the New Tricks Seminar series (February 2016)
• journal metrics using WoS and Scopus
• article level metrics in WoS, Scopus and Google Scholar, and from publishers and the differences in each. Touch on altmetrics.
• author metrics in the above. Touch on Publish or Perish
Tanya Williamson, Academic Liaison Librarian
This document provides an overview of metrics and tools used to assess scholarly impact and discusses current issues and trends. It covers journal-level metrics like journal impact factor and eigenfactor scores, article-level metrics including citations and altmetrics, and author-level metrics like the h-index. It explains how and where to find these various metrics, and highlights some alternative new metrics and changing expectations around research assessment and demonstrating broader impact.
A presentation delivered online to the Mountain Plains Management Conference at Cedar City, UT on Oct. 18, 2013.
Presented by: Jon Ritterbush of the Calvin T. Ryan Library at the University of Nebraska-Kearney.
This document discusses factors to consider when choosing where to publish research, including journal quality metrics and indexing. It covers:
1. Tools to help find an appropriate journal, including JournalGuide, JournalFinder, and others that search databases like Web of Science, Scopus, and Springer.
2. Common journal quality metrics like Journal Impact Factor (JIF), CiteScore, SCImago Journal Rank (SJR), Eigenfactor, and SNIP, and when each may be most useful.
3. Author impact metrics like the h-index, m-index, and i10-index to evaluate individual researcher impact over time and career stage.
Finding Journal Impact Factor using Journal Citation Reports Andiswa Mfengu
Journal Citation Reports - InCites provides journal impact factors and rankings across various disciplines. It collects citation data and calculates metrics like impact factor, immediacy index, and half-life for journals indexed in Web of Science. Impact factor is a measure of how frequently the average article in a journal is cited in a given year. It is calculated by dividing the number of times articles published in the last two years were cited by the total number of articles published in those two years. The document provides instructions on how to access Journal Citation Reports through the library database to search for journals, compare metrics between journals, and view trends in journal-level indicators over time.
The binding of cosmological structures by massless topological defectsSérgio Sacani
Assuming spherical symmetry and weak field, it is shown that if one solves the Poisson equation or the Einstein field
equations sourced by a topological defect, i.e. a singularity of a very specific form, the result is a localized gravitational
field capable of driving flat rotation (i.e. Keplerian circular orbits at a constant speed for all radii) of test masses on a thin
spherical shell without any underlying mass. Moreover, a large-scale structure which exploits this solution by assembling
concentrically a number of such topological defects can establish a flat stellar or galactic rotation curve, and can also deflect
light in the same manner as an equipotential (isothermal) sphere. Thus, the need for dark matter or modified gravity theory is
mitigated, at least in part.
The debris of the ‘last major merger’ is dynamically youngSérgio Sacani
The Milky Way’s (MW) inner stellar halo contains an [Fe/H]-rich component with highly eccentric orbits, often referred to as the
‘last major merger.’ Hypotheses for the origin of this component include Gaia-Sausage/Enceladus (GSE), where the progenitor
collided with the MW proto-disc 8–11 Gyr ago, and the Virgo Radial Merger (VRM), where the progenitor collided with the
MW disc within the last 3 Gyr. These two scenarios make different predictions about observable structure in local phase space,
because the morphology of debris depends on how long it has had to phase mix. The recently identified phase-space folds in Gaia
DR3 have positive caustic velocities, making them fundamentally different than the phase-mixed chevrons found in simulations
at late times. Roughly 20 per cent of the stars in the prograde local stellar halo are associated with the observed caustics. Based
on a simple phase-mixing model, the observed number of caustics are consistent with a merger that occurred 1–2 Gyr ago.
We also compare the observed phase-space distribution to FIRE-2 Latte simulations of GSE-like mergers, using a quantitative
measurement of phase mixing (2D causticality). The observed local phase-space distribution best matches the simulated data
1–2 Gyr after collision, and certainly not later than 3 Gyr. This is further evidence that the progenitor of the ‘last major merger’
did not collide with the MW proto-disc at early times, as is thought for the GSE, but instead collided with the MW disc within
the last few Gyr, consistent with the body of work surrounding the VRM.
The technology uses reclaimed CO₂ as the dyeing medium in a closed loop process. When pressurized, CO₂ becomes supercritical (SC-CO₂). In this state CO₂ has a very high solvent power, allowing the dye to dissolve easily.
Mending Clothing to Support Sustainable Fashion_CIMaR 2024.pdfSelcen Ozturkcan
Ozturkcan, S., Berndt, A., & Angelakis, A. (2024). Mending clothing to support sustainable fashion. Presented at the 31st Annual Conference by the Consortium for International Marketing Research (CIMaR), 10-13 Jun 2024, University of Gävle, Sweden.
Immersive Learning That Works: Research Grounding and Paths ForwardLeonel Morgado
We will metaverse into the essence of immersive learning, into its three dimensions and conceptual models. This approach encompasses elements from teaching methodologies to social involvement, through organizational concerns and technologies. Challenging the perception of learning as knowledge transfer, we introduce a 'Uses, Practices & Strategies' model operationalized by the 'Immersive Learning Brain' and ‘Immersion Cube’ frameworks. This approach offers a comprehensive guide through the intricacies of immersive educational experiences and spotlighting research frontiers, along the immersion dimensions of system, narrative, and agency. Our discourse extends to stakeholders beyond the academic sphere, addressing the interests of technologists, instructional designers, and policymakers. We span various contexts, from formal education to organizational transformation to the new horizon of an AI-pervasive society. This keynote aims to unite the iLRN community in a collaborative journey towards a future where immersive learning research and practice coalesce, paving the way for innovative educational research and practice landscapes.
Describing and Interpreting an Immersive Learning Case with the Immersion Cub...Leonel Morgado
Current descriptions of immersive learning cases are often difficult or impossible to compare. This is due to a myriad of different options on what details to include, which aspects are relevant, and on the descriptive approaches employed. Also, these aspects often combine very specific details with more general guidelines or indicate intents and rationales without clarifying their implementation. In this paper we provide a method to describe immersive learning cases that is structured to enable comparisons, yet flexible enough to allow researchers and practitioners to decide which aspects to include. This method leverages a taxonomy that classifies educational aspects at three levels (uses, practices, and strategies) and then utilizes two frameworks, the Immersive Learning Brain and the Immersion Cube, to enable a structured description and interpretation of immersive learning cases. The method is then demonstrated on a published immersive learning case on training for wind turbine maintenance using virtual reality. Applying the method results in a structured artifact, the Immersive Learning Case Sheet, that tags the case with its proximal uses, practices, and strategies, and refines the free text case description to ensure that matching details are included. This contribution is thus a case description method in support of future comparative research of immersive learning cases. We then discuss how the resulting description and interpretation can be leveraged to change immersion learning cases, by enriching them (considering low-effort changes or additions) or innovating (exploring more challenging avenues of transformation). The method holds significant promise to support better-grounded research in immersive learning.
The cost of acquiring information by natural selectionCarl Bergstrom
This is a short talk that I gave at the Banff International Research Station workshop on Modeling and Theory in Population Biology. The idea is to try to understand how the burden of natural selection relates to the amount of information that selection puts into the genome.
It's based on the first part of this research paper:
The cost of information acquisition by natural selection
Ryan Seamus McGee, Olivia Kosterlitz, Artem Kaznatcheev, Benjamin Kerr, Carl T. Bergstrom
bioRxiv 2022.07.02.498577; doi: https://doi.org/10.1101/2022.07.02.498577
EWOCS-I: The catalog of X-ray sources in Westerlund 1 from the Extended Weste...Sérgio Sacani
Context. With a mass exceeding several 104 M⊙ and a rich and dense population of massive stars, supermassive young star clusters
represent the most massive star-forming environment that is dominated by the feedback from massive stars and gravitational interactions
among stars.
Aims. In this paper we present the Extended Westerlund 1 and 2 Open Clusters Survey (EWOCS) project, which aims to investigate
the influence of the starburst environment on the formation of stars and planets, and on the evolution of both low and high mass stars.
The primary targets of this project are Westerlund 1 and 2, the closest supermassive star clusters to the Sun.
Methods. The project is based primarily on recent observations conducted with the Chandra and JWST observatories. Specifically,
the Chandra survey of Westerlund 1 consists of 36 new ACIS-I observations, nearly co-pointed, for a total exposure time of 1 Msec.
Additionally, we included 8 archival Chandra/ACIS-S observations. This paper presents the resulting catalog of X-ray sources within
and around Westerlund 1. Sources were detected by combining various existing methods, and photon extraction and source validation
were carried out using the ACIS-Extract software.
Results. The EWOCS X-ray catalog comprises 5963 validated sources out of the 9420 initially provided to ACIS-Extract, reaching a
photon flux threshold of approximately 2 × 10−8 photons cm−2
s
−1
. The X-ray sources exhibit a highly concentrated spatial distribution,
with 1075 sources located within the central 1 arcmin. We have successfully detected X-ray emissions from 126 out of the 166 known
massive stars of the cluster, and we have collected over 71 000 photons from the magnetar CXO J164710.20-455217.
When I was asked to give a companion lecture in support of ‘The Philosophy of Science’ (https://shorturl.at/4pUXz) I decided not to walk through the detail of the many methodologies in order of use. Instead, I chose to employ a long standing, and ongoing, scientific development as an exemplar. And so, I chose the ever evolving story of Thermodynamics as a scientific investigation at its best.
Conducted over a period of >200 years, Thermodynamics R&D, and application, benefitted from the highest levels of professionalism, collaboration, and technical thoroughness. New layers of application, methodology, and practice were made possible by the progressive advance of technology. In turn, this has seen measurement and modelling accuracy continually improved at a micro and macro level.
Perhaps most importantly, Thermodynamics rapidly became a primary tool in the advance of applied science/engineering/technology, spanning micro-tech, to aerospace and cosmology. I can think of no better a story to illustrate the breadth of scientific methodologies and applications at their best.
Farming systems analysis: what have we learnt?.pptx
Citation analysis
1. Citation analysis:
Web of science,
scopus
Masoud Mohammadi
Medical Informationist
Golestan University of Medical Sciences
Mohammadi.M
2. Citation Analysis
• Citation analysis is the study of the impact and assumed quality of
an article, an author, or an institution based on the number of
times works and/or authors have been cited by others
• Citation analysis is the examination of the frequency, patterns, and
graphs of citations in documents. It uses the pattern of citations,
links from one document to another document, to reveal
properties of the documents. A typical aim would be to identify
the most important documents in a collection. A classic example is
that of the citations between academic articles and books.[1][2]
The judgements produced by judges of law to support their
decisions refer back to judgements made in earlier cases so citation
analysis in a legal context is important. Another example is
provided by patents which contain prior art, citation earlier patents
relevant to the current claim.
Mohammadi.M
3. Citation Databases
• Citation databases are databases that have been developed
for evaluating publications. The citation databases enable you
to count citations and check, for example, which articles or
journals are the most cited ones
• In a citation database you get information about who has cited
an article and how many times an author has been cited. You
can also list all articles citing the same source.
• Most important citation database are
• “Web of Science”,
• “Scopus”
• “Google Scholar”
Mohammadi.M
4. Web of Sciences
• Web of Science is owned and produced by Thomson Reuters. WoS is composed of
three databases containing citations from international scientific journals:
• Arts & Humanities Citation Index - AHCI
• Social Sciences Citation Index - SSCI
• Science Citation Index - SCI
• Journal Coverage:
• Aims to include the best journals of all fields. Included journals are for example:
European Journal of Marketing, Journal of Finance, Strategic Management Journal
• Citation Coverage:
• Includes citations starting from the year 1945
• Citations can be counted in a simple or complex manner, with different results
• The Citation Report is easy to generate, but only counts references to articles that
are also in the database.
• Searching based on authors by using Cited Reference Search is more cumbersome,
but will also return references to items outside of the database, including articles,
conference proceedings, books, reports etc.
Mohammadi.M
5. Web of Science – What is a citation?
2004
Cited
References
1974 1998
2000
1993
2003Times
Cited
Related
Records
2008
2008
1999
2002
2000
2000
2009
Mohammadi.M
7. Journal Citation Reports®
• The Journal Citation Reports module within InCites allows you to
evaluate and compare journals using citation data drawn from
approximately 12,000 scholarly and technical journals and
conference proceedings from more than 3,300 publishers in over
60 countries. Journal Citation Reports is the only source of
citation data on journals, and includes virtually all specialties in
the areas of science, technology, and social sciences.
• JCR provides dynamic, interactive features supported by
visualizations of JCR indicators that allow researchers and
librarians, as well as publishers, to compare, evaluate and inform
their decisions on published research in varying fields.
Mohammadi.M
8. Key features in the Journal Citation
Reports allow you to:
• Focus on desired subject categories, enabling you to
review journal titles and key performance indicators
in the category;
• Compare multiple journals based on a chosen
indicator;
• Evaluate the performance of journals in which you or
your organization has published research;
• Recognize trending journals in key research
categories;
• Identify the ideal journal in which to publish your
forthcoming research;
Mohammadi.M
10. Belowisonlyasimplifiedexplanationofthemetrics.Fordefinitions
anddetails,clickthehelpintheJCRjournalreportpages
Total Cites Total number of citations for this journal in the JCR year
Impact Factor On average, how many times an article in this journal is being
cited – based on articles published in the two previous years
5-Year Impact
Factor
On average, how many times an article in this journal is being
cited – based on articles published in the five previous years
Immediacy Index On average, how many times an article in this journal is being
cited in the same year – based on last year's data (reflects
more about the nature of the subject than journal quality)
Cited Half-Life Indicates how far back the older articles in this journal are still
being cited (reflects more about the nature of the subject than
journal quality)
Mohammadi.M
11. Citing Half-Life Indicates how recent or how old the bibliography referred by articles
in this journal are (reflects more about the nature of the subject than
journal quality)
Eigenfactor Score The Eigenfactor Score is some kinds of enhanced 5-year impact
factor – by giving higher score for getting cited in more influential
journals and eliminates self-citation
Article Influence
Score
The Article Influence Score is derived from the Eigenfactor Score
based on matching the share of the journal's influence against the
share of the journal's share of articles. The neutral influence score is
1.00 – thus a journal with article influence score greater than 1.00
indicates that each article in the journal has above-average influence
and vice versa
Below is only a simplified explanation of the metrics. For
definitions and details, click the help in the JCR journal report
pages
Mohammadi.M
12. Quartile Comparison (Q)
• Quartile Comparisons enable users to compare
various quartile ranks from metrics for chosen
journals within a given subject category. Users
can see how each journal ranks within a given
quartile, compared with other journals of their
choosing, so long as each is categorized within
the same subject. The ability to select multiple
quartile metrics simultaneously allows for a
comprehensive view of how each journal ranks
within metrics for a given year.
Mohammadi.M
16. The data grid will always be the area that displays user's choices. The default view is
the categories, ranked by number of journals (to correspond to the default
visualization). This list can be sorted or customized. For users signed in with UNP, these
selections are preserved from session to session.
JOURNAL CITATION REPORTS
CATEGORIES BY RANK
Mohammadi.M
17. On Categories by Rank, this is the left
navigation bar:
• "Go to Journal Profile" is a master search
tool.
• "Select Journals" and "Select Categories"
allow the user to filter the data grid based
on journal title or category name. Please
note that in the Categories by Rank
section, the resulting data grid will contain
category-level data and the visualization
will also display at the category level.
• "Select JCR Year" allows users to choose
the year of the category-level data they
wish to view.
• "Select Edition" allows users to choose
which edition, Science Citation Index-
Expanded or Social Science Citation Index,
they wish to browse.
• Clicking "Clear" will restore the default
view; clicking "Submit" will submit the
selections and refresh the data grid.
JOURNAL CITATION REPORTS
CATEGORIES BY RANK – filtering options
Mohammadi.M
18. Clicking any hyperlinked value will pop up a window displaying the calculations behind that value.
Clicking on "Graph" in any column will bring up a graph of the relevant indicator or data point in
the space below the table.
JOURNAL CITATION REPORTS
CATEGORIES BY RANK
Mohammadi.M
19. • Many filtering options (Journal Profile,
Journals, Categories, JCR Year, & Edition)
are the same as on the Categories by Rank
page.
• “Compare Journals” will take users to the
Compare Journals component.
• “View Title Changes” shows the title
changes for the selected JCR Year.
• “Select Category Scheme” allows users to
view ranked lists of journal using either
the WOS or ESI category scheme.
• “JIF Quartile” allows users to choose
which quartile’s journals to view
• “Select Publisher” and “Select
Country/Territory” allow users to filter
based on publisher or country of origin.
• “Impact Factor Range” allows users to
choose their preferred range of Impact
Factor via a drop-down list.
JOURNAL CITATION REPORTS
JOURNAL BY RANK – filtering
Mohammadi.M
20. Search for specific journals to create a
custom list.
Auto-completes any title you type in
JOURNAL CITATION REPORTS
JOURNAL BY RANK – filtering
Mohammadi.M
21. Select a subject category or categories to see all
journals
JOURNAL CITATION REPORTS
JOURNAL BY RANK – filtering
Mohammadi.M
22. Limit your analysis to only those journals in
certain quartiles of their subject area
Or limit to a specific range of
Impact Factors
JOURNAL CITATION REPORTS
JOURNAL BY RANK – filtering
Mohammadi.M
23. The top of the Journal Profile Page contains the name of the journal, publishing
information, title information, category listings (hovering over the category name will
pop up Scope Notes for that category), languages, publication frequency, and whether
or not that journal is an Open Access (OA) title. Links to Current Contents Connect and
Ulrich's also appear in this section.
JOURNAL CITATION REPORTS
JOURNAL BY RANK – journal profile page
Mohammadi.M
24. • Directly below the journal information is a table containing all of the key
indicators for that journal. This table contains data for all the years of
coverage. For years the journal was not covered or was suppressed, data
columns are marked as "Not Available." Also, data columns may indicate
"Not Available" if the particular indicator had not yet been included in
JCR.
JOURNAL CITATION REPORTS
JOURNAL BY RANK – key indicators
Mohammadi.M
25. AnalysisTools
Analyze Results
extract citation data from a selected field (e.g. source title,
country, author), and produces a report showing the values
in ranked order
Create Citation Report
view aggregate citation statistics for a set of search results
e.g. breakdown of citations over years, average citations per
year
Mohammadi.M
27. You can rank the search results with different field options, e.g. source titles
e.g. Rank the journals by
number of articles on
this topic
1. Analyze Results
Max number of items to display:
top 500 results
Mohammadi.M
29. Scopus
• Scopus is owned and produced by Elsevier.
• Journal Coverage:
• Indexes more journals than Web of Science, including some
conference proceedings, which results in better coverage for
Economics and IT. Includes journals such as International Journal
of Entrepreneurship and Innovation Management, Human
Resource Management Journal, Journal of Economics and
Business, Journal of International Financial Markets, Institutions
and Moey.
• Citation Coverage:
• Only indexes citations starting from 1996
• Only counts references to publications that are also in the
database
• Use Citation Tracker to generate analyses on up to 5000
publications
• Citations per year and per publication (with or without self-
citations), h-index
Mohammadi.M
30. SCImago
• The SCImago Journal & Country Rank is a publicly available
portal that includes the journals and country scientific
indicators developed from the information contained in the
Scopus® database (Elsevier B.V.). These indicators can be
used to assess and analyze scientific domains. Journals can
be compared or analysed separately. Country rankings may
also be compared or analysed separately. Journals can be
grouped by subject area (27 major thematic areas), subject
category (313 specific subject categories) or by country.
Citation data is drawn from over 21,500 titles from more
than 5,000 international publishers and country performance
metrics from 239 countries worldwide. The SJCR allows you
also to embed significative journal metrics into your web as a
clickable image widget
Mohammadi.M
32. CiteScore
• Calculating CiteScore is simple and is based on the average
citations received per document. CiteScore is the number of
citations received by a journal in one year to documents
published in the three previous years, divided by the number
of documents indexed in Scopus published in those same
three years.
Mohammadi.M
33. SJR (SCImago Journal Rank)
• SJR is weighted by the prestige of a journal. Subject field,
quality, and reputation of the journal have a direct effect
on the value of a citation.
• SJR assigns relative scores to all of the sources in a
citation network. Its methodology is inspired by the
Google PageRank algorithm, in that not all citations are
equal. A source transfers its own 'prestige', or status, to
another source through the act of citing it. A citation
from a source with a relatively high SJR is worth more
than a citation from a source with a lower SJR.
Mohammadi.M
34. SNIP (Source Normalized
Impact per Paper)
• SNIP measures a source’s contextual citation impact by weighting
citations based on the total number of citations in a subject field. It
helps you make a direct comparison of sources in different subject
fields.
• SNIP takes into account characteristics of the source's subject field,
which is the set of documents citing that source. SNIP especially
considers
• the frequency at which authors cite other papers in their
reference lists
• the speed at which citation impact matures
• the extent to which the database used in the assessment covers
the field’s literature
• SNIP is the ratio of a source's average citation count per paper and
the citation potential of its subject field.
Mohammadi.M