1
The Journal Citation Indicator is a new way to measure the citation impact of a journal’s recent
publications using a field-normalized calculation. This new approach provides a single value
that is easy to interpret and compare, complementing current journal metrics and further
supporting responsible use. Starting from the 2021 JCR release, it will be calculated for all
journals in the Web of Science Core Collection™.
Background
Since the publication of the first Journal Citation Reports (JCR)™ in 1976, the Journal Impact
Factor (JIF)™ has become a standard way to measure the citation impact of a journal. The JCR was
created to describe and define the network of journals as an aggregate of the article citation
network in the Science Citation Index™. It was intended to provide an objective measure
regarding scholarly use of journals to support both libraries and authors in publication
evaluation. The utility of the JIF to these simple purposes evolved to other areas of research
assessment, helping authors choose where to publish papers and enabling publishers and editors
to monitor the success of their portfolios. Due to the rigorous and independent selectivity
process used, inclusion in the JCR has also become a hallmark for editorial quality and research
integrity, helping the research community identify trusted sources of scholarly content.
The JIF is simple and easy to calculate – all you need to know is the number of scholarly works
that a journal published in the last two years (also referred to as citable items) and how many
citations they received from papers published in the JCR data year. Various factors influence how
many citations could be accumulated including the typical number of references made in a paper,
the age of papers referenced, the total number of papers published and even the meaning of a
citation itself. Due to these differences, comparisons for the JIF should be made in category or
between adjacent fields.
In the last 20 years, the bibliometric community has devoted much attention to these issues of
interpretation and comparison, devising more sophisticated ways to measure citation impact
than by counting the number of citations. Among these, normalization has become the de facto
standard – rather than using a citation count as a measure of impact, the citations received by a
paper are compared against a cohort of similar papers and expressed as a ratio or percentile.
Three main factors have been identified that help us determine the relevant cohort:
• Field or discipline – compare papers only to others in an area with similar publication
volume, cited reference counts and cited reference ages
• Publication type – certain publication types, such as review papers, can attract more
citations than others so they should be compared separately
• Year of publication – older papers will have had more time to accumulate citations
and cannot be compared to more recent papers
Hence, it is now commonplace to see measures of citation impact expressed as a percentile (as
utilized in the Web of Science™ Author Impact Beamplots) or ratio. Both of these are included in
our analytics product InCites™ and are used in a variety of research evaluation settings to
measure citation impact of papers, individuals, institutions, funders and regions.
Therefore, the natural evolution for a journal citation impact metric is towards a normalized
indicator – one that accounts for variation and provides a number that can be more easily
interpreted and compared across disciplines.
Journal Citation Indicator
The Journal Citation Indicator is a new field-normalized metric that will be calculated for all
journals in the Web of Science Core Collection and will be published in the JCR. The value
represents the average category-normalized citation impact for papers published in the prior
three-year period. For example, the 2020 Journal Citation Indicator will be calculated for journals
that published citable items (i.e. research papers classified as articles or reviews in the Web of
Science) in 2017, 2018 and 2019, counting all citations they received from any document indexed
between 2017 and 2020, as shown in Figure 1.