Skip to main content

FAU Libraries' Scholarly Communication Program: Assessment of Research Impact

Assessment of research impact

"Research impact" typically is defined as the extent to which scholarly research is read, discussed, and used, both inside and outside academe.  Measuring impact is important for promotion and tenure, determining research quality, and to assess potential for grant funding. 

Journal-Level Metrics

Journal-Level Metrics give a value (a score or a rank) to a journal. There are a number of bibliometric indicators focusing on measuring impact of scholarly journals, such as the Journal Impact Factor (JIF), Eigenfactor, CiteScore, SCImago Journal Rank, and Source Normalization impact per paper (SNIP). Depending on the discipline, other journal evaluation criteria can include publishing information, such as rate of acceptance, circulation, and where the journal is indexed, see "Other Indicators" tab.

The Journal Impact Factor (JIF) is a measure of the frequency with which the "average article" published in a given scholarly journal has been cited in a particular year or period and is often used to measure or describe the importance of a particular journal to its field. 
See Journal Citation Reports: A Primer on the JCR and Journal Impact Factor 

What is the calculation for the Journal impact Factor?


How can I find the Journal Impact Factor for individual journals?

Journal Impact Factors can be found independently of JCR.  For major publishers, Journal Impact Factors can usually be found on the journal's homepage or publisher's site (see example). Often, this information can be found in the section "About the Journal."  The Library's subscription to Web of Science DOES NOT include access to JCR.

What is the underlying data?

Only journals that are selected for the Web of Science Science Citation Index Expanded (SCIE) and the Social Sciences Citation Index (SSCI) will be listed in JCR with Journal Impact Factors.  You can also check Ulrichsweb to see if a journal has been assigned a Journal Impact Factor; "Journal Citation Reports" will be listed under "Key Features." 

Warning: Disreputable journals may falsely display an Impact Factor on their websites. If you are unsure about a publisher, confirm with one of the resources above to see if the journal is included in Journal Citation Reports.

What are the limitations of the Journal Impact Factor?

Clarviate Anayltics states under the section “Misuse of the Journal Impact Factor”:

The JIF was originally conceived as an aid for libraries in deciding which journals to purchase. JIF is a journal-level metric, so it’s not appropriate to use as a proxy measure for any other entity. The fact of a journal being highly cited really tells us little or nothing about the specific authors who have published in that journal. It is more appropriate to use Web of Science or InCites to measure the output and influence of authors, institutions, regions, or documents.
(Journal Citation Reports: A Primer on the JCR and Journal Impact Factor, p.6)

Also see the 2012 San Francisco Declaration on Research Assessment (DORA).

The Eigenfactor, developed by Jevin West and Carl Bergstrom at the University of Washington, is intended to reflect the influence and prestige of journals. Citations from highly ranked journals are weighed to make a larger contribution to the score (i.e. the value of a single publication in a major journal vs. many publications in minor journals).

Where can I find the Eigenfactor?

You can find the Eigenfacor score at the website

What is the Calculation for the Eigenfactor?

The calculation is based on citations made in a given year to papers published in the prior five years: The Eigenfactor of journal J in year X is defined as the percentage of weighted citations received by J in X to any item published in (X-1), (X-2), (X-3), (X-4), or (X-5), out of the total citations received by all journals in the dataset.

What is the underlying data?

Like the journal impact factor, the Eigenfactor is based on data held in Thomson Reuters' Journal Citation Reports.



CiteScore is an Elsevier product that calculates the average number of citations received in a calendar year by all items published in that journal in the preceding three years. CiteScore metrics are available for all serial titles indexed in the Scopus title list that have enough data available to calculate the metric.  See About CiteScore and it's deliberative metrics.

Where can I find the CiteScore?
Elsevier's Journal Metrics site tracks the Citescore, SNIP and SJR. You can refine by subject title and year.

What is the Calculation for CiteScore?

The number of citations received by a journal in one year to documents published in the three previous years, divided by the number of documents indexed in Scopus published in those same three years.

Difference from Journal Impact Factor:

Image Source:

What is the underlying data?

Scopus indexes nearly 22,000 peer-reviewed journals, trade publications, book series, conference papers and patents in the scientific, technical, and medical and social sciences (including arts and humanities).

Commentary on CiteScore
The Measure of All Things: Some Notes on CiteScore (The Scholarly Kitchen, 1/11/17). An evaluation posted by independent management consultant Joseph Esposito. 


SCImago Journal & Country Ranking. The SCImago journal rank (SJR) expresses the average number of weighted citations received in the selected year by the documents published in the selected journal in the three previous years. Much like the Eigenfactor, a citation from an important journal will count more than one coming from a less important journal.

Where can I find the SJR?

Find journal ranks by subject category, region/country and publication type at  Scimago Journal & Country Ranks or

What is the calculation for the SJR?

The SJR of journal J in year X is the number of weighted citations received by J in X to any item published in J in (X-1), (X-2) or (X-3), divided by the total number of articles and reviews published in (X-1), (X-2) or (X-3). For more details see Understanding indicators, tables and charts.

What is the underlying data?

SJR is based on citation data of the more than 20,000 peer-reviewed journals indexed by Scopus from 1996 onwards.


SNIP (Source Normalized Impact per Paper)

SNIP measures a source’s contextual citation impact by weighting citations based on the total number of citations in a subject field. It helps you make a direct comparison of sources in different subject fields. 

Calculation for the SNIP Indicator

SNIP is the ratio of a source's average citation count per paper and the citation potential of its subject field. See the Methodology.

Where do I find SNIP?

Find journal ranks by subject category, title or publisher at CWTS Journal Indicators. You can also find the SNIP metric at

What  is the underlying data?

SNIP is based on citation data of the more than 20,000 peer-reviewed journals indexed by Scopus from 1996 onwards.



Journal h-index is one measure of the quality of a journal and can be calculated using data from Web of Science, Scopus or Google Scholar. As with the impact factor, journal h-index does not take into account differing citation practices of fields (unlike the weighted SJR and SNIP) and so is best used to compare journals within a field. The h-index publication window can be selected to best suit the citation practices of a discipline. 

H-Index Calculation

An entity has an h-index of y if the entity has y publications that have all been cited at least y times (Hodge & Lacasse, 2011)

Where do I find a journal's H-Index?

Google Scholar Metrics. Go to Google Scholar and select Metrics from left hand menu. You can find Classic papers, Top Publications and search for an individual journal.



Google Scholar Metrics currently cover articles published between 2012 and 2016, both inclusive. The metrics are based on citations from all articles that were indexed in Google Scholar in June 2017. This also includes citations from articles that are not themselves covered by Scholar Metrics.. 

Publish or Perish. Using data from Google Scholar, the freely available Publish or Perish software can be used to generate a variety of metrics.

Web of Science

  • From Search enter journal title e.g. "organization studies" and select Publication Name from the drop-down menu
  • Set the desired publication window using the Timespan limit;
  • Select Search
  • Check that the target title is the only journal listed under Refine Results > Source Titles in the left-hand side column - if not, tick the box next to the target title and Refine;
  • Select Create Citation Report
    The Citation Report reflects citations to source records indexed within a product. See about Citation Report Citations.

Depending on the discipline, other journal evaluation criteria can include publishing information such as rate of acceptance, peer review process,  circulation, and where the journal is indexed. 

  • Check journal directories to find acceptance rates, time to publication information, peer review processes, and where the journal is indexed
  • You can use  journal recommender tools to find journal information
  • The author services pages will or "about journal pages" will also have this information.

See Choosing a Journal to find links to these resources.


Loading ...

Author-Level Metrics

Author-level metrics are bibliographic measurements that capture the productivity and cumulative impact of the output of individual authors, researchers, and scholars.  The h-index, proposed by Jorge E. Hirsch (2005), is a flexible, widely-used citation metric applicable to any set of citable documents.  Various databases (e.g., Web of Science) can calculate an h-index, but are likely to produce a different h for the same scholar, since databases vary in content and coverage.      

What is the h-index?
The h-index is a composite measure of the number of articles published by an author (productivity) and the number of citations to those publications (impact).  

How is the h-index used?  

  • grant funding
  • promotion and tenure
  • self-promotion 

What is the underlying data for the h-index? 
Multi-disciplinary databases (e.g., Web of Science and Scopus) use their own tools for tracking, analyzing, and visualizing the h-index according to their respective journal coverage.  Other databases, like Google Scholar (GS) and ResearchGate, also calculate an h-index, using author-created citation profiles.  The software Publish or Perish analyzes for an h-index using data derived from GS citation profiles.

How is the h-index different from other citation metrics?

  • The g-index, suggested by Leo Egghe (2013), is a variation of the h-index and is a measure of the largest number of highly-cited articles (g) for which the average number of citations is at least g.  [Note: See the tab "H-Index Limitations & Modifications" for application of the g-index].    
  • The i10-index, developed by and found in Google Scholar, refers to the number of published papers with at least 10 citations.

Further reading:
Egghe, L. (2013). Theory and practice of the g-index.  Scientometrics 69: 131-152.

Hirsch, J. E. (2005). An index to quantify an individual's scientific research output. PNAS 102(46), 16569-16572.


How is the h-index calculated? 
The h-index is calculated manually or by using automatic tools found in subscription databases or in Google Scholar. Each database is likely to produce a different result for the same scholar, since databases vary in scope and years of coverage. 

To calculate the h-index manually, place articles in descending order according to the number of times they have been cited:

For automatic calculation, use a database citation-tracking functionality to generate a citation report on Web of Science.  For Google Scholar, create an author profile, from which your h-index is derived. 

What are limitations of the h-index?

  • Multiple authors of a publication each receive one citation count no matter the order of the author list
  • Can be manipulated via self-citations
  • Accounts only for work located in a database
  • A work's first citation can take years to materialize
  • Does not account for citation of highly-cited papers over time

Are there modifications to the h-index that address its limitations?

  • The g-index provides a means for highly-cited papers of an author to boost his or her lower-cited papers.  As such, the g-index gives more weight to highly-cited papers than does the h-index [see the tab "H-Index Limitations and Modifications" for application of the g-index].     
Loading ...

Citation Tracking and Article-Level Metrics

Article-level metrics (ALMs) measure the use and impact of individual scholarly articles.  The metrics include traditional impact measures (e.g., citation counts), and more contemporary measures (e.g., number of downloads).  While number of downloads is also known as an alternative metric ("altmetric,” or usage data), article-level metrics incorporate altmetrics, thus distinguishing the two types of measures; in other words, altmetrics are a type of article-level metric. 

In the context of article-level metrics, citation counts are the number of times an article has been cited in other works.  While citation counts typically measure the degree to which a particular article is useful to other researchers in support of their work, these metrics are not a measure of the quality of a cited work, since a work can be cited for negative reasons (e.g., refutations). Additionally, citation counts are highly dependent upon particular disciplines and the number of researchers in them.  For example, more researchers work in neuroscience than in philosophy; as such, more papers are published in neuroscience than in philosophy, and thus neuroscience papers receive more citation counts than do philosophy papers.

Citation counts are found in discipline- and subject-specific indexes and databases.  Note that citation counts are limited by their citation data source, i.e. since database coverage varies by content and discipline, a particular database will count only the citations contained within it, even if those citation counts exist externally of that database.  Prominent citation indexes and databases include Web of Science, Google Scholar, and Scopus.  See the FAU Libraries’ Scholarly Publishing web page (and "Disciplinary Indexes" specifically) for a comprehensive list of disciplinary indexes and databases in which citation counts can be located.  

An example of citation counts in Web of Science: 


...and in Google Scholar:


Meanwhile, Usage Count as one example of an alternative metric ("altmetric"):



Further reading:

Bartoli, A. & Medvet, E. (2014). Bibliometric Evaluation of Researchers in the Internet Age. Information Society 30(5), 349-354.

Garfield, E. et al. (1978). Citation data as science indicators. In: Elkana, Y. et al. (eds): Toward a Metric of Science: The Advent of Science Indicators. John Wiley, New York: pp. 179-207.

Citation tracking, or citation analysis is an important tool used to trace scholarly research, measure impact, and inform tenure and funding decisions. The impact of an article is evaluated by counting the number of times other authors cite it in their work. Researchers do citation analysis for several reasons:

  • find out how much impact a particular article has had, by showing which other authors have cited the article in their own paper
  • find out how much impact a particular author has had by looking at the frequency and number of his/her total citations
  • discover more about the development of a field or topic (by reading the papers that cite a seminal work in that area)

The output from citation studies is often the only way that non-specialists in governments and funding agencies, or even those in different scientific disciplines, can judge the importance of a piece of scientific research (Johns Hopkins University Library Guide, 2018)

Loading ...


The flow, dissemination, and interaction of online research can now be tracked and analyzed beyond what was traditionally accepted as the signifiers of prestige and impact.

Altmetrics have the potential to answer these questions:

  • How many times is my work downloaded?
  • Who is reading my work?
  • Has my work been covered by news outlets?
  • Who is commenting on my work?
  • How is my work being shared?
  • Which countries are looking at my work?
  • Is my work making a social impact?

Examples of measurements:

  • Bookmarks or Saves to online reference managers such as Mendeley
  • Mentions in social network sites such as Twitter or Facebook or in Wikipedia
  • Discussions in blogs and media
  • Favorites or Likes in sites such as Slideshare, YouTube or Facebook
  • Recommendations in sites such as Figshare
  • Comments/annotations from readers in platforms such as PubMed Commons
  • Noted in post-peer review resources such as F1000Prime


The altscore is generated by The colorful doughnut is integrated into many publisher sites and other search platforms.  Altmetrics has some free tools for researchers such as the Altmetric Booklet




Plum Analytics

Plum Analytics (Elsevier) like, tracks altmetrics. Their visual score is the "Plum Print" and is  integrated into platforms that have agreements with Elsevier (i.e. CINAHL, ScienceDirect).

The Libraries have a subscription to PlumX, a dashboard that allows FAU affiliates to track usage of their scholarly works and run reports. See PlumX at FAU for more information.




Impactstory, funded by the National Science Foundation, is an open-source website that helps researchers explore and share  the online impact of their research. See an example profile.


PLOS Article Level Metrics

ALM Reports allows you to view article-level metrics for any set of PLOS articles.




Kudos is a free resource that tracks your alternative metrics and provides tools to better promote your work.




Loading ...

Researcher IDs and Profiles


Register NOW and get your unique ORCID identifier in 30 seconds. 

ORCID (Open Researcher and Contributer ID) is a free and open registry of unique identifiers for researchers and scholars. Unlike Google Scholar,, or ResearchGate, ORCID is NOT primarily a research profile system (although it can serve that purpose). In fact, having and using an ORCID iD will facilitate the maintenance of researcher profiles that you already have. 

Signing up for an ORCID identifier and using it in your research workflows will ensure that you receive credit for your work, simplify manuscript submissions and improve author search results. ORCID is an increasingly important part of the global research infrastructure, with many funding bodies and publishers now making it a requirement.

Registered? Now, make the most out of your ORCID iD:

  • Add information that is important to distinguish you and improve the functionality of the ORCID search and link wizards: your name variations, multiple email addresses, and organizational affiliations. See Six Ways to Make Your ORCID iD Work for You!
  • You can set visibility settings at the item level. However, in order to allow interoperability with other services such as ImpactStory and PlumX, you will need to make your ORCID record public.
  • Add your works to your record in two ways: by using one or more of the Import Wizards and by adding works manually. 

See more instructional videos on the ORCID Vimeo channel.  .

ResearchGate is an online research community in which you can share updates about your research and publications, and obtain citation counts and your h-index.

To create a ResearchGate profile:

  • Go to ResearchGate
  • Click "Join for Free"
  • Select your research type
  • Add your institution and department, and confirm your authorship
  • Add your research interests and skills

If you have published in a Scopus (Elsevier) indexed journal, you have been assigned a SCOPUS ID. To find your Scopus ID, go to You can link your SCOPUS ID to your ORCID accout.

Your Google Scholar profile includes a list of articles you have placed into the profile, with "cited by" links for each article.  Google Scholar displays a graph showing citation activity, and calculates your total number of citations, as well as your h-index and i10-index.

To create your Google Scholar profile:

  • Go to Google Scholar
  • Click "My Citations."  Log in using your current Google account or create a new one.
  • Provide requested information (e.g., name, affiliation, etc.)
  • Add articles and select your update preferences 
Loading ...