文档库 最新最全的文档下载
当前位置:文档库 › Facts of Impact Factors

Facts of Impact Factors

T H E J O U R N A L O F C E L L B I O L O G Y

JCB: EDITORIAL

? The Rockefeller University Press $30.00

The Journal of Cell Biology, Vol. 179, No. 6, December 17, 2007 1091–1092https://www.wendangku.net/doc/e37737911.html,/cgi/doi/10.1083/jcb.200711140

JCB 1091

in the numerator for which there is no corresponding value in the denominator.

2. Articles are designated as pri-mary, review, or “front matter” by hand by Thomson Scientifi c employees exam-ining journals (6) using various biblio-graphic criteria, such as keywords and number of references (7).

3. Some publishers negotiate with Thomson Scientifi c to change these des-ignations in their favor (5). The specifi cs of these negotiations are not available to the public, but one can’t help but wonder what has occurred when a journal experi-ences a sudden jump in impact factor. For example, Current Biology had an impact factor of 7.00 in 2002 and 11.91 in 2003. The denominator somehow dropped from 1032 in 2002 to 634 in 2003, even though the overall number of articles published in the journal increased (see ISI Web of Science: https://www.wendangku.net/doc/e37737911.html,/, subscription required).

4. Citations to retracted articles are counted in the impact factor calculation (8). Show me the data

Mike Rossner,1 Heather Van Epps,2 and Emma Hill 3

1Executive Director, The Rockefeller University Press 2

Executive Editor, The Journal of Experimental Medicine 3

Executive Editor, The Journal of Cell Biology

The integrity of data, and transparency about their acquisition, are vital to science. The impact factor data that are gathered and sold by Thomson Scientifi c (formerly the Institute of Scientifi c Information, or ISI) have a strong infl uence on the scien-tifi c community, affecting decisions on where to publish, whom to promote or hire (1), the success of grant applications (2), and even salary bonuses (3). Y et, mem-bers of the community seem to have little understanding of how impact factors are determined, and, to our knowledge, no one has independently audited the underlying data to validate their reliability.

Calculations and negotiations

The impact factor for a journal in a par-ticular year is declared to be a measure of the average number of times a paper pub-lished in the previous two years was cited during the year in question. For example, the 2006 impact factor is the average number of times a paper published in 2004 or 2005 was cited in 2006. There are, however, some quirks about impact factor calculations that have been pointed out by others (e.g., 1, 4, 5), but which we think are worth reiterating here:

1. The numerator of the impact factor contains every detectable citation to a journal’s content from the previous two years, regardless of the article type (6). For example, the 2006 impact factor numerator contains all citations to all content published in 2004 and 2005. The denominator of the impact factor, how-ever, contains only those articles desig-nated by Thomson Scientifi c as primary research articles or review articles. Journal “front matter”, such as Nature “News and Views” is not counted (4). Thus, the impact factor calculation contains citation values

In a particularly egregious example, Woo Suk Hwang’s stem cell papers in Science from 2004 and 2005, both subsequently retracted, have been cited a total of 419 times (as of November 20, 2007). We won’t cite them again here to prevent the creation of even more c itations to this work.

5. Because the impact factor calcu-lation is a mean, it can be badly skewed by a “blockbuster” paper. For example, the initial human genome paper in Na-ture (9) has been cited a total of 5,904 times (as of November 20, 2007). In a self-analysis of their 2005 impact factor, Nature noted that 89% of their citations came from only 25% of the papers pub-lished (4).

When we asked Thomson Scientifi c if they would consider providing a me-dian calculation in addition to the mean they already publish, they replied, “It’s an interesting suggestion…The median…would typically be much lower than the mean. There are other statistical mea-

sures to describe the nature of the citation

?c a r t o o n b a n k .c o m . A l l R i g h t s R e s e r v e d .

on January 5, 2008

https://www.wendangku.net/doc/e37737911.html, Downloaded from

frequency distribution skewness, but the median is probably not the right choice.” Perhaps so, but it can’t hurt to provide the community with measures other than the mean, which, by Thomson Scientifi c’s own admission, is a poor refl ection of the average number of citations gleaned by most papers.

6. There are ways of playing the impact factor game, known very well by all journal editors, but played by only some of them. For example, review arti-cles typically garner many citations, as do genome or other “data-heavy” articles (see example above). When asked if they would be willing to provide a calculation for primary research papers only, Thomson Scientifi c did not respond.

Integrity

As journal editors, data integrity means that data presented to the public accu-rately refl ect what was actually observed. To help ensure this, The Rockefeller University Press instituted a policy of scrutinizing image data in accepted manuscripts for evidence of manipulation. We realize that image data is only one type of data we publish, but it is a type that can be easily examined for integrity. If a question is raised about the data in a fi gure, we ask the authors to submit the original data for examination by the editors. We consider it our obligation to protect the published record in this way.

Thomson Scientifi c makes its data for individual journals available for purchase. With the aim of dissecting the data to determine which topics were being highly cited and which were not, we decided to buy the data for our three journals (The Journal of Experiment al Medicine, The Journal of Cell Biology, and The Journal of General Physiology) and for some of our direct competitor journals. Our intention was not to question the integrity of their data.

When we examined the data in the Thomson Scientifi c database, two things quickly became evident: fi rst, there were numerous incorrect article-type desig-nations. Many articles that we consider “front matter” were included in the de-nominator. This was true for all the journals we examined. Second, the num-bers did not add up. The total number of citations for each journal was substan-tially fewer than the number published on

the Thomson Scientifi c, Journal Citation

Reports (JCR) website (http://portal

https://www.wendangku.net/doc/e37737911.html,, subscription required).

The difference in citation numbers was

as high as 19% for a given journal, and

the impact factor rankings of several

journals were affected when the calcu-

lation was done using the purchased

data (data not shown due to restrictions

of the license agreement with Thomson

Scientifi c).

Your database or mine?

When queried about the discrepancy,

Thomson Scientifi c explained that they

have two separate databases—one for

their “Research Group” and one used for

the published impact factors (the JCR).

We had been sold the database from the

“Research Group”, which has fewer cita-

tions in it because the data have been

vetted for erroneous records. “The JCR

staff matches citations to journal titles,

whereas the Research Services Group

matches citations to individual articles”,

explained a Thomson Scientifi c repre-

sentative. “Because some cited references

are in error in terms of volume or page

number, name of fi rst author, and other

data, these are missed by the Research

Services Group.”

When we requested the database

used to calculate the published impact

factors (i.e., including the erroneous rec-

ords), Thomson Scientifi c sent us a sec-

ond database. But these data still did not

match the published impact factor data.

This database appeared to have been as-

sembled in an ad hoc manner to create a

facsimile of the published data that might

appease us. It did not.

Opaque data

It became clear that Thomson Scientifi c

could not or (for some as yet unexplained

reason) would not sell us the data used to

calculate their published impact factor.

If an author is unable to produce original

data to verify a fi gure in one of our papers,

we revoke the acceptance of the paper.

We hope this account will convince some

scientists and funding organizations to re-

voke their acceptance of impact factors as

an accurate representation of the quality—

or impact—of a paper published in a

given journal.

Just as scientists would not accept

the fi ndings in a scientifi c paper without

seeing the primary data, so should they

not rely on Thomson Scientifi c’s impact

factor, which is based on hidden data.

As more publication and citation data

become available to the public through

services like PubMed, PubMed Central,

and Google Scholar?, we hope that people

will begin to develop their own metrics

for assessing scientifi c quality rather

than rely on an ill-defi ned and manifestly

unscientifi c number.

Correspondence to Mike Rossner:

rossner@https://www.wendangku.net/doc/e37737911.html,

References

1. Monastersky, R. 2005. The number that’s

devouring science. The impact factor, once

a simple way to rank scientifi c journals,

has become an unyielding yardstick for

hiring, tenure, and grants. Chron. High. Educ.

52:A12.

2. Wells, W.A. 2007. The returning tide:

how China, the world’s most populous

country, is building a competitive re-

search base. J. Cell Biol. 176:376–401

doi:10.1083/jcb.200701150.

3. Editorial. 2006. Cash-per-publication is

an idea best avoided. Nature. 441:786

doi:10.1038/441786a.

4. Editorial. 200

5. Not-so-deep impact.

Research assessment rests too heavily on the

infl ated status of the impact factor. Nature.

435:1003–1004 doi:10.1038/4351003a.

5. The PL oS Medicine Editors. 200

6. The

i mpact factor game. It is time to fi nd a

better way to assess the scientifi c literature.

PLoS Med. 3:e291 doi:10.1371/journal

.pmed.0030291.

6. Garfi eld, E. 1999. Journal impact factor: a brief

review. Can. Med. Assoc. J. 161:979–980.

7. The

Thomson

Scientifi c Impact Factor. 1994.

https://www.wendangku.net/doc/e37737911.html,/free/essays/

journalcitationreports/impactfactor/ (accessed

November 29, 2007).

8. Liu, S.V. 2007. Hwang’s retracted publication

still contributes to Science’s impact factor.

Sci. Ethics. 2:44–45.

9. L ander, E.S., L.M. L inton, B. Birren, C.

Nusbaum, M.C. Zody, J. Baldwin, K. Devon,

K. Dewar, M. Doyle, W. FitzHugh, et al.

2001. Initial sequencing and analysis of the

human genome. Nature. 409:860–921.

on January 5, 2008

https://www.wendangku.net/doc/e37737911.html,

Downloaded from

JCB ? VOLUME 179 ? NUMBER 6 ? 2007 1092

相关文档