Firm that tallies controversial journal impact scores moves to provide more context

first_imgThe company that calculates journal impact factors wants to give users more information about the controversial metric. Sign up for our daily newsletter Get more great content like this delivered right to you! Country B. Douthitt/Science Firm that tallies controversial journal impact scores moves to provide more context Stung by years of criticism that its journal impact factors have distorted scholarly publishing, the private firm Clarivate Analytics based in Philadelphia, Pennsylvania, this week rolled out an updated version of its Journal Citation Reports database that it says provides context useful to understanding journals’ characteristics and audiences.Impact factors—which represent the number of citations to a journal’s articles divided by the number of articles published during a 2-year period—are widely used in academe as a yardstick of a journal’s prestige and reach. But the metric has plenty of critics. The rap includes worries that editors can too easily boost their journal’s ranking through a variety of strategies, and that impact factors are misleading—a few highly cited papers can drive much of a journal’s overall impact factor.Although Clarivate—which has offices in the United States, the United Kingdom, Japan, and China—continues to publish journal impact factors in its Journal Citation Review (JCR) database, the latest version, released 26 June, contains supplementary information that addresses some of this criticism. Most prominently, the page showing a journal’s impact factor now includes a distribution curve displaying the total number of articles and other items published in a journal versus the number of times each item was cited. The median number of citations for all of the journal’s research articles and review articles is also identified on the curve. Click to view the privacy policy. Required fields are indicated by an asterisk (*)center_img Country * Afghanistan Aland Islands Albania Algeria Andorra Angola Anguilla Antarctica Antigua and Barbuda Argentina Armenia Aruba Australia Austria Azerbaijan Bahamas Bahrain Bangladesh Barbados Belarus Belgium Belize Benin Bermuda Bhutan Bolivia, Plurinational State of Bonaire, Sint Eustatius and Saba Bosnia and Herzegovina Botswana Bouvet Island Brazil British Indian Ocean Territory Brunei Darussalam Bulgaria Burkina Faso Burundi Cambodia Cameroon Canada Cape Verde Cayman Islands Central African Republic Chad Chile China Christmas Island Cocos (Keeling) Islands Colombia Comoros Congo Congo, the Democratic Republic of the Cook Islands Costa Rica Cote d’Ivoire Croatia Cuba Curaçao Cyprus Czech Republic Denmark Djibouti Dominica Dominican Republic Ecuador Egypt El Salvador Equatorial Guinea Eritrea Estonia Ethiopia Falkland Islands (Malvinas) Faroe Islands Fiji Finland France French Guiana French Polynesia French Southern Territories Gabon Gambia Georgia Germany Ghana Gibraltar Greece Greenland Grenada Guadeloupe Guatemala Guernsey Guinea Guinea-Bissau Guyana Haiti Heard Island and McDonald Islands Holy See (Vatican City State) Honduras Hungary Iceland India Indonesia Iran, Islamic Republic of Iraq Ireland Isle of Man Israel Italy Jamaica Japan Jersey Jordan Kazakhstan Kenya Kiribati Korea, Democratic People’s Republic of Korea, Republic of Kuwait Kyrgyzstan Lao People’s Democratic Republic Latvia Lebanon Lesotho Liberia Libyan Arab Jamahiriya Liechtenstein Lithuania Luxembourg Macao Macedonia, the former Yugoslav Republic of Madagascar Malawi Malaysia Maldives Mali Malta Martinique Mauritania Mauritius Mayotte Mexico Moldova, Republic of Monaco Mongolia Montenegro Montserrat Morocco Mozambique Myanmar Namibia Nauru Nepal Netherlands New Caledonia New Zealand Nicaragua Niger Nigeria Niue Norfolk Island Norway Oman Pakistan Palestine Panama Papua New Guinea Paraguay Peru Philippines Pitcairn Poland Portugal Qatar Reunion Romania Russian Federation Rwanda Saint Barthélemy Saint Helena, Ascension and Tristan da Cunha Saint Kitts and Nevis Saint Lucia Saint Martin (French part) Saint Pierre and Miquelon Saint Vincent and the Grenadines Samoa San Marino Sao Tome and Principe Saudi Arabia Senegal Serbia Seychelles Sierra Leone Singapore Sint Maarten (Dutch part) Slovakia Slovenia Solomon Islands Somalia South Africa South Georgia and the South Sandwich Islands South Sudan Spain Sri Lanka Sudan Suriname Svalbard and Jan Mayen Swaziland Sweden Switzerland Syrian Arab Republic Taiwan Tajikistan Tanzania, United Republic of Thailand Timor-Leste Togo Tokelau Tonga Trinidad and Tobago Tunisia Turkey Turkmenistan Turks and Caicos Islands Tuvalu Uganda Ukraine United Arab Emirates United Kingdom United States Uruguay Uzbekistan Vanuatu Venezuela, Bolivarian Republic of Vietnam Virgin Islands, British Wallis and Futuna Western Sahara Yemen Zambia Zimbabwe Including these graphs alongside impact factors is “clearly a step in the right direction,” says Stephen Curry, a structural biologist at Imperial College London. Because impact factors measure the average citation performance of papers in a journal, they tend to be driven up by a small number of highly cited papers and don’t reveal anything about the spread of citations across all a journal’s papers. In contrast, the distribution graphs give researchers a much better understanding of how often individual papers in a journal are actually cited than can be provided by a single number. (In a 2016 preprint, Curry and colleagues, including then–Editor-in-Chief of Science Marcia McNutt, called on journals to publish these kinds of distribution graphs in an effort to increase transparency.)Clarivate’s update also includes a number of other changes that together are meant to “give you a much more nuanced picture of what that journal contributes” to scholarly communication than the impact factor alone can convey, Marie McVeigh, product director of JCR, told ScienceInsider.Users can drill down into the underlying data to see, for example, the titles of the most highly cited items and, in a separate list, the citations and articles that went into the calculation of the journal’s impact factor.The dashboard also displays summary information characterizing a journal’s citations by type of article. This allows users to see, for example, what proportion came from research articles versus review articles. Another chart shows how the journal’s impact factor has fluctuated over recent years.There is also summary information about the journal’s authors—tables group them by country and institution. That should interest other authors looking to pitch their manuscripts to journals that serve diverse international constituencies, McVeigh said. In all, the database tracks 11,655 journals.Overall, McVeigh said, Clarivate is “trying to pull this back from an obsessive use of the JIF … and support and allow this more contextualized use of this number that we’ve been producing now for 44 years. It’s been rather a shame to see so much rich, valuable data be thrown away just to look at that one number. Well, let’s make this data-rich and valuable and visible.”The new tools provide researchers with better insight into individual journals, says John Tregoning, an immunologist at Imperial College London who recently wrote an opinion piece in Nature on the benefits and drawbacks of impact factors. This information could be useful, for example, when deciding where to submit a paper. But he cautions that the impact factors and related metrics published by Clarivate remain measures of the journals themselves, and should not be used by grant agencies or employers to make judgments about the people who publish in those journals. “This cannot be used as a proxy for quality of … individual people or individual work,” he says. “There is no … single number that says person A is better than person B. It has to be about a judgment on the quality of their published science.”Curry says he would now like to see journals print the citation distribution graphs made available by Clarivate. After publishing his 2016 preprint, a handful of journals began to do so—including Nature and the Proceedings of the National Academy of Sciences—but not as many as he’d have liked. “Hopefully this will give a big boost to that,” Curry says. “If it’s provided to you ready-made, why wouldn’t you want to be transparent about citation performance?” Email By Jeffrey Brainard, Matt WarrenJun. 27, 2018 , 4:05 PMlast_img

Leave a Reply

Your email address will not be published. Required fields are marked *