2012
DOI: 10.1177/0883073812465014
|View full text |Cite
|
Sign up to set email alerts
|

“3 . . 2 . . 1 . . Impact [Factor]: Target [Academic Career] Destroyed!”

Abstract: "Publish or perish" is the time-honored "principle" for academicians who race to accumulate lines under the "publications" section of a curriculum vitae. The original intent of publication-to inform others of findings and further scientific knowledge-has been corrupted by factors including (1) exponential growth of journals and the journal industry, fueled in part by intrusion of the Internet into all aspects of academic life; and (2) adoption of journal metrics (rather than written content) as the measure of … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
22
0

Year Published

2012
2012
2020
2020

Publication Types

Select...
7
1

Relationship

1
7

Authors

Journals

citations
Cited by 30 publications
(22 citation statements)
references
References 131 publications
0
22
0
Order By: Relevance
“…Numerous scholars and editors have pointed out the biases and limitations of biblometrics like the impact factor to evaluate journals (e.g., MacRoberts & MacRoberts, 1989 ;Opthof, 1997 ;Seglen, 1997 ;Phelan, 1999 ;Kurmis, 2003 ;Cameron, 2005 ;Garfi eld, 2006 ;Adler & Harzing, 2009 ;Vanclay, 2009 ;Brumback, 2012 ). It is also well known that the highly skewed nature of citations to articles in journals makes a journal's impact factor virtually unrelated to the citations of individual articles in that journal ( Seglen, 1997 ;Starbuck, 2005 ;Garfi eld, 2006 ).…”
Section: Selecting Journals As Publication Outletsmentioning
confidence: 99%
See 1 more Smart Citation
“…Numerous scholars and editors have pointed out the biases and limitations of biblometrics like the impact factor to evaluate journals (e.g., MacRoberts & MacRoberts, 1989 ;Opthof, 1997 ;Seglen, 1997 ;Phelan, 1999 ;Kurmis, 2003 ;Cameron, 2005 ;Garfi eld, 2006 ;Adler & Harzing, 2009 ;Vanclay, 2009 ;Brumback, 2012 ). It is also well known that the highly skewed nature of citations to articles in journals makes a journal's impact factor virtually unrelated to the citations of individual articles in that journal ( Seglen, 1997 ;Starbuck, 2005 ;Garfi eld, 2006 ).…”
Section: Selecting Journals As Publication Outletsmentioning
confidence: 99%
“…Alberts (2013) argues that leaders in science need to accept responsibility for making substantive evaluations of individual scientifi c contributions and not rely on journal editors and impact factors of journals. Alberts (2013) and others ( Adler & Harzing, 2009 ;Brumback, 2012 ;Brembs, Button, & Munafo, 2013 ) have nicely summarized how the misuse of bibliometric variables harms science and individual scholars being evaluated using these metrics rather than the merit of their work. Cardinal, et al .…”
Section: Selecting Journals As Publication Outletsmentioning
confidence: 99%
“…This database contains details of the journals' impact factors and a range of other bibliometrics. Whether one agrees with this measure of prestige, and some certainly do not (Brumback 2012), it is indisputable that impact factors are the primary heuristic tool in judging the quality of research output. As well as extracting data on the impact factors of each of these journals, I also searched the SHERPA RoMEO database, which summarises the approach taken to selfarchiving manuscripts for over 18,000 academic journals.…”
Section: Self-archiving In Ecology and Evolutionmentioning
confidence: 99%
“…This latter criteria is a bit cynically pragmatic, but nonetheless important in the current academic environment. Journal metrics such as impact factor and other purported performance measures are well known to be unreliable, invalid and can be just plain wrong (Brumback, 2012;EyreWalker & Stoletzki, 2013;Hantula, 2005), however they still exert an inordinate influence over gullible administrators and the potential authors who have to answer to them. Too many institutions and funding agencies use journal metrics as the primary or sole indicator of scholarly Bperformance^Bproductivity^or Bquality.^It does a great disservice to early or mid-career scholars if their work appears in a journal with low metrics.…”
mentioning
confidence: 99%