Skip to main content

1 in 5 Researchers Have Been Asked To Cite Superflous Papers To Boost Journal Reputations

You might think that scientific journals are the arbitera of professionalism when it comes to publication practices, but a new survey indicates they have their sketchy side as well. While scientific journals are definitely interested in publishing valuable information and interesting studies, it seems they are also concerned with building and maintaining their reputations. This may be why 1 in 5 researchers reported being asked to add bogus citations to specific journals in their published papers.

Recommended Videos

It goes a little something like this: Imagine you’re a scientist, a snack scientist — a snackentest, if you will. You write a paper on potato chips, and Snacktastic: A Science Journal of Science wants to publish it. So you send it to them and you get some notes from an editor which amount to “Hey, so this is great, but do you think you could cite some more Snacktastic articles? Like, just go ahead and find some and bolt ’em on there and that’d be great.” The idea is that a glut of citations tends to boost a journals impact-factor — a calculation of a journal’s reputation derived from the number of articles that cite its articles. It’s not the worst thing in the world, but it’s a little bit shady and none too uncommon.

In a survey conducted by Allen Wilhite and Eric Fong of the University of Alabama in Huntsville, 54,000 academics were questioned about their experience with citations, whether they’d ever been asked to fluff up their citation section, whether they knew that that was a thing that happened, and what they thought about it. Of the 6,672 who responded (this is a good time to remember sampling bias), 40% reported familiarity with the practice, but 20% (in total) reported direct experience.

But perhaps most interesting was the researchers view of this practice versus they way they dealt or deal with it. Although 86% thought the practice was “inappropriate” and 81% thought it was bad for the journal’s reputation, 57% said they would totally add bogus citations if asked. So basically, it’s bad, and people mostly agree that it’s bad, but not enough to do anything about it. It’s understandable considering that adding a few superflous citations to your paper doesn’t really hurt it from a researcher’s perspective, and from a journal’s perspective, it’s only going to help to have more references out there.

Still, some people are making moves to control the issue, either by removing journals that are bloated with self-citations from ranking lists or by calculating impact-factor without internal citations. Either way, it’s not a world-ending issue, but it does throw things off a bit, and is probably worth correcting for. More than anything it just lays bare that scientific journals can be starved for attention too.

So the next time you find yourself clamoring for attention, whether you’re handing out CDs of your band’s EP to uninterested pedestrians or sending off scores of unsolicited emails just trying to get one person to read your 500 page tome of heart-wrenching, soul-rending limericks, you can remember that you’re in good company; Scientific journals can crave validation just like the rest of us. Now some one just needs to get a paper published in a scientific journal, telling us whether or not we should be ashamed of ourselves.

(via Nature)

Relevant to your interests

Have a tip we should know? tips@themarysue.com

Author

Filed Under:

Follow The Mary Sue:

Exit mobile version