Category Archives: Peer review

Retractions Brown Bag Discussion

Our next scholarly communication brown bag discussion will be about retractions. We hope to talk about how retractions get issued, how researchers find out about retracted articles, what happens to people who are involved in a retraction, and what impact this has on the research lifecycle.

In advance of that discussion, here are a few resources that may help provide some context.

We are also gathering anonymous feedback about retractions. If you have anything to share regarding retractions (even if you haven’t been directly involved) please consider taking the short survey.

We hope to see you on Wednesday, Nov. 7th, at 12:00pm in PCL Learning Lab 3.

OA Week 2018: Engaging Early and Often

A key component of scholarly communication is, in fact, communication. What’s the point of making information available if engagement doesn’t follow? One way of facilitating increased engagement with scholarly literature is through the hosting of preprint articles on institutional repositories and preprint servers.

Preprints are typically defined as scholarly articles that have not yet undergone peer-review and are ready to be submitted for publication. Generally, preprints include the same overall information as final published articles but lack the added design elements and review that occur in the journal publication process. Most importantly in terms of open access, authors can, in most circumstances, freely post and make available their preprint work online.

Preprints speed up the dissemination of scholarly literature by aligning with researcher timelines – not publisher timelines. Preprint servers like arXiv, bioRxiv, and OSF Preprints typically make author-submitted preprints available for viewing in just a few business days, allowing posted articles to be both timely and relevant to current discussions. In the age of social media and instant reporting, it’s important that scholarly research increase the immediacy with which it’s available to enter public discourse.

Infographic about positive aspects of preprints
Image by Daniela Saderi & Adam Lazenby / CC BY 4.0

To come to any kind of consensus on scholarly research, we need a diverse range of individuals engaging with the research and giving feedback on findings. Open knowledge initiatives like PREreview, a web platform allowing for peer-reviewing of preprint articles, encourage scholarly conversation to occur between individuals whose voices have been historically excluded from this crucial process, such as early career and unaffiliated researchers. PREreview also provides valuable preprint feedback to “be compiled into a review and sent back to the authors, who then have the chance of integrating that feedback into their work” (Welcome to PREreview).

According to responses from about 500 faculty members in a recent UT Libraries’ survey, roughly 65% of faculty respondents have shared their scholarly research in “pre-print or e-print digital archives” in the past 5 years (Ithaka survey, Q10). Nearly 40% of those same respondents believed circulating preprint versions of their work to be “an important way for me to communicate my research findings with my peers” (Ithaka survey, Q12).


Considering submission of your preprint work to a preprint server? Double-check journal submission policies on SHERPA/RoMEO before doing so.

Interested in more results from the UT Libraries’ Ithaka S+R 2018 survey of faculty, graduate students, and undergraduate students? View the full results online.

 

Interesting new journal selection tool

The Cofactor Journal Selector Tool has been developed to allow researchers to find an appropriate journal for their paper. It allows researchers to select options for subject, peer review process, open access availability, speed of publication, and a few additional miscellaneous categories. After answering the questions (a process that takes just a minute or two) the system spits out a few recommendations of journals that meet your criteria.

Here’s a short article that gives an overview of the product: http://cofactorscience.com/blog/introducing-the-cofactor-journal-selector-tool

And here’s a link to the tool: http://cofactorscience.com/journal-selector

Publishers behaving badly-again

Once again, someone has uncovered a scam by unsavory publishers. The Ottawa Citizen recently published an article by Tom Spears about a sham article that got accepted by several fake publishers within 48 hours of being submitted. These journals claimed that articles went out for peer review, but there is no way peer review could be done in that short of a time period – which is why I call the publishers fake. And of course, after accepting the bogus article for publication, they require a payment (usually less than the charges at reputable publishers, but still a scam) to make the article available online.

The article that Tom submitted was a mishmash of cut and pasted sentences from geology papers and hematology papers. The references came from a wine chemistry article. In short, the article made no sense at all – the abstract contained the phrase ‘seismic platelets’ which is obviously made up – and it was almost entirely copied from other articles. No reputable journal should have even sent it out for peer review.

These fake publishers make it difficult for already overworked faculty to evaluate the articles they find, evaluate potential publication venues, and evaluate the work of their peers. Jeffrey Beall, a librarian in Colorado, has a list of these types of publishers that faculty can refer to, but I wonder if there is more that libraries could be doing in this area to help the faculty and students we work with?

ScienceOpen

There is a new open access publisher called ScienceOpen. The idea behind ScienceOpen is to publish in all areas of science and utilize post-publication peer review. Submitted articles will go through a technical and ethical review and accepted publications are then published online with a DOI after payment of an $800 publication fee.

Open peer review takes place after publication. Authors may invite suitable reviewers for their own manuscript, and editors or other ScienceOpen members may also invite peers to review the work. Unsolicited comments make up a separate portion of the public review system. The identity of all reviewers and commenters is visible at all times.

For more information about this process: https://www.scienceopen.com/external/how_does_it_work

Royal Society launching OA journal

The Royal Society of London will launch a new open access journal this fall, Royal Society Open Science (RSOS). RSOS will operate similarly to PLoS One, meaning it will publish research in all areas of science and mathematics and will base peer review on quality of the research, not novelty of the subject.

For more information about this journal, see this article from The Guardianhttp://www.theguardian.com/science/grrlscientist/2014/feb/18/royal-society-open-access-science-maths-new-journal

Publishers behaving badly

A recent article in Science Magazine, http://www.sciencemag.org/content/342/6154/60.summary, reveals a troubling problem with peer review at some scientific journals. The author, John Bohannon, submitted a bogus, scientifically and ethically flawed paper to 304 Gold OA journals (meaning they charge a fee to publish), and so far 157 journals have accepted the article. 98 journals have rejected the paper, including PLoS One and a Hindawi journal. Publishers for journals that accepted the paper include some listed on Jeffrey Beall’s Predatory Open Access list and well-known publishers like Elsevier, Sage and Wolters Kluwer.

While this is certainly a very interesting topic, without repeating the study on toll-access journals and evaluating articles posted to institutional repositories it’s rather difficult to say if the problem is exclusive to OA journals or part of a larger problem in the scientific publishing community.