Category Archives: Publishing

A new way to evaluate research

The International Development Research Centre (IDRC) in Canada has developed a tool they hope will better measure the quality and impact of research coming out of the global south. They want to ensure that researchers who are working on projects that positively impact their region are evaluated on criteria that make sense. Metrics like citation count and h-index don’t necessarily measure the rigor and usefulness of research.

The IDRC calls this tool Research Quality Plus (RQ+) and it has three parts:

  • Identify contextual factors – political, data, research environment, maturity of the scientific field, and how much the project focuses on capacity strengthening
  • Articulate dimensions of quality – scientific integrity, legitimacy, importance, and positioning for use
  • Use rubrics & evidence – assessments have to be systematic, comparable and based on evidence (both qualitative and quantitative)

You can read more about this tool in Nature or on the IDRC website.

Editors4BetterResearch initiative

Researchers Chris Chambers, Corin Logan, and Brad Wyble have started an initiative called Editors4BetterResearch. They hope to create a database of journal editors who support reproducibility and open science. Right now they are soliciting feedback on their proposal and collecting names of editors who would like to be listed in the database. Their goal is to allow authors who value open and reproducible science to find editors who share their values.

June brown bag discussion

For our June discussion we’ll be talking about information security, privacy, social media, and algorithmic culture and how all those things are impacting our society. If you’d like to read up on any of this before the June event, here are some helpful resources.

Antisocial Media: How Facebook Disconnects Us and Undermines Democracy, by Siva Vaidhyanathan.
Excerpt from the book

Robyn Caplan, Content Standards and Their Consequences talk from ER&L 2018

Siva Vaidhyanathan, Antisocial Media talk from ER&L 2018

Danah Boyd, The Messy Reality of Algorithmic Culture talk from ER&L 2018

Stay tuned for more information about this session.

Petition against new machine learning journal

A large group of scholars has signed a petition protesting the creation of a new subscription journal from Nature, Nature Machine Intelligence.

https://openaccess.engineering.oregonstate.edu/home

They argue that most journals in the field are open access with no charges for readers or authors. The people signing the petition have said they will not submit to, review or edit for the journal. I’m not sure how large the machine learning community is, but the petition has over 2400 signatures (as of May 2nd).

Nature has stated they respect the opinions of the petitioners, but they feel the new subscription journal can co-exist with others by providing an outlet for interdisciplinary work that has undergone rigorous peer review.

Florida State cancels big deal

Florida State University has announced it’s canceling it’s “big deal” journals package with Elsevier. The dean of university libraries, Julia Zimmerman, released a statement about the cancellation. Instead of the big deal, FSU will subscribe to a sub-set of journals that are most-needed. The Faculty Senate voted unanimously to approve the planned cancellation.

Florida State is the most recent university to announce a decision like this. SPARC (Scholarly Publishing and Academic Resources Coalition) maintains a list of institutions that have cancelled their big deals.

French Universities say goodbye to Springer

It looks like a national consortium of French academic institutions, Couperin.org, has decided to cancel their subscriptions to Springer Journals. They had been in negotiations with Springer for over a year, but were unable to come to an agreement on price. Couperin.org was advocating for a subscription price reduction due to the volume of APCs being paid by French authors. Springer had put forward a price increase. Access was supposed to be cut off on April 1st, but Springer has decided to keep access on while they continue to move discussions forward. You may remember that a similar event happened with German universities and Elsevier.

An English translation of the announcement is here (you may have to click on the UK flag). You can also find information about this through a short news item from The Scientist.

Sci-Hub and LibGen in Perspective

Please join us on Wednesday, Feb. 21st at noon in PCL Learning Lab 4 to hear UT Austin iSchool graduate student, Stephen McLaughlin, speak about Sci-Hub and LibGen. There will be plenty of time for discussion, so bring your questions.

Sci-Hub and LibGen in Perspective
Over the past decade, websites offering free, unauthorized copies of books and academic articles have grown rapidly. How are they maintained and used, and what might they mean for the future of scholarly publishing?

Event Flyer

Digital scholarship office hours spring semester

We are offering digital scholarship office hours again this semester. Gilbert Borrego, Allyssa Guzman, Jessica Trelogan and Colleen Lyon will be available to answer any questions you may have about digital scholarship, Texas Data Repository, Texas ScholarWorks, research data services, or scholarly publishing.

Here are the dates –all sessions are in PCL 1.124 (one floor below the entrance level of PCL)

  • Wednesday, Jan. 24th, 10:00-12:00
  • Wednesday, Feb. 28th, 10:00-12:00
  • Wednesday, Mar. 28th, 10:00-12:00
  • Wednesday, Apr. 25th, 10:00-12:00

The office hours are open to anyone at UT Austin – we’re thinking of it as an alternative to booking a consultation. No appointment is needed, you can just stop in during the two hour time frame and chat with us. As a bonus, we’ll have sweet treats or snacks for anyone who stops by.

We are of course still available for consultation at any time via email, phone, or in-person. You can continue to ask questions that way if you prefer.

 

Evaluating Journal Quality

With the proliferation of new journals enabled by online publishing, it can be difficult for researchers to know if a particular journal is worth publishing in. Here are two resources that could help librarians and researchers when looking into an unfamiliar journal.

The first is the Quality Open Access Market (QOAM). The QOAM enlists the help of academics to evaluate a journal’s online presence and the experience of publishing with a particular journal. The journal’s website is evaluated for editorial information, peer review, governance, and workflow. This evaluation results in a Base Score Card. Authors can share their experience publishing with that particular journal, which results in a Valuation Score Card. The journal score cards are combined to give users an indication of whether this is a strong journal, weaker journal, opportunity to the publisher to improve, or a threat to the author. The QOAM measures the quality of service of the journal, not the quality of the research being published.

The other journal evaluation tool is the Journal Publishing Practices and Standards (JPPS) framework. This project was started to help journals from the global south improve their international reputation. The criteria used to assess the journal include: publication of original research, functional editorial board, verified involvement from editorial & advisory boards, accuracy of the description of the peer review and quality control processes, availability of author and reviewer guidelines, and display of editorial and publishing policies. Assessed journal are assigned to one of six levels: inactive titles, new title, no stars, one star, two stars, three stars.

These tools are not white lists or black lists. They are designed to provide some information about the transparency and quality of the publication services of a given journal. They should be used in conjunction with disciplinary knowledge, consultation with colleagues, and the author’s own professional judgment.