The Cofactor Journal Selector Tool has been developed to allow researchers to find an appropriate journal for their paper. It allows researchers to select options for subject, peer review process, open access availability, speed of publication, and a few additional miscellaneous categories. After answering the questions (a process that takes just a minute or two) the system spits out a few recommendations of journals that meet your criteria.
Here’s a short article that gives an overview of the product: http://cofactorscience.com/blog/introducing-the-cofactor-journal-selector-tool
And here’s a link to the tool: http://cofactorscience.com/journal-selector
Liverpool University Press has announced a new OA publishing platform for scholars in the modern languages. The platform is called Modern Languages Open and currently has sections for Chinese/Asian languages, French and Francophone, German Studies, Hispanic Studies, Italian, Portuguese and Lusophone, and Russian and Eastern European Languages.
Access Modern Languages Open: http://www.modernlanguagesopen.org/index.php/mlo/issue/current
Our June brown bag discussion focused on altmetrics. Two tools that are offered that could integrate with our UT Digital Repository are PlumX and Altmetric. Both of these tools provide information on downloads, saves, and social media mentions as part of the larger picture for research impact. They are also both subscription-based tools. In a time of shrinking budgets, it may be difficult to add one of these to our toolkit. One person suggesting trying to get administration buy in for paying for a tool like this.
Many participants were concerned about the lack of assessment information for altmetrics tools – if we don’t know how and what they are measuring it’s difficult to evaluate their effectiveness. For instance, do any of the tools differentiate between something that is tweeted to 1 million followers as opposed to something tweeted to 10 followers. And, is there any way of measuring scholarly tweets as opposed to popular tweets and should that matter? And are Zotero and EndNote included in altmetrics, as that is how many scholars “save” an article or book for future reference.
Altmetrics are a quantitative measure, just like traditional bibliometrics. Using both quantitative and qualitative measures for evaluating scholarship provides a much richer picture of a scholar’s work, but quantitative metrics are frequently used alone. The altmetrics tools also don’t really address citation tracking which is a large part of the scholarly communication cycle. NSF and NIH have both widened the definition of what can be considered a research output, so metrics could be collected for non-traditional kinds of publications like data sets.
A final major issue brought up by participants was the lack of awareness about altmetrics among faculty and students. When thinking about education opportunities surrounding altmetrics, there was a desire to make sure those on tenure and promotion committees are aware of these tools and what they can measure and what the measurements mean. Word of mouth was one option presented for getting information out about altmetrics. Identifying early adopters was also put forth as an outreach strategy. The hardest part is catching faculty’s attention before the last minute. We’d like to be able to provide information about altmetrics before faculty are compiling their tenure packages; while there is still time to incorporate them in a meaningful way. Catching faculty interest is a topic of ongoing discussion -emails frequently get deleted before they are read, events on campus are usually not well-attended, and flyers and brochures get limited attention. One faculty member candidly told us that going door-to-door is the only way to get faculty attention. One way of getting around this is by approaching graduate students who may have more incentive for finding ways to stand out among their peers and who may be more familiar with social media, which plays into altmetrics tools. Conducting a survey or using focus groups to elicit faculty and student opinions were also mentioned as ways of moving forward.
In the end, everyone agreed this is an issue that merits further attention. It is likely that at least one library-class this fall will incorporate altmetrics somehow, and there was interest expressed in doing some sort of train-the-trainer event for library staff.
The American Society of Civil Engineers has hired a firm called Digimarc to police the uploading of publisher PDF versions of their articles on personal or university websites. The take-down notices have gone out to many universities around the world. For more information:
Information about these take-down notices from the University of California:
While it’s sad to see a publisher attacking the very people who keep their journals in business, this is a perfect example of why faculty/researchers should be aware of their rights when publishing any of their work.
Once again, someone has uncovered a scam by unsavory publishers. The Ottawa Citizen recently published an article by Tom Spears about a sham article that got accepted by several fake publishers within 48 hours of being submitted. These journals claimed that articles went out for peer review, but there is no way peer review could be done in that short of a time period – which is why I call the publishers fake. And of course, after accepting the bogus article for publication, they require a payment (usually less than the charges at reputable publishers, but still a scam) to make the article available online.
The article that Tom submitted was a mishmash of cut and pasted sentences from geology papers and hematology papers. The references came from a wine chemistry article. In short, the article made no sense at all – the abstract contained the phrase ‘seismic platelets’ which is obviously made up – and it was almost entirely copied from other articles. No reputable journal should have even sent it out for peer review.
These fake publishers make it difficult for already overworked faculty to evaluate the articles they find, evaluate potential publication venues, and evaluate the work of their peers. Jeffrey Beall, a librarian in Colorado, has a list of these types of publishers that faculty can refer to, but I wonder if there is more that libraries could be doing in this area to help the faculty and students we work with?