The currency of science is–duh!–publication. You do the work, you write a paper. Acceptance advances a researcher’s career in every way that counts: tenure, promotion, reputation, and funding.
But what if your research output isn’t a research publication? For programmers who labor in the trenches to advance their and their colleagues’ science, that’s not a hypothetical question. The output of their research is code. And hard as it is to solve problems computationally, it’s even harder to it well, requiring considerable testing, debugging, and documentation.
In the world of computational science, says Arfon Smith, who heads the Data Science Mission Office at the Space Telescope Science Institute in Baltimore, Maryland, creating, documenting, and releasing a polished bit of code effectively is a publication. But unless that software is described in a traditional research publication and cited, its authors will rarely receive their due academic credit.
As it turns out, few journals actually publish papers that document software on their own. Usually, they require that the software be bundled into a larger package that uses the software to drive some scientific advance.
That situation struck Smith, who used to head scientific efforts at GitHub, as inherently absurd. So, he decided to do something about it. In May 2016, he launched the Journal of Open Source Software. On 27 March, the journal was designated an affiliate of the Open Source Initiative, an open-source software advocacy group.
As described in a blog post announcing its formation, JOSS is a “developer-friendly journal for research software packages.” Article preparation should take no more than an hour, Smith says, with articles amounting effectively to abstracts and pointers to the online homes of the software they describe.
The journal is a “hack,” Smith writes (meaning, in programmer lingo, a hasty workaround). But it’s also legit. The journal has an ISSN number; articles are assigned citable DOIs; there is a formal (and open) peer-review process; and a top-flight editorial board.
“They pass the ‘sniff test’,” he says, of JOSS articles, “but they are deliberately short. Because we think if you have gone through the effort of producing high-quality software, you shouldn’t have to go through the effort of creating the paper to announce it.”
The requirements for publication are few: Software must serve a research application, be available on a public repository such as GitHub or Bitbucket, have an open-source license, and include a short file containing the article title, summary, authors, affiliations, and references. Accepted packages must be archived on sites such as Zenodo or Figshare.
Once submitted, peer reviewers verify the software is freely available, that it installs correctly, and that it functions as advertised. They ask that the software abide by the open-source ethos – that there be mechanisms for researchers to comment, post queries, and suggest improvements, for instance. And they ensure the software is well documented. (Researchers can also cross-submit articles that have already been accepted at rOpenSci; four or five authors have done so, says Smith.)
Because the publishing team is all-voluntary, considerable effort was put into automating the editorial process, Smith says. For instance, paper submissions through the JOSS web site automagically produce an “issue” in the journal’s GitHub page, where the review process occurs in plain view (see an example thread here). The team produced a GitHub bot called “@whedon” (named for screenwriter/director Joss Whedon) that performs tasks such as assigning editors and reviewers, and updating article status.
JOSS has received 126 submissions since mid-May 2016, 87 of which have been accepted; 38 manuscripts are under review. No papers have yet been rejected, Smith says, though a few have been withdrawn because they did not describe research applications. Others were so under-developed they were not yet ready for publication, but they too were not rejected outright, he says, but rather put on hold.
“Software is a living entity, it should be enhanced and improved. So, it’s not a problem to keep things open a long time.”
Smith admits JOSS is not a long-term solution but a temporary workaround to a system that really doesn’t work for computational scientists. In his blog, he calls it “a necessary hack for a crappy metrics system.” Ultimately, science must evolve beyond citation metrics and other traditional measures of academic success, he writes, but “that’s the long-term fix.” In the meantime, there’s JOSS.
Jeffrey Perkel is Technology Editor at Nature.