Send in your ideas. Deadline December 1, 2024

Open letter to DG Research & Innovation

      To:
      Directorate-General for Research and Innovation
      attn. mr. Jean-Eric Paquet

      Address:
      European Commission
      1049 Bruxelles/Brussel
      Belgium

      Re:
       2017/RTD/A6 /PP06481/2017

      Date:
      May 25th, 2018

Dear mr. Paquet,

please allow me to present our response to the European Commission's Call For Tenders N° 2017/RTD/A6 /PP06481/2017, a.k.a. "Open Research Europe – The European Commission Open Research Publishing Platform". According to the documentation delivered, DG Research envisions "a publishing service of the highest technical and scientific standards with wide acceptance by research institutions, authors and other funders". The platform should result in a portfolio of an estimated 5600 peer-reviewed articles (≈10% of the around 56.000 articles which are likely to be produced with Horizon 2020 grants during the remaining period), and in an unknown amount of pre-prints.
      Any original scientific article based on research fully or partially funded by Horizon 2020 is supposed to be eligible. The whole effort is specified to be based on currently operational publishing infrastructure but (parts of) the underlying technology should be released as open source. Furthermore, the service should be "portable" - after running the service for four years, another host organisation should be able to take over the whole effort.

We have talked to a number of actors and stakeholders and believe that there is a disconnect between the intended outcome of this tender, the approach chosen and the ambitions Europe has (or needs to have) for the long term development of science and research. The tender description is written from a rather limited perspective closely confined within the current (problematic) situation, and anyone following the recipe is likely to be ineffective in reaching the actually desired state of affairs. The hurried time schedule is based on an urgency that does not actually exist: if it were not for the requirements of the European Commission itself, one could ask the question if even the authors would see the need for most of these articles to be independently published as significant standalone contributions to science.
      The admirable level of detail with which the procured approach is written down is in some contrast with the lack of attention for much more pressing and much larger needs of the science of today - let alone the science of tomorrow. Building a "publishing service of the highest technical and scientific standards with wide acceptance by research institutions, authors and other funders" is not something to be rushed into. We are convinced the stakes are even higher than the ambition laid out in the tender. We should not artificially limit the playing field to a small set of current operational actors in a very specific domain, challenged to solve the wrong problems in the wrong constellation at very short notice. There are actually many good reasons to first try to understand and then redesign the larger role of publishing in science, given how different the circumstances are from the situation in which the original tradition of publishing scientific articles originated. The operational practises established in 1665, when the 'Royal Society of London for Improving Natural Knowledge' published the first issue of Philosophical Transactions, may have shaped science across the world since, but their overall benefit is no longer uncontested after three and a half centuries.

Especially in the last few decades the speed, scope, depth and complexity of scientific discovery has rapidly increased, and so has the importance of related activities outside of traditional academic and institutional venues. Globalisation and the rise of new economies has meant even more acceleration. While the classical form of articles has been retained as the dominant discourse, the scientific ecosystem as a whole has troubles dealing with that – and there seems to be a clear relation to how the publication paradigm has worked out in those new circumstances.

The key role of articles is in many cases no longer to merely inform interested readers of the latest scientific discoveries. Publishing articles has gained such systemic importance at the institutional and funding body level that getting enough publications published in suitable venues has become a dominant factor in the careers and lives of todays scientists and researchers. An article is a 'deliverable' promised prior to the start of the research, and it has to be delivered in order for an organisation to get its funding - no matter if the publication is interesting or not.
      This bureaucratic pressure to deliver articles as "reports" is well illustrated by the notion of "publish or perish". The benefit of the general audience is in many cases a mere derivation of this internal 'institutional' necessity of writing papers. The publishing industry has significantly capitalised on this dependency. Over time, a multibillion euro industry evolved around the publication and exclusive exploitation of scientific articles. The excesses of this industry have finally led to the rise of the open access paradigm.

However, solving the exploitation of the institutional and individual dependency of publications is only a small part of the existential crisis of modern science. Under the increased pressure of upscaling, even more fundamental weaknesses of the traditional paradigms have started to become apparent. In some areas the economic stakes for external stakeholders like pharmaceutical companies are immense, leading to ghost authorship and other mishaps. There have been cases where individual researchers managed to be credited with publishing literally hundreds of scientific articles a year, proving the creativity of people in satisfying the requirements of funding bodies.
      This unfortunately coincides with the phenomenon of the reproducibility crisis. It is estimated that more than two-thirds of researchers have tried and failed to reproduce another scientist's published experiments. The reproducibility crisis is directly linked to the ongoing pressure on scientists to continually present breakthrough findings in articles in well-known magazines as 'proof' of their quality. There is a push by editors, funding bodies and institutions for eye-catching novelties and 'innovation' in their publications. This creates a dominant negative incentive towards confirmatory work which is essential to the long term validity and hygiene of any research area. The significant career effects of such publications and the disproportionate amount of attention also creates a perverse incentive to publish (only) the most interesting, exciting, unexpected and news-worthy results. This means conveniently leaving other results behind, or not even bothering to undertake the work. Peer review has become a significant burden in itself, an unthankful and costly task in terms of wasted human effort. With the amount of scientific activity up several orders of magnitude and the "bureaucratisation" of scientific publishing described above, the principle of scientific priority turns out to be rather unsatisfactory as the main ordering principle of the body of knowledge of science. Even in a limited area of science, there is often more work being published than any individual could dive into or keep up with even with the most unhealthy work ethics and unlimited resources. The constant flow of new articles, each pointing to ever more articles, may in theory provide something akin to a public 'ledger' of scientific claims of breakthroughs over time. A continuous flood of articles however does not result in adequate oversight nor does it deliver a consolidated view on efforts, overlaps and contradictions.

The resulting stack of documents is hostile to real-world understanding, and more importantly, rather unsafe for usage. Which of the millions of scientific claims made in the past by articles referenced by other articles were partially debunked soon after - possibly endangering the conclusions of a certain article and all its descendants? How many of them were never properly looked at in depth? Counting the total amount of citations and references to some article does not allow to distinguish between the articles that prove results right and those articles that prove them (partially) wrong – or the quality of their approaches.

The current publishing paradigm delivers a view through the rear view mirror. Each of the documents referenced in a particular paper leaves the reader with the huge responsibility of reading up on an avalanche of documents published after each of the citations. Failure to do so may hide underlying assumptions proven wrong, computational mechanisms that contained errors or proof that the original authors had been selective or biased. In combination with the reproducibility crisis mentioned before, this clearly poses a huge challenge, and one that really needs to be solved by altering the publication paradigm and changing the dynamics of the system.

We share your belief that a European scientific open publishing outlet is a great and potentially very important idea. But only if the collective solution we build as Europe recognises the actual challenges of todays science and capitalises on technological possibilities. Only that combination will allow it to disrupt the global state of affairs. It is clear that the current status quo of commercial publishing has a troubling relationship with science as a whole. If the European Commission wants to be a leading funder, an worthwhile ambition that can easily be read between the lines of the document, the us-too approach proposed is not the right one. In fact, it can be argued that the very principle of a tender goes against the collaborative approach which seems most suited for this type of challenge.

We are convinced there are significantly more worthwhile alternative options that would better serve the public interest and the purposes of the European Commission, and which would be realistically achievable within the same budget range. A 'blockbuster' tender in a rush to become operational is simply too blunt a mechanism, and we believe somewhat unsuitable for the actual innovation strategy behind it. To satisfy the immediate needs of H2020 and FP9, provisioning of an existing open source publishing infrastructure like Editoria and a light-weight repository infrastucture based on open source infrastructure like CERN's Invenio could makes all the publications available as such. Even at the usage load specified in the tender approaching these publications as grey literature would cost less than 1% of the budget made available to the larger effort. And as specified, this would allow all articles to be accessed at no cost by anyone around the world. The success of grass roots efforts like Sci-Hub proves that users can find their own way from there on.

We suggest to leave the remainder of the many 'conventional' requirements in the current document as is, and allocate the rest of budget to efforts that contribute more to the vision behind it. In the long run, a cascading set of texts (articles and letters) citing texts will hardly suffice to maintain and curate global bodies of knowledge. While technologies such as Artificial Intelligence may become useful in the long run to support the maintenance of large knowledge graphs derived from papers as well as raw scientific content, finding more efficient ways of continuous human curation of knowledge will become an essential precondition to research. While the relevance of peer review prior to publication is of clear benefit, there is no need to limit peer review to the period before publication. The technology is clearly there to do a posteriori review, and even to provide updated versions of the published text based on such comments.

There is a unique opportunity to embark on a more open and inclusive approach that invites good ideas and talent from the scientific and engineering community, from the publishing community, from the free/libre open source community, from the archiving and digital heritage community, from social sciences, scientometrics and civil society. Through open calls you can enable individual researchers and developers as well as small teams to research and develop important new ideas that contribute to the establishment of the Next Generation Science infrastructure, as well as investigate critical ideas on how to improve the scientific machinery. One should extend currently available open source infrastructure and tools to enable any individual or community to produce, review/audit/curate and maintain articles and to link them with related artefacts such as data sets and software. Or cultivate knowledge mining infrastructure which will significantly lower the threshold of discovering relevant content. Or create tools that integrate publishing with transparent and reproducible software environments for science, contributing to a more convenient and efficient vetting of outcomes.

Such an open approach driven not by profit but by science and society we believe is vastly superior to merely tasking a single provider, which would likely end up in monopolising aspects of the operational tasks involved. We would be happy to help you out with this in any way possible – building on decades of experience we have in funding such work. The vision of open science is essential for the open information society our public benefit organisation stands for; this is a major challenge for society that no single organisation can solve. The open source methodology requires a different mindset than the traditional tender, but it is the only way to move out of the current predicament. The real challenge ahead of all of us is to converge the work of the global scientific community towards a distributed knowledge paradigm ready to take on the many challenges of the next millenium.

Kind regards,

Michiel Leenaars
Director of Strategy
NLnet foundation