The Richest Return of Wisdom
By Brian S. Cole
The real lesson I’ve gleaned from my time in pursuit of a PhD in biomedical
research hasn’t been the research itself; indeed many of my colleagues and I
came into the program already equipped with extensive bench experience, but the
real eye-opener has been how science is communicated. When I was an
undergraduate, assiduously repeating PCR after PCR that quietly and dutifully
failed to put bands on a gel, I just assumed that experiments always worked in
the well-funded, well-respected, well-published labs that wrote the papers we
read in school. As an undergraduate, I had implicit trust in scientific
publications; at the end of the PhD, I have implicit skepticism. It turns
out I’m not alone.
The open access movement has taken a new tone in the past year: increasing
recognition of the irreplicability1 and alarming prevalence of scientific misconduct2 in highly-cited
journals has led to questioning of the closed review process. Such a
process disallows the public access to reviewers’ comments on the work, as well
as the editorial correspondence and decision process. The reality of the
publication industry is selling ads and subscriptions, and it is likely that
editors often override scientific input by peer reviewers that throws a sexy
new manuscript into question. The problem is the public doesn’t get
access to the review process, and closed peer review is tantamount to no peer
review at all as far as accountability is concerned.
For these reasons, our current scientific publication platform has two
large-scale negative consequences: the first economic, and the second
epistemic. First, intellectual property rights for publicly funded
research are routinely transferred to nonpublic entities that then use these
rights for profit. Second, there is insufficient interactivity within the
scientific community and with the public as a result of the silo effect of
proprietary journals. The open access revolution is gaining momentum on
the gravity of these issues, but to date, open access journals and publishers
have largely conformed to the existing model of journals and isolated
manuscripts, and while open access journals have enabled public access to
scientific publications, they fail to provide the direly needed interactivity
that the internet enables.
In
the background of the open access revolution in science, a 70 year old idea3
about a new system for disseminating scientific publications was realized two
decades ago on a publicly licensed code stack4 that allows not
just open review, but distributed and continuous open review with real-time
version control and hypertext interlinking: not just citations, links to the
actual source. Imagine being able to publish a paper that anybody can
review, suggest edits to, add links to, and discuss publicly, with every step
of that ongoing process versioned and stored. If another researcher
repeats your experiment, they can contribute their data. If you extend or
strengthen the message of your paper with a future experiment, that can also be
appended. Such a platform would utterly transform scientific publication
from a series of soliloquies into an evolving cloud of interlinked ideas.
We’ve had that technology for an alarmingly long time given its lack of
adoption by researchers who continue to grant highly cited journals ownership
over the work the public has already paid for.
I’ve kicked around the idea of a Wikiscience5 publication system
for a long time with a lot of scientists, and the concerns that came up were
cogent and constructive. In testament to the tractability of a wiki
replacement for our system of scientific publication is Wikipedia, one of the
greatest gifts to humankind ever to grace the worldwide web. The
distributed review and discussion system that makes Wikipedia evolve does work,
and most of us are old enough to remember a time when nobody thought it would.
But how can we assess impact and retain attribution in a distributed
publication and review system such as a wiki? Metrics such as journal
impact factor and article-level metrics wouldn’t directly apply to a
community-edited, community-reviewed scientific resource. Attribution and
impact assessment are important challenges to any system that aims to replace
our journal and manuscript method for disseminating scientific information.
While a distributed scientific information system would not easily fit
into the context of the current metrics for publication impact that are an
intimate part of the funding, hiring, and promotion processes in academia, the
consideration of such a system presents an opportunity to explore innovative
analyses of the relevance and impact of scientific research. Indeed, rethinking
the evaluation of scientists and their work6 is a pressing need even within the context of the current
publication system.
We
should be thinking about the benefit of the networked consciousness of online
collectivism, not the startling failures of our current publication system to
put scientific communication into the hands of the public that enabled it, or
even the challenges in preserving integrity and attribution in a commons-based
peer production system.7 We are the
generation that grew up with Napster and 4chan, the information generation, the
click-on-it-and-it’s-mine generation, born into a world of unimaginable technological
wealth. Surely we can do better than paywalls, closed peer review, and
for-profit publishers. We owe it to everybody: as Emerson put it, “He who has put forth his total strength in fit actions,
has the richest return of wisdom.” 8
This article accompanies a feature piece about scientific publishing in the digital era and also appeared in the Penn Science Policy Group January 2015 newsletter.
Brian S. Cole |
1Ioannidis, John P. A. "Why Most Published Research
Findings Are False."PLoS Medicine 2.8 (2005): E124.
2Stern, Andrew M.,
Arturo Casadevall, R. Grant Steen, and Ferric C. Fang. "Research:
Financial Costs and Personal Consequences of Research Misconduct Resulting in
Retracted Publications." ELife 3 (2014)
3Bush, Vannevar.
"As We May Think." The Atlantic. Atlantic Media Company,
01 July 1945.
4"MediaWiki
1.24." - MediaWiki. <http://www.mediawiki.org/wiki/MediaWiki_1.24>.
5"WikiScience." - Meta. <http://meta.wikimedia.org/wiki/WikiScience>.
6"San Francisco
Declaration on Research Assessment." American Society for Cell Biology. <http://www.ascb.org/dora/>.
7"3. Peer
Production and Sharing." <http://cyber.law.harvard.edu/wealth_of_networks/3._Peer_Production_and_Sharing>.
8Emerson, Ralph W. "The American Scholar." The
American Scholar. Web. <http://www.emersoncentral.com/amscholar.htm>.
Comments
Post a Comment