from 30 November 2017 to 1 December 2017
The Square Meeting Centre
Europe/Brussels timezone
Connecting the building blocks for Open Science
Home > Timetable > Contribution details

Contribution Posters

Peer-review of the research flow

Speakers

  • Alessia BARDI
  • Paolo MANGHI

Primary authors

Description

An increasing number of researchers use ICT tools for the production and processing of research outcomes. In the last decade, research infrastructures (organizational and technological facilities supporting research activities) are investing in “e-infrastructures” that leverage ICT tools, services, guidelines and policies to support the digital practices of their community of researchers. e-infrastructures are the place where researchers can define the boundaries of their digital laboratories, i.e. the subset of assets they use to run an experiment. Researchers run their digital experiments (e.g. simulations, data analysis) taking advantage of the digital laboratory assets and generate new research data and computational products (e.g. software, R algorithms, computational workflows) that can be shared with other researchers of the same community, to be discovered, accessed and reused. The role of digital laboratories is therefore twofold: on the one hand, they support researchers in their advancement of science, offering the facilities needed for their daily activities; on the other hand, they foster the dissemination of research output within the research community, supporting discovery, access to, sharing, and reuse of digital research products, including intermediate results of a research flow. Those features are fundamental for an effective implementation of the Open Science paradigm. Digital laboratories set the conditions for novel peer review methodologies, as well as scientific reward policies, which assess the research flow not only based on the scientific article that describes the final results, but also include the other (intermediate) research products (data, software, workflows, negative results) so that science can be transparently and objectively assessed, possibly using machine-assisted processes. The presentation will describe our vision of a “research flow peer review”, trying to identify current solutions and existing challenges, and proposing future directions. The implementation of a full-fledged research flow peer review methodology has requirements (tools and practices) that differ from those identified in Open Science for reproducibility. Reproducibility of science and its underlying principles are crucial to support transparent peer review, but existing practices are not enough to fully address research flow peer review. In order to support this kind of peer review, reviewers should evaluate science by means of a user-friendly environment, which transparently relies on the underlying digital laboratory assets, hides their ICT complexity, and gives guarantees of repeatability and reproducibility recognised by the community. We propose some ideas in the direction of the definition of a general framework for the representation of a research flow peer review, which could then be tailored to a given discipline of science. Such a framework may become a scaffolding for holding new discipline-specific tools, which in turn should become “real-time hooks” in the underlying digital laboratory (where scientists are carrying out their research flow), providing the collection of data and information useful for the peer review. Such tools should abstract over the complexity of the specific research activity and offer user-friendly dashboards to examine the scientific process adopted, explore the ongoing research flow, and evaluate its intermediate experiments and relative products