FAIR Computational Workflows

Join

  15 members   |    Active

The FAIR principles have laid a foundation for sharing and publishing digital assets and, in particular, data. The FAIR principles emphasize machine accessibility and that all digital assets should be Findable, Accessible, Interoperable, and Reusable. Workflows encode the methods by which the scientific process is conducted and via which data are created. It is thus important that workflows both support the creation of FAIR data and themselves adhere to the FAIR principles.

Subsections:


Goals

The working group is seeking workflow developers and users to directly inform the standards, processes and recommendations that make computational workflows FAIR.

In this working group, we aim to:

  • Define FAIR principles for computational workflows that consider the complex lifecycle from specification to execution and data products
  • Define metrics to measure the FAIRness of a workflow
  • Define recommendations for FAIR workflow developers and systems
  • Define processes to automate FAIRness in workflows by recording necessary provenance data

Working Group meetings

This working Group is open for anyone interested, please feel free to join the Workflows Community Initiative or attend one of our calls. To ensure more people can get involved in this international effort, the Australian BioCommons is collaborating with the WCI to run a second working group meeting at a time convenient for the Asia Pacific region. Everyone is welcome, regardless of your timezone.

  • Meeting schedule:
    • Monthly 2nd Thursday 17:00 CET (note: daylight savings follows EU)
    • Monthly 2nd Thursday 13:30 AEDT (note: daylight savings follows Australia)
    • Monthly 4th Thursday 17:00 CET (note: daylight savings follows EU)
  • Meeting agendas/telcon: Meeting notes
  • Slack chat: #fair-computational-workflows on workflowscommunity.slack.com (invite link)

What are FAIR Computational Workflows?

Adapted from the article FAIR Computational Workflows https://doi.org/10.1162/dint_a_00033:

Computational workflows describe the complex multi-step methods that are used for data collection, data preparation, analytics, predictive modelling, and simulation that lead to new data products.

Workflows can inherently contribute to the FAIR data principles: by processing data according to established metadata; by creating metadata themselves during the processing of data; and by tracking and recording data provenance.

These properties aid data quality assessment and contribute to secondary data usage. Moreover, workflows are digital objects in their own right.

We argue that FAIR principles for workflows need to address their specific nature in terms of their composition of executable software steps, their provenance, and their development.

This group is gathering community resources and literature on FAIR Computational Workflows. Feel free to suggest a change to help improve this page!

Cite FAIR Computational Workflows

Sean Wilkinson, Meznah Aloqalaa, Khalid Belhajjame, Michael R. Crusoe, Bruno de Paula Kinoshita, Luiz Gadelha, Daniel Garijo, Ove Johan Ragnar Gustafsson, Nick Juty, Sehrish Kanwal, Farah Zaib Khan, Johannes Köster, Karsten Peters-von Gehlen, Line Pouchard, Randy K. Rannow, Stian Soiland-Reyes, Nicola Soranzo, Shoaib Sufi, Ziheng Sun, Baiba Vilne, Merridee A. Wouters, Denis Yuen, Carole Goble (2024):
Applying the FAIR Principles to Computational Workflows.
arXiv 2410.03490 [cs.DL]
https://doi.org/10.48550/arXiv.2410.03490

Carole Goble, Sarah Cohen-Boulakia, Stian Soiland-Reyes, Daniel Garijo, Yolanda Gil, Michael R. Crusoe, Kristian Peters, Daniel Schober (2020):
FAIR Computational Workflows.
Data Intelligence 2(1):108–121
https://doi.org/10.1162/dint_a_00033

Members

The FAIR Computational Workflows working group is composed of 15 members.

Join Working Group