Bioinformatics pipelines are fragile ecosystems of software tools, scripts, and dependencies. This complexity commonly makes them hard to maintain, extend, or use outside their original development environment. Nextflow is a workflow framework and a domain specific programming language which offers an alternative, and arguably superior, approach to developing, executing, and sharing pipelines. Nextflow provides seamless integration with code repositories, container image registries, and out of the box support for various HPC schedulers and cloud compute services.
In this workshop you will learn about the building blocks of Nextflow and how to convert a simple pipeline from a set of shell scripts to a Nextflow workflow. You will also learn how to separate the pipeline logic from compute and software environment configuration to get you on your way towards deploying your workflows across clusters and clouds.
Keywords: Nextflow, HPC, cloud, workflow, pipeline
Relevance: Relevant to those who script bioinformatics analyses and wish to make them more reproducible, reusable, and scalable in HPC and/or cloud context.
Research Scientist, CSIRO Aginformatics
Rad studied Computing Sciences at the University of East Anglia, Norwich, UK where he also completed his PhD on algorithms and mathematical properties of phylogenetic trees and networks. For his postdoc he moved to the University of Adelaide where he developed software tools an analyses pipelines for genomics and transcriptomics of bread wheat. He is now a Research Scientist in CSIROs Aginformatics group where he applies and develops tools and practices which facilitate reproducibility, reusability, portability, and scalability of analytical workflows.