зеркало из
1
0
Форкнуть 0
Workflows for processing high-throughput sequencing data for variant discovery with GATK4 and related tools
Перейти к файлу
Matt McLoughlin aadcac3a16
Update links to use public containers with HTTP links
2024-11-05 14:26:55 -08:00
CODE_OF_CONDUCT.md Added Code of Conduct (#3) 2021-04-13 14:27:26 -07:00
LICENSE First version of generic data processing workflow 2017-10-01 02:10:45 -04:00
README.md Input data reorg (#2) 2020-10-29 10:09:13 -07:00
SECURITY.md renamed trigger json files, initial SECURITY.md commit (#1) 2020-03-20 15:19:33 -07:00
processing-for-variant-discovery-gatk4.b37.trigger.json Retain access to datasettestinputs 2024-11-05 12:52:43 -08:00
processing-for-variant-discovery-gatk4.b37.wgs.inputs.json Retain access to datasettestinputs 2024-11-05 12:52:43 -08:00
processing-for-variant-discovery-gatk4.hg38.trigger.json Retain access to datasettestinputs 2024-11-05 12:52:43 -08:00
processing-for-variant-discovery-gatk4.hg38.wgs.inputs.json Retain access to datasettestinputs 2024-11-05 12:52:43 -08:00
processing-for-variant-discovery-gatk4.wdl remove "" at the beginning of the "disk" definition 2020-02-14 11:38:50 -08:00

README.md

Data pre-processing for variant discovery pipeline on Azure

This repository is an example of running the data pre-processing for variant discovery pipeline, based on Best Practices Data Pre-processing Pipeline by Broad Institute of MIT and Harvard, on Cromwell on Azure.

Learn more about using Azure for your Cromwell WDL workflows on our GitHub repo! - Cromwell on Azure.

This repository is a fork from the original and has all the required changes to run the WDL workflow on Cromwell on Azure.

Here, you can find the WDL file and an example inputs JSON file with links to data hosted on a public Azure Storage account. You can use the "datasettestinputs" storage account directly as a relative path, like in the inputs JSON file.

The processing-for-variant-discovery-gatk4.b37.trigger.json and processing-for-variant-discovery-gatk4.hg38.trigger.json trigger files are ready to use. You can start the workflow on your instance of Cromwell on Azure, using these instructions.

gatk4-data-processing

Purpose :

Workflows for processing high-throughput sequencing data for variant discovery with GATK4 and related tools.

processing-for-variant-discovery-gatk4 :

The processing-for-variant-discovery-gatk4 WDL pipeline implements data pre-processing according to the GATK Best Practices.

Requirements/expectations:

  • Pair-end sequencing data in unmapped BAM (uBAM) format
  • One or more read groups, one per uBAM file, all belonging to a single sample (SM)
  • Input uBAM files must additionally comply with the following requirements:
    • filenames all have the same suffix (we use ".unmapped.bam")
    • files must pass validation by ValidateSamFile
    • reads are provided in query-sorted order
    • all reads must have an RG tag
  • Reference index files must be in the same directory as source (e.g. reference.fasta.fai in the same directory as reference.fasta)

Outputs:

  • A clean BAM file and its index, suitable for variant discovery analyses.

Software version requirements :

  • GATK 4 or later
  • BWA 0.7.15-r1140
  • Picard 2.16.0-SNAPSHOT
  • Samtools 1.3.1 (using htslib 1.3.1)
  • Python 2.7
  • Cromwell version support
    • Successfully tested on v37
    • Does not work on versions < v23 due to output syntax

Important Note :

  • The provided JSON is meant to be a ready to use example JSON template of the workflow. It is the users responsibility to correctly set the reference and resource input variables using the GATK Tool and Tutorial Documentations.
  • Please visit the User Guide site for further documentation on our workflows and tools.

LICENSING :

Copyright Broad Institute, 2019 | BSD-3 This script is released under the WDL open source code license (BSD-3) (full license text at https://github.com/openwdl/wdl/blob/master/LICENSE). Note however that the programs it calls may be subject to different licenses. Users are responsible for checking that they are authorized to run all programs before running this script.