alias ataxia='singularity exec -B /mnt/enigma:/mnt /mnt/ubuntu/ENIGMA-subcortical-volumes-ataxia/build/enigma-neuro.img run_pipeline'
'''
```
to save on typing, then you can just do
'''
```
ataxia recon
'''
```
etc.
Options to 'run_pipeline' are explained bellow
Options to `run_pipeline` are explained bellow
The 'datapath' should contain 'input' and 'output' directorys. a 'figures' directory will also be created
The 'licensepath' should contain a license.txt file with a Freesurfer license (we are investigating including a license with the container)
On M3 you could use the licensepath '/usr/local/freesurfer/20160922/'
The `licensepath` should contain a license.txt file with a Freesurfer license (we are investigating including a license with the container)
On M3 you could use the licensepath `/usr/local/freesurfer/20160922/`
Using run_pipeline
------------------
'run_pipeline' is a simple python script to handle the various steps of the pipeline.
You can use 'run_pipeline --help' or 'run_pipeline recon --help' for more detail
`run_pipeline` is a simple python script to handle the various steps of the pipeline.
You can use `run_pipeline --help` or `run_pipeline recon --help` for more detail
There are three steps to the pipeline currently executed individually
run_pipeline recon will Look for '\*.nii.gz' files in your input directory and process them with recon-all. Its smart enought that you can rerun it multiple times on different computers without overwriting things (suitable for processing on an HPC cluster). You can also modify if with --oneonly if you want it to process only one subject and exit (suitable for HPC systems where you need to provide an accurate esimate of walltime). The other option of note is --retry if recon fails for some reason (note you should remove any output generated by recon-all from your directory first or recon-all will fail again)
run_pipeline recon will Look for `\*.nii.gz` files in your input directory and process them with recon-all. Its smart enought that you can rerun it multiple times on different computers without overwriting things (suitable for processing on an HPC cluster). You can also modify if with --oneonly if you want it to process only one subject and exit (suitable for HPC systems where you need to provide an accurate esimate of walltime). The other option of note is --retry if recon fails for some reason (note you should remove any output generated by recon-all from your directory first or recon-all will fail again)
run_pipeline stats will generate histograms and look for outliers based on standard deviation. Of course this will fail if you only have one subject (you can define the stddev for one data point!). The initial pipeline called for you to run fslview at this point. If you want to do so you will need to do so manually.