Once you have a template, you can then configure the file as needed.
The template referred to here is the generated in the previous section of the documentation ("Create data_settings.yml template file"). If you're using v1.8.0, there's a small bug in the template (from a merge error). You can delete the merge conflict markers (change
to
You will need to use a text editor to fill in at least dataFormat, bidsBaseDir (if using BIDS data), outputSubjectListLocation, and subjectListName in your settings file before you build.
For this example, I'll show how to configure non-BIDS-formatted data.
# CPAC Data Settings File
Once your data_settings.yml file is ready, you can generate your data configuration file by running the following commands, binding any directories in your data_settings.yml to the same locations in the container:
"Binding any directories in your data_settings.yml" is what the lines
-B /path/to/data/in/data_settings/1 \
-B /path/to/data/in/data_settings/2 \
Finally, to run the Docker container with a specific data configuration file (instead of providing a BIDS data directory):
Note: we are still providing /bids_dataset to the bids_dir input parameter. However, we have mapped this to any directory on your machine, as C-PAC will not look for data in this directory when you provide a data configuration YAML with the --data_config_file flag. In addition, if the dataset in your data configuration file is not in BIDS format, just make sure to add the --skip_bids_validator flag at the end of your command to bypass the BIDS validation process.


| Jon Clucas, MIS | |
| Associate Software Developer Computational Neuroimaging Lab Child Mind Institute 646.625.4319 childmind.org | Location |
|
| Jon Clucas, MIS | |
| Associate Software Developer Computational Neuroimaging Lab Child Mind Institute 646.625.4319 childmind.org | Location |
|
| Question | Response |
|
I encountered a problem when I run the default C-PAC pipeline with conmmand: ########## cpac run /nfs/turbo/jiankanggroup/hcp_rest/data /nfs/turbo/jiankanggroup/hcp_rest/try/newdata/outputs participant --data_config_file /nfs/turbo/jiankanggroup/hcp_rest/try/data_config_try.yml --skip_bids_validator ########### It seems that it just stuck at some point when running "slicing timing". After it showed the warnings below, the procedure did not continue and I did not get any output files. I am really confused about this problem. I would appreciate if you could give me some suggestions. |
The amount of memory allocated appears to be insufficient for the combination of pipeline and data. Allocating more memory, reducing the size of the input data, providing some preprocessed data or adjusting the pipeline to avoid unwanted memory-hungry steps are all potential solutions. See above for some further explanation. |
| I am wondering if I could just get ALFF and f/ALFF derivatives without doing any preprocessing using C-PAC. | I think only if you have already preprocessed data in BIDS format and provide that directory to C-PAC. |
| This error is due to a pipeline configuration that requires data that are not provided and will not be generated by the given configuration. Either preprocessed data must be provided (for this, I believe BIDS format is the only way C-PAC can currently comprehend preprocessed data) or the preprocessing steps that feed into FALFF must be turned on. |
Available memory: 50.0 (GB)
when I set --mem_gb 50
I also set maximum_memory_per_participant = 50 and max_cores_per_participant = 20 in my pipeline_config file.


