The error `process_data.sh: line 105: cd: too many arguments` in the "Analysis pipelines with SCT" section in SCT web tutorial

When I run the following section in the tutorial (running the script to batch process):

Running the sample script (process_data.sh) using sct_run_batch - Spinal Cord Toolbox documentation

I got the error


data_batch-processing-of-subjects/multi_subject/output/process_data.sh: line 105: cd: too many arguments

, which says there’s some problem in the process_data.sh itself.

Could you please give me some advice on how to solve this error?
I would appreciate it if you could help me.

Hi @Yuexiang_Ji,

I think I know what the problem is. To fix this issue, please open the process_data.sh script in a text editor, and make the following change to lines 104-107:

Before:

# Go to folder where data will be copied and processed
cd $PATH_DATA_PROCESSED
# Copy source images
rsync -avzh $PATH_DATA/$SUBJECT .

After:

# Go to folder where data will be copied and processed
cd "$PATH_DATA_PROCESSED"
# Copy source images
rsync -avzh "$PATH_DATA/$SUBJECT" .

The issue here is that the environment variables are missing double-quotes ("). Because of this, if $PATH_DATA_PROCESSED has a space in it, then it will actually get interpreted as two separate arguments, causing an error with cd.

Please let me know if this fixes your issue. If it does, we will add the fix to the tutorials. :slight_smile:

Kind regards,
Joshua

1 Like

Dear @joshuacwnewton

Thank you again for your prompt response and valuable assistance.
Taking your advice into consideration, I will revise the process_data.sh and try again.

Best regards,
Yuexiang Ji

Dear @joshuacwnewton,

Thank you for your advice.
Based on your advice, I have revised my process_data.sh and the previous error I posted was solved.
However, there still remained another error.
I checked the content of the err.process_data_sub-05.log file in output/log.

Could you please give me some advice on how to solve the error?
I would appreciate if you could help me.

Here is the content of log file err.process_data_sub-05.log:


--
Spinal Cord Toolbox (6.0)

sct_check_dependencies -short
--


SYSTEM INFORMATION
------------------
SCT info:
- version: 6.0
- path: /home/xyj/sct_6.0
OS: linux (Linux-5.15.90.1-microsoft-standard-WSL2-x86_64-with-glibc2.35)
CPU cores: Available: 12, Used by ITK functions: 1
RAM: Total: 15649MB, Used: 672MB, Available: 14711MB
sending incremental file list

sent 257 bytes  received 18 bytes  550.00 bytes/sec
total size is 12.11M  speedup is 44,024.02

Looking for manual segmentation: /mnt/f/2.SCT_tutorial/8. Analysis pipelines with SCT/data_batch-processing-of-subjects/multi_subject/data/derivatives/labels/sub-05/anat/sub-05_T2w_seg-manual.nii.gz
Not found. Proceeding with automatic segmentation.

--
Spinal Cord Toolbox (6.0)

sct_deepseg_sc -i sub-05_T2w.nii.gz -c t2 -qc /mnt/f/2.SCT_tutorial/8. Analysis pipelines with SCT/data_batch-processing-of-subjects/multi_subject/output/qc -qc-subject sub-05
--

usage: sct_deepseg_sc -i <file> -c {t1,t2,t2s,dwi} [-h]
                      [-centerline {svm,cnn,viewer,file}]
                      [-file_centerline <str>] [-thr <float>] [-brain {0,1}]
                      [-kernel {2d,3d}] [-ofolder <str>] [-o <file>] [-r {0,1}]
                      [-v <int>] [-qc <str>] [-qc-dataset <str>]
                      [-qc-subject <str>]

Spinal Cord Segmentation using convolutional networks. Reference: Gros et al.
Automatic segmentation of the spinal cord and intramedullary multiple sclerosis
lesions with convolutional neural networks. Neuroimage. 2019 Jan 1;184:901-915.

MANDATORY ARGUMENTS:
  -i <file>             Input image. Example: t1.nii.gz
  -c {t1,t2,t2s,dwi}    Type of image contrast.

OPTIONAL ARGUMENTS:
  -h, --help            show this help message and exit
  -centerline {svm,cnn,viewer,file}
                        Method used for extracting the centerline:
                         svm: Automatic detection using Support Vector Machine
                         algorithm.
                         cnn: Automatic detection using Convolutional Neural
                         Network.
                         viewer: Semi-automatic detection using manual selection
                         of a few points with an interactive viewer followed by
                         regularization.
                         file: Use an existing centerline (use with flag
                         -file_centerline) (default: svm)
  -file_centerline <str>
                        Input centerline file (to use with flag -centerline
                        file). Example: t2_centerline_manual.nii.gz
  -thr <float>          Binarization threshold (between 0 and 1) to apply to the
                        segmentation prediction. Set to -1 for no binarization
                        (i.e. soft segmentation output). The default threshold
                        is specific to each contrast and was estimated using an
                        optimization algorithm. More details at:
                        https://github.com/sct-pipeline/deepseg-threshold.
  -brain {0,1}          Indicate if the input image contains brain sections (to
                        speed up segmentation). Only use with "-centerline cnn".
                        (default: 1 for T1/T2 contrasts, 0 for T2*/DWI
                        contrasts)
  -kernel {2d,3d}       Choice of kernel shape for the CNN. Segmentation with 3D
                        kernels is slower than with 2D kernels. (default: 2d)
  -ofolder <str>        Output folder. Example: My_Output_Folder  (default:
                        /mnt/f/2.SCT_tutorial/8. Analysis pipelines with
                        SCT/data_batch-processing-of-subjects/multi_subject/outp
                        ut/data_processed/sub-05/anat)
  -o <file>             Output filename. Example: spinal_seg.nii.gz
  -r {0,1}              Remove temporary files. (default: 1)
  -v <int>              Verbosity. 0: Display only errors/warnings, 1:
                        Errors/warnings + info messages, 2: Debug mode (default:
                        1)
  -qc <str>             The path where the quality control generated content
                        will be saved
  -qc-dataset <str>     If provided, this string will be mentioned in the QC
                        report as the dataset the process was run on
  -qc-subject <str>     If provided, this string will be mentioned in the QC
                        report as the subject the process was run on
e[1me[91m
sct_deepseg_sc: error: unrecognized arguments: Analysis pipelines with SCT/data_batch-processing-of-subjects/multi_subject/output/qc

e[0m

The issue is again due to the space in the path name, and a lack of double quotes for the PATH_QC environment variable. In this case, we have fixed the issue in our repository (Improve environment variable quoting by only quoting the variables th… · spinalcordtoolbox/sct_tutorial_data@ede52e5 · GitHub), but we have not yet created a new release for the dataset, so the copy from the webpage does not have the double quotes yet. We will be updating this soon.

However, for a more permanent fix, I would actually recommend to just rename the folder from 8. Analysis pipelines with SCT to 8.Analysis_pipelines_with_SCT, and to avoid using spaces in path names moving forward. This will avoid all related issues, and keep you from having to edit the script further.

Kind regards,
Joshua

Thank you for your message.
I have revised my path so that it does not contain any spaces, and it worked well.

I really appreciate your valuable advice.

1 Like