Compute CSA of conus medullaris region by normalizing PAM50 template

Hello ?

Segmentation result of conus region is not as good as C and T spinal cord. Therefore, I used different cord which was suggested in this forum.

sct_deepseg -i t2.nii.gz -task seg_lumbar_sc_t2w

The result was good.
Then I want to calculate CSA of this result by normalizing PAM 50 template as follows ;

sct_process_segmentation -i t2_seg-manual.nii.gz -vertfile t2_seg-manual_labeled.nii.gz -perslice 1 -normalize-PAM50 1 -o csa_pam50_L.csv

Following is error message

Image header specifies datatype 'int16', but array is of type 'float64'. Header metadata will be overwritten to use 'float64'.
Compute shape analysis: 100%|██████████████| 540/540 [00:05<00:00, 106.37iter/s]
Traceback (most recent call last):
  File "/Users/ahn/sct_6.0/spinalcordtoolbox/scripts/sct_process_segmentation.py", line 521, in <module>
    main(sys.argv[1:])
  File "/Users/ahn/sct_6.0/spinalcordtoolbox/scripts/sct_process_segmentation.py", line 424, in main
    metrics_PAM50_space = interpolate_metrics(metrics, fname_vert_level_PAM50, fname_vert_level)
  File "/Users/ahn/sct_6.0/spinalcordtoolbox/metrics_to_PAM50.py", line 59, in interpolate_metrics
    metrics_inter = np.interp(x_PAM50, x, metric_values_level)
  File "<__array_function__ internals>", line 180, in interp
  File "/Users/ahn/sct_6.0/python/envs/venv_sct/lib/python3.9/site-packages/numpy/lib/function_base.py", line 1594, in interp
    return interp_func(x, xp, fp, left, right)
ValueError: array of sample points is empty

How can I solve this problem ?

Sincerely

Sung Jun Ahn

sct_check_dependencies

SYSTEM INFORMATION
------------------

SCT info:
- version: 6.0
- path: /Users/ahn/sct_6.0
OS: osx (macOS-10.16-x86_64-i386-64bit)
CPU cores: Available: 4, Used by ITK functions: 4
RAM: Total: 16384MB, Used: 6350MB, Available: 10032MB

Hi @ahn,

Thank you for your question! My apologies for the late reply. I have just arrived back from winter holidays.

Could you please provide the input file (t2.nii.gz) so that I could try to reproduce the error?

Kind regards,
Joshua

Dear Joshua.

Greetings , happy new year !!

I am posting full process what I have done

sct_label_vertebrae -i t2.nii.gz -s t2_seg-manual.nii.gz -c t2 -initlabel label_t3t4.nii.gz -qc “$SCT_BP_QC_FOLDER”

sct_label_utils -i t2_seg-manual_labeled.nii.gz -vert-body 11,18 -o labels_vert.nii.gz

sct_register_to_template -i t2.nii.gz -s t2_seg-manual.nii.gz -l labels_vert.nii.gz -c t2 -qc “$SCT_BP_QC_FOLDER”

sct_process_segmentation -i t2_seg-manual.nii.gz -z 40:400 -vertfile t2_seg-manual_labeled.nii.gz -perslice 1 -normalize-PAM50 1 -o csa_pam50_L.csv

This generates same error as above.

Following links are t2.nii.gz as well as associated files
(l_spine.zip - Google Drive)

Hi @ahn,

When I try this first step on the newest version of SCT, I get the following error:

Vertebral detection failed: Missing label or zero label for initial disc.

This is not the same error message than you received. But, this error message should be present on SCT v6.0 (which is what you have been using). I’m not sure why you have not received this error on your end, but here is my investigation into improving this initial labeling step:

Getting `sct_label_vertebrae` to work for your data
  • To give some background context: sct_label_vertebrae contains an intermediate straightening step, where the curved spinal cord will be transformed into a straight spinal cord (which helps with identification of vertebral discs). It is important to note that this straightening process is limited to the region occupied by the spinal cord segmentation.
  • This means that any labels that exist above or below the slices of the spinal cord segmentation will be lost during straightening.
  • In your case, you have provided the T3-T4 disc to -initlabel. I have visualized the position of the label using the green + cursor below:
    image
  • Because the label is above the topmost region of the segmentation, the label will be lost during straightening, causing the error of “Missing label” to occur.
  • This error should be fixable by using the T4-T5 disc as a landmark instead. I tried this using the following command, selecting the posterior tip of the disc that is immediately below your first label:
    sct_label_utils -i t2.nii.gz -create-viewer 12 -o label_t4t5.nii.gz
    sct_label_vertebrae -i t2.nii.gz -s t2_seg-manual.nii.gz -c t2 -initlabel label_t4t5.nii.gz
    
    And the automatic labeling was able to successfully complete:
    image
Scaling issue (output labels 11.999, 12.999, etc.)

As an aside, the labeling appears to have slightly incorrect values (.9999).

image

This seems like it is potentially a bug in SCT, so I will investigate this separately in Issue #3232.

For now, I can quickly work around this issue by adding 0.001 to the image (so that the labels become 12.0001, 13.0001, etc.) then converting the image type to integer:

sct_maths -i t2_seg-manual_labeled.nii.gz -o t2_seg-manual_labeled_add.nii.gz -add 0.001
sct_image -i t2_seg-manual_labeled_add.nii.gz -o t2_seg-manual_labeled_uint8.nii.gz -type uint8

image

Now that I have an accurate vertebral labels, I can try your other steps (making sure to use the uint8 labeled vertebrae file):

sct_label_utils -i t2_seg-manual_labeled_uint8.nii.gz -vert-body 12,18 -o labels_vert.nii.gz
sct_register_to_template -i t2.nii.gz -s t2_seg-manual.nii.gz -l labels_vert.nii.gz -c t2
sct_process_segmentation -i t2_seg-manual.nii.gz -z 40:400 -vertfile t2_seg-manual_labeled_uint8.nii.gz -perslice 1 -normalize-PAM50 1 -o csa_pam50_L.csv

I am able to complete this process without error. So, I think the issue may have been with the vertebral labeling step.

That said, I do also want to mention that in SCT v6.1, we have added a dedicated label to the PAM50 template corresponding to the conus medullaris. You should be able to utilize this label to get more accurate CSA calculations in that region. Please refer to the following tutorial to see how you might be able to amend your processing steps: Registering lumbar images to the PAM50 template - Spinal Cord Toolbox documentation

Please let me know if you have any further questions or concerns. :slight_smile:

Kind regards,
Joshua

1 Like

Dear Joshua,

Thank you for your kind reply.

I am not familiar with methodology. But it seems to be because I did not use uint8 labeled vertebra file.

Then, How can I generate uint8 labeled vertebra file rather than just t2_seg-manual_labeled.nii.gz ?

Sincerely

Sung Jun Ahn

Then, How can I generate uint8 labeled vertebra file rather than just t2_seg-manual_labeled.nii.gz ?

Since this may be a bug/issue on SCT’s end, I will need to do some more investigation to give a full answer, so that I can understand why a uint8 image isn’t being generated in the first place. I’ll look deeper at our code and then reply again later today. :slight_smile:

For now, there is the workaround I provided above (adding a small value to the image and then manually converting to uint8), though this isn’t a great solution. I hope I can provide something better for you.

Kind regards,
Joshua

Hi @ahn,

After some investigation, I have determined that the problem actually exists inside of ‘sct_deepseg’, during the lumbar segmentation step.

Due to a highly technical issue, the lumbar segmentation image (t2_seg-manual.nii.gz) gets saved with a value of 0.99999999977 instead of 1.0. This issue then propagates to the vertebral labeling step.

So, the easiest short-term way to fix this issue is to make sure that the lumbar segmentation is binarized. You can do this by running the following command:

# Generate the lumbar segmentation (0.99999999977)
sct_deepseg -i t2.nii.gz -task seg_lumbar_sc_t2w
# Binarize the lumbar segmentation (1.0)
sct_maths -i t2_seg-manual.nii.gz -bin 0.5 -o t2_seg_bin.nii.gz

Then, use t2_seg_bin.nii.gz throughout the rest of your processing.

We will update SCT so that this binarization step will not be needed in future versions of SCT. But, for now, please use this workaround. :slight_smile:

Kind regards,
Joshua

Dear Joshua

Many thanks to you. Everything has been done completely after following your suggestion.

As a neuro-radiologist, my opinion could be helpful for developing a better algorithm. Although current segmentation is good enough to use, but to me , at visual inspection, segmentation result by seg_lumbar_sc_t2w algorithm tends to underestimate CSA a little bit.

Once again , Thank you very much

Kinds regards

Sung Jun Ahn

1 Like

Hi @ahn,

Although current segmentation is good enough to use, but to me , at visual inspection, segmentation result by seg_lumbar_sc_t2w algorithm tends to underestimate CSA a little bit.

Thank you kindly for this feedback. I will make sure to share your feedback with the team members who developed this approach. :hearts:

For posterity, further details of the approach can be found here: GitHub - ivadomed/lumbar_seg_EPFL: Model repository for lumbar segmentation from EPFL data

Thank you again,
Joshua