Sct_compute_compression RuntimeWarning: Mean of empty slice

Dear SCT developers:

I’m running sct_compute_compression step with these parameters:

sct_compute_compression -i subject_01_T2_seg.nii -vertfile subject_01_T2_seg_labeled.nii -l subject_01_T2_compression_label.nii -metric diameter_AP -normalize-hc 1 -o ap_ratio_norm_PAM50.csv

However, I got the next error:

--
Spinal Cord Toolbox (6.2)

sct_compute_compression -i subject_01_T2_seg.nii -vertfile subject_01_T2_seg_labeled.nii -l subject_01_T2_compression_label.nii -metric diameter_AP -normalize-hc 1 -o ap_ratio_norm_PAM50.csv
--

Converting image from type 'uint8' to type 'float64' for linear interpolation
Compute shape analysis: 100%|██████████████| 469/469 [00:02<00:00, 160.17iter/s]
Aggregating metrics:   0%|                              | 0/9 [00:00<?, ?iter/s]/opt/sct_6.2/python/envs/venv_sct/lib/python3.9/site-packages/numpy/core/fromnumeric.py:3432: RuntimeWarning: Mean of empty slice.
  return _methods._mean(a, axis=axis, dtype=dtype,
/opt/sct_6.2/python/envs/venv_sct/lib/python3.9/site-packages/numpy/core/_methods.py:190: RuntimeWarning: invalid value encountered in double_scalars
  ret = ret.dtype.type(ret / rcount)
Aggregating metrics: 100%|██████████████████████| 9/9 [00:03<00:00,  2.32iter/s]

Done! To view results, type:
xdg-open .../01_T2/compression/subject_01_T2_seg_metrics.csv

Converting image from type 'uint8' to type 'float64' for linear interpolation
Compute shape analysis: 100%|██████████████| 469/469 [00:02<00:00, 168.53iter/s]
Aggregating metrics: 100%|██████████████████████| 9/9 [00:28<00:00,  3.12s/iter]

Done! To view results, type:
xdg-open .../01_T2/compression/subject_01_T2_seg_metrics_PAM50.csv


Compression at level 4 (slice 413)
diameter_AP_ratio = 7.945096932199814
diameter_AP_ratio_PAM50 = 8.36729342966882
diameter_AP_ratio_PAM50_normalized = 8.790033354978178```

In addition, adding more compression points to the compression_label.nii file will result in unconsistent metrics on the output subject_01_T2_seg_metrics_PAM50.csv file (within the same compression point).

Can you help me with this issue? I just sent the images to Joshua by WeTransfer

Thanks,

Roger

Here the dependecies:

--
Spinal Cord Toolbox (6.2)

sct_check_dependencies 
--


SYSTEM INFORMATION
------------------
SCT info:
- version: 6.2
- path: /opt/sct_6.2
OS: linux (Linux-5.15.0-113-generic-x86_64-with-glibc2.35)
CPU cores: Available: 24, Used by ITK functions: 24
RAM: Total: 48148MB, Used: 6161MB, Available: 41257MB

OPTIONAL DEPENDENCIES
---------------------
Check FSLeyes version...............................[OK] (1.11.0)

MANDATORY DEPENDENCIES
----------------------
Check Python executable.............................[OK]
  Using bundled python 3.9.19 (main, Mar 21 2024, 17:11:28) 
[GCC 11.2.0] at /opt/sct_6.2/python/envs/venv_sct/bin/python
Check if data are installed.........................[OK]
Check if dipy is installed..........................[OK] (1.5.0)
Check if ivadomed is installed......................[OK] (2.9.9)
Check if matplotlib is installed....................[OK] (3.8.3)
Check if monai is installed.........................[OK] (1.3.0)
Check if nibabel is installed.......................[OK] (3.2.2)
Check if nilearn is installed.......................[OK] (0.10.2)
Check if nnunetv2 is installed......................[OK]
Check if numpy is installed.........................[OK] (1.23.5)
Check if onnxruntime is installed...................[OK] (1.17.0)
Check if pandas is installed........................[OK] (1.4.4)
Check if portalocker is installed...................[OK] (2.8.2)
Check if psutil is installed........................[OK] (5.9.8)
Check if pyqt5 (5.12.3) is installed................[OK] (5.12.3)
Check if pyqt5-sip is installed.....................[OK]
Check if pytest is installed........................[OK] (8.0.0)
Check if pytest-cov is installed....................[OK] (4.1.0)
Check if requests is installed......................[OK] (2.31.0)
Check if requirements-parser is installed...........[OK]
Check if scipy is installed.........................[OK] (1.12.0)
Check if scikit-image is installed..................[OK] (0.22.0)
Check if scikit-learn is installed..................[OK] (1.4.0)
Check if xlwt is installed..........................[OK] (1.3.0)
Check if tqdm is installed..........................[OK] (4.66.2)
Check if transforms3d is installed..................[OK] (0.4.1)
Check if urllib3 is installed.......................[OK] (2.2.0)
Check if pytest_console_scripts is installed........[OK]
Check if pyyaml is installed........................[OK] (6.0.1)
Check if voxelmorph is installed....................[OK] (0.2)
Check if wquantiles is installed....................[OK] (0.4)
Check if xlsxwriter is installed....................[OK] (3.1.9)
Check if spinalcordtoolbox is installed.............[OK]
Check ANTs compatibility with OS ...................[OK]
Check PropSeg compatibility with OS ................[OK]
Check if figure can be opened with PyQt.............QStandardPaths: XDG_RUNTIME_DIR not set, defaulting to '/tmp/runtime-rmateu'
QStandardPaths: XDG_RUNTIME_DIR not set, defaulting to '/tmp/runtime-rmateu'
[OK]
Check if figure can be opened with matplotlib.......[OK] (Using GUI backend: 'QtAgg')

Dear Roger,

Thank you for raising the question!

I tried sct_compute_compression locally on your data, and it worked fine. Please note that RuntimeWarning: Mean of empty slice. is just a warning, not an error. You can simply ignore it, as it does not affect the results.

In addition, adding more compression points to the compression_label.nii file will result in unconsistent metrics on the output subject_01_T2_seg_metrics_PAM50.csv file (within the same compression point).

Excellent point! To simulate this, I labeled one extra compression on slice 352 (S-I axis). The subject_01_T2_compression_label.nii file thus now has two compression labels: slice 413 and slice 352. You are right that the metrics for the “original” compression (slice 413) are now different:

Single compression:

Compression at level 4 (slice 413)
diameter_AP_ratio = 8.880338656527387
diameter_AP_ratio_PAM50 = 9.24318083734561
diameter_AP_ratio_PAM50_normalized = 9.661568088278438

Multiple compressions:

Compression at level 6 (slice 352)
diameter_AP_ratio = -5.569680119718501
diameter_AP_ratio_PAM50 = -5.9922687788487305
diameter_AP_ratio_PAM50_normalized = -4.06842080955736

Compression at level 4 (slice 413)
diameter_AP_ratio = 3.29901895943584
diameter_AP_ratio_PAM50 = 2.939454846759737
diameter_AP_ratio_PAM50_normalized = 9.300828267066763

The reason for this is that in case of multiple compressions, the sct_compute_compression function uses 10mm above the most upper compression and 10mm below the lowest compression site to ensure that “non-compressed” levels are used for the normalization. For more details, please see section 2.1.1. in this paper.

Best,
Jan