Sct_smooth_spinalcord for Post_mortem Data

Dear SCT Developers,

Could you please advice which the best syntax that I could use to smooth my Ex-vivo data before template construction?

I went through your documentation, previous issues and command line, which If ound the following syntax; -i -c <centerline/segmentation>
sct_smooth_spinalcord -i -s <centerline/segmentation>
sct_maths -i data.nii -smooth (previous issue was about segma value estimation) If this is the right one how I could output the sigma value please?

In addition, I tried this syntax sct_smooth_spinalcord -i -s <centerline/segmentation> and result does seem getting any improvement from the top and bottom. (please find screenshot).

Sorry I couldn’t use the upload botton (not working) to have images attached(can you please check it ou if workin befor posting the issue?)

Could you please help me to solve this issue?

Many thanks in Adavance,


Hi @ihattan,

Thank you for the question!

To best provide advice, it would help to be able to see the image data / screenshots.

So, to address your concern:

How is the upload button broken for you? I ask because when I try it myself, it seems to work OK:

Upload steps
  1. Press the upload button


  2. Choose and upload a file


  3. Result: t2.nii.gz (222.3 KB)

Could you please try one more time to upload your files/screenshots? And, if it doesn’t work, could you let us know which of the upload steps is broken?

Thank you kindly,

1 Like

Thank you so much for providing the screenshots.

However, the screenshots are not labeled, so it is not clear to me which screenshots correspond to the input image, and which screenshots correspond to the output images. (It is also unclear which commands were used to generate the example outputs in your screenshots.)

To help with debugging/testing, are you able to upload the exact image files (nii.gz), as well as sharing the exact commands that you have tried?

Thank you kindly,

1 Like

Hi @ihattan

In addition to @joshuacwnewton 's question, could you please also point us to the documentation you are referring to:

I went through your documentation, previous issues and command line, which If ound the following syntax;

Please note that our procedures evolve in time, so it is possible that you were looking at an old doc. Our latest template generation repository, which is actively maintained, is this one: GitHub - neuropoly/template: A framework for creating unbiased MRI templates of the spinal cord

Note to SCT team: we should add a link to that repos (GitHub - neuropoly/template: A framework for creating unbiased MRI templates of the spinal cord) in all other template creation repos, so people are redirected to the latest and actively maintained repos.


Hi @joshuacwnewton and @jcohenadad ,

Thank you very much for your prompt response and invaluable information. I apologies for any confusion. My question was about what is the the best SCT_tool can I use to smooth my dataset before template generation. I used sct_smooth_spinalcord -i -s <centerline/segmentation> for the following sample and result on the second screenshot.

Input T1 data:

Result after using above syntax to smooth the data.

@jcohenadad I’m using ANTs/ at master · ANTsX/ANTs · GitHub to generate my T1,B0,FA template. I would try yours but I found it a bit tricky and never used python before. However, I’ll try it and give it a go soon.

Here is the first iteration for multivariate results so far using Ants,

b0 averaged out 5 sub after SMOOTHING steps:



Do you think should I continue on this method of registration or use your framework? by the way, most my preprocessing steps I used SCT tools to preprocess my dataset except template generation STEP.

Many thanks in advance for your help and support,


Please find the data attached in the link below

Password is Canada-2023

Syntax that I used

sct_smooth_spinalcord -i t1_masked.nii.gz -s t1_den_RPI_centerline.nii.gz

I would not smooth the data before template creation, given that: (i) a template creation process involves multiple subjects, hence ‘smoothing’ will naturally occur once aggregating and averaging all subjects together, (ii) smoothing will reduce your effective image resolution, (iii) sct_smooth_spinal cord involves straightening of the cord, which in general is fine, but in your case, you have an extremely high resolution, and I’m afraid the through-slice interpolation will introduce more errors than if you were not smoothing.

1 Like

Hi @jcohenadad,

I agree with your point of view but I ran the template without smoothing but the final template results after 4 iterations look like the following

do you think during registration can these damage and variables be removed or (reduced)? I did it multiple times but didn’t find an automatic way in ants scripts to solve out.

Please look at the previous message above for last 3 screenshots after smoothing step and here without it.

which ones are you referring to?

1 Like

Screenshot from 2023-03-07 04-54-26

sorry but I don’t understand-- can you please elaborate/describe in more details than only posting a single picture without any explanation?

1 Like

Sorry @jcohenadad for the misunderstanding I caused. My dataset when scanned they have some artifacts and when they averaged the template result quality affected. What I read that during registration as you said this issues would be handled during the registration process. Thus, I thought the program automatically would find those local areas/voxels that are greater than 2xStandard Deviation, and then removing (masking these out) when taking the template average during the iteration. In my case, Ants program didn’t handle these issues and template quality not quiet PERFECT. I read through your documentation in particular the importance of smoothing the data could improve the quality. Im not sure whether this step will help in my case or ignore the smoothing step.

what king of artefact? this information is important for me to advise. Eg: if B1+ bias, then smoothing will absolutely not help.

1 Like

Thanks @jcohenadad once again,

As you can see below, these dataset were scanned long time ago. Last screenshot had a tissue damaged. These affects appeared like some threads across the image after averaging. I don’t know if there is a way to replace the signal at each voxel with a weighted average of that voxel’s neighbors using your smoothing tools prior averaging if would help.

I don’t think it is a good idea to compromise the entire image with smoothing (which again, will reduce the effective resolution), to fix a very small proportion of your image. Moreover, smoothing will not fix the issue, it will only mitigate it.

One possibility would be to mask out this region during your template creation. If you have enough samples, the transition at the edge of the mask won’t be too visible. And to make it less visible, you could use a soft mask (ie: 3D smooth your mask using a 3D kernel, ie: not with sct_smooth_spinalcord).

1 Like

Hi @jcohenadad

Thank you very much for your feedback. I tried what you’ve suggested using FSL and got the result below. I used Itksnap to msk out the affected region and used Kernal 3D implemented in FSL platform.I’m not sure whether I did the wrong or right thing that you’ve recommend.

I tested all the below commands to find a solution for the above issue.


fslmaths 3889_t1.nii.gz -mas 3889_t1_itksmask1.nii.gz -kernel gauss 3 3889_t1_itksmask1_gauss3.nii.gz
fslmaths 3889_t1.nii.gz -mas 3889_t1_itksmask1.nii.gz -kernel gauss 3 -s 3889_t1_itksmask1_gauss3.nii.gz
fslmaths 3889_t1_itksmask1_gauss3.nii.gz -kernel gauss 3 -s 3889_t1_itksmask1_gauss3s.nii.gz
fslmaths 3889_t1_itksmask1_gauss3.nii.gz -kernel 3D -s 3889_t1_itksmask1_gauss3s.nii.gz
fslmaths 3889_t1_itksmask1_gauss3.nii.gz -kernel gauss 3 3889_t1_itksmask1_gauss3b.nii.gz
fslmaths 3889_t1_itksmask1_gauss3.nii.gz -kernel gauss 3 -fmean 3889_t1_itksmask1_gauss3b.nii.gz
fslmaths 3889_t1_itksmask1_gauss3.nii.gz -kernel gauss 1 -fmean 3889_t1_itksmask1_gauss1.nii.gz &
fslmaths 3889_t1_itksmask1.nii.gz -binv 3889_t1_itksmask1_binv.nii.gz
fslmaths 3889_t1.nii.gz -mul 3889_t1_itksmask1_binv.nii.gz -add 3889_t1_itksmask1_gauss1.nii.gz 3889_t1_corrected.nii.gz &
fslmaths 3889_t1.nii.gz -mul 3889_t1_itksmask1_binv.nii.gz 3889_t1_corrected_hole.nii.gz &
fslmaths 3889_t1.nii.gz -mul 3889_t1_itksmask1_binv.nii.gz -add 3889_t1_itksmask1_gauss1.nii.gz -add 3889_t1_itksmask1_gauss1.nii.gz 3889_t1_corrected_add22.nii.gz


I’m pretty sure there is an effective tool implemented in SCT that could handle this issue in correct way.

In addition, I tried to install your GitHub - neuropoly/template: A framework for creating unbiased MRI templates of the spinal cord but there is administration issue permission. I’ll try to fix it up soon.

Would be there any help please from your team to test a template generation using * ANIMAL registration framework, part of the IPL longitudinal pipeline. This may improve our template result better than Ants.

I’m also confused should I use GitHub - neuropoly/template: A framework for creating unbiased MRI templates of the spinal cord or exvivo-template/generate_template at master · sct-pipeline/exvivo-template · GitHub pipeline?

Many thanks in advance for all your help and support.




The smoothing does not look right-- you need to ignore the ‘zeroed’ voxels during your smoothing. Anyway, I don’t think that smoothing is a good idea.

About your installation issue with the NIST pipeline, a few students in my team are currently looking into that-- they will post additional information in the repos to help with the installation

1 Like