NiftyReg Parameters in BrainReg


Has anyone come up with a systematic way of adjusting the NiftyReg parameters in BrainReg to get a more precise registration? I am currently repeating registration on the same file after changing a single parameter to compare outputs, but I’m sure there is a better way to go about optimizing the values.

I have read the descriptions of the parameters at Registration parameters - brainglobe. My impression is that while the descriptions are great information regarding what the parameters control, it is still unclear to me how adjusting a parameter up or down will affect the registration.

To better understand, I reviewed the NiftyReg documentation, the aMap article, and the Modat et al. paper on the deformation algorithms used. A lot of the information I encountered was beyond what I can understand in the time I have to devote to this which is unfortunate because I think to be able to confidently adjust the parameter values one would really need to understand what is in these articles. From the Methods section of the aMap article I see that “a parameter search was performed” and many of these values seem to be the defaults used with BrainReg registration; however, with my dataset the defaults are pretty far off.

I am happy to help develop a systematic approach if none exists. I am also happy to share my results and data to help any way I can.



Hi Justin,

There isn’t a systematic approach for brainreg, although developing one would be great!

The original aMAP paper was released with the software that was used to perform the parameter search, but I guess that’s not too helpful for you.

As you’ve noticed, brainreg has a lot of parameters, so I would start by working out which is the limiting step. If you run brainreg with the --debug flag, it will save the intermediate steps, as nifti files, in the niftyreg directory. If you haven’t worked with these before, they’ll open in FIJI, or napari if you install the napari-medical-image-formats plugin. With this data, you can see if, for example the initial affine registration step is performing well (load the brain_filtered.nii and affine_registered_atlas_brain.nii images to see if they roughly align).

If the affine step is working ok, then you can focus on the freeform registration step, which runs much quicker. You can’t run the steps independently in brainreg, but you can run the NiftyReg commands directly (e.g. by copying and pasting what gets saved in freeform.log). It’s on my todo list to implement a napari plugin that allows the steps to be run independently, and to easily tweak the parameters, but it’s still a way off.

As well as focusing on one step (e.g. freeform registration), you can also speed up the optimisation by testing parameters using a lower resolution atlas (e.g. 50um), and then only testing a smaller number at the higher resolutions.

Lastly, it may be the case that your data is so different to what brainreg is designed for, that some of our assumptions don’t hold. We preprocess (filter) the data in a specific way, and it may be worth experimenting with different types of preprocessing to better match the atlas and sample data prior to registration.

Hope this helps in some way! It would be great to make brainreg more flexible, and allow users to more easily test parameter combinations, but I don’t think I’ll have the time for a while.


1 Like

Thank you for the helpful suggestions. As you can see from the images, I believe there is a problem with the initial affine step. What parameters would you suggest be changed?
affine reg
Brain Filtered

Sorry I think I told you to compare the wrong images, could you try overlaying downsampled_filtered.nii and affine_registered_atlas_brain.nii? They should be roughly overlaid, but it doesn’t need to be too precise at this stage.

No problem. Here are the results. Would you say this is precise enough to proceed?


Yep, that looks fine. Your data looks to have relatively low signal to noise in some areas which is maybe why the freeform registration isn’t working so well. Could you share the final registration result?

I agree about the signal to noise. We are still working on the clearing and imaging steps.

Oh yeah, that’s bad!

Longer term, I think we might have to investigate additional preprocessing for data like this. At the moment, we assume that the sample data is relatively similar to the atlas data.

It looks like the freeform registration is making it worse (compared to just affine), so I would try to constrain this more. I don’t have any direct suggestions, but things I would try are:

  • Increasing --bending-energy-weight
  • Increasing --grid-spacing - the values are negative to denote voxels, so actually reduce the number

You could also try registering with a different resolution atlas. What resolution is your data, and which atlas are you using?

For the record, the scan is of an ETC-cleared brain imaged in CUBIC-mount with a 20x Glyc 1.0NA objective and a confocal equipped with a spectral detector. The brain had RFP injected into the right PFC and GFP injected into the left. Horizontal plane optical sections were taken with sequential 405/561nm and 488/640nm laser excitation capturing tuned emission channels that each had a signal and background component. Z-steps were 34.8um and the XY resolution is 1.2um/px.

I am troubleshooting with the 50um allen brain atlas and have also tried the 10um atlas.

I think the images are not ideal for automated registration as neither the tracer or background signal are uniform and the clearing process, while fast, resulted in significant warping of the tissue, perhaps beyond what can be reasonably accounted for by an existing algorithm. As we improve upon our clearing and imaging processes I am finding it helpful to also learn and hopefully help improve upon the analysis end.

I increased the --bending-energy-weight from 0.95 to 0.99 and noticed improvement particularly in registering the dorsal surface.

I will next try reducing the value for --grid-spacing. Would you recommend combining a higher --bending-energy-weight value with a lower --grid-spacing value? Do you know what the limits for these parameters are? Thank you again for your help!

That looks a bit better, but still pretty bad!

Yes, I would start that, but to be honest you just need to play around. The parameters are not well documented, but the bending energy is between 0-1, with higher weights reducing the warping, and the grid spacing probably doesn’t have a limit, other than the image size.