I’m looking for advice how to best assess the chromatic aberration of a dual-camera microscope. We acquired image stacks of auto-fluorescent beads, and we know there are (at least) two kinds of **aberration**/**distortion**:

- Misalignment of the two cameras (rotation along the optical axis)
- Chromatic aberration due to the different wavelengths used in two channels

In the past, we’ve been correcting for these distortions by measuring a combination of:

- a
**2D affine transformation**(to account for rotation, shear and scaling in the focal plane) - a
**3D translation model**(to account for possible shifts in the z direction, along the optical axis)

This worked mostly to our satisfaction.

For some background, see also this related topic from 2015:

Now we have a project where we’d like to determine as accurately as possible the 3D distance between two corresponding points in two channels, and therefore I’d like to revisit the possibilities to measure the aberration.

I’d expect the transformation field (at least along the Z axis) to be somewhat **non-linear**, e.g. when looking at a xz view of a 3D stack:

So here are my questions:

- Does anybody have experience measuring/correcting this kind of aberrations in 3D microscopy data?
- Is there anything available (in ImageJ, ITK, scikit-image, or any other tool) to measure non-linear (geometric) transformations and apply them to coordinates of a point cloud (i.e. bead centers), or to an image?

I briefly had a look at the source of `multiview-reconstruction`

by @StephanPreibisch et al., but was overwhelmed by the complexity of the project (and the sparsity of javadoc in some of the classes). Is there maybe some example code illustrating how to interact with the API? Also, I was not sure if `multiview-reconstruction`

supports non-linear transformations or only affine ones?