ImageMoments in imagej-ops potential miscalculations

FYI, here is a quote from an email of a colleague helping me with image analysis… (results are using the same random image used in the test class that tests HuMoments)

We’ll submit a PR to imagej-ops master when we are confident in what we determine. Cheers

1 Like

Great… Thanks! I will forward the link to the implementor. Maybe he can comment on this, too.

@jaywarrick Thanks for the heads up.

Please, strongly encourage your colleague to file PR(s)—as well as issues if needed for any remaining unsolved bugs—against the Ops repository. It would really be a shame, and waste of time, for the core Ops developers to have to redo these fixes from your description.

Fantastic, looking forward to it!

@jaywarrick

Hi I’m the implementer of the image moments in ImageJ. First thanks for your feedback. Could you give me some additional feedback?

I’m currently comparing our results with the results from OpenCV (OpenCV 2.4.11 to be exact) and unfortunately I get the same results for the moments, central moments, normalized central moment and Hu moments. Could you tell me how you calculated the moments with Matlab? Maybe then I can compare their implementation/results with our implementation/results.

2 Likes

We can likely create a small code repo of the matlab tests as well so all is transparent. The last couple differences have been elusive. In general, our assumption has been that the imagej-ops framework was correct. But in the case described at the top, it turned out to be something in ops. I think how he described it to me was that it was a case of integer division of exponent parameters that occurred (e.g., 2/3) that should have been with double precision. Hopefully we’ll determine the remaining discrepancies with HuMoment5 and 7 (whether on the Matlab end or ops end) and submit the PR and test code. If it proves to elude us past today, we’ll submit the test code and issue with what we have with the discrepancies remaining so others can take a crack along-side us.

Would be interesting to compare your code against OpenCV (the results in our Unit-Tests).

Here are the benchmark results after corrections using the createImg() method of the ImageMomentsTest.java class to create a test image.

Hu moment benchmark results:
                         1               2                3                4                5                6                7
ImageJ-Ops:        +0.00130386     +8.61540e-11     +2.40612e-14     +1.24688e-13     -6.61044e-27     +1.13102e-18     +1.71626e-27
Matlab:            +0.00130386     +8.61540e-11     +2.40612e-14     +1.24688e-13     -6.61044e-27     +1.13102e-18     +1.71626e-27
Difference:        -4.33681e-19    +4.62963e-23     +9.55468e-27     -4.63976e-26     +2.94017e-39     -5.77779e-33     +1.55403e-39

Pull request coming soon that illustrates the changes in code.

The remaining discrepancies between MATLAB and imagej-ops were due to numerical precision differences. Using a for loop in MATLAB results in imprecision compared to matrix/vector math in MATLAB.

Hope this helps.

-Dave

I see that you filed imagej/imagej-ops#301 for this. Thank you very much for that.

It appears that the tests use a tolerance of 1e-3 but many of the expected values are 1e-11 to 1e-27. The test image is an image of noise, so these moments end up very small. I suppose one approach would be either lower the tolerance on the test and/or to change the image to make differences in calculations more noticeable? Potentially a relative tolerance instead of absolute tolerance might be appropriate for some test assertions (e.g., accurate to within a percent)? Is that done?

3 Likes