Non-local Means does not work in 16 bits?

I found a peculiar behavior for the Non-Local Means filter (https://imagej.net/Non_Local_Means_Denoise ) and I am not sure it is a characteristic or a bug.

I have this approx. 1000x1000 16-bit image obtained by x-ray microCT - see attachment
16bitComplete.tif (1.9 MB)

It has a dark (zero value) edge all around.

I applied the Non_Local_Means filter always using the “Auto estimate sigma” option. I tested it on the full image and in square ROI´s, both at 16-bit and after converting to 8-bit.

I found out that at 16-bit the result depended strongly on ROI size. For the full image there was no visible effect of the filter. As the ROI got smaller the filter worked but left many artifacts except for the smallest ROI (approx 200 x 200 pixels). When operating at 8-bit the ROI size had little effect. In fact, even the full image was nicely denoised, although some artifacts are visible.

At 16-bit the result was always worse than at 8-bit except for the smallest ROI, for which both results were visually similar. The attached images illustrate the results. I did not attach the result for the full image at 16-bits as the effect of the filter is invisible.

16bitLargeSquare_NLM_Auto.tif (901.1 KB)
16BitMediumSquare_NLM_Auto.tif (322.2 KB)
16BitSmallSquare_NLM_Auto.tif (64.9 KB)

8bitComplete_NLM_Auto.tif (978.6 KB)
8BitLargeSquare_NLM_Auto.tif (450.6 KB)
8BiMediumSquare_NLM_Auto.tif (161.2 KB)
8BitSmallSquare_NLM_Auto.tif (32.6 KB)

I would appreciate some comments on this as, ideally, I would like to operate on the full image at 16-bit.

Thank you!