Thanks for your answer derek,
unfortunatly, the Rescale option in the CIC module of my pipeline is set to ‘No’. Rather than a trimmed distribution problem, i think the issue is coming from the call of «affine_transform» from the scipy.ndimage module, in the centrosome.bg_compensate module:
Indeed, from the doc of affine_transform, the keywords «mode» and «cval» will pad the results with zeros when the output coordinates points outside the boundaries of the input:
extract from scipy,ndimage.affine_transform docs
«mode : str, optional
Points outside the boundaries of the input are filled according to the given mode (‘constant’, ‘nearest’, ‘reflect’, ‘mirror’ or ‘wrap’). Default is ‘constant’.
cval : scalar, optional
Value used for points outside the boundaries of the input if mode=‘constant’. Default is 0.0»
And upon testing the backgr function (from centrosome.bg_compensate) with the test image i joined in my original post, and with the settings form my pipeline (all the keywords of backgr are the defaults ones except for “scale=2”), i indeed found the last column and the last row of the output set to 0.
here is the last part of the backgr function which i slightly modified in order to check the indexing and shapes:
output = np.zeros(orig_shape, img.dtype)
print(‘output_shape =’, tuple(clip_shape))
aff = affine_transform(res, inverse_transform,
output_shape = tuple(clip_shape),
order = 3)
output[clip_imin:clip_imax, clip_jmin:clip_jmax] = aff
if input_mask is not None:
output[~input_mask] = 0
and here is the call of the function and the results:
In : outp = backgr(img, scale=2)
RES SHAPE (375, 375)
output_shape = (750, 750)
clips 0 750 , 0 750
affshape (750, 750)
(the yellow pixels are all equal to 0, the others are negative hee)
But i don’t really get why this is happenning, since as you can see the indexing is correct, and when manually calculating the correspondance between output and input coordinates, as explained in the affine_transform doc:
«Given an output image pixel index vector o, the pixel value is determined from the input image at position np.dot(matrix,o) + offset.»
everything seems fine (the offset in the backgr function is null):
In : a=np.array([749,749])
In : np.dot(inverse_transform,a)
array([ 374., 374.])
where inverse_transform is
In : inverse_transform
array([[ 0.49933244, 0. ],
[ 0. , 0.49933244]])
Any idea what is going on? Could it be a floating-point rounding issue or something like it?
As for your others questions, i am not working with intensity images, but with phase images reconstructed from a digital hologram without any dye. This explains why i don’t have a regular illumination function since it is not really an illumination function, and why there are concentric patterns which are interference patterns. But the principle of the background correction still remains valid/usefull.
Finally i also have a bunch of questions/remarks related to cellprofiler. I’ll ask them here but will start a new thread if needed:
Why can’t we save, with the saveImage module, 32 bits images with pixels values outside the range of [-1,1]? (i get the neccessity to rescale for the treatments but why for the save as well?)
When i compare the given WALL_TIME per worker and per module in the console, with the actual time it takes to treat one set of data, i get a real time twice as long as the given wall time. What i am missing here?
I tried to use cellProfiler as a python package and was wondering if there was an available documentation to do so (the wiki page: https://github.com/CellProfiler/CellProfiler/wiki/CellProfiler-as-a-Python-package is a bit short )?
same question for headless mode.
thanks for your time and great work.