Illumination correction

cellprofiler

#1

Dear Team,

I have two set of images: one containing 10 images representing Control cells and other set has 10 images representing treated cells. These are all confocal images & we r looking for nuclear translocation of two proteins as a result of treatment. One protein is Red dye labeled and other is Green. Nucleus is Blue.

My question is:
should I calculate Illumination correction function for each image and apply to the same image while running through the pipeline ?
OR
should I calculate Illumination correction function for all images as one pipeline and then apply the saved function to each image ?

Whats suggested to get the best results?
And what could be the difference b/w the two methods?

Thanks
Mridul KK


#2

Hi,

Generally, you want to calculate the illumination function using all images generated using the same microscope, under the same experimental conditions. This is because the illumination correction is an issue of the optics used in the experiment, which can vary over time, even from day to day. Ideally, the function is best estimated from a large set of images, so you can try creating one from all the images. Keep in mind that there must a separate illumination function for each channel.

However, 10 images is not much to create an illumination function from; usually, several hundred or thousands are better (e.g., from a 384-well plate for example). So if using all the images doesn’t work (your 2nd option), you may need to do one for each (your 1st option).

Have a look at this thread. My colleague Kate does an excellent job of explaining the functionality of illumination correction in CellProfiler there.

Regards,
-Mark


#3

Hi Mark,

Thanks for explaining me & the thread by Kate was fantastic to understand.

Bcoz of couple of issues, I was thinking on different lines.

  1. The translocation of drug induced expression of protein in my untreated cells ideally should not occur or may be very low basal levels to the nucleus. And this translocation should be much higher in treated cells. Attached are examples figures (control & treated).

By naked eye & western blots, we can see the diiference much better. However, the numerical values obtained by CP is not reflecting the difference in that manner. Can you suggest something here.

I have got now total of 350+ images for the entire experiment. I’m doing illumination correction pipeline as follows:

LoadImages
CorrectIllumination_Calculate … for blue channel
SaveImages
CorrectIllumination_Calculate…for green channel
SaveImages
CorrectIllumination_Calculate…for red channel
SaveImages

In CorrectIllumination_Calculate, I’m using ‘regular’ with ‘division’ at apply. This I tried in combination with Gaussian filtering with automatic/100/200/300/50 etc as well as Median filtering with automatic/100/200/300/50.
Attached are different illumination functions (gfp_illumfn). Can you please help me which are better than the others & why?

Thanks a lot for teaching me for basic but important stuffs.
Mridul KK






#4

Hi,
I was trying to upload images of illumination fucntions. but it says the following…

Sorry, the board attachment quota has been reached.

Plz let me know, how can I deal with it.
Thanks
Mridul KK


#5

[quote=“mridulkk”]In CorrectIllumination_Calculate, I’m using ‘regular’ with ‘division’ at apply. This I tried in combination with Gaussian filtering with automatic/100/200/300/50 etc as well as Median filtering with automatic/100/200/300/50.
Attached are different illumination functions (gfp_illumfn). Can you please help me which are better than the others & why?[/quote]

Try using you YouSendIt to upload the files and post the link here when done.

However, without seeing the files, the general principle when creating an illumination correction function is that the resultant function should like like the illumination function of an actual microscope. This typically means the following:

  • The function is usually brightest towards the center of the image and becomes darker towards the image edges.

  • The function varies in intensity about 10-20% across the image. By this, I mean that the lowest value is 1 and the highest value is 1.1 - 1.2 (for example)

  • It should be fairly smooth across the image. If you can still see obvious blobs where the cells are, then you either have not smoothed the image enough (in which case, you can increase the artifact width size) or you need more images for the cumulative average.

Either the Average or the Median filters can give you functions that are satisfactory, but typically we prefer the Median filter because it is more statistically robust against outlier values. However, if you find that neither is giving you adequate results, you can use Fit Polynomial to give you a quadratic curve fit to the data, which is guaranteed to be smooth.

Hope this helps!
-Mark


#6

Hi Mark,

I used all types of filterign: Median, Gaussian and Polynomial fitting too as per your suggestion … but no significant improvements as such. What to do now?

Also, how can we work better with dim images, if thats a problem.

I have also attached three illumination function images which I did for green GFP channel over 350+ images, taken under same conditions and microscope. The two of them are using median filtering with different artifact width and third is using polynomial fitting. If to say, which one is better and why, whats the criteria to decide upon?

Thanks
Mridul KK






#7

Hi,

You can see in the Illum_green1 image that the image has some higher-intensity blobs, which indicates that it is not as a good a correction function as the Illum_green5 image (I only saw two images attached to your post). However, the fit polynomial looks OK so that may be usable.

One important question if you are using the ‘regular’ method before you use or apply any of these functions is whether the cells are evenly distributed across the field of view, on average. If the cells are all in the same location for all the images that you use to create the function, then all you will get is a highly smoothed version of the cellular image, which will be useless for illumination correction. The same is true for the ‘background’ method.

If the illumination correction is not helping, then it may be that the images don’t need very much correction. Illumination correction will help the images if there is a systemic, image-wide distortion but if there isn’t, then it doesn’t do much. Still, we often include this step anyway, just to be on the safe side.

Also, have you looked at the SBS example pipeline? It seems very similar to what you are doing…

Regards,
-Mark


#8

Hi,

You can see in the Illum_green1 image that the image has some higher-intensity blobs, which indicates that it is not as a good a correction function as the Illum_green5 image (I only saw two images attached to your post). However, the fit polynomial looks closer to what a typical illumination correction function look like, so that may be usable.

One important question if you are using the ‘regular’ method before you use or apply any of these functions is whether the cells are evenly distributed across the field of view, on average. If the cells are in the same locations for all the images that you use to create the function, then all you will get is essentially a highly smoothed version of a single cellular image, which will be useless for illumination correction. The same is true for the ‘background’ method.

If the illumination correction is not helping, then it may be that the images don’t need very much correction. Illumination correction will help the images if there is a systemic, image-wide distortion but if there isn’t, then it doesn’t do much. Still, we often include this step anyway, just to be on the safe side.

Regards,
-Mark


#9

Hi Mark,

Thanks for replying me.

Now, i understand the illumination corrections much better.

In our set of images, the cells are randomly located from image to image. Also, the number of cells is also varying. But some cells are brighter than the other & i guess, illumination correction is helping out in figuring out those low-bright cells. Is there any better way of doing it?

How can we work with dim images?

best regards
Mridul KK


#10

[quote=“mridulkk”]
In our set of images, the cells are randomly located from image to image. Also, the number of cells is also varying. [/quote]

In this case, using illumination correction is appropriate, even if the amount of correction is small.

It may be helpful to think about illumination correction more about removing the contribution of the imaging device that could be leading to these bright cells. For example, if the bright cells were always located in the center of the image, this could be because the brightest microscope illumination is in the center, and correcting for illumination might remove an otherwise excessive cell intensity.

However, if the cells in the center were always bright for some other, perhaps biological, reason, then correcting for it might not be helpful. It is probably worth checking to see whether these bright cells are scattered across the image, as opposed to being concentrated in a specific location.

For illumination correction, it shouldn’t make much of a difference; the main assumption is that the microscope optics and illumination function is relatively similar for all images. For dim images, the main concern will be segmentation of the cells themselves, and that will involve examine your settings in IdentifyPrimAutomatic (or a similar module).

Regards,
-Mark


#11

Hi Mark,

Thanks for a such a prompt reply.

In our set of images, the bright cells are scattered across the image, rather than being concentrated in any specific location. So, i guess, illumination correction function has a good job to do & its working good.

After applying illumination correction function, how can i save each of the corrected image? I want to compare original to that of illumination corrected image !!

And is there anyway to find out intensity of each of the pixel for each of the object so that I can look for the intensity histogram of pixels comprising a object like nuclei? I’m aware of intensity histogram for image.
This information will be a great help to us, if we can find out.

regards
Mridul KK


#12

Hi,

Would you be able to post your pipeline plus perhaps a sample of the actual images that you would be using? (You can post them either here or use YouSendIt if the images are too large). This might help us determine why nucleus vs. cytoplasm intensities seem to be incorrect.

Also, you had a chance to examine the SBS example pipeline? If so, has it been helpful?

Regards,
-Mark


#13

Hi Mark,

I have been through SBS pipeline and thats where I started learning CP, few months ago. It was really a great help.

Here is the pipeline that I’m using:

Pixel Size: 1

Pipeline:
LoadImages
LoadSingleImage
LoadText
CorrectIllumination_Apply
CorrectIllumination_Apply
CorrectIllumination_Apply
IdentifyPrimAutomatic
IdentifyPrimAutomatic
IdentifySecondary
IdentifySecondary
IdentifyTertiarySubregion
IdentifyTertiarySubregion
MeasureCorrelation
MeasureObjectIntensity
MeasureObjectIntensity
MeasureObjectIntensity
MeasureObjectAreaShape
MeasureTexture
MeasureTexture
MeasureTexture
CalculateRatios
CalculateRatios
CalculateRatios
CalculateRatios
CalculateRatios
CalculateStatistics

Module #1: LoadImages revision - 2
How do you want to load these files? Text-Exact match
Type the text that one type of image has in common (for TEXT options), or their position in each group (for ORDER option): Ch2-T2
What do you want to call these images within CellProfiler? rawGFP
Type the text that one type of image has in common (for TEXT options), or their position in each group (for ORDER option). Type “Do not use” to ignore: Ch3-T1
What do you want to call these images within CellProfiler? (Type “Do not use” to ignore) rawDNA
Type the text that one type of image has in common (for TEXT options), or their position in each group (for ORDER option): Ch3-T3
What do you want to call these images within CellProfiler? rawIRF
Type the text that one type of image has in common (for TEXT options), or their position in each group (for ORDER option): Do not use
What do you want to call these images within CellProfiler? Do not use
If using ORDER, how many images are there in each group (i.e. each field of view)? 3
What type of files are you loading? individual images
Analyze all subfolders within the selected folder? No
Enter the path name to the folder where the images to be loaded are located. Type period (.) for default image folder. .
Note - If the movies contain more than just one image type (e.g., brightfield, fluorescent, field-of-view), add the GroupMovieFrames module. .

Module #2: LoadSingleImage revision - 4
This module loads one image for all cycles that will be processed. Typically, however, a different module (LoadImages) is used to load new sets of images during each cycle of processing. n/a
Enter the path name to the folder where the images to be loaded are located. Type period (.) for the default image folder, or type ampersand (&) for the default output folder. .
What image file do you want to load? Include the extension, like .tif ILLUM_Green.mat
What do you want to call that image? IllumGFP
What image file do you want to load? Include the extension, like .tif ILLUM_Blue.mat
What do you want to call that image? IllumDNA
What image file do you want to load? Include the extension, like .tif ILLUM_Red.mat
What do you want to call that image? IllumIRF
What image file do you want to load? Include the extension, like .tif Do not use
What do you want to call that image? Do not use

Module #3: LoadText revision - 2
What is the file containing the text that you want to load? A549-Sendai_Timepoints.txt
What would you like to call the loaded text? names
Enter the path name to the folder where the text file to be loaded is located. Type period (.) for the default image folder, or ampersand (&) for the default output folder. .

Module #4: CorrectIllumination_Apply revision - 3
What did you call the image to be corrected? rawGFP
What do you want to call the corrected image? CorrGreen
What did you call the illumination correction function image to be used to carry out the correction (produced by another module or loaded as a .mat format image using Load Single Image)? IllumGFP
How do you want to apply the illumination correction function? Divide
If you chose division, Choose rescaling method. Stretch 0 to 1

Module #5: CorrectIllumination_Apply revision - 3
What did you call the image to be corrected? rawDNA
What do you want to call the corrected image? CorrBlue
What did you call the illumination correction function image to be used to carry out the correction (produced by another module or loaded as a .mat format image using Load Single Image)? IllumDNA
How do you want to apply the illumination correction function? Divide
If you chose division, Choose rescaling method. Stretch 0 to 1

Module #6: CorrectIllumination_Apply revision - 3
What did you call the image to be corrected? rawIRF
What do you want to call the corrected image? CorrRed
What did you call the illumination correction function image to be used to carry out the correction (produced by another module or loaded as a .mat format image using Load Single Image)? IllumIRF
How do you want to apply the illumination correction function? Divide
If you chose division, Choose rescaling method. Stretch 0 to 1

Module #7: IdentifyPrimAutomatic revision - 12
What did you call the images you want to process? CorrBlue
What do you want to call the objects identified by this module? Nuclei
Typical diameter of objects, in pixel units (Min,Max): 30,50
Discard objects outside the diameter range? Yes
Try to merge too small objects with nearby larger objects? No
Discard objects touching the border of the image? Yes
Select an automatic thresholding method or enter an absolute threshold in the range [0,1]. To choose a binary image, select “Other” and type its name. Choosing ‘‘All’’ will use the Otsu Global method to calculate a single threshold for the entire image group. The other methods calculate a threshold for each image individually. “Set interactively” will allow you to manually adjust the threshold during the first cycle to determine what will work well. MoG Global
Threshold correction factor 1.2
Lower and upper bounds on threshold, in the range [0,1] 0.04,1
For MoG thresholding, what is the approximate fraction of image covered by objects? 20 Method to distinguish clumped objects (see help for details): Intensity
Method to draw dividing lines between clumped objects (see help for details): Intensity
Size of smoothing filter, in pixel units (if you are distinguishing between clumped objects). Enter 0 for low resolution images with small objects (~< 5 pixel diameter) to prevent any image smoothing. 5
Suppress local maxima within this distance, (a positive integer, in pixel units) (if you are distinguishing between clumped objects) Automatic
Speed up by using lower-resolution image to find local maxima? (if you are distinguishing between clumped objects) Yes
Enter the following information, separated by commas, if you would like to use the Laplacian of Gaussian method for identifying objects instead of using the above settings: Size of neighborhood(height,width),Sigma,Minimum Area,Size for Wiener Filter(height,width),Threshold Do not use
What do you want to call the outlines of the identified objects (optional)? Do not use
Do you want to fill holes in identified objects? Yes
Do you want to run in test mode where each method for distinguishing clumped objects is compared? No

Module #8: IdentifyPrimAutomatic revision - 12
What did you call the images you want to process? CorrGreen
What do you want to call the objects identified by this module? ThresholdedCells
Typical diameter of objects, in pixel units (Min,Max): 100,999999
Discard objects outside the diameter range? Yes
Try to merge too small objects with nearby larger objects? No
Discard objects touching the border of the image? No
Select an automatic thresholding method or enter an absolute threshold in the range [0,1]. To choose a binary image, select “Other” and type its name. Choosing ‘‘All’’ will use the Otsu Global method to calculate a single threshold for the entire image group. The other methods calculate a threshold for each image individually. “Set interactively” will allow you to manually adjust the threshold during the first cycle to determine what will work well. .03
Threshold correction factor 1
Lower and upper bounds on threshold, in the range [0,1] 0,1
For MoG thresholding, what is the approximate fraction of image covered by objects? 40 Method to distinguish clumped objects (see help for details): None
Method to draw dividing lines between clumped objects (see help for details): None
Size of smoothing filter, in pixel units (if you are distinguishing between clumped objects). Enter 0 for low resolution images with small objects (~< 5 pixel diameter) to prevent any image smoothing. Automatic
Suppress local maxima within this distance, (a positive integer, in pixel units) (if you are distinguishing between clumped objects) Automatic
Speed up by using lower-resolution image to find local maxima? (if you are distinguishing between clumped objects) Yes
Enter the following information, separated by commas, if you would like to use the Laplacian of Gaussian method for identifying objects instead of using the above settings: Size of neighborhood(height,width),Sigma,Minimum Area,Size for Wiener Filter(height,width),Threshold Do not use
What do you want to call the outlines of the identified objects (optional)? Do not use
Do you want to fill holes in identified objects? No
Do you want to run in test mode where each method for distinguishing clumped objects is compared? No

Module #9: IdentifySecondary revision - 3
What did you call the primary objects you want to create secondary objects around? Nuclei
What do you want to call the objects identified by this module? PropCells
Select the method to identify the secondary objects (Distance - B uses background; Distance - N does not): Propagation
What did you call the images to be used to find the edges of the secondary objects? For DISTANCE - N, this will not affect object identification, only the final display. CorrRed
Select an automatic thresholding method or enter an absolute threshold in the range [0,1]. To choose a binary image, select “Other” and type its name. Choosing ‘‘All’’ will use the Otsu Global method to calculate a single threshold for the entire image group. The other methods calculate a threshold for each image individually. Set interactively will allow you to manually adjust the threshold during the first cycle to determine what will work well. Otsu Global
Threshold correction factor 1
Lower and upper bounds on threshold, in the range [0,1] 0.02,1
For MoG thresholding, what is the approximate fraction of image covered by objects? 10 For DISTANCE, enter the number of pixels by which to expand the primary objects [Positive integer] 10
For PROPAGATION, enter the regularization factor (0 to infinity). Larger=distance,0=intensity 0.05
What do you want to call the outlines of the identified objects (optional)? Do not use
Do you want to run in test mode where each method for identifying secondary objects is compared? No

Module #10: IdentifySecondary revision - 3
What did you call the primary objects you want to create secondary objects around? Nuclei
What do you want to call the objects identified by this module? DistanceCells
Select the method to identify the secondary objects (Distance - B uses background; Distance - N does not): Distance - N
What did you call the images to be used to find the edges of the secondary objects? For DISTANCE - N, this will not affect object identification, only the final display. CorrRed
Select an automatic thresholding method or enter an absolute threshold in the range [0,1]. To choose a binary image, select “Other” and type its name. Choosing ‘‘All’’ will use the Otsu Global method to calculate a single threshold for the entire image group. The other methods calculate a threshold for each image individually. Set interactively will allow you to manually adjust the threshold during the first cycle to determine what will work well. Otsu Global
Threshold correction factor 1
Lower and upper bounds on threshold, in the range [0,1] 0,1
For MoG thresholding, what is the approximate fraction of image covered by objects? 10 For DISTANCE, enter the number of pixels by which to expand the primary objects [Positive integer] 6
For PROPAGATION, enter the regularization factor (0 to infinity). Larger=distance,0=intensity 0.05
What do you want to call the outlines of the identified objects (optional)? Do not use
Do you want to run in test mode where each method for identifying secondary objects is compared? No

Module #11: IdentifyTertiarySubregion revision - 1
What did you call the larger identified objects? DistanceCells
What did you call the smaller identified objects? Nuclei
What do you want to call the new subregions? DistCytoplasm
What do you want to call the outlines of the identified objects (optional)? Do not use

Module #12: IdentifyTertiarySubregion revision - 1
What did you call the larger identified objects? PropCells
What did you call the smaller identified objects? Nuclei
What do you want to call the new subregions? PropCytoplasm
What do you want to call the outlines of the identified objects (optional)? Do not use

Module #13: MeasureCorrelation revision - 3
Choose at least two image types to measure correlations between: CorrGreen
(All pairwise correlations will be measured) CorrBlue
CorrRed
Do not use
Choose objects within which to measure the correlations (Choosing Image will measure correlations across the entire images) Nuclei
PropCells
DistanceCells
ThresholdedCells
PropCytoplasm
DistCytoplasm

Module #14: MeasureObjectIntensity revision - 2
What did you call the greyscale images you want to measure? CorrGreen
What did you call the objects that you want to measure? Nuclei
ThresholdedCells
PropCells
DistanceCells
DistCytoplasm
PropCytoplasm

Module #15: MeasureObjectIntensity revision - 2
What did you call the greyscale images you want to measure? CorrBlue
What did you call the objects that you want to measure? Nuclei
ThresholdedCells
PropCells
DistanceCells
DistCytoplasm
PropCytoplasm

Module #16: MeasureObjectIntensity revision - 2
What did you call the greyscale images you want to measure? CorrRed
What did you call the objects that you want to measure? Nuclei
ThresholdedCells
PropCells
DistanceCells
DistCytoplasm
PropCytoplasm

Module #17: MeasureObjectAreaShape revision - 3
What did you call the objects that you want to measure? Nuclei
ThresholdedCells
PropCells
DistanceCells
DistCytoplasm
PropCytoplasm
Do not use
Would you like to calculate the Zernike features for each object (with lots of objects, this can be very slow)? No

Module #18: MeasureTexture revision - 2
What did you call the greyscale images you want to measure? CorrGreen
What did you call the objects that you want to measure? Nuclei
ThresholdedCells
PropCells
DistanceCells
DistCytoplasm
PropCytoplasm
What is the scale of texture? 1

Module #19: MeasureTexture revision - 2
What did you call the greyscale images you want to measure? CorrBlue
What did you call the objects that you want to measure? Nuclei
ThresholdedCells
PropCells
DistanceCells
DistCytoplasm
PropCytoplasm
What is the scale of texture? 1

Module #20: MeasureTexture revision - 2
What did you call the greyscale images you want to measure? CorrRed
What did you call the objects that you want to measure? Nuclei
ThresholdedCells
PropCells
DistanceCells
DistCytoplasm
PropCytoplasm
What is the scale of texture? 1

Module #21: CalculateRatios revision - 6
What do you want to call the ratio calculated by this module? The prefix ‘‘Ratio_’’ will be applied to your entry, or simply leave as ‘‘Automatic’’ and a sensible name will be generated RelA_nuc2Propcyto
Which object would you like to use for the numerator? Nuclei
Which category of measurements would you like to use? Intensity
Which feature do you want to use? (Enter the feature number or name - see help for details) 1
For INTENSITY, AREAOCCUPIED or TEXTURE features, which image’‘s measurements would you like to use? CorrGreen
For TEXTURE, RADIAL DISTRIBUTION, OR NEIGHBORS features, what previously measured size scale (TEXTURE OR NEIGHBORS) or previously used number of bins (RADIALDISTRIBUTION) do you want to use? 1
Which object would you like to use for the denominator? PropCytoplasm
Which category of measurements would you like to use? Intensity
Which feature do you want to use? (Enter the feature number or name - see help for details) 1
For INTENSITY, AREAOCCUPIED or TEXTURE features, which image’'s measurements would you like to use? CorrGreen
For TEXTURE, RADIAL DISTRIBUTION, OR NEIGHBORS features, what previously measured size scale (TEXTURE OR NEIGHBORS) or previously used number of bins (RADIALDISTRIBUTION) do you want to use? 1
Do you want the log (base 10) of the ratio? No

Module #22: CalculateRatios revision - 6
What do you want to call the ratio calculated by this module? The prefix ‘‘Ratio_’’ will be applied to your entry, or simply leave as ‘‘Automatic’’ and a sensible name will be generated RelA_nuc2Distcyto
Which object would you like to use for the numerator? Nuclei
Which category of measurements would you like to use? Intensity
Which feature do you want to use? (Enter the feature number or name - see help for details) 1
For INTENSITY, AREAOCCUPIED or TEXTURE features, which image’‘s measurements would you like to use? CorrGreen
For TEXTURE, RADIAL DISTRIBUTION, OR NEIGHBORS features, what previously measured size scale (TEXTURE OR NEIGHBORS) or previously used number of bins (RADIALDISTRIBUTION) do you want to use? 1
Which object would you like to use for the denominator? DistCytoplasm
Which category of measurements would you like to use? Intensity
Which feature do you want to use? (Enter the feature number or name - see help for details) 1
For INTENSITY, AREAOCCUPIED or TEXTURE features, which image’'s measurements would you like to use? CorrGreen
For TEXTURE, RADIAL DISTRIBUTION, OR NEIGHBORS features, what previously measured size scale (TEXTURE OR NEIGHBORS) or previously used number of bins (RADIALDISTRIBUTION) do you want to use? 1
Do you want the log (base 10) of the ratio? No

Module #23: CalculateRatios revision - 6
What do you want to call the ratio calculated by this module? The prefix ‘‘Ratio_’’ will be applied to your entry, or simply leave as ‘‘Automatic’’ and a sensible name will be generated IRF3_nuc2Propcyto
Which object would you like to use for the numerator? Nuclei
Which category of measurements would you like to use? Intensity
Which feature do you want to use? (Enter the feature number or name - see help for details) 1
For INTENSITY, AREAOCCUPIED or TEXTURE features, which image’‘s measurements would you like to use? CorrRed
For TEXTURE, RADIAL DISTRIBUTION, OR NEIGHBORS features, what previously measured size scale (TEXTURE OR NEIGHBORS) or previously used number of bins (RADIALDISTRIBUTION) do you want to use? 1
Which object would you like to use for the denominator? PropCytoplasm
Which category of measurements would you like to use? Intensity
Which feature do you want to use? (Enter the feature number or name - see help for details) 1
For INTENSITY, AREAOCCUPIED or TEXTURE features, which image’'s measurements would you like to use? CorrRed
For TEXTURE, RADIAL DISTRIBUTION, OR NEIGHBORS features, what previously measured size scale (TEXTURE OR NEIGHBORS) or previously used number of bins (RADIALDISTRIBUTION) do you want to use? 1
Do you want the log (base 10) of the ratio? No

Module #24: CalculateRatios revision - 6
What do you want to call the ratio calculated by this module? The prefix ‘‘Ratio_’’ will be applied to your entry, or simply leave as ‘‘Automatic’’ and a sensible name will be generated IRF3_nuc2Distcyto
Which object would you like to use for the numerator? Nuclei
Which category of measurements would you like to use? Intensity
Which feature do you want to use? (Enter the feature number or name - see help for details) 1
For INTENSITY, AREAOCCUPIED or TEXTURE features, which image’‘s measurements would you like to use? CorrRed
For TEXTURE, RADIAL DISTRIBUTION, OR NEIGHBORS features, what previously measured size scale (TEXTURE OR NEIGHBORS) or previously used number of bins (RADIALDISTRIBUTION) do you want to use? 1
Which object would you like to use for the denominator? DistCytoplasm
Which category of measurements would you like to use? Intensity
Which feature do you want to use? (Enter the feature number or name - see help for details) 1
For INTENSITY, AREAOCCUPIED or TEXTURE features, which image’'s measurements would you like to use? CorrRed
For TEXTURE, RADIAL DISTRIBUTION, OR NEIGHBORS features, what previously measured size scale (TEXTURE OR NEIGHBORS) or previously used number of bins (RADIALDISTRIBUTION) do you want to use? 1
Do you want the log (base 10) of the ratio? No

Module #25: CalculateRatios revision - 6
What do you want to call the ratio calculated by this module? The prefix ‘‘Ratio_’’ will be applied to your entry, or simply leave as ‘‘Automatic’’ and a sensible name will be generated RelA2IRF3_nuc
Which object would you like to use for the numerator? Nuclei
Which category of measurements would you like to use? Intensity
Which feature do you want to use? (Enter the feature number or name - see help for details) 1
For INTENSITY, AREAOCCUPIED or TEXTURE features, which image’‘s measurements would you like to use? CorrGreen
For TEXTURE, RADIAL DISTRIBUTION, OR NEIGHBORS features, what previously measured size scale (TEXTURE OR NEIGHBORS) or previously used number of bins (RADIALDISTRIBUTION) do you want to use? 1
Which object would you like to use for the denominator? Nuclei
Which category of measurements would you like to use? Intensity
Which feature do you want to use? (Enter the feature number or name - see help for details) 1
For INTENSITY, AREAOCCUPIED or TEXTURE features, which image’'s measurements would you like to use? CorrRed
For TEXTURE, RADIAL DISTRIBUTION, OR NEIGHBORS features, what previously measured size scale (TEXTURE OR NEIGHBORS) or previously used number of bins (RADIALDISTRIBUTION) do you want to use? 1
Do you want the log (base 10) of the ratio? No

Module #26: CalculateStatistics revision - 3
What did you call the grouping values you loaded for each image cycle? See help for details. names
Would you like to log-transform the grouping values before attempting to fit a sigmoid curve? No
If you want to save the plotted dose response data for each feature as an interactive figure in the default output folder, enter the filename here (.fig extension will be automatically added); otherwise, leave at “Do not use.” Note: the figures do not stay open during processing because it tends to cause memory issues when so many windows are open. Note: This option is not compatible with running the pipeline on a cluster of computers. Do not use

Attached are the images. Through Yousendit, whats your emailid to put on there?
Thanks
Mridul KK






#14

Hi,

Could you please post the pipeline .mat file as an attachment instead of the text with the settings?

Also, while you’ve been working with the SBS pipeline, have you observed that the SBS pipeline has the same problem with the intensities that you are reporting?
-Mark


#15

Hi Mark,

I haven’t looked for that problem with SBS? Unfortunately, Im out of station now & I cant access that work from here. But I can definitely look into that, once back to the lab. Till that time, all other trials, im doing in other lab & all yr explanations will help us tremendously.

Attached is the .mat file which I’m using.
Plz feel free to modify it & let me know for any better modifications also.

Can you also help me in getting the following done (also asked in previous emails):::

  1. After applying illumination correction function, how can i save each of the corrected image? I want to compare original to that of illumination corrected image !!

  2. And is there anyway to find out intensity of each of the pixel for each of the object so that I can look for the intensity histogram of pixels comprising a object like nuclei? I’m aware of intensity histogram for image.

Thanks a lot for helping me out.
Mridul KK
CPMAIN_PIPE_3ch_withRedch.mat (2.29 KB)


Calculation of basic features like intensity, area etc
#16

Hi Mark,

I have attached the pipeline & example images as you asked for.
I’m waiting for your input & help.
Thanks
Mridul KK


#17

Hi,

I don’t see all of the three channels that your pipeline references. It would be fine to just see one example image of each, and the actual image, not a screenshot of the CellProfiler image browser window.

For YouSendIt, I believe you can send it to yourself and then post the unique link to the location of the files.

-Kate