Coloc2 of Deconvolved data files

Hello image analysis experts,

I manage a small confocal core facility and am trying to help someone with their image analysis. The desired workflow is to collect multi-channel z-stack images on our Nikon confocal (.nd2 files, 12bit), deconvolve data in Huygens Professional (here she is saving as 16 bit tiff files), and then colocalization of two channels (red and green) with the coloc2 plugin. However, we have run into some issues with the deconvolved files taking excessive amounts of time to analyze in FIJI.

If I test ImageJ’s preloaded sample images, the Coloc2 plugin works relatively quickly for both single planes and z-stacks. Similarly, if I load the original nikon z-stack files (which read into bioformats as 16 bit), the analysis takes a reasonable time for a ROI around a cell (~5 minutes). However - if I load the deconvolved images, and use the same ROI - the analysis takes over an hour per cell, even on well equipped analysis workstation computers. It seems to get stuck on step 2 of 11, running auto threshold regression. I have updated FIJI and ensured that the allocated memory is sufficiently high. With user permission, I have uploaded sample images to Google drive - please let me know if you have any trouble accessing them:
My understanding is that the colocalization of red and green signals is not expected to be great biologically in this example, but the desired analysis workflow remains.

Within the Coloc2 plugin, we are currently using the default settings as we are new to the plugin, but any advice on how to optimize the analysis would also be greatly appreciated.

Does anyone know why these deconvolved images should take so long for a single ROI analysis in Coloc2?

Thank you in advance for any and all advice!



Hey Corinne! I think I figured out the issue… it’s the Threshold regression option you have selected. When I run it on my system using the Bissection method - it only takes seconds to process. Costes is holding things up.

@chalkie666 or other colocalization experts may provide better details as to WHY this makes such a large difference in the processing time… But I would check the numbers with your Nikon software - as you were doing before - using the Bissection method.

Also - these are the steps I took (just so we are all on the same page):

  1. Import images via Bio-Formats (File > Import > Bio-Formats).
  2. I drew a small ROI and saved it to the ROI Manager.
  3. In the Coloc 2 window - for the ROI or mask - I selected “ROI Manager”…
  4. Select Bissection option for Threshold regression.

I’ll try to look into this too - to get you a better answer as well… Just need some more time on my side. But at least you can play a bit now.


1 Like


Thank you Ellen! I will take a look and pass on the info.

Do you know if there is a way to code coloc2 into a macro for batch processing? I know that we could save a folder of masks beforehand - but because the save window pops up prompting for user input after each analysis, I was not sure if this is possible.


1 Like


If you want to look into more detail - you can always check out the source code for Coloc 2 on github. Specifically - the issue is in the AutoThresholdRegression class - this class automatically finds the threshold used for Pearson colocalization calculation. In this class - Costes or Bissection implementation define the stepper used to move through the image - starting a SimpleStepper at max of channel 1 or a BissectionStepper at mid-point of channel 1, respectively. Bissection will be faster in the end…



Yes - you can use Coloc 2 in a macro… you can use the Macro Recorder to see the function calls. And can use Script Parameters to set up Batch Processing… take a look at at this example code you can use to process folders/subfolders of images:

// @File(label = "Input directory", style = "directory") input
// @File(label = "Output directory", style = "directory") output
// @String(label = "File suffix", value = ".tif") suffix

 * Macro template to process multiple images in a folder

// See also for a version of this code
// in the Python scripting language.


// function to scan folders/subfolders/files to find files with correct suffix
function processFolder(input) {
	list = getFileList(input);
	list = Array.sort(list);
	for (i = 0; i < list.length; i++) {
		if(File.isDirectory(input + File.separator + list[i]))
			processFolder(input + File.separator + list[i]);
		if(endsWith(list[i], suffix))
			processFile(input, output, list[i]);

function processFile(input, output, file) {
	// Do the processing here by adding your own code.
	// Leave the print statements until things work, then remove them.
	print("Processing: " + input + File.separator + file);
	print("Saving to: " + output);

There is also another example of how to script programmatic execution of Coloc 2 here. Just go to Script Editor > Templates > Examples > Colocalisation (Groovy).

I think putting pieces of those examples together should do the trick.

eta :slight_smile:

For sure Costes autothreshold will be very slow for
very high dynamic range images, eg 16 bit deconvolution result images. Why? The thresholds start,at max pixel value and step down by 1 in each iteration. So if max value of image is eg 45666 it will take tens if thousands of iterations. Takes ages. Bisection is a much faster way to find the correct thresholds because it needs fewer iterations to get to the same, hopefully, thresholds as Costes. Only noise would mess that up. Not much noise after proper deconvolution.