How to get connectivity of each particle

Hello everyone,

I would like to calculate connectivity. Connectivity for this kind of image is defined as average of (no. of particles connected to each particle). As an example for starred particle it counts as 3 similarly I would like to calculate for each particle. One way is that somehow I can get the no. of watershed to each particle on binary image.
Please guide me to get this result.

Thank You,
Mirtunjay

1 Like

Hey @Mirtunjay_Kumar,

this is a very interesting challenging question! I recently programmed something similar for Fiji and I’m happy to share. Starting from this image

The following script blurs and thresholds the image, applies a watershed and connected components. Afterwards, it fills the gaps between the labels:

From this image it generates a touch matrix. This is an image where pixels are set to white ( =1 ) if objects touch. If the pixel in row 5 colum 4 is white that means objects 4 and 5 touch. There is usually a white column in the very left telling you all the objects touch the background.
image

You can sum this image in a special way (sum in column n + sum in row n = number of neighbors object n touches, including background) and write the result in a table:

Here comes the script:

// open image, crop a region, convert it to 8-bit
open("C:/Users/rober/Downloads/1_003 - Copy.jpg");
makeRectangle(6, 5, 534, 536);
run("Duplicate...", " ");
run("8-bit");

// blur the image a bit
run("Gaussian Blur...", "sigma=3");

// threshold it and apply a binary watershed
setAutoThreshold("Otsu dark");
setOption("BlackBackground", true);
run("Convert to Mask");
run("Watershed");

// init GPU
run("CLIJ Macro Extensions", "cl_device=[GeForce RTX 2060 SUPER]");
Ext.CLIJ_clear();

// send input image to GPU
input = getTitle();
Ext.CLIJ_push(input);

// apply connected components analysis
labelled = "labelled";
Ext.CLIJx_connectedComponentsLabeling(input, labelled);

// fill gaps created by the watershed to make labels touch
extended_labels = "extended_labels";
Ext.CLIJ_maximum2DBox(labelled, extended_labels, 1.0, 1.0);

// visualise extended labels
Ext.CLIJ_pull(extended_labels);
run("glasbey_on_dark");

// generate a touch matrix
touch_matrix = "touch_matrix";
Ext.CLIJx_generateTouchMatrix(extended_labels, touch_matrix);

// show touch matrix
Ext.CLIJ_pull(touch_matrix);
for (i = 0; i < 5; i++) {
    run("In [+]");
}

// count touching neighbors and show in a table
neighbor_count = "neighbor_count";
Ext.CLIJx_countTouchingNeighbors(touch_matrix, neighbor_count);
Ext.CLIJx_image2DToResultsTable(neighbor_count);

// cleanup by the end
Ext.CLIJ_clear();

In order to make it run in your Fiji, please activate the clij and clij2 update sites (Menu: Help > Update…):
image

You can read more about clij and clij2 online.

Let me know if it works!

Cheers,
Robert

3 Likes

It works flawless. One more query I have, Is it possible to get annotation which object is X1, X2 … etc.

1 Like

Hi,@Mirtunjay_Kumar @haesleinhuepf,Here is a similar question, 4-colors filling. I need a python solution. https://forum.image.sc/t/relabel-with-4-colors-like-map/33564

1 Like

Yes. CLIJ also allows some basic measurements such as the Center of Mass of objects. Thus, you can go through them and draw text at that position using ImageJ methods:

Here is the updated script:

// open image, crop a region, convert it to 8-bit
open("C:/Users/rober/Downloads/1_003 - Copy.jpg");
makeRectangle(6, 5, 534, 536);
run("Duplicate...", " ");
original = getTitle();
run("Duplicate...", " ");
run("8-bit");

// blur the image a bit
run("Gaussian Blur...", "sigma=3");

// threshold it and apply a binary watershed
setAutoThreshold("Otsu dark");
setOption("BlackBackground", true);
run("Convert to Mask");
run("Watershed");

// init GPU
run("CLIJ Macro Extensions", "cl_device=[GeForce RTX 2060 SUPER]");
Ext.CLIJ_clear();

// send input image to GPU
input = getTitle();
Ext.CLIJ_push(input);

// apply connected components analysis
labelled = "labelled";
Ext.CLIJx_connectedComponentsLabeling(input, labelled);

// fill gaps created by the watershed to make labels touch
extended_labels = "extended_labels";
Ext.CLIJ_maximum2DBox(labelled, extended_labels, 1.0, 1.0);

// visualise extended labels
Ext.CLIJ_pull(extended_labels);
getStatistics(_, _, _, number_of_labels, _, _)
run("glasbey_on_dark");

// generate a touch matrix
touch_matrix = "touch_matrix";
Ext.CLIJx_generateTouchMatrix(extended_labels, touch_matrix);

// show touch matrix
Ext.CLIJ_pull(touch_matrix);
for (i = 0; i < 5; i++) {
    run("In [+]");
}

// Visualize numbering on original image
selectWindow(original);
for (i = 1; i < number_of_labels; i++) {
	single_label = "single_label";
	Ext.CLIJx_labelToMask(extended_labels, single_label, i);
	Ext.CLIJ_centerOfMass(single_label);
	x = getResult("MassX", nResults - 1);
	y = getResult("MassY", nResults - 1);
	drawString("X" + i, x, y);
}
run("Clear Results");

// count touching neighbors and show in a table
neighbor_count = "neighbor_count";
Ext.CLIJx_countTouchingNeighbors(touch_matrix, neighbor_count);
Ext.CLIJx_image2DToResultsTable(neighbor_count);


// cleanup by the end
Ext.CLIJ_clear();

Cheers,
Robert

1 Like

I wrote an python solution.

import numpy as np
from numba import jit
from scipy.ndimage import generate_binary_structure

def neighbors(shape, conn=1):
    dim = len(shape)
    block = generate_binary_structure(dim, conn)
    block[tuple([1]*dim)] = 0
    idx = np.where(block>0)
    idx = np.array(idx, dtype=np.uint8).T
    idx = np.array(idx-[1]*dim)
    acc = np.cumprod((1,)+shape[::-1][:-1])
    return np.dot(idx, acc[::-1])

@jit(nopython=True)
def search(img, nbs):
    s, line = 0, img.ravel()
    rst = np.zeros((len(line),2), img.dtype)
    for i in range(len(line)):
        if line[i]==0:continue
        for d in nbs:
            if line[i+d]==0: continue
            if line[i]==line[i+d]: continue
            rst[s,0] = line[i]
            rst[s,1] = line[i+d]
            s += 1
    return rst[:s]

def connect(img, conn=1):
    buf = np.pad(img, 1, 'constant')
    nbs = neighbors(buf.shape, conn)
    rst = search(buf, nbs)
    rst.sort()
    return np.unique(rst, axis=0)

if __name__ == '__main__':
    img = np.array([[1,1,1,1,1],
                    [1,1,2,2,1],
                    [1,3,2,2,1],
                    [1,3,1,1,4]])

    print(connect(img, 1))
    >>> [(1,2), (1,3), (1,4), (2,3)]

    print(connect(img, 2))
    >>> [(1,2), (1,3), (1,4), (2,3), (2,4)]
  1. building an touching matrix may be waste, (if there are many labels, we must build an n^2 matrix)
  2. So we can use sparse matrix, but python’s dictionary is very slow.

So I iterate the image, and put the ralationship pair in an numpy array, then sort by the last axis, and unique by the first axis. It can be jit(), it is fast, supporting nd.

Later I will write an ImagePy plugin.

1 Like

Hi!, A ImagePy plugin just finished:

the algorithsm: https://github.com/Image-Py/imagepy/blob/master/imagepy/ipyalg/graph/connect.py

the plugin: https://github.com/Image-Py/imagepy/blob/master/imagepy/menus/Analysis/Region%20Analysis/connect_plg.py