Need help in preprocessing to binary image

I’ve been at this for a while and I’m admitting defeat. I know it is possible, I just can’t get it to work. See attached Electron Microscope image ( a small section to start, keep it simple, images are much larger and more complicated)

Line Detection (870.5 KB)

The image is a metal fracture surface. Due to the stress placed on the material, horizontal cracks are formed, which need to be quantified (once in binary form i have a macro down for it). The areas around in are dimpled, often described as “orange peel” in classrooms.

Need to:

  1. adjust Brightness/Contrast for maximum sharpness of the cracks (auto adjust seems to sharpen the dimples more)
  2. separate line segments from “dimples” (i.e. discard/threshold out dimples, or anything circular)
  3. discard any line segments that are not (roughly) horizontal. Reduce any shapes to a single line along the major axis (Draw out the Feret Diameter?).

Hi Ajay

I took a quick (5 minute) try at processing your image. I can’t say I achieved 100% success and perhaps you allready tried all this…

  • Apply a smoothing filter (Gaussian Blur)
  • Apply a horizontal edge filter-
    I achieved this by running process->filters->convolve with the following kernel

-1 -2 -1
0 0 0
1 2 1

  • Apply Otsu automatic threshold
  • Analyze particles.

… maybe other people will have better ideas… I think the key is finding a combination of smoothing and edge filter that will bring out the horizontal cracks, then optimizing the particle filter…

below is the beanshell script that was recorded, "Gaussian Blur...", "sigma=2");, "Convolve...", "text1=[-1 -2 -1\n0 0 0 \n1 2 1 \n] normalize");, "Auto Threshold", "method=Otsu white");, "Analyze Particles...", "size=50-Infinity circularity=0.00-0.50 show=Masks display add");

Hi @bnorthan

I tried your suggestions, my results lead to excessive fragmentation. However, I do want to say that your are on the right track, which is FAR further than I would have ever gotten on my own.

For giggles, I applied a vertical Sobel filter on my whole image when the horizontal filter wasn’t getting me where I needed to go, and sure enough it highlighted a definite alignment at around 70°. I tried using a Fourier transform via FFT (and bandpass filters) but I can’t seem to get end results, even though immediately after passing through a vertical bandpass filter the vertical component looks much cleaner.

(That is of course, if I understand how or why or when the $#@%&* works. Sometimes I get an output, sometimes I get just an FFT which can be re-transformed, and sometimes I get nothing after a filter image. But then again, that seems to also be the case when I go Analyze>Filter>Convolve where Preview and Filtering mysteriously stops working… weird)

This leads me to suspect a Gabor filter (or variant) may be where I need to go, but I am having troubles following the Gabor example macro (and there is no plugin).

I haven’t posted a full image in public, as I cannot, but I can privately send people the full image, if it helps.

1 Like

Do you mean this Gabor filter example?? Or a different macro?

I tried running the above Macro. I couldn’t see the actual link to download the beanshell script so I had to copy and paste into fiji script editor then explicitly save it as beanshell before it would work.

Did you manage to get the Gabor filter working on your image?? If so, and results were not good, maybe tweaking below parameters will get you better results? Or maybe you tried tweaking parameters and ran into problems?

// Sigma defining the size of the Gaussian envelope
sigma = 8.0;
// Aspect ratio of the Gaussian curves
gamma = 0.25;
// Phase
psi = Math.PI / 4.0 * 0;
// Frequency of the sinusoidal componentF
x = 3.0;
// Number of diferent orientation angles to use
nAngles = 5;

Yes, that’s the only example I could find. I had troubles following the example code logic to implement it. It has a lot to do with the fact that I don’t have a good grasp of how to go from an explanation of image processing in the frequency space to modifying the equation appropriately to get it to do what I need it to do. If it was a plugin already, I could try altering the parameters experimentally until it started to make sense in my head. I’m stuck troubleshooting code before I get to trying it out.

PS From a purely Image Analysis perspective (without getting into ImageJ/Fiji), there are many options if I can wrap my head around the math. By variant I meant stuff like Log Gabor filter - Wikipedia. But the OCR on the Chinese character in the article Gabor Filter - Wikipedia Illustrates what I hope the Gabor filter can do for me.

Hello Ajay,

Using the Gabor filter script you can try to enhance the horizontal structures in your image. You just need a bit of patience to find the right parameters. Have a look at the following results I obtained from your image using sigma = 9, gamma = 0.25, psi = 0, Fx = 2.0, nAngles = 10 and a filter size of 51x51:

Here you are the exact script I used, which is a faster version than the one on the site since it uses FFT:

import net.imglib2.img.ImagePlusAdapter;
import net.imglib2.img.Img;
import net.imglib2.img.display.imagej.ImageJFunctions;
import net.imglib2.type.numeric.real.FloatType;
import net.imglib2.algorithm.fft2.FFTConvolution;

import ij.*;
import ij.process.*;
import ij.plugin.filter.*;
import ij.plugin.ContrastEnhancer;
import ij.plugin.ZProjector;

* This script calculates a set of Gabor filters over the selected image.
* Parameters: sigma, gamma, psi, Fx, nAngles

start = System.currentTimeMillis();

// Sigma defining the size of the Gaussian envelope
sigma = 9;
// Aspect ratio of the Gaussian curves
gamma = 0.25;
// Phase
psi = Math.PI / 4.0 * 0;
// Frequency of the sinusoidal component
Fx = 2.0;

// Number of diferent orientation angles to use
nAngles = 10;

// copy original image and transform it to 32 bit 
originalImage = IJ.getImage();
originalImage = new ImagePlus(originalImage.getTitle(), originalImage.getProcessor().convertToFloat());
width = originalImage.getWidth();
height = originalImage.getHeight();

// Apply aspect ratio to the Gaussian curves
sigma_x = sigma;
sigma_y = sigma / gamma;

// Decide size of the filters based on the sigma
largerSigma = (sigma_x > sigma_y) ? (int) sigma_x : (int) sigma_y;
if(largerSigma < 1)
    largerSigma = 1;
ip = originalImage.getProcessor().duplicate();

sigma_x2 = sigma_x * sigma_x;
sigma_y2 = sigma_y * sigma_y;

// Create set of filters

filterSizeX = 51; //6 * largerSigma + 1;
filterSizeY = 51; //6 * largerSigma + 1;

middleX = (int) Math.round(filterSizeX / 2);
middleY = (int) Math.round(filterSizeY / 2);

is = new ImageStack(width, height);
kernels = new ImageStack(filterSizeX, filterSizeY);

rotationAngle = Math.PI/(double)nAngles;
// Rotate kernel from 0 to 180 degrees
for (i=0; i<nAngles; i++)
    theta = rotationAngle * i;
    filter = new FloatProcessor(filterSizeX, filterSizeY);    
    for (int x=-middleX; x<=middleX; x++)
        for (int y=-middleY; y<=middleY; y++)
            xPrime = (double)x * Math.cos(theta) + (double)y * Math.sin(theta);
            yPrime = (double)y * Math.cos(theta) - (double)x * Math.sin(theta);
            a = 1.0 / ( 2.0 * Math.PI * sigma_x * sigma_y ) * Math.exp(-0.5 * (xPrime*xPrime / sigma_x2 + yPrime*yPrime / sigma_y2) );
            c = Math.cos( 2.0 * Math.PI * (Fx * xPrime) / filterSizeX + psi); 
            filter.setf(x+middleX, y+middleY, (float)(a*c) );
    kernels.addSlice("kernel angle = " + theta, filter);

// Show kernels
ip_kernels = new ImagePlus("kernels", kernels);;

// Apply kernels
for (i=0; i<nAngles; i++)
    ip2 = originalImage.duplicate();
    kernel = ImagePlusAdapter.wrap( new ImagePlus("", kernels.getProcessor( i+1 )) );
    image2 = ImagePlusAdapter.wrap( ip2 );

    c = new FFTConvolution( image2, kernel );
    ip2 = ImageJFunctions.wrap( image2, "" );
    is.addSlice("gabor angle = " + (180.0/nAngles*i), ip2.getProcessor() );    

// Normalize filtered stack (it seems necessary to have proper results)
c = new ContrastEnhancer();
for(int i=1 ; i <= is.getSize(); i++)
    c.stretchHistogram(is.getProcessor(i), 0.4);

projectStack = new ImagePlus("filtered stack",is);, "Enhance Contrast", "saturated=0.4 normalize normalize_all");
resultStack = new ImageStack(width, height);
zp = new ZProjector(projectStack);
for (int i=0;i<=5; i++)
    resultStack.addSlice("Gabor_" + i 
            +"_"+sigma+"_" + gamma + "_"+ (int) (psi / (Math.PI/4) ) +"_"+Fx, 

// Display filtered images
(new ImagePlus("Gabor, sigma="+sigma+" gamma="+gamma+ " psi="+psi, is)).show();

result= new ImagePlus ("Gabor stack projections", resultStack) ;, "Enhance Contrast", "saturated=0.4 normalize normalize_all");;

end = System.currentTimeMillis();
IJ.log( "Gabor filter took " + (end-start) + "ms" );

@iarganda I wonder if a size of 51 isn’t big enough to capture the entire filter. Inside the convolve the filter will be padded (with 0 I think) and there will be a discontinuity at the edge, and the properties of the filter are different.

Your result actually isn’t bad… when I use 255 as the size the result is much blurrier

1 Like

@Ajay.Sood If I start with the image that @iarganda created, apply the horizontal edge filter, then take only positive values, then apply an automatic threshold, then count particles with min size=1000 I get an OK result (maybe this is getting a bit hacktastic… but it seems to be heading in the right direction at least…)

The script I used is below (it uses concepts from IJ2 and ops and is based on scripts presented here

# @OpService ops
# @UIService ui
# @Dataset inputData

from net.imglib2.img.display.imagej import ImageJFunctions
from ij import IJ
from ij.plugin import Duplicator


# create horizontal edge filter ###################################3333


for t in kernel:
# end create horizontal edge filter ###################################3333

# convolve with edge filter
filtered=ops.filter().convolve(inputData, kernel)

# get rid of negative values (auto threshold will work better)
for p in filtered:
    if (p.getRealFloat()<0): p.setReal(0)

# threshold
thresholded = ops.threshold().ij1(filtered)

# convert to imagej1 imageplus so we can run analyze particles
impThresholded=ImageJFunctions.wrap(thresholded, "wrapped")"thresholded", impThresholded)

# convert to mask and analyze particles, "Convert to Mask", ""), "Analyze Particles...", "size="+str(minParticleSize)+"-Infinity show=[Overlay Masks]");

Hello @bnorthan and @iarganda,

One of the curses of my industries, being pulled off to fight another fire, has left me unable to try your suggested solutions right now. However, I appreciate the responses and wanted to take the time to thank you for looking into it. I will try it as soon as I can manage it and get back to you.

Just a reminder, the image I posted in general was a very small picture, the originals are about 8300x3200 pixels. I will have to play with the numbers to adjust for the more challenging areas. If you guys are curious, I can privately send you the full image.

In some of the images, I will have to figure out how to separate the cracks from extrusion marks and/or polishing marks, which are frustratingly in the same orientation as the cracks (all hail Murphy’s Law!). The good news is we’re also getting slightly better images because we’ve adjusted the electron microscope settings (increased kV) to get more depth information.

Hi @iarganda,

Thanks for writing this FFT-Gabor filter script. My Beanshell skills are zero, but I have found it useful by manually adjusting values. Are there plans for a Gabor (and LogGabor) filter plugin for ImageJ? It could be useful for for machine learning-based segmentation if a systematically adjusted range of sigma, gamma, psi, frequency and orientation values could be entered using a macro. There seems to be a plugin as part of the Weka 2D plugin, but only gamma is adjustable.


1 Like

I’m glad it was useful!

To be honest, there are so many parameters that I didn’t implement more options in Trainable Weka Segmentation to avoid creating too many image features…