Error in processing bunch of images by TWS

I created a Classifier (on TWS Fiji) and wanted to use this on other images in a file by java code. However, I always get an error like this:

Here is my code: I guess the problem located in the line: ImagePlus result = segmentator.applyClassifier(image); which is the mentioned No.36 line code in the error panel. But I couldn’t find out the mistakes. Can anyone give me some inspiration so that I know how to move on? Many thanks in advance.

//create segmentator
		WekaSegmentation segmentator = new WekaSegmentation();

		//load classifier
		File modelPath = new File("C:\\Users\\yuanf\\Desktop\\sample\\model\\50bl50c_classifier.model");
		segmentator.loadClassifier(modelPath.getPath());

		//get list of input images
		File inputDir = new File("C:\\Users\\yuanf\\Desktop\\sample\\Various Density");
		File[] listOfFiles = inputDir.listFiles();

		//define the output dir
		File outputDir = new File("C:\\Users\\yuanf\\Desktop\\sample\\WekaResults");

		for (File f:listOfFiles) {
			if(!f.isDirectory()) {
				ImagePlus image = IJ.openImage(f.getPath());
				if (image == null) {
					IJ.error("No image" + f.getPath());
				}else {
				// apply classifier and get results (0 indicates number of threads is auto-detected)
				ImagePlus result = segmentator.applyClassifier(image);
				
				// assign same LUT as in GUI
				result.setLut( Utils.getGoldenAngleLUT());
				
				// save result as TIFF in output folder
				String outputFileName = f.getName().replaceFirst("[.][^.]+$", "") + ".tif";
				new FileSaver( result ).saveAsTiff( outputDir.getPath() + File.separator + outputFileName );

Hello @Yuanfei_Mai and sorry for the late answer!

From the exception you got, you can see it is a memory problem. Probably your image is too large to fit all its feature into memory. To prevent that, you can use the strategy describe on this script. Let me know if you have questions!

Thank you for your help! The code finally works when I used smalled images but all the results I got are fully red images. Do you know why?

It is hard to say without seeing any image. In principle, the classifier should work if the test images are similar in content and histogram to the training ones.

Hello! @iarganda
Finally, I successfully obtained the segmented results by using smaller size images. However, the whole image still can’t be segmented by setting the tiles. It seems that the program still run out of the memory no matter I set the tiles as 3 or 10. Is it the memory exploited the same, no matter how I set the tiles as long as I work on a same large image?

// apply classifier and get results (0 indicates number of threads is auto-detected)
  int[]	tilesPerDim = new int[ 2 ];
		tilesPerDim[ 0 ]  = 3;
		tilesPerDim[ 1 ]  = 3;
		ImagePlus result = segmentator.applyClassifier(image,tilesPerDim, 0, getProbs);

The smaller the tiles, the smaller the memory you use. What are your image sizes?

I tried tiles = 1 again but it still ran out of memory. Maybe I need to try another machine which has larger RAM. The image I am testing is 665665 micrometer(20482048),16bit,8M.

How much RAM do you have? The image is not that large. You got an error using 10 tiles on each dimension?

Yes. Come to error at tiles = 10 on each dimension. I have 4GB on work. is it too small?

What is the size of image you used for training? (in the same machine)

the training iamge has the same size as the one I used to test.

That is very surprising then. You shouldn’t have any problem. Can you send me the input and output images together with the trained model?

The model I developed, the training image, the segmentation results of training images and the testing image are all included in this file.

Thanks in advance for your patience and help.

Dear @Yuanfei_Mai,

Sorry for the late replay. I just tried reproducing your error with your files but it worked fine on my machine. I set the X tiles to 4, Y tiles to 4 and Z tiles to 0.

Did you check how much memory is your Fiji allowed to use? Have a look at Edit > Options > Memory & Threads > Maximum Memory.

I set 4000MB as the maximum memory and 8 as the parallel threads. how much is yours?

My only difference is I use 8 parallel threads, maybe it is that.

Also, I remember in some Windows systems the effective RAM was not determined by that number. Are you sure all that memory is used?

I also used the 8 as the parallel threads and I am not sure if all the memory is used. But anyway, thank you for all your help and your patient answer. I feel so much gratitude for that! :+1:

Sorry, I was mistaken, I meant I use 4 parallel threads.