Swapping Slices and Timepoints for a .avi Image (ImageJ Plugin)

Hello,

I am working on a plugin for Fiji that tracks the motion of magnetic particles throughout a particle video by interfacing with the TrackMate plugin, and then uses the particle motion data to calculate the force the particles are experiencing due to an external magnetic gradient.

I have a batch of .avi videos that I am trying to analyze using my plugin. However, these videos have their slice and timepoint channels swapped, as they have only one timepoint and hundreds of slices. When I run the TrackMate plugin directly on an image, it informs me of this, and asks me if I want to swap z and t channels, to which I agree. However, I am having difficulty implementing the same functionality into my own plugin.

First of all, I would like a way to access the number of timepoints in the currently open video, and store that data in a variable. I would then use a conditional statement to check if the number of timepoints was one, and if it was, I would swap the z and t channels. However, I have not been able to find any function that accesses this timepoint data. I am writing my plugin in the Eclipse Integrated Development Environment in Java by the way. Does anyone have any methods they could recommend to access the timepoint data of a hyperstack video?

Second, I need a way to swap the z and t channels of my video using the plugin. I tired using the macro command:
IJ.run(imp, β€œRe-order Hyperstack …”, β€œchannels=[Channels (c)] slices=[Frames (t)] frames=[Slices (z)]”);
However, when I try to run my plugin on one of my videos, I get the following error:
java.lang.IllegalArgumentException: Stack argument out of range: 1
Any advice you have on implementing either of the two methods mentioned above would be appreciated. I will provide a portion of my source code below. I would attach one of the video files I am working with, but it seems .avi files are not supported in these posts.

Source code (Initial portion):

//package name
package net.msu.kunzelab;

//Imports
import net.imagej.Dataset;
import net.imagej.ImageJ;
import net.imglib2.img.Img;
import net.imglib2.type.numeric.RealType;
import org.scijava.command.Command;
import org.scijava.plugin.Parameter;
import org.scijava.plugin.Plugin;

import java.lang.Math;
import java.io.File;
import java.util.Set;

import org.scijava.ItemIO;
import org.scijava.log.LogService;

import ij.IJ;
import ij.ImagePlus;
import ij.ImageStack;
import ij.WindowManager;

import io.scif.services.DatasetIOService;

import fiji.plugin.trackmate.Settings;
import fiji.plugin.trackmate.Spot;
import fiji.plugin.trackmate.Model;
import fiji.plugin.trackmate.SelectionModel;
import fiji.plugin.trackmate.TrackMate;
import fiji.plugin.trackmate.Logger;
import fiji.plugin.trackmate.detection.DetectorKeys;
import fiji.plugin.trackmate.detection.LogDetectorFactory;
import fiji.plugin.trackmate.tracking.sparselap.SparseLAPTrackerFactory;
import fiji.plugin.trackmate.tracking.TrackerKeys;
import fiji.plugin.trackmate.tracking.LAPUtils;
import fiji.plugin.trackmate.visualization.hyperstack.HyperStackDisplayer;
import fiji.plugin.trackmate.features.FeatureFilter;
import fiji.plugin.trackmate.features.FeatureAnalyzer;
import fiji.plugin.trackmate.FeatureModel;
import fiji.plugin.trackmate.features.spot.SpotContrastAndSNRAnalyzerFactory;
import fiji.plugin.trackmate.action.ExportStatsToIJAction;
import fiji.plugin.trackmate.io.TmXmlReader;
import fiji.plugin.trackmate.action.ExportTracksToXML;
import fiji.plugin.trackmate.io.TmXmlWriter;
import fiji.plugin.trackmate.features.ModelFeatureUpdater;
import fiji.plugin.trackmate.features.SpotFeatureCalculator;
import fiji.plugin.trackmate.features.spot.SpotContrastAndSNRAnalyzer;
import fiji.plugin.trackmate.features.spot.SpotIntensityAnalyzerFactory;
import fiji.plugin.trackmate.features.track.TrackSpeedStatisticsAnalyzer;
import fiji.plugin.trackmate.features.track.TrackLocationAnalyzer;
import fiji.plugin.trackmate.util.TMUtils;

//Create plugin class
@Plugin(type = Command.class, menuPath = "Plugins>Particle Video")
public class ParticleVideo<T extends RealType<T>> implements Command {
	 
	//add IO service
	@Parameter
	private DatasetIOService datasetIOService;
	
	//add log service
	@Parameter
	private LogService logService;
	
	//define input variables
	@Parameter(label = "Enter actual particle radius (microns):")
	private String radius = "";
	
	@Parameter(label = "Enter apparent particle radius (microns):")
	private String apparent_radius = "";
	
	@Parameter(label = "Enter solution viscosity (kg/(m*s):")
	private String viscosity = "";
	
	@Parameter(label = "Enter particle mass (micrograms):")
	private String mass = "";
	
	@Parameter(label = "Enter particle detection threshold")
	private String threshold = "";
	
	@Parameter(label = "Enter pixel to um conversion (pixels/um)")
	private String conversion = "";
	
	//create output variables
	@Parameter(type = ItemIO.OUTPUT)
	private String particle_radius;
	
	@Parameter(type = ItemIO.OUTPUT)
	private String apparent_particle_radius;
	
	@Parameter(type = ItemIO.OUTPUT)
	private String solution_viscosity;
	
	@Parameter(type = ItemIO.OUTPUT)
	private String particle_mass;
	
	@Parameter(type = ItemIO.OUTPUT)
	private String detection_threshold;
	
	@Parameter(type = ItemIO.OUTPUT)
	private String conversion_factor;
	
	@Override
	public void run() {
		
		//Assign output variables & output to screen
		particle_radius = ""+ radius +" microns";
		apparent_particle_radius = ""+ apparent_radius +" microns";
		solution_viscosity = ""+ viscosity +" (kg/(m*s))";
		particle_mass = ""+ mass +" micrograms";
		detection_threshold = ""+ threshold +"";
		conversion_factor = ""+ conversion +" (pixels/um)";
		
		//Get image
		//final String file = IJ.getFilePath("Choose TIF Image Set");
		//final ImagePlus imp = IJ.openImage(file);
		final ImagePlus imp = IJ.getImage();
		
		//Instantiate Model Object
		final Model tmmodel = new Model();
		
		//Set logger
		tmmodel.setLogger(Logger.IJ_LOGGER);
		
		//Prepare settings object
		final Settings tmsettings = new Settings();
		tmsettings.setFrom(imp);
		
		//Pixel to um scale converted from string to double
		double scale = Double.parseDouble(conversion);
		
		//Set Calibration
		IJ.run(imp, "Set Scale...", "distance="+ scale +" known=1 unit=um");
		
		//Condition if frames are interpreted as slices - need to add conditionality!
		IJ.run(imp, "Re-order Hyperstack ...", "channels=[Channels (c)] slices=[Frames (t)] frames=[Slices (z)]");
		
		//Configure detector
		tmsettings.detectorFactory = new LogDetectorFactory();
		tmsettings.detectorSettings.put(DetectorKeys.KEY_DO_SUBPIXEL_LOCALIZATION, Boolean.TRUE);
		tmsettings.detectorSettings.put(DetectorKeys.KEY_RADIUS, Double.parseDouble(apparent_radius));
		tmsettings.detectorSettings.put(DetectorKeys.KEY_TARGET_CHANNEL, 1);
		tmsettings.detectorSettings.put(DetectorKeys.KEY_THRESHOLD, Double.parseDouble(threshold));
		tmsettings.detectorSettings.put(DetectorKeys.KEY_DO_MEDIAN_FILTERING, Boolean.FALSE);

		//Configure Tracker
		tmsettings.trackerFactory = new SparseLAPTrackerFactory();
		tmsettings.trackerSettings = LAPUtils.getDefaultLAPSettingsMap();
		tmsettings.trackerSettings.put(TrackerKeys.KEY_LINKING_MAX_DISTANCE, 60.0);
		tmsettings.trackerSettings.put(TrackerKeys.KEY_GAP_CLOSING_MAX_DISTANCE, 60.0);
		tmsettings.trackerSettings.put(TrackerKeys.KEY_GAP_CLOSING_MAX_FRAME_GAP, 2);
		
		tmsettings.addSpotAnalyzerFactory(new SpotIntensityAnalyzerFactory());
		tmsettings.addSpotAnalyzerFactory(new SpotContrastAndSNRAnalyzerFactory());
		
		tmsettings.addTrackAnalyzer(new TrackLocationAnalyzer());
		tmsettings.addTrackAnalyzer(new TrackSpeedStatisticsAnalyzer());
		
		tmsettings.initialSpotFilterValue = 1.0;
		
		//print(str(tmsettings));
		
		//Instantiate TrackMate
		final TrackMate trackmate = new TrackMate(tmmodel, tmsettings);
		
		//Execute all
		Boolean ok;
		ok = trackmate.checkInput();
		if (!ok) {
			IJ.log(trackmate.getErrorMessage());
		}
		ok = trackmate.process();
				if (!ok) {
					IJ.log(trackmate.getErrorMessage());
				}
		
		//Display results
		tmmodel.getLogger().log("Found " + (tmmodel.getTrackModel().nTracks(Boolean.TRUE)) + " tracks.");
		
		final SelectionModel tmselectionmodel = new SelectionModel(tmmodel);
		final HyperStackDisplayer displayer =  new HyperStackDisplayer(tmmodel, tmselectionmodel, imp);
		displayer.render();
		displayer.refresh();
		
		final FeatureModel fm = tmmodel.getFeatureModel();
1 Like

I will get back to you asap this week.

Hello

Here is how I do the switching stack β†’ 2D over time:

final int nslices = imp.getStackSize();
final int nframes = nslices;
// If you have just 1 channel.
imp.setDimensions( 1, 1, nframes );
imp.getCalibration().pixelDepth = 1.;
imp.getCalibration().frameInterval = 1.;