I feel like I must be overlooking something here, I have a middling size trackmate result xml file ~100MB the result of tracking (using interactive gui) only 1000 odd tracklets over 70 frames, but using the multi-channel add on to compute expression across multiple (5) channels. I’m then trying to read the trackmate xml file in matlab (using the script that ships with fiji) to pull out the expression for each track/channel. This has worked fine with smaller ~30mb files, but now I’m getting a java.lang.OutOfMemoryError: GC overhead limit exceeded with a call stack deep in various org.apache and net.sf libraries. That I believe means its spending all its time in garbage collection due to some quirk of the loader implementation and java has given up in disgust. The machine has 256GB ram so there’s no lack of real memory. This doesn’t seem an unreasonably large trackmate file, so I wonder if there’s some obvious gotcha I’m overlooking? The only work around I can think of is try and create a smaller file by re-running trackmate using a script and preventing unnecessary features from being calculated, but it would be a drag to figure out how to get all the expression values and nothing else calculated from a script and then discover it still wont load, so I’m wondering if anyone else has any thoughts? Thanks
As a follow up to this if any one is curious by computing just the multiple channel intensity spot features (and a couple needed for filtering tracks) with a script I was able to shrink the 100mb file to 40ish and I could then run trackMateGraph to load the data without getting an error, (though it took over an hour). Since the error was happening (if I remember correctly) in the spot loading call of trackMateGraph I dont think calling that directly instead of trackMateGraph would have helped. I am still curious if there’s some more robust way people like for loading expression spot features assembled into tracks into matlab for larger data sets. Do people just parse the spot summary spreadsheet style export themselves?
You are right that it is not a crazy-large image size. I suspect an issue on the MATLAB size. How much memory MATLAB allocates to the JVM on your configuration?
It was the default 768mb. I didn’t think to look at the setting since it was a GC overhead error instead of a heap size error but worth trying when I next get a minute.