Andor Dragonfly IMS files larger than 60GB cannot be opened with Bioformats 5.9.2

Hi all,

We’re unable to open individual, IMS Andor Dragonfly files, when the size is => 60GB. We’re using NETCDF version 4.6.12. We were first made aware of this by Mark Scott who works at the Institute of Molecular Bioscience in Brisbane and he said he has been in contact with Andor and Bioformats over this but i thought it would be useful to see the error we’re getting in case its different and perhaps others will come forward with similar experiences.

Thanks,

Matt
java.io.IOException: java.io.EOFException: Reading Y:/Scott/20180928_MF31R1fullbaseMF31R2baseandfull_Protocol_F2.ims at 1225261679400853370 file length = 1225261679400853370
at ucar.nc2.NetcdfFile.open(NetcdfFile.java:427)
at ucar.nc2.NetcdfFile.open(NetcdfFile.java:394)
at ucar.nc2.NetcdfFile.open(NetcdfFile.java:381)
at ucar.nc2.NetcdfFile.open(NetcdfFile.java:369)
at loci.formats.services.NetCDFServiceImpl.init(NetCDFServiceImpl.java:310)
at loci.formats.services.NetCDFServiceImpl.setFile(NetCDFServiceImpl.java:103)
at loci.formats.in.ImarisHDFReader.initFile(ImarisHDFReader.java:246)
at loci.formats.FormatReader.setId(FormatReader.java:1397)
at loci.plugins.in.ImportProcess.initializeFile(ImportProcess.java:499)
at loci.plugins.in.ImportProcess.execute(ImportProcess.java:142)
at loci.plugins.in.Importer.showDialogs(Importer.java:140)
at loci.plugins.in.Importer.run(Importer.java:76)
at loci.plugins.LociImporter.run(LociImporter.java:78)
at ij.IJ.runUserPlugIn(IJ.java:228)
at ij.IJ.runPlugIn(IJ.java:192)
at ij.Executer.runCommand(Executer.java:137)
at ij.Executer.run(Executer.java:66)
at java.lang.Thread.run(Thread.java:745)
Caused by: java.io.EOFException: Reading Y:/Scott/20180928_MF31R1fullbaseMF31R2baseandfull_Protocol_F2.ims at 1225261679400853370 file length = 1225261679400853370
at ucar.unidata.io.RandomAccessFile.readFully(RandomAccessFile.java:852)
at ucar.unidata.io.RandomAccessFile.readFully(RandomAccessFile.java:831)
at ucar.unidata.io.RandomAccessFile.readString(RandomAccessFile.java:1522)
at ucar.nc2.iosp.hdf5.FractalHeap.readIndirectBlock(FractalHeap.java:441)
at ucar.nc2.iosp.hdf5.FractalHeap.readIndirectBlock(FractalHeap.java:498)
at ucar.nc2.iosp.hdf5.FractalHeap.(FractalHeap.java:182)
at ucar.nc2.iosp.hdf5.H5header$DataObject.processAttributeInfoMessage(H5header.java:2477)
at ucar.nc2.iosp.hdf5.H5header$DataObject.(H5header.java:2463)
at ucar.nc2.iosp.hdf5.H5header$DataObject.(H5header.java:2294)
at ucar.nc2.iosp.hdf5.H5header.getDataObject(H5header.java:2129)
at ucar.nc2.iosp.hdf5.H5header.access$500(H5header.java:72)
at ucar.nc2.iosp.hdf5.H5header$DataObjectFacade.(H5header.java:2175)
at ucar.nc2.iosp.hdf5.H5header.readGroupNew(H5header.java:4183)
at ucar.nc2.iosp.hdf5.H5header.access$1000(H5header.java:72)
at ucar.nc2.iosp.hdf5.H5header$H5Group.(H5header.java:2262)
at ucar.nc2.iosp.hdf5.H5header$H5Group.(H5header.java:2220)
at ucar.nc2.iosp.hdf5.H5header.makeNetcdfGroup(H5header.java:494)
at ucar.nc2.iosp.hdf5.H5header.read(H5header.java:219)
at ucar.nc2.iosp.hdf5.H5iosp.open(H5iosp.java:127)
at ucar.nc2.NetcdfFile.(NetcdfFile.java:1560)
at ucar.nc2.NetcdfFile.open(NetcdfFile.java:835)
at ucar.nc2.NetcdfFile.open(NetcdfFile.java:424)
… 17 more

1 Like

Dear All,

We have discovered the source of the IMS file opening problems in Fiji and it relates to the way meta data is stored in the HDF5 file. We were exceeding a size limit that exists in the NetCDF library, but is not actually defined in the HDF5 specification. We have implemented two changes to correct for this.
First we have relocated Protocol metadata to a different field type with greater capacity. Second, we have given the user an option to avoid time stamping every frame acquired, which can also exceed the metadata field capacity. The default for the time stamping is off, so that all files we have tested now open in Fiji and the virtual stack model works correctly. We release these fixes and some significant improvements to focus drift correction functionality (PFS, AFC and ZDC) in Fusion 2.1.0.80 which is now available for download on MyAndor (http://my.andor.com/user/).
For those who have existing data that needs to be imported to Fiji, we recommend one of the following:
1.Load the file back into Fusion and Export to OME TIFF.
2.Download the HDF5 Viewer (https://www.hdfgroup.org/downloads/hdfview/) and load the file and edit the headers by removing the time stamps and/or Protocol metadata.
Please get in touch if you have further questions.