Fiji, CLIJ, etc. *native* on Apple Silicon (arm64) M1

I thought I would start a topic to consolidate some questions and info regarding image analysis on this consumer arm64 platform: Apple Silicon (arm64) with MacOS 11.4 (Big Sur). I think this platform is here to stay—disclosure, I own an M1 MBPro.

I want to focus on native arm64 only, as this offers greater performance.

I (and at least a couple others) have been able to run Fiji/ImageJ native by installing a “no JRE” version combined with a native JRE. See:

Native JRE is available via homebrew (openJDK Java11, Java16):

or from Azul (Zulu Java8, 11, 13, 15, 16):

As I posted on Twitter, switching to native results in ~40% performance increase in a simple CPU benchmark from @haesleinhuepf

However, switching to native JRE results in issues: you can’t mix architectures. So for example, my initial trials with CLIJ only worked in Rosetta, the emulation environment—when everything is emulated, everything works.

To fully leverage things—particularly in CLIJ, but probably other high performance analysis pipelines—one needs native components. For CLIJ this required compiling a native JOCL library (Java Bindings for OpenCL). I was able to do this—it was actually trivial—and @haesleinhuepf has added it to his CLIJ distribution and it works: ~8X speedup.

So my question to those more knowledgable and those writing plugins: what other libraries are like JOCL, that they need to be recompiled?

So far I’ve also run into this issue when trying the CLIJ2 Richardson-Lucy deconvolution by @bnorthan and @haesleinhuepf
I’ve started an issue on CLIJ2-FFT:

Again, I’ve managed to compile the library, but a bit more help is needed I think, plus ultimately distribution.

Another one is StarDist, which uses Tensorflow-Java:

This one is definitely trickier!
There is an Apple Tensorflow fork with arm64 support, but getting from there to TF-Java is still not obvious.

What other plugins, libraries need recompiling? I’m willing to help/test, but my coding experience is really limited, mostly to copy-pasta. I can follow directions—when I read them—and have the basic toolchains installed (gcc, llvm, cmake, etc.).


Less people interested in new Macs than I thought :cry:

Anyhow, I’ve poked around a bit in Apple docs.
The good news is that you don’t need an Apple Silicon Mac to build for it and you can compile Universal (fat) versions of dylibs, et al.
Of course, to test native code, an Apple Silicon Mac will be needed.

To make a fat binary using cmake:

CFLAGS="-arch x86_64 -arch arm64"

This means that a lot of stuff that uses

uname -m

to detect architecture will build just x86—or just arm64 (in my case).

Alternately, it’s possible to combine binaries using lipo for distribution:

lipo -create -output universal_lib x86_lib arm_lib

TensorFlow Python support will come in macOS 12.0+


You beat me to posting that!
A bit of a wait: public betas this summer, release in the fall.
But really good news IMO—hopefully this makes TF-java support more likely.
I hope for some more interesting stuff to come out of WWDC as the week goes on.

In the short term, it means that Apple tensorflow fork is dead, so what works works and what does’t won’t.

So far I’ve found the following that look to need recompiling:

  1. The launcher—it actually works, but sort of makes a 2nd Fiji, minor annoyance.
/Applications/> lipo -info ImageJ-macosx       
Non-fat file: ImageJ-macosx is architecture: x86_64

It’s not clear to me if this can be built by itself…

Edit: It seems Fiji uses the imagej launcher, which is now built using cmake:

The MacOS section doesn’t seem to force an arch, so it may “just work”—I’ll take a look more closely when I get a chance!

  1. In /Applications/ in addition to libclij2fft related .dylib there is also

Not sure where it’s used, but it’s built using cmake and well documented:

3. in /Applications/ there are -universal.jar, but judging by their dates they can’t be x86/arm64 fat:


The last 4 are all related, part of JogAMP—released in 2015, so definitely not arm64 universal. Interestingly this is a different JOCL than that used by CLIJ, perhaps @haesleinhuepf can comment?

The rest of the jars are less obvious—like the above mentioned libtensorflow, which has no clear indicator it’s not platform independent. Is there an easy way to check for native libraries inside? lipo -info fails.


I don’t really know what jogamp is. I assume it’s opengl related. (not my business :wink: )

In Polish we have a great phrase for that:
nie mój cyrk, nie moje małpy
Literally: “not my circus, not my monkeys”

How about CLIJ2-FFT?
Seems like maybe “your circus”? :wink:

Was this not what Java 3D was based on at some point?

Again, perhaps when the N5 components moved into SciJava?

I’m happy to support your efforts. But please be aware that neither me nore my collaborators don’t have that target system. So these are low priority projects at the moment.


Totally understand—I apologize if I came off as rude or demanding, was hoping the emoji made that clear.
Your efforts with CLIJ and your engagement with the community here and on Twitter are nothing short of outstanding and hugely commendable! :smiling_face_with_three_hearts:
It’s a bit of a chicken and egg problem for everyone, but situation is quite dynamic: a lot of the upstream libraries are increasingly become available native (e.g. today I installed native pandoc).
So I hope soon for plugin authors it will just be a case of downloading a native or fat library to use in their plugin, with no effort required on their end.

Re: libblosc.dylib
Turns out this one is already available native via homebrew (bottle = binary):

I just installed on my M1. Alas, the homebrew bottle is not fat!

/opt/homebrew/Cellar/c-blosc/1.21.0/lib > ls                  
libblosc.1.21.0.dylib libblosc.a            pkgconfig
libblosc.1.dylib      libblosc.dylib
/opt/homebrew/Cellar/c-blosc/1.21.0/lib > lipo -info libblosc.1.21.0.dylib
Non-fat file: libblosc.1.21.0.dylib is architecture: arm64

So this is a bit of a snag for someone on an x86 Mac.
These libraries (c-blosc and clFFT) are small, I can easily share a zip.

I believe that is one of @haesleinhuepf 100 or more circuses :grinning_face_with_smiling_eyes: :grinning_face_with_smiling_eyes:

clij-fft is an attempt to build some algorithms that use clfft. So you would need to compile clfft for Apple M1 first. Have you tried that?

After that you would need to add some info to 2 files in the clij2-fft project

  1. this file calls cmake then make to build the native part of clij2-fft. You would need to add Apple M1 as a Platform.

  2. clij2fftWrapper . This is needed to build the java wrapper. Again need to add Apple M1 as a platform and specify where clFFT can be found.

It’s a complicated build, but so far it’s worked. Admittedly we need to put more work into making the build files neater and more consistent. Thus far clij2fft has been a ‘experimental’ project, and the first step was to hack around and get the build working, and some fft based algorithms implemented.


@bnorthan Thanks for jumping in!

clFFT is available from homebrew as M1 native. So that’s covered—that was the easy part: brew install clfft

I was able to do this in my fork and got an arm64 (M1) lib:

> lipo -info libclij2ff.dylib
Non-fat file: libclij2fft.dylib is architecture: arm64

(As mentioned above, I should go back and figure out how to make it make a fat version with x86 and arm64, but that’s for another day…)

OK, this wasn’t obvious to me—still isn’t—so I’ve not done it.
Do I also need CUDA?

That’s an nVidia thing, I think? So are we DoA?
Also, is this the step that will generate libjniclij2fftWrapper.dylib ?

It’s fine, thank you and @haesleinhuepf for your efforts with this.
I hope I can get the M1 steps done so that you or @haesleinhuepf can push a new version such that:


runs on my Mac—and other peoples M1 Macs!

You do not need Cuda. You do need opencl.lib, and I’ve set it up to find it in the cuda directories, because there is a copy distributed with cuda. In your case it just needs to point to where opencl.lib or equivalent is.

Long term we probably need to set this up in a more generic way, so it is not pointing to specific locations that could only exist on individual machines.

If you want to understand things better you may want to look at javacpp if you are not already familiar with it.

1 Like

OK, so this was pretty easy, I was able to build a universal (x86, arm64) imagej-macosx launcher by just adjusting the cmake arch to have both x86 and arm64.
I’ve made an issue on the imagej/imagej-launcher GitHub where you can see more info:

/Applications/> lipo -info ImageJ-macosx       
Non-fat file: ImageJ-macosx is architecture: x86_64

A summary of the universal binary behavior (launching from command line to see information):

  1. my universal imagej-macosx (x86/arm64), with current official JRE (x86):
    Wrong architecture error, system Java fallback, launches Fiji, No issues. (<4 s launch)
  2. my universal imagej-macosx (x86/arm64), with no JRE:
    System Java fallback, launches Fiji, No issues. (<4 s launch)
  3. my universal imagej-macosx (x86/arm64), with native Azul 11 JRE inside
    Normal launch (<4 s), no issues, 1 Fiji in dock

So from where I sit, the universal imagej-macosx with native JRE inside is the best, but all cases work fine, as long as a system Java exists.
The missing case is universal launcher on x86 Mac, which I might be able to try this weekend. But if someone wants to try my binary, here’s a zip:

(it belongs in:

Thinking about it more, this is a regression, since being able to run Fiji in Rosetta can certainly be desired—especially in the early going, ie. now.
I’m puzzled that Get Info does not offer Open using Rosetta, as is available for other universal (x86/arm64) apps.
However, the launcher can be forced into x86 mode and Fiji launches in Rosetta from the terminal:

arch -x86_64 ./ImageJ-macosx

More on the Apple Silicon Tensorflow front:
There’s reports on Twitter of TF2.5 and the new tensorflow-metal plugin working on Big Sur:

Alas, TF-java still problematic at the moment, but there is hope:

1 Like

For those working in ImageJ @Wayne has a ready download of ImageJ with native Zulu Java 8. See:
It launches very fast and runs beautifully on my MBPro—no issue with extra dock icons or drag-and-drop.

Fiji (Zulu 11 arm64): 11.9, 10.7, 10.7
ImageJ (M1): 11.5, 10.9, 10.7