How to get ilastik neural network to use GPU

Hi!

I have a computer running Linux Mint 18 with a GeForce GTX 1080Ti with Cuda 10 installed.
I can’t get ilastik to see the GPU when I hit “Get Devices” in step 2 “Server configuration”.
It does see and it can use the CPU to run the workflow.

I do get a PASS from NVIDIA’s deviceQuery and bandwidthTest.

Am I missing something obvious?

Hi @Buono,

I assume you have installed the server component, too, and it only shows you the CPU?

Hi!
I did install it like here
Quoted below:

conda create -n tiktorch-server-env -c ilastik-forge -c conda-forge -c pytorch tiktorch

It sees the CPU. It is capable of running the whole workflow. But never gives me the option of using the GPU.

what nvidia driver version do you have installed?

Driver Version: 410.129
Cuda compilation tools, release 10.0, V10.0.130

could you please try upgrading your driver to anything as new or newer than 418… Here is a thread with a similar issue.

It works!
Updated driver as you directed.
Now running:
Driver Version: 430.64
CUDA Version: 10.1

Blazing fast speed!

Thank you for the help.

1 Like

Awesome!!! :slight_smile: