I am currently training an inception v3 network to classify images into 2 classes. My inputs data are fluorescence images containing 5 channels, one for each marker. Since the inception network only inputs 3 channels my first approach was to combine the different channels into an RGB image by recombining the different channels like that for example: (channel1 + channel4/2, channel2 + channel4/2, channel3). However my later goal is to try to interpret what the deep learning has learned so I think it would make more sense to modify the inception network to input additional channels, otherwise how to disantangle the different channels?
I have been thinking of several approaches to do it and I would like your opinion on what would be the most relevant !
The first approach would consist to only modify the 3x3x3 kernel size of the first convolutional layer of the inception network (keras-applications/inception_v3.py at bc89834ed36935ab4a4994446e34ff81c0d8e1b7 · keras-team/keras-applications · GitHub
line 169) to a 5x5x5 kernel or a 5x3x3 kernel (actually i don’t really know what would make more sense). However:
(1) I have no idea whether it could cause any accuracy issue because everything was built for 3 channels in the first place
(2) I feel I won’t be able later to disantangle the discriminative power of each channels using for instance saliency maps and say “this pixel in this channel is discriminative” rather than just “this combination of pixels for the 5 channels are discriminative”
The second approach i have been thinking of is to train 5 different inception networks in parallel (only the feature extraction part) with 5 different loss functions (one for each channel) and combine all features together before input everything to a dense classifier. It sounds very heavy but this way I may be able to disantangle the discriminative power of each of the 5 channels, but at the same time I think I would loose the information carried by the marker colocalization
What do you think? Of course any other idea of approach would be appreciated