I am currently using fiji to measure the fluouresce intensity of plants grown under controlled cnditions. The plants are grown in a grid and there are 20 plants per grid. At the moment we simply create a ROI and hit measure to gain the mean intenisty of fluorescence of the leaves, we are currently taking about 5 reading per plant. This process takes allot of time as there are over eighty plants and these data points need to be taken multiple times fo the week. I would like to in some way automate this process. I have tried creating a threshold and creating an overlay, and then using ROI manager to split and measure, but this produces what appears to be a randomly ordered set of meaurments and these measurements dont come out with similar numbers to what I would get if I just selected a region and measured. Since the plants tend to move over the period of a week what I would like to do is to simply make a grid of ROI boxes, that will measure the leaves but completly ignore the background, or a composite ROI box that I could expand out and will automatically select ROI points based on intenisty of the image. SO the point hat are measure are only the leaves and not the back ground. Are there any suggestion as to how I could automate such a process, I have included a picture for you to better understand what I am trying to do.
you may have a look at these older threads:
Maybe the Rosette Tracker (http://www.plant-image-analysis.org/software/rosettetracker) is of use although I don’t know how you are going to deal with overlapping plants.
Dr Ir K.R. Straatman
Senior Experimental Officer
Advanced Imaging Facility
Centre for Core Biotechnology Services
University of Leicester
Rosette worked perfectly for my issues. The plants that are in the picture I posted have already been done by hand, the entire series has been done. The next step in the experiment will begin in January so I will be able to make adjustments in the future to deal with the plants touching so the program can work. It did take me a few minutes to get the same intensity numbers that I had with individual measurements, I had to convert my image to 8-bit black and white from RGB in order to get matching numbers. It appears that the program was measuring the green channel when the flourescents I was measuring was in the red channel.
thank you again
By looking at the image you posted, it strikes me how much more intense are the plants (and the light reflected by the tray) in the middle as compared to the ones in the periphery.
My impression is that the lighting in the image is inhomogeneous. I pressume that it comes from a single source located above the plant tray (side-by-side with the camera?). You should take this into account by some normalization procedure, or change to a diffused source that ensures a more even illumination pattern.
In addition, there might be other effect present: depeding on the field of view and the distance of the camera to the subjects, you could get a series of biases in the intensity readings that have a common geometry issue: some parts of the imaged subject are closer to the camera than others. For one, the objects lying near the center and the ones near the periphery will always have a different distance to the camera lens. This is because you are imaging a “flat” array of plants in a tray, but the points equidistant to the lens lie on a spherical surface. Thus, the closer you get the camera, or the wider is the field of view, the less the first approximates the latter. On top of that , if some plants/leaves are taller/higher than others, the closer you get the camera, the higher will be the relative difference in distances to the lens. In any case, every distance issue present will affect an otherwise even reading.
If your measurement setup has some of these catches, you should account for them in order to get more reliable measurements.
I hope these observations are useful for your work.
Thanks for the observations, this indeed something I have noticed and am working to correct. When we start again with the next batch in January. Our current lighting at the moment is not a single bulb but a ring of 16 365nm peak emission LED lights. Each with a blue xonalt glass filter to block out any detectable red from an individual light bulb. Most of the issues come from our camera which has a red filter on it and can bearly pick up the light often. I recognize that our best bet is to rearrange the light bulbs but finding a good 365 Nm LED lights has proven to be very difficult.
Maybe you can try to find/develop some sort of “flat field” object (a piece of foamcore?) to image first, and to then use that image to normalize the intensities of your plants. As long as it fluoresces homogeneously throughout the surface and covers the whole field of view, it could be a good start.