How can I use image J to assess ocular motility

I want to assess synthetic eye motility in comparison to the normal eye in the same individual.
How far the eye moves from primary position?

Thank you very much in advance

Best
Mostafa

Hi @Mostafa_Diab,

It looks as if in ImageJ/FIJI you have discovered a tool that may suit your needs.

Referring to the Welcome message, in particular the sentence “The primary objective is to foster independent learning for everyone in the community.”, how would you go about solving this problem?

What are the goals, the constraints and the possible pitfalls? The more you have thought over an approach, the easier its implementation gets.

Hi @eljonco

Yes, you are right. I find image J is a good tool in evaluating my patients.

Unfortunately, I found myself illiterate in such programs. I try to learn.

Please, are there Simple resources to use this program in doing measurements on human facial pictures?

All the best,

Mostafa

@Mostafa_Diab,

Although I haven’t seen such a piece of research passing by on the forum, there must be a wealth of research based on eye movement in the field of psychology. That may be the breeding ground for eye tracking applications. From a hacking point of view, there is also a lot to read up.

I realise strictly speaking eye tracking is different from ocular motility but find it hard to believe underlying mechanisms would differ.

Yet, if ImageJ or any other application is used for analysis of ocular motility, it is important to design a set of premises that allow validation of final results. Think of the sample images you gave. What are your reference points, what is the distance between the camera and the subject, what is the key number you want to obtain etc. etc. Using this as a starting point, sub-goals can be expressed: I want to find the center of the eye. That can be split up in smaller goals, like ‘find the pupil’ etc.etc.

Is the key number you are after maybe the mm the pupil travelled in the image, the percentage of movement of the pupil, the angle in the horizontal and the vertical, if so, with respect to what central pivot point? How do you correct for more or less bulging of the eye, as that influences geometry with respect to a 2D image etc.

@eljonco
I am planning to measure the distance that certain landmark will move in horizontal and vertical gazes from primary position.
this landmark will be lateral limus (edge of colored part of the eye in inward movement) , medial limbus in outward movement, lower border of the pupil in upgaze. all these from the primary position of gaze looking a straight ahead and compared with the contra lateral normal eye. I want to set a scale (horizontal diameter of the cornea=colored part is known to be fixed 11mm).

the landmark is changed with each gaze to ensure good visibility
the question is: how to determine the primary position of each landmark in the picture showing different gaze so I can set a distance between the 2 points.

all the best
mostafa

@Mostafa_Diab,
That is a promising wording and approach. So you are going to use the corona as a built-in tape measure to calibrate the image. That is (relatively) simple: drag a line from one side to the other and using Analyse > set scale tell ImageJ that the real distance is 11, and the unit is mm.
For that image, the scale is known. This however has to be done for each image individually as distance to the camera may differ. A global calibration hence is not in place. Do not tick that box.

Caveat: (based on Google searches) there may be pathological situations where the corona is not circular and thus most likely not 11 mm. Or am I wrong?

Things to consider: during the recording: is the head fixed with respect to the camera?
The corona is not symmetric if the eye does not look straight into the camera. (compare the right bottom image and the one above it, in the bottom one the patients left eye has a smaller distance between the pupil and the white of the eye on the nose side than on the ear side, while on the image above it, these two distances are equal.

Do you consider modelling the eye as a sphere and fit the measurements onto this sphere or do you intend to follow the more practical approach by expressing the displacement in mm, taking for granted that in every subject the distortions are equal?

Many questions that hopefully help to get a clear picture (pun intended).

Oh, what would help is an annotated image of the above image with six panes that indicate your landmarks for the non-ophthalmologist on the forum, like me.

very rare. However we can measure in real patients individually

Sure, only eyes will move.

You are right, this is what i am going to measure
the right eye is normal and moves fully to the right, while the left eye is the artificial eye and do not move fully. This discrepancy in side gaze between eye motility is my main concern.

I am not sure what is better, it is relatively a new method if assessment.

1

The big question to me, how we can set the site of landmark shown in primary position when evaluating eyes in gazes. in right gaze, we want to measure how far the lateral limbus moved from its postion in primary gaze to its new position in right gaze.

To retrieve the displacement in mm, the right gaze might be measured from the limbus of the left side in both eyes (then we cannot speak of lateral unfortunately), as this region is frontmost in the image; it shows least distortion caused by the ball-shape of the eye.

Given the up-gaze, where no disturbing flash reflection near the pupil is in sight, using Analyse > Plot Profile gets you this result:
profile
From which you can easily discern the pupil, which is in x-position 23-32 of the yellow selected line on the eye, its center then can be calculated to be half way, at position 27.5

I know in this image the x-position is not most relevant, it is just to indicate why an image without flash reflection in the pupil is so easy to find a location of the pupil.

I sincerely wonder if a measurement shouldn’t be back-projected onto a model of the eye ball, allowing the displacement/motility to be expressed in terms of rotational angles. A quick-and-dirty measurement of the medial limbus shows an aspect ratio of 1.1, while in the up-gaze, the aspect ratio has changed to 1.49. This is quite some distortion as a consequence of the projection of the sphere of the eyeball onto the flat surface of the camera.

Sure, only eyes will move.

The six panes of your sample image suggest otherwise to my leeman’s eye; we see nostrils in the left topmost two images, but not in the other images. Even it were to the different moments of recording (top most vs. 2nd and 3rd on left panes), imho this cannot be explained by a different cropping. I guess these images are from an article as I can see a halftone mask? Original images work best, preferably they are recorded as raw and saved as TIFF or lossless compressed images. These generally have the extension .tif.

1 Like