 # PointRoi input different from output point positions

Hi guys,

I was just wondering, why the following python code executed in a current FIJI

``````from ij import IJ
from ij.gui import PointRoi

IJ.run("Blobs (25K)", "")

imp = IJ.getImage()

imp.setRoi(PointRoi([12,22,32,42,52], [15,25,35,45,55],5))

pointRoi = imp.getRoi()

xArr = pointRoi.getXCoordinates()
yArr = pointRoi.getYCoordinates()

for i in range(0, pointRoi.getNCoordinates()):
print "position: " + str(xArr[i]) + "/" + str(yArr[i])
``````

results in

``````position: 0/0
position: 10/10
position: 20/20
position: 30/30
position: 40/40
``````

Am I accessing the points positions in a wrong way?

Thanks,
Robert

If you use:

xArr = pointRoi.getPolygon().xpoints
yArr = pointRoi.getPolygon().ypoints

``````from ij import IJ
from ij.gui import PointRoi

IJ.run("Blobs (25K)", "")

imp = IJ.getImage()

imp.setRoi(PointRoi([12,22,32,42,52], [15,25,35,45,55],5))

pointRoi = imp.getRoi()

xArr = pointRoi.getPolygon().xpoints
yArr = pointRoi.getPolygon().ypoints

for i in range(0, pointRoi.getNCoordinates()):
print "position: " + str(xArr[i]) + "/" + str(yArr[i])
``````

Everything is working fine.

I only found that the ImageJ API describes the function `getXCoordinates`, etc., as obsolete: