Cloud matching of 2d points

Hello,
in the minimal example below, I am

  1. creating a cloud of random 2D points
  2. creating a 2D rigid transform
  3. applying the 2D transform to the source points
  4. finding matchpoints with the iterative closest point algorithm from mpicbg.models.AbstractModel between the source and target points
  5. Filtering with RANSAC and getting the transform from the model

but the found transformation does not match my initial transformation. Does somebody know what I am doing wrong ?

Thanks for your help


from mpicbg.models import RigidModel2D, AffineModel2D, Point
from java.util import HashSet
from java.lang import Math
from java.awt.geom import AffineTransform
from jarray import array

def pointListToList(pointList): # [[1,2],[5,8]] to [1,2,5,8]  
    l = array(2 * len(pointList) * [0], 'd')
    for id, point in enumerate(pointList):
        l[2*id] = point[0]
        l[2*id+1] = point[1]
    return l
    
def listToPointList(l): # [1,2,5,8] to [[1,2],[5,8]]  
    pointList = []
    for i in range(len(l)/2):
        pointList.append([l[2*i], l[2*i+1]])
    return pointList

####### 1. creating a random cloud of 2D points
sourcePoints = [ [int(100*Math.random()), int(100*Math.random())] for i in range(30)]
sourceList = pointListToList(sourcePoints)
####### 2. creating a 2D transform
myTrans = AffineTransform().getTranslateInstance(200,200)
####### 3. applying the 2D transform to the source (target)
targetList = array(2 * len(sourcePoints) * [0], 'd')
myTrans.transform(sourceList, 0, targetList, 0, len(sourcePoints))

print 'sourceList', sourceList
print 'targetList', targetList

targetPoints = listToPointList(targetList)
sourcePoints = map(Point, sourcePoints)
targetPoints = map(Point, targetPoints)

####### 4. running the iterative closest point algorithm from RigidModel2D
model = RigidModel2D()
pointMatches = model.icp(sourcePoints, targetPoints)

for pointMatch in pointMatches:
    print pointMatch.p1.getL(), pointMatch.p2.getL() 

####### 5. Filtering with RANSAC and getting the transform
inliers = HashSet()
try:
    modelFound = model.filterRansac(pointMatches, inliers, 1000, 10, 0)
    aff = model.createAffine()
except Exception, e:
    IJ.log('Model not found: ' + str(e))

print 'Found transformation', aff
print 'Original transformation', myTrans
print 'Number of inliers', len(inliers)
3 Likes

Hello @ThomasT

I think the problem is the matching of your points. Because iterative closest points takes the closest point from the target cloud as a match. If you have 2 points with a distance of 10 and you move the points by 10 ICP will match the first point with the second point.

In your example each source point gets moved by a certain distance but the order of the points in the target cloud is the same as in the source cloud. If you skip the matching step you should find the transformation with RANSAC.

2 Likes

Real general question about this field, since I am trying to develop a little Object Recognition application in Matlab;
and I am a little new to Computer Vision field.
which other methods work fine to estimate geometrical transformation like RANSAC?

and about that

I dont get fully the point, when can I skip the matching phase?

Thank you very much,
Emanuele

RANSAC just tries to find the best model to a dataset which has outliers.
You could use Iterative Closest Points, but only if the correspondences are really close. But most of the time the biggest problem is to find matches. To find matches I would suggest SIFT.

You can skip the matching phase because you have an ordered source point cloud. If you move every point from the source cloud by a certain distance point 1 from the source cloud corresponds to point 1 in the target cloud. You already know the perfect matching.

3 Likes

thank you very much for reply.
At the moment I am using SURF, but I can switch to/try SIFT too.
At the moment, with RANSAC I am having this problem :
I collect many SURF features of an object (i.e. a remote control) coming from different photos of the object itself (let’s say more or less 70) I use all this features (and respectively points) to match the object in a scene.
That way I found many many points of matching points quite good.
Now I would use some estimator (and I read on the net that RANSAC does that) to remove inliers;
to do that I try to find affine transform with RANSAC for each pictures.
Here the weird behavior :
most of the time all the points collapsing in a single pionts (better to say in many points with same coordinates); I really dont understand the reason.
I attach a picture of that to be clearer, I hope.

If I remove some points of matching let’s say 15 instead of 50… RANSAC doesn’t collapse in a single point.

Thank you for your attention!
Emanuele Martini

Thanks @tibuch. I agree that there is a problem with the matching. Maybe I should have stated it but I created this minimal example to debug a 2D bead registration pipeline. In a real case the clouds will not be ordered. I did not put it in the minimal example, but shuffling the target cloud or using rotations + translations does not change the result.

I agree that matching with features such as SIFT is robust but it does not work well to register beads (small bright spots in 2D images in my case). Are you sure that ICP should not be able to match two random clouds regardless of whether [quote=“tibuch, post:4, topic:856”]
the correspondences are really close
[/quote] or not ? SPIM registration is probably the best reference in the imagej framework and it uses ICP. I thought that I had extracted the basic workflow from their ICP approach but apparently not. I should have a closer look to it again. (It is for example not clear to me yet whether the mpicbg.models.AbstractModel.icp is performing all iterations until convergence or whether I should do it myself.)
Thanks for your help

1 Like

I think SURF works as well.

RANSAC should not remove any points. It only finds the transformation between the point clouds with the most inliers.

As far as I can see you don’t find many correct matches in your example images. Most of them match to some wrong points and also it is often a many-to-one matching. As a rule of thumb you could say that RANSAC only works well if the number of outliers is below 50%.

An other problem is that you try to find an Affine Transformation. An Affine Transformation maps parallel lines to parallel lines. I guess you want to find a Homography.

I can’t tell by eye but you probably also have some lens distortion effects which makes it even harder to find a perfect match.

2 Likes

Thank you very much for your help and suggestions,
I will try and let you know.
Emanuele

Yes SIFT doesn’t help if you do a bead based registration.

The Interest Point Registration from SPIM registration is based on three steps:

  • Apply calibration (difference in xy and z resolution per pixel)
  • Affine bead-based registration
  • ICP registration based on nuclei

After the calibration step they apply bead-based registration to find the over all model and after that they use ICP to find a even better model.

2 Likes

Thanks @tibuch. So to summarize, if I want to find pair matches between points of two 2D clouds of beads such as in these pictures (30 deg rotation between them) :

  1. using ICP directly on the center of masses of the beads is not going to work (I am still surprised) ?
  2. I could use the bead based registration implemented in the SPIM registration plugin. I would then first create the local geometric descriptors followed by the kd-tree matching ? I have the impression that it is a bit overkill because if I understand correctly the need for the local matching descriptor came from the difficulties to match 3D clouds acquired with different view angles.
    In this approach it seems to me that I would first need to find a way to convert my simple 2D clouds images to the spimdata format to use the functions from the spim interestpointregistration package.

Thanks for your help

The matching should be according to the numbers, but ICP will find the red-line matching.

You have another view angle because your camera is rotated. Maybe you can find a 2D bead descriptor or you have some knowledge about your camera/images and can apply an initial transformation followed by ICP.
ICP will work if the distance between the matching beads is smaller than the distance between two beads in your image.

3 Likes

You can also use the Descriptor-based registration plugin in Fiji.

If you already have two point clouds, you can populate an ArrayList<DifferenceOfGaussianPeak<FloatType>> by using the DifferenceOfGaussianPeak constructor with the respective coordinates.

Or you use the static methods of the Matching class:

Matching.extractCandidates()

followed by:

Matching.descriptorMatching()

The actual processing is done in the protected method pairwiseMatching in Matching.java:

3 Likes

Thanks @imagejan and @tibuch. That is becoming clearer. To finish the minimal example I am now populating the DOGs myself to feed them to Matching().descriptorMatching. I am getting closer but the following piece of code does not work (sorry I still have not found how to embed code properly, can someone let me know ?), I am getting the errors:

AttributeError: 'process.Matching' object has no attribute 'getCorrespondenceCandidates'
or 
AttributeError: 'process.Matching' object has no attribute 'pairwiseMatching'
or
comparePairs = m.descriptorMatching(dogpContainer, 2, DescriptorParameters(), 0)
at process.ComparePair.<init>(ComparePair.java:18)
at process.Matching.getComparePairs(Matching.java:642)
at process.Matching.descriptorMatching(Matching.java:440)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:497)


java.lang.NullPointerException: java.lang.NullPointerException

in a freshly downloaded Fiji (win8, 64bit). Am I missing something ? Thanks.

from mpicbg.models import RigidModel2D, AffineModel2D, Point
from java.util import HashSet
from java.lang import Math
from java.awt.geom import AffineTransform
from jarray import array
import random
from mpicbg.imglib.algorithm.scalespace import DifferenceOfGaussianPeak
from mpicbg.imglib.algorithm.scalespace.DifferenceOfGaussian import SpecialPoint
from process import Matching
from java.util import ArrayList
from mpicbg.imglib.type.numeric.integer import IntType
from plugin import DescriptorParameters
from mpicbg.pointdescriptor.matcher import SimpleMatcher

def pointListToList(pointList): # [[1,2],[5,8]] to [1,2,5,8]  
	l = array(2 * len(pointList) * [0], 'd')
	for id, point in enumerate(pointList):
		l[2*id] = point[0]
		l[2*id+1] = point[1]
	return l
	
def listToPointList(l): # [1,2,5,8] to [[1,2],[5,8]]  
	pointList = []
	for i in range(len(l)/2):
		pointList.append([l[2*i], l[2*i+1]])
	return pointList

def pointsToDOGPs(points):
	DOGPs = ArrayList()
	for point in points:
		DOGPs.add(DifferenceOfGaussianPeak( [int(point[0]), int(point[1]) ] , IntType(255), SpecialPoint.MAX ))
	return DOGPs

####### 1. creating a random cloud of 2D points
sourcePoints = [ [int(100*Math.random()), int(100*Math.random())] for i in range(30)]
sourceList = pointListToList(sourcePoints)
####### 2. creating a 2D transform
myTrans = AffineTransform().getTranslateInstance(30,40)
####### 3. applying the 2D transform to the source
targetList = array(2 * len(sourcePoints) * [0], 'd')
myTrans.transform(sourceList, 0, targetList, 0, len(sourcePoints))

targetPoints = listToPointList(targetList)
random.shuffle(targetPoints)

print 'sourcePoints', sourcePoints
print 'targetPoints', targetPoints

sourceDOGPs = pointsToDOGPs(sourcePoints)
targetDOGPs = pointsToDOGPs(targetPoints)

dogpContainer = ArrayList()
dogpContainer.add(sourceDOGPs)
dogpContainer.add(targetDOGPs)

dp = DescriptorParameters()
'''
print dp.brightestNPoints
print dp.channel1 
print dp.channel2 
print dp.correspondenceDirectory
print dp.dimensionality 
print dp.directory 
print dp.filterRANSAC
print dp.fixFirstTile 
print dp.fuse 
print dp.globalOpt 
print dp.inliers 
print dp.iterations 
print dp.lambda 
print dp.localization 
print dp.lookForMaxima 
print dp.lookForMinima 
print dp.max 
print dp.maxIterations
print dp.maxTrust
print dp.min 
print dp.minInlierFactor
print dp.minInlierRatio
print dp.minMaxType
print dp.minSimilarity
print dp.model
print dp.model1
print dp.model2
print dp.numNeighbors 
print dp.printAllSimilarities 
print dp.range 
print dp.ransacIterations
print dp.ransacThreshold 
print dp.reApply 
print dp.redundancy 
print dp.region 
print dp.regularize 
print dp.roi1 
print dp.roi2 
print dp.setPointsRois 
print dp.sigma 
print dp.sigma1 
print dp.sigma2 
print dp.significance 
print dp.silent 
print dp.similarOrientation 
print dp.storeModels 
print dp.storePoints 
print dp.threshold 
'''
m = Matching()

comparePairs = m.descriptorMatching(dogpContainer, 2, DescriptorParameters(), 0)
print 'comparePairs', comparePairs

# m.getCorrespondenceCandidates(1, SimpleMatcher(4), sourceDOGPs, targetDOGPs, RigidModel2D(), 2, 0, 0, 'explanation')

# m.pairwiseMatching(ArrayList<PointMatch> finalInliers, sourceDOGPs, targetDOGPs, 0, 0, DescriptorParameters(), 'explanation')

#print 'Found transformation', aff
print 'Original transformation', myTrans
#print 'Number of inliers', len(inliers)

descriptorMatching is a static method, so you should call it on the Matching class, not on an object instance:

comparePairs = Matching.descriptorMatching(dogpContainer, 2, DescriptorParameters(), 0)

Thanks. I agree, it is a static method but I get the same error with Matching or Matching() (thanks for the python highlighting edit, I have seen the ‘’’ python) …

AttributeError: type object 'process.Matching' has no attribute 'getCorrespondenceCandidates'
or
AttributeError: 'process.Matching' object has no attribute 'getCorrespondenceCandidates'
or
AttributeError: type object 'process.Matching' has no attribute 'pairwiseMatching'
or    
AttributeError: 'process.Matching' object has no attribute 'pairwiseMatching'

or

comparePairs = Matching.descriptorMatching(dogpContainer, 2, DescriptorParameters(), 0)
at process.ComparePair.<init>(ComparePair.java:18)

at process.Matching.getComparePairs(Matching.java:642)

at process.Matching.descriptorMatching(Matching.java:440)

at sun.reflect.GeneratedMethodAccessor20.invoke(Unknown Source)

at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)

at java.lang.reflect.Method.invoke(Method.java:497)
java.lang.NullPointerException: java.lang.NullPointerException

or

#comparePairs = Matching().descriptorMatching(dogpContainer, 2, DescriptorParameters(), 0)
at process.ComparePair.<init>(ComparePair.java:18)

at process.Matching.getComparePairs(Matching.java:642)

at process.Matching.descriptorMatching(Matching.java:440)

at sun.reflect.GeneratedMethodAccessor20.invoke(Unknown Source)

at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)

at java.lang.reflect.Method.invoke(Method.java:497)
java.lang.NullPointerException: java.lang.NullPointerException

Well, and you should use a sensible set of DescriptorParameters, i.e. not just an initialized object, but something along these lines:

dp = DescriptorParameters()
dp.model = RigidModel2D()
dp.dimensionality = 2
dp.numNeighbors = 5

comparePairs = Matching.descriptorMatching(dogpContainer, 2, dp, 0)

Sorry, now it is clearer on my side:

####1.
I now have a more sensible set of DescriptorParameters so I get in the log

0<->1: Remaining inliers after RANSAC (RigidModel2D): 297 of 300 with average error 0.010000000000000009.

####2.

I overlooked that getCorrespondenceCandidates and pairwiseMatching are protected …

####3.
The following code now works. The found transform is less accurate when I am using rotation angles different from pi/2 probably because of the lack of interpolation when I am transforming my source points.
There are many DescriptorParameters, I will also have to explore which ones to adjust.
Thanks @imagejan and @tibuch for your help.

from mpicbg.models import RigidModel2D, AffineModel2D, Point
from java.util import HashSet
from java.lang import Math
from java.awt.geom import AffineTransform
from jarray import array
import random
from mpicbg.imglib.algorithm.scalespace import DifferenceOfGaussianPeak
from mpicbg.imglib.algorithm.scalespace.DifferenceOfGaussian import SpecialPoint
from process import Matching
from java.util import ArrayList
from mpicbg.imglib.type.numeric.integer import IntType
from plugin import DescriptorParameters

def pointListToList(pointList): # [[1,2],[5,8]] to [1,2,5,8]  
	l = array(2 * len(pointList) * [0], 'd')
	for id, point in enumerate(pointList):
		l[2*id] = point[0]
		l[2*id+1] = point[1]
	return l
	
def listToPointList(l): # [1,2,5,8] to [[1,2],[5,8]]  
	pointList = []
	for i in range(len(l)/2):
		pointList.append([l[2*i], l[2*i+1]])
	return pointList

def pointsToDOGPs(points):
	DOGPs = ArrayList()
	for point in points:
		DOGPs.add(DifferenceOfGaussianPeak( [int(point[0]), int(point[1]) ] , IntType(255), SpecialPoint.MAX ))
	return DOGPs

####### 1. creating a random cloud of 2D points
sourcePoints = [ [int(100*Math.random()), int(100*Math.random())] for i in range(300)]
sourceList = pointListToList(sourcePoints)
####### 2. creating a 2D transform
myTrans = AffineTransform().getTranslateInstance(300,400)
myTrans.concatenate(AffineTransform().getRotateInstance(Math.PI/6))
# myTrans = AffineTransform().getRotateInstance(Math.PI/4)
####### 3. applying the 2D transform to the source
targetList = array(2 * len(sourcePoints) * [0], 'd')
myTrans.transform(sourceList, 0, targetList, 0, len(sourcePoints))

targetPoints = listToPointList(targetList)
random.shuffle(targetPoints)

# print 'sourcePoints', sourcePoints
# print 'targetPoints', targetPoints
####### 4. populating the difference of gaussian peaks
sourceDOGPs = pointsToDOGPs(sourcePoints)
targetDOGPs = pointsToDOGPs(targetPoints)

dogpContainer = ArrayList()
dogpContainer.add(sourceDOGPs)
dogpContainer.add(targetDOGPs)

####### 5. adjusting some matching parameters, probably not optimal yet
dp = DescriptorParameters()
dp.model = RigidModel2D()
dp.dimensionality = 2
dp.numNeighbors = 3
dp.brightestNPoints = 3
dp.maxIterations = 1000
dp.iterations = 1000
dp.lookForMaxima = True 
dp.max = 255
dp.ransacThreshold = 1000

'''
print 'dp.brightestNPoints', dp.brightestNPoints
print 'dp.channel1',  dp.channel1 
print 'dp.channel2',  dp.channel2 
print 'correspondenceDirectory', dp.correspondenceDirectory
print ' dimensionality ', dp.dimensionality 
print ' directory ', dp.directory 
print ' filterRANSAC ', dp.filterRANSAC
print ' fixFirstTile  ', dp.fixFirstTile 
print ' fuse ', dp.fuse 
print ' globalOpt ', dp.globalOpt 
print ' inliers ', dp.inliers 
print ' iterations ', dp.iterations 
print ' lambda ', dp.lambda 
print ' localization ', dp.localization 
print ' lookForMaxima ', dp.lookForMaxima 
print ' lookForMinima ', dp.lookForMinima 
print ' max ', dp.max 
print ' dp.maxIterations', dp.maxIterations
print ' maxTrust ', dp.maxTrust
print ' min ', dp.min 
print ' minInlierFactor ', dp.minInlierFactor
print ' minInlierRatio ', dp.minInlierRatio
print ' minMaxType ', dp.minMaxType
print ' minSimilarity ', dp.minSimilarity
print ' model ', dp.model
print ' model1 ', dp.model1
print ' model2 ', dp.model2
print ' numNeighbors ', dp.numNeighbors 
print  'AllSimilarities ', dp.printAllSimilarities 
print ' range ', dp.range 
print ' ransacIterations ', dp.ransacIterations
print ' ransacThreshold ', dp.ransacThreshold 
print ' reApply ', dp.reApply 
print ' redundancy ', dp.redundancy 
print ' region ', dp.region 
print ' regularize ', dp.regularize 
print ' roi1 ', dp.roi1 
print ' roi2 ', dp.roi2 
print ' setPointsRois ', dp.setPointsRois 
print ' sigma ', dp.sigma 
print ' sigma1 ', dp.sigma1 
print ' sigma2 ', dp.sigma2 
print ' significance ', dp.significance 
print ' silent ', dp.silent 
print ' similarOrientation ', dp.similarOrientation 
print ' storeModels ', dp.storeModels 
print ' storePoints ', dp.storePoints
print ' threshold ', dp.threshold 
'''
####### 6. performing matching
comparePairs = Matching.descriptorMatching(dogpContainer, 2, dp, 0)

print 'Found transformation', comparePairs[0].model.createAffine()
print 'Original transformation', myTrans
print 'Number of inliers', (str(len(comparePairs[0].inliers)) + '/' + str(len(sourcePoints)))
2 Likes

A post was split to a new topic: Looking for Iterative Closest Point code to align 2D data