I have a cloud of 3D points (x,y,z) and a 2D array (x,y). I have to do a matching between these points because I want to find some correspondaces between 2D points (come from image) and 3D model(come from CAD). Someone can help me?. I hope that my problem is clear.
Maybe you specify what the 2D image is; a projection of the 3D model, but then in real life? A subset of the 3D model? A cross-sectional plane through the 3D space? Sample images showing the input and desired output would be even more illustrative.
the 2D image is a simple photographic of a real target (in my case is a satellite). So on one side I extracted the features from these 2D images (x,y coordinates), on the other side I build a CAD model and I extracted the points (X,Y,Z coordinates). Now I have to find soma correspondances, the matching between 2D points and 3D points for my future analysis. The problem is find some algorothm for matching 2D/3D. Thanks.
Ps.I upload my 2D image
I do not know the answer to your question, but I guess that you need to transform the CAD data cloud to match the position of the 2D cloud in the photo.
I do not think you can do that without fiducial points in both sets to know exactly what transform you need to apply to the 3D data.
This sounds more like a Computer Graphics problem than an image processin problem to me. CG is a lot about downprojecting virtual objects onto a computer screen.
After 3D manipulation (rotation, translation) project the 3D data space onto a 2D plane and calculate least squares error between calculated and observed 2D data set (image points). Then iterate over angles and distances of the 3D manipulation matrix and find the ‘best’ solution.
Image processing could assist you in calculating the amount of corresponding areas in the projected and observed image.