Basic PolynomialTransform.estimate() blows up with memory overflow

I am attempting to use scikit-image PolynomialTransform to map a bunch of points to their new coordinates. Here is a minimum example

A = np.random.normal(0, 1, 10)
B = np.random.normal(0, 1, 10)
pt = skt.PolynomialTransform(np.array([A, B]))

src = np.random.uniform(0, 1, (65536, 2))
dst = np.zeros(src.shape)

pt.estimate(src, dst)

It crashes with

Unable to allocate 128. GiB for an array with shape (131072, 131072) and data type float64

Why does the estimate function have QUADRATIC memory complexity with respect to the number of points to be mapped ???

It looks like we do an SVD in this line of estimate:

And it also looks like the standard SVD is extremely memory hungry, see here:

By my reading we could potentially fix this by adding a preserve_memory flag to the class init or to the estimate method (or both) that uses the sparse SVD. Indeed, your example just finishes instantly if I use sparse.linalg.svds in that line! :joy: So maybe sparse should be the default (I’m inclined to say yes). Would you mind creating an issue on github.com/scikit-image/scikit-image/issues, linking to this post, so that we can track the problem?

CC @stefanv @grlee77

If you don’t want to wait until the next release of scikit-image to take advantage of this, you can edit that line in your local copy. Let me know if you need help with that!

Hey, thanks for looking at this. I think I got the wrong function. I did not want to estimate the transformation. I wanted to evaluate an existing transformation on a set of points. Can you hint on what is the function for that?

EDIT: Never mind. It was just there in the source code. Apparently I just needed to use the __call__() function

1 Like

Note that the syntax for __call__() is simply tf() where tf is the estimated transform.

@stefanv Thanks, that makes sense.

@jni I submitted a tiny issue as you requested. The problem is no longer relevant for me as I found it by mistake, but hopefully it is useful to somebody else

1 Like

Brilliant, thank you @aleksejs-fomins!