Why use Signed Distance Function(SDF) as Level Set Function?

In Level Set Segmentation Algorithm, why it uses Signed Distance Function(SDF) as Level Set Function(SDF)?

What’s more, why Level Set Function(SDF) will deviate Signed Distance Function(SDF) during evolution of SDF?

Hi @SimyHsu, and welcome to the forum.

A very nice feature of signed distance functions is that the magnitude of the gradient is always 1.
When you evolve the level set, there are usually terms that need the normalized gradient, but if
you know the gradient magnitude is 1, you can skip the normailzation step.

I’m not sure what your background is, but there is a discussion of some reasons the deviation happens in this paper by Gomes & Fagueras (Section 2).

To summarize their first example, the PDE that evolves the level set applies everywhere (not just the zero-level set which represents the boundary). This means that every level set is being “squeezed” toward the zero level set, yielding a result that is no longer an SDF.

Please follow up if you want to discuss more,