Skip to content

change RemoveDuplicateVertices and calculateNormals function#15

Merged
tinevez merged 4 commits intoimglib:mainfrom
ekatrukha:meshProcessing
Jan 20, 2026
Merged

change RemoveDuplicateVertices and calculateNormals function#15
tinevez merged 4 commits intoimglib:mainfrom
ekatrukha:meshProcessing

Conversation

@ekatrukha
Copy link
Copy Markdown
Contributor

@ekatrukha ekatrukha commented Jan 14, 2026

This PR suggests changing two things.

First, in the calculate function of RemoveDuplicateVertices class, add a check to see if some of the triangles become degenerate.
This is done by one extra condition: do some of the triangle vertices (indices) become equal.
I.e. triangle is "collapsed", where two or three vertices coincide, resulting in zero area.
This can happen when we round up the vertices' coordinates with specified precision.

The situation can arise, for example, if the characteristic triangle size is less than the provided precision value.
Sometimes, after marching cubes, if one rounds up with precision larger than or equal to the pixel size.
These degenerative triangles are no problem for the rendering, but if one tries to calculate normals after (which is often the case, since the function output mesh does not have normals anymore), it is a big problem.

(In principle, another option is possible, that all three vertices are on the same line, also zero area, not checked right now)

Second, I suggest to change calculateNormals function of the Meshes class.
The reasoning behind this is very nicely explained in this post.
Quick reminder, right now, the function calculates averaged vertex normals by averaging the normals of the triangles that this vertex is part of.

It is a "regular" averaging now, but it actually looks better if the normals are averaged with weights equal to the areas of the triangles.
These areas are already calculated from the cross product, so this actually eliminates one extra loop over triangles (speed up! again, check the post).

Additionally, it actually handles cases of "degenerative" triangles, since in this case the normal length can be zero and there is no prior normalization, which can lead to infinity and break the cumulative normal calculation.

To illustrate it, here is a movie that has multiple renderings of the example unperfect_sphere.stl mesh.
This mesh has some small triangles that become degenerate after RemoveDuplicateVertices.

20260114_imglib-mesh_illustration.mp4

From left to right

  1. original mesh with multiple duplicate vertices, and normals are calculated
    using the current implementation of calculateNormals
  2. on this mesh, we first applythe current RemoveDuplicateVertices (precision 1)
    and then the current calculateNormals. Since some small triangles are degenerative,
    the normals of adjacent triangles are "black", i.e. infinity.
  3. on this mesh, we first apply the proposed RemoveDuplicateVertices (precision 1)
    and then the current calculateNormals. Since degenerative triangles are gone,
    the normals of the "leftofer" majority become "proper". But there are still some
    very small "badly" oriented triangles that contribute to the vertex normals
    of the larger triangles (since their weight is the same as large ones).
  4. proposed RemoveDuplicateVertices (precision 1) and then proposed calculateNormals.
    The effect of small triangles on the large is negligible, since they are weighted by area.

To reproduce this issue, I used this branch of BVB, since I wanted to see rendering.
This branch depends on the proposed version of imglib2-mesh.
If you load the provided stl file to this branch, it will render option 1),
if you load it gain, option 2), etc.

If it is not enough, I can try to make an alternative and simpler test
somewhere else, let me know.

Cheers,
Eugene

unperfect_sphere.zip

@ekatrukha
Copy link
Copy Markdown
Contributor Author

ekatrukha commented Jan 15, 2026

With the last commit, I've trimmed the number of output triangles in the RemoveDuplicateVertices to only "good ones". It was filtering "collapsed", but still writing "empty" triangles to the output.

Now it already makes the case 3) above look good (like the case 4)).

But I would keep changes to the calculateNormals, since it is faster, nicer looking, and more robust with respect to "dirty" mesh data.

@tinevez tinevez merged commit 76993a6 into imglib:main Jan 20, 2026
1 check passed
@ekatrukha ekatrukha deleted the meshProcessing branch January 20, 2026 11:14
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants