SALD: Sign Agnostic Learning with Derivatives

SALD: Sign Agnostic Learning with Derivatives
Notice: This research summary and analysis were automatically generated using AI technology. For absolute accuracy, please refer to the [Original Paper Viewer] below or the Original ArXiv Source.

💡 Research Summary

This paper introduces SALD (Sign‑Agnostic Learning with Derivatives), a method for learning implicit neural representations of 3‑D shapes directly from raw, unoriented data such as point clouds, triangle soups, or non‑manifold meshes. The approach builds on the earlier Sign‑Agnostic Learning (SAL) framework, which regresses an unsigned distance function h(x)=min_{y∈X}‖x−y‖ to a neural network f(x;θ) using a loss that is invariant to the sign of the output. SAL can recover a signed distance field, but it relies only on function values, which may require many samples to uniquely determine the network.

SALD extends this idea by also incorporating gradient information of the unsigned distance field. The authors define an unsigned similarity measure τ that, for scalars, is τ(a,b)=|| |a|−b || and for vectors τ(a,b)=min{‖a−b‖,‖a+b‖}. The full SALD loss is:
loss(θ)=E_{x∼D}


Comments & Academic Discussion

Loading comments...

Leave a Comment