What is it about?

ZeroGrads is a new approach that helps solve problems in computer graphics where traditional methods can’t be used due to missing or undefined gradients. Gradients express the direction of highest improvement, so they are essential for optimization, but when they aren’t available, we often can replace the objective with an easier-to-handle function, known as a "surrogate." ZeroGrads automates this process by using a neural network to approximate the objective function, allowing it to optimize complex graphics problems. It focuses on learning the most important aspects of the objective during each step of the process, without needing pre-made data or pre-trained models. ZeroGrads works well for otherwise hard-to-optimize tasks like rendering scenes, generating procedural models, or controlling animations driven by physics.

Featured Image

Why is it important?

ZeroGrads is important because it addresses a common challenge in computer graphics: many problems can't be solved using traditional optimization methods because they either lack well-defined gradients (at discontinuities), depend on integration (and therefore can't use conventional auto-differentiation logic), or are simply written in a non-differentiable programming language like C++. In such cases, finding an optimal solution becomes difficult, especially for complex tasks like rendering or animation. By automatically creating a neural approximation of the objective function, ZeroGrads makes it possible to optimize these hard-to-solve problems. This saves time and effort compared to manually tweaking functions, and is highly scalable compared to conventional derivative-free optimizers like Genetic Algorithms or Simulated Annealing. This advancement could lead to better performance in fields like animation, game development, and simulations, where solving non-convex or non-differentiable problems is crucial.

Perspectives

We hope this article encourages the graphics community to explore alternative avenues of derivative estimation and to re-visit ideas from the classic optimization literature for differentiable rendering and inverse problems.

Michael Fischer
University College London

Read the Original

This page is a summary of: ZeroGrads: Learning Local Surrogates for Non-Differentiable Graphics, ACM Transactions on Graphics, July 2024, ACM (Association for Computing Machinery),
DOI: 10.1145/3658173.
You can read the full text:

Read

Contributors

The following have contributed to this page