Physics Informed Neural Fields for Smoke Reconstruction with Sparse Data


High-fidelity reconstruction of dynamic fluids from sparse multiview RGB videos remains a formidable challenge, due to the complexity of the underlying physics as well as the severe occlusion and complex lighting in the captured data. Existing solutions either assume knowledge of obstacles and lighting, or only focus on simple fluid scenes without obstacles or complex lighting, and thus are unsuitable for real-world scenes with unknown lighting conditions or arbitrary obstacles. We present the first method to reconstruct dynamic fluid phenomena by leveraging the governing physics (ie, Navier -Stokes equations) in an end-to-end optimization from a mere set of sparse video frames without taking lighting conditions, geometry information, or boundary conditions as input. Our method provides a continuous spatio-temporal scene representation using neural networks as the ansatz of density and velocity solution functions for fluids as well as the radiance field for static objects. With a hybrid architecture that separates static and dynamic contents apart, fluid interactions with static obstacles are reconstructed for the first time without additional geometry input or human labeling. By augmenting time-varying neural radiance fields with physics-informed deep learning, our method benefits from the supervision of images and physical priors. To achieve robust optimization from sparse input views, we introduced a layer-by-layer growing strategy to progressively increase the network capacity of the resulting neural representation. Using our progressively growing models with a newly proposed regularization term, we manage to disentangle the density-color ambiguity in radiance fields without overfitting. A pretrained density-to-velocity fluid model is leveraged in addition as the data prior to avoid suboptimal velocity solutions which underestimate vorticity but trivially fulfill physical equations. Our method exhibits high-quality results with relaxed constraints and strong flexibility on a representative set of synthetic and real flow captures. Code and sample tests are at



author = {Chu, Mengyu and Liu, Lingjie and Zheng, Quan and Franz, Erik and Seidel, Hans-Peter and Theobalt, Christian and Zayer, Rhaleb},
title = {Physics Informed Neural Fields for Smoke Reconstruction with Sparse Data},
journal = {ACM Transactions on Graphics, (Proc. SIGGRAPH)},
month = {aug},
number = {4},
pages = {119:1-119:15},
volume = {41},
year = {2022},

Mengyu Chu, Lingjie Liu, Quan Zheng, Erik Franz, Hans-Peter Seidel, Christian Theobalt, Rhaleb Zayer. 2022.Physics informed neural fields for smoke reconstruction with sparse data. In ACM Transactions on Graphics (Proc. SIGGRAPH), vol. 41, no. 4, 119:1-119:15, 2022.


For questions, clarifications, please get in touch with:
Mengyu Chu<>

Imprint. Data Protection.