DiffusionPDE: Generative PDE-Solving
Under Partial Observation

NeurIPS 2024
ICML 2024 AI for Science Workshop (Oral)

1University of Michigan 
2Stanford University 

Abstract

We introduce DiffusionPDE, a general framework for solving partial differential equations (PDEs) using generative diffusion models. In particular, we focus on the scenarios where we do not have the full knowledge of the scene necessary to apply classical solvers. Most existing forward or inverse PDE approaches perform poorly when the observations on the data or the underlying coefficients are incomplete, which is a common assumption for real-world measurements. In this work, we propose DiffusionPDE that can simultaneously fill in the missing information and solve a PDE by modeling the joint distribution of the solution and coefficient spaces. We show that the learned generative priors lead to a versatile framework for accurately solving a wide range of PDEs under partial observation, significantly outperforming the state-of-the-art methods for both forward and inverse directions.

Method Overview

In our framework, we train a diffusion model on the joint distribution of the complete observations of the coefficient (or initial state) and the solution (or final state), concatenated along the channel dimension. During inference, sparse observations are made on either the coefficient (or initial state) side, the solution (or final state) side, or both. The pre-trained model is then guided by these sparse observations and the PDE function to output the final full predictions of both the coefficient (or initial state) and the solution (or final state).



Denoising Process

Two samples of the denoising process are shown below.

Denoising Darcy Flow (left) and Navier-Stokes Equation for vorticity (right) with observations of the coefficient or the final state.

Solving Forward and Inverse Problems

We address both forward and inverse problems of various types of PDEs with partial observations and compare the efficacy of our approach with state-of-the-art methods including DeepONet (Lu et al., 2021), PINO (Li et al., 2021), FNO (Li et al., 2020), and PINNs (Raissi et al., 2019). We show the relative errors of all methods regarding both forward and inverse problems with 500 observation points in Table 1. Since the coefficients of Darcy Flow are binary, we evaluate the error rates of our prediction. Non-binary data is evaluated using mean pixel-wise relative error. We report error numbers averaged across 1,000 random scenes and observations for each PDE. DiffusionPDE outperforms all other methods.

Several examples of the forward and inverse problems are shown below.

Bounded Darcy Flow with 500 observation points of the coefficient or the solution.
Non-bounded Navier-Stokes Equation for vorticity with 500 observation points of the initial state or the final state.
More Results

Comparisons with GraphPDE (Zhao et al., 2022)

Two samples of the inverse problem of bounded Navier-Stokes Equation for velocity with a random circular obstacle and 1% observation points at the final state.

Recovering Solutions Throughout a Time Interval

We retrieve all time steps througout a time interval from continuous observations on sparse sensors.

Two samples of the Burgers' equation with continuous observations from 5 out of 128 sensors.

BibTeX


@misc{huang2024diffusionpdegenerativepdesolvingpartial,
    title={DiffusionPDE: Generative PDE-Solving Under Partial Observation}, 
    author={Jiahe Huang and Guandao Yang and Zichen Wang and Jeong Joon Park},
    year={2024},
    eprint={2406.17763},
    archivePrefix={arXiv},
    primaryClass={cs.LG}
    url={https://arxiv.org/abs/2406.17763}, 
}