Variational loss

Variational loss can be adopted when a PDE has a variational form, in which case the PDE is the Euler-Lagrange equation to the variational quantity. In such problems the PDEs are guaranteed to be elliptic.

A typical variational loss is the Dirichlet energy \[ E(u):=\int_{\Omega} |\nabla u|^2 \mathrm{d} x, \] where \(\Omega\subset\mathbb{R}^n\) is a bounded open domain with smooth boundary. If we restrict the search space to \(u\in H^1(\Omega)\) with constraint \(u_\mathrm{b}\in C^\infty(\partial\Omega)\), then the minimizer of \(E\) exists and is unique, and coincide with the solution to Laplace equation on \(\Omega\) \[ \begin{cases} \Delta u(x) = 0 & x\in\Omega \\ u(x) = u_\text{b}(x) & x\in\partial\Omega \end{cases}. \]

Since we mainly work in \(\mathbb{R}^2\), we rename the input variables as \(x,y\) instead of \(x\in\mathbb{R}^2\).

To illustrate the convenience of the PINN to solve complicated boundary conditions, we chose \(\Omega\) to be a Koch snowflake.

Boundary condition is chosen to be \[ u_\mathrm{b}(x,y)= \frac{x^2-y^2}{x^2+y^2}, \] which is equivalent to the polar cooridinates \[ u_\mathrm{b}(r,\varphi)=\cos(2\varphi). \]

The collocation points are chosen to be triangle vertices of the Koch snowflake at fractal level 5, so \(\Omega\) is approximated by a polygon of \(3\times4^5=3072\) sides. Consequently, there are 45397 collocation points in the interior of this polygon, as the vertices of the triangles. The variational loss in this case is chosen to be \[ L_\mathrm{var}(\theta; A) = \sum_{j=1}^{45397} \left[ u_x^2(x_j,y_j;\theta) + u_y^2(x_j, y_j;\theta) \right], \hspace{1em} A = \{(x_j, y_j)\}_{j=1}^{45397} \subset \mathring{\Omega}. \]

This is a discretized (and rescaled) version of \( \iint_{\Omega} |\nabla u(x,y)|^2 \mathrm{d} x \mathrm{d} y \).

To enforce the boundary condition, we pose the regression loss as \[ L_\mathrm{reg}(\theta; B) = \sum_{j=1}^{3072} \left[ u(x_j,y_j;\theta) - u_\text{b}(x_j, y_j) \right]^2, \hspace{1em} B = \{(x_j, y_j)\}_{j=1}^{3072} \subset \partial\Omega. \]

This is a discretized (and rescaled) version of \( \|u(x,y;\theta) - u_\text{b}(x,y)\|_{L^2(\partial \Omega)}^2 \).

The total loss can be defined as the sum \[ L_\mathrm{total}(\theta; A\cup B) = L_\mathrm{var}(\theta; A) + L_\mathrm{reg}(\theta; B). \]

We can also compare with the solution obtained from finite difference method.