Approximating functions, functionals, and operators using deep neural networks for diverse applications - presented by Prof. George Karniadakis

Approximating functions, functionals, and operators using deep neural networks for diverse applications

Prof. George Karniadakis

Prof. George Karniadakis
Slide at 32:51
DeepOnet
nonlinear operators based on the universal approximation theorem of operators
+ks2++(((x)
4:u(x)
s(x) = so u(T)dT
=k(t;w)y(t;w)dt
-ksinsi+u(t
div(eb(w)/u(x;w))=f(x) =
cos(x)
sin(x)
loge(x)
cos(2x)
log10(x)
cot(x)
sin (2x)
Nature Machine Intelligence. March 2021
1
References
  • 1.
    L. Lu et al. (2021) Learning nonlinear operators via DeepONet based on the universal approximation theorem of operators. Nature Machine Intelligence
Share slide
Summary (AI generated)

I have various types of operators that I define, including explicit mathematical operators, stochastic operators, fractional operators, and multiscale operators that operate in the nano and continuum regime. These multiscale operators involve different heterogeneous mathematical descriptions. Can I come up with these operators? Yes, because I will refer back to Chen and Chen.

The reason why this is important is because we transition from functions to operators. When we transition from functions, such as the success in image net, we move from a finite dimensional space to another finite dimensional space. However, when we talk about operators, we are mapping one infinite dimensional space to another infinite dimensional space. These operators could be mathematical operators, kernels, dynamical systems like OD or PD, stochastic PD, biological systems, or social systems. Some systems may not even have equations written for them.