Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix and clear documentation #96

Merged
merged 11 commits into from
Jul 7, 2024
36 changes: 36 additions & 0 deletions .buildkite/pipeline.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,36 @@
steps:
- label: "Julia 1"
plugins:
- JuliaCI/julia#v1:
version: "1"
- JuliaCI/julia-test#v1:
coverage: false # 1000x slowdown
agents:
queue: "juliagpu"
cuda: "*"
env:
GROUP: 'GPU'
timeout_in_minutes: 60
# Don't run Buildkite if the commit message includes the text [skip tests]
if: build.message !~ /\[skip tests\]/

- label: "Documentation"
plugins:
- JuliaCI/julia#v1:
version: "1"
command: |
julia --project=docs -e '
println("--- :julia: Instantiating project")
using Pkg; Pkg.develop(PackageSpec(path=pwd())); Pkg.instantiate()
println("+++ :julia: Building documentation")
include("docs/make.jl")'
agents:
queue: "juliagpu"
cuda: "*"
if: build.message !~ /\[skip docs\]/ && !build.pull_request.draft
timeout_in_minutes: 1000
env:
SECRET_DOCUMENTER_KEY: "v/2iYv+rW3iCgDU5Cp4DN1P51xQvXkC4aTO28pDLysKaNTVopjzsQNNtgJaswckg5ahdusdnAmlDRd8oMEobmyPM7PY7wppGdr6MHfUhwakJ7AAWAyAEIb+e+fNFuu+EIHS43bU+vkgzCgKojFEGwa5AFSW69rucnuwROXzIrQQ6GgXF4LayvDMDbUgYPjjnS2zAgfkWOR1L5UJ2ODuy3zK+VESS5YVFRNSmhAnT0EdS2AsenBwa25IgHsYkGO1fR/CL+1epFg7yzSOA+bExDWD0L4WkjKzU8qTlMAI2BAw+clJLhcpZyy4MdUG8hLmic0vvYjBOdjgqtl1f9xtMwQ==;U2FsdGVkX1+MBH0JHcFRDZ6tuHvLUrc8cxidXL6EZJZYrOllAtI7WHCEqx57m0T49q/ALas8vzZb9FJBGcjcrhCNu6jmbKNsBGNb7ZNRbex9qZCiZltv9g0cFU6xA7562CuYDhLxwIF4bXwszpYg/I+AhSCCIfaeaA1couJ0YTUxvKmW/VCR4/anH4SZD9HqblfMHa7N3FMo1VIEN91n+/xgyPNx3unQnre/UPDK0+Z7qjsYoXhqqQokJENdrsDWifjvdpWKh/1viGlMVzN0KFRYR0HIa8SN9wBuJAjVwgiz4SG7oGansuE1sowHon85iNsgIYr4k2WFpWY/f813EHeGvdFAu8/+/EJ42YvSHKJX2yOah1kw43OsBz/nBzQnffQNgG1/xz3WlqH5nXChlmfamzIaN4eERK2jQDIiPMTkbyPBNlDZ0TRHjzpWbpma2gMvnN2L0nyw6d7+yiop1DdYF/GTKuGXXR4I6dA1q5NdYvMQGmaXkDwqz2c6XLLGDPSbzfy8AOsqmAFfzCQTCVva96ynneBlpClEm2A2cW5l8N8/2Y/SfOguBKfFzjncPd+IK7BdHb0FFOhYmZVvcDglGp/Fy2snUe1JiSrtNNKA/CiRC5UQKcffTBP7qboSIOlamFKTR6GaNCYM3teLCodPiaV7mQuidCiFsR7WwORLRMpxkctMj08YZ+da8riUSrfXndXppZ5X1lNxSOQSmj4BaKVNkWb9h2pcoV3P6gD2PzvKiVsLRpqiLfIEuzfQxco2rU16DZj51zh28Gb8tJSbZWQB+pT2kOyVjBfI6AuWEJ3wkHWxyLntzXTUqc4WWnCu1wLPLWZWBA0rq28jh/jJE2NpGmYL5z8/+9T8i4RrSRwZWMbrSekiruSwrk12mWmbshVkArdSxR/Lk3AvvNetu4SccdDnD8CLxeTkdGYV1tU5OklWpxpPCjG8lI1oCCHCrEMTFu5rc/YeLbPcPCg4oKC5rqLpcJI/bmF+9fwqbhgULGfmHABqMZhs1fjvWEqkURCira0WsPHRVPrqRVqXaejpXbwWZbhdYHe9OoViBebj+6gm3l9KUthZBF+/acp5RvCk2KIH+GIhglVYdp9R3JZelRGTLZa1QGha2+RsnNhfE9Wp8ynjTQBknWUbCSCxOM+bjOTLyp1nZD+AjHtu56N30v8rGmBeWjae7t0e0lSxhdQmt5388R1TbeSxhPLGxTVW3X2WmAaIEnxzBWroTVELjchHq6CJIT7vF1UHFVmr4e3e8XH8o+9Mu3M36d5wAT9wT+ggWJTp+Oo48uZF+OBQ0erkNm37Qz307jPrLRO0UNsmuxawEGKATRgsJCWU1FE6YA+4CDXbcHBtj81flO+7MahO39/y1BLpHDt15sfhM94R8LLmjwsTQhk5kiiFHkxG1JT8I87fq3spO6hFXUFO04QSPYu6GeH6uca4COXV/lGbqv7ahTSZwKQR8EVoaiYHcAJpfK0gXXo9G4pe2xaEU6VP6TIB6wv6Vdag2JAOYt7v0pP08y3UiUMgfdPcdRwa/S7F68YT12gcKXgj/I9DAImyX01lfnMYNsE2w7vIPFgvDJ003uDiD+k5Dg6QPOBuEOryYpdaaDVpD8BKbKEMon7mxV3xJTQEXc84WXk+cSgR2a5fa8yyPn2AuZvc/ICyhxYNriMqd55b5ZP70jeBI0FJpZw00CvGroMuqUDCq7Kmf3tdKB8XHU7sJ/Q2OQEYdJkLSO9lFFu4iU856Xu4XWCwdFwH6CQh04/RFx8Sm18OCuczyHH/P4y930qQEqsjmjyJyU8+dQyGTZV08wPCcv23XAGOfBpnIJyaF44wW/rM6a+C+7dXZYFb1/viMpPxZg9C1ILl7gdo4l0QZxLsv/eVXthrDhSiG1AFQtG8QdS+JiQ2icxA9qpxrsdeCCxjhcZgkqV6MofMAnMr+mSZOS6FdVZcHskoi1M7Dq6iyVjNE9jrWnWWINiEwZ4PnulebI20xXsp27EcI9paHVmzfAKXs2+4P35GHPzGX7Foyx6rYkZkP/emhDjqInq2wUArkV21u7VL4AL4Rl01lAOyWdxfTGFVcrGDylXpKSZVTcJgmeCNdbq8uVHykjabgV+XytKWFf50m23bW5T7GZsBk+wuXe6vSrUfL3V5h29VbPorsMZV4p5a5Yn37aROD3pvb/Zex1WXSaxZyIn0z1hMyaElsiYvQ9T3jIwcjHrm/WEbcCPojNbJTGvXKAjcKNFuvMUAlpSNIWDcouKRD6ekOKlNhuJiDclLYKEj7Hj8n/btttruz1Z0G/Znpz7k6dzV7TlLU5bvmefX/yn0l5cMG70cb5111/Zv6twkWT7vXe2rX0DWzVst15DxplGID0VM4tDJO5lXTYNrvoVrK9rVCtAJtzlMLIF6zQxQWMxqhU7Sc7YC2nQJgl99igQFjqB1/BF2ZQZ+EZDFDuKSIrTsscN4xJTtjFSQkZWJT10RyIO0IamoBP2K6SVlRFG91+fC+vgyauqdvqBAWEQbQHsy3vyi3vxS3AeZFDLqeBuRU/L/aYiGde4ok9YXr6ab5d6Aj57bReDBt5r50GY1M/Aaz9gxLvfWBMp8EsZVHovItHL6BE8Ejk6C/nr2uDklktvbVPzE8YSh5BUfeH/PCymPTE6iCcu20tZOVxU1dhGgOkstRHJWgKi3RydyzyHihOsPZXpTsWzTmU/sc9rgbz4ypVRfk3s6hc2ClC2gxQEpVUMacaluvazXn8G24pvlhpEoX6rgjaILjkGiCTIuP4k9e3AQjIW6704Hj9o9j0E5GYpw0ed60ALkoCNoEW9X3f1DXN9x4lVefX2Ix522EOs6dKfbOJsJcI4C56MFlJkKbwETBt4YCumfqDafO98OKkSAHWQn+UvJraVE/64C5b7rJQOb00G9bxovDcyoGXZ28AyFA1abzvhWj7L9AFiOwe6sdouz7aQC2yTNZxNXbwZG5hJuMWv1TVM+TPxzQm/fvy60brAfPhE+vdJVbhzZ+LIwHVGwdfH8XlYJYcHj7sUM01ax0154Y6+V6/T8sJjDrXrHxUYhxJBJDDDsEDy6ZW891r/3mbdfk8faCxtFqD8YCJJaO7C43S/f4ELOtBq7VhFRLeNdLy4yfGcJLSVNDi2Mb26rsdxRIVo7ppekCQNNRSoNGWJUwmuiarbbti2+KdpJYib7KQH862VU1ki64GhlD6ZAO3S2PZ5RnPfBCZcsJ5fyMI8u637wk6kxoIGqsuH0A64gXp6qQo+z3FqRz2X35B9hrXTmshpewsUNMoFIz3WWijUbHDfmMyPAziX5ZDzEAX+C29CmaY6TACtN9KcYCQ+7/MQe8Ye8mHwFqCdNmpCNL9RWjweRykzPDO7M8nsCsnwsi2jOHy1u9C+KQaw+sN9yiJKxpITXWdJTwPwEKhTv0lL2AUW9h+ue77nh6eqrQy+p7FJELEWQePRKAf0ryk+GOX7uSiDuUqHdyPOhWGOlWdQ0IEurcNjVxC8NG4Md5wD2V+iQe49kCPeUDgjkm8YPt6HALSLIdRe7EfpJ9QAe8Z2yCYZuk0ckHcyHcPotSvCyMAY4zmke/GjjbBR3jpcFmGZT0PyD76+yEy5te86dH1wVT7gPozKAwH4Wt3l9xzBYEDx5WSGnC/AEe8bkHAQSUFdeFZIvt7/poonr5/DNBP33z2DEw675ym+Jx9wtYrgQK4HUy9mSf7BYGmt0hWy8s4+t8lOohxcIYytShJXybJA1L4904PA13pd5VJgDFGB2xrHQ/UywsAIGFYsv9fEkkH83pvRDAufmxcrB5DTf0cCul3qv7gI38Gh+FT+By98k9ucBCAzH0APthuM7ERrno0oEnASXeGNAkN/1vVcVSors6CJjFdB5I3+e2yub4ZLa99qXoyeTr4aR9oIuMcvcRagqpVzAwxirHu9mNlMQ1MNt41BbLPn36gfx5jWoUxZCwtNIvKXUfz8kAnjryUL0qCjrr1ZvFp0CC0P6tprpUiLfp2OUbz36PvGnbK3SaDkj05X/BUnLtRObl2o5YOdXZCuzTCGYP3GSbzI3ot3ps8N6RxhMCsN8xyn6yzojLuzm2hWhkgLH626KcZt0fvxWMUinJKIShnjJveWcX1FPQpYRv2k/EwF5lidmBI51DDtU+N9c34FMA3bgYyN2LwNP6HesAZAEtQ0GHYDXPJzjS2t01gVnb3ei7Gdm6GY4Tc0XimM/IIf68qfsESwMYGG2X8siGtM/kFJSxGXwbIAmwLq3wYO4TYvL0ZD4z0ObMpIOiQfmVJngitZgsFCrImrMpDV4nhGePS7nlu2SkLnTKN1CyQvoCrwfwSKGeqNtGsFeUY26zTS+c3h6X0pudKSt2zfIEl/yJBTfotKYtDT+GnYdXxsCo/RixxOTS7HpfNvarCFLnU6p"
thazhemadam marked this conversation as resolved.
Show resolved Hide resolved

env:
JULIA_PKG_SERVER: "" # it often struggles with our large artifacts
1 change: 0 additions & 1 deletion LICENSE.md
Original file line number Diff line number Diff line change
Expand Up @@ -19,4 +19,3 @@ The HighDimPDE.jl package is licensed under the MIT "Expat" License:
> LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
> OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
> SOFTWARE.
>
2 changes: 2 additions & 0 deletions Project.toml
Original file line number Diff line number Diff line change
Expand Up @@ -12,6 +12,7 @@ Functors = "d9f16b24-f501-4c13-a1f2-28368ffc5196"
LinearAlgebra = "37e2e46d-f89d-539d-b4ee-838fcccc9c8e"
Random = "9a3f8284-a2c9-5f02-9a11-845980a1fd5c"
Reexport = "189a3867-3050-52da-a836-e630ba90ab69"
SciMLBase = "0bca4576-84f4-4d90-8ffe-ffa030f20462"
SciMLSensitivity = "1ed8b502-d754-442c-8d5d-10ac956f44a1"
SparseArrays = "2f01184e-e22b-5df5-ae63-d93ebab69eaf"
Statistics = "10745b16-79ce-11e8-11f9-7d13ad32a3b2"
Expand All @@ -33,6 +34,7 @@ LinearAlgebra = "1.10"
Random = "1.10"
Reexport = "1.2.2"
SafeTestsets = "0.1"
SciMLBase = "2.42"
SciMLSensitivity = "7.62"
SparseArrays = "1.10"
Statistics = "1.10"
Expand Down
15 changes: 10 additions & 5 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@
[![codecov](https://codecov.io/gh/SciML/HighDimPDE.jl/branch/main/graph/badge.svg)](https://codecov.io/gh/SciML/HighDimPDE.jl)
[![Build Status](https://github.com/SciML/HighDimPDE.jl/workflows/CI/badge.svg)](https://github.com/SciML/HighDimPDE.jl/actions?query=workflow%3ACI)

[![ColPrac: Contributor's Guide on Collaborative Practices for Community Packages](https://img.shields.io/badge/ColPrac-Contributor's%20Guide-blueviolet)](https://github.com/SciML/ColPrac)
[![ColPrac: Contributor's Guide on Collaborative Practices for Community Packages](https://img.shields.io/badge/ColPrac-Contributor%27s%20Guide-blueviolet)](https://github.com/SciML/ColPrac)
[![SciML Code Style](https://img.shields.io/static/v1?label=code%20style&message=SciML&color=9558b2&labelColor=389826)](https://github.com/SciML/SciMLStyle)

# HighDimPDE.jl
Expand All @@ -13,12 +13,13 @@

$$
\begin{aligned}
(\partial_t u)(t,x) &= \int_{\Omega} f\big(t,x,{\bf x}, u(t,x),u(t,{\bf x}), ( \nabla_x u )(t,x ),( \nabla_x u )(t,{\bf x} ) \big) d{\bf x} \\
& \quad + \big\langle \mu(t,x), ( \nabla_x u )( t,x ) \big\rangle + \tfrac{1}{2} \text{Trace} \big(\sigma(t,x) [ \sigma(t,x) ]^* ( \text{Hess}_x u)(t, x ) \big).
(\partial_t u)(t,x) &= \int_{\Omega} f\big(t,x,{\bf x}, u(t,x),u(t,{\bf x}), ( \nabla_x u )(t,x ),( \nabla_x u )(t,{\bf x} ) \big) d{\bf x} \\
& \quad + \big\langle \mu(t,x), ( \nabla_x u )( t,x ) \big\rangle + \tfrac{1}{2} \text{Trace} \big(\sigma(t,x) [ \sigma(t,x) ]^* ( \text{Hess}_x u)(t, x ) \big).
\end{aligned}
$$

where $u \colon [0,T] \times \Omega \to \mathbb{R}, \Omega \subseteq \mathbb{R}^{d}$ is subject to initial and boundary conditions, and where $d$ is large.

## Tutorials and Documentation

For information on using the package,
Expand All @@ -27,22 +28,26 @@ For information on using the package,
the documentation, which contains the unreleased features.

## Installation

Open Julia and type the following

```julia
using Pkg;
Pkg.add("HighDimPDE.jl")
```

This will download the latest version from the git repo and download all dependencies.

## Getting started

See documentation and `test` folders.

## Reference
- Boussange, V., Becker, S., Jentzen, A., Kuckuck, B., Pellissier, L., Deep learning approximations for non-local nonlinear PDEs with Neumann boundary conditions. [arXiv](https://arxiv.org/abs/2205.03672) (2022)

- Boussange, V., Becker, S., Jentzen, A., Kuckuck, B., Pellissier, L., Deep learning approximations for non-local nonlinear PDEs with Neumann boundary conditions. [arXiv](https://arxiv.org/abs/2205.03672) (2022)

<!-- - Becker, S., Braunwarth, R., Hutzenthaler, M., Jentzen, A., von Wurstemberger, P., Numerical simulations for full history recursive multilevel Picard approximations for systems of high-dimensional partial differential equations. [arXiv](https://arxiv.org/abs/2005.10206) (2020)
- Beck, C., Becker, S., Cheridito, P., Jentzen, A., Neufeld, A., Deep splitting method for parabolic PDEs. [arXiv](https://arxiv.org/abs/1907.03452) (2019)
- Han, J., Jentzen, A., E, W., Solving high-dimensional partial differential equations using deep learning. [arXiv](https://arxiv.org/abs/1707.02568) (2018) -->

<!-- ## Acknowledgements
`HighDimPDE.jl` is inspired from Sebastian Becker's scripts in Python, TensorFlow, and C++. Pr. Arnulf Jentzen largely contributed to the theoretical developments of the solver algorithms implemented. -->
9 changes: 7 additions & 2 deletions docs/Project.toml
Original file line number Diff line number Diff line change
@@ -1,10 +1,15 @@
[deps]
CUDA = "052768ef-5323-5732-b1bb-66c8b64840ba"
cuDNN = "02a925ec-e4fe-4b08-9a7e-0d78e3d38ccd"
Documenter = "e30172f5-a6a5-5a46-863b-614d45cd2de4"
Flux = "587475ba-b771-5e3f-ad9e-33799f191a9c"
HighDimPDE = "57c578d5-59d4-4db8-a490-a9fc372d19d2"
LinearAlgebra = "37e2e46d-f89d-539d-b4ee-838fcccc9c8e"
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

missing cuda and nnlibcuda

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Added CUDA: cad2d35.
We donot need NNlibCUDA

StochasticDiffEq = "789caeaf-c7a9-5a7d-9973-96adeb23e2a0"

[compat]
CUDA = "5"
CUDA = "5.4.2"
cuDNN = "1.3.2"
Documenter = "1"
Flux = "0.13, 0.14"
Flux = "0.14.16"
StochasticDiffEq = "6.66"
2 changes: 1 addition & 1 deletion docs/make.jl
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@ cp("./docs/Project.toml", "./docs/src/assets/Project.toml", force = true)
include("pages.jl")

makedocs(sitename = "HighDimPDE.jl",
authors = "Victor Boussange",
authors = "#",
pages = pages,
clean = true, doctest = false, linkcheck = true,
format = Documenter.HTML(assets = ["assets/favicon.ico"],
Expand Down
12 changes: 9 additions & 3 deletions docs/pages.jl
Original file line number Diff line number Diff line change
Expand Up @@ -5,7 +5,7 @@ pages = [
"Solver Algorithms" => ["MLP.md",
"DeepSplitting.md",
"DeepBSDE.md",
"NNStopping.md",
"NNStopping.md",
"NNKolmogorov.md",
"NNParamKolmogorov.md"],
"Tutorials" => [
Expand All @@ -14,7 +14,13 @@ pages = [
"tutorials/mlp.md",
"tutorials/nnstopping.md",
"tutorials/nnkolmogorov.md",
"tutorials/nnparamkolmogorov.md",
"tutorials/nnparamkolmogorov.md"
],
"Feynman Kac formula" => "Feynman_Kac.md",
"Extended Examples" => [
"examples/blackscholes.md"
],
"Solver Algorithms" => ["MLP.md",
"DeepSplitting.md",
"DeepBSDE.md"],
"Feynman Kac formula" => "Feynman_Kac.md"
]
9 changes: 6 additions & 3 deletions docs/src/DeepBSDE.md
Original file line number Diff line number Diff line change
@@ -1,16 +1,19 @@
# [The `DeepBSDE` algorithm](@id deepbsde)

### Problems Supported:
1. [`ParabolicPDEProblem`](@ref)

1. [`ParabolicPDEProblem`](@ref)

```@autodocs
Modules = [HighDimPDE]
Pages = ["DeepBSDE.jl", "DeepBSDE_Han.jl"]
```

## The general idea 💡
The `DeepBSDE` algorithm is similar in essence to the `DeepSplitting` algorithm, with the difference that

The `DeepBSDE` algorithm is similar in essence to the `DeepSplitting` algorithm, with the difference that
it uses two neural networks to approximate both the the solution and its gradient.

## References
- Han, J., Jentzen, A., E, W., Solving high-dimensional partial differential equations using deep learning. [arXiv](https://arxiv.org/abs/1707.02568) (2018)

- Han, J., Jentzen, A., E, W., Solving high-dimensional partial differential equations using deep learning. [arXiv](https://arxiv.org/abs/1707.02568) (2018)
65 changes: 42 additions & 23 deletions docs/src/DeepSplitting.md
Original file line number Diff line number Diff line change
@@ -1,8 +1,9 @@
# [The `DeepSplitting` algorithm](@id deepsplitting)

### Problems Supported:
1. [`PIDEProblem`](@ref)
2. [`ParabolicPDEProblem`](@ref)

1. [`PIDEProblem`](@ref)
2. [`ParabolicPDEProblem`](@ref)

```@autodocs
Modules = [HighDimPDE]
Expand All @@ -13,76 +14,93 @@ The `DeepSplitting` algorithm reformulates the PDE as a stochastic learning prob

The algorithm relies on two main ideas:

- The approximation of the solution $u$ by a parametric function $\bf u^\theta$. This function is generally chosen as a (Feedforward) Neural Network, as it is a [universal approximator](https://en.wikipedia.org/wiki/Universal_approximation_theorem).
- The approximation of the solution $u$ by a parametric function $\bf u^\theta$. This function is generally chosen as a (Feedforward) Neural Network, as it is a [universal approximator](https://en.wikipedia.org/wiki/Universal_approximation_theorem).

- The training of $\bf u^\theta$ by simulated stochastic trajectories of particles, through the link between linear PDEs and the expected trajectory of associated Stochastic Differential Equations (SDEs), explicitly stated by the [Feynman Kac formula](https://en.wikipedia.org/wiki/Feynman–Kac_formula).
- The training of $\bf u^\theta$ by simulated stochastic trajectories of particles, through the link between linear PDEs and the expected trajectory of associated Stochastic Differential Equations (SDEs), explicitly stated by the [Feynman Kac formula](https://en.wikipedia.org/wiki/Feynman%E2%80%93Kac_formula).

## The general idea 💡

Consider the PDE

```math
\partial_t u(t,x) = \mu(t, x) \nabla_x u(t,x) + \frac{1}{2} \sigma^2(t, x) \Delta_x u(t,x) + f(x, u(t,x)) \tag{1}
```
with initial conditions $u(0, x) = g(x)$, where $u \colon \R^d \to \R$.

with initial conditions $u(0, x) = g(x)$, where $u \colon \R^d \to \R$.

### Local Feynman Kac formula

`DeepSplitting` solves the PDE iteratively over small time intervals by using an approximate [Feynman-Kac representation](@ref feynmankac) locally.

More specifically, considering a small time step $dt = t_{n+1} - t_n$ one has that

```math
u(t_{n+1}, X_{T - t_{n+1}}) \approx \mathbb{E} \left[ f(t, X_{T - t_{n}}, u(t_{n},X_{T - t_{n}}))(t_{n+1} - t_n) + u(t_{n}, X_{T - t_{n}}) | X_{T - t_{n+1}}\right] \tag{3}.
```

One can therefore use Monte Carlo integrations to approximate the expectations

```math
u(t_{n+1}, X_{T - t_{n+1}}) \approx \frac{1}{\text{batch\_size}}\sum_{j=1}^{\text{batch\_size}} \left[ u(t_{n}, X_{T - t_{n}}^{(j)}) + (t_{n+1} - t_n)\sum_{k=1}^{K} \big[ f(t_n, X_{T - t_{n}}^{(j)}, u(t_{n},X_{T - t_{n}}^{(j)})) \big] \right]
```


### Reformulation as a learning problem

The `DeepSplitting` algorithm approximates $u(t_{n+1}, x)$ by a parametric function ${\bf u}^\theta_n(x)$. It is advised to let this function be a neural network ${\bf u}_\theta \equiv NN_\theta$ as they are universal approximators.

For each time step $t_n$, the `DeepSplitting` algorithm
For each time step $t_n$, the `DeepSplitting` algorithm

1. Generates the particle trajectories $X^{x, (j)}$ satisfying [Eq. (2)](@ref feynmankac) over the whole interval $[0,T]$.
1. Generates the particle trajectories $X^{x, (j)}$ satisfying [Eq. (2)](@ref feynmankac) over the whole interval $[0,T]$.

2. Seeks ${\bf u}_{n+1}^{\theta}$ by minimizing the loss function
2. Seeks ${\bf u}_{n+1}^{\theta}$ by minimizing the loss function

```math
L(\theta) = ||{\bf u}^\theta_{n+1}(X_{T - t_n}) - \left[ f(t, X_{T - t_{n-1}}, {\bf u}_{n-1}(X_{T - t_{n-1}}))(t_{n} - t_{n-1}) + {\bf u}_{n-1}(X_{T - t_{n-1}}) \right] ||
```

This way, the PDE approximation problem is decomposed into a sequence of separate learning problems.
In `HighDimPDE.jl` the right parameter combination $\theta$ is found by iteratively minimizing $L$ using **stochastic gradient descent**.

!!! tip

To solve with `DeepSplitting`, one needs to provide to `solve`
- `dt`
- `batch_size`
- `maxiters`: the number of iterations for minimizing the loss function
- `abstol`: the absolute tolerance for the loss function
- `use_cuda`: if you have a Nvidia GPU, recommended.

- `dt`
- `batch_size`
- `maxiters`: the number of iterations for minimizing the loss function
- `abstol`: the absolute tolerance for the loss function
- `use_cuda`: if you have a Nvidia GPU, recommended.

## Solving point-wise or on a hypercube

### Pointwise

`DeepSplitting` allows obtaining $u(t,x)$ on a single point $x \in \Omega$ with the keyword $x$.

```julia
prob = PIDEProblem(μ, σ, x, tspan, g, f,)
prob = PIDEProblem(μ, σ, x, tspan, g, f)
```

### Hypercube

Yet more generally, one wants to solve Eq. (1) on a $d$-dimensional cube $[a,b]^d$. This is offered by `HighDimPDE.jl` with the keyword `x0_sample`.

```julia
prob = PIDEProblem(μ, σ, x, tspan, g, f; x0_sample = x0_sample)
```

Internally, this is handled by assigning a random variable as the initial point of the particles, i.e.

```math
X_t^\xi = \int_0^t \mu(X_s^x)ds + \int_0^t\sigma(X_s^x)dB_s + \xi,
```

where $\xi$ a random variable uniformly distributed over $[a,b]^d$. This way, the neural network is trained on the whole interval $[a,b]^d$ instead of a single point.

## Non-local PDEs

`DeepSplitting` can solve for non-local reaction diffusion equations of the type

```math
\partial_t u = \mu(x) \nabla_x u + \frac{1}{2} \sigma^2(x) \Delta u + \int_{\Omega}f(x,y, u(t,x), u(t,y))dy
```
Expand All @@ -93,13 +111,14 @@ The non-localness is handled by a Monte Carlo integration.
u(t_{n+1}, X_{T - t_{n+1}}) \approx \sum_{j=1}^{\text{batch\_size}} \left[ u(t_{n}, X_{T - t_{n}}^{(j)}) + \frac{(t_{n+1} - t_n)}{K}\sum_{k=1}^{K} \big[ f(t, X_{T - t_{n}}^{(j)}, Y_{X_{T - t_{n}}^{(j)}}^{(k)}, u(t_{n},X_{T - t_{n}}^{(j)}), u(t_{n},Y_{X_{T - t_{n}}^{(j)}}^{(k)})) \big] \right]
```

!!! tip
In practice, if you have a non-local model, you need to provide the sampling method and the number $K$ of MC integration through the keywords `mc_sample` and `K`.
```julia
alg = DeepSplitting(nn, opt = opt, mc_sample = mc_sample, K = 1)
```
`mc_sample` can be whether `UniformSampling(a, b)` or ` NormalSampling(σ_sampling, shifted)`.
!!! tip


In practice, if you have a non-local model, you need to provide the sampling method and the number $K$ of MC integration through the keywords `mc_sample` and `K`.
`julia alg = DeepSplitting(nn, opt = opt, mc_sample = mc_sample, K = 1) `
`mc_sample` can be whether `UniformSampling(a, b)` or ` NormalSampling(σ_sampling, shifted)`.

## References
- Boussange, V., Becker, S., Jentzen, A., Kuckuck, B., Pellissier, L., Deep learning approximations for non-local nonlinear PDEs with Neumann boundary conditions. [arXiv](https://arxiv.org/abs/2205.03672) (2022)
- Beck, C., Becker, S., Cheridito, P., Jentzen, A., Neufeld, A., Deep splitting method for parabolic PDEs. [arXiv](https://arxiv.org/abs/1907.03452) (2019)

- Boussange, V., Becker, S., Jentzen, A., Kuckuck, B., Pellissier, L., Deep learning approximations for non-local nonlinear PDEs with Neumann boundary conditions. [arXiv](https://arxiv.org/abs/2205.03672) (2022)
- Beck, C., Becker, S., Cheridito, P., Jentzen, A., Neufeld, A., Deep splitting method for parabolic PDEs. [arXiv](https://arxiv.org/abs/1907.03452) (2019)
Loading
Loading