Skip to content

[PoC] : Add neural operator ROM proof-of-concept for GSoC 2026#35

Draft
Ady0333 wants to merge 1 commit intogridap:mainfrom
Ady0333:poc/neural-operator-rom
Draft

[PoC] : Add neural operator ROM proof-of-concept for GSoC 2026#35
Ady0333 wants to merge 1 commit intogridap:mainfrom
Ady0333:poc/neural-operator-rom

Conversation

@Ady0333
Copy link
Copy Markdown
Contributor

@Ady0333 Ady0333 commented Mar 20, 2026

Description

Classical reduced order models (POD-Galerkin, greedy RBM) work well for PDEs with smooth parameter dependence, but struggle when the solution manifold is nonlinear or high-dimensional. For parametric problems requiring thousands of evaluations (optimization, uncertainty quantification), even reduced models can be too slow. Traditional ROMs also require intrusive access to PDE operators, limiting their use with legacy codes.
Example Issue: A parametric elasticity problem with 5 material parameters needs 10,000 solves for Monte Carlo sampling. POD-Galerkin with 50 basis functions still takes hours because each evaluation requires assembling and solving the reduced system.


Solution

This PoC introduces a NeuralOperatorROMs module that learns the parameter→solution mapping directly from FEM snapshots using DeepONet. The DeepONet architecture splits into branch (encodes parameters μ) and trunk (encodes mesh coordinates x), then combines via inner product to predict all DOF values in one pass. For fixed meshes, we precompute the trunk output once, making inference a single matrix-vector multiply—completely bypassing FEM assembly.


What This PoC Demonstrates

  • Non-intrusive training: Only needs (parameter, solution) pairs from existing Gridap solvers.
  • Mesh-independent architecture: DeepONet works on arbitrary FE meshes without modification.
  • Massive speedup: 25× faster than full FEM solve through precomputed trunk optimization.
  • Seamless integration: Predictions convert to FEFunction via standard Gridap interface.
  • Strong generalization: 5.6% error on held-out parameters with just 60 training snapshots.

Testing

319 tests verify:

  • Snapshot generation extracts DOFs correctly via get_free_dof_values.
  • DeepONet forward pass produces correct output shapes.
  • Training loop converges (loss decreases monotonically).
  • Predictions reconstruct valid FEFunction objects.
  • End-to-end: parametric Poisson achieves 5.6% L2 error, 25× speedup.

All tests pass. Lint clean.


Running It

cd GridapROMs.jl
julia --project=. -e 'using Pkg; Pkg.test()'
julia --project=. examples/poisson_deeponet.jl

Nest Steps

This PoC proves the approach works. The full GSoC implementation would:

  • Scale to 5-10 parameter problems with 100K+ DOFs.
  • Add graph neural operator option (leverage mesh topology).
  • Benchmark against POD-Galerkin on elasticity, Stokes, convection-diffusion.
  • Implement mini-batch training for large snapshot datasets.
  • Add comprehensive tutorial and API documentation.

This is a proof-of-concept for the GSoC 2026 "Reduced Order Modelling with Neural Operators" project.

Proof of concept for GSoC 2026 "Reduced Order Modelling with Neural Operators".

Adds a new NeuralOperatorROMs submodule that implements a DeepONet-based
non-intrusive reduced order model for parametric PDEs:

- Snapshots.jl: Parameter sampling (LHS), FEM solve loop, free-DOF
  coordinate extraction via cell connectivity
- DeepONet.jl: Branch-trunk architecture as a Lux.jl layer with
  precomputed trunk matrix for O(N·p) online inference
- Training.jl: MSE training loop with normalization, early stopping,
  and Zygote AD
- Reconstruction.jl: Predicted DOFs → Gridap FEFunction reconstruction

Integration points:
- src/GridapROMs.jl: includes NeuralOperatorROMs module
- src/Exports.jl: re-exports all public symbols via @publish
- Project.toml: adds Lux, Optimisers, Zygote dependencies
- test/runtests.jl: adds NeuralOperatorROMs test suite (319 tests)
- examples/poisson_deeponet.jl: end-to-end demo on parametric Poisson

Signed-off-by: Aditya <ady0333@gmail.com>
@Ady0333 Ady0333 force-pushed the poc/neural-operator-rom branch from 17d5b54 to 757ea70 Compare March 20, 2026 04:10
@Ady0333 Ady0333 marked this pull request as draft March 20, 2026 04:11
@nichomueller
Copy link
Copy Markdown
Collaborator

Hi @Ady0333
Thanks you for this PR. Nonlinear ROMs are definitely an extension I have in mind for this package. I don't have time at the moment, next week I'll have a look at the code you propose. We can even have a chat about this if you'd like.
Cheers!

@Ady0333
Copy link
Copy Markdown
Contributor Author

Ady0333 commented Mar 25, 2026

Thanks @nichomueller! I appreciate you taking the time to review this. I'm definitely interested in discussing nonlinear ROMs and would be happy to chat about it whenever you have time next week. Looking forward to your feedback on the PR!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants