Data-Driven Finite Element Method as Foundation Model
Recent advances in machine learning have opened the door to foundation models for scientific computing—general-purpose, physics-grounded models designed to scale across problems, geometries, and applications. Drawing inspiration from traditional finite element methods (FEM), we proposed a unifying framework called Data-Driven FEM (DD-FEM), where trainable components are assigned to local subdomains and then composed through interface constraints. This modular, compositional architecture enables generalization to arbitrary global configurations, reuse across problems, and consistent physical grounding.
DD-FEM builds on principles long valued in numerical physics—modularity, locality, and generalizability—but reimagines them through the lens of machine learning. Rather than hand-coding the behavior of each element, DD-FEM learns local representations (e.g., within small subdomains) from data and governing physics, while maintaining compatibility and consistency at interfaces. This enables the construction of extensible, reusable, and fast surrogates that act as scientific building blocks—not just single-use approximators.
🧱 From Theory to Implementation: DD NM-ROMs
As a concrete realization of the DD-FEM concept, I augmented the domain-decomposition nonlinear manifold reduced-order model (DD NM-ROM) framework. This approach trains shallow, sparse autoencoders on small subdomains (e.g., 2×2 patches), which are then composed into larger, full-domain solvers using algebraic constraints on interface states. By decomposing both training and inference, the method avoids the scalability bottlenecks of monolithic architectures while preserving the system’s underlying physical structure.
- Achieves 700× speedups with <1% error
- Enables fast training on small domains (e.g., 2×2) and inference on large (e.g., 10×10) patched domains
- Supports composability, parallelism, and cross-problem generalization
📚 Selected Publications
Y. Choi, S. W. Cheung, Y. Kim, P.‑H. Tsai, A. N. Diaz, I. Zanardi, S. W. Chung, D. M. Copeland, C. Kendrick, W. Anderson, T. Iliescu, and M. Heinkenschloss.
“Defining Foundation Models for Computational Science: A Call for Clarity and Rigor”.
arXiv preprint (May 2025).
DOI
I. Zanardi, A. N. Diaz, S. W. Chung, M. Panesi, and Y. Choi.
“Scalable nonlinear manifold reduced order model for dynamical systems”.
arXiv preprint (Nov. 2024).
DOI NeurIPS
💻 Scientific Software
DD-NM-ROM
Open-source PyTorch framework for building, training, and composing domain-decomposed nonlinear manifold ROMs.
GitHub (LLNL)