Implicit Fixed-Point Iteration
📐 Definition
Section titled “📐 Definition”Given an implicit update (for example, ), fixed-point iteration generates
starting from an initial guess until convergence.
Domain and Codomain
Section titled “Domain and Codomain”Applies on Banach spaces (e.g., or function spaces) where maps the space into itself and is contractive on the relevant neighborhood. The limit, when it exists, lies in the same space.
⚙️ Key Properties
Section titled “⚙️ Key Properties”Contraction Condition and Convergence
Section titled “Contraction Condition and Convergence”If is a contraction on a neighborhood,
then the iteration converges to the unique fixed point, with linear rate controlled by .
Cost and Implementation
Section titled “Cost and Implementation”Each iteration requires one evaluation of (often one drift evaluation and an algebraic update). No Jacobian is required, making Picard iteration attractive for matrix-free solves when contractivity holds.
Failure Modes
Section titled “Failure Modes”If is not contractive in the region of interest, the iteration may diverge, converge very slowly, or stagnate; damping or switching to Newton-type methods can help.
🎯 Special Cases and Limits
Section titled “🎯 Special Cases and Limits”For backward Euler on , ; convergence requires small enough for contractivity. For linear , convergence occurs iff the spectral radius , with error decaying as .
🔗 Related Functions
Section titled “🔗 Related Functions”Provides the nonlinear solve for implicit integrators such as backward Euler and Crank–Nicolson; Newton or quasi-Newton iterations accelerate convergence when contractivity is weak.
Usage in Oakfield
Section titled “Usage in Oakfield”Oakfield uses fixed-point (Picard) iteration internally to implement its implicit integrators:
- Backward Euler (
backward_euler) and Crank–Nicolson (crank_nicolson) iterate a fixed number of times (or until the residual falls below tolerance), starting from an explicit predictor. - No Jacobians/Newton: the current implementations are matrix-free and rely on repeated drift evaluations rather than derivative-based solves.
- Adaptive mode can reduce
dtbetween attempts when convergence is not achieved within the iteration budget.
Historical Foundations
Section titled “Historical Foundations”📜 Picard Iteration
Section titled “📜 Picard Iteration”Picard iteration emerged alongside foundational existence and uniqueness theory for ODEs, where successive approximations provide both a constructive method and a proof technique.
🔬 Fixed-Point Theorems
Section titled “🔬 Fixed-Point Theorems”Banach’s fixed-point theorem (contraction mapping principle) formalized sufficient conditions for convergence and uniqueness in complete metric spaces, directly supporting modern convergence analysis.
📚 References
Section titled “📚 References”- Ortega & Rheinboldt, Iterative Solution of Nonlinear Equations in Several Variables
- Hairer, Nørsett, Wanner, Solving Ordinary Differential Equations II