Skip to content
Oakfield Operator Calculus Function Reference Site

Implicit Fixed-Point Iteration

Given an implicit update u=G(u)u = G(u) (for example, u=un+ΔtF(u,tn+1)u = u^{n} + \Delta t\,F(u, t_{n+1})), fixed-point iteration generates

u(m+1)=G ⁣(u(m))u^{(m+1)} = G\!\big(u^{(m)}\big)

starting from an initial guess u(0)u^{(0)} until convergence.

Applies on Banach spaces (e.g., Rm\mathbb{R}^m or function spaces) where GG maps the space into itself and is contractive on the relevant neighborhood. The limit, when it exists, lies in the same space.


If GG is a contraction on a neighborhood,

G(u)G(v)Luv,0L<1\|G(u) - G(v)\| \le L\|u-v\|, \qquad 0 \le L < 1

then the iteration converges to the unique fixed point, with linear rate controlled by LL.

Each iteration requires one evaluation of GG (often one drift evaluation and an algebraic update). No Jacobian is required, making Picard iteration attractive for matrix-free solves when contractivity holds.

If GG is not contractive in the region of interest, the iteration may diverge, converge very slowly, or stagnate; damping or switching to Newton-type methods can help.


For backward Euler on ut=F(u,t)u_t = F(u,t), G(u)=un+ΔtF(u,tn+1)G(u) = u^{n} + \Delta t\,F(u, t_{n+1}); convergence requires ΔtF\Delta t\,\|F'\| small enough for contractivity. For linear G(u)=Bu+cG(u) = Bu + c, convergence occurs iff the spectral radius ρ(B)<1\rho(B) < 1, with error decaying as Bm\|B\|^m.


Provides the nonlinear solve for implicit integrators such as backward Euler and Crank–Nicolson; Newton or quasi-Newton iterations accelerate convergence when contractivity is weak.


Oakfield uses fixed-point (Picard) iteration internally to implement its implicit integrators:

  • Backward Euler (backward_euler) and Crank–Nicolson (crank_nicolson) iterate a fixed number of times (or until the residual falls below tolerance), starting from an explicit predictor.
  • No Jacobians/Newton: the current implementations are matrix-free and rely on repeated drift evaluations rather than derivative-based solves.
  • Adaptive mode can reduce dt between attempts when convergence is not achieved within the iteration budget.

Picard iteration emerged alongside foundational existence and uniqueness theory for ODEs, where successive approximations provide both a constructive method and a proof technique.

Banach’s fixed-point theorem (contraction mapping principle) formalized sufficient conditions for convergence and uniqueness in complete metric spaces, directly supporting modern convergence analysis.


  • Ortega & Rheinboldt, Iterative Solution of Nonlinear Equations in Several Variables
  • Hairer, Nørsett, Wanner, Solving Ordinary Differential Equations II