CtrlK
BlogDocsLog inGet started
Tessl Logo

tessl/pypi-pyamg

Algebraic Multigrid (AMG) solvers for large sparse linear systems with Python interface

Pending
Overview
Eval results
Files

krylov-methods.mddocs/

Krylov Iterative Methods

Complete suite of Krylov subspace methods for iterative solution of linear systems. These methods are optimized for use with AMG preconditioning and provide the acceleration layer for PyAMG solvers.

Capabilities

GMRES Family

Generalized Minimal Residual methods for general (nonsymmetric) linear systems.

def gmres(A, b, x0=None, tol=1e-5, restart=None, maxiter=None, 
          M=None, callback=None, residuals=None, **kwargs):
    """
    Generalized Minimal Residual method.
    
    Solves linear system Ax = b using GMRES with optional
    restarting and preconditioning. Most robust Krylov method
    for general matrices.
    
    Parameters:
    - A: sparse matrix or LinearOperator, coefficient matrix
    - b: array-like, right-hand side vector
    - x0: array-like, initial guess (default: zero vector)
    - tol: float, convergence tolerance (relative residual)
    - restart: int, restart parameter (default: no restart)
    - maxiter: int, maximum iterations (default: n)
    - M: sparse matrix or LinearOperator, preconditioner
    - callback: callable, function called after each iteration
    - residuals: list, stores residual norms if provided
    - orthog: str, orthogonalization method ('mgs', 'householder')
    
    Returns:
    tuple: (x, info) where:
        - x: array, solution vector
        - info: int, convergence flag:
            * 0: successful convergence
            * >0: convergence not achieved in maxiter iterations
    
    Raises:
    ValueError: if A and b have incompatible dimensions
    """

def gmres_householder(A, b, x0=None, tol=1e-5, restart=None, 
                     maxiter=None, M=None, **kwargs):
    """
    GMRES with Householder orthogonalization.
    
    Variant of GMRES using Householder reflections for
    orthogonalization, providing better numerical stability
    at higher computational cost.
    
    Parameters: (same as gmres)
    
    Returns:
    tuple: (x, info)
    """

def gmres_mgs(A, b, x0=None, tol=1e-5, restart=None,
              maxiter=None, M=None, **kwargs):
    """
    GMRES with Modified Gram-Schmidt orthogonalization.
    
    GMRES variant using Modified Gram-Schmidt for 
    orthogonalization, balancing stability and efficiency.
    
    Parameters: (same as gmres)
    
    Returns:
    tuple: (x, info)
    """

def fgmres(A, b, x0=None, tol=1e-5, restart=None, maxiter=None,
           M=None, **kwargs):
    """
    Flexible GMRES method.
    
    Variant of GMRES that allows variable preconditioning,
    useful with inner-outer iteration schemes and adaptive
    preconditioning strategies.
    
    Parameters: (same as gmres)
    - M: can vary between iterations for flexible preconditioning
    
    Returns:
    tuple: (x, info)
    """

Usage Examples:

import pyamg
import numpy as np

# Basic GMRES solve
A = pyamg.gallery.poisson((100, 100))
b = np.random.rand(A.shape[0])
x, info = pyamg.krylov.gmres(A, b, tol=1e-8)
print(f"GMRES converged: {info == 0}")

# GMRES with AMG preconditioning
ml = pyamg.ruge_stuben_solver(A)
M = ml.aspreconditioner()
x, info = pyamg.krylov.gmres(A, b, M=M, tol=1e-10)

# GMRES with restart
x, info = pyamg.krylov.gmres(A, b, restart=30, maxiter=100)

# Monitor convergence
residuals = []
x, info = pyamg.krylov.gmres(A, b, residuals=residuals)
print(f"Residuals: {len(residuals)} iterations")

# Flexible GMRES for variable preconditioning
x, info = pyamg.krylov.fgmres(A, b, M=M)

Conjugate Gradient Family

Methods for symmetric positive definite systems with optimal convergence properties.

def cg(A, b, x0=None, tol=1e-5, maxiter=None, M=None, 
       callback=None, residuals=None, **kwargs):
    """
    Conjugate Gradient method.
    
    Optimal Krylov method for symmetric positive definite
    systems. Requires fewer operations per iteration than
    GMRES and has theoretical convergence guarantees.
    
    Parameters:
    - A: sparse matrix or LinearOperator (must be SPD)
    - b: array-like, right-hand side vector
    - x0: array-like, initial guess
    - tol: float, convergence tolerance  
    - maxiter: int, maximum iterations (default: n)
    - M: sparse matrix or LinearOperator, SPD preconditioner
    - callback: callable, iteration callback function
    - residuals: list, stores residual norms
    
    Returns:
    tuple: (x, info)
    
    Raises:
    ValueError: if A is not SPD or dimensions are incompatible
    """

def cr(A, b, x0=None, tol=1e-5, maxiter=None, M=None, **kwargs):
    """
    Conjugate Residual method.
    
    Variant of CG that minimizes A-norm of residual rather
    than Euclidean norm. Can be more robust for indefinite
    systems but requires A to be symmetric.
    
    Parameters: (same as cg, but A need only be symmetric)
    
    Returns:
    tuple: (x, info)
    """

def cgnr(A, b, x0=None, tol=1e-5, maxiter=None, M=None, **kwargs):
    """
    Conjugate Gradient Normal Residual method.
    
    Applies CG to normal equations A^T A x = A^T b.
    Works for rectangular or singular systems but can
    have poor conditioning.
    
    Parameters:
    - A: sparse matrix (can be rectangular or singular)
    - b: array-like, right-hand side
    - M: preconditioner for A^T A system
    
    Returns:
    tuple: (x, info)
    """

def cgne(A, b, x0=None, tol=1e-5, maxiter=None, M=None, **kwargs):
    """
    Conjugate Gradient Normal Equation method.
    
    Applies CG to normal equations A A^T y = b, x = A^T y.
    Alternative formulation for least squares problems.
    
    Parameters: (same as cgnr)
    
    Returns:
    tuple: (x, info)
    """

Usage Examples:

# CG for SPD system
A = pyamg.gallery.poisson((80, 80))
b = np.random.rand(A.shape[0])
x, info = pyamg.krylov.cg(A, b, tol=1e-10)

# CG with AMG preconditioning (most efficient combination)
ml = pyamg.smoothed_aggregation_solver(A)
M = ml.aspreconditioner() 
x, info = pyamg.krylov.cg(A, b, M=M)

# Conjugate residual for symmetric indefinite
A_indef = pyamg.gallery.gauge_laplacian((50, 50))  # Singular
x, info = pyamg.krylov.cr(A_indef, b)

# Normal equations for rectangular system
A_rect = np.random.rand(100, 80)
b_rect = np.random.rand(100)
x, info = pyamg.krylov.cgnr(A_rect, b_rect)

BiConjugate Methods

Methods for nonsymmetric systems using bi-orthogonalization.

def bicgstab(A, b, x0=None, tol=1e-5, maxiter=None, M=None,
             callback=None, **kwargs):
    """
    Biconjugate Gradient Stabilized method.
    
    Stabilized variant of BiCG that avoids erratic convergence
    behavior. Good general-purpose method for nonsymmetric
    systems when GMRES memory requirements are prohibitive.
    
    Parameters:
    - A: sparse matrix or LinearOperator (general matrix)
    - b: array-like, right-hand side vector
    - x0: array-like, initial guess
    - tol: float, convergence tolerance
    - maxiter: int, maximum iterations  
    - M: sparse matrix or LinearOperator, preconditioner
    - callback: callable, iteration callback
    
    Returns:
    tuple: (x, info)
    
    Notes:
    - Memory requirements: O(n) vs O(kn) for GMRES
    - Can experience irregular convergence behavior
    - May break down for certain matrices
    """

Usage Examples:

# BiCGSTAB for nonsymmetric system
A = pyamg.gallery.poisson((60, 60)) + 0.1 * sp.random(3600, 3600, density=0.01)
b = np.random.rand(A.shape[0])
x, info = pyamg.krylov.bicgstab(A, b, tol=1e-8)

# With preconditioning
ml = pyamg.ruge_stuben_solver(A)
M = ml.aspreconditioner()
x, info = pyamg.krylov.bicgstab(A, b, M=M)

Other Iterative Methods

Additional Krylov and related methods for special cases.

def steepest_descent(A, b, x0=None, tol=1e-5, maxiter=None, **kwargs):
    """
    Steepest descent method.
    
    Simple gradient descent method, mainly for educational
    purposes. Very slow convergence but robust and simple.
    
    Parameters:
    - A: sparse matrix (should be SPD for convergence)
    - b: array-like, right-hand side
    - x0: array-like, initial guess
    - tol: float, convergence tolerance
    - maxiter: int, maximum iterations
    
    Returns:
    tuple: (x, info)
    
    Notes:
    - Convergence rate depends on condition number
    - Not recommended for practical use
    - Useful for teaching and algorithm comparison
    """

def minimal_residual(A, b, x0=None, tol=1e-5, maxiter=None, 
                    M=None, **kwargs):
    """
    Minimal Residual method.
    
    Minimizes residual norm at each iteration without
    orthogonality constraints. Can be useful for certain
    problem classes.
    
    Parameters: (same as other Krylov methods)
    
    Returns:
    tuple: (x, info)
    """

Usage Examples:

# Steepest descent (educational)
A = pyamg.gallery.poisson((30, 30))
b = np.random.rand(A.shape[0])
x, info = pyamg.krylov.steepest_descent(A, b, maxiter=1000)

# Minimal residual
x, info = pyamg.krylov.minimal_residual(A, b)

Method Selection Guidelines

Problem Type Recommendations

  • SPD Systems: Conjugate Gradient with AMG preconditioning
  • General Symmetric: Conjugate Residual or GMRES
  • Nonsymmetric: GMRES (small-medium) or BiCGSTAB (large)
  • Indefinite: GMRES with appropriate preconditioning
  • Singular/Rectangular: CGNR or CGNE

Preconditioning Strategy

# Optimal combination: CG + SA AMG for SPD
A = pyamg.gallery.linear_elasticity((40, 40))
ml = pyamg.smoothed_aggregation_solver(A)
x, info = pyamg.krylov.cg(A, b, M=ml.aspreconditioner())

# General problems: GMRES + Classical AMG  
A = pyamg.gallery.poisson((50, 50))
ml = pyamg.ruge_stuben_solver(A)
x, info = pyamg.krylov.gmres(A, b, M=ml.aspreconditioner())

Performance Considerations

  • Memory: CG < BiCGSTAB < GMRES(restart) < GMRES(no restart)
  • Robustness: GMRES > BiCGSTAB > CG
  • Speed per iteration: CG > BiCGSTAB > GMRES
  • Preconditioning: Essential for practical performance

Convergence Monitoring

def monitor_convergence(x):
    residual = np.linalg.norm(b - A*x)
    print(f"Iteration residual: {residual:.2e}")

x, info = pyamg.krylov.gmres(A, b, callback=monitor_convergence)

Common Issues

  • Stagnation: Try different method or better preconditioner
  • Breakdown: Switch from BiCGSTAB to GMRES
  • Memory: Use restarted GMRES or BiCGSTAB
  • Slow convergence: Improve preconditioning or initial guess

Install with Tessl CLI

npx tessl i tessl/pypi-pyamg

docs

aggregation-methods.md

gallery.md

high-level-interface.md

index.md

krylov-methods.md

multilevel-solver.md

relaxation-methods.md

solver-constructors.md

strength-of-connection.md

utilities.md

tile.json