CS 5220

Applications of Parallel Computers

Lumped parameter models

Prof David Bindel

Please click the play button below.

Lumped parameter simulations

Examples include:

  • SPICE-level circuit simulation
    • nodal voltages vs. voltage distributions
  • Structural simulation
    • beam end displacements vs. continuum field
  • Chemical concentrations in stirred tank reactor
    • concentrations in tank vs. spatially varying concentrations

Lumped parameter simulations

  • Typically ordinary differential equations (ODEs)
  • Constraints: differential-algebraic equations (DAEs)

Often (not always) sparse.

Sparsity

Sparsity plot and associated graph

Consider ODEs \(x' = f(x)\) (special case: \(f(x) = Ax\))

  • Dependency graph: edge \((i,j)\) if \(f_j\) depends on \(x_i\)
  • Sparsity means each \(f_j\) depends on only a few \(x_i\)
  • Often arises from physical or logical locality
  • Corresponds to \(A\) being sparse (mostly zeros)

Sparsity and partitioning

Sparsity plot and associated graph, partitioned

Want to partition sparse graphs so that

  • Subgraphs are same size (load balance)
  • Cut size is minimal (minimize communication)

We’ll talk more about this later.

Types of analysis: Static

Consider \(x' = f(x)\) (special case: \(f(x) = Ax + b\)).

Might want: \(f(x_*) = 0\)

  • Boils down to \(Ax = b\) (e.g. for Newton-like steps)
  • Can solve directly or iteratively
  • Sparsity matters a lot!

Types of analysis: Dynamic

Consider \(x' = f(x)\) (special case: \(f(x) = Ax + b\)).

Might want: \(x(t)\) for many \(t\)

  • Involves time stepping (explicit or implicit)
  • Implicit methods involve linear/nonlinear solves
  • Need to understand stiffness and stability issues

Types of analysis: Modal

Consider \(x' = f(x)\) (special case: \(f(x) = Ax + b\)).

Might want: eigenvalues of \(A\) or \(f'(x_*)\)

Explicit time stepping

  • Example: forward Euler $$ x_{k+1} = x_k + (\Delta t) f(x_k) $$
  • Next step depends only on earlier steps
  • Simple algorithms
  • May have stability/stiffness issues

Implicit time stepping

  • Example: backward Euler $$ x_{k+1} = x_k + (\Delta t) f(x_{k+1}) $$
  • Next step depends on itself and on earlier steps
  • Algorithms involve solves — complication, communication!
  • Larger time steps, each step costs more

A common kernel

In all cases, lots of time in sparse matvec:

  • Iterative linear solvers: repeated sparse matvec
  • Iterative eigensolvers: repeated sparse matvec
  • Explicit time marching: matvecs at each step
  • Implicit time marching: iterative solves (involving matvecs)

We need to figure out how to make matvec fast!

An aside on sparse matrix storage

  • Sparse matrix \(\implies\) mostly zero entries
    • Can also have “data sparseness” — representation with less than \(O(n^2)\) storage, even if most entries nonzero
  • Could be implicit (e.g. directional differencing)
  • Sometimes explicit representation is useful
  • Easy to get lots of indirect indexing!
  • Compressed sparse storage schemes help

Example: Compressed sparse row

Illustration of compressed sparse row format

This can be even more compact:

  • Could organize by blocks (block CSR)
  • Could compress column index data (16-bit vs 64-bit)
  • Various other optimizations — see OSKI

Summary

  • ODE and DAE models widely used in engineering
  • Different analyses: static, dynamic, modal
  • Sparse linear algebra is often key