Yousef Saad -- PWS Publishing -- 1996

• Preface , xiii
• Acknowledgments, xiv
• Suggestions for Teaching, xv

• CHAPTER 1: BACKGROUND IN LINEAR ALGEBRA , 1
• 1.1 Matrices , 1
• 1.2 Square Matrices and Eigenvalues , 3
• 1.3 Types of Matrices , 4
• 1.4 Vector Inner Products and Norms , 6
• 1.5 Matrix Norms , 8
• 1.6 Subspaces, Range, and Kernel , 9
• 1.7 Orthogonal Vectors and Subspaces , 10
• 1.8 Canonical Forms of Matrices , 15
• 1.8.1 Reduction to the Diagonal Form15
• 1.8.2 The Jordan Canonical Form, 16
• 1.8.3 The Schur Canonical Form, 17
• 1.8.4 Application to Powers of Matrices, 19
• 1.9 Normal and Hermitian Matrices , 21
• 1.9.1 Normal Matrices, 21
• 1.9.2 Hermitian Matrices, 24
• 1.10 Nonnegative Matrices, M-Matrices , 26
• 1.11 Positive-Definite Matrices , 30
• 1.12 Projection Operators , 33
• 1.12.1 Range and Null Space of a Projector, 33
• 1.12.2 Matrix Representations, 35
• 1.12.3 Orthogonal and Oblique Projectors, 35
• 1.12.4 Properties of Orthogonal Projectors, 37
• 1.13 Basic Concepts in Linear Systems , 38
• 1.13.1 Existence of a Solution, 38
• 1.13.2 Perturbation Analysis, 39
• Exercises and Notes , 41

• CHAPTER 2: DISCRETIZATION OF PDEs , 44
• 2.1 Partial Differential Equations , 44
• 2.1.1 Elliptic Operators, 45
• 2.1.2 The Convection Diffusion Equation, 47
• 2.2 Finite Difference Methods , 47
• 2.2.1 Basic Approximations, 48
• 2.2.2 Difference Schemes for the Laplacean Operator, 49
• 2.2.3 Finite Differences for 1-D Problems, 51
• 2.2.4 Upwind Schemes, 51
• 2.2.5 Finite Differences for 2-D Problems, 54
• 2.3 The Finite Element Method , 55
• 2.4 Mesh Generation and Refinement , 61
• 2.5 Finite Volume Method , 63
• Exercises and Notes , 66

• CHAPTER 3: SPARSE MATRICES , 68
• 3.1 Introduction , 68
• 3.2 Graph Representations , 70
• 3.2.1 Graphs and Adjacency Graphs, 70
• 3.2.2 Graphs of PDE Matrices, 72
• 3.3 Permutations and Reorderings , 72
• 3.3.1 Basic Concepts, 72
• 3.3.2 Relations with the Adjacency Graph, 75
• 3.3.3Common Reorderings, 75
• 3.3.4 Irreducibility, 83
• 3.4 Storage Schemes , 84
• 3.5 Basic Sparse Matrix Operations , 87
• 3.6 Sparse Direct Solution Methods , 88
• 3.7 Test Problems , 88
• Exercises and Notes , 91

• CHAPTER 4: BASIC ITERATIVE METHODS , 95
• 4.1 Jacobi, Gauss-Seidel, and SOR , 95
• 4.1.1 Block Relaxation Schemes, 98
• 4.1.2 Iteration Matrices and Preconditioning, 102
• 4.2 Convergence , 104
• 4.2.1 General Convergence Result, 104
• 4.2.2 Regular Splittings, 107
• 4.2.3Diagonally Dominant Matrices, 108
• 4.2.4 Symmetric Positive Definite Matrices, 112
• 4.2.5 Property A and Consistent Orderings, 112
• 4.3 Alternating Direction Methods , 116
• Exercises and Notes , 119

• CHAPTER 5: PROJECTION METHODS , 122
• 5.1 Basic Definitions and Algorithms , 122
• 5.1.1 General Projection Methods, 123
• 5.1.2 Matrix Representation, 124
• 5.2 General Theory , 126
• 5.2.1 Two Optimality Results, 126
• 5.2.2 Interpretation in Terms of Projectors, 127
• 5.2.3General Error Bound1, 29
• 5.3 One-Dimensional Projection Processes , 131
• 5.3.1 Steepest Descent, 132
• 5.3.2 Minimal Residual (MR) Iteration, 134
• 5.3.3. Residual Norm Steepest Descent, 136
• 5.4 Additive and Multiplicative Processes , 136
• Exercises and Notes , 139

• CHAPTER 6: KRYLOV SUBSPACE METHODS -- PART I , 144
• 6.1 Introduction , 144
• 6.2 Krylov Subspaces , 145
• 6.3 Arnoldi's Method , 147
• 6.3.1 The Basic Algorithm, 147
• 6.3.2 Practical Implementations, 149
• 6.4 Arnoldi's Method for Linear Systems (FOM) , 152
• 6.4 .1 Variation 1: Restarted FOM, 154
• 6.4.2 Variation 2: IOM and DIOM, 155
• 6.5 GMRES , 158
• 6.5.1 The Basic GMRES Algorithm, 158
• 6.5.2 The Householder Version, 159
• 6.5.3 Practical Implementation Issues, 161
• 6.5.4 Breakdown of GMRES, 165
• 6.5.5 Relations between FOM and GMRES, 165
• 6.5.6 Variation 1: Restarting, 168
• 6.5.7 Variation 2: Truncated GMRES Versions, 169
• 6.6 The Symmetric Lanczos Algorithm , 174
• 6.6 .1 The Algorithm, 174
• 6.6.2 Relation with Orthogonal Polynomials, 175
• 6.7 The Conjugate Gradient Algorithm , 176
• 6.7.1 Derivation and Theory, 176
• 6.7.2 Alternative Formulations, 180
• 6.7.3 Eigenvalue Estimates from the CG Coefficients, 181
• 6.8 The Conjugate Residual Method , 183
• 6.9 GCR, ORTHOMIN, and ORTHODIR , 183
• 6.10 The Faber-Manteuffel Theorem , 186
• 6.11 Convergence Analysis , 188
• 6.11.1 Real Chebyshev Polynomials, 188
• 6.11.2 Complex Chebyshev Polynomials, 189
• 6.11.3 Convergence of the CG Algorithm, 193
• 6.11.4 Convergence of GMRES, 194
• 6.12 Block Krylov Methods , 197
• Exercises and Notes , 202

• CHAPTER 7: KRYLOV SUBSPACE METHODS -- PART II , 205
• 7.1 Lanczos Biorthogonalization , 205
• 7.1.1 The Algorithm, 205
• 7.1.2 Practical Implementations208
• 7.2 The Lanczos Algorithm for Linear Systems , 210
• 7.3 The BCG and QMR Algorithms , 210
• 7.3.1 The Biconjugate Gradient Algorithm, 211
• 7.3.2 Quasi-Minimal Residual Algorithm, 212
• 7.4 Transpose-Free Variants , 214
• 7.4.1 Conjugate Gradient Squared, 215
• 7.4.2 BICGSTAB, 217
• 7.4.3 Transpose-Free QMR (TFQMR), 221
• Exercises and Notes , 227

• CHAPTER 8: METHODS RELATED TO THE NORMAL EQUATIONS , 230
• 8.1 The Normal Equations , 230
• 8.2 Row Projection Methods , 232
• 8.2.1 Gauss-Seidel on the Normal Equations, 232
• 8.2.2 Cimmino's Method, 234
• 8.3 Conjugate Gradient and Normal Equations , 237
• 8.3.1 CGNR, 237
• 8.3.2 CGNE, 238
• 8.4 Saddle-Point Problems , 240
• Exercises and Notes , 243

• CHAPTER 9: PRECONDITIONED ITERATIONS , 245
• 9.1 Introduction , 245
• 9.2 Preconditioned Conjugate Gradient , 246
• 9.2.1 Preserving Symmetry, 246
• 9.2.2 Efficient Implementations, 249
• 9.3 Preconditioned GMRES , 251
• 9.3.1 Left-Preconditioned GMRES, 251
• 9.3.2 Right-Preconditioned GMRES, 253
• 9.3.3 Split Preconditioning, 254
• 9.3.4 Comparison of Right and Left Preconditioning, 255
• 9.4 Flexible Variants , 256
• 9.4.1 Flexible GMRES, 256
• 9.4.2 DQGMRES, 259
• 9.5 Preconditioned CG for the Normal Equations , 260
• 9.6 The CGW Algorithm , 261
• Exercises and Notes , 263

• CHAPTER 10: PRECONDITIONING TECHNIQUES , 265
• 10.1 Introduction , 265
• 10.2 Jacobi, SOR, and SSOR Preconditioners , 266
• 10.3 ILU Factorization Preconditioners , 269
• 10.3.1 Incomplete LU Factorizations, 270
• 10.3.2 Zero Fill-in ILU (ILU(0)), 275
• 10.3.3 Level of Fill and ILU(\$p\$), 278
• 10.3.4 Matrices with Regular Structure, 281
• 10.3.5 Modified ILU (MILU), 286
• 10.4 Threshold Strategies and ILUT , 287
• 10.4.1 The ILUT Approach, 288
• 10.4.2 Analysis, 289
• 10.4.3 Implementation Details, 292
• 10.4.4 The ILUTP Approach, 294
• 10.4.5 The ILUS Approach, 296
• 10.5 Approximate Inverse Preconditioners , 298
• 10.5.1 Approximating the Inverse of a Sparse Matrix, 299
• 10.5.2 Global Iteration, 299
• 10.5.3 Column-Oriented Algorithms, 301
• 10.5.4 Theoretical Considerations, 303
• 10.5.5 Convergence of Self Preconditioned MR, 305
• 10.5.6 Factored Approximate Inverses, 307
• 10.5.7 Improving a Preconditioner, 310
• 10.6 Block Preconditioners , 310
• 10.6.1 Block-Tridiagonal Matrices, 311
• 10.6.2 General Matrices312
• 10.7 Preconditioners for the Normal Equations , 313
• 10.7.1 Jacobi, SOR, and Variants, 313
• 10.7.2 IC(0) for the Normal Equations, 314
• 10.7.3 Incomplete Gram-Schmidt and ILQ, 316
• Exercises and Notes , 319

• CHAPTER 11: PARALLEL IMPLEMENTATIONS , 324
• 11.1 Introduction , 324
• 11.2 Forms of Parallelism , 325
• 11.2.1 Multiple Functional Units, 325
• 11.2. 2. Pipelining, 326
• 11.2.3 Vector Processors, 326
• 11.2.4 Multiprocessing and Distributed Computing, 326
• 11.3 Types of Parallel Architectures , 327
• 11.3.1 Shared Memory Computers, 327
• 11.3.2. Distributed Memory Architectures, 329
• 11.4 Types of Operations , 331
• 11.4.1 Preconditioned CG, 332
• 11.4.2 GMRES, 332
• 11.4.3 Vector Operations, 333
• 11.4.4 Reverse Communication, 334
• 11.5 Matrix-by-Vector Products , 335
• 11.5.1 The Case of Dense Matrices, 335
• 11. 5. 2 The CSR and CSC Formats, 336
• 11. 5. 3 Matvecs in the Diagonal Format, 339
• 11. 5. 4 The Ellpack-Itpack Format, 340
• 11. 5. 5 The Jagged Diagonal Format, 341
• 11. 5. 6 The Case of Distributed Sparse Matrices, 342
• 11. 6 Standard Preconditioning Operations , 345
• 11. 6. 1 Parallelism in Forward Sweeps, 346
• 11. 6. 2 Level Scheduling: the Case of 5-Point Matrices, 346
• 11. 6. 3 Level Scheduling for Irregular Graphs, 347
• Exercises and Notes , 350

• CHAPTER 12: PARALLEL PRECONDITIONERS , 353
• 12. 1 Introduction , 353
• 12. 2 Block-Jacobi Preconditioners , 354
• 12. 3 Polynomial Preconditioners , 356
• 12. 3. 1 Neumann Polynomials, 356
• 12. 3. 2 Chebyshev Polynomials, 357
• 12. 3. 3 Least-Squares Polynomials, 360
• 12. 3. 4 The Nonsymmetric Case, 363
• 12. 4 Multicoloring , 365
• 12. 4. 1 Red-Black Ordering, 366
• 12. 4. 2 Solution of Red-Black Systems, 367
• 12. 4. 3 Multicoloring for General Sparse Matrices, 368
• 12. 5 Multi-Elimination ILU , 369
• 12. 5. 1 Multi-Elimination, 370
• 12. 5. 2 ILUM, 371
• 12. 6 Distributed ILU and SSOR , 374
• 12. 6. 1 Distributed Sparse Matrices, 374
• 12. 7 Other Techniques , 376
• 12. 7. 1 Approximate Inverses, 377
• 12. 7. 2 Element-by-Element Techniques, 377
• 12. 7. 3 Parallel Row Projection Preconditioners, 379
• Exercises and Notes , 380

• CHAPTER 13: DOMAIN DECOMPOSITION METHODS , 383
• 13. 1 Introduction , 383
• 13. 1. 1 Notation, 384
• 13. 1. 2 Types of Partitionings, 385
• 13. 1. 3 Types of Techniques, 386
• 13. 2 Direct Solution and the Schur Complement , 388
• 13. 2. 1 Block Gaussian Elimination, 388
• 13. 2. 2 Properties of the Schur Complement, 389
• 13. 2. 3 Schur Complement for Vertex-Based Partitionings, 390
• 13. 2. 4 Schur Complement for Finite-Element Partitionings, 393
• 13. 3 Schwarz Alternating Procedures , 395
• 13. 3. 1 Multiplicative Schwarz Procedure, 395
• 13. 3. 2 Multiplicative Schwarz Preconditioning, 400
• 13. 3. 3 Additive Schwarz Procedure, 402
• 13. 3. 4 Convergence, 404
• 13. 4 Schur Complement Approaches , 408
• 13. 4. 1 Induced Preconditioners, 408
• 13. 4. 2 Probing, 410
• 13. 4. 3 Preconditioning Vertex-Based Schur Complements, 411
• 13. 5 Full Matrix Methods , 412
• 13. 6 Graph Partitioning , 414
• 13. 6. 1 Basic Definitions, 414
• 13. 6. 2 Geometric Approach, 415
• 13. 6. 3 Spectral Techniques, 417
• 13. 6. 4 Graph Theory Techniques, 418
• Exercises and Notes , 422
• References 425
• Index, 439