Study guide for Midterm 2

Here is a non-exhaustive list of questions you should be able to answer as you prepare for the first midterm. The midterm will cover chapter 1-4.

Linear Least Squares

  • How can you solve a least-squares problem using the SVD?
  • Given an SVD of the matrix and a right-hand side, how would you find the 2-norm of the residual of a least-squares problem?
  • How can least squares problems be solved via the normal equations? What are the advantages and disadvantages of that?
  • On what factors does the conditioning of a linear least-squares problem depend on?
  • What is QR factorization, when does it exist and is it unique?
  • What is an orthogonal matrix?
  • What is a projection matrix?
  • What are the classical and modified Gram-Schmidt processes? What can you say about their stability?
  • What is a Householder reflector matrix, what properties does it have and how is it computed?
  • What is the Householder QR factorization algorithm, and what can you say about its stability?
  • How many Householder iterations are needed for a QR factorization and what is the total cost?
  • What is a Givens rotation matrix, what properties does it have and how is it computed?
  • How can Givens rotations be used to factorize a sparse matrix?
  • How does one solve linear least squares problems using a QR factorization?
  • What is Cholesky QR and how is it used for solving a linear least squares problem?
  • How can you solve a rank-deficient linear least squares problem?

Eigenvalue Problems

  • What is an eigenvector? an eigenvalue of a matrix? (i.e. know the definition)
  • What is a similarity transformation?
  • What is the relationship between the SVD and the eigenvalue decomposition?
  • When are eigenvectors linearly independent?
  • What are the Jordan and Schur forms?
  • What is a normal matrix? a defective matrix? a diagonalizable matrix?
  • What is an eigenvalue multiplicity? what is a complex eigenvalue pair?
  • What is the relationship between the SVD and the eigenvalue decomposition?
  • What is power iteration? What can be obtained using power iteration?
  • What is normalized power iteration? What problem does it address?
  • Given an approximate eigenvector, how can you estimate eigenvalues?
  • What is the Rayleigh Quotient? Rayleigh Quotient iteration?
  • What is inverse iteration and inverse shift iteration? Which eigenvalue does it find?
  • What is the conditioning of eigenvalues and eigenvectors in an eigenvalue problem?
  • What are the eigenvalues of an upper triangular matrix and how do you compute the eigenvectors?
  • What is orthogonal iteration? QR iteration? how are they related?
  • How can one reduce a matrix to Hessenberg form and why is it helpful?
  • How can one incorporate shifting into QR iteration?
  • What is deflation?
  • What is a Krylov subspace? how is it related to a Companion matrix?
  • What is the Arnoldi method? the Lanczos method?