Study guide for Midterm 3

Here is a non-exhaustive list of questions you should be able to answer as you prepare for the first midterm. The midterm will cover from latter part of chapter 4 to chapter 7 with some minor review questions from chapters 1-4.

Eigenvalue problems

  • What is a Krylov subspace? How is it related to a Companion matrix?
  • What is the Arnoldi method? the Lanczos method?
  • What is the cost of an iteration in Arnoldi and Lanczos?
  • What is a generalized eigenvalue problem? How is Cholesky factorization used when B is SPD?

Nonlinear Equations

  • What is a nonlinear equation and how is it different from a linear equation? Give examples
  • In general, what form do we want for our nonlinear equation and what exactly are we looking for in terms of solutions?
  • Given a system of nonlinear equations, how many solutions are there? Can this be estimated a priori?
  • What is a fixed point of a function?
  • Define multiplicity and state it mathematically.
  • What is the condition number of root finding? Can you use any form of conditioning (i.e. relative or absolute)?
  • What is the Jacobian? What is the Jacobian at the root of the nonlinear function.
  • What is the formal definition of a convergence rate? How does one classify different kinds of convergence rates?
  • What is interval bisection? What is its convergence rate? Given a tolerance can you tell in advance how many iterations it needs?
  • For a given function, how many fixed point problems can you use to find a root?
  • What condition do you need for your fixed point function so that it is locally convergent?
  • What happenes to a fixed point scheme if the previous condition is zero at the root?
  • What is Newton's method? State it mathematically. Define its convergence rate. Is it always this rate?
  • What is the secant method? State it mathematically. Define its convergence rate. Is it always this rate?
  • When would you prefer to use the secant method over Newton's method and vice versa?
  • What is inverse interpolation? Why is it useful for root finding problems in one dimension?
  • What is a safeguarding method?
  • What methods can be used to find the roots of a polynomial?
  • State Newton's method in multiple dimensions and define conditions for convergence.
  • State Broyden's method in multiple dimensions and define conditions for convergence.

Optimization

  • What is the difference between linear and nonlinear programming?
  • What is a convex function? What is a local/global minimum? What is a coercive function?
  • What is a critical point? What is the relation between critical point and local/global minimum?
  • Given a unconstrained optimization problem, what are the necessary conditions of a minimum?
  • Given a constrained optimization problem, what are the necessary conditions of a minimum?
  • What is the Lagrange function (form) of a constrained problem? What are the necessary condition for a critical point for Lagrange function?
  • What is the general sensitivity and conditioning of optimization problems?
  • What is the golden section search method? In which condition it can be used? What is its convergence rate?
  • What is successive parabolic interpolation? What is its convergence rate?
  • What is the steepest descent method? What is its convergence rate? What are its advantages and disadvantages?
  • What is safeguarding in 1D optimization?
  • What is Newton’s method (1D and $n$D)? What is its convergence rate? What is its restriction?
  • What are Quasi-Newton methods? What are its advantages over Newton method and in general? What is its convergence rate in general?
  • What is the BFGS method in high level? What are its advantages? What is its convergence rate in general?
  • What is the conjugate gradient method in high level? What advantages does it have?
  • What is the Gauss-Newton method? What is the Levenberg-Marquardt Method? What are their advantages? What properties do they have?
  • What is sequential quadratic programming?
  • What is the active set method?
  • What are penalty functions? What are barrier functions?

Interpolation

  • State the interpolation problem and some of its applications.
  • What do basis functions mean in a general sense?
  • How many different smooth interpolants can you have for a given set of data? What about piecewise interpolants?
  • How does the choice of points and basis function affect conditioning of the Vandermonde system?
  • What are advantages and disadvantages of the monomial, Lagrange and Newton bases?
  • What are orthogonal polynomials? How can polynomials be orthogonalized?
  • What is the advantage of Chebyshev points over uniformly spaced points? How do they relate to Chebyshev polynomials?
  • On what does the worst case error for polynomial interpolation depend and how can it be reduced?
  • What is the difference between Hermite interpolation and cubic spline interpolation?
  • How many conditions/equations and unknowns are associated with a spline interpolant?
  • How do B-splines use basis functions for piecewise interpolation?