Study guide for Midterm 2

Here is a non-exhaustive list of questions you should be able to answer as you prepare for the first midterm. The midterm will cover chapters 5-7 with some minor review questions from chapters 1-4.

Nonlinear Equations

  • What is a nonlinear equation and how is it different from a linear equation? Give examples
  • In general, what form do we want for our linear equation and what exactly are we looking for in terms of solutions?
  • Given a nonlinear system of equations, how many solutions do you have? Can this be estimated a priori?
  • What is a fixed point of a function?
  • Define multiplicity and state it mathematically?
  • What is the condition number of root finding? Can you use any form of conditioning (i.e. relative or absolute)? Explain
  • What is the Jacobian? Evaluated at the root what can we expect the Jacobian to be?
  • Define convergence rate and state the formula. Clasify the different kinds of convergence rates and give examples
  • What is interval bisection? What is its convergence rate? Given a tolerance can you tell in advance how many iterations it needs?
  • For a given function, how many fixed point problems can you use to find a root?
  • What condition do you need for your fixed point function so that it is locally convergent?
  • What happenes to a fixed point scheme if the previous condition is zero at the root?
  • What is Newton's method? State it mathematically. Define its convergence rate. Is it always this rate?
  • What is the secant method? State it mathematically. Define its convergence rate. Is it always this rate?
  • When would you prefer to use the secant method over Newton's method and viceversa?
  • What is inverse interpolation? Is it always a good idea to fit a higher order polynomial in this fashion?
  • Explain what a safeguarded method is
  • Describe some methods to find the roots of a polynomial
  • State Newton's method in multiple dimensions and define conditions for convergence
  • State Broyden's method in multiple dimensions and define conditions for convergence

Optimization

  • What is the definition difference between linear/nonlinear programming?
  • What is a convex function? What is a local/global minmum? What is a coercive function?
  • What is a critical point? What is the relation between critical point and local/global minmum?
  • Civen a unconstrained Optimization problem, what is requirement for critical to be a minmum?
  • Civen a constrained Optimization problem, what is the necessary condition for minmum?
  • What is the Lagrange function (form) of a constrained problem? What is the necessary condition of critical point for Lagrange function? For inequality constrains, what is the n KKT optimality conditions?
  • What is the general sensitivity and conditioning of optimization problem?
  • What is the golden section search method? In which condition it can be used? What is its convergence rate?
  • What is the Successive Parabolic Interpolation? What is its convergence rate?
  • What is the Steepest Descent Method? How to compute alpha? What is its convergence rate? What is its advantages and disadvantages?
  • What is the Safeguarded Methods?
  • What is the Newton’s Method (1D and $n$D)? What is its convergence rate? What is its restriction?
  • What is the Quasi-Newton Methods? What is its advantages over Newton method in general? What is its convergence rate in general?
  • What is the BFGS Method in high level? What is its advantages? What is its convergence rate in general?
  • What is the Conjugate Gradient Method in high level? What is Levenberg-Marquardt Method? What advantages do they have?
  • What is the Gauss-Newton Method? What is the Levenberg-Marquardt Method? What are their advantages? What properties does they have?
  • What is a Sequential Quadratic Programming (SQP method)?
  • What is Quadratic Programming?
  • What is the Active set Strategy?
  • What is the Penalty Methods? What is the Barrier Methods?

Interpolation

  • State the interpolation problem and some of its applications
  • What do basis functions mean in a general sense? Think of basis vectors or axes
  • How many different smooth interpolants can you have for a given set of data? What about piecewise interpolants?
  • What defines the conditioning of the interpolation problem as well as its existence? Is this true for any basis?
  • Define the monomial basis and use it to interpolate a set of points
  • Define the Lagrange basis and use it to interpolate a set of points
  • Define the Newton basis and use it to interpolate a set of points
  • What are divided differences? Use them to interpolate a set of points
  • What are orthogonal polynomials? Can you use any method learned in Chapter 3 to orthogonalize a set of polynomials? Pick a set and prove it
  • State the advantages and disadvantages of each basis (i.e. monomial, Lagrange, Newton, orthogonal) for interpolation
  • What are Chebyshev points? Are they preferable to equispaced points? Explain
  • In interpolating a continous function what is the maximum error you can attain? Can you provide a bound?
  • What is the difference between Hermite interpolation and cubic spline interpolation?
  • Using a quadratic spline define the equations and continuity conditions? How many conditions are left free?