## Course Description

This one-semester graduate course will cover basic scientific computing for computer scientists, mathematicians, and students in other areas where numerical computation is used. We will discuss mathematical principles and algorithms, and also practical issues of reliability, library use, and performance on modern hardware. There is no required text; instead, we will use a set of lecture notes developed by Jonathon Goodman, which will be posted as we go.

Prerequisites are linear algebra, multivariable calculus, and some elementary probability theory. Students should also be able to read and write simple programs in C/C++, and should know (or learn) a little MATLAB or Octave.

The grade will be based on weekly homework assignments, most of which involve computing, and a final examination. Students are encouraged to discuss problems and help each other with assignments, but all software and final write-ups must be done individually. Part of the grade will be based on the clarity and correctness of the code.

## Course Outline

This outline is subject to revision.

- Week 1: Sources of error.
- Sources of error: roundoff, truncation error, incomplete convergence, statistical error, program bug.
- Computer floating point arithmetic and the IEEE standard.
- Conditioning, condition number, and error amplification.
- Stable and unstable computational algorithms.

- Weeks 2 and 3: Basic numerical analysis.
- Taylor series as asymptotic expansions.
- Finite difference approximations to derivatives.
- Asymptotic error expansions and order of accuracy.
- Richardson extrapolation.
- Local low order polynomial interpolation.
- Methods for integration on a uniform mesh: rectangle rule, trapezoid rule, midpoint rule, Simpson's rule.
- Local error, error cancellation, and global error.
- Convergence study as a correctness check.
- Using Richardson error estimation to achieve a specified level of accuracy.

- Weeks 4 and 5: Numerical Linear Algebra.
- Review of linear algebra, vector and matrix norms.
- Condition number of a system of linear equations, condition number of a matrix.
- Condition number of the symmetric and nonsymmetric eigenvalue problem (done lightly).
- Improving condition number: scaling and balancing
- The LU factorization and it's use for systems of linear equations.
- Computing the factors by Gauss elimination.
- Pivoting.
- The Cholesky factorization.
- Band matrices.

- Week 6: Discrete Fourier transform and applications.
- The algebraic definition of the DFT.
- The physical and graphical interpretation of the modes.
- Application to convolution and filtering.
- What the FFT algorithm achieves (but not the algorithm itself).

- Week 7: Nonlinear equations and optimization I, Newton's method.
- Graphical and analytical definition of Newton's method in one dimension.
- Newton's method in higher dimensions.
- Local quadratic convergence.
- The problem of local minima, but no solution.

- Week 8: Nonlinear optimization II, robust software.
- The problem of convergence from a poor starting guess.
- Line search.
- Modified Cholesky factorization.
- Robust software for optimization.

- Week 9: Time stepping methods for dynamical systems (ODEs).
- The time stepping (marching) approach to dynamics.
- Forward and backward Euler.
- Determining the accuracy of time stepping methods, local truncation error.
- Midpoint and trapezoid rules.
- Four stage Runge Kutta, what it is and what it achieves.
- Adaptive time step selection based on local error estimation.

- Week 10: Principles of numerical software, performance and reliabilty
- Software tools: debuggers, profilers, etc.
- Understanding the hardware: instruction-level parallelism and caches.
- Coding for performance.

- Week 11: Monte Carlo I, basics.
- Review of basic probability: random variables, probability densities, expectation, independence, etc. (very quick)
- What psuedo random number generators do and how they do it.
- Discrete events and coin tossing.
- Generating exponential and Gaussian (normal) random variables: Box Muller.
- Error bars and the central limit theorem.

- Week 12: Monte Carlo II, variance reduction (cheating).
- Philosophy of variance reduction: find different random variables with the same expectation.
- Control variates.
- Antithetic variates
- Importance sampling (very lightly).