Linear Algebra Dr. Jorge Eduardo Basilio
FINAL EXAM Study Guide
- See canvas for logistics!
Material Covered
- Ch 0: review the study guide from Exam 1
- Ch 1 & 2: Euclidean Spaces & Subspaces
- RESTUDY THIS STUFF SINCE I’LL ASK QUESTIONS LIKE THE MATERIAL FROM THIS BUT FOR IPS
- In other words, review the study guide from Exam 1
- Ch 3: Linear Transformations ,
![](https://lh7-rt.googleusercontent.com/docsz/AD_4nXeppjtk8USd2M-n0ofsM82XrW25x1R1MLN-avfaJhWUw4eLLv7lJ63NlSaoEXonGvpZah44_eD0Dt7deJ0fKRPyiSV9uR0CIYCw-T0zZ9mvWWQFU3kYGygo0m9H7BOq78pefT6sS8L_ka8840cO?key=zrXyJgPayip0ymMKcBAPTg)
![](https://lh7-rt.googleusercontent.com/docsz/AD_4nXcyQQCw5fUjnVEvYIkzSGe48TpjutXzf7MlnnCIBM6SH0lfxyynEieijiQVSJ8dJJ69X0Q1MmZvtkxZRNJ6oTz8onOgHI8DDMwpNj4OU4ZzGmaL_dBI5bwhnB4tl7rJCF_P_t8WUPPWOHdnXfGE?key=zrXyJgPayip0ymMKcBAPTg)
- Again, restudy this since chapter 5 is very similar!
- Definitions from 2.9: can ask about diagonal, triangular, symmetric (
), and transposes etc
- Pay close attention to symmetric matrices (iff
). I like using these for “new” problems on the final. No hw is assigned but I’d look at some of the problems from this section. There’s some cool proofs I can ask that are not too long. - Properties of Transpose: let
![](https://lh7-rt.googleusercontent.com/docsz/AD_4nXeGTx_Nxe3UlwGGvhP3vyPz8DL5s4FzQDKfHgccZfHBZ69lmWDHPk5WC0RItBM_wSjJV7HUKHpTAtJR35CR8mtTliMrEaVHe0FvJ50jBQlUXI-dbQCYxFwXCLvTWIQAdba6MMM4KymUA5g7QnNZ?key=zrXyJgPayip0ymMKcBAPTg)
![](https://lh7-rt.googleusercontent.com/docsz/AD_4nXc-JY197dg-hE0XRE7pqW2oW9idNDnWmsbpYDn_-9ELSRHrc6y5yA_w4LVSIvhUtXJI27oUKB_X0Mu4IhOff0AYTr8LzBmlfzrTJTekKnTiaN1BnU-Qo2XyW8JkrwQOOwHlzrDHotKqiIeAnVtF?key=zrXyJgPayip0ymMKcBAPTg)
![](https://lh7-rt.googleusercontent.com/docsz/AD_4nXdW4mzJSbWUoa03IumV_WtabjNCZ1IqdJ6UUvP2_a7wHevSh46K8MqYK-1RRYv4vrNL-cx5BTyWYrV1hkPYlS8G782YGJdDyDljuEBLkT75_NNCQf_94yHMCV-hHn82tLOau_HhQJMKUU2g-kwE?key=zrXyJgPayip0ymMKcBAPTg)
![](https://lh7-rt.googleusercontent.com/docsz/AD_4nXf0BdRTFZku7r3WDXUpvz7eNBtbXnxR6ZPVBkQhgOLwoW5R8FaEtITLIWngd5Ir_uj88U72W2L_PmTkTIFYxNzuRziBoB8m_ib-VJIZFi6WQJIB-RcigyJ1SIDjCvs-rfQYXRzK824vGSe0fyE?key=zrXyJgPayip0ymMKcBAPTg)
(notice the order switch!)- A is invertible iff
is invertible, and ![](https://lh7-rt.googleusercontent.com/docsz/AD_4nXd_0gsKrHmc-MpC1myycxA_Us-iBMCGKiuIDkSJ_gnVwN1RkSI-PfuUN8xB2_a7z2Ae_yqX6iE97I1gYc7HvpGVEl1tFAMX73dSxq_MOq5HG0ZEv5qTE2-tFFYjXaI60urA_X61gvgWUG0hniKk?key=zrXyJgPayip0ymMKcBAPTg)
- Ch 4: Vector Spaces & Subspaces
- Definitions, axioms
- Proofs: of basic results, ones with 0 vectors
- Know cardinality, span and basis mean for infinite sets
- Subspaces
- Ch 5: Linear Transformations
- Linear transformations T: V → W
- Matrix representation: [T]_B,B’ = [ [T(v_1) | …. | [T(v_n)] ]
- 1-1, onto, ker, range
- Isomorphisms
- Hopefully, all of this is familiar and it should feel like a “victory lap” you just need to focus on using the new terminology of abstract vector spaces
- Review the Study Guide for Exam 3 on determinants
- Know any method to compute them by hand, especially since it is needed for the eigentheory (ie finding eigenvalues)
- Be able to find the eigenvalues by computing determinants
- Be able to find the eigenspaces by setting up a matrix and using RREF (you can use your calculator or ask me to compute it with Sage)
- Be able to find eigenvalues using geometry (i.e. without computations!) for projections and reflections
- EigenTheory and Diagonalization is an important topic. It is extremely likely that I’ll put more problems on the final from this section!
- Ch 9: Inner Product Spaces
- Definitions: inner product (4 axioms), inner product space, norm, unit circle, distance, orthogonal set of vectors, orthonormal set of vectors, orthogonal basis, orthonormal basis, orthogonal complement
of a subspace ![](https://lh7-rt.googleusercontent.com/docsz/AD_4nXfj4B___35JebT1LnbJWPI8QarD-QCYuQB81xgC_p21HX9tkzmBA-XkC740QXxhA70p78DY2JS6JeiWnmhwvdC7HICuqaez6LoxYCibRp9dpmgjoitbWp2gIaSG3WJzdtqjfinx4EL_a_a9M7w?key=zrXyJgPayip0ymMKcBAPTg)
- Examples of inner products to know:
- Dot products in R^n and “weighted dot products”
- Using matrices:
![](https://lh7-rt.googleusercontent.com/docsz/AD_4nXd4EtQFMu_-jRQILxLNK2ZKr0vq4orhisSUGvY7ed6zjQ6n3Awxl4c7Km6-teKq2QY-o1DLyq65BYoBpUTZUY0Afd8xCffUWUW4t0hZve80VHAKAqJe98okQJ8ARYF5AnXoDulhQV50eRefC1bp?key=zrXyJgPayip0ymMKcBAPTg)
- Using integrals: for
,
- Be able to prove classical results:
- Cauchy-Schwarz Inequality, Triangle Inequality (norm version and distance version), Generalize Pythagorean Thm
- If S is orthogonal (or O.N.) then S is LI. Know how to prove this.
- Be prepared to do Gram-Schmidt Algorithm (see ICA11#2)
- Be able to do this in R^n (n=3, 4, or 5)
- Be able to do this on other IPS, like polynomial space, matrix space, or function spaces.
- Keep in mind: when doing GSA:
- Build orthogonal set first
- Use clever mod along the way (i.e. replace
with a scalar multiple of
to get rid of fractions. - Build O.N. set at the very end. Ie normalize each
after you’ve built an orthogonal set
- Can ask questions involving
and
in V
- Orthogonal decomposition of a vector
:
where
and ![](https://lh7-rt.googleusercontent.com/docsz/AD_4nXcst8pfxZcTAcF-jx0JNh460YBZXwhqXfKpDKNSym_Z26fCukGVBkBoDxAzJ_3uIs1CpHaIA1xhkkywXjlyzUi45wllcpycj5mcfUpItzLnwV_ki2BCH-0q4cmWiuh7HVNRn1kSpUtbJUjjl1Xb?key=zrXyJgPayip0ymMKcBAPTg)
- Show
![](https://lh7-rt.googleusercontent.com/docsz/AD_4nXeuwj7dzQv7X-8n-RWR_INubGbk2NpbkYP5pTH7ZQJ3G5CLVMHVef8kp471AyIVVO9IOBfJzNghffYWYhprXP90BonTlGZCf1GDvXaSpgTNKTd22pKRcoVZOxm8xdPgf6yu5IoriUkzo8H-l1s?key=zrXyJgPayip0ymMKcBAPTg)
- Dimension Theorem in an inner product space
! Key is that you find an O.N. basis for
and extend it to an O.N. basis for
. (statement only) - Projection mappings:
and ![](https://lh7-rt.googleusercontent.com/docsz/AD_4nXfAoXTHarz7OUQs1NINlm7Rjw7wbGotp7CF1JF8AIWzk4udU-PysLJUL7KB03NFBX4qK9dsbXRJ1CHPUu6CVv_YROuliKdd8UGJLmxkIf9ZbDNgEXyLIjI-5nGtJjsjTEKfjRhZAhb3aSOj4WM?key=zrXyJgPayip0ymMKcBAPTg)
- Know how to find
if you have an O.N. basis for
. Use formula:
where
is the matrix whose columns are the vectors from the O.N. basis for W only (warning: use only the vectors in a basis for W not the entire basis for V)
- Recall that the eigenvalues and eigenvectors for projection maps are easy to find! They geometry of the maps reveals all!
- Also, projection maps turn out to be diagonalizable.