Daily Reading
- Final Exam
- Practice Problems for the Final
- Bonus material on Principal Component Analysis
- This video by Luis Serrano is a really good explanation of a technique called Principal Component Analysis. It combines the ideas behind eigenvectors and orthogonal decomposition to explain large datasets in a compact way.
- Notice that all those “projections” he talks about are just done with our dot product/projection operator. Same with “drawing a perpendicular”. So this is all trying to find the right orthogonal basis to use for our orthogonal decomposition.
- From 5:00 to about 12:15 he does an overview of the statistics concepts of mean, variance, and covariance. This isn’t really about the linear algebra, but you might find it useful anyway.
- From about 14:00 to 20:45 he does a quick overview of linear transformations, matrices, eigenvectors, and eigenvalues. You know this stuff already, but it also might be a really good review.
- At the 21 minute mark, notice how the eigenvectors automatically form an orthogonal basis!
- A good explanation of the purpose of PCA, from the perspective of data science and machine learning. Avoids a lot of the math technicalities. Focuses on how we can use a computer to do the computations for us. But has some great examples and visualizations.
- At 18:00, notice he’s just talking about projection with a dot product.
- Here’s a more technical explanation of how PCA works if you want to see what we might have done with another month.
- He relates this all to something called Singular Value Decomposition, which is also cool, but don’t worry about it too much.
- In step 4 at about 8:00, when he writes $CV = VD$, that’s just another way of writing the diagonalization $V^{-1}CV = D$.
- Finally, this video has nothing to do with PCA, but is super cool. Cymatics is a way we can see the eigenvectors in some function space in the real world. There’s a linear transformation called the “Laplacian”, and each tone they play corresponds to a different eigenvector of this transformation. The ripples and shapes we see are effectively graphs of these eigenfunctions.
- Here’s a behind-the-scenes explaining what’s going on.
- This video by Luis Serrano is a really good explanation of a technique called Principal Component Analysis. It combines the ideas behind eigenvectors and orthogonal decomposition to explain large datasets in a compact way.
- Tuesday, April 28: Skim the part of §7.3 that covers the Gram-Schmidt process. Read §7.4.
- Thursday, April 23: Read §7.2 and the first half of §7.3, up through the beginning of the Gram-Schmidt Process.
- Tuesday, April 21: No Class for Founders Day
- Thursday, April 16: Read § 6.5-7.1.
- Homework due
- Slides
- Video
- Some cute Markov Chain visualizations
- A nice video using Markov chains to solve a Chess-inspired problem.
- Tuesday, April 14: Read § 6.3-6.4.
- TEST DUE Monday, April 13**
- Practice problems
- Test
- Please try to take only 90 minutes for this test.
- If you need more time, change to a different color of pen or do something else to mark it clearly.
- You can use your notes and anything I’ve posted here, but please don’t use other resources.
- Thursday, April 9: Read § 6.1-6.2.
- Tuesday, April 7: Read §5.2-5.3.
- Slides
- Video
- Watch this video on eigenvectors. Note we haven’t covered change of basis yet, but we will soon.
- Thursday, April 2: Read §4.4 and §5.1.
- Tuesday, March 31: read the rest of §4.3. (Added later: read the first couple pages of §4.4 as well if you get the chance.)
- You might find this Khan Academy video helpful as well.
- Or this 3blue1brown video.
- Slides
- Video
- Thursday, March 26: read §4.3 of the notes through page 79.
- Tuesday, March 24: watch (at least) the first nine minutes of this video on Youtube, and look over §4.2 in the notes.
Course Goals
In this course we will learn the basics of linear algebra, integrating three different perspectives. We will study the geometry of vectors and higher-dimensional spaces, the algebra of real vectors and matrices, and the formal systems of vector spaces and linear transformations. We will see how the three perspectives interrelate, and how each can be used to better understand the other two.
Mathematics is a fundamentally linguistic activity. In this course we will learn to speak and write the language of mathematics; understanding, writing, and communicating proofs will be a substantial portion of the course.
Topics will include: geometry and manipulation of vectors, lines, and planes; systems of linear equations; vector spaces and linear transformations; matrix arithmetic, inverses, and determinants; bases, spanning sets, and linear independence; and eigenvectors and eigenvalues.
The course syllabus is available here.
Course Notes
- Complete Notes
- Notation Index
- Introduction
- Section 1: Systems of Linear Equations
- Section 2: Vector Spaces
- Section 3: Bases
- Section 4: Linear Transformations
- Section 5: Eigenvectors
- Section 6: Similarity and Change of Basis
- Section 7: Inner Products and Geometry
Homework
- Homework 1, due on Thursday, January 30
- Homework 2, due on Thursday, February 6
- Homework 3, due on Thursday, February 13
- Homework 4, due on Thursday, February 27
- Homework 5, due on Thursday, March 5
- Homework 6, due on Thursday, March 26
- Homework 7, due on Thursday, April 2
- Homework 8, due on Thursday, April 9
- Homework 9, due on Thursday, April 16
- Homework 10, “due” on Tuesday, April 28
Tests
Tentative midterm dates are February 20 and March 31.
The final exam will be held in our usual classroom according to the official schedule.
- Test 1 on February 20
- Test 2 due Monday, April 13
- Final Exam on Tuesday, May 5
Textbook
The official textbook for this course is Linear Algebra: A Modern Introduction, 4th edition, by David Poole. The ISBN is 978-1-285-46324-7.
I do not plan to rely heavily on this text during the course, since I plan to present the material from a different perspective. I will be posting my course notes on the course website as we go, and you can use those as a reference for the material.
You shouldn’t need to purchase the book to complete the course, but I will be using it as my primary reference as I put the course notes together.
If you would like more references or other perspectives, you may wish to check out:
- Linear Algebra: Ideas and Applications by Richard Penney, available online through the library
- Linear Algebra: Step by Step by Kuldeep Singh, available online through the library
- A First Course in Linear Algebra by Rob Beezer, available free online
- Linear Algebra by Jim Hefferon, available free online