Assignment 4: Linear Algebra Review¶
1. (20 points)
Consider the linear transformation \(f(x)\) on \(\mathbb{R}^3\) that takes the standard basis \(\left\{e_1,e_2,e_3\right\}\) to \(\left\{v_1,v_2,v_3\right\}\) where
\[\begin{split}v_1=\left(\begin{matrix}10\\-10\\16\end{matrix}\right), v_2=\left(\begin{matrix}2\\-5\\20\end{matrix}\right) \textrm {and } v_3=\left(\begin{matrix}1\\-4\\13\end{matrix}\right)\end{split}\]
- Write a matrix \(A\) that represents the same linear transformation. (4 points)
- Compute the rank of \(A\) using two different methods (do not use
matrix_rank
!). (4 points) - Find the eigenvalues and eigenvectors of \(A\). (4 points)
- What is the matrix representation of \(f\) with respect to the eigenbasis? (8 points)
In [1]:
2. (20 points)
You are given the following x-y coordinates (first column is x, second is y)
array([[ 0. , 4.12306991],
[ 3. , -15.47355729],
[ 4. , -11.68725507],
[ 3. , -20.33756693],
[ 5. , -6.06401989],
[ 6. , 32.79353057],
[ 8. , 82.48658405],
[ 9. , 84.02971858],
[ 4. , -1.30587276],
[ 8. , 68.59409878]])
- Find the coefficients \((a, b, c)\) of the least-squares fit of a quadratic function \(y = a + bx + cx^2\) to the data.
- Plot the data and fitted curve using
matplotlib
.
Note: Use numpy.linalg.leastsq
function to solve this.
In [1]:
3. (20 points)
Consider the following system of equations:
\[\begin{split}\begin{align*}
2x_1& - x_2& +x_x &=& 6\\
-x_1& +2x_2& - x_3 &=& 2\\
x_1 & -x_2& + x_3 &=& 1
\end{align*}\end{split}\]
- Consider the system in matrix form \(Ax=b\) and define \(A\), \(b\) in numpy.
- Show that \(A\) is positive-definite
- Use the appropriate matrix decomposition function in numpy and back-substitution to solve the system. Remember to use the structure of the problem to determine the appropriate decomposition.
In [1]:
4. (40 points)
You are given the following set of data to fit a quadratic polynomial to
x = np.arange(10)
y = np.array([ 1.58873597, 7.55101533, 10.71372171, 7.90123225,
-2.05877605, -12.40257359, -28.64568712, -46.39822281,
-68.15488905, -97.16032044])
- Find the least squares solution by using the normal equations \(A^T A \hat{x} = A^T y\). (5 points)
- Write your own gradient descent optimization function to find the
least squares solution for the coefficients \(\beta\) of a
quadratic polynomial. Do not use a gradient descent algorithm
from a package such as
scipy-optimize
orscikit-learn
. You can use a simple for loop - start with the parametersbeta = np.zeros(3)
with a learning rate \(\alpha = 0.0001\) and run for 100000 iterations. (15 points) - Plot the data together with the fitted polynomial. (10 points)
In [1]: