Elimination with matrices

  • We use it to find solution of linear equations
    • Elimination method is also popular among software packages like Matlab.
  • We would perform row operations to make matrix upper triangular
  • We can do this with equations itself (without putting in matrix)
    • We don’t want to keep writing variable names
  • Process
    • Iteratively select diagonal element as pivot
    • Make all entires below it 0
    • Repeat
  • Allowed row operations are the ones which does not change systems
    • Switching rows
    • Multiplying row with constant
    • Add rows

What would happen when we have 2 variable and 100 equations (observation) ?

  • Here are four terms
    • Underdetermined System (no of equation > no of unknowns)
    • Overdetermined system (no of equation < no of unknowns)
    • Consistent System (one/more set of values satisfy equations)
    • Inconsistent System (no set of values satisfy equations)
  • Overdetermined system is generally inconsistent except
    • Same equation multiplied by constant
    • One equation is linear combination of other
  • More specifically, according to the Rouché–Capelli theorem, any system of linear equations (underdetermined or otherwise) is inconsistent if the rank of the augmented matrix is greater than the rank of the coefficient matrix.

Snippets for Prof Gilbert’s course

Elimination matrices – we can create matrix for each elimination operation

Remember – Multiply on left is row operation, multiply on right is column operation

Permutation matrix (P) – Exchange the rows of identity matrix

It is easy to find inverse of this elimination matrices

Reference

These are notes from Prof. Gilbert Strang’s lecture on MIT open courseware. https://ocw.mit.edu/courses/mathematics/18-06-linear-algebra-spring-2010/video-lectures/lecture-1-the-geometry-of-linear-equations/

Matrices Definitions

Jacobian Matrix

jscobian

Gradient Vector

gradient

Hessian Matrix

hessian

Hessian Matrix is Jacobian of a gradient. 

Symmetric Matrix

  • Matrix and its transpose are same

 

Hermitian Matrix

  • Matrix is hermitian if A_transpose (A^T) = A_bar (Conjugate of complex no)

Positive Semi-definite Matrices

  • One definition:
    • Matrix M ∈ L(V) is positive definite iff
      • M is symmetric
      • v^T * M * v >= 0 for all v ∈ V
  • Now following are equal [0]
    • v^T * M * v >= 0 for all v ∈ V
    • All eigenvalues are non negative
    • There exists a matrix B such that B^T * B = M
  • Application
    • A twice differentiable function of n variable is convex if and only if Hessian of it is positive semi-definite (PSD) [2] [3]

 

References

[0] : https://www.cse.iitk.ac.in/users/rmittal/prev_course/s14/notes/lec11.pdf

[1] : http://www.princeton.edu/~amirali/Public/Teaching/ORF363_COS323/F15/ORF363_COS323_F15_Lec2.pdf

[2] : https://wiki.math.ntnu.no/_media/tma4180/2016v/note2.pdf

[3] : https://math.stackexchange.com/questions/946156/proving-convexity-of-a-function-whose-hessian-is-positive-semidefinite-over-a-co