Notes on Mathematical Methods for Physicists Chapter2
本文最后更新于 2024年5月22日 晚上
Notes on Mathematical Methods for Physicists
Chapter2 Determinants & Matrices
Determinants
We begin the study of matrices by solving linear equations that will lead us to determinants and matrices. The concept of determinant and the notation were introduced by the renowned German mathematician and philosopher
Homogeneous Linear Equations
Suppose three unknowns
The problem is to determine under what conditions there is any solution , apart from the trivial one
If the volume spanned by
is not zero , then there is the only trivial solution
Conversely , if the aforementional determinant of the coefficient vanishes , then one of the row vectors is a combination of the other two.
This is Cramer's Rule for homogeneous linear equation.
Inhomogeneous Linear Equation
Simple example :
This is Cramer's Rule for inhomogeneous linear equation.
Definitions
Before defining a determinant , we need to introduce some related concepts and definitions.
We now define a determinant of order
The determinant
Properties of Determinants
Take determinants of order
Laplacian Development by Minor
The fact that a determinant of order
Linear Equation Systems
For equation
We define
Then we have
This is the Cramer's Rule.
If
Determinants & Linear Dependence
If the coefficients of
Linearly Dependent Equations
Situation
All the equations are homogeneous (which means all the right hand side quantities
Situation
A second case is where we have (or combine equations so that we have) the same linear form in two equations , but with different values of the right-hand quantities
Situation
A third , related case , is where we have a duplicated linear form , but with a common value of
Numerical Evaluation
There are many methods to evaluate determinants , even using computers. We use the Gauss Elimination to calculate determinants , which is a versatile procedure that can be used for evaluating determinants, for solving linear equation systems, and (as we will see later) even for matrix inversion.
Gauss Elimination : make the determinant into a form that all the entries in the lower triangle of the determinant. Then the only effective part is the product of thediagonal elements.
Matrices
Matrices are
Basic Definitions
A matrix is a set of numbers or functions in a
A matrix for which
Equality
If
Addition , Subtraction
Addition and subtraction are defined only for matrices
Multiplication (by a Scalar)
Here we have
Note that the definition of multiplication by a scalar causes each element of marix
Matrix Multiplication (Inner Product)
Matrix multiplication is not an element-by-element operation like addition or multiplication by a scalar. The inner product of matrices
This definition causes the
It is useful to define the commutator of
which , as stated above , will in many cases be nonzero.
But , matrix multiplication is associative , meaning that
Unit Matrix
By direct matrix multiplication , it is possible to show that a square matrix with elements of value unity on its principal diagonal (the elements
note that it is not a matrix all of whose elements are unity. Giving such a matrix the name
Remember that
The previously introduced null matrices have only zero elements , so it is also obvious that for all
Diagonal Matrices
If a matrix
Matrix Inverse
It will often be the case that given a square matrix
Every nonezero real (or complex) number
If
Since we started with a matrix
This is inconsistent with the nonzero
The algebraic properties of real and complex numbers (including the existence of inverses for all nonzero numbers) define what mathematicians call a field. The properties we have identified for matrices are different ; they form what is called a ring.
A closed , but cumber-some formula for the inverse of a matrix exists ; it expresses the elements of
We describe here a well-known method that is computationally more efficient than the equation above , namely the Gauss-Jordan procedure.
Example Gauss-Jordan Matrix Inversion
The Gauss-Jordan method is based on the fact that there exist matrices
By using these transformations , the rows of a matrix can be altered (by matrix multiplication) in the same way as we did to the elements of determinants. If
What we need to do is to find out how to reduce
Write , side by side , the matrix
Multiply the rows as necessary to set to unity all elments of the first column of the left matrix ,
Subtracting the first row from the second an third rows , we obtain
Divide the second row by
Divide the third row by
Derivatives of Determinants
The formula giving the inverse of a matrix in terms of its minors enables us to write a compact formula for the derivative of a determinant
Applying now the chain rule to allow for the
Systems of Linear Equations
Note that if
This tells us two things : (a) that if we can evaluate
Then the result is important enough to be emphasized : A square matrix
Determinant Product Theorem
The Product Theorem is that
Note that
Rank of a Matrix
The concept of a matrix singularity can be refined by introducing the notion of the rank of a matrix. If the elements of a matrix are viewed as the coefficients of a set of linear forms , a square matrix is assigned a rank equal to the number of linearly independent forms that its elements describe. Thus , a nonsingular
Transpose , Adjoint , Trace
Transpose
The transpose of a matrix is the matrix that results from interchanging its row and column indices. This operation corresponds to subjecting the array to reflection about its principal diagonal. If a matrix is not square , its transpose will not even have the same shape as the original matrix. The transpose of
Note that transposition will convert a column vector into a row vector. A matrix that is unchanged by transposition is called symmetric.
Adjoint
The adjoint of a matrix
Trace
The trace , a quantity defined for square matrices , is the sum of the elements on the principal diagonal. Thus , for an
Some properties of the trace :
The second property holds even if
Considering the trace of the matrix product
Repeating this process , we also find
Operations on Matrix Products
There are some properties of the determinant and trace :
whether or not
For other operations on matrix products , there are
Matrix Representation of Vectors
I have nothing to say , because it is easy to understand. (I am going to use
Orthogonal Matrices
A real matrix is termed orthogonal if its transpose is equal to its inverse. Thus , if
Since , for
Unitary Matrices
The definition is matrix which the adjoint is also the inverse is identified as unitary. One way of expressing this relationship is
If all the elements of a unitary matrix are real , the matrix is also orthogonal.
Since for any matrix
We observe that if
Hermitian Matrices
A matrix is identified as Hermitian , or , synonymously , self-adjoint , if it is equal to its adjoint. To be self-adjoint , a matrix
We see that the principal diagonal elements must be real.
Note that if two matrices
Extraction of a Row or Column
It is useful to define column vectors