antisymmetric matrix

In linear algebra, a skew-symmetric (or antisymmetric) matrix is a square matrix A whose transpose is also its negative; that is, it satisfies the equation:

AT = −A


or in component form, if A = (aij):

aij = − aji   for
all i and j.

For example, the following matrix is skew-symmetric:



Compare this with a symmetric matrix whose transpose is the same as the matrix :AT = A.

Properties

Sums and scalar products of skew-symmetric matrices are again skew-symmetric. Hence, the skew-symmetric matrices form a vector space. Its dimension is .

If matrices A and B are both skew-symmetric, then the triple product BTAB is skew-symmetric.

The "skew-symmetric component" of a square matrix A is the matrix ; the "symmetric component" of A is ; the matrix A is the sum of its symmetric and skew-symmetric components.

A is skew-symmetric if and only if xTAx = 0 for all real vectors x.

All main diagonal entries of a skew-symmetric matrix have to be zero, and so the trace is zero.

The determinant of a skew-symmetric matrix

Let A be a n×n skew-symmetric matrix. The determinant of A satisfies

det(A) = det(AT) = det(−A) = (−1)ndet(A).


In particular, if n is odd the determinant vanishes. This result is called Jacobi's theorem, after Carl Gustav Jacobi (Eves, 1980).

The even-dimensional case is more interesting. It turns out that the determinant of A for n even can be written as the square of a polynomial in the entries of A (Theorem by Thomas Muir):

det(A) = Pf(A)2.


This polynomial is called the Pfaffian of A and is denoted Pf(A). Thus the determinant of a real skew-symmetric matrix is always non-negative.

Spectral theory

The eigenvalues of a skew-symmetric matrix always come in pairs ±λ (except in the odd-dimensional case where there is an additional unpaired 0 eigenvalue). For a real skew-symmetric matrix the nonzero eigenvalues are all pure imaginary and thus are of the form iλ1, −iλ1, iλ2, −iλ2, … where each of the λk are real.

Real skew-symmetric matrices are normal matrices (they commute with their adjoints) and are thus subject to the spectral theorem, which states that any real skew-symmetric matrix can be diagonalized by a unitary matrix. Since the eigenvalues of a real skew-symmetric matrix are complex it is not possible to diagonalize one by a real matrix. However, it is possible to bring every skew-symmetric matrix to a block diagonal form by an orthogonal transformation. Specifically, every 2n × 2n real skew-symmetric matrix can be written in the form A = Q Σ QT where Q is orthogonal and
for real λk. The nonzero eigenvalues of this matrix are ±iλk. In the odd-dimensional case Σ always has at least one row and column of zeros.

Alternating forms

An alternating form φ on a vector space V over a field K is defined (if K doesn't have characteristic 2) to be a bilinear form
φ : V × VK
such that
φ(v,w) = −φ(w,v).
Such a φ will be represented by a skew-symmetric matrix, once a basis of V is chosen; and conversely an n×n skew-symmetric matrix A on Kn gives rise to an alternating form sending x to xTAx.

Infinitesimal rotations

Skew-symmetric matrices form the tangent space to the orthogonal group O(n) at the identity matrix. In a sense, then, skew-symmetric matrices can be thought of as infinitesimal rotations.

Another way of saying this is that the space of skew-symmetric matrices forms the Lie algebra o(n) of the Lie group O(n). The Lie bracket on this space is given by the commutator:
It is easy to check that the commutator of two skew-symmetric matrices is again skew-symmetric.

The matrix exponential of a skew-symmetric matrix A is then an orthogonal matrix R:
The image of the exponential map of a Lie algebra always lies in the connected component of the Lie group that contains the identity element. In the case of the Lie group O(n), this connected component is the special orthogonal group SO(n), consisting of all orthogonal matrices with determinant 1. So R = exp(A) will have determinant +1. It turns out that every orthogonal matrix with unit determinant can be written as the exponential of some skew-symmetric matrix.

See also

References

External links

Linear algebra is the branch of mathematics concerned with the study of vectors, vector spaces (also called linear spaces), linear maps (also called linear transformations), and systems of linear equations.
..... Click the link for more information.
In linear algebra, the transpose of a matrix A is another matrix AT (also written Atr, tA, or A′) created by any one of the following equivalent actions:
  • write the rows of A

..... Click the link for more information.
In linear algebra, a symmetric matrix is a square matrix, A, that is equal to its transpose



The entries of a symmetric matrix are symmetric with respect to the main diagonal (top left to bottom right).
..... Click the link for more information.
In mathematics, a vector space (or linear space) is a collection of objects (called vectors) that, informally speaking, may be scaled and added. More formally, a vector space is a set on which two operations, called (vector) addition and (scalar) multiplication, are
..... Click the link for more information.
In mathematics, the dimension of a vector space V is the cardinality (i.e. the number of vectors) of a basis of V. It is sometimes called Hamel dimension or algebraic dimension to distinguish it from other types of dimension.
..... Click the link for more information.
In linear algebra, the main diagonal (sometimes leading diagonal) of a square matrix is the diagonal which runs from the top left corner to the bottom right corner. For example, the following matrix has 1s down its main diagonal:


..... Click the link for more information.
0 (zero) is both a number and a numerical digit used to represent that number in numerals. It plays a central role in mathematics as the additive identity of the integers, real numbers, and many other algebraic structures.
..... Click the link for more information.
In linear algebra, the trace of an n-by-n square matrix A is defined to be the sum of the elements on the main diagonal (the diagonal from the upper left to the lower right) of A, i.e.
..... Click the link for more information.
In algebra, a determinant is a function depending on n that associates a scalar, det(A), to every n×n square matrix A. The fundamental geometric meaning of a determinant is as the scale factor for volume when A
..... Click the link for more information.
Carl Jacobi

Carl Gustav Jacob Jacobi
Born November 10 1804(1804--)
Potsdam, Germany
Died January 18 1851 (aged 48)
..... Click the link for more information.
In mathematics, a polynomial is an expression that is constructed from one or more variables and constants, using only the operations of addition, subtraction, multiplication, and constant positive whole number exponents. is a polynomial.
..... Click the link for more information.
Sir Thomas Muir (25 August 1844 – 21 March 1934) was a Scottish mathematician, remembered as an authority on determinants. He was born in Stonebyres in South Lanarkshire, and brought up in Biggar.
..... Click the link for more information.
Pfaffian of the matrix. The Pfaffian is nonvanishing only for 2n × 2n skew-symmetric matrices, in which case it is a polynomial of degree n.
..... Click the link for more information.
eigenvector of the transformation and the blue vector is not. Since the red vector was neither stretched nor compressed, its eigenvalue is 1. All vectors with the same vertical direction - i.e., parallel to this vector - are also eigenvectors, with the same eigenvalue.
..... Click the link for more information.
imaginary number (or purely imaginary number) is a complex number whose square is a negative real number. Imaginary numbers were defined in 1572 by Rafael Bombelli. At the time, such numbers were thought not to exist, much as zero and the negative numbers were regarded by
..... Click the link for more information.
normal matrix if

A*A=AA*


where A* is the conjugate transpose of A. (If A is a real matrix, A*=AT and so it is normal if ATA = AAT.
..... Click the link for more information.
conjugate transpose, Hermitian transpose, or adjoint matrix of an m-by-n matrix A with complex entries is the n-by-m matrix A* obtained from A
..... Click the link for more information.
In mathematics, particularly linear algebra and functional analysis, the spectral theorem is any of a number of results about linear operators or about matrices. In broad terms the spectral theorem provides conditions under which an operator or a matrix can be diagonalized (that
..... Click the link for more information.
In mathematics, a unitary matrix is an n by n complex matrix U satisfying the condition

where is the identity matrix and is the conjugate transpose (also called the Hermitian adjoint) of U.
..... Click the link for more information.
In the mathematical discipline of matrix theory, a block matrix or a partitioned matrix is a partition of a matrix into rectangular smaller matrices called blocks. Looking at it another way, the matrix is written in terms of smaller matrices written side-by-side.
..... Click the link for more information.
In matrix theory, a real orthogonal matrix is a square matrix Q whose transpose is its inverse:



An orthogonal matrix is a special orthogonal matrix if it has determinant +1:

Overview


..... Click the link for more information.
In mathematics, a vector space (or linear space) is a collection of objects (called vectors) that, informally speaking, may be scaled and added. More formally, a vector space is a set on which two operations, called (vector) addition and (scalar) multiplication, are
..... Click the link for more information.
field is an algebraic structure in which the operations of addition, subtraction, multiplication and division (except division by zero) may be performed, and the same rules hold which are familiar from the arithmetic of ordinary numbers.
..... Click the link for more information.
In mathematics, the characteristic of a ring R, often denoted char(R), is defined to be the smallest number of times one must add the ring's multiplicative identity element (1) to itself to get the additive identity element (0); the ring is said to have
..... Click the link for more information.
In mathematics, a bilinear form on a vector space V is a bilinear mapping V × V → F, where F is the field of scalars.
..... Click the link for more information.
basis is a set of vectors that, in a linear combination, can represent every vector in a given vector space, and such that no element of the set can be represented as a linear combination of the others. In other words, a basis is a linearly independent spanning set.
..... Click the link for more information.
In mathematics, the tangent space of a manifold is a concept which facilitates the generalization of vectors from affine spaces to general manifolds, since in the latter case one cannot simply subtract two points to obtain a vector pointing from one to the other.
..... Click the link for more information.
In mathematics, the orthogonal group of degree n over a field F (written as O(n,F)) is the group of n-by-n orthogonal matrices with entries from F, with the group operation that of matrix multiplication.
..... Click the link for more information.
In mathematics, a Lie algebra is an algebraic structure whose main use is in studying geometric objects such as Lie groups and differentiable manifolds. Lie algebras were introduced to study the concept of infinitesimal transformations.
..... Click the link for more information.
In mathematics, a Lie group (IPA pronunciation: [liː], sounds like "Lee"), is a group which is also a differentiable manifold, with the property that the group operations are compatible with the smooth structure.
..... Click the link for more information.


This article is copied from an article on Wikipedia.org - the free encyclopedia created and edited by online user community. The text was not checked or edited by anyone on our staff. Although the vast majority of the wikipedia encyclopedia articles provide accurate and timely information please do not assume the accuracy of any particular article. This article is distributed under the terms of GNU Free Documentation License.