linear algebra

Linear algebra is the branch of mathematics concerned with the study of vectors, vector spaces (also called linear spaces), linear maps (also called linear transformations), and systems of linear equations. Vector spaces are a central theme in modern mathematics; thus, linear algebra is widely used in both abstract algebra and functional analysis. Linear algebra also has a concrete representation in analytic geometry and it is generalized in operator theory. It has extensive applications in the natural sciences and the social sciences, since nonlinear models can often be approximated by linear ones.

History

The history of modern linear algebra dates back to the early 1840's. In 1843, William Rowan Hamilton introduced quaternions, which describe mechanics in three-dimensional space. In 1844, Hermann Grassmann published his book Die lineale Ausdehnungslehre (see References). Arthur Cayley introduced matrices, one of the most fundamental linear algebraic ideas, in 1857. Despite these early developments, linear algebra has been developed primarily in the twentieth century.

Matrices were poorly-defined before the development of ring theory within abstract algebra. With the coming of special relativity, many practitioners gained appreciation of the subtleties of linear algebra. Furthermore, the routine application of Cramer's rule to solve partial differential equations led to the inclusion of linear algebra in standard coursework at universities. E.T. Copson wrote, for instance,
When I went to Edinburgh as a young lecturer in 1922, I was surprised to find how different the curriculum was from that at Oxford. It included topics such as Lebesgue integration, matrix theory, numerical analysis, Riemannian geometry, of which I knew nothing...

—E.T. Copson, Preface to Partial Differential Equations, 1973

Francis Galton initiated the use of correlation coefficients in 1888. Often more than one random variable is in play and may be cross-correlated. In statistical analysis of multivariate random variables the correlation matrix is a natural tool. Thus, statistical study of such random vectors helped establish matrix usage.

Elementary introduction

Linear algebra had its beginnings in the study of vectors in Cartesian 2-space and 3-space. A vector, here, is a directed line segment, characterized by both its magnitude, represented by length, and its direction. Vectors can be used to represent physical entities such as forces, and they can be added to each other and multiplied with scalars, thus forming the first example of a real vector space.

Modern linear algebra has been extended to consider spaces of arbitrary or infinite dimension. A vector space of dimension n is called an n-space. Most of the useful results from 2- and 3-space can be extended to these higher dimensional spaces. Although many people cannot easily visualize vectors in n-space, such vectors or n-tuples are useful in representing data. Since vectors, as n-tuples, are ordered lists of n components, it is possible to summarize and manipulate data efficiently in this framework. For example, in economics, one can create and use, say, 8-dimensional vectors or 8-tuples to represent the Gross National Product of 8 countries. One can decide to display the GNP of 8 countries for a particular year, where the countries' order is specified, for example, (United States, United Kingdom, France, Germany, Spain, India, Japan, Australia), by using a vector (v1, v2, v3, v4, v5, v6, v7, v8) where each country's GNP is in its respective position.

A vector space (or linear space), as a purely abstract concept about which theorems are proved, is part of abstract algebra, and is well integrated into this discipline. Some striking examples of this are the group of invertible linear maps or matrices, and the ring of linear maps of a vector space. Linear algebra also plays an important part in analysis, notably, in the description of higher order derivatives in vector analysis and the study of tensor products and alternating maps.

In this abstract setting, the scalars with which an element of a vector space can be multiplied need not be numbers. The only requirement is that the scalars form a mathematical structure, called a field. In applications, this field is usually the field of real numbers or the field of complex numbers. Linear maps take elements from a linear space to another (or to itself), in a manner that is compatible with the addition and scalar multiplication given on the vector space(s). The set of all such transformations is itself a vector space. If a basis for a vector space is fixed, every linear transform can be represented by a table of numbers called a matrix. The detailed study of the properties of and algorithms acting on matrices, including determinants and eigenvectors, is considered to be part of linear algebra.

One can say quite simply that the linear problems of mathematics - those that exhibit linearity in their behavior - are those most likely to be solved. For example differential calculus does a great deal with linear approximation to functions. The difference from nonlinear problems is very important in practice.

The general method of finding a linear way to look at a problem, expressing this in terms of linear algebra, and solving it, if need be by matrix calculations, is one of the most generally applicable in mathematics.

Some useful theorems

Generalisation and related topics

Since linear algebra is a successful theory, its methods have been developed in other parts of mathematics. In module theory one replaces the field of scalars by a ring. In multilinear algebra one deals with the 'several variables' problem of mappings linear in each of a number of different variables, inevitably leading to the tensor concept. In the spectral theory of operators control of infinite-dimensional matrices is gained, by applying mathematical analysis in a theory that is not purely algebraic. In all these cases the technical difficulties are much greater.

See also

Note

1. ^ The existence of a basis is straightforward for finitely generated vector spaces, but in full generality it is logically equivalent to the axiom of choice.

References

History

  • Fearnley-Sander, Desmond, Hermann Grassmann and the Creation of Linear Algebra, American Mathematical Monthly 86 (1979), pp. 809–817.
  • Grassmann, Hermann, Die lineale Ausdehnungslehre ein neuer Zweig der Mathematik: dargestellt und durch Anwendungen auf die übrigen Zweige der Mathematik, wie auch auf die Statik, Mechanik, die Lehre vom Magnetismus und die Krystallonomie erläutert, O. Wigand, Leipzig, 1844.

External links

Mathematics (colloquially, maths or math) is the body of knowledge centered on such concepts as quantity, structure, space, and change, and also the academic discipline that studies them. Benjamin Peirce called it "the science that draws necessary conclusions".
..... Click the link for more information.
spatial vector, or simply vector, is a concept characterized by a magnitude and a direction. A vector can be thought of as an arrow in Euclidean space, drawn from an initial point A pointing to a terminal point B.
..... Click the link for more information.
In mathematics, a vector space (or linear space) is a collection of objects (called vectors) that, informally speaking, may be scaled and added. More formally, a vector space is a set on which two operations, called (vector) addition and (scalar) multiplication, are
..... Click the link for more information.
In mathematics, a linear map (also called a linear transformation or linear operator) is a function between two vector spaces that preserves the operations of vector addition and scalar multiplication.
..... Click the link for more information.
system of linear equations (or linear system) is a collection of linear equations involving the same set of variables. For example,
is a system of three equations in the three variables .
..... Click the link for more information.
Mathematics (colloquially, maths or math) is the body of knowledge centered on such concepts as quantity, structure, space, and change, and also the academic discipline that studies them. Benjamin Peirce called it "the science that draws necessary conclusions".
..... Click the link for more information.
Algebra is a branch of mathematics concerning the study of structure, relation and quantity. The name is derived from the treatise written by the Arabic[1] mathematician, astronomer, astrologer and geographer,
..... Click the link for more information.
Abstract algebra is the subject area of mathematics that studies algebraic structures, such as groups, rings, fields, modules, vector spaces, and algebras. Most authors nowadays simply write algebra instead of abstract algebra.
..... Click the link for more information.
Functional analysis is the branch of mathematics, and specifically of analysis, concerned with the study of vector spaces and operators acting upon them. It has its historical roots in the study of functional spaces, in particular transformations of functions, such as the Fourier
..... Click the link for more information.
Analytic geometry, also called coordinate geometry and earlier referred to as Cartesian geometry or analytical geometry, is the study of geometry using the principles of algebra.
..... Click the link for more information.
In mathematics, operator theory is the branch of functional analysis which deals with bounded linear operators and their properties. It can be split crudely into two branches, although there is considerable overlap and interplay between them.
..... Click the link for more information.
natural science refers to a rational approach to the study of the universe, which is understood as obeying rules or laws of natural origin. The term natural science
..... Click the link for more information.
The social sciences are a group of academic disciplines that study human aspects of the world. They diverge from the arts and humanities in that the social sciences tend to emphasize the use of the scientific method in the study of humanity, including quantitative and qualitative
..... Click the link for more information.
William Hamilton

William Rowan Hamilton
Born July 4 1805(1805--)
Dublin, Ireland
Died September 2 1865 (aged 60)
..... Click the link for more information.
quaternions are a non-commutative extension of complex numbers. They were first described by the Irish mathematician, Sir William Rowan Hamilton, in 1843 and applied to mechanics in three-dimensional space.
..... Click the link for more information.
Hermann Günther Grassmann (April 15, 1809, Stettin – September 26, 1877, Stettin) was a German polymath, renowned in his day as a linguist and now admired as a mathematician. He was also a physicist, neohumanist, all-round scholar, and publisher.
..... Click the link for more information.
Arthur Cayley

Arthur Cayley
Born July 16 1821(1821--)
Richmond, Surrey, UK
Died January 26 1895 (aged 75)
Cambridge, UK
..... Click the link for more information.
matrix (plural matrices) is a rectangular table of elements (or entries), which may be numbers or, more generally, any abstract quantities that can be added and multiplied.
..... Click the link for more information.
In mathematics, ring theory is the study of rings, algebraic structures in which addition and multiplication are defined and have similar properties to those familiar from the integers.
..... Click the link for more information.
Abstract algebra is the subject area of mathematics that studies algebraic structures, such as groups, rings, fields, modules, vector spaces, and algebras. Most authors nowadays simply write algebra instead of abstract algebra.
..... Click the link for more information.
special theory of relativity was proposed in 1905 by Albert Einstein in his article "On the Electrodynamics of Moving Bodies". Some three centuries earlier, Galileo's principle of relativity had stated that all uniform motion was relative, and that there was no absolute and
..... Click the link for more information.
Cramer's rule is a theorem in linear algebra, which gives the solution of a system of linear equations in terms of determinants. It is named after Gabriel Cramer (1704 - 1752).
..... Click the link for more information.
In mathematics, a partial differential equation (PDE) is a type of differential equation, i. e. a relation involving an unknown function of several independent variables and its partial derivatives with respect to those variables.
..... Click the link for more information.
Lebesgue integration is a mathematical construction that extends the integral to a larger class of functions; it also extends the domains on which these functions can be defined.
..... Click the link for more information.
Matrix theory is a branch of mathematics which focuses on the study of matrices. Initially a sub-branch of linear algebra, it has grown to cover subjects related to graph theory, algebra, combinatorics, and statistics as well.
..... Click the link for more information.
Numerical analysis is the study of algorithms for the problems of continuous mathematics (as distinguished from discrete mathematics).

One of the earliest mathematical writing is the Babylonian tablet YBC 7289, which gives a sexagesimal numerical approximation of ,
..... Click the link for more information.
Elliptic geometry is also sometimes called Riemannian geometry.


In differential geometry, Riemannian geometry is the study of smooth manifolds with Riemannian metrics, i.e.
..... Click the link for more information.
Francis Galton

Francis Galton
Born January 16 1822(1822--)
Birmingham, England
Died January 17 1911 (aged 90)
..... Click the link for more information.
correlation, also called correlation coefficient, indicates the strength and direction of a linear relationship between two random variables. In general statistical usage, correlation or co-relation refers to the departure of two variables from independence.
..... Click the link for more information.
A random variable is an abstraction of the intuitive concept of chance into the theoretical domains of mathematics, forming the foundations of probability theory and mathematical statistics.
..... Click the link for more information.


This article is copied from an article on Wikipedia.org - the free encyclopedia created and edited by online user community. The text was not checked or edited by anyone on our staff. Although the vast majority of the wikipedia encyclopedia articles provide accurate and timely information please do not assume the accuracy of any particular article. This article is distributed under the terms of GNU Free Documentation License.