Skip to main content

Chapter 1 Vector spaces

In your first course in linear algebra, you likely worked a lot with vectors in two and three dimensions, where they can be visualized geometrically as objects with magnitude and direction (and drawn as arrows). You probably extended your understanding of vectors to include column vectors; that is, \(1\times n\) matrices of the form \(\vv=\bbm v_1\\v_2\\\vdots\\v_n\ebm\text{.}\)
Using either geometric arguments (in \(\R^2\) or \(\R^3\)) or the properties of matrix arithmetic, you would have learned that these vectors can be added, by adding corresponding components, and multiplied by scalars — that is, real numbers — by multiplying each component of the vector by the scalar.
It’s also likely, although you may not have spent too long thinking about it, that you looked at the properties obeyed by the addition and scalar multiplication of vectors (or, for that matter, matrices). For example, you may have made use of the fact that order of addition doesn’t matter, or that scalar multiplication distributes over addition. You may have also experienced some frustration due to the fact that for matrices, order of multiplication does matter!
It turns out that the algebraic properties satisfied by vector addition and scalar multiplication are not unique to vectors, as vectors were understood in your first course in linear algebra. In fact, many types of mathematical object exhibit similar behaviour. Examples include matrices, polynomials, and even functions.
Linear algebra, as an abstract mathematical topic, begins with a realization of the importance of these properties. Indeed, these properties, established as theorems for vectors in \(\R^n\text{,}\) become the axioms for the abstract notion of a vector space. The advantage of abstracting these ideas is that any proofs we write that depend only on these axioms will automatically be valid for any set of objects satisfying those axioms. That is, a result that is true for vectors in \(\R^2\) is often also true for vectors in \(\R^n\text{,}\) and for matrices, and polynomials, and so on. Mathematicians like to be efficient, and prefer to establish a result once in an abstract setting, knowing that it will then apply to many concrete settings that fit into the framework of the abstract result.