Eigenvalues and eigenvectors

In linear algebra, it is often important to know which vectors have their directions unchanged by a given linear transformation. An eigenvector (/ˈɡən-/ EYE-gən-) or characteristic vector is such a vector. Thus an eigenvector of a linear transformation is scaled by a constant factor when the linear transformation is applied to it: . The corresponding eigenvalue, characteristic value, or characteristic root is the multiplying factor .

Geometrically, vectors are multi-dimensional quantities with magnitude and direction, often pictured as arrows. A linear transformation rotates, stretches, or shears the vectors upon which it acts. Its eigenvectors are those vectors that are only stretched, with no rotation or shear. The corresponding eigenvalue is the factor by which an eigenvector is stretched or squished. If the eigenvalue is negative, the eigenvector's direction is reversed.[1]

The eigenvectors and eigenvalues of a transformation serve to characterize it, and so they play important roles in all the areas where linear algebra is applied, from geology to quantum mechanics. In particular, it is often the case that a system is represented by a linear transformation whose outputs are fed as inputs to the same inputs (feedback). In such an application, the largest eigenvalue is of particular importance, because it governs the long-term behavior of the system, after many applications of the linear transformation, and the associated eigenvector is the steady state of the system.


© MMXXIII Rich X Search. We shall prevail. All rights reserved. Rich X Search