# Eigenvalues and Eigenvectors

Eigenvalues and eigenvectors are the most powerful tools in Linear Algebra.

We’ve talked about matrices and linear transformations in the previous article in the context of solving systems of equations, but there are often many cases where we have a linear transformation of a space to itself. In cases like that, we can repeatedly apply the linear transformation.

Repeatedly applying linear transformations can be quite time consuming. Furthermore, while the columns of a matrix indicate where we send the basis vectors, it’s difficult to interpret what’s going on with the matrices. The solution to both of these problems involves rewriting everything in the eigenbasis.

# Prerequisites

While this is part of a series, you don’t need to know anything from the previous articles except what was covered in the previous article.

# Motivation

While I’ve given some general motivation in the intro to the article, the motivation will become more clear if we give some examples.

# Markov Chains

Say you live in a city where the weather on one day depends on the weather of the previous day with the following rules:

• If it’s sunny today, there’s a 75% chance it will be sunny tomorrow and a 25% chance it will be cloudy tomorrow.
• If it’s cloudy today, there’s a 50% chance it will be sunny tomorrow and a 50% chance it will be cloudy tomorrow.
• Seasons don’t affect the weather. I’m making this assumption for simplicity, not because we have to make the assumption. If you want, you can come up with a day-by-day change in the probabilities.

On a scale of Phoenix, Arizona to Seattle, Washington, how sunny would you say the city is? To approach this problem, you’ll want to start with a 2D vector whose rows correspond to the probability that a given day will be sunny and the probability that a given day will be cloudy. All the elements of the vector add up to 1. If you’re stuck, consider how this probability vector would change if the probability that it would be sunny today is 100%.