Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

That's a good observation, and it is indeed true for many Markov chains. But your counterexample of the identity matrix is not quite right; every vector is an eigenvector of the identity, so there is no "realignment" needed.

More generally speaking, you're asking when the iteration `x_+ = Ax` converges to a fixed point which is an eigenvector of A. This can happen a few different ways. The obvious way is that A has an eigenvector `v` with eigenvalue 1, and all other eigenvalues with magnitude < 1. Then those other components will die out with repeated application of A, leaving only `v` in the limit.

For Markov chains, we can get this exact property from the Perron-Frobenius theorem, which applies to non-negative irreducible matrices. Irreducible means that the transition graph of the Markov chain is strongly connected. If that's the case, then there is a unique eigenvector called the stationary distribution (with eigenvalue 1), and all initial conditions will converge to it.

In case A is not irreducible, you may have different connected components, and the stationary distribution may depend on which component your initial condition is in. Going back to the n x n identity matrix, it has n connected components (it's a completely disconnected graph with all the self-transition probabilities = 1). So every initial condition is stationary, because you can't change anything after the initial step.



Thank you, some really good info to go on and research.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: