How can I get the symbolic steady state vector of a Markov Chain?

203 ビュー (過去 30 日間)
Shih-Wen Liu
Shih-Wen Liu 2022 年 6 月 17 日
コメント済み: Bruno Luong 2022 年 8 月 8 日
Hello, does anyone know how to obtain the symbolic steady state vector (i.e. the long-term probability of each state) of this Markov Chain example in MATLAB?
At the end of this demonstration, it does not show how can I further get a steady state vector?
It will be very appreciated if you can help me with this problem.

回答 (1 件)

John D'Errico
John D'Errico 2022 年 8 月 7 日
編集済み: John D'Errico 2022 年 8 月 7 日
Easy, peasy. For example, given a simple Markov process, described by the 3x3 transition matrix T.
T = [.5 .2 .3;.1 .4 .5;.1 .1 .8]
T = 3×3
0.5000 0.2000 0.3000 0.1000 0.4000 0.5000 0.1000 0.1000 0.8000
There are no absorbing states. We can see this is indeed the transition matrix of a Markov chain. One good test is the rows all sum to 1, and none of the elements are greater than 1, or less than zero.
sum(T,2)
ans = 3×1
1 1 1
What are the steady-state probabilities?
[V,D] = eig(T')
V = 3×3
0.2357 0.4082 0.0000 0.2357 0.4082 -0.7071 0.9428 -0.8165 0.7071
D = 3×3
1.0000 0 0 0 0.4000 0 0 0 0.3000
Take eigenvector that corresponds to the unit eigenvalue. In this case, it is the first eigenvector.
P = V(:,1)';
Normalize so the elements sum to 1.
format long g
P = P/sum(P)
P = 1×3
0.166666666666666 0.166666666666667 0.666666666666667
Those are the steady state probabilites for this system. We can see that this does not change P.
P*T
ans = 1×3
0.166666666666666 0.166666666666667 0.666666666666667
I won't do your homework for you, but you can easily enough see how to proceed from here.
  4 件のコメント
Walter Roberson
Walter Roberson 2022 年 8 月 8 日
John, with symbolic coefficients, is it going to be possible to find the entry with eigenvalue 1?
Bruno Luong
Bruno Luong 2022 年 8 月 8 日
You don't need to compute eigen value, you can compute this, possibly easier in symbolic way:
ss = null(T.'-eye(size(T))).';
ss = ss/sum(ss)

サインインしてコメントする。

カテゴリ

Help Center および File ExchangeMarkov Chain Models についてさらに検索

製品


リリース

R2021b

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by