Talk:Markov chain

Page contents not supported in other languages.
From Wikipedia, the free encyclopedia

Markov Model - a generalisation or a special case?[edit]

The subsection Markov model presents Markov models as a generalisation of Markov chains. This is contrary to the subsection itself being placed in the section Special types of Markov chains. Which type of models is the subset of the other? Which are the more general framework? AVM2019 (talk) 19:37, 14 June 2022 (UTC)[reply]

To me, the term Markov model is a bit vague and could refer to any model having some sort of Markov property. A Markov process would refer to a family of random variables indexed by time, with the Markov property being the usual memorylessness. And a Markov chain would be even more specific, referring to a Markov process on a discrete state space. By default, that would refer to a discrete-time process -- otherwise I would indicate it explicitly by saying continuous-time Markov chain.
These conventions might vary a bit -- although I'm pretty confident everyone I know working in the field would agree with this. In the end it doesn't matter too much because I don't think this can be a real source of confusion. But, to come back to your original question, it's definitely {Markov chains} ⊂ {Markov models}, whatever the latter means.
Malparti (talk) 12:30, 16 September 2022 (UTC)[reply]
Yep, Markov chains are a type of Markov model. We could change the section header "Special types of Markov chains". Perhaps "Related models" would work. I don't think it makes sense to reorganise the content, even if we do have to find a section name that covers both subtypes and supertypes of Markov chains. — Bilorv (talk) 10:04, 19 September 2022 (UTC)[reply]

Convergence speed to the stationary distribution[edit]

I am studying the Markov chain now and I found some possible typo.

> π(k) approaches to π as k → ∞

I believe it should be "π(k) approaches a1*π as k → ∞".

> In other words, π = ui ← xPP...P = xPk as k → ∞.

It should be π = u1 instead of ui. Here, u1 represents the eigenvector of P corresponding to lambda1=1.

— Preceding unsigned comment added by Jgdyjb (talkcontribs) 05:02, 15 August 2023 (UTC)[reply]

Thanks for this. I believe you are correct and that I've fixed it here. In future, when you find a problem you need to fix it yourself as generally no-one else will. — Bilorv (talk) 09:33, 21 August 2023 (UTC)[reply]

This can be far more comprehensible for students.[edit]

Students may need to understand how to apply Markov Chain computations to the following example. A shepherd has a flock of sheep, three pastures and two sheep dogs, to graze seven days a week. He grazes the pastures on a rotating basis, working one dog at a time on alternating days. Say one is black and one is white. My student needs to compute when will the shepherd will work the black sheep dog in pasture 1 on a Thursday/all Thursday occurrences. I can make the chain more interesting by adding variables, and apply it to retail sales, cell phone tracking and re-stocking needs, for a mathematical application.

My interest in this discussion derives from an appreciation of exactly how good the IDEA encryption algorithm is, understanding that it derives from substitution on a password dependent Markov chain with an astronomical period/periodicity.

While it may be theoretically vulnerable, it is exceedingly difficult to cryptanalyze, and it should be understood by ENTRY LEVEL USERS to be very adequate, even for life and death uses. Formfollows (talk) 04:45, 3 October 2023 (UTC)[reply]