Talk:Markov property

Page contents not supported in other languages.
From Wikipedia, the free encyclopedia

Brand new article[edit]

The old article called Markov property didn't even correctly define the property, confusing it with the process. That was a mess. So I created a new article. linas (talk) 22:29, 30 August 2008 (UTC)[reply]


So what does sigma stand for in the definition?

Doubts[edit]

Does someone have a reference that does use the term "Markov property" in the sense suposedly used here? The reference to Feller doesn't do so I think ... The chapter/section mentioned seems to be about Markov processes and, while "Markov property" is in the index that points to a definition on other pages in connection with the "memoryless" property of the exponential distribution.

If the definition here is relevant to anything, what is the role of the index j? For a "Markov property" to hold does the stated condition have to hold for every j, or for only one j? Presumably Nj must exclude at least one other member besides j, but is it enough for this to apply for only one j?

Melcombe (talk) 17:58, 22 January 2009 (UTC)[reply]

Mangled sentence[edit]

if a stochastic process of random variables determining a set of probabilities which can be factored in such a way that the Markov property is obtained, then [...]

The part between "if" and "then" is not a sentence. What was it intended to say? Michael Hardy (talk) 21:11, 21 October 2009 (UTC)[reply]


This article also says:

In a broader sense, if a stochastic process of random variables determining a set of probabilities which can be factored in such a way that the Markov property is obtained, then that process is said to have the Markov-type property; this is defined in detail below.

But it's not defined below.

Again, what was intended here? Michael Hardy (talk) 21:15, 21 October 2009 (UTC)[reply]

I'm not probabilist, but all my books on probability define a sequence of random variables as having the Markov property if , not going any further back. I can see why you might want to extend the definition to the one given here, but has anyone else found any citations for it? Thudso (talk) 18:00, 11 December 2009 (UTC)[reply]

I have seen both definitions but cannot remember the reference. I will look for it tonight.
It is useful to note, however, that neither definition is more general than the other. The definition you give above can imply the seemingly "more general" one by simply grouping neighboring random variables in the sequence together forming a new sequence (of vector valued random variables) for which the above Markov property is satisfied. Paul Laroque (talk) 01:29, 21 December 2009 (UTC)[reply]

New definition plus strong MP[edit]

I have made a conscience definition of the Markov property. In my view the old version digressed too much, there is no need to talk about Markov networks and such in the definition. If anyone is keen on adding it, I suggest they do so under a new section.

I have not referenced any of my edits, as I do now know the format, nor do I know how, but there are plenty of books that define the MP and SMP similarly. A few that spring to mind, Levy Processes by Bertoin, Markov Processes by Ethier and Kurtz. If there are any problems, do let me know. I will try to add to this article more when I have time. Hyperbola 21:51, 23 January 2010 (UTC) —Preceding unsigned comment added by Hyperbola (talkcontribs)

I have reverted these changes since they led to something that was essentially what is in (or should be in) the article Markov process. What is needed here is something that will something that will deal with the idea of a "Markov property" in the context of something other than "time". Melcombe (talk) 11:29, 25 January 2010 (UTC)[reply]
I am quite confused as to what you mean. The definition of MP and SMP I gave are both correct and concise. The bog standard definition is the one I gave (it applies to any time be it discrete or continuous). The old article seems to me to be misinforming, as it does not have a correct general definition of a MP, and confusing. I cannot see a problem with my edit. Unless you actually have any valid objections and make them clear to me, I will revert the page back. Hyperbola 17:24, 25 January 2010 (UTC) —Preceding unsigned comment added by Hyperbola (talkcontribs)
The point is that the changes you made would have led to the content here being essentially the same as the content of Markov process and other existing articles. There is no point in having two articles saying the same thing. As structured, the existing version here points readers to several other articles that contain more or less correct stuff about time-based Markov processes, leaving the other topics to be discussed here.
If you were planning to merge in other articles, you haven't said so, or indicated how the other articles will be referred to in what you thought of constructing. There is certainly some need to discuss the Markov property in contexts other than "time", both networks and multi-dimensional space ...and this is exactly the sort of content you were intent on destroying. You might want to consider placing the sort of material you wanted to include directly in the Markov process article, where it would seem more immediately relevant. Melcombe (talk) 10:43, 26 January 2010 (UTC)[reply]
I am sorry but Markov property is a property that is used to describe "time" based things. If you wish to talk about such things as global, simple, local Markov property, then I suggest you start a new article as by simply entering the term "Markov property" in the Google Scholar will show you that all authors use the term to refer to stochastic processes. It would be fine to add as a subsection, but it is hugely misleading to make the whole article about that. The Markov process article is off I think. It just shows how horrid this current state of the article is when you read Ito diffusion and decide to follow the link here.
I seem to gather that this is a matter of semantics but the issue is that the term "Markov property" refers to stochastic processes. The definition here in the article is wrong, unclear and generally not very helpful. This article is not about Gibbs measures or Markov networks, but about the Markov property. I suggest you either address these concerns in the article, or even simpler, revert the article and add new sections instead of reverting the article to something that is of no help to anyone who wishes to know anything about the MP. Hyperbola 01:14, 27 January 2010 (UTC) —Preceding unsigned comment added by Hyperbola (talkcontribs)
One issue is that the term "Markov property" has been used to refer to stochastic processes, but it has also been used for other things, for example: for graphical chain models The Chain Graph Markov Property by Frydenberg, and Local Markov Property for Models Satisfying Composition Axiom by C Kang, J Tian, and [Markov property of Gaussian states of canonical commutation relation algebras; and in other ways, for example The Linear Markov Property in Credibility Theory where the different thing is the "linear" part. All these were found by a simple Google search. All the different ways in which "Markov proprty" is used need to be represented on WP ... you can't just appropriate it for one specialised usage.
The other issue is duplication. There is no point is having the same stuff in two/many articles.
Melcombe (talk) 13:35, 27 January 2010 (UTC)[reply]
Yes I am well aware of these uses of the X Markov property, but the term "Markov property" in itself is not any of those. Just as a smooth manifold is not the definition of a manifold, neither are those the definitions of a Markov Property. Somewhat funny that in all the articles you presented, the term "Markov property" on its own is used to refer to time indexed RVs. The article is in a mess and you seem rather to keep it that way instead of being productive, I have pointed out before, this article is misleading, incorrect and confusing. I suggest that you either help to make the article better, or let me do so. Hyperbola (talk) 00:11, 10 February 2010 (UTC)[reply]
I suggest the following.
  • Either replace the article Markov process with a redirect here or, better, remove from that article anything more than an informal definition of the Markov property, but link to this article for a formal definition, and
  • Use this article (Markov property) to start with informal discussion and move on to formal definitions on appropriate spaces.
However, a wikipedia article is not a "definition", so there must be something about how and why the concept of "Markov property" is used. It obviously can't restrict this just to "time" as there are people who use the term for other purposes. But the immediate concern is not to have two articles containing what should be essentially the same stuff, and there is no point is just considering the one article in isolation.
Melcombe (talk) 11:38, 10 February 2010 (UTC)[reply]
Yes I do agree that there needs to be uses for it, but the definition of the MP is needed. I have not been working with Markov random fields so I cannot provide examples there but there is a rich source of applications for the strong Markov property which is indeed the more useful one between MP and SMP.
I have no preference for either of these solutions so do which you think is best and inform me too. Hyperbola (talk) 15:31, 10 February 2010 (UTC)[reply]

As discussed above, I have rearranged the putative context of Markov property and Markov process, so that a full formal specification would appear in the former of these, and with the recently added material restored to this article. Melcombe (talk) 11:31, 16 February 2010 (UTC)[reply]


March 2011[edit]

i'm still somewhat confused by this whole series of articles, but after a fair bit of googling would i be right in thinking that 'A Markov property is a property that regardless of previous states can predict a future state in a Stochastic process.'?

i'm sure that its grossy over simplified but i only need a loose definition for a glossery in a report and i'm having trouble finding one. will 01:17, 21 march 2011 (UTC)

This is pretty poor. "Can predict" is far too general. After all, one could use the number 100 to predict anything one likes, so 100 "can predict" a future state in a Stochastic process ... that doesn't make it a Markov property. Melcombe (talk) 09:59, 21 March 2011 (UTC)[reply]
Thanks for the reply, how would it be made in to a more accurate statement then?
will 20:07, 21 March 2011 (UTC)
Having slept and re read the articles and your feedback am i any more accurate in thinking:
'A Markov property is the present state in a Stochastic process whereby the chance of a particular future state is dependant on the present, regardless of any previous states.'
eg: in a game of 'Rock, Paper, Scissors' your choise would be a markov property because your opponent is more likely to choose something to beat your current choise in the next round future game.
or am i still way off the mark...
will 188.39.33.102 (talk) 03:15, 23 March 2011 (UTC)[reply]
R=P-S is not a Markov property.Capodistria (talk) 06:46, 23 March 2011 (UTC)[reply]

November 2011 - Definitions of terms[edit]

You know what I just LOVE?

I really love it when an article about math, computer science, statistics, etc. has a whole bunch of terms that are never defined.

This article needs someone to come in and describe for us what the various variables and symbols in the text are referring to.

It would be really nice to have the ℝdefined, and maybe the Ω, or β (that is a beta, and not a 'B' isn't it... See, that is what I am talking about*), or the funny shaped 'F'. Heck, it would have been nice to know what the 't' means, even though this traditionally means 'Time" in math. I'm pretty sure that the P is a "Probability" (ℙ). Dang, even the X isn't defined here.

In fact, the whole series of articles on Markov's work needs to have this done. Some at least let you know what some of the variables are defined as, but most just have these arcane strings of letters and symbols that leave anyone who doesn't have a DEEP math background scratching their heads, completely unable to make ANY sense of the article without spending an hour of so searching (possibly in futility) for what these terms mean.

  • and regardless of whether it is a 'B' or a 'Beta' it needs to be defined. Not everyone who comes to this page will be some math geek or computer nerd who knows what these terms mean or if they are referring to real world objects or other conceptual objects/subjects. I mean, please give us a SLIGHT clue as to what a Borel sigma-algebra is... just a hint, you don't have to cite the whole page, but not having to go link-diving really helps.

Matthew R Bailey (talk) 15:16, 18 November 2011 (UTC)[reply]


Mistake in definition of strong Markov property[edit]

The statement of the strong Markov property is wrong (in the case tau:=t it reduces to X_{t+} is independent of F_t what is definitely wrong!). The essential statement is that X_{tau+} is independent of F_{tau} given X_{tau}. In this formulation the statement is only reasonable for time-homogeneous processes (as mentioned). However the statement X_{tau+} independent of F_{tau} given the pair (tau,X_{tau}) makes sense for any Markov process. Moreover there is a small mistake in the definition of F_{tau}. The intersection of tau and A does not make sense - it should be the intersection of A with the event {tau=t}. — Preceding unsigned comment added by 141.5.26.78 (talk) 11:38, 5 April 2013 (UTC)[reply]

Misleading notion of time[edit]

I don't like the notion of time the description of the Markov property gives here.

The Markov property is here explained in terms of future and past. This can be very confusing since it can give a notion of an absolute time which is not the case. A Markov process is characterised by it’s behaviour in relation to it’s preceding states, so relative to some state which can be in the past or not. In many cases the states of a process are not even functions of time, as for instance in the case of a DNA sequence or text. I think it makes more sense to define it as a dependence on only it's single preceding state. This is the way Vaseghi goes in this book "Advanced Digital Signal Processing" and I find that much better since it removes the unnecessary confusion about time. — Preceding unsigned comment added by Mijamaka (talkcontribs) 15:44, 10 April 2017 (UTC)[reply]

Alternative formulation[edit]

It might be good idea to precise where the formula in the "Alternative formulation" section comes from. The reference given states a formulation using conditional expectations but it seems interpreted in a weird way. (In the book, there is both indexes and that appear in the formulation) 194.214.86.1 (talk) 13:46, 11 October 2023 (UTC)[reply]