Talk:Stochastic convergence: Difference between revisions
imported>Ragnar Schroder |
imported>Greg Woodhouse (reply to Jitse) |
||
Line 65: | Line 65: | ||
:"The number a may also be the outcome of a stochastic variable X. In that case the compact notation <math>P(\lim_{i \to \infty} X_i = X) = 1 </math> is often used." | :"The number a may also be the outcome of a stochastic variable X. In that case the compact notation <math>P(\lim_{i \to \infty} X_i = X) = 1 </math> is often used." | ||
Isn't it the case that if a stochastic variable converges almost surely, the limit is a deterministic constant and not a stochastic variable? That is, <math>P(\lim_{i \to \infty} X_i = X) = 1 </math> can only be true if ''X'' takes some value with probability one and is thus not really a stochastic variable. Or do you implicitly allow for the possibility that ''X'' depends on the {''X''<sub>''i''</sub>} ? -- [[User:Jitse Niesen|Jitse Niesen]] 10:13, 5 July 2007 (CDT) | Isn't it the case that if a stochastic variable converges almost surely, the limit is a deterministic constant and not a stochastic variable? That is, <math>P(\lim_{i \to \infty} X_i = X) = 1 </math> can only be true if ''X'' takes some value with probability one and is thus not really a stochastic variable. Or do you implicitly allow for the possibility that ''X'' depends on the {''X''<sub>''i''</sub>} ? -- [[User:Jitse Niesen|Jitse Niesen]] 10:13, 5 July 2007 (CDT) | ||
::I'm glad I'm not the only one that finds this notation confusing. Unless, I'm wrong, this is just another way of saying that a sequence of measurable functions converges to a measurable function ''X'' a.e. (pointwise), convergence in probability is just convergence in measure, and convergence in the pth order mean is just L<sup>p</sup> convergence. The strong law of large numbers (the hypotheses of which I can look up when I get home) ensures convergence a.e. (I think!). I also think(!) that it appliews to i.i.d. random variables, and, in particular to Bernoulli trials. But I'm a little out of my league at this level of probability theory. [[User:Greg Woodhouse|Greg Woodhouse]] 11:20, 5 July 2007 (CDT) | |||
:The compact notation <math>P(\lim_{i \to \infty} X_i = X) = 1 </math> *is* confusing. Unfortunately, it seems to be traditional, so I guess it has to be included, even though a small rather than cap X would be more intuitive. Think of it this way: The value of X is obtained in advance, f.i. by throwing a die. Then somehow the sequence converges as to that value. | :The compact notation <math>P(\lim_{i \to \infty} X_i = X) = 1 </math> *is* confusing. Unfortunately, it seems to be traditional, so I guess it has to be included, even though a small rather than cap X would be more intuitive. Think of it this way: The value of X is obtained in advance, f.i. by throwing a die. Then somehow the sequence converges as to that value. | ||
Revision as of 10:20, 5 July 2007
Workgroup category or categories | Mathematics Workgroup, Physics Workgroup, Chemistry Workgroup [Categories OK] |
Article status | Developing article: beyond a stub, but incomplete |
Underlinked article? | No |
Basic cleanup done? | Yes |
Checklist last edited by | Greg Woodhouse 20:33, 28 June 2007 (CDT), Ragnar Schroder 11:13, 28 June 2007 (CDT) |
To learn how to fill out this checklist, please see CZ:The Article Checklist.
Work in progress
This page is a work in progress, I'm struggling with the latex and some other stuff, there may be actual errors now.
Ragnar Schroder 12:04, 28 June 2007 (CDT)
Almost sure convergence
The definition in the text doesn't make sense. In general, something is true almost surely (a.s.) if it is true with a probability of 1. It is almost surely true that a randomly chosen number is not 4. Greg Woodhouse 12:08, 28 June 2007 (CDT)
- Greg, some would argue if the probability has a chance of being not true ≤0.01‰ the convergenge to a value is true, and the tail can be forgotten. Compare to series. Robert Tito | Talk 12:44, 28 June 2007 (CDT)
I understand. It's a technical concept from measure theory. If two functions (say f and g) are equal except on a set of measure 0, we say f = g almost everywhere. This is important because their (Lebesgue) integrals over a given set will always be equal and it is convenient to identify such functions because then if we define
defines a metric on , giving it the structure of a metric space. If we didn't identify functions that wee equal almost everywhere (i.e., treat them as the same function), the purported metric would not be positive definite.
On a foundational level, probability theory is essentially measure theory, (and distributions are just measurable functions that, when integrated ovderf a set A give the probability of A). It is just a convention that probability theorists use the phrase "almost surely" instead of "almost everywhere", the meanings are the same. Of course, I'm almost sure :) you already know this! Greg Woodhouse 13:28, 28 June 2007 (CDT)
Almost sure convergence
I agree with all this, but I think stochastic convergence is a concept that should be accessible to intelligent laymen, not just math/natsci/tech guys.
- Yes, but this is the talk page. :-) Greg Woodhouse 17:05, 28 June 2007 (CDT)
Therefore, I try hard to avoid reference to hard-core math like measure theory in the beginning of the article, such things should come at the end, after the non-expert has gotten as enlightened as possible wrt the basic ideas.
I've corrected the tex code in the definition. The definition is standard textbook fare, but I really don't like it, I'll try find one that's more intuitive.
Ragnar Schroder 16:30, 28 June 2007 (CDT)
- Ragnar, please do, science - specially as encyclopedic science should be available and understandable to as many as possible. Even to the level where analogons are used to visualize a point even when the analogon isn't scientifically correct. If it helps to make laymen understand a topic that is what I would like to call academic freedom in educational and didactical sense. Please continue and make it easier. Robert Tito | Talk 16:49, 28 June 2007 (CDT)
- Yes, the definition is better, but what does it mean? I assume represents a stochastic process of some sort. Are you saying that as an ordinary function of i, the limit is a with probability 1? If it's just a function, why is probability involved?
- That's what I mean when I say I don't like the compact expression - it's too confusing, especially when the limit itself is a stochastic variable. The formal definition used in my old textbook is . Grad students may enjoy getting something like that thrown at them, but not a general audience. I hope the definition section is clear now.
Ragnar Schroder 14:04, 3 July 2007 (CDT)
- That makes more sense. It isn't actually stated that a is a stochastic process. I'll be honest: When I firswt looked at this, I thought: "What does this mean? It looks like you're saying that a sequence of stochastic processes converges to a number?" Greg Woodhouse 14:38, 3 July 2007 (CDT)
- Or are you saying that as , the distance approaches 0 with probability 1 (whatever that might mean)? The definition still needs to be fleshed out a bit.
- On an intuitive level, I think it's clear enough what you mean: the variable represents a "random walk", that gets you closer and closer to a. You don't know where you will be after i steps, but you do know that the probability that you will be any sizeable distance from a becomes vanishingly small as i approaches infinity. How can you translate that into mathematical language? Greg Woodhouse 17:03, 28 June 2007 (CDT)
- You focus on the convergence in probability implication of a.s. convergence here. I think a detailed discussion of the relation between the two should be mentioned after the introduction to the various modes of convergence, in a new subsection to the /* Relations between the different modes of convergence */ section.
Ragnar Schroder 14:04, 3 July 2007 (CDT)
I don't understand this fragment:
- "The number a may also be the outcome of a stochastic variable X. In that case the compact notation is often used."
Isn't it the case that if a stochastic variable converges almost surely, the limit is a deterministic constant and not a stochastic variable? That is, can only be true if X takes some value with probability one and is thus not really a stochastic variable. Or do you implicitly allow for the possibility that X depends on the {Xi} ? -- Jitse Niesen 10:13, 5 July 2007 (CDT)
- I'm glad I'm not the only one that finds this notation confusing. Unless, I'm wrong, this is just another way of saying that a sequence of measurable functions converges to a measurable function X a.e. (pointwise), convergence in probability is just convergence in measure, and convergence in the pth order mean is just Lp convergence. The strong law of large numbers (the hypotheses of which I can look up when I get home) ensures convergence a.e. (I think!). I also think(!) that it appliews to i.i.d. random variables, and, in particular to Bernoulli trials. But I'm a little out of my league at this level of probability theory. Greg Woodhouse 11:20, 5 July 2007 (CDT)
- The compact notation *is* confusing. Unfortunately, it seems to be traditional, so I guess it has to be included, even though a small rather than cap X would be more intuitive. Think of it this way: The value of X is obtained in advance, f.i. by throwing a die. Then somehow the sequence converges as to that value.
- However, this is a simplification - the value of X may be unknown in advance, and the sequence then basically may tell us more and more what the outcome of X would be, had we performed that particular random experiment.
- I'll try to think of a simple, concrete example to illustrate this.
Ragnar Schroder 11:14, 5 July 2007 (CDT)
examples?
Can anyone think of a non-trivial example of a.s. convergence (i.e., one where the sequence doesn't eventually become constant)? Greg Woodhouse 20:59, 28 June 2007 (CDT)
- many chemical processes and descriptions thereof are using stochastics I will have to see if I can pop up with a decent example. But as example (top of my head) it can be used to derive much of physical chemistry. Robert Tito | Talk
Well, I did a bit of looking and the strong law of large numbers has as its conlusion a.s. convergence. (I didn't know that. I was aware of the weak law of large numbers, but never bothered to look up the strong version.) Anyway, I added this as a third example. Greg Woodhouse 21:52, 28 June 2007 (CDT)
- An example of non-trivial a.s. convergence:
- Assume a guy has two sources of income, one stochastic. On a given day j he gets $ from the stochastic one, and $ from the other. Assume the first one converges almost surely to 0. Then obviously his total income converges almost surely to .
Ragnar Schroder 23:20, 28 June 2007 (CDT)
I've deleted example 3. Greg Woodhouse 23:38, 28 June 2007 (CDT)
Categories
I'm uncertain about the categories for this article, it's now listed under Mathematics, Physics and Chemistry. AFAIK, Stochastic Differential Equations is used a lot in Economics, maybe more than in f.i. chemistry???
- Other areas include computer science,operations research and mathematical biology. If we tried to include every are where these ideas can be applied, the list could become a bit unwieldy! On the other hand, that doesn't mean we shouldn't include certain core applications areas, there really are no hard and fast rules. Greg Woodhouse 20:52, 28 June 2007 (CDT)
Relationships between the types of convergence
I commented out the last claim in this section because I believe it is false as stated. Perhaps you had something different in mind here. Greg Woodhouse 21:34, 28 June 2007 (CDT)
Oops...I've got mud on my face here! Looking a bit more closely, I see that you're saying that Lp convergence implies convergence in measure. This is true. Greg Woodhouse 21:44, 28 June 2007 (CDT)
Added example 3 is under wrong convergence
Example 3 of the Almost sure convergence section should be moved to the Convergence in probability section.
Rather surprisingly, the sequence does NOT converge a.s, only P, AFAIK.
Ragnar Schroder 22:58, 28 June 2007 (CDT)
- Ok. It DOES converge a.s. as well as P. A.s. convergence follows from the "Strong law of large numbers", P convergence from Khinchin's theorem, aka the "weak law of large numbers".
- I think the example was very good, and should be put back either where it was, or under convergence in probability.
Ragnar Schroder 22:09, 30 June 2007 (CDT)
- Mathematics Category Check
- General Category Check
- Physics Category Check
- Chemistry Category Check
- Advanced Articles
- Nonstub Articles
- Internal Articles
- Mathematics Advanced Articles
- Mathematics Nonstub Articles
- Mathematics Internal Articles
- Physics Advanced Articles
- Physics Nonstub Articles
- Physics Internal Articles
- Chemistry Advanced Articles
- Chemistry Nonstub Articles
- Chemistry Internal Articles
- Developed Articles
- Mathematics Developed Articles
- Physics Developed Articles
- Chemistry Developed Articles
- Developing Articles
- Mathematics Developing Articles
- Physics Developing Articles
- Chemistry Developing Articles
- Stub Articles
- Mathematics Stub Articles
- Physics Stub Articles
- Chemistry Stub Articles
- External Articles
- Mathematics External Articles
- Physics External Articles
- Chemistry External Articles
- Mathematics Underlinked Articles
- Underlinked Articles
- Physics Underlinked Articles
- Chemistry Underlinked Articles
- Mathematics Cleanup
- General Cleanup
- Physics Cleanup
- Chemistry Cleanup