Talk:Stochastic convergence: Difference between revisions

From Citizendium
Jump to navigation Jump to search
imported>Greg Woodhouse
(Categories)
imported>Greg Woodhouse
(examples?)
Line 51: Line 51:


:Other areas include computer science,operations research and mathematical biology. If we tried to include every are where these ideas can be applied, the list could become a bit unwieldy! On the other hand, that doesn't mean we shouldn't include certain core applications areas, there really are no hard and fast rules. [[User:Greg Woodhouse|Greg Woodhouse]] 20:52, 28 June 2007 (CDT)
:Other areas include computer science,operations research and mathematical biology. If we tried to include every are where these ideas can be applied, the list could become a bit unwieldy! On the other hand, that doesn't mean we shouldn't include certain core applications areas, there really are no hard and fast rules. [[User:Greg Woodhouse|Greg Woodhouse]] 20:52, 28 June 2007 (CDT)
== examples? ==
Can anyone think of a non-trivial example of a.s. convergence (i.e., one where the sequence doesn't eventually become constant)? [[User:Greg Woodhouse|Greg Woodhouse]] 20:59, 28 June 2007 (CDT)

Revision as of 19:59, 28 June 2007


Article Checklist for "Stochastic convergence"
Workgroup category or categories Mathematics Workgroup, Physics Workgroup, Chemistry Workgroup [Categories OK]
Article status Developing article: beyond a stub, but incomplete
Underlinked article? No
Basic cleanup done? Yes
Checklist last edited by Greg Woodhouse 20:33, 28 June 2007 (CDT), Ragnar Schroder 11:13, 28 June 2007 (CDT)

To learn how to fill out this checklist, please see CZ:The Article Checklist.






Work in progress

This page is a work in progress, I'm struggling with the latex and some other stuff, there may be actual errors now.

Ragnar Schroder 12:04, 28 June 2007 (CDT)

Almost sure convergence

The definition in the text doesn't make sense. In general, something is true almost surely (a.s.) if it is true with a probability of 1. It is almost surely true that a randomly chosen number is not 4. Greg Woodhouse 12:08, 28 June 2007 (CDT)

Greg, some would argue if the probability has a chance of being not true ≤0.01‰ the convergenge to a value is true, and the tail can be forgotten. Compare to series. Robert Tito |  Talk  12:44, 28 June 2007 (CDT)

I understand. It's a technical concept from measure theory. If two functions (say f and g) are equal except on a set of measure 0, we say f = g almost everywhere. This is important because their (Lebesgue) integrals over a given set will always be equal and it is convenient to identify such functions because then if we define

defines a metric on , giving it the structure of a metric space. If we didn't identify functions that wee equal almost everywhere (i.e., treat them as the same function), the purported metric would not be positive definite.

On a foundational level, probability theory is essentially measure theory, (and distributions are just measurable functions that, when integrated ovderf a set A give the probability of A). It is just a convention that probability theorists use the phrase "almost surely" instead of "almost everywhere", the meanings are the same. Of course, I'm almost sure :) you already know this! Greg Woodhouse 13:28, 28 June 2007 (CDT)


Almost sure convergence

I agree with all this, but I think stochastic convergence is a concept that should be accessible to intelligent laymen, not just math/natsci/tech guys.

Yes, but this is the talk page. :-) Greg Woodhouse 17:05, 28 June 2007 (CDT)

Therefore, I try hard to avoid reference to hard-core math like measure theory in the beginning of the article, such things should come at the end, after the non-expert has gotten as enlightened as possible wrt the basic ideas.

I've corrected the tex code in the definition. The definition is standard textbook fare, but I really don't like it, I'll try find one that's more intuitive.

Ragnar Schroder 16:30, 28 June 2007 (CDT)

Ragnar, please do, science - specially as encyclopedic science should be available and understandable to as many as possible. Even to the level where analogons are used to visualize a point even when the analogon isn't scientifically correct. If it helps to make laymen understand a topic that is what I would like to call academic freedom in educational and didactical sense. Please continue and make it easier. Robert Tito |  Talk  16:49, 28 June 2007 (CDT)
Yes, the definition is better, but what does it mean? I assume represents a stochastic process of some sort. Are you saying that as an ordinary function of i, the limit is a with probability 1? If it's just a function, why is probability involved? Or are you saying that as , the distance approaches 0 with probability 1 (whatever that might mean)? The definition still needs to be fleshed out a bit.
On an intuitive level, I think it's clear enough what you mean: the variable represents a "random walk", that gets you closer and closer to a. You don't know where you will be after i steps, but you do know that the probability that you will be any sizeable distance from a becomes vanishingly small as i approaches infinity. How can you translate that into mathematical language? Greg Woodhouse 17:03, 28 June 2007 (CDT)

Categories

I'm uncertain about the categories for this article, it's now listed under Mathematics, Physics and Chemistry. AFAIK, Stochastic Differential Equations is used a lot in Economics, maybe more than in f.i. chemistry???

Other areas include computer science,operations research and mathematical biology. If we tried to include every are where these ideas can be applied, the list could become a bit unwieldy! On the other hand, that doesn't mean we shouldn't include certain core applications areas, there really are no hard and fast rules. Greg Woodhouse 20:52, 28 June 2007 (CDT)

examples?

Can anyone think of a non-trivial example of a.s. convergence (i.e., one where the sequence doesn't eventually become constant)? Greg Woodhouse 20:59, 28 June 2007 (CDT)