Thursday, December 10, 2009

Does the variance of a normal distribution ever plateau as the # of events considered approaches inf

I would like to know whether the probability of an outlier appearing approaches zero as the number of events considered approaches infinity.



Does the variance of a normal distribution ever plateau as the # of events considered approaches infinity?headache



I'm not quite sure what you mean. The possibility of an outlier is always going to be there. The proportion of outliers would approach the probability of an outlier. So if you consider an outlier to be any value that is more than 3 sd's away from the mean, then the probability of a outlier is always going to be 0.0027 regardless of what the previous values are or how many there are. It never becomes zero though.



edit: When people without knowledge of mathematics answer an advanced mathematical question, don't put too much thought into their answers. I think some of these people are being sarcastic. Infinity is, of course, not a number but a concept.



In general education statistics classes, we do not mention the fact that, say, the Central Limit Theorem requires the use of a limit as the sample size goes to infinity; we simply say that if n is large, then the sampling distribution for the sample mean is approximately normal. So approaching infinity is bascally saying that if the a value gets large enough, then the value of a function could get predictably close to a certain value. That is, of course, what a limit is in words, not a real mathematical definition.



Does the variance of a normal distribution ever plateau as the # of events considered approaches infinity?paramount theater opera theater



a variance will never plateau and infinity ( as mind bending as it sounds) cannot be approached.
Is it even possible to approach infinity? Something's either infinite or it's not. There's no in between. 5,000 is relatively just as distant from infinity as is 5,000,000. I suppose it would be a reduction in proportions, as in, when the number of events considered would double, the amount of variance would, in a perfect world, half. While the amount of variance might appear to near zero, the scale simply shrinks. Your differences become more and more relative. But the number of events could never be "close" to infinity, no matter how large the number is, unless it IS infinity. The variance will never plateau, and the outlier will always be proportionally distant from the mean, which is simply a severe reduction of scale.
My gut answer is no, but when I looked at this page:



http://en.wikipedia.org/wiki/Normal_dist...



my head almost exploded. I think you will like it.

No comments:

Post a Comment

 
ltd