19SpFreqBayes

.pdf
1 of 7 What Does It Mean for a Coin to Be Fair? Approaches to probability Classical: This approach is based on a set of 'equally likely' outcomes. Upon tossing, a coin can show H or T, and we assume (reasonably in most cases) that the H and T are equally likely out- comes. Upon rolling, a die can show one through six spots, and we assume the six possible outcomes are equally likely. The probability of an event is defined to be the number of 'favorable' outcomes divided by the number of possible outcomes. So P(H) = 1/2 for the coin. For the die P(Two) = 1/6 and P(Even) = 3/6 = 1/2. [ Complication: One can say that the idea of 'equally likely' already requires the idea of 'probability', so that this isn't really a satisfactory approach. However, it works nicely for fairly played games of chance and was the basis of many of the early results in probability theory.] Frequentist: This approach is based on an experiment is that repeatable infinitely many times in such a way that repetitions do not influence one another. The probability of an event E is defined as the 'limit' of the fraction p n = #(Occurrences of E so far) / #(Repetitions so far) as n becomes very large. In practice, this approach requires a lot of work: repetitions and tallying of results. Furthermore, the 'limit' cannot be an 'ordinary' mathematical limit such lim a n = (1 + 2/ n ) n = e 2 . In this limit, we can write down in advance the value of a n for any n . For example, a 4 = 1.5 2 = 2.25. But in the probability 'limit' we don't know the value of p n until we have done some experimentation. The theory to make sense of "lim p n " is based on the Law of Large Numbers, which we will review soon. A simulation can give a good idea of the value of lim p n . We can randomly generate a sequence of 0s and 1s (representing Hs and Ts), find p n after each coin toss, and see what happens after, say, 5000 tosses. Copyright © 2012, 2019 by Bruce E. Trumbo. All rights reserved. This is a draft. Corrections/comments/permissions: [email protected]
2 of 7 [ Complications: If we were tossing a real coin, we would have to be sure tosses are 'fair' and 'independent', and that the coin does not change its nature through wear after many tosses. If we do this by computer, we have to be sure that we have a source of 0s and 1s (pseudorandom generator) that behaves like 'honest' repeated coin tosses .] x = sample(0:1, 5000, repl = T) # col vector of 5000 random 0s and 1s s = cumsum(x); n = 1:5000; p = s/n # num, denom, ratio, col vectors TAB = cbind(n, x, s, p) # binds 4 column vectors to make matrix head(TAB); tail(TAB) # first 6 and last 6 rows of matrix TAB n x s p [1,] 1 1 1 1.0000000 # for small n values of p fluctuate greatly [2,] 2 0 1 0.5000000 [3,] 3 0 1 0.3333333 [4,] 4 1 2 0.5000000 [5,] 5 1 3 0.6000000 [6,] 6 1 4 0.6666667 n x s p [4995,] 4995 0 2501 0.5007007 # but eventually 'converge' to very nearly 1/2 [4996,] 4996 0 2501 0.5006005 [4997,] 4997 1 2502 0.5007004 [4998,] 4998 0 2502 0.5006002 [4999,] 4999 0 2502 0.5005001 [5000,] 5000 1 2503 0.5006000 Each time the program is run, it will give slightly different results for p 5000 , but 5000 is a long way from ∞, so we can imagine that p n will become even closer to 1/2 after very many thousands of coin tosses. Copyright © 2012, 2019 by Bruce E. Trumbo. All rights reserved. This is a draft. Corrections/comments/permissions: [email protected]
3 of 7 A 'trace', which is a plot of p n against n illustrates the process of convergence. plot(n, p, type="l", lwd=2, ylab="Fraction of Heads", xlab="Nr of Tosses") abline(h=.5, col="green2") The trace wobbles around a lot at the left of the graph, but seems to converge to 1/2 at the right. Copyright © 2012, 2019 by Bruce E. Trumbo. All rights reserved. This is a draft. Corrections/comments/permissions: [email protected]
Page1of 7
Uploaded by DeanIbisMaster955 on coursehero.com