# MAS2903: Introduction to Bayesian Statistics Solution to 2017 June exam

Description
Question A1 MAS290: Introduction to Bayesian Statistics Solution to 2017 June exam a. The answers are: i Exponential ii Gamma iii (0.00,.00) iv Lower variance v 1.16 vi 0 (1 mark each for part) Unseen
Categories
Published

View again

2
Categories
Published

All materials on our website are shared by users. If you have any questions about copyright issues, please report us to resolve them. We are always happy to assist you.
Similar Documents
Share
Transcript
Question A1 MAS290: Introduction to Bayesian Statistics Solution to 2017 June exam a. The answers are: i Exponential ii Gamma iii (0.00,.00) iv Lower variance v 1.16 vi 0 (1 mark each for part) Unseen but should be easy. b. Which interpretation to use here is a matter of opinion. Marks are awarded for any answer with a sensible justification. Example responses are as follows. i Frequency interpretation. We can repeatedly observe whether this event is true many times under similar conditions. ii Classicial interpretation. Each time we pick a card, any one in the pack is equally likely by symmetry. The probability can be derived from this. (2 mark each for part: 1 for an interpretation, 1 for a good reason) Similar to coursework questions. Question A2 a. Let S 1, S 2, S denote the events real, forged by Fred, forged by Carol. Let R denote the event that the passport is believed to be real. The question supplies us with the information: Using the law of total probability: Pr(S 1 ) Pr(S 2 ) Pr(S ) 1/ Pr(R S 1 ) 0.99 Pr(R S 2 ) 0.8 Pr(R S ) 0.4 Pr(R) Pr(R S i ) Pr(S i ) ( ) Bayes theorem is Pr(S i R) Pr(R S i ) Pr(S i )/ Pr(R). Hence: Pr(S 1 R) / Pr(S 2 R) / Pr(S R) / (1 for mathematical statement + 1 for using Bayes theorem + 1 mark for each correct answer) Similar to examples in course. b. Let RF denote the event that the first expert believes the passport is real and the second that it is false. Using independence: Thus: Pr(RF S i ) Pr(R S i )(1 Pr(R S i )) Pr(RF S 1 ) Pr(RF S 2 ) 0.16 Pr(RF S ) 0.24 Now using the law of total probability Pr(RF ) 0.17, and Bayes theorem gives: Pr(S 1 RF ) / Pr(S 2 RF ) / Pr(S RF ) / (2 for deriving new conditional probabilities of RF + 1 mark for each correct answer) Unseen. Question A a. The likelihood is e θ θ f(x θ)! e nθ θ n x e 7θ θ 12 (2 marks) b. To find the MLE first find the log likelihood: log f(x θ) 7θ + 12 log θ + const Now differentiate: d log f(x θ) /θ dθ Setting this equal to zero and solving for θ gives θ 12/7. Check this is the global maximum e.g. by noting f(x θ 12/7) 0 but f(x θ 0) 0 and lim θ f(x θ) 0. (Showing that the 2nd derivative is less that zero for all θ is also another approach here.) (1 for log likelihood, 1 for finding root, 1 for checking it s global maximum - alternatively the marks can be awarded if students use other methods) 2 c. The posterior is This is a Ga(14, b + 7) distribution. π(θ x) π(θ)f(x θ) θe bθ e 7θ θ 12 θ 1 e (7+b)θ (1 for Bayes theorem, 1 for some correct working, 1 for correct answer) d. From the formula sheet, the posterior mode is 1. Solving 1/(b+7) 12/7 gives b 7/12. b+7 We also need b 0 from the definition of the Gamma distribution. (1 for using posterior mode formula, 1 for some correct answer) Unseen but should be easy. Question A4 ai. This distribution has the correct mean. Also if µ N(a, b 2 ) then Pr(a 2b µ a + 2b) So Pr(10 µ 50) (1 for comment on mean, 1 for comment on other values) Unseen question on basic material. aii. This distribution has the wrong mode. (1 mark for a sensible comment [e.g. correct mean, variance seems wrong with explanation why], 1 if this shows prior is wrong) Unseen question on basic material. b. Is it reasonable to expect to see µ 60 only around 1 time in a thousand? (1 mark) c. Using the update formulae from the formula sheet, the posterior is N(B, 1/D) where D 1/ /10 11/100 B (1/ /10 8)/D 10 (1 for using formula, 1 for correct answer) d. The posterior is µ x N(10, 100/11) truncated to µ 5. This has density: { exp [ π(µ x) (µ 10)2] for µ 5 0 otherwise (2 for using truncation of prev posterior, 1 for final density) Standard coursework question. Question B5 a. A prior is conjugate for a specific model if the posterior is in the same family of distributions. This is the case here, as a Gamma prior leads to a Gamma posterior. (1 for a correct definition, 1 using it to get the correct answer.) b. i The credible interval must satisfy: b a b a π(θ x)dθ 0.95 h g θ g 1 exp( h θ) dθ 0.95 Γ(g ) b a θ 5 e θ dθ 0.95 Γ(6) (1 for general intergral, 2 for rearranging to correct answer.) Unseen question applying standard material. b. ii A HDI interval must also satisfy (in this case) π(a x) π(b x). (2 marks. Other correct conditions are also acceptable.) c. Sequential Bayesian inference can be used here. Use the posterior given x, Ga(g, h ) as the prior for the y data. Using the update formula this produces a new posterior Ga(g, h ) where g g + m g + n + m and h h + mȳ h + n x + mȳ. (1 mark for applying sequential Bayes, 1 for using update formula, 1 for correct answer. Other correct approaches also acceptable.) d. The likelihood is f(x θ) θe θ θ n exp( θn x) Hence the log-likelihood and its derivatives are: log f(x θ) n log θ θn x log f(x θ) n/θ n x θ 2 log f(x θ) n/θ2 θ2 Solving θ log f(x θ) 0 gives maximum likelihood estimate ˆθ 1/ x (n.b. this is the global maximum as 2 θ 2 log f(x θ) 0 for all θ.) The observed information is J(θ) 2 log f(x θ) n/θ2 θ2 4 The general form of the asymptotic posterior is θ N(ˆθ, J(ˆθ) 1 ). So in this particular case it is ( ) 1 N x, 1. n x 2 (1 marks for likelihood, 2 for log likelihood + derivatives, 2 for MLE, 2 for observed information, 2 for general form of asymptotic posterior, 1 for final answer.) An example from the notes, but remembering all the details may be challenging. Question B6 a. The likelihood is f(x θ) f( θ) ( ) k θ θ k [ n ( k ) ] θ n θ n k [ n ( ) ] k θ n x θ n(k x) as required. (1 for correct approach, 1 for some correct answer) b. The sufficient statistic is X by the factorisation theorem, since we can factorise the likelihood function as f(x θ) h(x)g( x, θ) where h(x) ( ) k g( x, θ) θ n x θ n(k x) (1 for using factorisation theorem, 1 for correct answer) c. This is true because E( X θ) E 1 n ( 1 n n ) n X i θ E(X i θ) by linearity of expectation 1 nkθ using expectation of binomial distribution n kθ as required (1 for correct approach, 1 for details) 5 d. First we need the log-likelihood: Next we want its second derivative log f(x θ) log c(x) + n x log θ + n(k x) log(1 θ) n x n(k x) log f(x θ) θ θ 1 θ 2 n(k x) log f(x θ) n x θ2 θ2 (1 θ) 2 Now we need to find Fisher s information I(θ), which is: ] E [ 2 log f(x θ) θ2 θ ne[ X θ] n(k E[ X θ]) + θ 2 (1 θ) 2 nkθ nk(1 θ) + using part c θ 2 (1 θ) [ 2 1 nk θ + 1 ] 1 θ nk θ(1 θ) Finally, the Jeffreys prior is as required. π(θ) I(θ) θ 1/2 (1 θ) 1/2 (2 for log-likelihood, 2 for derivatives, 4 for Fisher s information, 2 for Jeffreys prior) e. No, because it s a Be(1/2, 1/2) distribution. (1 for a correct answer + a valid reason) f. The posterior density is This is a Be(, 1) distribution. π(θ x) π(θ)f(x θ) θ 1/2 (1 θ) 1/2 θ 2.5 (1 θ) 12.5 θ 2 (1 θ) 12. (1 for Bayes theorem, 1 for correct working, 1 for stating final distribution) B6 is basically a standard example from the notes, but using an unseen distribution (binomial). This will make part d challenging. Question B7 a. i The likelihood is: f(x 6 θ) θ 6 e θ /6! { if θ if θ 6 Then the probability of busy activity is: Pr(X 6 θ 8) Pr(θ 8) Pr(θ 8 X 6) Pr(X 6 θ 8) Pr(θ 8) + Pr(X 6 θ ) Pr(θ ) Thus the probability of normal activity is Pr(θ X 6) (1 for likelihod, 2 for Bayes theorem, 1 for correct answer) a. ii Since tropical storms occur at the same rate in August as they do in July, we have f(y x 6) f(y θ)π(θ x 6) θ,8 0.62( y e /y!) (8 y e 8 /y!) (2 for correct predictive formula, 1 for correct answer) a. iii Using the values giving rise to highest probabilities, we see that [0, 8] gives close to a 95% coverage. (2 for correct method, 1 for correct answer. Other intervals with the correct probability are also acceptable.) b. The posterior is and so θ x 1 Be(, 27). π(θ x 1) θ(1 θ) 22 θ(1 θ) 4 θ 2 (1 θ) 26, The predictive probability function for Y is, for Y 0, 1,..., f(y x 1) f(y θ)π(θ x 1)dθ Θ 1 ( 5 )θ y (1 θ) 5 y θ2 (1 θ) 26 0 y B(, 27) dθ ( 5 y) 1 θ y+2 (1 θ) 1 y dθ B(, 27) 0 ( ) 5 B(y +, 2 y) y B(, 27) that is, Y x 1 BetaBin(5,, 27), as required. (2 for recognising Binomial model, for posterior, 2 for stating predictive dist as an integral, for rearranging this correctly to get correct answer) All parts of B7 are very similar to examples in the course notes, but will still be challenging 7

Feb 17, 2020

#### Lean LaunchPad and Customer Discovery as a Form of Qualitative Research

Feb 17, 2020
Search
Similar documents

### CPA Exam Introduction to the CPA Exam 19 Difficulty of the CPA Exam 20 CPA Exam Facts 20 CPA Exam Review Courses and Materials 21

View more...
Related Search
Thank you for visiting our website and your interest in our free products and services. We are nonprofit website to share and download documents. To the running of this website, we need your help to support us.

Thanks to everyone for your continued support.

No, Thanks
SAVE OUR EARTH

We need your sign to support Project to invent "SMART AND CONTROLLABLE REFLECTIVE BALLOONS" to cover the Sun and Save Our Earth.

More details...

Sign Now!

We are very appreciated for your Prompt Action!

x