Home Page > > Details

CS 505 – Spring 2021 – Assignment 1 (100 points) – Probability Basic

Problems due 11:59PM EST, February 7.

Submit in Blackboard by 11:59PM EST, February 7.

Please indicate names of those you collaborate with.

Every late day will reduce your score by 20

After 2 days (i.e., if you submit on the 3rd day after due date), it will be marked 0.

For all questions, you must show how you derive your answer.

Problem 1. (2 pts) Use conditional probability P(S|T) = P(S,T)

P(T) to prove the chain rule:

P(S, T, U) = P(S|T, U)P(T|U)P(U).

Problem 2. (8 pts) Suppose you are locked out of the main ship like Dave was by the ship’s AI system HAL; but this

time, HAL is giving you a chance to successfully return to the main ship. Without you knowing, HAL had connected

a tunnel to your space pod that can lead you back to the main ship, but he did not tell you which hatchway out of the

three you have in your pod will lead to this tunnel. If you open the wrong hatchway, you will get sucked out of the

pod and into space without your space helmet! HAL had asked you to choose one of the three hatchways but at the

last second before you open that hatchway, he told you which hatchway (among the other two you did not pick) will

lead you to your death (perhaps he’s being nice? Or maybe, he wants to make you waver and not survive!). Now the

question becomes whether you should stick with your current one, or change to the yet unidentified other hatchway.

What should you do to have the highest probability of getting to the tunnel and back to the main ship safe and sound

to finally disconnect HAL?

Consider the event that the hatchway you originally pick will lead you to safety, then consider the probability

that you will get out of there safely given that you decide to change or not. Work the probability out using conditional

probability. You must show how you arrive at your choice.

Problem 3. (10 pts) Let A and B be random variables and f be a function, prove that E[f(A)] = E[E(f(A)|B)]

Problem 4. (10 pts) Suppose we have sample of real values s1, s2, s3, ..., sn. Each sampled from p.d.f. p(s) where

f(s) = (

βe−βs if s ≥ 0

0 else

where β is unknown parameter. Derive the maximum likelihood estimation of β. Assume that all si in our sample

are larger than 1.

Problem 5. (3 pts) Suppose we have 3 variables S, T, U. If

P(U|S) = 0.9

P(U|T) = 0.6

Can you compute P(U|S, T)? If not, just write there’s not enough information.

1

Problem 6. (4 pts) If instead, we have the following information

P(U|S) = 0.9 P(U|T) = 0.6

P(S) = 0.4 P(T) = 0.5

Can you compute P(U|S, T)? If not, just write there’s not enough information.

Problem 7. (3 pts) If instead we have the following information:

P(U, S) = 0.3 P(S) = 0.5 P(T) = 1

Can we compute P(U|S, T) if not, just write there’s not enough information.

Problem 8. (5 pts) Suppose there is a gathering of people wearing different color shirts. Two thirds of the people

wearing green shirts are laughing, and one tenth of the gathering consists of green shirt wearing people. Only one in

five of the other people not wearing green shirt are laughing. What’s the probability that a randomly chosen person

in the gathering is a laughing green shirt wearing person?

Problem 9. (5 pts) Two third of baby animals in this world are fluffy. If a baby animal is fluffy, then it is more likely

cute! i.e. the probability that a baby animal is cute given it’s fluffy is 0.8 while the probability it’s cute given it’s not

fluffy is 0.1. A randomly picked baby animal is cute! What is the probability it is fluffy?

Problem 10. (5 pts) One third of all animals in the poorly maintained zoo in Indonesia are hungry. A hungry

animal is more likely to be cranky. A cranky animal is more likely to be scary. What’s the probability that a hungry

scary animal is cranky? Given that the probability of an animal being cranky given that it’s hungry is 0.7 while the

probability of an animal being cranky given it’s not hungry is 0.1 and the probability of an animal being scary given

that it is cranky is 0.9 while the probability of an animal being scary given that it’s not cranky is 0.05.

Problem 11. (10 pts) Given:

P(A) = 1

2

P(B|A) = 1

10

P(C|A) = 0

P(D|B) = 1

2

P(B|notA) = 1

2

P(C|notA) = 1

2

P(D|notB) = 1

and D is conditionally independent of A given B, and C is conditionally independent of D given A. Compute P(C, D).

2

Problem 12. Consider two machines programmed to generate compliments, each outputs one of 6 words; and a

game where a user can press a button on the machine, and out come a compliment to make his/her day. The higher

(i.e., the more positive) the compliment is, the higher will the user’s satisfaction be. The satisfaction score for the

words are (in order from least positive to most positive): adequate (score: 1), fine (score: 2), nice (score: 3), good

(score: 4), great (score: 5), fantastic (score: 6). One of the machine has not developed its own intelligence, so it

works as programmed (i.e., the vanilla machine) and always selects one of the 6 words equally at random, while

another (i.e., the AI machine) has developed an understanding that users will love it much more and keep it around

longer if it always gives the highest compliment, so it much prefers to give highest compliment to users each time:

P(word) = (1

3 word = fantastic

2

15 word = adequate, fine, nice, good, great

A professor has decided to purchase these two machines to compliment his students; but because he doesn’t

always want to give them the highest compliment, he uses another machine (i.e., mood machine) that will, depending

on the professor’s mood, choose to press the button of the vanilla machine or the AI one. When the professor’s not

feeling great, the mood machine will, with probability m pushes the button of the vanilla machine.

a. (5 pts) What is the expectation of the satisfaction score (in terms of m)?

b. (10 pts) What is the variance of the score in terms of m?

c. (4 pts) To generalize the above, we can think of a sample space containing several distributions:Pk(W) =

P(W|M = k), k = 1 ∙ ∙ ∙ n (i.e, the two compliment machines above) where each of M has also a distribution

P(M = k) (i.e, the mood machine). Formulate P(W) in terms of Pk(W) and P(M).

d. (8 pts) Formulate E(W) in terms of E(W|M). Simplify your answer as much as possible.

e. (8 pts) Formulate V ar(W) in terms of V ar(W|M) and E(W|M). Simplify your answer as much as possible.

Contact Us(Ghostwriter Service)

- QQ：99515681
- WeChat：codinghelp2
- Email：99515681@qq.com
- Work Time：8:00-23:00

- Cs2461-10 Programminghelp With ,Ghostw... 2021-03-02
- Ghostwriter Program Programming,Help W... 2021-03-02
- Programming Coursehelp With ,Ghostwrit... 2021-03-02
- Ghostwriter Csc1-Ua Programming,Help W... 2021-03-02
- Help With Program Programming,Ghostwri... 2021-03-02
- Ghostwriter Data Programming,Help With... 2021-03-02
- Cse 13S Programminghelp With ,Ghostwri... 2021-03-02
- Mat136h5 Programminghelp With ,C/C++ P... 2021-03-01
- Ghostwriter Ee425x Programming,Help Wi... 2021-03-01
- Cscc11 Programming Coursehelp With ,Gh... 2021-03-01
- Ghostwriter Program Programming,Python... 2021-03-01
- Help With R Programming|Help With Data... 2021-03-01
- Data Structuresghostwriter ,Help With ... 2021-03-01
- Help With Data Programming,C++，Python... 2021-03-01
- Ghostwriter Aps 105 Programming,C/C++ ... 2021-03-01
- Fre6831 Computational Finance 2021-02-28
- Sta141b Assignment 5 Interactive Visu... 2021-02-28
- Eecs2011a-F20 2021-02-28
- Comp-251 Final Asssessment 2021-02-28
- Ghostwriter Cs1027 Course Programming,... 2021-02-28

Contact Us - Email：99515681@qq.com WeChat：codinghelp

© 2014 www.asgnhelp.com

Programming Assignment Help！