Optional Assignment 2

Written Assignment - Posterior Proabilites and Bayesian Networks

Max points:
The assignment should be submitted via Canvas


Task 1 

20 points

You are a meteorologist that places temperature sensors all of the world, and you set them up so that they automatically e-mail you, each day, the high temperature for that day. Unfortunately, you have forgotten whether you placed a certain sensor S in Maine or in the Sahara desert (but you are sure you placed it in one of those two places) . The probability that you placed sensor S in Maine is 5%. The probability of getting a daily high temperature of 80 degrees or more is 20% in Maine and 90% in Sahara. Assume that probability of a daily high for any day is conditionally independent of the daily high for the previous day, given the location of the sensor.

Part a: If the first e-mail you got from sensor S indicates a daily high over 80 degrees, what is the probability that the sensor is placed in Maine?

Part b: If the first e-mail you got from sensor S indicates a daily high over 80 degrees, what is the probability that the second e-mail also indicates a daily high over 80 degrees?

Part c: What is the probability that the first three e-mails all indicate daily highs over 80 degrees?


Task 2

10 points.

In a certain probability problem, we have 11 variables: A, B1, B2, ..., B10. Based on these facts:

Part a: How many numbers do you need to store in the joint distribution table of these 11 variables?

Part b: What is the most space-efficient way (in terms of how many numbers you need to store) representation for the joint probability distribution of these 11 variables? How many numbers do you need to store in your solution? Your answer should work with any variables satisfying the assumptions stated above.


Task 3

15 points


Bayesian Network
Figure 1: A Bayesian Network of 5 Variables.

For the given Bayesian Network

Calculate the value of P(A | B = f, E = t)


Task 4

25 points

Decision Tree

Figure 2: A decision tree for estimating whether the patron will be willing to wait for a table at a restaurant.

Part a: Suppose that, on the entire set of training samples available for constructing the decision tree of Figure 1, 80 people decided to wait, and 20 people decided not to wait. What is the initial entropy at node A (before the test is applied)?

Part b: As mentioned in the previous part, at node A 80 people decided to wait, and 20 people decided not to wait.

What is the information gain for the weekend test at node A? 

Part c: In the decision tree of Figure 1, node E uses the exact same test (whether it is weekend or not) as node A. What is the information gain, at node E, of using the weekend test?

Part d: We have a test case of a hungry patron who came in on a rainy Tuesday. Which leaf node does this test case end up in? What does the decision tree output for that case?

Part e: We have a test case of a not hungry patron who came in on a sunny Saturday. Which leaf node does this test case end up in? What does the decision tree output for that case?


Task 5

20 points

  Class     A     B     C  
X 1 2 1
X 2 1 2
X 3 2 2
X 1 3 3
X 1 2 2
Y 2 1 1
Y 3 1 1
Y 2 2 2
Y 3 3 1
Y 2 1 1

We want to build a decision tree that determines whether a certain pattern is of type X or type Y. The decision tree can only use tests that are based on attributes A, B, and C. Each attribute has 3 possible values: 1, 2, 3 (we do not apply any thresholding). We have the 10 training examples, shown on the table (each row corresponds to a training example).

What is the information gain of each attribute at the root? Which attribute achieves the highest information gain at the root?


Task 6

10 points

Suppose that, at a node N of a decision tree, we have 1000 training examples. There are four possible class labels (A, B, C, D) for each of these training examples.

Part a: What is the highest possible and lowest possible entropy value at node N?

Part b: Suppose that, at node N, we choose an attribute K. What is the highest possible and lowest possible information gain for that attribute?