Assignment 10

The assignment need not be submitted. This is to give examples of some material the students of CSE 5360 can be tested on but was not part of any assignment

NOTE: CSE 4308 students will not be tested on this material.


Task 1 (45 points).

We have a binary classification problem, where the two classes are A and B, a pattern is denoted as x, and P(A | x) is uniform and equal to 0.9 for every x.

Part a: What is the error rate of a true Bayes classifier, averaged over all examples? In other words, what is the probability that the Bayes classifier will give the wrong answer for a random x? Justify your answer.

Part b: What is the error rate of a nearest neighbor classifier? In other words, what is the probability that the nearest neighbor classifier will give the wrong answer for a random x? Justify your answer.

Part c: What is the error rate of a 3-nearest neighbor classifier (i.e., a k-nearest neighbor with k=3)? In other words, what is the probability that the 3-nearest neighbor classifier will give the wrong answer for a random x? Justify your answer.


Task 2 (10 points).

At the M-step of the EM algorithm, we recompute the mean and std of every Gaussian by taking weighted averages over all training objects. What would happen if we changed that step, to take unweighted averages instead of weighted averages?


Task 3 (10 points)

Suppose that, at a node N of a decision tree, we have 1000 training examples. There are four possible class labels (A, B, C, D) for each of these training examples.

Part a: What is the highest possible and lowest possible entropy value at node N?

Part b: Suppose that, at node N, we choose an attribute K. What is the highest possible and lowest possible information gain for that attribute?


Task 4 (10 points)

Your boss at a software company gives you a binary classifier (i.e., a classifier with only two possible output values) that predicts, for any basketball game, whether the home team will win or not. This classifier has a 28% accuracy, and your boss assigns you the task of improving that classifier, so that you get an accuracy that is better than 60%. How do you achieve that task? Can you guarantee achieving better than 60% accuracy?

Task 5 (10 points)

Consider the Training set for a Pattern Classification problem given below

Attribute 1Attribute 2Class
1528A
2010B
1832A
3215B
2515B

As
suming we want to build a Pseudo-Bayes classifier for this problem using one dimensional gaussians (with naive-bayes assumption) to approximate the required probabilties. Calculate the probability density functions required.