Computer Science 246/446

Homework

Spring 2005

Homework 7
Due Friday April 15, 5pm

Implement EM to train an HMM for the data from homework 4. The model should have four hidden states with gaussian observation probabilities. Does the HMM model the data better than the original mixture of gaussians?

Homework 6
Due Thursday 4/7 in class
Homework 5
Due Thursday March 31 in class
Homework 4
Due Friday March 4

Implement EM fitting of a mixture of gaussians on the two-dimensional data set hw4.dat. You should (at a minimum) try different numbers of mixtures, as well as tied vs. separate covariance matrices for each gaussian. Which model seems to fit the data best?

Homework 3
Due Friday Feb 25

Implement ICA in matlab, and use it to unmix the sound sources in the following file: hw3b.dat (note - new file 2/21/05). This file is a matrix of observations from 3 sensors over 10000 times steps - your algorithm should unmix them into 3 sources.

Homework 2
Due Friday Feb 11

Implement the Decision Tree algorithm described in class, including the MDL criterion. Test it using the following dataset: voting2.dat. Turn in your code along with a report describing how the algorithm worked, and any observations about the types of trees it learned. You should try varying (at least) the amount of training data (the training/test split is up to you), whether MDL is used, and how the description length is weighted.

Homework 1
Due Thursday Jan 27


gildea @ cs rochester edu
April 7, 2005