Login
Computer Science @ Rochester
Wednesday, December 15, 2010
2:00 PM
Computer Studies Building Room 601
Bin Wei
University of Rochester
Graphical Models for Heterogeneous Transfer Learning and Co-reference Resolution
Traditional supervised machine learning requires labeled data for a specific problem of interest.There have been many attempts to reduce this requirement such as approaches based on semi-supervised learning. In recent years, people have started to consider anew strategy known as transfer learning, where labeled data from an old problem (called the source task) is used to assist the learning of a new but related problem(the target task).

In this thesis, we mainly consider an extreme case of transfer learning that we denote as heterogeneous transfer learning - where the feature spaces of the source task and the target tasks are disjoint. We first consider the cross-lingual text classification task, where we need to train a classifier for Chinese but we only have labeled data in English. We adapt the structural correspondence learning (SCL) algorithm for the problem. And then we generalize the SCL algorithm as a multi-task transfer learning strategy and propose the use of a restricted Boltzmann machine (RBM), a special type of probabilistic graphical models, as an implementation. We also give some preliminary theoretical analysis for the strategy by combining previous work on general transfer learning and multi-task learning.

Finally, we study the problem of co-reference resolution using another kind of graphical models, the conditional random field (CRF). We show that a previously proposed ranking approach, which produces state of the art results, can be viewed as a special case of the model. We go on to show how using a CRF allows us to easily incorporate other NLP tasks such as non-anaphoric identification and noun phrase boundary detection.