Give

Graduate Exam Abstract


Puoya Tabaghi

M.S. Final
June 9, 2016, 11:00 am - 12:00 pm
ENGR C101B
MIXTURE OF FACTOR MODELS FOR JOINT DIMENSIONALITY REDUCTION AND CLASSIFICATION

Abstract: In many areas such as machine learning, pattern recognition, information retrieval, and data mining one is interested in
extracting a low-dimensional data that is truly representative of the properties of the original high dimensional data. For
example, one application could be extracting representative low dimensional features of underwater objects from
sonar imagery suitable for detection and classification. This is a difficult problem due to various factors such as
variations in the operating and environmental conditions, presence of spatially varying clutter, and variations in object
shapes, compositions, and orientation.
The goal of this work is to develop a novel probabilistic method using mixture of factor models for simultaneous
nonlinear dimensionality reduction and classification. The proposed method provides a supervised probabilistic
approach suitable for analyzing labeled high dimensional data with complex structures by exploiting a set of low
dimensional latent variables which are both discriminative and generative. With the aid of these low-dimensional latent
variables, a mixture of linear models is introduced to locally linearize the unknown nonlinear manifold that the high-
dimensional data supposedly lie on. An optimum linear classifier is then built in the latent variable-domain to separate
the support of the latent variable associated with each class. Introducing these hidden variables allow us to derive the
joint probability density function of the data and class label, reduce data dimension and perform clustering,
classification and parameter estimation. This probabilistic approach provides a mechanism to traverse between the
input space and latent (feature) space and vice versa as well as data clustering and classification. This method can be
viewed as a nonlinear manifold-based inference using Mixtures of Factor Models.
A supervised training based on the Expectation-Maximization (EM) and steepest descent algorithms is then introduced
to derive the ML estimates of the unknown parameters. It is shown that parameters associated with dimensionality
reduction model can be estimated using the EM algorithm whereas those of the classifier are estimated using a
gradient descent. The introduction of latent variables not only helped representing the pdf of data and reducing the
dimension of them but also in parameter estimation using EM algorithm which is used to find ML estimates of the
parameters when the available data is incomplete.
A comprehensive study is carried out to assess the performance of the proposed method using two different data sets.
The first data set consists of Synthetic Aperture Sonar (SAS) images of model-generated underwater objects
superimposed on background clutter. These images correspond to two different object types namely Cylinder (target)
and Block (non-target). The signatures of each object are synthetically generated and are placed at various aspect
angles from 1 to 180 for each object type. The goal of our classifier is to assign non-target versus target labels to these
image snippets. The other data set consists of two sets of facial images of different individuals. Each image set
contains 2 series of 93 images of the same person at different poses. The goal of the classifier for this case is to
identify each individual correctly. The dimensionality reduction performance of the proposed method is compared to
two relevant dimensionality reduction methods, namely Probabilistic PCA (PPCA) and Mixture of Probabilistic PCA
(MPPCA) while its classification performance is benchmarked against Support Vector Machine (SVM).


Adviser: Prof. Mahmood R. Azimi-Sadjadi
Co-Adviser: Prof. Louis Scharf
Non-ECE Member: Prof. Michael Kirby, Mathematics
Member 3: Dr. Ali Pezeshki, ECE
Addional Members: N/A

Publications:
Tabaghi, Puoya, and Mahmood R. Azimi-Sadjadi. "Class-preserving manifold learning for detection and classification." Neural Networks (IJCNN), 2015 International Joint Conference on. IEEE, 2015.


Program of Study:
ECE 514
ECE 516
ECE 651
ECE 513
ECE 652
STAT 530
ECE 656
ECE 699