Undergraduate Program - Department of Electrical and Computer Engineering
PATTERN RECOGNITION
Description
The course covers the most popular in the literature techniques for pattern recognition, as they are typically employed in a number of practical applications. In more detail, the following are covered:
- Decision theory and the Bayesian approach to classification.
- Maximum likelihood parameter estimation and the expectation maximization algorithm.
- Nearest neighbor based classifier.
- Bayesian networks.
- Linear and non-linear classifiers.
- Neural networks.
- Support vector machines.
- Decision trees.
- Markov chains and hidden Markov models.
- Classifier combination.
- Feature selection based on various approaches.
- Data transforms and feature vector dimensionality reduction.
- Basic concepts in clustering.
- Basic clustering algorithms, including K-means, sequential, and agglomerative clustering.
Subject area
Signals, Communications, and Networking
Learning Outcomes
This course introduces students to the basic concepts and algorithms of pattern recognition, as they are typically employed in a number of practical applications, such as speech and audio recognition, image and video analysis, biometrics, bioinformatics, etc. The course covers the most commonly used classification algorithms, feature selection and transformation methods, and data clustering. The course provides numerous examples to allow student familiarization with the above concepts, as well as practical computational tools within the Matlab framework, further demonstrating these.
Students successfully completing this class will have mastered the main concepts and algorithms in the field of pattern recognition. For example, they will be able to:
- Design and implement pattern recognition systems for a wide variety of applications, including recognition and segmentation of images, recognition of speech sounds, etc.
- Extract and select appropriate features of reduced dimensionality from a wide variety of data, including speech, audio, images, and video.
- Obtain class-conditional parametric distributions of data features on basis of labeled data, using a number of techniques, such as maximum likelihood and maximum a-posteriori estimation, as well as the expectation-maximization algorithm in the case of partial observations.
- Implement, train, and test a number of classifiers, including Gaussian mixture models, neural networks, support vector machines, decision trees, and hidden Markov models.
- Perform data clustering in an unsupervised manner by means of various algorithms, such as K-means, sequential, and agglomerative clustering.