Course Information
What you can expect to learn in this course
Consider self-driving cars, speech-based web search, and machines
that grade
essays. We have moved far from the days when chess was the hallmark
of
machine intelligence - AI today is a matter of everyday use, and
some of the
largest technology firms are riding on this boom. How does one build
such
"intelligent" systems? What are the basic principles driving
research in
these areas? In this course, you will learn the elements of machine
learning, optimization, search, probabilistic reasoning, and also
explore
with projects that involve state-of-the art approaches to handle
real
problems in the area of NLP, Vision and Robotics. At the end, you
will have
a mastery of some of the fundamental principles that drive
technology in the
coming years.
Course Pedagogy
We will introduce these topics through lectures and homeworks in the
first part of the course. This will integrate class presentations
and discussions. This will be followed by projects, the topics for
which will be decided around week 6. The projects should investigate
some topic of current research interest, and you are expected to be
able to communicate the key ideas of your project to others in the
course.
Owing to the project nature of the course, anyone wishing to drop
needs to do so before project groups are formed. You will certgainly
NOT be able to drop the course at a very late date.
Text
There is no single text, though we will be loosely following the
text [RN}: AI: A Modern Approach, Stuart Russell and Peter Norvig,
3d edition
Also, we will be using chapters for study from: Kevin Murphy, Machine Learning: A Probabilistic Perspective (2012); R. Bishop : Pattern Recognition and Machine Learning (2006), and Duda, Hart and Stork, Pattern Classification (2nd ed, 2001). You are also expected to read some research papers as part of this course.
Also, we will be using chapters for study from: Kevin Murphy, Machine Learning: A Probabilistic Perspective (2012); R. Bishop : Pattern Recognition and Machine Learning (2006), and Duda, Hart and Stork, Pattern Classification (2nd ed, 2001). You are also expected to read some research papers as part of this course.
Prerequisites
CS210 / ESO 211 Data Structures and Algorithms;
MSO 201 Probability and Statistics.
Grading Scheme
- Two Written Exams: 15+15%
- Course Discussions, Quizzes, and Homework: 15-20%
- Final Project: 45-55%;
Approximate breakup:
- Proposal: ~5%
- In-class Presentation / discussions: 10%
- Final poster / Demo: ~15%
- Report: ~20%
Projects
- To be done in teams of 1 or 2 persons (random assignment).
- Read and present a recent research paper from the list of suggested papers. Write a brief review. (around week 4)
- Define your project topic and present a project proposal. (around week 6)
- Implement the project over approximately six weeks.
- Projects from past years, assignments, solutions, and other details may be seen under the old course page.
Plagiarism policy
Plagiarism: Ideas, diagrams, and text taken from other people's work, should be clearly attributed. There should be no possibility of a misunderstanding that the work was done as part of your project.As per department policy, in addition to a course-specific penalty, all instances of plagiarism, however small, must be recorded and reported to DOAA. Significant plagiarism will result in an F grade.
Course Topics
Topic | Week | References |
---|---|---|
INTRO: AI systems today. Evolution of AI from symbolic reasoning to learning systems. The idea of an agent. | week 1 | RN ch.1,2 |
Learning approaches to AI: * Linear function modeling via Regression. * Uncertainty : Probabilistic models; Classification; Information theory. Case study : PCA - Eigenfaces - Face Recognition |
week 2-3 | Bishop ch.1, Murphy ch.1, ch.2, RN ch.2, ch.18 |
State of AI : Readings from current research | week 4 | Paper reviews based on list of papers |
LEARNING: Regression vs Classification; Unsupervised vs Supervised; Clustering; Generative vs Discriminative; Manifold Dimensionality Reduction;STATISTICAL LEARNING. Naive Bayes; k-NN; multi-layer perceptron and deep Learning; kernel methods & SVM; | week 4-5 | ch. 13,18,20; Duda Hart ch.2; Bishop ch.1; |
SENSING: Vision - Image Formation, Gradient and Motion cues, Learning Backgrounds, Tracking | week 6 | ch.24, Marr ch.1 |
ACTION: Robotics - articulated and mobile robots; motion planning, task planning. | week 7 | ch.25 |
SEARCH, CONSTRAINT PROPAGATION, and LOGIC: | week 8 | ch. 3, 5, 6 |
LANGUAGE: natural language processing fundamentals | week 9 | ch.22, 23 ; ch. 26 |
PROJECTS: proposal, presentations, final demos | weeks 10-12 | Relevent research papers |
CLOSURE: AI and cognition, philosophy | week 12b |