This advanced course brings together methods from Machine Learning, Computer Vision, Robotics, and Human-Computer Interaction to enable interactive machines to perceive and act in complex, social environments. For example, consider Shutter, the robot photographer shown below:
The robot was created to study human-robot robot interactions in public environments. It needs to be able to perceive human users, reason about their behavior and communicate with them. Students enrolled in this course will be working with virtual models and copies of this robot for their assignments and group projects.
Part of the course will examine approaches for perception with different sensing devices and algorithms; the other part will focus on methods for decision making and applied machine learning for control. The course is a combination of lectures, state-of-the-art reading, presentations and discussions, programming assignments, and a final team project.
Instructor: Marynel Vázquez (marynel.vazquez at yale.edu)
T.F.: Sasha Lew (a.lew at yale.edu)
Class Hours: Tuesdays & Thursdays, 1:00pm - 2:15pm
Class Location: Becton Center CO31
Canvas Link: https://yale.instructure.com
Office Hours (beginning on 9/9):
Marynel - Fridays, 10:00am - 10:45am (AKW 402)
Sasha - Mondays, 4:30pm - 5:30pm ET (AKW 411)
At the end of this course, students will have gained an understanding of:
- the challenges involved in building autonomous, interactive systems, like the robot shown above;
- the limitations and advantages of various sensing techniques; and
- well-established frameworks for sequential decision making and robot development.
The assignments will teach students about the Robot Operating System (ROS v.1). Students will also be able to demonstrate their ability to work in a team and to communicate scientific content to a peer audience.
CPSC 201, CPSC 202, and CPSC 470 or 570 (or an equivalent introductory course about Artificial Intelligence/Machine Learning).
Understanding of probability, differential calculus, and linear algebra is expected for this course. Programming assignments require proficiency in Python. High-level familiarity with C++ is beneficial for working with the Shutter robot, although students are not expected to program in C++ for their assignments or final projects.
Students who do not fit this profile may be allowed to enroll with the permission of the instructor.
The following topics will be covered in the course:
- Sensor Design Choices for Interactive Systems
- Projective Geometry for Computer Vision
- Bayesian Filtering (e.g., for Tracking)
- Deep Learning for Function Approximation of Complex Physical and Social Phenomena
- Decision Making
- Markov Decision Processes
- Imitation Learning
- (Inverse) Reinforcement Learning
See the Schedule for more details.
The course grade will be based on:
Student Presentations (10%). Students will get the opportunity to present academic papers in class. Presentations will be graded based on clarity, how well students answer questions from the rest of the class, and how well they relate papers to other course material.
Programming Assignments (36%). There will be 3 individual programming assignments (with extra questions for students enrolled in CPSC 559). Students will have 3 late days to be used as needed to extend the deadline of an assignment for 1, 2, and up to 3 days after the original deadline. Not all late days need to be used on the same assignment.
Term Project (44%). Students will work in groups on a semester-long project creating a new capability for Shutter (the robot shown at the top of this website). Projects are expected to be several thousand lines of code and implemented at a high level of proficiency in Python and with ROS tools. Project grading will be based on 5 deliverables: project proposal (8%), project milestone (10%), demo (4%), project presentation (10%), and final report and supplementary material (7% + 5%).
Video quizzes (5%). For some classes, students will be assigned to watch pre-recorded lectures by the instructor and to complete quizzes about the material. The quiz with the minimum grade will not be counted towards the final grade.
Participation (5%). Being engaged and asking questions will be rewarded.
Note on CPSC 559 vs. CPSC 459: This course is double-numbered, which means that it can be taken as either 559 or 459. Graduate students must take it as 559; undergraduates must take it as 559 if they are enrolled in the combined MS-BS program. Those taking it as 559 must complete extra questions in the programming assignments and do more paper presentations.
Policy on late days: The late days are only valid for the programming assignments. After the 3 late days, the assignments will be penalized with -50% of the grade for every 24h after the deadline. No other late assignments will be permitted without a Dean’s excuse.
The class sessions will be divided into lectures and student presentations. Recent technical papers will be used as the main text reference, along with a few chapters of the books:
- Probabilistic Robotics, by Sebastian Thrun, Wolfram Burgard, and Dieter Fox
- Reinforcement Learning: An Introduction, by Richard S. Sutton and Andrew G. Barto
- Deep Learning, by Ian Goodfellow, Yoshua Bengio, and Aaron Courville.
- Multiple View Geometry in Computer Vision, by Richard Hartley and Andrew Zisserman. (
Online versionis accessible through Yale University Library)
Students need to have access to a computer running Ubuntu 20.04 for completing programming assignments and final projects. The course staff will provide access to shared computers and robots in the laboratory of the Interactive Machines Group for the purposes of this course.
In addition to using the shared computers in the laboratory of the Interactive Machines Group, students can run Ubuntu 20.04 natively on their own computer. We generally do not recommend using virtual machines to run Ubuntu. Students in past editions of this course have had trouble running their code using a VirtualBox virtual machine. Parallels has worked better, but in recent M1,M2 Apple chips, it requires ARM versions of Linux. This will likely require recompiling some ROS packages.