Autonomous Vehicle Perception

Kontaktadresse: lec.av.perception.ftm(at)ed.tum.de
Nummer | 0000001362 |
---|---|
Art | Vorlesung |
Umfang | 2 SWS |
Semester | Sommersemester 2025 |
Unterrichtssprache | Englisch |
Stellung in Studienplänen | Siehe TUMonline |
Lernziele
After participation in the course, students will have a comprehensive overview of perception methods and applications for autonomous vehicles. In particular, students are able to
- Understand the working principles of different sensor modalities and apply calibration methods to prevent spatial and temporal misalignments.
- Remember different map representations, understand the simultaneous localization and mapping problem, and apply appropriate methods to address it.
- Analyze various detection problems, select and apply appropriate methods to camera and point cloud data, and evaluate their results.
- Understand the concepts of tracking and prediction, select and apply appropriate methods, and evaluate their results.
- Understand the motivation for End-to-End perception and analyze scenarios for teleoperated driving applications.
- Understand the working principles of different sensor modalities and apply calibration methods to prevent spatial and temporal misalignments.
- Remember different map representations, understand the simultaneous localization and mapping problem, and apply appropriate methods to address it.
- Analyze various detection problems, select and apply appropriate methods to camera and point cloud data, and evaluate their results.
- Understand the concepts of tracking and prediction, select and apply appropriate methods, and evaluate their results.
- Understand the motivation for End-to-End perception and analyze scenarios for teleoperated driving applications.
Beschreibung
In this lecture, all relevant aspects of perception for autonomous driving are covered, and the associated practical applications are demonstrated.
1. Perception Sensors:
Sensor Modalities, Working Principals, Advantages and Disadvantages
2. Sensor Calibration:
Sensor Systems, Requirements, Spatial Calibration, and Temporal Synchronization
3. Mapping:
Map Representations, State Estimation, and Bayesian Filtering
4. Localization:
Probabilistic Localization, Feature Matching, and Global Referencing
5. SLAM:
Problem Definition, Registration, EKF, Particle Filters, and Graph-based SLAM
6. Static Object Detection:
Semantic Segmentation, Instance Segmentation, Lane Detection, and Evaluation
7. Dynamic Object Detection:
Datasets, Camera Detection, Point Cloud Detection, and Performance Metrics
8. Sensor Fusion:
Data-, Feature-, and Object-Level Fusion, as well as Occupancy Prediction
9. Tracking:
Association Problem, Filtering, Model-, and Learning-based Tracking
10. Prediction:
Knowledge-, Deep Learning-, and Reinforcement Learning based Prediction
11. End-to-End Perception:
Environment Models, Differentiable Software Stack, and Explainability
12. Teleoperated Driving:
Disengagements, ODD Extension, and Perception Modification
1. Perception Sensors:
Sensor Modalities, Working Principals, Advantages and Disadvantages
2. Sensor Calibration:
Sensor Systems, Requirements, Spatial Calibration, and Temporal Synchronization
3. Mapping:
Map Representations, State Estimation, and Bayesian Filtering
4. Localization:
Probabilistic Localization, Feature Matching, and Global Referencing
5. SLAM:
Problem Definition, Registration, EKF, Particle Filters, and Graph-based SLAM
6. Static Object Detection:
Semantic Segmentation, Instance Segmentation, Lane Detection, and Evaluation
7. Dynamic Object Detection:
Datasets, Camera Detection, Point Cloud Detection, and Performance Metrics
8. Sensor Fusion:
Data-, Feature-, and Object-Level Fusion, as well as Occupancy Prediction
9. Tracking:
Association Problem, Filtering, Model-, and Learning-based Tracking
10. Prediction:
Knowledge-, Deep Learning-, and Reinforcement Learning based Prediction
11. End-to-End Perception:
Environment Models, Differentiable Software Stack, and Explainability
12. Teleoperated Driving:
Disengagements, ODD Extension, and Perception Modification
Inhaltliche Voraussetzungen
- Foundations of Autonomous Vehicles [ED150017]
- Basic Python programming is necessary for understanding the code examples and homework. We recommend an online course for Python e.g., at Codeacademy
- Basic Python programming is necessary for understanding the code examples and homework. We recommend an online course for Python e.g., at Codeacademy
Lehr- und Lernmethoden
The module consists of a lecture and an exercise. While the lecture presents the theoretical foundations, the exercise focuses on the transfer of knowledge towards practical applications and scientific problems.
In the lecture, the teaching content is conveyed by means of lecture and presentation. In the process, more complex issues are derived and illustrated. Explicit questions are posed that expect a transfer performance from the students and in which the students are given the opportunity to speak up and discuss a possible solution. In this way, the challenging tasks of perception are to be deepened and the transfer of knowledge is to be achieved.
In the exercise, problem-solving skills and practical applications are facilitated. Therefore, calculation and coding examples are given to enable students to analyze, evaluate, and apply knowledge to theoretical and practical problems. Moreover, a flipped classroom design is used to facilitate critical thinking, scientific work, and discuss scientific papers.
Office hours are available for answering questions on individual lectures, practice sessions, and homework, which can be attended in person or online.
In the lecture, the teaching content is conveyed by means of lecture and presentation. In the process, more complex issues are derived and illustrated. Explicit questions are posed that expect a transfer performance from the students and in which the students are given the opportunity to speak up and discuss a possible solution. In this way, the challenging tasks of perception are to be deepened and the transfer of knowledge is to be achieved.
In the exercise, problem-solving skills and practical applications are facilitated. Therefore, calculation and coding examples are given to enable students to analyze, evaluate, and apply knowledge to theoretical and practical problems. Moreover, a flipped classroom design is used to facilitate critical thinking, scientific work, and discuss scientific papers.
Office hours are available for answering questions on individual lectures, practice sessions, and homework, which can be attended in person or online.
Studien-, Prüfungsleistung
The module examination is a written exam (duration 90 min, permitted items: non-programmable calculator, dictionary for non-native speakers). This exam uses knowledge questions, comprehension questions, and calculation tasks to test whether students can, for example, understand the working principles of different sensor modalities and apply calibration methods to prevent spatial and temporal misalignments, remember different map representations, understand the simultaneous localization and mapping problem, and apply appropriate methods to address it, analyze various detection problems, select and apply appropriate methods to camera and point cloud data, and evaluate their results, understand the concepts of tracking and prediction, select and apply appropriate methods, and evaluate their results, understand the motivation for End-to-End perception and analyze scenarios for teleoperated driving applications.
Empfohlene Literatur
- S. Pendleton et al., “Perception, Planning, Control, and Coordination for Autonomous Vehicles”, 2017
- D. Watzenig et al., “Automated Driving: Safer and More Efficient Future Driving”, 2017
- H. Winner et al., “Handbook of Driver Assistance Systems”, 2014
- R. Fan et al., “Autonomous Driving Perception: Fundamentals and Applications”, 2023
- D. Watzenig et al., “Automated Driving: Safer and More Efficient Future Driving”, 2017
- H. Winner et al., “Handbook of Driver Assistance Systems”, 2014
- R. Fan et al., “Autonomous Driving Perception: Fundamentals and Applications”, 2023