Sensor Fusion and Navigation for Autonomous Systems
In order for autonomous systems to move within their environment, engineers need to design, simulate, test, and deploy algorithms that perceive the environment, keep track of moving objects, and plan a course of movement for the system itself. This workflow is critical for a wide range of systems including self-driving cars, warehouse robots, and unmanned aerial vehicles (UAVs). In this talk, you will learn how to use MATLAB® and Simulink® to develop perception, sensor fusion, localization, multi-object tracking, and motion planning algorithms. Some of the topics that will be covered include:
- Perception algorithm design using deep learning
- Fusing sensor data (cameras, lidar, and radar) to maintain situational awareness
- Mapping the environment and localizing the vehicle using SLAM algorithms
- Path planning with obstacle avoidance
- Path following and control design
Published: 18 Oct 2020