Sensor Fusion and Navigation for Autonomous Systems
In order for autonomous systems to move within their environment, engineers need to design, simulate, test, and deploy algorithms that perceive the environment, keep track of moving objects, and plan a course of movement for the system itself. This workflow is critical for a wide range of systems including self-driving cars, warehouse robots, and unmanned aerial vehicles (UAVs). In this talk, you will learn how to use MATLAB® and Simulink® to develop perception, sensor fusion, localization, multi-object tracking, and motion planning algorithms. Some of the topics that will be covered include:
- Perception algorithm design using deep learning
- Fusing sensor data (cameras, lidar, and radar) to maintain situational awareness
- Mapping the environment and localizing the vehicle using SLAM algorithms
- Path planning with obstacle avoidance
- Path following and control design
Published: 18 Oct 2020
Featured Product
Navigation Toolbox
Up Next:
Related Videos:
Select a Web Site
Choose a web site to get translated content where available and see local events and offers. Based on your location, we recommend that you select: .
You can also select a web site from the following list
How to Get Best Site Performance
Select the China site (in Chinese or English) for best site performance. Other MathWorks country sites are not optimized for visits from your location.
Americas
- América Latina (Español)
- Canada (English)
- United States (English)
Europe
- Belgium (English)
- Denmark (English)
- Deutschland (Deutsch)
- España (Español)
- Finland (English)
- France (Français)
- Ireland (English)
- Italia (Italiano)
- Luxembourg (English)
- Netherlands (English)
- Norway (English)
- Österreich (Deutsch)
- Portugal (English)
- Sweden (English)
- Switzerland
- United Kingdom (English)
Asia Pacific
- Australia (English)
- India (English)
- New Zealand (English)
- 中国
- 日本Japanese (日本語)
- 한국Korean (한국어)