[RADAR + LIDAR Sensor Fusion Processing Pipeline]
NANODEGREE COURSE (Self-Driving Car Engineer - Sensor Fusion)
LESSON 1 : Introduction and Sensors
Meet the team at Mercedes who will help you track objects in real-time with Sensor Fusion.
LESSON 2 : Kalman Filters
Learn from the best! Sebastian Thrun will walk you through the usage and concepts of a Kalman Filter using Python.
Sensor Fusion and Object Tracking using an Extended Kalman Filter Algorithm — Part 1: mithiSensor Fusion and Object Tracking using an Extended Kalman Filter Algorithm — Part 2: mithiUDACITY SDCE Nanodegree Term 2: Kalman Filters for Sensor FusionKalman Filter, Extended Kalman Filter, Unscented Kalman Filter
- Kalman Filter: Predict, Measure, Update, Repeat.
- Make sense of Kalman Filter
- What is a Kalman filter and why is there an unscented version?
- How a Kalman filter works, in pictures
- Udacity Self-Driving Cars: Extended Kalman Filters — my bits
- Extended Kalman Filters for Dummies
LESSON 3 : C++ Checkpoint
- Are you ready to build Kalman Filters with C++? Take these quizzes to find out.
LESSON 4 : Lidar and Radar Fusion with Kalman Filters in C++
- In this lesson, you'll build a Kalman Filter in C++ that's capable of handling data from multiple sources. Why C++? Its performance enables the application of object tracking with a Kalman Filter in real-time.
LESSON 5(Project) : Extended Kalman Filter Project
In this project, you'll apply everything you've learned so far about Sensor Fusion by implementing an Extended Kalman Filter in C++!
LESSON 6 : Unscented Kalman Filters
While Extended Kalman Filters work great for linear motion, real objects rarely move linearly. With Unscented Kalman Filters, you'll be able to accurately track non-linear motion!
LESSON 7(Project) : Unscented Kalman Filter Project
Put your skills to the test! Use C++ to code an Unscented Kalman Filter capable of tracking non-linear motion.
CarND-Mercedes-SF-Utilities
- Tools for Sensor Fusion processing.
- visualizing and analyzing your data
파일 내용
[SENSOR ID] [SENSOR RAW VALUES] [TIMESTAMP] [GROUND TRUTH VALUES]
Example 1: line with three measurements from a radar sensor in polar coordinate followed by a timestamp in unix time followed by the the "ground truth" which is actual real position and velocity in cartesian coordinates (four state variables)
R 8.46642 0.0287602 -3.04035 1477010443399637 8.6 0.25 -3.00029 0 (R) (rho) (phi) (drho) (timestamp) (real x) (real y) (real vx) (real vy)
Example 2: line with two measurements from a lidar sensor in cartesian coordinates followed by a timestamp in unix time followed by the the "ground truth" which is the actual real position and velocity in cartesian coordinates (four state variables)
L 8.44818 0.251553 1477010443449633 8.45 0.25 -3.00027 0