Object tracking with LIDAR, Radar, sensor fusion and Extended Kalman Filter

원문 1~3은 칼만필터 깃북에 기록

4. Object motion tracking in 2D by fusing noisy measurements from LIDAR and Radar sensors

This part uses noisy measurements from Lidar and Radar sensors and the Extended Kalman Filter to estimate the 2D motion of bicycles

4.1 State Vector X & motion model

The state has four elements: position in x and y, and the velocity in x and y.

The linear motion model in the matrix form:

Motion noise and process noise refer to the same case: uncertainty in the object’s position when predicting location.

위 모델에서 속도는 고정상수로 가정한다. 하지만 가속도로 인해서 속도는 변할수 있다. 따라서 process noise에서 이러한 불확실성을 포함하고 있다. The model assumes velocity is constant between time intervals, but in reality we know that an object’s velocity can change due to acceleration. The model includes this uncertainty via the process noise.

4.2 Laser Measurements

Lidar의 출력은 (x,y)이다. The LIDAR sensor output is a point cloud but in this project, the point cloud is pre-processed and the x,y state of the bicycle is already extracted.

Definition of LIDAR variables:

  • z는 측정 벡터 이다. 라이다의 경우 z는 x,y를 포함 하고 있다. z is the measurement vector. For a lidar sensor, the z vector contains the position−x and position−y measurements.

  • H is the matrix that projects your belief about the object’s current state into the measurement space of the sensor.

    • For lidar, this is a fancy way of saying that we discard velocity information from the state variable since the lidar sensor only measures position:
    • 상태 벡터 X는 4개의 값을, 벡터 z는 2개의 값을 가지고 있다. The state vector x contains information about [px,py,vx,vy] whereas the z vector will only contain [px,py].
    • Multiplying Hx allows us to compare x, our belief, with z, the sensor measurement.
  • What does the prime notation in the x vector represent?
    • The prime notation like px′ means you have already done the update step but have not done the measurement step yet.
    • In other words, the object was at px. After time Δt, you calculate where you believe the object will be based on the motion model and get px′.

4.3 Radar Measurements

Radar의 출력은 거리, angle, 속도 이다. The Radar sensor output is defined by the measured distance to the object, orientation and its speed.

Definition of Radar variables:

  • The range, (ρ), is the distance to the pedestrian.

    • The range is basically the magnitude of the position vector ρ which can be defined as ρ=sqrt(px2+py2).
  • φ=atan(py/px). Note that φ is referenced counter-clockwise from the x-axis, so φ from the video clip above in that situation would actually be negative.

  • The range rate, ρ˙, is the projection of the velocity, v, onto the line, L.

퓨전을 위해 두 센서의 정보를 통일 해야 한다. To be able to fuse Radar measurements defined in the polar coordinate system with the LIDAR measurements defined in the cartesian coordinate system, one of the measurements must be transformed.

본 프로젝트에서는 Lidar값을 변경(cartesian -> polar) 하였다. In this project the LIDAR measurements are transformed from the cartesian into the polar coordinate system using this formula:

4.4 Overview of the Kalman Filter Algorithm Map

The Kalman Filter algorithm will go through the following steps:

A. first measurement

the filter will receive initial measurements of the bicycle’s position relative to the car.

These measurements will come from a radar or lidar sensor.

B. initialize state and covariance matrices

the filter will initialize the bicycle’s position based on the first measurement.

then the car will receive another sensor measurement after a time period Δt.

C. predict

the algorithm will predict where the bicycle will be after time Δt.

One basic way to predict the bicycle location after Δt is to assume the bicycle’s velocity is constant; thus the bicycle will have moved velocity * Δt.

In the extended Kalman filter lesson, we will assume the velocity is constant; in the unscented Kalman filter lesson, we will introduce a more complex motion model.

D. update

the filter compares the “predicted” location with what the sensor measurement says.

The predicted location and the measured location are combined to give an updated location.

The Kalman filter will put more weight on either the predicted location or the measured location depending on the uncertainty of each value.

E.

then the car will receive another sensor measurement after a time period Δt.

The algorithm then does another predict and update step.

results matching ""

    No results matching ""