# How to define bespoke MeasurementModel with trackingKF

3 ビュー (過去 30 日間)
William Campbell 2021 年 4 月 2 日
コメント済み: William Campbell 2021 年 5 月 21 日
Hello, I am fusing radar and mono-camera data together in a Kalman Filter as part of a tracking algorithm and I am using the sensor input dataset from the MatLab demo “Forward Collision Warning Using Sensor Fusion” which is in a cartesian coordinate frame.
In an extended KF when I use 'trackingEKF' I can call up a bespoke measurement function which switches depending on whether the sensor ID is radar or camera since the filter options accept function handles (refer below showing the 'trackingEKF' calling the bespoke measurement function 'cvmeas2FFullState'), and it runs fine.
filter = trackingEKF('State', H1 * ste, ...
'StateCovariance', steCov, ...
'MeasurementNoise', detectionClusters.MeasurementNoise(1:4,1:4), ...
'StateTransitionFcn', @constvel2d, ...
'MeasurementFcn', @cvmeas2DFullState, ...
'StateTransitionJacobianFcn', @constvel2djac, ...
'MeasurementJacobianFcn', @cvmeas2DFullStateJac,...
'ProcessNoise', Q);
% Measurement function:
function meas = cvmeas2DFullState(state, sensorID, varargin)
% The measurements depend on the sensor type, which is reported by
% the MeasurementParameters property of the objectDetection. The following
% two sensorID values are used:
% sensorID=1: video objects, the measurement is [x;vx;y] i.e. vy not measured.
% sensorID=2: radar objects, the measurement is [x;vx;y;vy].
% The state depends on the motion model used (in this case constant vel):
% Constant velocity state = [x;vx;y;vy]
switch sensorID
case 1 % vision data where vy=0
H = [1 0 0 0; 0 0 1 0; 0 1 0 0; 0 0 0 0];
meas = H*state(:);
H = [1 0 0 0; 0 0 1 0; 0 1 0 0; 0 0 0 1];
meas = H*state(:);
end
end
However, if I use a linear Kalman Filter 'trackingKF' with the filter option 'MotionModel', 'Custom', then the 'MeasurementModel' doesn't accept function handles and I am not able to call the same bespoke measurement function with the sensor-dependent switching - is there any way I can generate the switching measurement model as in the 'trackingEKF' ?
Also, if I use an extended Kalman Filter (trackerEKF) with linear dynamic model and linear sensor model, both in cartesian coordinate frame (as in MatLab demo “Forward Collision Warning Using Sensor Fusion”), where the raw radar and camera data is provided in cartesian frame, would this be expected to perform as a linear Kalman Filter ?
##### 1 件のコメント-1 件の古いコメントを表示-1 件の古いコメントを非表示
William Campbell 2021 年 5 月 21 日
brilliant - thanks for your comments and confirmation of my earlier points.
Best Regards,
William.

サインインしてコメントする。

### 採用された回答

Elad Kivelevitch 2021 年 5 月 20 日
Hi,
The linear Kalman filter, trackingKF, only uses a single measurement matrix, therefore the way to use two different measurement models is to manually modify them yourself before every call to correct. In other words:
% First set the measurement model based on sensor ID:
switch sensorID
case 1
KF.MeasurementModel = [1 0 0 0;0 0 1 0;0 1 0 0;0 0 0 0];
case 2
KF.MeasurementModel = [1 0 0 0;0 0 1 0;0 1 0 0;0 0 0 1];
end
% Then, call correct from the right sensor model
correct(KF, z);
Note: this is assuming you just want to use the filter directly. If you want to use the filter inside a tracker, that would be impossible, because the tracker acts on all the detections at the same time and you have no access midway in the step to modify the filter properties.
====
When you use the trackingEKF with linear functions, you will get similar results as trackingKF with a 'Custom' motion model, assuming that the dt you use in the StateTransitionModel is the same as the dt you will use in the predict call to trackingEKF. That is because the custom motion model in KF assumes a time-invariant state transition model (and process noise).
Of course, if you are using the trackingKF directly (not within a tracker), you can change the StateTransitionModel before each call to predict(KF). It will look like this:
% T is the timestep for the predict, which is known to you.
% StateTransitionModel is a 2-D constant velocity convention: [x;vx;y;vy].
KF.StateTransitionModel = [1 T 0 0;0 1 0 0;0 0 1 T;0 0 0 1];
predict(KF);
===
##### 2 件のコメントなしを表示なしを非表示
William Campbell 2021 年 5 月 20 日
thanks for your comments - I am going to fuse mono camera data with radar data :
Mono camera data measurement vector [x;y] and possibly only [y] since the accuracy of range data with mono vision is assumed to be poor. I was thinking to enter the vision measurement vector as [0;0;y;0] and with a very high co-variance matrix for x,vx,vy states (say 1000) to bias these states to radar data within the filter. I am hoping this will allow me to use EKF within a tracker since I would like to try different trackers (e.g JPDA) and also a constant acceleration motion model.
Am I correct in assuming mono vision is not accurate in range estimation and it should only be used with lateral position data (in 2D form). I was hoping to improve radar lateral accuracy with mono-vision on account of lower azimuth accuracy.
Elad Kivelevitch 2021 年 5 月 20 日
Hi William,
Everything you wrote makes sense.
If you want to use the tracker, I suggest doing it with trackingEKF.
The measurement model is as you defined it and covariance is as you said - use a large value for the dimensions tha are not measured.
Unfortunately, due to limitations imposed by code generation, we have to lock the size of the measurement model. That's why you have to have the 4-element vector in the vision measurement even though your range rate measurement is not accurate. You can use a high value in the corresponding covariance element to mitigate that. Depending on the problem and the rest of the measurement uncertainty values, I recommend using a value that is about 2-3 orders of magnitude higher than the other uncertainty values. That should work fine.

サインインしてコメントする。

### カテゴリ

Help Center および File ExchangeTracking and Sensor Fusion についてさらに検索

### Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by