How to use other filters than simple Kalman in Motion-Based Multiple Object Tracking Example

6 ビュー (過去 30 日間)
I have found the Motion-Based Multiple Object Tracking Example very useful in various problems. The example states at the end: "The likelihood of tracking errors can be reduced by using a more complex motion model, such as constant acceleration, or by using multiple Kalman filters for every object. Also, you can incorporate other cues for associating detections over time, such as size, shape, and color. "
I would like to try different filters such as those listed in Matlab as usable in the predict and correct functions:
Filter for object tracking, specified as one of these objects:
How would this be incorporated here? Would it involve the vision.Kalmanfilter? How?
Are there any examples of these or the other cues "associating detections over time, such as size, shape, and color." available. I searched the community and could not find any.
Thank you


Elad Kivelevitch
Elad Kivelevitch 2022 年 4 月 28 日
Hi Peter,
Thanks for the question.
The example that you refer to uses the vision.KalmanFilter object, which is a linear Kalman filter that assumes that both the motion and the measurements are modeled as linear models. Furthermore, the example uses some helper functions to associate new measurements with existing tracked objects, initialize new tracked objects, update existing ones, and delete ones that are no longer present.
There are two ways to move forward from this example to other filters and models. The first way is to still use the same helper functions, and replace the vision.KalmanFilter with one of the filters you listed. As you correctly noted, for a filter to be compatible, it must provide a few object functions (methods):
  • Predict - to predict the object state from one time step to the next.
  • Correct - to correct the object state with a new measurement.
  • Distance - to aid in computing the association cost that is used in the association stage.
The above filters all support these methods. The easiest one to convert to, and I recommend starting with that, would be the trackingKF, which is very similar to vision.KalmanFilter. You will need to define a bounding box model, for example on how to do that, please see:
After doing that, if you want to try using EKF or UKF, you will need to define the appropriate motion and measurement model functions. You can see the constvel and cvmeas functions for inspiration. Then, simply use the filter with these models by defining the StateTransitionFcn and MeasurementFcn, accordingly.
You can stop here, or you can decide to move to the next step.
The next step could be replacing all the tracking helper function with a tracker. Once again, I recommend looking at the example to see how to set up a tracker and how to run it. You can use any of the following trackers: trackerGNN, trackerJPDA, and trackerTOMHT with any of the filters listed in the question. To choose a filter, simply define the FilterInitializationFcn. You may want to look at the same FilterInitializationFcn used in the example I linked to above for that.
Finally, to learn more about tracking and trackers, please look at the documentation for the Sensor Fusion and Tracking Toolbox.
Good luck
  4 件のコメント
Peter 2022 年 5 月 2 日
Sorry to bother you again. I have looked at the examples you list and I realize that I do not have enough fundamental background to figure this out. Could you recommend books, tutorials, etc? Although I did 3 years of math in college, I probably remember about 1.
Thank you,


その他の回答 (1 件)

Peter 2022 年 4 月 29 日
Thank you again. This will give me alot of things to try out.

Community Treasure Hunt

Find the treasures in MATLAB Central and discover how the community can help you!

Start Hunting!

Translated by