A journal of IEEE and CAA , publishes high-quality papers in English on original theoretical/experimental research and development in all areas of automation
Volume 12 Issue 12
Dec.  2025

IEEE/CAA Journal of Automatica Sinica

  • JCR Impact Factor: 19.2, Top 1 (SCI Q1)
    CiteScore: 28.2, Top 1% (Q1)
    Google Scholar h5-index: 95, TOP 5
Turn off MathJax
Article Contents
X. Hao, Y. Xia, H. Yang, and Z. Zuo, “Target tracking by cameras and millimeter-wave radars: A confidence information fusion method,” IEEE/CAA J. Autom. Sinica, vol. 12, no. 12, pp. 2486–2498, Dec. 2025. doi: 10.1109/JAS.2025.125405
Citation: X. Hao, Y. Xia, H. Yang, and Z. Zuo, “Target tracking by cameras and millimeter-wave radars: A confidence information fusion method,” IEEE/CAA J. Autom. Sinica, vol. 12, no. 12, pp. 2486–2498, Dec. 2025. doi: 10.1109/JAS.2025.125405

Target Tracking by Cameras and Millimeter-Wave Radars: A Confidence Information Fusion Method

doi: 10.1109/JAS.2025.125405
Funds:  This work was supported in part by the National Natural Science Foundation of China (62303061, 62403353, U25A20460), the Beijing Natural Science Foundation Haidian Original Innovation Joint Fund Project (L252035), and the National Key Laboratory of Science and Technology on Space-Born Intelligent Information Processing (TJ-02-22-03)
More Information
  • This paper addresses a confidence fusion problem of the camera and the millimeter-wave (MMW) radar for target tracking in intelligent driving systems. The local camera and radar estimators are performed by analyzing the measurement characteristics of each sensor. The radar estimates are aligned to the camera sampling time and the Kuhn-Munkers method is used to obtain the matching relationship of local camera and radar estimates for fusion. Next, to utilize the advantage of the camera with low false detection and the radar with low miss detection performance, the mass functions are introduced to model the detection performance of the two sensors. Based on the mass functions and a D-S (Dempster-Shafer) evidence theory, the confidence fusion is performed sequentially to determine whether each target exists. Then a weighted maximum likelihood fusion estimator is designed for matched targets based on priori positing accuracy of the local camera and radar estimates. Finally, the experimental results on road vehicle tracking show that the detection range is expanded and false targets are significantly reduced by the proposed confidence fusion method.

     

  • loading
  • [1]
    D. Feng, C. Haase-Schütz, L. Rosenbaum, and H. Hertlein, “Deep multi-modal object detection and semantic segmentation for autonomous driving: Datasets, methods, and challenges,” IEEE Trans. Intelligent Transportation Systems, vol. 22, no. 3, pp. 1341–1360, 2021. doi: 10.1109/TITS.2020.2972974
    [2]
    W. Jang, J. Hyun, J. An, and M. Cho, “A lane-level road marking map using a monocular camera,” IEEE Trans. Intelligent Transportation Systems, vol. 9, no. 1, pp. 187–204, 2022.
    [3]
    H. Lian, X. Pei, and X. Guo, “A local environment model based on multi-sensor perception for intelligent vehicles,” IEEE Sensors J., vol. 21, no. 14, pp. 15427–15436, 2021. doi: 10.1109/JSEN.2020.3018319
    [4]
    X. Li, Y. Zhou, and B. Hua, “Study of a multi-beam LiDAR perception assessment model for real-time autonomous driving,” IEEE Trans. Instrumentation and Measurement, vol. 70, pp. 1–15, 2021.
    [5]
    H. Geng, Z. Wang, Y. Chen, and X. Yi, “Variance-constrained filtering fusion for nonlinear cyber-physical systems with the denial-of-service attacks and stochastic communication protocol,” IEEE/CAA J. Autom. Sinica, vol. 9, no. 6, pp. 978–989, 2022. doi: 10.1109/JAS.2022.105623
    [6]
    M. Davari, A. Harooni, A. Nasr, K. Savoji, and M. Soleimani, “Improving recognition accuracy for facial expressions using scattering wavelet,” EAI Endorsed Trans. AI and Robotics, vol. 3, Mar. 2024, DOI: 10.4108/airo.5145.
    [7]
    A. Gupta, “Improved hybrid preprocessing technique for effective segmentation of wheat canopies in chlorophyll fluorescence images,” EAI Endorsed Trans. AI and Robotics, vol. 3, Jan. 2024, DOI: 10.4108/airo.4621.
    [8]
    H. Hoorfar and A. Bagheri, “Advancing robot perception in non-spiral environments through camera-based image processing,” EAI Endorsed Trans. AI and Robotics, vol. 2, Aug. 2023, DOI: 10.4108/airo.3591.
    [9]
    A. J. Moshayedi, N. M. I. Uddin, A. S. Khan, J. Zhu, and M. E. Andani, “Designing and developing a vision-based system to investigate the emotional effects of news on short sleep at noon: An experimental case study,” Sensors, vol. 23, no. 20, pp. 1–20, 2023. doi: 10.1109/JSEN.2023.3321407
    [10]
    A. J. Moshayedi, A. S. Khan, M. Davari, T. Mokhtari, and M. E. Andani, “Micro robot as the feature of robotic in healthcare approach from design to application: The state of art and challenges,” EAI Endorsed Trans. AI and Robotics, vol. 3, Apr. 2024, DOI: 10.4108/airo.5602.
    [11]
    A. J. Moshayedi, A. S. Khan, S. Yang, and S. M. Zanjani, “Personal image classifier based handy pipe defect recognizer (HPD): Design and test,” in Proc. 7th Int. Conf. Intelligent Computing and Signal Processing (ICSP), pp. 1721–1728, 2022.
    [12]
    S. Yao, R. Guan, X. Huang, and Z. Li, “Radar-camera fusion for object detection and semantic segmentation in autonomous driving: A comprehensive review,” IEEE Trans. Intelligent Vehicles, vol. 9, no. 1, pp. 2094–2128, 2024. doi: 10.1109/TIV.2023.3307157
    [13]
    J. Bai, S. Li, L. Huang, and H. Chen, “Robust detection and tracking method for moving object based on radar and camera data fusion,” IEEE Sensors Journal, vol. 21, no. 9, pp. 10761–10774, 2021. doi: 10.1109/JSEN.2021.3049449
    [14]
    Z. Liu, Y. Cai, H. Wang, L. Chen, and Y. Li, “Robust target recognition and tracking of self-driving cars with radar and camera information fusion under severe weather conditions,” IEEE Trans. Intelligent Transportation Systems, vol. 23, no. 7, pp. 6640–6653, 2022. doi: 10.1109/TITS.2021.3059674
    [15]
    X. Hao, Y. Xia, H. Yang, and Z. Zuo, “Asynchronous information fusion in intelligent driving systems for target tracking using cameras and radars,” IEEE Trans. Industrial Electronics, vol. 70, pp. 2708–2717, 2022.
    [16]
    J. Wang, L. Chu, and C. Guo, “Target track enhancement based on asynchronous radar and camera fusion in intelligent driving system,” IEEE Sensors J., vol. 24, no. 3, pp. 3131–3143, 2024. doi: 10.1109/JSEN.2023.3339328
    [17]
    L. Deng, T. Zhou, B. Yin, Z. Guo, Q. Sun, and H. Wen, “An improved fusion positioning method for millimeter-wave radar and stereo camera,” IEEE Sensors J., vol. 24, no. 17, pp. 28028–28035, 2024. doi: 10.1109/JSEN.2024.3430072
    [18]
    L. Li and H. Cao, “Target tracking for tarkBot intelligent robots by fusing lidar and camera data,” IEEE Sensors Journal, vol. 24, no. 19, pp. 30920–30929, 2024. doi: 10.1109/JSEN.2024.3447015
    [19]
    S. S. Ram, “Fusion of inverse synthetic aperture radar and camera images for automotive target tracking,” IEEE J. Selected Topics in Signal Processing, vol. 17, no. 2, pp. 431–444, 2023. doi: 10.1109/JSTSP.2022.3211198
    [20]
    Y. Li, Y. Wang, C. Meng, Y. Duan, J. Ji, and Y. Zhang, “FARFusion: A practical roadside radar-camera fusion system for far-range perception,” IEEE Robotics and Autom. Letters, vol. 9, no. 6, pp. 5433–5440, 2024. doi: 10.1109/LRA.2024.3387700
    [21]
    A. Kosuge, S. Suehiro, M. Hamada, and T. Kurod, “mmWave-YOLO: A mmWave imaging radar-based real-time multiclass object recognition system for ADAS applications,” IEEE Trans. Instrumentation and Measurement, vol. 71, p. 2509810, 2022.
    [22]
    C. Chen, B. Liu, S. Wan, P. Qiao, and Q. Pei, “An edge traffic flow detection scheme based on deep learning in an intelligent transportation system,” IEEE Trans. Intelligent Transportation Systems, vol. 22, no. 3, pp. 1840–1852, 2020.
    [23]
    X. Hao, Y. Liang, L. Xu, and X. Wang, “Mode separability-based state estimation for uncertain constrained dynamic systems,” Automatica, vol. 115, pp. 1–11, 2020.
    [24]
    X. Zhou, Y. Li, B. He, and T. Bai, “GM-PHD-based multi-target visual tracking using entropy distribution and game theory,” IEEE Trans. Industrial Informatics, vol. 10, no. 2, pp. 1064–1076, 2014. doi: 10.1109/TII.2013.2294156
    [25]
    Z. Li, S. Li, A. Francis, and X. Luo, “A novel calibration system for robot arm via an open dataset and a learning perspective,” IEEE Trans. Circuits and Systems II: Express Briefs, vol. 69, no. 12, pp. 5169–5173, 2022. doi: 10.1109/TCSII.2022.3199158
    [26]
    Z. Li, S. Li, and X. Luo, “An overview of calibration technology of industrial robots,” IEEE/CAA J. Autom. Sinica, vol. 8, no. 1, pp. 23–26, 2021. doi: 10.1109/JAS.2020.1003381
    [27]
    Z. Li, S. Li, O. O. Bamasag, A. Alhothali, and X. Luo, “Diversified regularization enhanced training for effective manipulator calibration,” IEEE Trans. Neural Networks and Learning Systems, vol. 34, no. 11, pp. 8778–8790, 2022.
    [28]
    E. Aghajari and A. A. K. AbdulRahim, “Prediction of short circuit current of wind turbines based on artificial neural network model,” EAI Endorsed Trans. AI and Robotics, vol. 3, 2024, DOI: 10.4108/airo.5955.
    [29]
    B. N. Vo and W. K. Ma, “The Gaussian mixture probability hypothesis density filter,” IEEE Trans. Signal Processing, vol. 54, pp. 4091–4104, 2006. doi: 10.1109/TSP.2006.881190
    [30]
    Y. Pan, L. Zhang, Z. Li, and L. Ding, “Improved fuzzy bayesian network-based risk analysis with interval-valued fuzzy sets and D-S evidence theory,” IEEE Trans. Fuzzy Systems, vol. 28, no. 9, pp. 2063–2077, 2020. doi: 10.1109/TFUZZ.2019.2929024

Catalog

    通讯作者: 陈斌, bchen63@163.com
    • 1. 

      沈阳化工大学材料科学与工程学院 沈阳 110142

    1. 本站搜索
    2. 百度学术搜索
    3. 万方数据库搜索
    4. CNKI搜索

    Figures(24)  / Tables(5)

    Article Metrics

    Article views (9) PDF downloads(5) Cited by()

    /

    DownLoad:  Full-Size Img  PowerPoint
    Return
    Return