A journal of IEEE and CAA , publishes high-quality papers in English on original theoretical/experimental research and development in all areas of automation
Volume 11 Issue 5
May  2024

IEEE/CAA Journal of Automatica Sinica

  • JCR Impact Factor: 11.8, Top 4% (SCI Q1)
    CiteScore: 17.6, Top 3% (Q1)
    Google Scholar h5-index: 77, TOP 5
Turn off MathJax
Article Contents
D. Xiong, D. Zhang, Y. Chu, Y. Zhao, and  X. Zhao,  “Intuitive human-robot-environment interaction with EMG signals: A review,” IEEE/CAA J. Autom. Sinica, vol. 11, no. 5, pp. 1075–1091, May 2024. doi: 10.1109/JAS.2024.124329
Citation: D. Xiong, D. Zhang, Y. Chu, Y. Zhao, and  X. Zhao,  “Intuitive human-robot-environment interaction with EMG signals: A review,” IEEE/CAA J. Autom. Sinica, vol. 11, no. 5, pp. 1075–1091, May 2024. doi: 10.1109/JAS.2024.124329

Intuitive Human-Robot-Environment Interaction With EMG Signals: A Review

doi: 10.1109/JAS.2024.124329
Funds:  This work was supported by the National Key Research and Development Program of China (2022YFF1202500, 2022YFF1202502, 2022YFB4703200, 2023YFB4704700, 2023YFB4704702), the National Natural Science Foundation of China (U22A2067, U20A20197, 61773369, 61903360, 92048302, 62203430), the Self-Planned Project of the State Key Laboratory of Robotics (2023-Z05), and China Postdoctoral Science Foundation funded project (2022M723312)
More Information
  • A long history has passed since electromyography (EMG) signals have been explored in human-centered robots for intuitive interaction. However, it still has a gap between scientific research and real-life applications. Previous studies mainly focused on EMG decoding algorithms, leaving a dynamic relationship between the human, robot, and uncertain environment in real-life scenarios seldomly concerned. To fill this gap, this paper presents a comprehensive review of EMG-based techniques in human-robot-environment interaction (HREI) systems. The general processing framework is summarized, and three interaction paradigms, including direct control, sensory feedback, and partial autonomous control, are introduced. EMG-based intention decoding is treated as a module of the proposed paradigms. Five key issues involving precision, stability, user attention, compliance, and environmental awareness in this field are discussed. Several important directions, including EMG decomposition, robust algorithms, HREI dataset, proprioception feedback, reinforcement learning, and embodied intelligence, are proposed to pave the way for future research. To the best of what we know, this is the first time that a review of EMG-based methods in the HREI system is summarized. It provides a novel and broader perspective to improve the practicability of current myoelectric interaction systems, in which factors in human-robot interaction, robot-environment interaction, and state perception by human sensations are considered, which has never been done by previous studies.

     

  • loading
  • [1]
    D. Farina and A. Holobar, “Characterization of human motor units from surface EMG decomposition,” Proc. IEEE, vol. 104, no. 2, pp. 353–373, Feb. 2016. doi: 10.1109/JPROC.2015.2498665
    [2]
    M. Chen and P. Zhou, “A novel framework based on FastICA for high density surface EMG decomposition,” IEEE Trans. Neural Syst. Rehabil. Eng., vol. 24, no. 1, pp. 117–127, Jan. 2016. doi: 10.1109/TNSRE.2015.2412038
    [3]
    D. Xiong, D. Zhang, X. Zhao, and Y. Zhao, “Deep learning for EMG-based human-machine interaction: A review,” IEEE/CAA J. Autom. Sinica, vol. 8, no. 3, pp. 512–533, Mar. 2021. doi: 10.1109/JAS.2021.1003865
    [4]
    Z. Zhu, J. Li, W. J. Boyd, C. Martinez-Luna, C. Dai, H. Wang, H. Wang, X. Huang, T. R. Farrell, and E. A. Clancy, “Myoelectric control performance of two degree of freedom hand-wrist prosthesis by able-bodied and limb-absent subjects,” IEEE Trans. Neural Syst. Rehabil. Eng., vol. 30, pp. 893–904, Mar. 2022. doi: 10.1109/TNSRE.2022.3163149
    [5]
    H. E. Williams, A. W. Shehata, M. R. Dawson, E. Scheme, J. S. Hebert, and P. M. Pilarski, “Recurrent convolutional neural networks as an approach to position-aware myoelectric prosthesis control,” IEEE Trans. Biomed. Eng., vol. 69, no. 7, pp. 2243–2255, Jul. 2022. doi: 10.1109/TBME.2022.3140269
    [6]
    K. Z. Zhuang, N. Sommer, V. Mendez, S. Aryan, E. Formento, E. D’Anna, F. Artoni, F. Petrini, G. Granata, G. Cannaviello, W. Raffoul, A. Billard, and S. Micera, “Shared human-robot proportional control of a dexterous myoelectric prosthesis,” Nat. Mach. Intell., vol. 1, pp. 400–411, Sept. 2019. doi: 10.1038/s42256-019-0093-5
    [7]
    Y. Zhuang, Y. Leng, J. Zhou, R. Song, L. Li, and S. W. Su, “Voluntary control of an ankle joint exoskeleton by able-bodied individuals and stroke survivors using EMG-based admittance control scheme,” IEEE Trans. Biomed. Eng., vol. 68, no. 2, pp. 695–705, Feb. 2021. doi: 10.1109/TBME.2020.3012296
    [8]
    Y. Zhuang, S. Yao, C. Ma, and R. Song, “Admittance control based on EMG-driven musculoskeletal model improves the human-robot synchronization,” IEEE Trans. Ind. Inf., vol. 15, no. 2, pp. 1211–1218, Feb. 2019. doi: 10.1109/TII.2018.2875729
    [9]
    G. Durandau, W. F. Rampeltshammer, H. Van Der Kooij, and M. Sartori, “Neuromechanical model-based adaptive control of bilateral ankle exoskeletons: Biological joint torque and electromyogram reduction across walking conditions,” IEEE Trans. Robot., vol. 38, no. 3, pp. 1380–1394, Jun. 2022. doi: 10.1109/TRO.2022.3170239
    [10]
    L. Zhang, Z. Li, Y. Hu, C. Smith, E. M. G. Farewik, and R. Wang, “Ankle joint torque estimation using an EMG-driven neuromusculoskeletal model and an artificial neural network model,” IEEE Trans. Autom. Sci. Eng., vol. 18, no. 2, pp. 564–573, Apr. 2021. doi: 10.1109/TASE.2020.3033664
    [11]
    L. Bi, A. G. Feleke, and C. T. Guan, “A review on EMG-based motor intention prediction of continuous human upper limb motion for human-robot collaboration,” Biomed. Signal Process. Control, vol. 51, pp. 113–127, May 2019. doi: 10.1016/j.bspc.2019.02.011
    [12]
    M. Simão, N. Mendes, O. Gibaru, and P. Neto, “A review on electromyography decoding and pattern recognition for human-machine interaction,” IEEE Access, vol. 7, pp. 39564–39582, Mar. 2019. doi: 10.1109/ACCESS.2019.2906584
    [13]
    H. Xu and A. Xiong, “Advances and disturbances in sEMG-based intentions and movements recognition: A review,” IEEE Sensors J., vol. 21, no. 12, pp. 13019–13028, Jun. 2021. doi: 10.1109/JSEN.2021.3068521
    [14]
    T. Bao, S. Q. Xie, P. Yang, P. Zhou, and Z.-Q. Zhang, “Toward robust, adaptiveand reliable upper-limb motion estimation using machine learning and deep learning–A survey in myoelectric control,” IEEE J. Biomed. Health Inf., vol. 26, no. 8, pp. 3822–3835, Aug. 2022. doi: 10.1109/JBHI.2022.3159792
    [15]
    N. Jiang, C. Chen, J. He, J. Meng, L. Pan, S. Su, and X. Zhu, “Bio-robotics research for non-invasive myoelectric neural interfaces for upper-limb prosthetic control: A 10-year perspective review,” Natl. Sci. Rev., vol. 10, no. 5, p. nwad048, Feb. 2023. doi: 10.1093/nsr/nwad048
    [16]
    O. W. Samuel, M. G. Asogbon, Y. Geng, A. H. Al-Timemy, S. Pirbhulal, N. Ji, S. Chen, P. Fang, and G. Li, “Intelligent EMG pattern recognition control method for upper-limb multifunctional prostheses: Advances, current challenges, and future prospects,” IEEE Access, vol. 7, pp. 10150–10165, Jan. 2019. doi: 10.1109/ACCESS.2019.2891350
    [17]
    D. Kim, K. Koh, G. Oppizzi, R. Baghi, L.-C. Lo, C. Zhang, and L.-Q. Zhang, “EMG-based simultaneous estimations of joint angle and torque during hand interactions with environments,” IEEE Trans. Biomed. Eng., 2021. DOI: 10.1109/TBME.2021.3134204.
    [18]
    J. M. Hahne, M. Markovic, and D. Farina, “User adaptation in myoelectric man-machine interfaces,” Sci. Rep., vol. 7, no. 1, p. 4437, Jun. 2017. doi: 10.1038/s41598-017-04255-x
    [19]
    R. E. Johnson, K. P. Kording, L. J. Hargrove, and J. W. Sensinger, “Does EMG control lead to distinct motor adaptation,” Front. Neurosci., vol. 8, p. 302, Sept. 2014.
    [20]
    J. Vogel, A. Hagengruber, M. Iskandar, G. Quere, U. Leipscher, S. Bustamante, A. Dietrich, H. Höppner, D. Leidner, and A. Albu-Schäffer, “EDAN: An EMG-controlled daily assistant to help people with physical disabilities,” in Proc. IEEE/RSJ Int. Conf. Intelligent Robots and Systems, Las Vegas, NV, USA, 2020, pp. 4183–4190.
    [21]
    S. Došen, C. Cipriani, M. Kostić, M. Controzzi, M. C. Carrozza, and D. B. Popović, “Cognitive vision system for control of dexterous prosthetic hands: Experimental evaluation,” J. NeuroEng. Rehabil., vol. 7, no. 1, p. 42, Aug. 2010. doi: 10.1186/1743-0003-7-42
    [22]
    M. A. Powell, R. R. Kaliki, and N. V. Thakor, “User training for pattern recognition-based myoelectric prostheses: Improving phantom limb movement consistency and distinguishability,” IEEE Trans. Neural Syst. Rehabil. Eng., vol. 22, no. 3, pp. 522–532, May 2014. doi: 10.1109/TNSRE.2013.2279737
    [23]
    Y. Su, M. H. Fisher, A. Wolczowski, G. D. Bell, D. J. Burn, and R. X. Gao, “Towards an EMG-controlled prosthetic hand using a 3-D electromagnetic positioning system,” IEEE Trans. Instrum. Meas., vol. 56, no. 1, pp. 178–186, Feb. 2007. doi: 10.1109/TIM.2006.887669
    [24]
    M. Sartori, D. G. Llyod, and D. Farina, “Neural data-driven musculoskeletal modeling for personalized neurorehabilitation technologies,” IEEE Trans. Biomed. Eng., vol. 63, no. 5, pp. 879–893, May 2016. doi: 10.1109/TBME.2016.2538296
    [25]
    D. Farina and F. Negro, “Accessing the neural drive to muscle and translation to neurorehabilitation technologies,” IEEE Rev. Biomed. Eng, vol. 5, pp. 3–14, Jan. 2012. doi: 10.1109/RBME.2012.2183586
    [26]
    Y. Yu, C. Chen, X. Sheng, and X. Zhu, “Wrist torque estimation via electromyographic motor unit decomposition and image reconstruction,” IEEE J. Biomed. Health Inform., vol. 25, no. 7, pp. 2557–2566, Jul. 2021. doi: 10.1109/JBHI.2020.3041861
    [27]
    D. Xiong, D. Zhang, X. Zhao, Y. Chu, and Y. Zhao, “Synergy-based neural interface for human gait tracking with deep learning,” IEEE Trans. Neural Syst. Rehabil. Eng., vol. 29, pp. 2271–2280, Oct. 2021. doi: 10.1109/TNSRE.2021.3123630
    [28]
    S. K. Khare, and U. R. Acharya, “Adazd-Net: Automated adaptive and explainable Alzheimer’s disease detection system using EEG signals,” Knowl. Based Syst., vol. 278, p. 110858, Oct. 2023. doi: 10.1016/j.knosys.2023.110858
    [29]
    S. K. Khare, V. M. Gadre, and U. R. Acharya, “ECGPsychNet: An optimized hybrid ensemble model for automatic detection of psychiatric disorders using ECG signals,” Physiol. Meas., vol. 44, no. 11, p. 115004, Nov. 2023. doi: 10.1088/1361-6579/ad00ff
    [30]
    S. K. Khare, V. Bajaj, N. B. Gaikwad, and G. R. Sinha, “Ensemble wavelet decomposition-based detection of mental states using electroencephalography signals,” Sensors, vol. 23, no. 18, p. 7860, Sept. 2023. doi: 10.3390/s23187860
    [31]
    S. Pancholi and A. M. Joshi, “Time derivative moments based feature extraction approach for recognition of upper limb motions using EMG,” IEEE Sensors Lett., vol. 3, no. 4, p. 6000804, Apr. 2019.
    [32]
    M. F. Wahid, R. Tafreshi, and R. Langari, “A multi-window majority voting strategy to improve hand gesture recognition accuracies using electromyography signal,” IEEE Trans. Neural Syst. Rehabil. Eng., vol. 28, no. 2, pp. 427–436, Feb. 2020. doi: 10.1109/TNSRE.2019.2961706
    [33]
    F. Duan, L. Dai, W. Chang, Z. Chen, C. Zhu, and W. Li, “sEMG-based identification of hand motion commands using wavelet neural network combined with discrete wavelet transform,” IEEE Trans. Ind. Electron., vol. 63, no. 3, pp. 1923–1934, Mar. 2016. doi: 10.1109/TIE.2015.2497212
    [34]
    R. N. Khushaba, A. H. Al-Timemy, A. Al-Ani, and A. Al-Jumaily, “A framework of temporal-spatial descriptors-based feature extraction for improved myoelectric pattern recognition,” IEEE Trans. Neural Syst. Rehabil. Eng., vol. 25, no. 10, pp. 1821–1831, Oct. 2017. doi: 10.1109/TNSRE.2017.2687520
    [35]
    X. Zhai, B. Jelfs, R. H. M. Chan, and C. Tin, “Self-recalibrating surface EMG pattern recognition for neuroprosthesis control based on convolutional neural network,” Front. Neurosci., vol. 11, p. 379, Jul. 2017. doi: 10.3389/fnins.2017.00379
    [36]
    M. Simão, P. Neto, and O. Gibaru, “EMG-based online classification of gestures with recurrent neural networks,” Pattern Recogn. Lett., vol. 128, pp. 45–51, Dec. 2019. doi: 10.1016/j.patrec.2019.07.021
    [37]
    W. Sun, H. Liu, R. Tang, Y. Lang, J. He, and Q. Huang, “sEMG-based hand-gesture classification using a generative flow model,” Sensors, vol. 19, no. 8, p. 1952, Apr. 2019. doi: 10.3390/s19081952
    [38]
    A. Ameri, M. A. Akhaee, E. Scheme, and K. Englehart, “Regression convolutional neural network for improved simultaneous EMG control,” J. Neural Eng., vol. 16, no. 3, p. 036015, Apr. 2019. doi: 10.1088/1741-2552/ab0e2e
    [39]
    Y. Zhao, Z. Zhang, Z. Li, Z. Yang, A. A. Dehghani-Sanij, and S. Xie, “An EMG-driven musculoskeletal model for estimating continuous wrist motion,” IEEE Trans. Neural Syst. Rehabil. Eng., vol. 28, no. 12, pp. 3113–3120, Dec. 2020. doi: 10.1109/TNSRE.2020.3038051
    [40]
    E. A. Clancy, L. Liu, P. Liu, and D. V. Z. Moyer, “Identification of constant-posture EMG-torque relationship about the elbow using nonlinear dynamic models,” IEEE Trans. Biomed. Eng., vol. 59, no. 1, pp. 205–212, Jan. 2012. doi: 10.1109/TBME.2011.2170423
    [41]
    Q. Ding, J. Han, and X. Zhao, “Continuous estimation of human multi-joint angles from sEMG using a state-space model,” IEEE Trans. Neural Syst. Rehabil. Eng., vol. 25, no. 9, pp. 1518–1528, Sept. 2017. doi: 10.1109/TNSRE.2016.2639527
    [42]
    N. E. Krausz, D. Lamotte, I. Batzianoulis, L. J. Hargrove, S. Micera, and A. Billard, “Intent prediction based on biomechanical coordination of EMG and vision-filtered gaze for end-point control of an arm prosthesis,” IEEE Trans. Neural Syst. Rehabil. Eng., vol. 28, no. 6, pp. 1471–1480, Jun. 2020. doi: 10.1109/TNSRE.2020.2992885
    [43]
    C. Shi, D. Yang, J. Zhao, and H. Liu, “Computer vision-based grasp pattern recognition with application to myoelectric control of dexterous hand prosthesis,” IEEE Trans. Neural Syst. Rehabil. Eng., vol. 28, no. 9, pp. 2090–2099, Sept. 2020. doi: 10.1109/TNSRE.2020.3007625
    [44]
    C. Shi, L. Qi, D. Yang, J. Zhao, and H. Liu, “A novel method of combining computer vision, eye-tracking, EMG, and IMU to control dexterous prosthetic hand,” in Proc. IEEE Int. Conf. Robotics and Biomimetics, Dali, China, 2019, pp. 2614–2618.
    [45]
    P. G. S. Alva, S. Muceli, S. Farokh Atashzar, L. William, and D. Farina, “Wearable multichannel haptic device for encoding proprioception in the upper limb,” J. Neural Eng., vol. 17, no. 5, p. 056035, Oct. 2020. doi: 10.1088/1741-2552/aba6da
    [46]
    M. D’Alonzo, F. Clemente, and C. Cipriani, “Vibrotactile stimulation promotes embodiment of an alien hand in amputees with phantom sensations,” IEEE Trans. Neural Syst. Rehabil. Eng., vol. 23, no. 3, pp. 450–457, May 2015. doi: 10.1109/TNSRE.2014.2337952
    [47]
    M. A. Schweisfurth, M. Markovic, S. Dosen, F. Teich, B. Graimann, and D. Farina, “Electrotactile EMG feedback improves the control of prosthesis grasping force,” J. Neural Eng., vol. 13, no. 5, p. 056010, 2016. doi: 10.1088/1741-2560/13/5/056010
    [48]
    J. Gonzalez, H. Soma, M. Sekine, and W. Yu, “Psycho-physiological assessment of a prosthetic hand sensory feedback system based on an auditory display: A preliminary study,” J. Neuroeng. Rehab., vol. 9, no. 1, p. 33, Jun. 2012. doi: 10.1186/1743-0003-9-33
    [49]
    M. A. Wilke, C. Hartmann, F. Schimpf, D. Farina, and S. Dosen, “The interaction between feedback type and learning in routine grasping with myoelectric prostheses,” IEEE Trans. Haptics, vol. 13, no. 3, pp. 645–654, Jul.–Sept. 2020. doi: 10.1109/TOH.2019.2961652
    [50]
    J. D. Brown, A. Paek, M. Syed, M. K. O’Malley, P. A. Shewokis, J. L. Contreras-Vidal, A. J. Davis, and R. B. Gillespie, “Understanding the role of haptic feedback in a teleoperated/prosthetic grasp and lift task,” in Proc. World Haptics Conf., Daejeon, Korea (South), 2013, pp. 271–276.
    [51]
    C. Cipriani, F. Zaccone, S. Micera, and M. C. Carrozza, “On the shared control of an EMG-controlled prosthetic hand: Analysis of user-prosthesis interaction,” IEEE Trans. Robot., vol. 24, no. 1, pp. 170–184, Feb. 2008. doi: 10.1109/TRO.2007.910708
    [52]
    J. Mouchoux, M. A. Bravo-Cabrera, S. Dosen, A. F. Schilling, and M. Markovic., “Impact of shared control modalities on performance and usability of semi-autonomous prostheses,” Front. Neurorobot., vol. 15, p. 768619, Dec. 2021. doi: 10.3389/fnbot.2021.768619
    [53]
    R. Volkmar, S. Dosen, J. Gonzalez-Vargas, M. Baum, and M. Markovic, “Improving bimanual interaction with a prosthesis using semi-autonomous control,” J. Neuroeng. Rehabil., vol. 16, no. 1, p. 140, Nov. 2019. doi: 10.1186/s12984-019-0617-6
    [54]
    M. Markovic, S. Dosen, D. Popovic, B. Graimann, and D. Farina, “Sensor fusion and computer vision for context-aware control of a multi degree-of-freedom prosthesis,” J. Neural Eng., vol. 12, no. 6, p. 066022, Nov. 2015. doi: 10.1088/1741-2560/12/6/066022
    [55]
    F. Parietti and H. H. Asada, “Independent, voluntary control of extra robotic limbs,” in Proc. IEEE Int. Conf. Robotics and Automation, Singapore, 2017, pp. 5954–5961.
    [56]
    G. Liu, L. Wang, and J. Wang, “A novel energy-motion model for continuous sEMG decoding: From muscle energy to motor pattern,” J. Neural Eng., vol. 18, no. 1, p. 016019, Feb. 2021. doi: 10.1088/1741-2552/abbece
    [57]
    J. Li, G. Li, Z. Chen, and J. Li, “A novel EMG-based variable impedance control method for a tele-operation system under an unstructured environment,” IEEE Access, vol. 10, pp. 89509–89518, Aug. 2022. doi: 10.1109/ACCESS.2022.3200696
    [58]
    H. J. B. Witteveen, E. A. Droog, J. S. Rietman, and P. H. Veltink, “Vibro- and electrotactile user feedback on hand opening for myoelectric forearm prostheses,” IEEE Trans. Biomed. Eng., vol. 59, no. 8, pp. 2219–2226, Aug. 2012. doi: 10.1109/TBME.2012.2200678
    [59]
    J. Mouchoux, S. Carisi, S. Dosen, D. Farina, A. F. Schilling, and M. Markovic, “Artificial perception and semiautonomous control in myoelectric hand prostheses increases performance and decreases effort,” IEEE Trans. Robot., vol. 37, no. 4, pp. 1298–1312, Aug. 2021. doi: 10.1109/TRO.2020.3047013
    [60]
    D. Chen, X. Zhou, J. Li, J. He, X. Yu, L. Zhang, and W. Qi, “A muscle teleoperation system of a robotic rollator based on bilateral shared control,” IEEE Access, vol. 8, pp. 151160–151170, Aug. 2020. doi: 10.1109/ACCESS.2020.3016841
    [61]
    F. S. Botros, A. Phinyomark, and E. J. Scheme, “Electromyography-based gesture recognition: Is it time to change focus from the forearm to the wrist,” IEEE Trans. Ind. Inf., vol. 18, no. 1, pp. 174–184, Jan. 2022. doi: 10.1109/TII.2020.3041618
    [62]
    S. Pancholi and A. M. Joshi, “Improved classification scheme using fused wavelet packet transform based features for intelligent myoelectric prostheses,” IEEE Trans. Ind. Electron., vol. 67, no. 10, pp. 8517–8525, Oct. 2020. doi: 10.1109/TIE.2019.2946536
    [63]
    D. Xiong, D. Zhang, X. Zhao, Y. Chu, and Y. Zhao, “Learning non-euclidean representations with SPD manifold for myoelectric pattern recognition,” IEEE Trans. Neural Syst. Rehabil. Eng., vol. 30, pp. 1514–1524, May 2022. doi: 10.1109/TNSRE.2022.3178384
    [64]
    W. Wei, Q. Dai, Y. Wong, Y. Hu, M. Kankanhalli, and W. Geng, “Surface-electromyography-based gesture recognition by multi-view deep learning,” IEEE Trans. Biomed. Eng., vol. 66, no. 10, pp. 2964–2973, Oct. 2019. doi: 10.1109/TBME.2019.2899222
    [65]
    M. Atzori, M. Cognolato, and H. Müller, “Deep learning with convolutional neural networks applied to electromyography data: A resource for the classification of movements for prosthetic hands,” Front. Neurorobot., vol. 10, p. 9, Sept. 2016.
    [66]
    I. J. R. Martinez, A. Mannini, F. Clemente, A. M. Sabatini, and C. Cipriani, “Grasp force estimation from the transient EMG using high-density surface recordings,” J. Neural Eng., vol. 17, no. 1, p. 016052, Feb. 2020. doi: 10.1088/1741-2552/ab673f
    [67]
    A. Ameri, E. N. Kamavuako, E. J. Scheme, K. B. Englehart, and P. A. Parker, “Support vector regression for improved real-time, simultaneous myoelectric control,” IEEE Trans. Neural Syst. Rehabil. Eng., vol. 22, no. 6, pp. 1198–1209, Nov. 2014. doi: 10.1109/TNSRE.2014.2323576
    [68]
    G. Hajian and E. Morin, “Deep multi-scale fusion of convolutional neural networks for EMG-based movement estimation,” IEEE Trans. Neural Syst. Rehabil. Eng., vol. 30, pp. 486–495, Feb. 2022. doi: 10.1109/TNSRE.2022.3153252
    [69]
    C. Caulcrick, W. Huo, E. Franco, S. Mohammed, W. Hoult, and R. Vaidyanathan, “Model predictive control for human-centred lower limb robotic assistance,” IEEE Trans. Med. Robot. Bionics, vol. 3, no. 4, pp. 980–991, Nov. 2021. doi: 10.1109/TMRB.2021.3105141
    [70]
    G. Li, Z. Li, J. Li, Y. Liu, and H. Qiao, “Muscle-synergy-based planning and neural-adaptive control for a prosthetic arm,” IEEE Trans. Art. Intell., vol. 2, no. 5, pp. 424–436, Oct. 2021. doi: 10.1109/TAI.2021.3091038
    [71]
    J. Han, Q. Ding, A. Xiong, and X. Zhao, “A state-space EMG model for the estimation of continuous joint movements,” IEEE Trans. Ind. Electron., vol. 62, no. 7, pp. 4267–4275, Jul. 2015. doi: 10.1109/TIE.2014.2387337
    [72]
    G. G. Peña, L. J. Consoni, W. M. Dos Santos, and A. A. G. Siqueira, “Feasibility of an optimal EMG-driven adaptive impedance control applied to an active knee orthosis,” Robot. Auton. Syst., vol. 112, pp. 98–108, Feb. 2019. doi: 10.1016/j.robot.2018.11.011
    [73]
    C. Xie, Q. Yang, Y. Huang, S. W. Su, T. Xu, and R. Song, “A hybrid arm-hand rehabilitation robot with EMG-based admittance controller,” IEEE Trans. Biomed. Circuits Syst., vol. 15, no. 6, pp. 1332–1342, Dec. 2021. doi: 10.1109/TBCAS.2021.3130090
    [74]
    B. J. E. Misgeld, M. Lüken, R. Riener, and S. Leonhardt, “Observer-based human knee stiffness estimation,” IEEE Trans. Biomed. Eng., vol. 64, no. 5, pp. 1033–1044, May 2017. doi: 10.1109/TBME.2016.2587841
    [75]
    H. Dimitrov, A. M. J. Bull, and D. Farina, “Real-time interface algorithm for ankle kinematics and stiffness from electromyographic signals,” IEEE Trans. Neural Syst. Rehabil. Eng., vol. 28, no. 6, pp. 1416–1427, Jun. 2020. doi: 10.1109/TNSRE.2020.2986787
    [76]
    C. Antfolk, M. D’Alonzo, B. Rosén, G. Lundborg, F. Sebelius, and C. Cipriani, “Sensory feedback in upper limb prosthetics,” Expert Rev. Med. Devices, vol. 10, no. 1, pp. 45–54, Jan. 2013. doi: 10.1586/erd.12.68
    [77]
    A. Ninu, S. Dosen, S. Muceli, F. Rattay, H. Dietl, and D. Farina, “Closed-loop control of grasping with a myoelectric hand prosthesis: Which are the relevant feedback variables for force control,” IEEE Trans. Neural Syst. Rehabil. Eng., vol. 22, no. 5, pp. 1041–1052, Sept. 2014. doi: 10.1109/TNSRE.2014.2318431
    [78]
    S. Cheng, A. Yi, U. X. Tan, and D. Zhang, “Closed-loop system for myoelectric hand control based on electrotactile stimulation,” in Proc. 3rd Int. Conf. Advanced Robotics and Mechatronics, Singapore, 2018, pp. 486–490.
    [79]
    Y. He, O. Fukuda, S. Ide, H. Okumura, N. Yamaguchi, and N. Bu, “Simulation system for myoelectric hand prosthesis using augmented reality,” in Proc. IEEE Int. Conf. Robotics and Biomimetics, Macau, China, 2017, pp. 1424–1429.
    [80]
    Y. Fang, D. Zhou, K. Li, and H. Liu, “Interface prostheses with classifier-feedback-based user training,” IEEE Trans. Biomed. Eng., vol. 64, no. 11, pp. 2575–2583, Nov. 2017. doi: 10.1109/TBME.2016.2641584
    [81]
    R. B. Woodward and L. J. Hargrove, “Adapting myoelectric control in real-time using a virtual environment,” J. Neuroeng. Rehabil., vol. 16, no. 1, p. 11, Jan. 2019. doi: 10.1186/s12984-019-0480-5
    [82]
    L. Van Dijk, C. K. Van Der Sluis, H. W. Van Dijk, and R. M. Bongers, “Learning an EMG controlled game: Task-specific adaptations and transfer,” PLoS One, vol. 11, no. 8, p. e0160817, 2016. doi: 10.1371/journal.pone.0160817
    [83]
    J. M. Hahne, S. Dähne, H. J. Hwang, K. R. Müller, and L. C. Parra, “Concurrent adaptation of human and machine improves simultaneous and proportional myoelectric control,” IEEE Trans. Neural Syst. Rehabil. Eng., vol. 23, no. 4, pp. 618–627, Jul. 2015. doi: 10.1109/TNSRE.2015.2401134
    [84]
    M. Hamaya, T. Matsubara, T. Noda, T. Teramae, and J. Morimoto, “Learning assistive strategies from a few user-robot interactions: Model-based reinforcement learning approach,” in Proc. IEEE Int. Conf. Robotics and Autom., Stockholm, Sweden, 2016, pp. 3346–3351.
    [85]
    M. Hamaya, T. Matsubara, T. Noda, T. Teramae, and J. Morimoto, “Learning task-parametrized assistive strategies for exoskeleton robots by multi-task reinforcement learning,” in Proc. IEEE Int. Conf. Robotics and Autom., Singapore, 2017, pp. 5907–5912.
    [86]
    A. Gibson and P. Artemiadis, “Object discrimination using optimized multi-frequency auditory cross-modal haptic feedback,” in Proc. 36th Annu. Int. Conf. IEEE Engineering in Medicine and Biology Society, Chicago, IL, USA, 2014, pp. 6505–6508.
    [87]
    A. Gibson and P. Artemiadis, “Neural closed-loop control of a hand prosthesis using cross-modal haptic feedback,” in Proc. IEEE Int. Conf. Rehabilitation Robotics, Singapore, Singapore, 2015, pp. 37–42.
    [88]
    J. Gonzalez and W. Yu, “Multichannel audio aided dynamical perception for prosthetic hand biofeedback,” in Proc. IEEE Int. Conf. Rehabilitation Robotics, Kyoto, Japan, 2009, pp. 240–245.
    [89]
    Y. Tsubouchi and K. Suzuki, “BioTones: A wearable device for EMG auditory biofeedback,” in Proc. Annu. Int. Conf. IEEE Engineering in Medicine and Biology, Buenos Aires, Argentina, 2010, pp. 6543–6546.
    [90]
    W. Sangngoen, W. Sroykham, S. Khemthong, W. Jalayondeja, Y. Kajornpredanon, and S. Thanangkul, “Effect of EMG biofeedback on muscle activity in computer work,” in Proc. 5th Biomedical Engineering Int. Conf., Muang, Thailand, 2012, pp. 1–4.
    [91]
    J. Gonzalez, H. Suzuki, N. Natsumi, M. Sekine, and W. Yu, “Auditory display as a prosthetic hand sensory feedback for reaching and grasping tasks,” in Proc. Annu. Int. Conf. IEEE Engineering in Medicine and Biology Society, San Diego, CA, USA, 2012, pp. 1789–1792.
    [92]
    J. He, D. Zhang, N. Jiang, X. Sheng, D. Farina, and X. Zhu, “User adaptation in long-term, open-loop myoelectric training: Implications for EMG pattern recognition in prosthesis control,” J. Neural Eng., vol. 12, no. 4, p. 046005, Jun. 2015. doi: 10.1088/1741-2560/12/4/046005
    [93]
    H. Zhang, Y. Zhao, F. Yao, L. Xu, P. Shang, and G. Li, “An adaptation strategy of using LDA classifier for EMG pattern recognition,” in Proc. 35th Annu. Int. Conf. IEEE Engineering in Medicine and Biology Society, Osaka, Japan, 2013, pp. 4267–270.
    [94]
    J. W. Sensinger, B. A. Lock, and T. A. Kuiken, “Adaptive pattern recognition of myoelectric signals: Exploration of conceptual framework and practical algorithms,” IEEE Trans. Neural Syst. Rehabil. Eng., vol. 17, no. 3, pp. 270–278, Jun. 2009. doi: 10.1109/TNSRE.2009.2023282
    [95]
    G. Ghazaei, A. Alameer, P. Degenaar, G. Morgan, and K. Nazarpour, “Deep learning-based artificial vision for grasp classification in myoelectric hands,” J. Neural Eng., vol. 14, no. 3, p. 036025, May 2017. doi: 10.1088/1741-2552/aa6802
    [96]
    D. G. Kanishka Madusanka, R. A. R. C. Gopura, Y. W. R. Amarasinghe, and G. K. I. Mann, “Hybrid vision based reach-to-grasp task planning method for trans-humeral prostheses,” IEEE Access, vol. 5, pp. 16149–16161, Jul. 2017. doi: 10.1109/ACCESS.2017.2727502
    [97]
    N. E. Krausz and L. J. Hargrove, “Sensor fusion of vision, kinetics, and kinematics for forward prediction during walking with a transfemoral prosthesis,” IEEE Trans. Med. Robot. Bionics, vol. 3, no. 3, pp. 813–824, Aug. 2021. doi: 10.1109/TMRB.2021.3082206
    [98]
    B. Zhu, D. H. Zhang, Y. Q. Chu, and X. G. Zhao, “A novel limbs-free variable structure wheelchair based on face-computer interface (FCI) with shared control,” in Proc. Int. Conf. Robotics and Automation, Philadelphia, PA, USA, 2022, pp. 5480–5486.
    [99]
    A. Dwivedi, D. Shieff, A. Turner, G. Gorjup, Y. Kwon, and M. Liarokapis, “A shared control framework for robotic telemanipulation combining electromyography based motion estimation and compliance control,” in Proc. IEEE Int. Conf. Robotics and Autom., Xi’an, China, 2021, pp. 9467–9473.
    [100]
    J. Luo, Z. Lin, Y. Li, and C. Yang, “A teleoperation framework for mobile robots based on shared control,” IEEE Robot. Autom. Lett., vol. 5, no. 2, pp. 377–384, Apr. 2020. doi: 10.1109/LRA.2019.2959442
    [101]
    R. V. Godoy, A. Dwivedi, B. Guan, A. Turner, D. Shieff, and M. Liarokapis, “On EMG based dexterous robotic telemanipulation: Assessing machine learning techniques, feature extraction methods, and shared control schemes,” IEEE Access, vol. 10, pp. 99661–99674, Sept. 2022. doi: 10.1109/ACCESS.2022.3206436
    [102]
    J. Starke, P. Weiner, M. Crell, and T. Asfour, “Semi-autonomous control of prosthetic hands based on multimodal sensing, human grasp demonstration and user intention,” Robot. Auton. Syst., vol. 154, p. 104123, Aug. 2022. doi: 10.1016/j.robot.2022.104123
    [103]
    M. Markovic, S. Dosen, C. Cipriani, D. Popovic, and D. Farina, “Stereovision and augmented reality for closed-loop control of grasping in hand prostheses,” J. Neural Eng., vol. 11, no. 4, p. 046001, Jun. 2014. doi: 10.1088/1741-2560/11/4/046001
    [104]
    Y. He, R. Shima, O. Fukuda, N. Bu, N. Yamaguchi, and H. Okumura, “Development of distributed control system for vision-based myoelectric prosthetic hand,” IEEE Access, vol. 7, pp. 54542–54549, Apr. 2019. doi: 10.1109/ACCESS.2019.2911968
    [105]
    J. Kuhn, J. Ringwald, M. Schappler, L. Johannsmeier, and S. Haddadin, “Towards semi-autonomous and soft-robotics enabled upper-limb exoprosthetics: First concepts and robot-based emulation prototype,” in Proc. Int. Conf. Robotics and Autom., Montreal, Canada, 2019, pp. 9180–9186.
    [106]
    K. D. Katyal, M. S. Johannes, T. G. McGee, A. J. Harris, R. S. Armiger, A. H. Firpi, D. McMullen, G. Hotson, M. S. Fifer, N. E. Crone, R. J. Vogelstein, and B. A. Wester, “HARMONIE: A multimodal control framework for human assistive robotics,” in Proc. 6th Int. IEEE/EMBS Conf. Neural Engineering, San Diego, CA, USA, 2013, pp. 1274–1278.
    [107]
    E. Rahimian, S. Zabihi, A. Asif, D. Farina, S. F. Atashzar, and A. Mohammadi, “FS-HGR: Few-shot learning for hand gesture recognition via electromyography,” IEEE Trans. Neural Syst. Rehabil. Eng., vol. 29, pp. 1004–1015, May 2021. doi: 10.1109/TNSRE.2021.3077413
    [108]
    Z. Yu, Y. Lu, Q. An, C. Chen, Y. Li, and Y. Wang, “Real-time multiple gesture recognition: Application of a lightweight individualized 1D CNN model to an edge computing system,” IEEE Trans. Neural Syst. Rehabil. Eng., vol. 30, pp. 990–998, Apr. 2022. doi: 10.1109/TNSRE.2022.3165858
    [109]
    I. Khan, X. Zhang, M. Rehman, and R. Ali, “A literature survey and empirical study of meta-learning for classifier selection,” IEEE Access, vol. 8, pp. 10262–10281, Jan. 2020. doi: 10.1109/ACCESS.2020.2964726
    [110]
    Q. Ding, X. Zhao, J. Han, C. Bu, and C. Wu, “Adaptive hybrid classifier for myoelectric pattern recognition against the interferences of outlier motion, muscle fatigue, and electrode doffing,” IEEE Trans. Neural Syst. Rehabil. Eng., vol. 27, no. 5, pp. 1071–1080, May 2019. doi: 10.1109/TNSRE.2019.2911316
    [111]
    H. P. Saal and S. J. Bensmaia, “Biomimetic approaches to bionic touch through a peripheral nerve interface,” Neuropsychologia, vol. 79, pp. 344–353, Dec. 2015. doi: 10.1016/j.neuropsychologia.2015.06.010
    [112]
    J. Xu, L. Xu, Y. Li, G. Cheng, J. Shi, J. Liu, and S. Chen, “A multi-channel reinforcement learning framework for robotic mirror therapy,” IEEE Robot. Autom. Lett., vol. 5, no. 4, pp. 5385–5392, Oct. 2020. doi: 10.1109/LRA.2020.3007408
    [113]
    A. Buerkle, A. Al-Yacoub, W. Eaton, M. Zimmer, T. Bamber, P. Ferreira, E. M. Hubbard, and N. Lohse, “An incremental learning approach to detect muscular fatigue in human-robot collaboration,” IEEE Trans. Hum. Mach. Syst., vol. 53, no. 3, pp. 520–528, Jun. 2023. doi: 10.1109/THMS.2023.3259139
    [114]
    W. Huang, C. Wang, R. Zhang, Y. Li, J. Wu, and F. Li, “VoxPoser: Composable 3D value maps for robotic manipulation with language models,” in Proc. 7th Conf. Robot Learning, Atlanta, GA, USA, 2023, pp. 540–562.
    [115]
    A. Phinyomark, P. Phukpattaranont, and C. Limsakul, “Feature reduction and selection for EMG signal classification,” Expert Syst. Appl., vol. 39, no. 8, pp. 7420–7431, Jun. 2012. doi: 10.1016/j.eswa.2012.01.102
    [116]
    M. A. Oskoei and H. Hu, “Myoelectric control systems–A survey,” Biomed. Signal Process. Control, vol. 2, no. 4, pp. 275–294, Oct. 2007. doi: 10.1016/j.bspc.2007.07.009
    [117]
    M. Ison and P. Artemiadis, “Proportional myoelectric control of robots: Muscle synergy development drives performance enhancement, retainment, and generalization,” IEEE Trans. Robot., vol. 31, no. 2, pp. 259–268, Apr. 2015. doi: 10.1109/TRO.2015.2395731
    [118]
    J. Chen, Y. Geng, Z. Chen, J. Z. Pan, Y. He, W. Zhang, I. Horrocks, and H. Chen, “Zero-shot and few-shot learning with knowledge graphs: A comprehensive survey,” Proc. IEEE, vol. 111, no. 6, pp. 653–685, Jun. 2023. doi: 10.1109/JPROC.2023.3279374
    [119]
    T. T. Nguyen, N. D. Nguyen, and S. Nahavandi, “Deep reinforcement learning for multiagent systems: A review of challenges, solutions, and applications,” IEEE Trans. Cybern., vol. 50, no. 9, pp. 3826–3839, Sept. 2020. doi: 10.1109/TCYB.2020.2977374
    [120]
    S. K. Khare, V. Blanes-Vidal, E. S. Nadimi, and U. R. Acharya, “Emotion recognition and artificial intelligence: A systematic review (2014–2023) and research recommendations,” Inform. Fusion, vol. 102, p. 102019, Feb. 2024. doi: 10.1016/j.inffus.2023.102019
    [121]
    H. Yaacob, F. Hossain, S. Shari, S. K. Khare, C. P. Ooi, and U. R. Acharya, “Application of artificial intelligence techniques for brain-computer interface in mental fatigue detection: A systematic review (2011–2022),” IEEE Access, vol. 11, pp. 74736–74758, Jul. 2023. doi: 10.1109/ACCESS.2023.3296382
    [122]
    S. K. Khare, S. March, P. D. Barua, V. M. Gadre, and U. R. Acharya, “Application of data fusion for automated detection of children with developmental and mental disorders: A systematic review of the last decade,” Inform. Fusion, vol. 99, p. 101898, Nov. 2023. doi: 10.1016/j.inffus.2023.101898
    [123]
    J. Duan, S. Yu, H. L. Tan, H. Zhu, and C. Tan, “A survey of embodied AI: From simulators to research tasks,” IEEE Trans. Emerging Top. Comput. Intell., vol. 6, no. 2, pp. 230–244, Apr. 2022. doi: 10.1109/TETCI.2022.3141105
    [124]
    H. P. Liu, D. Guo, F. C. Sun, and X. Y. Zhang, “Morphology-based embodied intelligence: Historical retrospect and research progress,” Acta Autom. Sinica, vol. 49, no. 6, pp. 1131–1154, Jun. 2023.
    [125]
    X. Wang, G. Chen, G. Qian, P. Gao, X. Y. Wei, Y. Wang, Y. Tian, and W. Gao, “Large-scale multi-modal pre-trained models: A comprehensive survey,” Mach. Intell. Res., vol. 20, no. 4, pp. 447–482, Jun. 2023. doi: 10.1007/s11633-022-1410-8

Catalog

    通讯作者: 陈斌, bchen63@163.com
    • 1. 

      沈阳化工大学材料科学与工程学院 沈阳 110142

    1. 本站搜索
    2. 百度学术搜索
    3. 万方数据库搜索
    4. CNKI搜索

    Figures(4)  / Tables(5)

    Article Metrics

    Article views (134) PDF downloads(25) Cited by()

    Highlights

    • This paper presents a comprehensive review of EMG-based human-robot-environment interaction (HREI), which introduces three interaction paradigms, including direct control, sensory feedback, and partial autonomous control
    • It provides a novel and broader perspective from the viewpoint of system interaction and user engagement rather than intention decoding, in which factors in human-robot interaction, robot-environment interaction, and state perception by human sensations are considered
    • Five key issues that are crucial for the overall performance of an HREI system, including precision, stability, user attention, compliance, and context awareness, are summarized and discussed
    • Several future directions like EMG decomposition, robust algorithms, HREI dataset, proprioception feedback, reinforcement learning, and embodied intelligence, are proposed to pave the way for future research

    /

    DownLoad:  Full-Size Img  PowerPoint
    Return
    Return