A journal of IEEE and CAA , publishes high-quality papers in English on original theoretical/experimental research and development in all areas of automation
Volume 4 Issue 4
Oct.  2017

IEEE/CAA Journal of Automatica Sinica

  • JCR Impact Factor: 11.8, Top 4% (SCI Q1)
    CiteScore: 17.6, Top 3% (Q1)
    Google Scholar h5-index: 77, TOP 5
Turn off MathJax
Article Contents
Zhentao Liu, Min Wu, Weihua Cao, Luefeng Chen, Jianping Xu, Ri Zhang, Mengtian Zhou and Junwei Mao, "A Facial Expression Emotion Recognition Based Human-robot Interaction System," IEEE/CAA J. Autom. Sinica, vol. 4, no. 4, pp. 668-676, Oct. 2017. doi: 10.1109/JAS.2017.7510622
Citation: Zhentao Liu, Min Wu, Weihua Cao, Luefeng Chen, Jianping Xu, Ri Zhang, Mengtian Zhou and Junwei Mao, "A Facial Expression Emotion Recognition Based Human-robot Interaction System," IEEE/CAA J. Autom. Sinica, vol. 4, no. 4, pp. 668-676, Oct. 2017. doi: 10.1109/JAS.2017.7510622

A Facial Expression Emotion Recognition Based Human-robot Interaction System

doi: 10.1109/JAS.2017.7510622
Funds:

the National Natural Science Foundation of China 61403422

the National Natural Science Foundation of China 61273102

the Hubei Provincial Natural Science Foundation of China 2015CFA010

the 111 Project B17040

More Information
  • A facial expression emotion recognition based human-robot interaction (FEER-HRI) system is proposed, for which a four-layer system framework is designed. The FEERHRI system enables the robots not only to recognize human emotions, but also to generate facial expression for adapting to human emotions. A facial emotion recognition method based on 2D-Gabor, uniform local binary pattern (LBP) operator, and multiclass extreme learning machine (ELM) classifier is presented, which is applied to real-time facial expression recognition for robots. Facial expressions of robots are represented by simple cartoon symbols and displayed by a LED screen equipped in the robots, which can be easily understood by human. Four scenarios, i.e., guiding, entertainment, home service and scene simulation are performed in the human-robot interaction experiment, in which smooth communication is realized by facial expression recognition of humans and facial expression generation of robots within 2 seconds. As a few prospective applications, the FEERHRI system can be applied in home service, smart home, safe driving, and so on.

     

  • loading
  • [1]
    Z. K. Wang, K. Mülling, M. P. Deisenroth, H. B. Amor, D. Vogt, B. Schölkopf, and J. Peters, "Probabilistic movement modeling for intention inference in human-robot interaction, " Int. J. Robot. Res. , vol. 32, no. 7, pp. 841-858, Apr. 2013. http: //dl. acm. org/citation. cfm?id=2500334. 2500341
    [2]
    K. Qian, J. Niu, and H. Yang, "Developing a gesture based remote human-robot interaction system using kinect, " Int. J. Smart Home, vol. 7, no. 4, pp. 203-208, Jul. 2013. http: //connection. ebscohost. com/c/articles/90307845/developing-gesture-based-remote-human-robot-interaction-system-using-kinect
    [3]
    M. Awais and D. Henrich, "Human-robot interaction in an unknown human intention scenario, " in Proc. 11th Int. Conf. Frontiers of Information Technology, Washington, DC, USA, 2013, pp. 89-94. http: //ieeexplore. ieee. org/document/6717232/
    [4]
    S. Iengo, A. Origlia, M. Staffa, and A. Finzi, "Attentional and emotional regulation in human-robot interaction, " in Proc. 21st IEEE Int. Symp. Robot and Human Interactive Communication, Paris, France, 2012, pp. 1135-1140. http: //ieeexplore. ieee. org/xpls/icp. jsp?arnumber=6343901
    [5]
    J. W. Ryu, C. Park, J. Kim, S. Kang, J. Oh, J. Sohn, and H. K. Cho, "KOBIE: A pet-type emotion robot, " J. Korea Robot. Soc. , vol. 3, no. 2, pp. 154-163, Jun. 2008. http: //www. koreascience. kr/article/ArticleFullRecord. jsp?cn=KROBC7_2008_v3n2_154
    [6]
    L. Zhang, M. Jiang, D. Farid, and M. A. Hossain, "Intelligent facial emotion recognition and semantic-based topic detection for a humanoid robot, " Exp. Syst. Appl. , vol. 40, no. 13, pp. 5160-5168, Oct. 2013. http: //www. sciencedirect. com/science/article/pii/S0957417413001668
    [7]
    H. J. Zhang, Z. Y. Zhao, Z. Y. Meng, and Z. L. Lin, "Experimental verification of a multi-robot distributed control algorithm with containment and group dispersion behaviors: The case of dynamic leaders, " IEEE/CAA J. Automat. Sinica, vol. 1, no. 1, pp. 54-60, Jan. 2014. http: //ieeexplore. ieee. org/document/6391021/
    [8]
    D. S. Kwon, Y. K. Kwak, J. C. Park, M. J. Chung, E. S. Jee, K. S. Park, H. R. Kim, Y. M. Kim, J. C. Park, E. H. Kim, K. H. Hyun, H. J. Min, H. S. Lee, J. W. Park, S. H. Jo, S. Y. Park, and K. W. Lee, "Emotion interaction system for a service robot, " in Proc. 16th IEEE Int. Symp. Robot and Human Interactive Communication, Jeju, Korea, 2007, pp. 351-356. http: //ieeexplore. ieee. org/xpls/abs_all. jsp?arnumber=4415108
    [9]
    J. J. Gross and L. F. Barrett, "The emerging field of affective science, " Emotion, vol. 13, no. 6, pp. 997-998, Dec. 2013. http: //www. ncbi. nlm. nih. gov/pubmed/24320711
    [10]
    K. Hirota and F. Y. Dong, "Development of mascot robot system in NEDO project, " in Proc. 4th Int. IEEE Conf. Intelligent Systems, Varna, Bulgaria, 2008, pp. 1-38-1-44. http: //ieeexplore. ieee. org/document/4670396/
    [11]
    Y. Yamazaki, F. Y. Dong, Y. Masuda, Y. Uehara, P. Kormushev, H. A. Vu, P. Q. Le, and K. Hirota, "Intent expression using eye robot for mascot robot system, " in Proc. 8th Int. Symp. Advanced Intelligent Systems, Madrid, Spain, 2007, pp. 576-580. http: //www. oalib. com/paper/4078089
    [12]
    Y. Yamazaki, H. A. Vu, Q. P. Le, K. Fukuda, Y. Matsuura, M. S. Hannachi, F. Y. Dong, Y. Takama, and K. Hirota, "Mascot robot System by integrating eye robot and speech recognition using RT middleware and its casual information recommendation, " in Proc. 3rd Int. Symp. Computational Intelligence and Industrial Applications, Bali, Indonesia, 2008, pp. 375-384. http: //t2r2. star. titech. ac. jp/cgi-bin/publicationinfo. cgi?q_publication_content_number=CTT100601375
    [13]
    T. Fukuda, D. Tachibana, F. Arai, J. Taguri, M. Nakashima, and Y. Hasegawa, "Human-robot mutual communication system, " in Proc. 10th IEEE Int. Workshop on Robot and Human Interactive Communication, Bordeaux, Paris, France, 2001, pp. 14-19. http: //ieeexplore. ieee. org/xpls/icp. jsp?arnumber=981870
    [14]
    J. Röning, J. Holappa, V. Kellokumpu, A. Tikanmäki, and M. Pietikäinen, "Minotaurus: A system for affective human-robot interaction in smart environments, " Cogn. Comput. , vol. 6, no. 4, pp. 940-953, Dec. 2014. doi: 10. 1007/s12559-014-9285-9
    [15]
    Z. T. Liu, F. F. Pan, M. Wu, W. H. Cao, L. F. Chen, J. P. Xu, R. Zhang, and M. T. Zhou, "A multimodal emotional communication based humans-robots interaction system, " in Proc. 35th Chinese Control Conf. , Chengdu, China, 2016, pp. 6363-6368. http: //ieeexplore. ieee. org/document/7554357/
    [16]
    P. Ekman and W. V. Friesen, "Constants across cultures in the face and emotion, " J. Personal. Soc. Psychol. , vol. 17, no. 2, pp. 124-129, Feb. 1971. http: //europepmc. org/abstract/med/5542557
    [17]
    M. N. Dailey, C. Joyce, M. J. Lyons, M. Kamachi, H. Ishi, J. Gyoba, and C. W. Cottrell, "Evidence and a computational explanation of cultural differences in facial expression recognition, " Emotion, vol. 10, no. 6, pp. 874-893, Dec. 2010. http: //www. ncbi. nlm. nih. gov/pubmed/21171759
    [18]
    G. Sandbach, S. Zafeiriou, M. Pantic, and L. J. Yin, "Static and dynamic 3D facial expression recognition: A comprehensive survey, " Image Vis. Comput. , vol. 30, no. 10, pp. 683-697, Oct. 2012. http: //dl. acm. org/citation. cfm?id=2379846. 2380025
    [19]
    Z. C. Lian, M. J. Er, and Y. Cong, "Local line derivative pattern for face recognition, " in Proc. 19th IEEE Int. Conf. Image Processing, Orlando, FL, USA, 2012, pp. 1449-1452. doi: 10. 1109/ICIP. 2012. 6467143
    [20]
    G. Huo, Y. N. Liu, and X. D. Zhu, "2D-gabor filter design and parameter selection based on iris recognition, " J. Inform. Comput. Sci. , vol. 11, no. 6, pp. 1995-2002, Apr. 2014. http: //www. joics. com/showabstract. aspx?id=3060
    [21]
    Z. H. Guo, L. Zhang, and D. Zhang, "Rotation invariant texture classification using LBP variance (LBPV) with global matching, " Pattern Recognition, vol. 43, no. 3, pp. 706-719, Mar. 2010. http: //dl. acm. org/citation. cfm?id=1660645
    [22]
    K. Mo and L. Xu, "Defect detection of solar cell based on threshold uniform LBP and BP neural network, " Acta Energ. Solar. Sinica, vol. 35, no. 12, pp. 2448-2454, Dec. 2014.
    [23]
    Z. T. Liu, G. T. Sui, D. Y. Li, and G. Z. Tan, "A novel facial expression recognition method based on extreme learning machine, " in Proc. 34th Chinese Control Conf. , Hangzhou, China, 2015, pp. 3852-3857. http: //ieeexplore. ieee. org/document/7260233/
    [24]
    W. B. Mu, S. L. Zhang, X. D. Wang, and J. Y. Li, "Study on the iris recognition technology based on 2D-gabor filter, " Appl. Mech. Mater. , vol. 539, pp. 151-155, Jul. 2014. http: //www. scientific. net/AMM. 539. 151
    [25]
    G. B. Huang, Q. Y. Zhu, and C. K. Siew, "Extreme learning machine: Theory and applications, " Neurocomputing, vol. 70, no. 1-3, pp. 489-501, Dec. 2006. http: //www. sciencedirect. com/science/article/pii/S0925231206000385

Catalog

    通讯作者: 陈斌, bchen63@163.com
    • 1. 

      沈阳化工大学材料科学与工程学院 沈阳 110142

    1. 本站搜索
    2. 百度学术搜索
    3. 万方数据库搜索
    4. CNKI搜索

    Figures(15)

    Article Metrics

    Article views (1784) PDF downloads(150) Cited by()

    /

    DownLoad:  Full-Size Img  PowerPoint
    Return
    Return