 Volume 9
							Issue 8
								
						 Volume 9
							Issue 8 
						IEEE/CAA Journal of Automatica Sinica
| Citation: | T. Liu, M. W. Hu, S. N. Ma, Y. Xiao, Y. Liu, and W. T. Song, “Exploring the effectiveness of gesture interaction in driver assistance systems via virtual reality,” IEEE/CAA J. Autom. Sinica, vol. 9, no. 8, pp. 1520–1523, Aug. 2022. doi: 10.1109/JAS.2022.105764 | 
 
	                | [1] | National Highway Traffic Safety Administration (NHTSA), “Visual-manual NHTSA driver distraction guidelines for in-vehicle electronic devices,” 2019. [Online], Available: https://www.nhtsa.gov/risky-driving/distracted-driving. | 
| [2] | B. Metz, A. Landau, and M. Just, “Frequency of secondary tasks in driving – Results from naturalistic driving data,” Safety Science, vol. 68, pp. 195–203, 2014. doi:  10.1016/j.ssci.2014.04.002 | 
| [3] | S. Rümelin and A. Butz, “How to make large touch screens usable while driving,” in Proc. 5th Int. Conf. Automotive User Interfaces and Interactive Vehicular Applications, 2013, pp. 48–55. | 
| [4] | Y. Saito, M. Itoh, and T. Inagaki, “Driver assistance system with a dual control scheme: Effectiveness of identifying driver drowsiness and preventing lane departure accidents,” IEEE Trans. Human-Machine Systems, vol. 46, no. 5, pp. 660–671, Oct. 2016. doi:  10.1109/THMS.2016.2549032 | 
| [5] | Microsoft, “Kinect,” [Online], Available: https://developer.microsoft.com/zh-cn/windows/kinect/. Accessed on: Jun. 2022. | 
| [6] | Leap, “Motion Controller,” [Online], Available: https://www.ultraleap.com/product/leap-motion-controller/. Accessed on: Jun. 2022. | 
| [7] | K. May, T. Gable, and B. Walker, “A multimodal air gesture interface for in vehicle menu navigation,” in Proc. 6th Int. Conf. Automotive User Interfaces and Interactive Vehicular Applications, 2014, pp. 1–6. | 
| [8] | V. Laack and A. Walter, “Measurement of sensory and cultural influences on haptic quality perception of vehicle interiors,” BoD–Books on Demand, 2014. | 
| [9] | G. Young, H. Milne, D. Griffiths, E. Padfield, R. Blenkinsopp, and O. Georgiou, “Designing mid-air haptic gesture controlled user interfaces for cars,” in Proc. ACM Hum.-Comput. Interact., vol. 4, no. 81, pp. 1–23, Jun. 2020. | 
| [10] | L. Angelini, F. Carrino, S. Carrino, M. Caon, O. Khaled, J. Baumgartner, A. Sonderegger, D. Lalanne, and E. Mugellini, “Gesturing on the steering wheel: A user-elicited taxonomy,” in Proc. 6th Int. Conf. Automotive User Interfaces and Interactive Vehicular Applications, 2014, pp. 1–8. | 
| [11] | A. Sharma, J. Roo, and J. Steimle, “Grasping microgestures: Eliciting single-hand microgestures for handheld objects,” in Proc. CHI Conf. Human Factors Computing Syst., 2019, pp. 1–13. | 
| [12] | Y. Xiao and R. He, “The intuitive grasp interface: Design and evaluation of micro-gestures on the steering wheel for driving scenario,” Univ Access Inf Soc, vol. 19, p. 433, Apr. 2020. | 
| [13] | B. Barbara, W. Jia, C. Deepa, M. Linda, C. Michael, and V. Federico, “Brain-based limitations in attention and secondary task engagement during high-fidelity driving simulation among young adults,” Neuroreport, vol. 31, no. 8, pp. 619–623, 2020. doi:  10.1097/WNR.0000000000001451 | 
| [14] | E. Chan, T. Seyed, W. Stuerzlinger, X. Yang, and F. Maurer, “User elicitation on single-hand microgestures.” in Proc. CHI Conf. Human Factors Computing Syst., 2016, pp. 3403–3414. | 
| [15] | S. Mattes, “The Lane-Change-Task as a tool for driver distraction evaluation,” in Proc. Quality of Work and Products in Enterprises of the Future, pp. 57–60, 2003. | 
| [16] | N. Dahlbck, A. Jnsson, and L. Ahrenberg, “Wizard of Oz studies,” in Proc. 1st int. Conf. Intelligent User Interfaces, 1993, pp. 193–200. | 
| [17] | S. Hart, “NASA-task load index (NASA-TLX); 20 years later,” in Proc. Human Factors & Ergonomics Society Meeting, Oct. 2006, vol. 50, no. 9, pp. 904–908. |