Author

Shoya Ishimaru, Kai Kunze, Koichi Kise, Jens Weppner, Andreas Dengel, Paul Lukowicz, Andreas Bulling

Abstract

We demonstrate how information about eye blink frequency and head motion patterns derived from Google Glass sensors can be used to distinguish different types of high level activities. While it is well known that eye blink frequency is correlated with user activity, our aim is to show that (1) eye blink frequency data from an unobtrusive, commercial platform which is not a dedicated eye tracker is good enough to be useful and (2) that adding head motion patterns information significantly improves the recognition rates. The method is evaluated on a data set from an experiment containing five activity classes (reading, talking, watching TV, mathematical problem solving, and sawing) of eight participants showing 67% recognition accuracy for eye blinking only and 82% when extended with head motion patterns.   [Download]

BibTex

@inproceedings {Ishimaru:In:2014:7614,
	number = {}, 
	month = {}, 
	year = {2014}, 
	title = {In the Blink of an Eye: Combining Head Motion and Eye Blink Frequency for Activity Recognition with Google Glass}, 
	journal = {}, 
	volume = {}, 
	pages = {15:1-15:4}, 
	publisher = {ACM}, 
	author = {Shoya Ishimaru, Kai Kunze, Koichi Kise, Jens Weppner, Andreas Dengel, Paul Lukowicz, Andreas Bulling}, 
	keywords = {Google Glass, IMU, activity recognition, blink frequency, head mounted sensor, infrared proximity sensor}
}