Gernot Bahle, Paul Lukowicz, K. Kunze, K. Kise


In this paper we investigate how vision based devices (cameras or the Kinect controller) that happen to be in the users' environment can be used to improve and fine tune on body sensor systems for activity recognition. Thus we imagine a user with his on body activity recognition system passing through a space with a video camera (or a Kinect), picking up some information, and using it to improve his system. The general idea is to correlate an anonymous ”stick figure” like description of the motion of a user's body parts provided by the vision system with the sensor signals as a means of analyzing the sensors' properties. In the paper we for example demonstrate how such a correlation can be used to determine, without the need to train any classifiers, on which body part a motion sensor is worn.


@inproceedings {Bahle:I:2013:7124,
	number = {}, 
	month = {4}, 
	year = {2013}, 
	title = {I see you: How to improve wearable activity recognition by leveraging information from environmental cameras}, 
	journal = {}, 
	volume = {}, 
	pages = {409-412}, 
	publisher = {IEEE}, 
	author = {Gernot Bahle, Paul Lukowicz, K. Kunze, K. Kise}, 
	keywords = {body sensor networks;computer vision;correlation theory;gait analysis;image motion analysis;image sensors;object recognition;video cameras;anonymous stick figure;body sensor system;correlation method;environmental camera;motion sensor;sensor signal;user b}