Author

Widyawan, Gerald Pirkl, Daniele Munaretto, Carl Fischer, Chunlei An, Paul Lukowicz, Martin Klepal, Andreas Timm-Giel, Joerg Widmer, Dirk Pesch, Hans Gellersen

Abstract

We present a novel, multimodal indoor navigation technique that combines pedestrian dead reckoning (PDR) with relative position information from wireless sensor nodes. It is motivated by emergency response scenarios where no fixed or pre-deployed global positioning infrastructure is available and where typical motion patterns defeat standard PDR systems. We use RF and ultrasound beacons to periodically re-align the PDR system and reduce the impact of incremental error accumulation. Unlike previous work on multimodal positioning, we allow the beacons to be dynamically deployed (dropped by the user) at previously unknown locations. A key contribution of this paper is to show that despite the fact that the beacon locations are not known (in terms of absolute coordinates), they significantly improve the performance of the system. This effect is especially relevant when a user re-traces (parts of) the path he or she had previously travelled or lingers and moves around in an irregular pattern at single locations for extended periods of time. Both situations are common and relevant for emergency response scenarios. We describe the system architecture, the fusion algorithms and provide an in depth evaluation in a large scale, realistic experiment.

BibTex

@article {Widyawan:Virtual:2012:6501,
	number = {3}, 
	month = {}, 
	year = {2012}, 
	title = {Virtual lifeline: Multimodal sensor data fusion for robust navigation in unknown environments}, 
	journal = {Pervasive and Mobile Computing}, 
	volume = {8}, 
	pages = {388-401}, 
	publisher = {Elsevier}, 
	author = {Widyawan, Gerald Pirkl, Daniele Munaretto, Carl Fischer, Chunlei An, Paul Lukowicz, Martin Klepal, Andreas Timm-Giel, Joerg Widmer, Dirk Pesch, Hans Gellersen}, 
	keywords = {Virtual lifeline;     Navigation;     Sensor data fusion;     Unknown environments}
}