Augmenting Human Perception: Mediation of Extrasensory Signals in Head-Worn Augmented Reality

Austin Erickson, Dirk Reiners, Gerd Bruder, Greg Welch: Augmenting Human Perception: Mediation of Extrasensory Signals in Head-Worn Augmented Reality. In: Proceedings of the 2021 International Symposium on Mixed and Augmented Reality, pp. 1-4, IEEE Forthcoming.

Abstract

Mediated perception systems are systems in which sensory signals from the user's environment are mediated to the user's sensory channels. This type of system has great potential for enhancing the perception of the user via augmenting and/or diminishing incoming sensory signals according to the user's context, preferences, and perceptual capability. They also allow for extending the perception of the user to enable them to sense signals typically imperceivable to human senses, such as regions of the electromagnetic spectrum beyond visible light. However, in order to effectively mediate extrasensory data to the user, we need to understand when and how such data should be presented to them.

In this paper, we present a prototype mediated perception system that maps extrasensory spatial data into visible light displayed within an augmented reality (AR) optical see-through head-mounted display (OST-HMD). Although the system is generalized such that it could support any spatial sensor data with minor modification, we chose to test the system using thermal infrared sensors. This system improves upon previous extended perception augmented reality prototypes in that it is capable of projecting egocentric sensor data in real time onto a 3D mesh generated by the OST-HMD that is representative of the user's environment. We present the lessons learned through iterative improvements to the system, as well as a performance analysis of the system and recommendations for future work.

BibTeX (Download)

@inproceedings{Erickson2021b,
title = {Augmenting Human Perception: Mediation of Extrasensory Signals in Head-Worn Augmented Reality},
author = {Austin Erickson and Dirk Reiners and Gerd Bruder and Greg Welch},
url = {https://sreal.ucf.edu/wp-content/uploads/2021/08/ismar21d-sub1093-i6.pdf},
year  = {2021},
date = {2021-10-04},
booktitle = {Proceedings of the 2021 International Symposium on Mixed and Augmented Reality},
pages = {1-4},
organization = {IEEE},
series = {ISMAR 2021},
abstract = {Mediated perception systems are systems in which sensory signals from the user's environment are mediated to the user's sensory channels. This type of system has great potential for enhancing the perception of the user via augmenting and/or diminishing incoming sensory signals according to the user's context, preferences, and perceptual capability. They also allow for extending the perception of the user to enable them to sense signals typically imperceivable to human senses, such as regions of the electromagnetic spectrum beyond visible light. However, in order to effectively mediate extrasensory data to the user, we need to understand when and how such data should be presented to them.

In this paper, we present a prototype mediated perception system that maps extrasensory spatial data into visible light displayed within an augmented reality (AR) optical see-through head-mounted display (OST-HMD). Although the system is generalized such that it could support any spatial sensor data with minor modification, we chose to test the system using thermal infrared sensors. This system improves upon previous extended perception augmented reality prototypes in that it is capable of projecting egocentric sensor data in real time onto a 3D mesh generated by the OST-HMD that is representative of the user's environment. We present the lessons learned through iterative improvements to the system, as well as a performance analysis of the system and recommendations for future work.},
keywords = {A-ae, A-gb, A-gfw, P-EICAR, P-EPICAR, SREAL},
pubstate = {forthcoming},
tppubtype = {inproceedings}
}