WFTV covers research done in UCF SREAL Lab (click to see video)

 

We thought you might enjoy seeing a glimpse of the research being conducted in UCF’s SREAL Lab. This research is being supported by National Science Foundation grants: 1560302 (UCF) and 1800961 (Collaborative with 1800947 to the University of Florida, and 1800922 to Stanford University) and the Office of Naval Research: N00014-18-1-2927.

XR Advance, now with Episode Guides

 

XR ADVANCE: AN EXTENDED REALITY VIDEO SERIES

Was hosted by Dr. Stephen Gilbert and the Virtual Reality Applications Center at Iowa State University, XR Advance welcomed extended reality experts to present innovative research in a series of videos.

Dr. Welch was invited to be one of the extended reality experts to present. His presentation on “Identifying User Physical States” can be viewed here, it was Episode 8 in the series.

Jason Hochreiter defended his PhD in Computer Science on June 24, 2019

 

FINAL ORAL EXAMINATION
OF

Jason Hochreiter
BS, University of Central Florida, 2011
MS, University of Central Florida, 2014
for the degree of
DOCTOR OF PHILOSOPHY
IN COMPUTER SCIENCE
June 24, 2019, 11:00 AM
Partnership III 233

Dissertation Committee:

Dr. Gregory Welch, Chairman welch@ucf.edu
Dr. Juan Cendan Juan.Cendan@ucf.edu
Dr. Laura Gonzalez Laura.Gonzalez@ucf.edu
Dr. Joseph LaViola Jr. jjl@cs.ucf.edu
Dr. Gerd Bruder Gerd.Bruder@ucf.edu

DISSERTATION RESEARCH IMPACT

The flat screens of today’s smartphones allow for integrated electronic touch sensing. Such electronic touch sensing methods are impractical to implement on non-planar rear-projection displays. This dissertation introduces a generalizable camera-based method for touch input on such rear-projection displays, allowing for touch interactions with complex virtual content registered to the surfaces. In a human-subject study, we demonstrate several advantages of this paradigm compared to others, including improved touch performance and decreases in cognitive load. We are particularly inspired by patient care: despite the importance of touch for both diagnostic and therapeutic purposes, modern high-fidelity mannequins and other patient simulators are typically unable to naturally respond to touch input, which can limit the effectiveness of training. Our research focuses on supporting touch input in a general way, making it suitable for patient simulation and other applications.

SELECTED PUBLICATIONS & PATENTS

Touch sensing on non-parametric rear-projection surfaces: A physical-virtual head for hands-on healthcare training, Jason Hochreiter, Salam Daher, Arjun Nagendran, Laura Gonzalez, Greg Welch, in Proceedings of IEEE Virtual Reality, 2015.

Optical touch sensing on non-parametric rear-projection surfaces for interactive physical-virtual experiences, Jason Hochreiter, Salam Daher, Arjun Nagendran, Laura Gonzalez, Greg Welch, in Presence: Teleoperators and Virtual Environments, 2016.

A systematic survey of 15 years of user studies published in the intelligent virtual agents conference, Nahal Norouzi, Kangsoo Kim, Jason Hochreiter, Myungho Lee, Salam Daher, Gerd Bruder, Greg Welch, in Proceedings of the 18th ACM International Conference on Intelligent Virtual Agents, 2018.

Physical-virtual agents for healthcare simulation, Salam Daher, Jason Hochreiter, Nahal Norouzi, Laura Gonzalez, Gerd Bruder, Greg Welch, in Proceedings of the 18th ACM International Conference on Intelligent Virtual Agents, 2018.

Cognitive and touch performance effects of mismatched 3D physical and visual perceptions, Jason Hochreiter, Salam Daher, Gerd Bruder, Greg Welch, in Proceedings of IEEE Virtual Reality, 2018.

Optical touch sensing on non-parametric rear-projection surfaces, Jason Hochreiter, in Proceedings of IEEE Virtual Reality, 2018.

Patents:

2014, System for Detecting Sterile Field Events and Related Methods, 9808549B2

DISSERTATION
Multi-touch detection and semantic response on non-parametric rear-projection surfaces

Modern interfaces supporting touch input are ubiquitous. Typically, such interfaces are implemented on integrated touch-display surfaces with simple geometry that can be mathematically parameterized, such as planar surfaces and spheres; for more complicated non-parametric surfaces, such parameterizations are not available. In this dissertation, we introduce a method for generalizable optical multi-touch detection and semantic response on uninstrumented non-parametric rear-projection surfaces using an infrared-light-based multi-camera multi-projector platform.

In this paradigm, touch input allows users to manipulate complex virtual 3D content that is registered to and displayed on a physical 3D object. Detected touches trigger responses with specific semantic meaning in the context of the virtual content, such as animations or audio responses. The broad problem of touch detection and response can be decomposed into three major components: determining if a touch has occurred, determining where a detected touch has occurred, and determining how to respond to a detected touch. Our fundamental contribution is the design and implementation of a relational lookup table architecture that addresses these challenges through the encoding of coordinate relationships among the cameras, the projectors, the physical surface, and the virtual content.  Additionally, we present and evaluate two algorithms for touch detection and localization utilizing the lookup table architecture.

We demonstrate touch-based interactions on several physical parametric and non-parametric surfaces, and we evaluate both system accuracy and the accuracy of typical users in touching desired targets on these surfaces. In a formative human-subject study, we present an exploratory application of this method in patient simulation. A second study highlights the advantages of touch input on content-matched physical surfaces achieved by our method, such as decreases in induced cognitive load, increases in system usability, and increases in user touch performance.

SELECTED AWARDS & HONORS:

2012, UCF Presidential Doctoral Fellowship
2018, IEEE Virtual Reality Doctoral Consortium
2018, Graduate Travel or Presentation Fellowship

Jason Hochreiter: Defense Announcement

Divine Maloney (2016 Summer Intern in the SREAL Lab at UCF) receives Microsoft Research Ada Lovelace Fellowship

 
Divine Maloney, ada lovelace winner

Divine Maloney

 

The Ada Lovelace Fellowship is a new fellowship for Microsoft Research, established to support diverse talent getting doctorates in computing-related fields by providing three years of funding for second-year PhD students from groups underrepresented in computing.

“My hope is that the Ada Lovelace Fellowship will inspire an abundance of unconventional problem solvers and thought leaders—those who don’t fit the mold but who are excited to address challenges in their respective communities and global society,” said Divine Maloney of Clemson University.

Maloney’s research seeks to understand the role of implicit biases in embodied virtual avatars and to establish guidelines to minimize unwanted biases in the design of avatars and other immersive virtual reality content.

More information on Microsoft Research Ada Lovelace and PhD Fellowships.

More information on Ada Lovelace Fellows.

Professor Welch was interviewed for a British Sunday Times Article – Where technology breakthroughs could take us next.

 

“There is no single factor that will make VR adoption go critical mass,” says Professor Gregory Welch, chair at the University of Central Florida College of Nursing. “The US military has just bought $100m of HoloLens units which it is using for training and UAV [unmanned aerial vehicle] surveys. It is also being used to train medical professionals in childbirth. AR has been around since the 1960s and the technology is improving all the time. When the hardware can be packaged like eye glasses, I suspect people will be more comfortable, but it still lacks a killer app.”

Read the full article here.

Find Article online here.

 

Gerd Bruder earned Best Paper Award at ICAT-EGVE 2018

 

"Prof. Gerd Bruder received the Best Paper Award at the ICAT-EGVE 2018 conference in Limassol, Cyprus, in collaboration with researchers from the University of Hamburg, Germany.

 

Susanne Schmidt, Gerd Bruder, and Frank Steinicke. "Effects of Embodiment on Generic and Content-Specific Intelligent Virtual Agents as Exhibition Guides," in Proceedings of the International Conference on Artificial Reality and Telexistence and Eurographics Symposium on Virtual Environments (ICAT-EGVE), 2018, pp. 161-169, Limassol, Cyprus.

Sungchul got Best Paper Award at ACM SUI 2018 in Berlin

 
Former UCF student Sungchul Jung, PhD had a great conference at SUI and was honored to receive the Best Paper Award. Co-awardees are Prof. Charlie Hughes (SREAL) and Prof. Gerd Bruder (SREAL). Congratulate our KIWI colleague from the world down under and his co-authors.
 
Sungchul is currently a Post Doctoral Fellow at University of Canterbury (New Zealand) in The Human Interface Technology Laboratory.
 
Sungchul Jung, Gerd Bruder, Pamela J. Wisniewski, Christian Sandor, and Charles E Hughes. Over My Hand: Using a Personalized Hand in VR to Improve Object Size Estimation, Body Ownership, and Presence. ACM Symposium on Spatial User Interaction (SUI) 2018.