IEEE VR 2020 Tutorial: Developing Embodied Interactive Virtual Characters for Human-Subjects Studies

  • Title: Developing Embodied Interactive Virtual Characters for Human-Subjects Studies
  • Zoom: https://zoom.us/j/536530816
  • Twitch: https://www.twitch.tv/ieeevr2020_tutorial_embod
  • Organizers:
  • Summary
    • Embodied interactive virtual characters, such as virtual humans or animals, have been actively used for various Virtual/Augmented/Mixed Reality (VAMR) applications, and researchers have developed different types of embodied virtual characters and studied their effects on the user’s perception and behavior. This tutorial aims to provide the audience with background knowledge on research in embodied interactive virtual characters and how to develop such interactive characters for their specific applications, particularly focusing on human-in-the-loop systems (Wizard of Oz paradigm). The tutorial will first explore the prior interactive virtual character research focusing on the social influence of these entities over the users, e.g., the sense of social presence, trust, collaboration, while discussing the recent trend of the convergence among IVAs, MR, and Internet of Things (IoT) in the scope of virtual characters interacting with the physical surroundings. We will also share our recent research findings and some lessons from our 5+ years of experience in researching interactive virtual characters and user studies at the Synthetic Reality Lab (SREAL), University of Central Florida (UCF). The tutorial will explain how to develop virtual characters in Unity using 3rd party assets and plugins, such as Mixamo and Rogo Digital’s LipSync. The audience will follow the step-by-step instructions with provided materials and eventually have a simple interactive virtual character that they can control through conventional 2D user interfaces, considering human-in-the-loop studies. The tutorial will also explain how to develop a sensing module to understand the current state of the surrounding environment, which can make a realistic connection between the physical and the virtual worlds. For example, an Arduino board with a couple of sensors, e.g., a wind sensor, can be used to detect the wind in the real environment and trigger the coherent events in the virtual environment, such as blowing a virtual ball on a table.
  • Intended Audience
    • The tutorial will be of interest to students, faculty, and researchers who are interested in the interactive virtual character research and want to develop such interactive characters for their user studies. Basic understanding of programming and Unity editor, which audience can easily obtain from many Unity online lectures, should be sufficient to follow the tutorial, so audience without technical background are also welcome and encouraged to join.
  •  Expected Values
    • The audience for this tutorial can expect to leave with the following:
      • A basic understanding of embodied interactive virtual character research, its impact on human perception and behavior, and the recent trends and potential.
      • An understanding of how to prototype an embodied interactive virtual character.
      • An understanding of how to integrate an environmental sensing module with the embodied virtual character prototype.
  • Attendees are required to bring their own laptops for the tutorial.
  • Schedule (2:00pm–5:30pm, March 22, 2020, EDT)
  • Download Project: Entire Unity Project for Tutorial
  • Here is the link for the tutorial videos (SREAL UCF YouTube)
  • Thank you for your participation!