2007
|
| Timo Ropinski; Frank Steinicke; Gerd Bruder; Klaus H. Hinrichs Focus+Context Resolution Adaption for Autostereoscopic Displays Proceedings Article In: Butz, Andreas; Fisher, Brian D.; Krüger, Antonio; Olivier, Patrick; Owada, Shigeru (Ed.): Smart Graphics, pp. 188–193, Springer, 2007. @inproceedings{RSBH07,
title = {Focus+Context Resolution Adaption for Autostereoscopic Displays},
author = { Timo Ropinski and Frank Steinicke and Gerd Bruder and Klaus H. Hinrichs},
editor = {Andreas Butz and Brian D. Fisher and Antonio Krüger and Patrick Olivier and Shigeru Owada},
year = {2007},
date = {2007-01-01},
booktitle = {Smart Graphics},
volume = {4569},
pages = {188--193},
publisher = {Springer},
series = {Lecture Notes in Computer Science},
keywords = {},
pubstate = {published},
tppubtype = {inproceedings}
}
|
| Frank Steinicke; Gerd Bruder; Harald Frenz A Multimodal Locomotion User Interface for Immersive Geospatial Information Systems Proceedings Article In: Proceedings of GI-Days, pp. 289–293, 2007. @inproceedings{SBF07,
title = {A Multimodal Locomotion User Interface for Immersive Geospatial Information Systems},
author = { Frank Steinicke and Gerd Bruder and Harald Frenz},
url = {https://sreal.ucf.edu/wp-content/uploads/2017/02/SBF07.pdf},
year = {2007},
date = {2007-01-01},
booktitle = {Proceedings of GI-Days},
pages = {289--293},
abstract = {In this paper we present a new multimodal locomotion user interface that enables users to travel through 3D environments displayed in geospatial information systems, e.g., Google Earth or Microsoft Virtual Earth. When using the proposed interface the geospatial data can be explored in immersive virtual environments (VEs) using stereoscopic visualization on a head-mounted display (HMD). When using certain tracking approaches the entire body can be tracked in order to support natural traveling by real walking. Moreover, intuitive devices are provided for both-handed interaction to complete the navigation process. We introduce the setup as well as corresponding interaction concepts.},
keywords = {},
pubstate = {published},
tppubtype = {inproceedings}
}
In this paper we present a new multimodal locomotion user interface that enables users to travel through 3D environments displayed in geospatial information systems, e.g., Google Earth or Microsoft Virtual Earth. When using the proposed interface the geospatial data can be explored in immersive virtual environments (VEs) using stereoscopic visualization on a head-mounted display (HMD). When using certain tracking approaches the entire body can be tracked in order to support natural traveling by real walking. Moreover, intuitive devices are provided for both-handed interaction to complete the navigation process. We introduce the setup as well as corresponding interaction concepts. |
| Frank Steinicke; Timo Ropinski; Gerd Bruder; Klaus H. Hinrichs 3D Modeling and Design Supported via Interscopic Interaction Strategies Proceedings Article In: Proceedings of HCI International, pp. 1160–1169, Springer, 2007. @inproceedings{SRBH07c,
title = {3D Modeling and Design Supported via Interscopic Interaction Strategies},
author = { Frank Steinicke and Timo Ropinski and Gerd Bruder and Klaus H. Hinrichs},
url = {https://sreal.ucf.edu/wp-content/uploads/2017/02/SRBH07c.pdf},
year = {2007},
date = {2007-01-01},
booktitle = {Proceedings of HCI International},
volume = {4553},
pages = {1160--1169},
publisher = {Springer},
series = {Lecture Notes in Computer Science},
abstract = {3D modeling applications are widely used in many application domains ranging from CAD to industrial or graphics design. Desktop environments have proven to be a powerful user interface for such tasks. However, the raising complexity of 3D dataset exceeds the possibilities provided by traditional devices or two-dimensional display. Thus, more natural and intuitive interfaces are required. But in order to get the users' acceptance technology-driven solutions that require inconvenient instrumentation, e.g., stereo glasses or tracked gloves, should be avoided. Autostereoscopic display environments in combination with 3D desktop devices enable users to experience virtual environments more immersive without annoying devices. In this paper we introduce interaction strategies with special consideration of the requirements of 3D modelers. We propose an interscopic display environment with implicated user interface strategies that allow displaying and interacting with both mono-, e.g., 2D elements, and stereoscopic content, which is beneficial for the 3D environment, which has to be manipulated. These concepts are discussed with special consideration of the requirements of 3D modeler and designers.},
keywords = {},
pubstate = {published},
tppubtype = {inproceedings}
}
3D modeling applications are widely used in many application domains ranging from CAD to industrial or graphics design. Desktop environments have proven to be a powerful user interface for such tasks. However, the raising complexity of 3D dataset exceeds the possibilities provided by traditional devices or two-dimensional display. Thus, more natural and intuitive interfaces are required. But in order to get the users' acceptance technology-driven solutions that require inconvenient instrumentation, e.g., stereo glasses or tracked gloves, should be avoided. Autostereoscopic display environments in combination with 3D desktop devices enable users to experience virtual environments more immersive without annoying devices. In this paper we introduce interaction strategies with special consideration of the requirements of 3D modelers. We propose an interscopic display environment with implicated user interface strategies that allow displaying and interacting with both mono-, e.g., 2D elements, and stereoscopic content, which is beneficial for the 3D environment, which has to be manipulated. These concepts are discussed with special consideration of the requirements of 3D modeler and designers. |
| Frank Steinicke; Gerd Bruder; Klaus H. Hinrichs; Timo Ropinski Simultane 2D/3D User Interface Konzepte f"ur Autostereoskopische Desktop-VR Systeme Proceedings Article In: Proceedings of the GI Workshop on Virtual and Augmented Reality (GI VR/AR), pp. 125–132, Shaker, 2007. @inproceedings{SBHR07,
title = {Simultane 2D/3D User Interface Konzepte f"ur Autostereoskopische Desktop-VR Systeme},
author = { Frank Steinicke and Gerd Bruder and Klaus H. Hinrichs and Timo Ropinski},
url = {https://sreal.ucf.edu/wp-content/uploads/2017/02/SBHR07.pdf},
year = {2007},
date = {2007-01-01},
booktitle = {Proceedings of the GI Workshop on Virtual and Augmented Reality (GI VR/AR)},
pages = {125--132},
publisher = {Shaker},
keywords = {},
pubstate = {published},
tppubtype = {inproceedings}
}
|
| Frank Steinicke; Gerd Bruder; Klaus H. Hinrichs [POSTER] Hybrid Traveling in Fully-Immersive Large-Scale Geographic Environments Proceedings Article In: Proceedings of the ACM Symposium on Virtual Reality and Software Technology (VRST) (Poster Presentation), pp. 229–230, 2007. @inproceedings{SBH07,
title = {[POSTER] Hybrid Traveling in Fully-Immersive Large-Scale Geographic Environments},
author = { Frank Steinicke and Gerd Bruder and Klaus H. Hinrichs},
url = {https://sreal.ucf.edu/wp-content/uploads/2017/02/SBH07.pdf},
year = {2007},
date = {2007-01-01},
booktitle = {Proceedings of the ACM Symposium on Virtual Reality and Software Technology (VRST) (Poster Presentation)},
pages = {229--230},
abstract = {In this paper we present hybrid traveling concepts that enable users to navigate immersively through 3D geospatial environments displayed by applications such as Google Earth or Microsoft Virtual Earth. We propose a framework which allows to integrate virtual reality (VR) based interaction devices and concepts into such applications that do not support VR technologies natively. In our proposed setup the content displayed by a geospatial application is visualized stereoscopically on a head-mounted display (HMD) for immersive exploration. The user's body can be tracked by using appropriate technologies in order to support natural traveling through the VE via a walking metaphor. Since the VE usually exceeds the dimension of the area in which the user can be tracked, we propose different strategies to map the user's movement into the virtual world. Moreover, intuitive devices and interaction techniques are presented for both-handed interaction to enrich the navigation process. In this paper we will describe the technical system setup as well as integrated interaction concepts and discuss scenarios based on existing geospatial visualization applications.},
keywords = {},
pubstate = {published},
tppubtype = {inproceedings}
}
In this paper we present hybrid traveling concepts that enable users to navigate immersively through 3D geospatial environments displayed by applications such as Google Earth or Microsoft Virtual Earth. We propose a framework which allows to integrate virtual reality (VR) based interaction devices and concepts into such applications that do not support VR technologies natively. In our proposed setup the content displayed by a geospatial application is visualized stereoscopically on a head-mounted display (HMD) for immersive exploration. The user's body can be tracked by using appropriate technologies in order to support natural traveling through the VE via a walking metaphor. Since the VE usually exceeds the dimension of the area in which the user can be tracked, we propose different strategies to map the user's movement into the virtual world. Moreover, intuitive devices and interaction techniques are presented for both-handed interaction to enrich the navigation process. In this paper we will describe the technical system setup as well as integrated interaction concepts and discuss scenarios based on existing geospatial visualization applications. |
2006
|
| Timo Ropinski; Frank Steinicke; Gerd Bruder; Klaus H. Hinrichs Simultaneously Viewing Monoscopic and Stereoscopic Content on Vertical-Interlaced Autostereoscopic Displays Proceedings Article In: Proceedings of the ACM International Conference and Exhibition on Computer Graphics and Interactive Techniques (SIGGRAPH) (Conference DVD), ACM Press, 2006. @inproceedings{RSBH06,
title = {Simultaneously Viewing Monoscopic and Stereoscopic Content on Vertical-Interlaced Autostereoscopic Displays},
author = { Timo Ropinski and Frank Steinicke and Gerd Bruder and Klaus H. Hinrichs},
year = {2006},
date = {2006-01-01},
booktitle = {Proceedings of the ACM International Conference and Exhibition on Computer Graphics and Interactive Techniques (SIGGRAPH) (Conference DVD)},
publisher = {ACM Press},
keywords = {},
pubstate = {published},
tppubtype = {inproceedings}
}
|
| Frank Steinicke; Timo Ropinski; Klaus H. Hinrichs; Gerd Bruder A Multiple View System for Modeling Building Entities Proceedings Article In: Proceedings of the International Conference on Coordinated & Multiple Views in Exploratory Visualization, pp. 69–78, IEEE Press, 2006. @inproceedings{SRHB06,
title = {A Multiple View System for Modeling Building Entities},
author = { Frank Steinicke and Timo Ropinski and Klaus H. Hinrichs and Gerd Bruder},
url = {https://sreal.ucf.edu/wp-content/uploads/2017/02/SRHB06.pdf},
year = {2006},
date = {2006-01-01},
booktitle = {Proceedings of the International Conference on Coordinated & Multiple Views in Exploratory Visualization},
pages = {69--78},
publisher = {IEEE Press},
abstract = {Modeling virtual buildings is an essential task in the city planning domain whereas several aspects have essential influence. Planners have to deal with different types of potentially multiform datasets; moreover they have to consider certain guidelines and constraints that are imposed on the development areas to which buildings are related. The planning process can be divided into different subtasks with varying requirements regarding the interaction techniques that are used for their accomplishment. To incorporate these aspects multiple view systems have proven enormous potential in order to provide efficient user interfaces. In this paper, we present strategies for modeling virtual building entities via a multiple view system as part of a 3D decision support system that enables the intuitive generation and evaluation of building proposals. City planners have been involved in the design process of the system, in particular the multiple view concepts. Therefore each view of the system, which visualizes different aspects concerning the underlying models, meets the demands of city planners. Furthermore, we present both coupled and uncoupled interaction techniques between different views with respect to requirements of the city planning domain.},
keywords = {},
pubstate = {published},
tppubtype = {inproceedings}
}
Modeling virtual buildings is an essential task in the city planning domain whereas several aspects have essential influence. Planners have to deal with different types of potentially multiform datasets; moreover they have to consider certain guidelines and constraints that are imposed on the development areas to which buildings are related. The planning process can be divided into different subtasks with varying requirements regarding the interaction techniques that are used for their accomplishment. To incorporate these aspects multiple view systems have proven enormous potential in order to provide efficient user interfaces. In this paper, we present strategies for modeling virtual building entities via a multiple view system as part of a 3D decision support system that enables the intuitive generation and evaluation of building proposals. City planners have been involved in the design process of the system, in particular the multiple view concepts. Therefore each view of the system, which visualizes different aspects concerning the underlying models, meets the demands of city planners. Furthermore, we present both coupled and uncoupled interaction techniques between different views with respect to requirements of the city planning domain. |