Exploring Dynamical SystemsCollaborative augmented realityProject duration:1996-1997 Contact: Anton Fuhrmann |
|||||
|
Augmented reality was used for the interactive visualization of complex dynamical systems. Using this platform, multiple researchers were able to collaborate on the analysis of phase space representations of dynamical systems to gain insight into this complex topic. | ||||
|
Scientific visualization is an important tool for scientists to deal with very hard problems that often exceed the limits of imagination. Real-time exploration of three-dimensional mathematical structures using augmented reality allows to rapidly examine many aspects of the problem and leads to faster understanding. We have studied dynamical systems, but the same technology may be used for exploration of any kind of scientific data. | ||||
|
We needs to create a system capable of displaying dynamical systems in real time for multiple users. However, the computation of such data takes a while which is at odds with the real-time requirements. Furthermore, an integration with an existing desktop visualization system - AVS - was desired, because many data sets and visualization tools were already available on this platform. | ||||
|
The Studierstube platform was extended to communicate with AVS via a dedicated network interface. While a tighly coupled real-time display loop was provided for rapid visual feedback, a separate communication loop between Studierstube and AVS was used for sending steering commands to AVS and computed geometry back to Studierstube. This architecture together with simple direct manipulation facilities of the data lead to a very responsive solution. | ||||
|
|
This page is maintained by Anton Fuhrmann. It was last updated on June 15, 2000.Typical setup
Details on the implementation of the
AVS Interface.
Animations
The following videos show how collaboration looks like
from the point of view of a third user.
The videos were generated by simulating the users view
using a tracked video camera and overlaying the video
footage in realtime with Studierstube-rendered graphics.
The graphics were rendered in realtime on a SGI O2 with
R5000 and secondary cache, the visualization server was
an SGI Indigo2 with R10000.
Two users collaborating:
Starting a streamsurface on the RTorus, as seen from a third user.
Quicktime movie 2.6MB
Main input device is the Personal Interaction Panel (PIP), an augmented prop.
Quicktime movie 2.5MB
This happens when you put your head through an object. Note how the clipping which occurs in front of our view can be used to take a look inside complex structures.
Quicktime movie 1MB
If you have any comments, please send a message to fuhrmann@cg.tuwien.ac.at.