14 Authoring Immersive Mixed Reality Experiences 281
14.4.1.1 Overall Architecture
During the earlier stages of our mixed reality environments development, the focus
was on the technology needed for working prototypes, finding a balance between
custom and off-the-shelf hardware and software tools. Based on our experiences in
the case studies, we developed a more generic platform called V2_ Game Engine
(VGE). We engineered the platform to enable artists and designers to create content
for mixed reality environments.
On a technical level the platform allows for flexible combinations between the
large amounts of components needed to realise mixed reality environments. We
chose to work with as many readily available software libraries and hardware com-
ponents as possible, enabling us to focus on the actual application within the scope
of mixed reality. Figure 14.1 provides an overview of all components that are part
of VGE.
The basis of VGE is the Ogre real-time 3D rendering engine.
9
We use Ogre to
create the virtual 3D images that we want to place in our mixed reality environ-
ment. Next t o Ogre we use OpenAL, the Bullet physics engine, SDL and several
smaller libraries needed to control our sensors and cameras to create the final mixed
reality result. Most of these available software libraries are written in C or C++,
but in order to rapidly design programs and experiment with different aspects of
these libraries we used Scheme as a scripting language to effectively glue these
components together. Scheme wraps the various libraries and connects them in a
very flexible manner, making it possible to change most aspects of the environment
at run time. Not only parameters but also the actual routines being executed can
be changed at run time. This allows for the engineers to rapidly test and prototype
all parts of the mixed reality system. Components that run on different computers
communicate using the Open Sound Control (OSC) protocol.
Artists and designers can use the Blender 3D authoring to model the geometry of
the real space and add virtual objects to it. The OSC-based metaserver middleware
makes it possible to feed these data into the VGE-rendering engine in real time,
making it directly visible and audible to a user wearing a mixed reality system and
vice versa. The measured position and orientation of the user can be shown in real
time in the modelling environment. Because VGE can be controlled via the OSC
protocol, it is relatively straightforward to control it from other applications, notably
by the Max/MSP visual programming environment that is used extensively by artists
and interaction designers.
14.4.1.2 Perspectives
As is clear from the case studies, our main research focus is on the first-person
perspective, i.e. the player. But because all virtual content is absolutely positioned
with respect to the real world, it is relatively easy to set up a third-person perspective.
9
http://www.ogre3d.org