PUMA
Istituto di Scienza e Tecnologie dell'Informazione     
Chessa S., Barsocchi P., Lenzi S. INTERMEDIA - Software module for mobile mixed reality content environment. Interactive Media with personal networked devices. Deliverable D13.1, 2008.
 
 
Abstract
(English)
The aim of this document is the definition of the software (and partially hardware) architecture of the Intermedia wearable mixed reality system. 2007 As one proof of concept, we developed a mobile mixed reality guide system for indoor environments, Chloe@University, as part of the INTERMEDIA ongoing demonstrators. A mobile computing device is hidden inside a jacket and a user selects a destination inside a building through wearable input buttons on a sleeve. A 3D virtual assistant then appears in the see-through HMD, and guides the user to his/her destination. Thus, the user simply follows the virtual guide. Chloe@University also suggests the most suitable virtual character based on user preferences and profiles. Depending on user profiles, different security levels and authorizations for content are previewed. Concerning indoor location tracking several sensor-based methods (WiFi, RFID and ZIGBEE) are integrated in this system in order to have maximum flexibility. 2008 The new client features a general relooking and code rewriting, resulting in a more coherent design (both graphically, architecturally and software) and better responsiveness. Localization modules performed more robustly than last year, user input was more intuitive and natural, 3D visualization faster and nicer and new functionalities provided a wider set of options to the user (like the bookmark interface and remote media controller). The new jacket design is lighter and more comfortable than previous year and the software/hardware interface more intuitive and simpler to use. Extended interaction with other frameworks, like the Bookmark system. Users can get a picture of a poster, a CD cover, etc. through their mobile phone and send this information to a central server that will analyze the input and provide specific feedback to the ChloeClient2008. This way, users can get a picture of a CD cover and MP3 files playback automatically through the integrated client sound manager. Users can also get a snapshot of a colleague's picture and the ChloeClient2008 will automatically guide them to his/her office, adding a new destination to the list of available one. All this operations don't even require the user to directly interact with the client but are transparently performed. The remainder of this document is organized as follows: Section 3 describes our envisioned scenario. In Section 4, we discuss our overall software architecture and main modules integrated into the mobile mixed reality system. After we explain details of each software module and current status, Section 5 shows our hardware specification as well. In Section 6 and 7, we conclude the document with lessons learned from the development of the system and future work.
Subject Dynamic Networking
Context-aware Network
QoS
C.2 COMPUTER-COMMUNICATION NETWORKS


Icona documento 1) Download Document PDF


Icona documento Open access Icona documento Restricted Icona documento Private

 


Per ulteriori informazioni, contattare: Librarian http://puma.isti.cnr.it

Valid HTML 4.0 Transitional