Creating and evaluating embodied interactive experiences: case studies of full-body, sonic and tactile enaction.

Loading...
Thumbnail Image
Journal Title
Journal ISSN
Volume Title
School of Science | Doctoral thesis (article-based) | Defence date: 2015-12-11
Checking the digitized thesis and permission for publishing
Instructions for the author
Date
2015
Major/Subject
Mcode
Degree programme
Language
en
Pages
87 + app. 73
Series
Aalto University publication series DOCTORAL DISSERTATIONS, 205/2015
Abstract
This thesis contributes to the field of embodied and multimodal interaction by presenting the development of different original interactive systems. Using a constructive approach, a variety of real-time user interaction situations were designed and tested, two cases of human-virtual character bodily interaction, two interactive sonifications of trampoline jumping, collaborative interaction in mobile music performance and tangible and tactile interaction with virtual sounds. While diverse in terms of application, all the explored interaction techniques belong to the context of augmentation and are grounded in the theory of embodiment and strategies for natural human-computer interaction (HCI). The cases have been contextualized within the umbrella of enaction, a paradigm of cognitive science that addresses the user as an embodied agent situated in an environment and coupled to it through sensorimotor activity. This activity of sensing and action is studied through different modalities: auditory, tactile and visual and combinations of these. The designed applications aim at a natural interaction with the system, being full-body, tangible and spatially aware. Particularly sonic interaction has been explored in the context of music creation, sports and auditory display. These technology-mediated scenarios are evaluated in order to understand what the adopted interaction techniques can bring to the user experience, how they modify impressions and enjoyment. The publications also discuss the enabling technologies used for the development, including motion tracking and programmed hardware for the tactile-sonic interaction and sonic and tangible interaction. Results show that combining full-body interaction with auditory augmentation and sonic interaction can modify the perception, observed behavior and emotion during the experience. Using spatial interaction together with tangible interaction or tactile feedback provides for a multimodal experience of exploring a mixed reality environment where audio can be accessed and manipulated with natural interaction. Embodied and spatial interaction brings playfulness to a mobile music improvisation, shifting the focus of the experience from music-making towards movement-based gaming. Finally, two novel implementations of full-body interaction based on the enactive paradigm are presented. In these designed scenarios of enaction the participant is motion tracked and a virtual character rendered as a stick figure is displayed in front of her on a screen. Results from the user studies show how the involvement of the body is crucial in understanding the behavior of a virtual character or a digital representation of the self in a gaming scenario.
Description
Supervising professor
Takala, Tapio, Prof., Aalto University, Department of Computer Science, Finland
Thesis advisor
Takala, Tapio, Prof., Aalto University, Department of Computer Science, Finland
Keywords
embodied interaction, enaction, multimodal, sonic interaction, full-body, audio augmented reality, avatars
Other note
Parts
  • [Publication 1]: Roberto Pugliese, Klaus Lehtonen. A framework for motion based bodily enaction with virtual characters. In 11th International Conference on Intelligent Virtual Agents (IVA 2011), Reykjavik, Iceland, Lecture Notes in Computer Science, Volume 6895, pp. 162-168, September 2011.
    DOI: 10.1007/978-3-642-23974-8_18 View at publisher
  • [Publication 2]: Roberto Pugliese, Archontis Politis, Tapio Takala. Spatial rendering of audio-tactile feedback for exploration and object interaction in virtual environments. In Proceedings of the 9th Sound and Music Computing Conference, Copenhagen, Denmark, pp. 241-248, July 2012.
  • [Publication 3]: Roberto Pugliese, Koray Tahiroğlu, Callum Goddard, James Nesfield. A qualitative evaluation of augmented human-human interaction in mobile group improvisation. In Proceedings of the International Conference on New Interfaces for Musical Expression, University of Michigan, Ann Arbor, May 2012.
  • [Publication 4]: Luca Turchet, Roberto Pugliese, Tapio Takala. Physically based sound synthesis and control of jumping sounds on an elastic trampoline. In Proceedings of ISon 2013, 4th Interactive Sonification Workshop, Fraunhofer IIS, Erlangen, Germany, pp. 87-94, December 2013.
  • [Publication 5]: Roberto Pugliese, Archontis Politis, Tapio Takala. ATSI: augmented and tangible sonic interaction. In Proceedings of the Ninth International Conference on Tangible, Embedded, and Embodied Interaction, Stanford, California, USA, ACM, pp. 97-104, January 2015.
    DOI: 10.1145/2677199.2680550 View at publisher
  • [Publication 6]: Roberto Pugliese, Tapio Takala. Sonic trampoline: how audio feedback impacts the user’s experience of jumping. MultiMedia, IEEE, Volume 22, no. 1, pp. 74-79, January - March 2015.
    DOI: 10.1109/MMUL.2015.13 View at publisher
  • [Publication 7]: Roberto Pugliese, Klaus Forger, Tapio Takala. Game experience when controlling a weak avatar in full-body enaction. In 15th International Conference on Intelligent Virtual Agents (IVA 2015), Delft, Netherlands, Lecture Notes in Computer Science, Volume 9238, pp. 418-431, August´2015.
    DOI: 10.1007/978-3-319-21996-7_45 View at publisher
Citation