PhD

Introduction

In recent history, computational devices evolved from simple calculators to now pervasive artifacts, with which we share most aspects of our lives, and it is hard to imagine otherwise. Yet, this change of the role of computers was not accompanied by a change in interaction: we still depend in most cases on screens, keyboards and mice. These legacy interfaces might be not only inadequate for the new tasks, but also they preserve the separation between digital and physical realms, which we consider now counterparts of our reality.

During this PhD, we focused on the dissolution of the separation between physical and digital, first by extending the reach of digital tools into the physical environment, followed by the creation of hybrid artifacts (physical-digital emulsions), to finally support the transition between different mixed realities, increasing immersion only when needed.

The final objective of this work is to augment the experience of reality. This comprises not only the support of the interaction with the external world, but also with the internal one. The following sections briefly describe our contributions towards that overarching goal.

Tangible Viewports: Extending the reach of screens

Desktop computers contain powerful applications, a good example is Computer-Aided-Design (CAD) software, which enables the creation of complex 3D models. When creating physical artifacts, 3D printing enables the materialization of these digital models, yet once printed, the physical artifacts are both static and disconnected from the digital versions. To address this, we gave life to the printed objects using Spatial Augmented Reality (SAR), and extended the reach of the mouse cursor to the objects when placed in front of the screen. This enables the use of traditional desktop applications, such as Photoshop to change the appearance of the object, or creative coding to create dynamic behavior. Being augmented physical objects, it is also possible to freely manipulate and interact with them. More importantly, we noticed that as soon as objects could be placed in front of the screen, the display and the applications shown on it became part of the user’s environment. This allows the users to choose the interaction modality that better suits the task at hand, instead of being constrained by an isolated interface.

This work was published and presented at TEI’16 [1].


Tangible Viewports: Getting Out of Flatland in Desktop Environments

Complementing mixed reality modalities

Using Augmented Reality (AR) to complement the physical world experience is a promising approach, yet the different AR technologies have different strengths and limitations. As a consequence, we face a trade-off when choosing a given technology. For instance, Spatial Augmented Reality (SAR) enables the environment to be augmented without the need of instrumenting the user, at the cost of being constrained by the properties of the available physical surfaces. In contrast, Head-Mounted Displays for Virtual Reality (HMD-VR) provide total control of what the user sees, but they isolate the users from their environment. In this work, we complemented SAR with HMD-VR, using both to interact with the same augmented scene, and enabling the seamless transition between them. This enables the users to benefit both from natural interaction and virtual operation over the augmented space.

This work was published and presented at 3DUI’17 [2], where it got the Best Tech-note Award.


Towards a Hybrid Space Combining Spatial Augmented Reality and Virtual Reality

The work presented at 3DUI was a work-in-progress, and was later extended to support see-through devices and both direct and tool-based interaction, such as pen and paper. The full work explores how can we progressively increase the users’ immersion in order to support additional digital tools, while keeping the interaction anchored in the physical space.

A paper describing the final version of the system was presented at UIST’17 [3].

One Reality: Augmenting How the Physical World is Experienced by Combining Multiple Mixed Reality Modalities

The combination of physical and virtual spaces was also explored in the context of asymmetric collaboration. Complex industrial environments, such as the aerospace industry, already use Virtual Reality to ease the conception and iteration of ideas. Yet, the decisions involved are usually taken by experts discussing in physical meeting rooms. We explored the possibility of bridge these two spaces in order to ease their awareness and communication capabilities, by building on our previous work. This proof of concept was presented at VRST’17 [6].

Towards Seamless Interaction between Physical and Virtual Locations for Asymmetric Collaboration

Inner Garden: Augmented artifacts for well-being

Similarly to how augmented artifacts can extend their reach from the digital realm into the physical one, we can extend the reach of our internal processes into the environment. Inner Garden is a mixed reality sandbox influenced by the user’s physiological activity. Created as a tangible meditation metaphor, Inner Garden enables users to construct a landscape with their bare hands, and then breathe life onto it. For instance, the user’s breathing is mapped to the sea waves, while the life and weather are influenced by their Cardiac Coherence (that is, a positive correlation between breathing and heart-rate fluctuation, associated with a relaxed state). The artifact was designed and iterated using both guidelines found in the literature and interviews with meditation experts and practitioners. Our takeaway from this process is that, when designing carefully, it is possible to use technology to support well-being.

The objective of Inner Garden was to explore the support of introspection and mindfulness through augmented artifacts. It was first presented as a work-in-progress at TEI’16 [4], and its full paper was recently accepted at the upcoming CHI’17 [5] with an Honorable Mention (top 5%). Inner Garden and other Potioc projects were also used to promote the combination of Tangible User Interfaces and Spatial Augmented Reality to support introspection [7,8].


Inner Garden: Connecting Inner States to a Mixed Reality Sandbox for Mindfulness

Overview

Physical and Digital realms are far richer and more powerful of what is nowadays involved in their interaction. We agree with the vision of a world where the physical and digital, and humans and their environment are not opposites, but instead all counterparts of a unified reality. This thesis will discuss the aforementioned contributions, as well as other projects (currently in progress) towards this overarching goal.

Thesis Defense

The thesis defense took place the 15/12/2017 at Bordeaux, France.

Members of the Jury

President of the jury: Nicolas Roussel
Rapporteur
: Anatole Lécuyer
Rapporteur: Eva Hornecker
Examiner: Jürgen Steimle
Examiner: Nadine Couture
Director: Martin Hachet

Manuscript

You can obtain the PhD manuscript HERE.

Indexed Publications

  1. Renaud Gervais, Joan Sol Roo, Martin Hachet. Tangible Viewports: Getting Out of Flatland in Desktop Environments. TEI’16, Tangible, Embedded and Embodied Interaction, Eindhoven, Netherlands. February 2016.
  2. Joan Sol Roo, Martin Hachet. Towards a Hybrid Space Combining Spatial Augmented Reality and Virtual Reality. 3DUI’17, IEEE Symposium on 3D User Interfaces, March 2017, California, USA. 2017. (Best Tech-note Award)
  3. Joan Sol Roo, Martin Hachet. One Reality: Augmenting How the Physical World is Experienced by Combining Multiple Mixed Reality Modalities. UIST’17, ACM Symposium on User Interface Software and Technology, Quebec, Canada. October 2017.
  4. Joan Sol Roo, Renaud Gervais, Martin Hachet. Inner Garden: an Augmented Sandbox Designed for Self-Reflection. TEI’16, Tenth International Conference on Tangible, Embedded, and Embodied Interaction, Feb 2016, Eindhoven, Netherlands. 2016. (Work in Progress)
  5. Joan Sol Roo, Renaud Gervais, Jerémy Frey, Martin Hachet. Inner Garden: Connecting Inner States to a Mixed Reality Sandbox for Mindfulness. CHI’17, Proceedings of the 2017 CHI Conference on Human Factors in Computing Systems, May 2017, Denver, USA. 2017. https://chi2017.acm.org. (Best Paper, Honorable mention)
  6. Damien Clergeaud*, Joan Sol Roo*, Martin Hachet, Pascal Guitton. Towards Seamless Interaction between Physical and Virtual Locations for Asymmetric Collaboration. VRST’17, 23rd ACM Symposium on Virtual Reality Software and Technology, Gothenburg, Sweden. November 2017. *Shared first co-authorship.

Other Documents

  1. Renaud Gervais, Joan Sol Roo, Jérémy Frey, Martin Hachet. Introspectibles: Tangible Interaction to Foster Introspection. In CHI 2016 Computing and Mental Health Workshop (CHI ’16 Workshop), California, USA. May 2016. http://mentalhealth.media.mit.edu/2016-2/
  2. Joan Sol Roo, Renaud Gervais, Jérémy Frey, Martin Hachet. Augmented Human Experience: Spatial Augmented Reality and Physiological Computing. ETIS, European Tangible Interaction Studio Winter School, Fribourg, Switzerland. January 2016. http://www.etis.estia.fr/etis-2016.html