[PDF Version]

26 June 2019, 15:30 – 18:00

Invited Talk

15:30 – 16:30

Eye Tracking in Mixed Reality and its Promises for Spatial Research

Sophie Stellmach (Senior Scientist @ Microsoft, HoloLens team)

Abstract: Mixed Reality headsets such as “HoloLens 2” and “Magic Leap” blend the virtual with the real world and provide tremendous new possibilities for users and researchers alike – especially given the integrated eye and hand tracking as well as spatial mapping capabilities. On the one hand, this provides users with entirely new ways to engage with both their virtual and real environment. On the other hand, it provides a rich and powerful toolset for researchers to, for example, investigate visual attention for spatial research in a much more complex space. Whether this is to navigate more efficiently through a virtual 2D or 3D map that is placed in your real environment,  to find your way through an unknown building or to explore new ways to investigate shared visual attention in a complex 3D space. As part of the “HoloLens 2” team, I will talk about the possibilities and challenges that I see for Augmented and Mixed Reality headsets – in particular with respect to how spatial research can benefit in various ways from this innovative technology.

Sophie Stellmach is a Senior Scientist at Microsoft where she explores entirely new ways to engage with and blend our virtual and physical realities in products such as Microsoft HoloLens. With her PhD from the Technical University of Dresden, Germany, and a strong research background in Human-Computer Interaction, she bridges the gap between software engineering to innovative UX and multimodal interaction design. Sophie has been an avid eye tracking researcher for over a decade with her work ranging from using eye tracking for psychophysiological logging system, novel ways to visualize visual attention to exploring multimodal gaze-supported interactions in her PhD.

Paper presentations

16:30 – 16:50

GeoGCD: Improved Visual Search via Gaze-Contingent Display [ACM DL]

Kenan Bektaş (Zurich University of Applied Sciences Center for Aviation), Arzu Çöltekin (University of Applied Sciences and Arts Northwestern Switzerland, Institute for Interactive Technologies), Jens Krüger (University of Duisburg-Essen and University of Utah), Andrew T. Duchowski (Clemson University Visual Computing), Sara Irina Fabrikant (University of Zürich Department of Geography and Digital Society Initiative)

16:50 – 17:10

Eye gaze and head gaze in collaborative games [ACM DL]

Oleg Špakov (Tampere University), Howell Istance (Tampere University), Kari-Jouko Räihä (Tampere University), Tiia Viitanen (Tampere University of Applied Sciences), Harri Siirtola (Tampere University)

17:10 – 17:30

Attentional orienting in virtual reality using endogenous and exogenous cues in auditory and visual modalities [ACM DL]

Rébaï Soret (ISAE-SUPAERO, Université de Toulouse, France), Pom Charras (Université Paul Valéry-Montpellier III, France), Christophe Hurter (ENAC, Toulouse, France), Vsevolod Peysakhovich (ISAE-SUPAERO, Université de Toulouse, France)

17:30 – 17:50

POI-Track: Improving Map-Based Planning with Implicit POI Tracking [ACM DL]

Fabian Göbel (ETH Zurich), Peter Kiefer (ETH Zurich)

17:50 – 18:00, short paper

Gaze awareness improves collaboration efficiency in a physical collaborative assembly task [ACM DL]

Haofei Wang (The Hong Kong University of Science and Technology), Bertram Shi (The Hong Kong University of Science and Technology)