HCI Deep Dives

HCI Deep Dives is your go-to podcast for exploring the latest trends, research, and innovations in Human Computer Interaction (HCI). AI-generated using the latest publications in the field, each episode dives into in-depth discussions on topics like wearable computing, augmented perception, cognitive augmentation, and digitalized emotions. Whether you’re a researcher, practitioner, or just curious about the intersection of technology and human senses, this podcast offers thought-provoking insights and ideas to keep you at the forefront of HCI.

Listen on:

  • Apple Podcasts
  • YouTube
  • Podbean App
  • Spotify
  • Amazon Music
  • iHeartRadio
  • PlayerFM
  • Podchaser
  • BoomPlay

Episodes

Friday Oct 25, 2024

Akifumi Takahashi, Yudai Tanaka, Archit Tamhane, Alan Shen, Shan-Yuan Teng, and Pedro Lopes. 2024. Can a Smartwatch Move Your Fingers? Compact and Practical Electrical Muscle Stimulation in a Smartwatch. In Proceedings of the 37th Annual ACM Symposium on User Interface Software and Technology (UIST '24). Association for Computing Machinery, New York, NY, USA, Article 2, 1–15. https://doi.org/10.1145/3654777.3676373
Smartwatches gained popularity in the mainstream, making them into today’s de-facto wearables. Despite advancements in sensing, haptics on smartwatches is still restricted to tactile feedback (e.g., vibration). Most smartwatch-sized actuators cannot render strong force-feedback. Simultaneously, electrical muscle stimulation (EMS) promises compact force-feedback but, to actuate fingers requires users to wear many electrodes on their forearms. While forearm electrodes provide good accuracy, they detract EMS from being a practical force-feedback interface. To address this, we propose moving the electrodes to the wrist—conveniently packing them in the backside of a smartwatch. In our first study, we found that by cross-sectionally stimulating the wrist in 1,728 trials, we can actuate thumb extension, index extension & flexion, middle flexion, pinky flexion, and wrist flexion. Following, we engineered a compact EMS that integrates directly into a smartwatch’s wristband (with a custom stimulator, electrodes, demultiplexers, and communication). In our second study, we found that participants could calibrate our device by themselves <Formula format="inline"><TexMath><?TeX $\sim 50 \%$?></TexMath><AltText>Math 1</AltText><File name="uist24-51-inline1" type="svg"/></Formula> faster than with conventional EMS. Furthermore, all participants preferred the experience of this device, especially for its social acceptability & practicality. We believe that our approach opens new applications for smartwatch-based interactions, such as haptic assistance during everyday tasks.
https://dl.acm.org/doi/10.1145/3654777.3676373

Friday Oct 25, 2024

Md Touhidul Islam, Noushad Sojib, Imran Kabir, Ashiqur Rahman Amit, Mohammad Ruhul Amin, and Syed Masum Billah. 2024. Wheeler: A Three-Wheeled Input Device for Usable, Efficient, and Versatile Non-Visual Interaction. In Proceedings of the 37th Annual ACM Symposium on User Interface Software and Technology (UIST '24). Association for Computing Machinery, New York, NY, USA, Article 31, 1–20.  https://doi.org/10.1145/3654777.3676396
Blind users rely on keyboards and assistive technologies like screen readers to interact with user interface (UI) elements. In modern applications with complex UI hierarchies, navigating to different UI elements poses a significant accessibility challenge. Users must listen to screen reader audio descriptions and press relevant keyboard keys one at a time. This paper introduces Wheeler, a novel three-wheeled, mouse-shaped stationary input device, to address this issue. Informed by participatory sessions, Wheeler enables blind users to navigate up to three hierarchical levels in an app independently using three wheels instead of navigating just one level at a time using a keyboard. The three wheels also offer versatility, allowing users to repurpose them for other tasks, such as 2D cursor manipulation. A study with 12 blind users indicates a significant reduction (40%) in navigation time compared to using a keyboard. Further, a diary study with our blind co-author highlights Wheeler’s additional benefits, such as accessing UI elements with partial metadata and facilitating mixed-ability collaboration.
https://dl.acm.org/doi/10.1145/3654777.3676396
 

Friday Oct 25, 2024

Shwetha Rajaram, Nels Numan, Balasaravanan Thoravi Kumaravel, Nicolai Marquardt, and Andrew D Wilson. 2024. BlendScape: Enabling End-User Customization of Video-Conferencing Environments through Generative AI. In Proceedings of the 37th Annual ACM Symposium on User Interface Software and Technology (UIST '24). Association for Computing Machinery, New York, NY, USA, Article 40, 1–19. https://doi.org/10.1145/3654777.3676326
Today’s video-conferencing tools support a rich range of professional and social activities, but their generic meeting environments cannot be dynamically adapted to align with distributed collaborators’ needs. To enable end-user customization, we developed BlendScape, a rendering and composition system for video-conferencing participants to tailor environments to their meeting context by leveraging AI image generation techniques. BlendScape supports flexible representations of task spaces by blending users’ physical or digital backgrounds into unified environments and implements multimodal interaction techniques to steer the generation. Through an exploratory study with 15 end-users, we investigated whether and how they would find value in using generative AI to customize video-conferencing environments. Participants envisioned using a system like BlendScape to facilitate collaborative activities in the future, but required further controls to mitigate distracting or unrealistic visual elements. We implemented scenarios to demonstrate BlendScape’s expressiveness for supporting environment design strategies from prior work and propose composition techniques to improve the quality of environments.
https://dl.acm.org/doi/10.1145/3654777.3676326

Thursday Oct 24, 2024

Ximing Shen, Youichi Kamiyama, Kouta Minamizawa, and Jun Nishida. 2024. DexteriSync: A Hand Thermal I/O Exoskeleton for Morphing Finger Dexterity Experience. In Proceedings of the 37th Annual ACM Symposium on User Interface Software and Technology (UIST '24). Association for Computing Machinery, New York, NY, USA, Article 102, 1–12. https://doi.org/10.1145/3654777.3676422
 
Skin temperature is an important physiological factor for human hand dexterity. Leveraging this feature, we engineered an exoskeleton, called DexteriSync, that can dynamically adjust the user’s finger dexterity and induce different thermal perceptions by modulating finger skin temperature. This exoskeleton comprises flexible silicone-copper tube segments, 3D-printed finger sockets, a 3D-printed palm base, a pump system, and a water temperature control with a storage unit. By realising an embodied experience of compromised dexterity, DexteriSync can help product designers understand the lived experience of compromised hand dexterity, such as that of the elderly and/or neurodivergent users, when designing daily necessities for them.We validated DexteriSync via a technical evaluation and two user studies, demonstrating that it can change skin temperature, dexterity, and thermal perception. An exploratory session with design students and an autistic compromised dexterity individual, demonstrated the exoskeleton provided a more realistic experience compared to video education, and allowed them to gain higher confidence in their designs. The results advocated for the efficacy of experiencing embodied compromised finger dexterity, which can promote an understanding of the related physical challenges and lead to a more persuasive design for assistive tools.https://dl.acm.org/doi/10.1145/3654777.3676422
 

Wednesday Oct 23, 2024

 
Andreia Valente, Dajin Lee, Seungmoon Choi, Mark Billinghurst, and Augusto Esteves. 2024. Modulating Heart Activity and Task Performance using Haptic Heartbeat Feedback: A Study Across Four Body Placements. In Proceedings of the 37th Annual ACM Symposium on User Interface Software and Technology (UIST '24). Association for Computing Machinery, New York, NY, USA, Article 25, 1–13. https://doi.org/10.1145/3654777.3676435
This paper explores the impact of vibrotactile haptic feedback on heart activity when the feedback is provided at four different body locations (chest, wrist, neck, and ankle) and with two feedback rates (50 bpm and 110 bpm). A user study found that the neck placement resulted in higher heart rates and lower heart rate variability, and higher frequencies correlated with increased heart rates and decreased heart rate variability. The chest was preferred in self-reported metrics, and neck placement was perceived as less satisfying, harmonious, and immersive. This research contributes to understanding the interplay between psychological experiences and physiological responses when using haptic biofeedback resembling real body signals.
https://dl.acm.org/doi/10.1145/3654777.3676435
 

Tuesday Oct 22, 2024

Junlei Hong, Tobias Langlotz, Jonathan Sutton, and Holger Regenbrecht. 2024. Visual Noise Cancellation: Exploring Visual Discomfort and Opportunities for Vision Augmentations. ACM Trans. Comput.-Hum. Interact. 31, 2, Article 22 (April 2024), 26 pages. https://doi.org/10.1145/3634699
Acoustic noise control or cancellation (ANC) is a commonplace component of modern audio headphones. ANC aims to actively mitigate disturbing environmental noise for a quieter and improved listening experience. ANC is digitally controlling frequency and amplitude characteristics of sound. Much less explored is visual noise and active visual noise control, which we address here. We first explore visual noise and scenarios in which visual noise arises based on findings from four workshops we conducted. We then introduce the concept of visual noise cancellation (VNC) and how it can be used to reduce identified effects of visual noise. In addition, we developed head-worn demonstration prototypes to practically explore the concept of active VNC with selected scenarios in a user study. Finally, we discuss the application of VNC, including vision augmentations that moderate the user’s view of the environment to address perceptual needs and to provide augmented reality content.
https://dl.acm.org/doi/10.1145/3634699

Monday Oct 21, 2024

Tim Duente, Dennis Stanke, Moritz Klose, Benjamin Simon, Ibraheem Al-Azzawi, and Michael Rohs. 2024. Shock Me The Way: Directional Electrotactile Feedback under the Smartwatch as a Navigation Aid for Cyclists. Proc. ACM Hum.-Comput. Interact. 8, MHCI, Article 274 (September 2024), 25 pages. https://doi.org/10.1145/3676521
Cycling navigation is a complex and stressful task as the cyclist needs to focus simultaneously on the navigation, the road, and other road users. We propose directional electrotactile feedback at the wrist to reduce the auditory and visual load during navigation-aided cycling. We designed a custom electrotactile grid with 9 electrodes that is clipped under a smartwatch. In a preliminary study we identified suitable calibration settings and gained first insights about a suitable electrode layout. In a subsequent laboratory study we showed that a direction can be encoded with a mean error of 19.28\,° (σ = 42.77°) by combining 2 adjacent electrodes. Additionally, by interpolating with 3 electrodes a direction can be conveyed with a similar mean error of 22.54° (σ = 43.57°). We evaluated our concept of directional electrotactile feedback for cyclists in an outdoor study, in which 98.8% of all junctions were taken correctly by eight study participants. Only one participant deviated substantially from the optimal path, but was successfully navigated back to the original route by our system.
https://dl.acm.org/doi/10.1145/3676521
 

Sunday Oct 20, 2024

Steeven Villa, Yannick Weiss, Niklas Hirsch, and Alexander Wiethoff. 2024. An Examination of Ultrasound Mid-air Haptics for Enhanced Material and Temperature Perception in Virtual Environments. Proc. ACM Hum.-Comput. Interact. 8, MHCI, Article 243 (September 2024), 21 pages. https://doi.org/10.1145/3676488
Rendering realistic tactile sensations of virtual objects remains a challenge in VR. While haptic interfaces have advanced, particularly with phased arrays, their ability to create realistic object properties like state and temperature remains unclear. This study investigates the potential of Ultrasound Mid-air Haptics (UMH) for enhancing the perceived congruency of virtual objects. In a user study with 30 participants, we assessed how UMH impacts the perceived material state and temperature of virtual objects. We also analyzed EEG data to understand how participants integrate UMH information physiologically. Our results reveal that UMH significantly enhances the perceived congruency of virtual objects, particularly for solid objects, reducing the feeling of mismatch between visual and tactile feedback. Additionally, UMH consistently increases the perceived temperature of virtual objects. These findings offer valuable insights for haptic designers, demonstrating UMH's potential for creating more immersive tactile experiences in VR by addressing key limitations in current haptic technologies.
https://dl.acm.org/doi/10.1145/3676488

Sunday Oct 20, 2024

The use of wearable sensor technology opens up exciting avenues for both art and HCI research, providing new ways to explore the invisible link between audience and performer. To be effective, such work requires close collaboration between performers and researchers. In this article, we report on the co-design process and research insights from our work integrating physiological sensing and live performance. We explore the connection between the audience’s physiological data and their experience during the performance, analyzing a multi-modal dataset collected from 98 audience members. We identify notable moments based on HRV and EDA, and show how the audience’s physiological responses can be linked to the choreography. The longitudinal changes in HRV features suggest a strong connection to the choreographer’s intended narrative arc, while EDA features appear to correspond with short-term audience responses to dramatic moments. We discuss the physiological phenomena and implications for designing feedback systems and interdisciplinary collaborations.
https://dl.acm.org/doi/10.1145/3557887

Sunday Oct 20, 2024

Daniel Geißler, Hymalai Bello, Esther Zahn, Emil Woop, Bo Zhou, Paul Lukowicz, and Jakob Karolus. 2024. Head 'n Shoulder: Gesture-Driven Biking Through Capacitive Sensing Garments to Innovate Hands-Free Interaction. Proc. ACM Hum.-Comput. Interact. 8, MHCI, Article 265 (September 2024), 20 pages. https://doi.org/10.1145/3676510
Distractions caused by digital devices are increasingly causing dangerous situations on the road, particularly for more vulnerable road users like cyclists. While researchers have been exploring ways to enable richer interaction scenarios on the bike, safety concerns are frequently neglected and compromised. In this work, we propose Head 'n Shoulder, a gesture-driven approach to bike interaction without affecting bike control, based on a wearable garment that allows hands- and eyes-free interaction with digital devices through integrated capacitive sensors. It achieves an average accuracy of 97% in the final iteration, evaluated on 14 participants. Head 'n Shoulder does not rely on direct pressure sensing, allowing users to wear their everyday garments on top or underneath, not affecting recognition accuracy. Our work introduces a promising research direction: easily deployable smart garments with a minimal set of gestures suited for most bike interaction scenarios, sustaining the rider's comfort and safety.
https://dl.acm.org/doi/10.1145/3676510

Copyright 2024 All rights reserved.

Podcast Powered By Podbean

Version: 20241125