HCI Deep Dives

HCI Deep Dives is your go-to podcast for exploring the latest trends, research, and innovations in Human Computer Interaction (HCI). AI-generated using the latest publications in the field, each episode dives into in-depth discussions on topics like wearable computing, augmented perception, cognitive augmentation, and digitalized emotions. Whether you’re a researcher, practitioner, or just curious about the intersection of technology and human senses, this podcast offers thought-provoking insights and ideas to keep you at the forefront of HCI.

Listen on:

  • Apple Podcasts
  • YouTube
  • Podbean App
  • Spotify
  • Amazon Music
  • iHeartRadio
  • PlayerFM
  • Podchaser
  • BoomPlay

Episodes

Tuesday Oct 22, 2024

Junlei Hong, Tobias Langlotz, Jonathan Sutton, and Holger Regenbrecht. 2024. Visual Noise Cancellation: Exploring Visual Discomfort and Opportunities for Vision Augmentations. ACM Trans. Comput.-Hum. Interact. 31, 2, Article 22 (April 2024), 26 pages. https://doi.org/10.1145/3634699
Acoustic noise control or cancellation (ANC) is a commonplace component of modern audio headphones. ANC aims to actively mitigate disturbing environmental noise for a quieter and improved listening experience. ANC is digitally controlling frequency and amplitude characteristics of sound. Much less explored is visual noise and active visual noise control, which we address here. We first explore visual noise and scenarios in which visual noise arises based on findings from four workshops we conducted. We then introduce the concept of visual noise cancellation (VNC) and how it can be used to reduce identified effects of visual noise. In addition, we developed head-worn demonstration prototypes to practically explore the concept of active VNC with selected scenarios in a user study. Finally, we discuss the application of VNC, including vision augmentations that moderate the user’s view of the environment to address perceptual needs and to provide augmented reality content.
https://dl.acm.org/doi/10.1145/3634699

Monday Oct 21, 2024

Tim Duente, Dennis Stanke, Moritz Klose, Benjamin Simon, Ibraheem Al-Azzawi, and Michael Rohs. 2024. Shock Me The Way: Directional Electrotactile Feedback under the Smartwatch as a Navigation Aid for Cyclists. Proc. ACM Hum.-Comput. Interact. 8, MHCI, Article 274 (September 2024), 25 pages. https://doi.org/10.1145/3676521
Cycling navigation is a complex and stressful task as the cyclist needs to focus simultaneously on the navigation, the road, and other road users. We propose directional electrotactile feedback at the wrist to reduce the auditory and visual load during navigation-aided cycling. We designed a custom electrotactile grid with 9 electrodes that is clipped under a smartwatch. In a preliminary study we identified suitable calibration settings and gained first insights about a suitable electrode layout. In a subsequent laboratory study we showed that a direction can be encoded with a mean error of 19.28\,° (σ = 42.77°) by combining 2 adjacent electrodes. Additionally, by interpolating with 3 electrodes a direction can be conveyed with a similar mean error of 22.54° (σ = 43.57°). We evaluated our concept of directional electrotactile feedback for cyclists in an outdoor study, in which 98.8% of all junctions were taken correctly by eight study participants. Only one participant deviated substantially from the optimal path, but was successfully navigated back to the original route by our system.
https://dl.acm.org/doi/10.1145/3676521
 

Sunday Oct 20, 2024

Steeven Villa, Yannick Weiss, Niklas Hirsch, and Alexander Wiethoff. 2024. An Examination of Ultrasound Mid-air Haptics for Enhanced Material and Temperature Perception in Virtual Environments. Proc. ACM Hum.-Comput. Interact. 8, MHCI, Article 243 (September 2024), 21 pages. https://doi.org/10.1145/3676488
Rendering realistic tactile sensations of virtual objects remains a challenge in VR. While haptic interfaces have advanced, particularly with phased arrays, their ability to create realistic object properties like state and temperature remains unclear. This study investigates the potential of Ultrasound Mid-air Haptics (UMH) for enhancing the perceived congruency of virtual objects. In a user study with 30 participants, we assessed how UMH impacts the perceived material state and temperature of virtual objects. We also analyzed EEG data to understand how participants integrate UMH information physiologically. Our results reveal that UMH significantly enhances the perceived congruency of virtual objects, particularly for solid objects, reducing the feeling of mismatch between visual and tactile feedback. Additionally, UMH consistently increases the perceived temperature of virtual objects. These findings offer valuable insights for haptic designers, demonstrating UMH's potential for creating more immersive tactile experiences in VR by addressing key limitations in current haptic technologies.
https://dl.acm.org/doi/10.1145/3676488

Sunday Oct 20, 2024

The use of wearable sensor technology opens up exciting avenues for both art and HCI research, providing new ways to explore the invisible link between audience and performer. To be effective, such work requires close collaboration between performers and researchers. In this article, we report on the co-design process and research insights from our work integrating physiological sensing and live performance. We explore the connection between the audience’s physiological data and their experience during the performance, analyzing a multi-modal dataset collected from 98 audience members. We identify notable moments based on HRV and EDA, and show how the audience’s physiological responses can be linked to the choreography. The longitudinal changes in HRV features suggest a strong connection to the choreographer’s intended narrative arc, while EDA features appear to correspond with short-term audience responses to dramatic moments. We discuss the physiological phenomena and implications for designing feedback systems and interdisciplinary collaborations.
https://dl.acm.org/doi/10.1145/3557887

Sunday Oct 20, 2024

Daniel Geißler, Hymalai Bello, Esther Zahn, Emil Woop, Bo Zhou, Paul Lukowicz, and Jakob Karolus. 2024. Head 'n Shoulder: Gesture-Driven Biking Through Capacitive Sensing Garments to Innovate Hands-Free Interaction. Proc. ACM Hum.-Comput. Interact. 8, MHCI, Article 265 (September 2024), 20 pages. https://doi.org/10.1145/3676510
Distractions caused by digital devices are increasingly causing dangerous situations on the road, particularly for more vulnerable road users like cyclists. While researchers have been exploring ways to enable richer interaction scenarios on the bike, safety concerns are frequently neglected and compromised. In this work, we propose Head 'n Shoulder, a gesture-driven approach to bike interaction without affecting bike control, based on a wearable garment that allows hands- and eyes-free interaction with digital devices through integrated capacitive sensors. It achieves an average accuracy of 97% in the final iteration, evaluated on 14 participants. Head 'n Shoulder does not rely on direct pressure sensing, allowing users to wear their everyday garments on top or underneath, not affecting recognition accuracy. Our work introduces a promising research direction: easily deployable smart garments with a minimal set of gestures suited for most bike interaction scenarios, sustaining the rider's comfort and safety.
https://dl.acm.org/doi/10.1145/3676510

Sunday Oct 13, 2024

We deep dive today into the ISWC 2024 best paper award.
 
Tactile feedback mechanisms enhance the user experience of modern wearables by stimulating the sense of touch and enabling intuitive interactions. Electro-tactile stimulation-based tactile interfaces stand out due to their compact form factor and ability to deliver localized tactile sensations. Integrating force sensing with electro-tactile stimulation creates more responsive bidirectional systems that are beneficial in applications requiring precise control and feedback. However, current research often relies on separate sensors for force sensing, increasing system complexity and raising challenges in system scalability. We propose a novel approach that utilizes 3D-printed modified surfaces as the electro-tactile electrode interface to sense applied force and deliver feedback simultaneously without additional sensors. This method simplifies the system, maintains flexibility, and leverages the rapid prototyping capabilities of 3D printing. The functionality of this approach is validated through a user study (N=10), and two practical applications are proposed, both incorporating simultaneous sensing and tactile feedback.
https://dl.acm.org/doi/10.1145/3675095.3676612

Sunday Oct 13, 2024

We deep dive today into an ISWC 2024 Honorable Mention.
Self-recording eating behaviors is a step towards a healthy lifestyle recommended by many health professionals. However, the current practice of manually recording eating activities using paper records or smartphone apps is often unsustainable and inaccurate. Smart glasses have emerged as a promising wearable form factor for tracking eating behaviors, but existing systems primarily identify when eating occurs without capturing details of the eating activities (E.g., what is being eaten). In this paper, we present EchoGuide, an application and system pipeline that leverages low-power active acoustic sensing to guide head-mounted cameras to capture egocentric videos, enabling efficient and detailed analysis of eating activities. By combining active acoustic sensing for eating detection with video captioning models and large-scale language models for retrieval augmentation, EchoGuide intelligently clips and analyzes videos to create concise, relevant activity records on eating. We evaluated EchoGuide with 9 participants in naturalistic settings involving eating activities, demonstrating high-quality summarization and significant reductions in video data needed, paving the way for practical, scalable eating activity tracking.
https://dl.acm.org/doi/10.1145/3675095.3676611
 

Sunday Oct 13, 2024

We deep dive today into an ISWC 2024 Honorable Mention.
We present RetailOpt, a novel opt-in, easy-to-deploy system for tracking customer movements offline in indoor retail environments. The system uses readily accessible information from customer smartphones and retail apps, including motion data, store maps, and purchase records. This eliminates the need for additional hardware installations/maintenance and ensures customers full data control. Specifically, RetailOpt first uses inertial navigation to recover relative trajectories from smartphone motion data. The store map and purchase records are cross-referenced to identify a list of visited shelves, providing anchors to localize the relative trajectories in a store through continuous and discrete optimization. We demonstrate the effectiveness of our system in five diverse environments. The system, if successful, would produce accurate customer movement data, essential for a broad range of retail applications including customer behavior analysis and in-store navigation.
https://dl.acm.org/doi/pdf/10.1145/3675095.3676623

Saturday Oct 12, 2024

Today we deep dive into one publication that received a UbiComp 2024 distinguished paper awards.
Applying customized epidermal electronics closely onto the human skin offers the potential for biometric sensing and unique, always-available on-skin interactions. However, iterating designs of an on-skin interface from schematics to physical circuit wiring can be time-consuming, even with tiny modifications; it is also challenging to preserve skin wearability after repeated alteration. We present SkinLink, a reconfigurable on-skin fabrication approach that allows users to intuitively explore and experiment with the circuitry adjustment on the body. We demonstrate SkinLink with a customized on-skin prototyping toolkit comprising tiny distributed circuit modules and a variety of streamlined trace modules that adapt to diverse body surfaces. To evaluate SkinLink's performance, we conducted a 14-participant usability study to compare and contrast the workflows with a benchmark on-skin construction toolkit. Four case studies targeting a film makeup artist, two beauty makeup artists, and a wearable computing designer further demonstrate different application scenarios and usages.
https://dl.acm.org/doi/10.1145/3596241

Saturday Oct 12, 2024

Today we deep dive into one publication that received a UbiComp 2024 distinguished paper awards.
We present MoCaPose, a novel wearable motion capturing (MoCap) approach to continuously track the wearer's upper body's dynamic poses through multi-channel capacitive sensing integrated in fashionable, loose-fitting jackets. Unlike conventional wearable IMU MoCap based on inverse dynamics, MoCaPose decouples the sensor position from the pose system. MoCaPose uses a deep regressor to continuously predict the 3D upper body joints coordinates from 16-channel textile capacitive sensors, unbound by specific applications. The concept is implemented through two prototyping iterations to first solve the technical challenges, then establish the textile integration through fashion-technology co-design towards a design-centric smart garment. A 38-hour dataset of synchronized video and capacitive data from 21 participants was recorded for validation. The motion tracking result was validated on multiple levels from statistics (R2 ~ 0.91) and motion tracking metrics (MP JPE ~ 86mm) to the usability in pose and motion recognition (0.9 F1 for 10-class classification with unsupervised class discovery). The design guidelines impose few technical constraints, allowing the wearable system to be design-centric and usecase-specific. Overall, MoCaPose demonstrates that textile-based capacitive sensing with its unique advantages, can be a promising alternative for wearable motion tracking and other relevant wearable motion recognition applications.
https://dl.acm.org/doi/10.1145/3580883

Copyright 2024 All rights reserved.

Podcast Powered By Podbean

Version: 20241125