SESSION 2.5.6 Roundtable. From Vibration to Visualization: Sensemaking within Multimodal Technologies
My Session Status
This panel discussion brings together scholars of STS, engineering, and Media Studies to explore how technologies have been reshaping embodied experience across different sensory domains. Panelists will examine developments in multisensory representation, from vibrotactile musical devices to screen reader-friendly data visualizations, to explore how haptic and audio technologies can facilitate new forms of sensory engagement. Panelists will discuss historical and contemporary developments in multimodal representation, including Paul Bach-Y-Rita’s pioneering work in tactile-visual sensory substitution, the IMAGE project’s multimodal AI-powered displays, and Jeff Blum’s MIMIC device. The conversation will address the fragmentation in haptic effects editing software and multisensory data representation, questioning why attempts at standardization have struggled, and what this means for the field. Additionally, insights from perceptual psychology and embodied cognition, such as attunement techniques from auditory and tactile perception (e.g., human echolocation, vibrational cueing in cane navigation), will provide a broader context for the implications of these technologies. By merging critical perspectives from history, sociology, and disability studies, this panel will synthesize insights about these novel technologies to understand the future of mediated social touch and perceptual sensory research.
Speakers:
• Kyle Keane (University of Bristol, UK),
• Mark Paterson (University of Pittsburgh, USA),
• David Parisi (NYU, USA)
• Crystal Lee (MIT, USA)
Discussion