Visual elements dominate the modern UX. There is a lot to see, but much less to feel. As we design across multiple devices, we’re also learning to design across multiple senses. Because sound connects to powerful sensory and emotional centers in our brain, it can help us create more satisfying and soulful experiences—yet traditional design models overlook and undervalue the role of sound.
This session explores new ways of thinking about sound as an integrated design element that plays a key role in orchestrating a multisensory product experience. Developed in the Design R&D group at Microsoft, this model grounds traditional elements of sound design within a sensory design framework. The approach is intuitive and accessible and can inspire all designers to create more beautiful, functional and inclusive experiences.
Matthew Bennett is Head of the Sound + Sensory Design Program at Microsoft, where he leads development of the next-generation sound design language for the company’s portfolio of software, services, devices and platforms. Since creating the program in 2014, he has been pioneering a new model for product sound design that grounds traditional elements of the discipline within a broader sensory design framework. He works with teams across the company to develop new sensory-driven interaction models and create design systems that support intuitive and inclusive experiences.