Feels weird to say this, but there seems to be something of a blind spot when it comes to the design industry. A vast majority of the conversation centers around what’s seen. And therefore technology and tools aren’t far behind. We talk about layers, button states, and even layout placements in terms of what’s seen. Yet, little to no attention is paid to the other senses.
We notice that little-to-no attention when we look at devices like the Apple Watch. Its really a nice piece of hardware, but for all that it can do towards hearing you, the response more often than not needs to be seen and/or tapped on the user’s end. And don’t let your action be something that needs to continue on the iPhone - then you’ve got to not only tap a button to acknowledge the shift, and then (non-verbally) instigate whatever the action was again on your iPhone. Its all visual (nevermind the rest of not respecting contexts).
This bears its head a bit more with connected speakers like Google Home and Amazon’s Echo. Now, these are voice-led platforms, yet still there’s a reliance on visuals for the initial setup, and even the building of the experiences. These are certainly on their way to being voice-first - but until the authoring tools are also held in the same sensory space as the consume/productvity actions, designing for voice-first will also land in the palette of what’s seen.
I find the focus dedicated accessibility professionals and automotive designers pay to non-visuals as being a better place to design from. Not that being visual is the floor of the experience - but adds something to the base experience which enables a better contextual grasp of the moment. Turn signals still do their click; CVTs fake engine rev sounds; haptic feedback for sight sticks. Using sounds and touch (and to the clothing, automotive rental, and food industries: smell) to the design palette opens a door into the kind of immersion which fuels one to knowing they are using something that’s a part of a larger system. That there’s a symmetry and balance happening, and the context was designed to engage (or disengage) their senses towards this.
Stripping the things away you see, can one say many companies have actually designed anything at all? The feel of an Apple Watch is certainly different than a Rolex. But that mechanical sound evokes something that the temperature and weight of the Apple Watch’s chassis cannot (or will not). What happens when you strip away what you can see and lean-in to evoke another sense - something more than sight? Design that meets this question should invite a different kind of attention indeed.