From Usability Matters by Matt Lacey

In this article, you will learn about designing apps that don’t rely on single sense interaction, but instead take advantage of multiple senses to provide a better user experience.

Save 37% on Usability Matters. Just enter code intmobux into the discount code box at checkout at

Physical and audio output support changes on screen

We interact with our environments using multiple senses. The same should be true of the way people interact with your app.

Apps can interact with three of the five senses of the people using them. Devices don’t come with the ability to create smells or tastes, so you don’t need to consider these (yet). I’m not sure how people would feel about sniffing or licking their phones, and when Steve Jobs said Apple made items on the screen “look so good, you’ll want to lick them” this probably isn’t what he meant. If you have been looking for app development for your business and have a specific style in mind, it may be best in the early stages for you to look at a professional app development perth company or one similar, to see what they can do for you during this time.

In this article, I’m going to talk about how your app can interact with the senses of hearing and touch by considering the sounds it makes and how one physically interacts with it.

Don’t leave your app silent. Give it a voice.

The way your app communicates with people has a lot of parallels with a conversation between two people:

  • You can communicate more clearly with words and sounds than by visuals.
  • Certain sounds connote specific meanings. Using them for something else can be confusing.
  • You need to consider any other sounds coming from the device.

Audible feedback should accompany important visual information on the screen. Adding this feedback is part of giving your app a voice, but, as with a person learning to speak, the app must be taught to speak at the right time, and in a suitable way.

In a busy environment, full of distractions, it can be easy to miss something. If you and I were both in a crowded room, you might not notice me waving to try and get your attention. If I shouted your name, the sound would indicate to you that there was something for you to be aware of, and you’d look around to see me waving. A similar scenario exists with apps. If something important happens in an app, an appropriate sound can serve as an additional cue that there’s something which requires your attention.

Don’t use sounds alone to alert a person to a problem, or when something has changed; use sounds to give confirmation of an action. Such as a sound to confirm a button press, sending a message, or changing a setting.

The use of sound isn’t the only way to communicate information; it serves as a complement. Shouting and waving worked to get your attention in the busy environment, using a combination of outputs makes it harder for the person using the app to miss anything. This also helps anyone with limited use of a sense. For example, if a person has limited or temporarily-restricted vision they may miss a change in color, but would hear an associated noise.

The opposite is also true. A person may be in a noisy environment and not hear an alert, but could see something on a screen.

Consider the listener and their environment

I once visited the studio of an audio engineer who specialized in making radio commercials. He had a lot of expensive and sophisticated equipment, but would listen to what he was working on with an old, cheap, low-quality set of speakers. When asked why he wasn’t making use of the equipment available to him, he explained that he had equipment capable of producing a clear sound in a room with no other noises, but that wasn’t the typical experience of the people who’d be listening to the adverts. They’d be using devices of varying quality in environments filled with other noises. He needed to be sure that what he was producing worked for them.


Ears and speakers differ from person to person and device to device, and not everyone will hear the same thing. You need to allow for this and not rely on testing your app in a single, controlled environment.

I previously worked on an app used solely in a warehouse filled with automated conveyors and other machinery that required the use of ear protection on site. In this environment, there was no way to incorporate sound into the app. Such environments are rare, and it isn’t acceptable to exclude audio output from your app purely on the basis that some people may not hear it in every situation. As the demand for high-quality app experiences continues to grow, sound output will become increasingly noticeable by its absence.

It’s not enough to make any old noise. The sounds that your app makes must be meaningful and match the context of use:

  • Don’t reuse system level sounds for different tasks or functions. No matter how much you like the system sound effect for receiving a message, you’ll only confuse people if you use it for something else within your app.
  • Use positive sounds for positive events and negative sounds for when something fails or isn’t possible. You don’t want to use an “uh-oh” sound for when something is successful or cheering sound for when something fails.
  • Avoid repeating sounds in quick succession, particularly in negative scenarios. A repeated failure sound can worsen negative emotions when a person is becoming frustrated, because they can’t achieve the result they require or a task isn’t possible.

Add sounds that are simple, impactful, and purposeful. Use them intentionally and with discretion. Too many can be overwhelming and lessen their impact. If you start by adding a few, add them in ways that correspond to the most important actions and events in the app.

Sounds can also be distracting. Always have a way to control or disable them. It’s more common for games to have background audio playing while the app is in use, but if your app plays background music, allow control of this to be separate from sound effects.

If your app includes background music, don’t start this automatically when launching the app if something else is already playing in the background. It’s frustrating to be listening to music, an audiobook, or a podcast and have it suddenly stopped when launching another app, which automatically started playing its background music. Your app should respect the choice of the person to listen to what they choose.

Haptic feedback starts with vibration

Haptic, or touch-based, feedback has several important considerations.

  • Haptic feedback is an area where development is still taking place. Expect new capabilities from future devices.
  • Take advantage of haptic capabilities if available on a device.
  • Avoid confusion by not reusing feedback someone could mistake for something else.
  • Multiple methods of communication are important. If one or more is disabled at a device level, you can still rely on others.

Haptic feedback involves more than making the device shake. Haptics is the science of touch. It can allow a person to identify what they’re touching or receive feedback based on shape, size, texture, and resistance. The most advanced haptic feedback systems can give the impression of touching more than a flat piece of glass on a device, but such capabilities have yet to make it into mainstream devices. Most devices your app runs on will be limited to using vibration, although the iPhone 7 introduced a “taptic engine” to provide more tactile sensations and we can expect more devices to include similar functionality in future.

A small number of devices offer a high level of control over the motors that cause a device to vibrate. For these devices, there are libraries of predefined vibration patterns of varying strength and duration, which allow different games to respond consistently to similar actions, but most devices are limited to allowing you to control the length of vibration.

If the platform your app runs on has standard vibration patterns, you should use these to match the behavior of your app to its context. If there’s no direct equivalent, base your action on something similar and don’t reuse a pattern already used for something else. You don’t, for example, want someone experiencing a vibration in your app and then thinking it means that they have a new email. Picking arbitrary values for the strength and duration of vibrations is highly unlikely to produce the best results, and you should test this with actual users before releasing widely.

Don’t use the same vibration pattern for all actions and events within an app. When the app causes the device to vibrate, it’s a sub-par experience for this to indicate that something has happened. Not everyone is capable of distinguishing between subtle variations in patterns. When using many different patterns you should be able to sufficiently differentiate between positive and negative events happening within the app.

Giving your app the ability to communicate to the person using it through their sense of touch means that your app can communicate an important event in three ways: visually; audibly; and through touch. By being able to communicate in multiple ways you reduce the chance of missing an important event, even if a person is unable to fully use all their senses. It isn’t necessary to always output to all three senses, but your app should never use only one.

Some people don’t like their device vibrating and disable it at an operating system level. Your app must still work in this case and you should be sure to test for this. You should also allow the person using your app to control the use of vibration with a setting.

That’s all for this article.

If you would like to read more about creating great mobile app experiences, check out the whole book on liveBook here and see this Slideshare presentation.