Touchscreens are everywhere. In fact, many of our daily interactions with digital interfaces happen now through a flat glass surface. Apart from size, resolution, flexibility and resistance to scratch and damage, the touchscreen displays on our devices have not changed much.
Conversely, operating systems keep evolving and boasting more advanced functionality. With iOS 7, Apple started adopting translucencies and animation to create distinct layers that increase the sense of depth and establish hierarchy.
So maybe it’s time to evolve on the hardware side. We should make use of new technologies to take the touchscreen experience to the next level. It is ironic to use ‘touchscreen’ to describe a display that provides visual rather than tactile feedback.
While touch is considered one of the five senses, the impression of touch is made of several elements, such as pressure, skin stretch, vibration and temperature. This is all possible thanks to the 17,000 mechanoreceptors located in our hands.
One of the main advantages of touchscreens over other input methods, is their relative ease of use. This is because they allow direct manipulation of the objects on the screen, thus making the interaction more intuitive.
However, the absence of tactile feedback creates some usability problems. The two main issues concern finger positioning and confirmation.
Firstly, we need to look at the screen as we tap, to make sure we hit the right button. This is crucial in critical environments such as in-car navigation, where the user is situationally impaired.
The lack of physical buttons also makes it impossible to touch type or enter data without looking at the screen. Vibration-based haptics have already been implemented by device manufacturers, in the attempt to simulate the physical button experience.
They use vibratory feedback to mimic the resistance produced when pressing an actual button. Yet this solution provides feedback only after a key has been pressed, and does not assist the user in properly locating the fingers on the flat glass surface.
Another problem is the fact that the user cannot rest his fingers on the display, as each touch is registered as data input, and this can lead to arm/finger fatigue. And let’s also not forget that there are considerable segments of the population that cannot operate touchscreens. These are the visually impaired and people suffering from conditions that affect fine motor skills, such as Parkinson’s disease or arthritis.
However, over the past few years, a number of new touchscreen technologies have been developed, in the attempt to address the these issues. Companies like Tactus, Senseg and Disney Research have all been working on different ways of implementing tactile feedback.
Tactus have engineered a thin layer that sits on top of the touchscreen, and is formed of micro channels that are filled with liquid to create buttons and other interface elements.
Senseg’s solution, on the other hand, uses the electrostatic force between the screen and the user’s finger to simulate the feeling of textures.
Disney Research has created TeslaTouch, which is a tactile algorithm that simulates rich 3D elements such as bumps and edges. This is achieved by simulating the lateral friction that stretches the skin when we slide a finger over real textures.
It’s also worth mentioning the Braille phone project by Sumit Dagar. In this case, a grid of pins elevates and depresses to display Braille.
These technologies may be costly to implement on a large scale at present, but they open new and exciting opportunities for future devices. Firstly, overall accessibility of touchscreens would be greatly enhanced by providing both tactile and visual feedback.
Adopting these technologies would also allow us to operate the user interfaces without looking at the screen. This would be particularly useful whilst driving or in other contexts like gaming.
These are some of the most obvious implementations of these new technologies, but they could also help reinforce the sense of depth and hierarchy provided by the latest mobile UIs.
We would be able to perceive with both sight and touch the relative positioning and hierarchy of the objects on screen. Also, adding textures and edges to interface components such as sliding panels, buttons, sliders, overlays, would address some of the discoverability issues intrinsic to flat design.
We are becoming increasingly more reliant on digital technology to get work done, communicate and get entertained. Let’s borrow some of the sensorial elements from the analogue realm to take our digital experiences to the next level.