IOT With Built-in Tactile Interaction – Can I touch it? Sure go ahead!

When all are connected (without wires) one needs the core essence of communication “FEEDBACK”, and what better could be than a combination of visual and touch based interactions. Can’t imagine IOT without TOUCH.

Note: IOT is a phenomenon and I have picked up the “emotional” aspect of our daily life, as how it could impact it, as other technologies have already overpowered us (humans).

1. Introduction

“Is your phone on vibration mode”, “Hey can you feel this cloth, it’s so smooth”, “I have been to this amusement part recently and tried their new bike ride game. It is awesome. It felt like as if I am riding the bike for real and that shooting game was amazing, almost felt like using a real gun and being in a real environment.”

These are some of the emotions/reactions of people who come in contact with devices, which use “tactile” feedback as a medium of interaction. So why are we interested in “Tactile Interaction” and how does it impacts the design process.

When a child holds a remote for a car toy for the first time; he explores the controls and switches to view the reaction by it getting transmitted to the toy car. Once he is aware of the function, he hardly sees the controls; as a mapping of the controls gets formed, which act as imagery, which is complimented by the auditory feedback. The visual and auditory feedback reinforces the ta tactile environment, but what could lead to next level of interaction, is the introduction of “TOUCH” or the tangible axis, which is also called as Haptic interaction.

Some physical examples are the interactive table, where one could use his phone as a talking device with other devices. The upraised, footpath slabs at the metro station/bus stop/hospitals to guide the visually impaired. The car steering “vibration” feature when driver over speeds.

The debate is weather sound is sufficient to evoke right emotions or touch plays a vital role in a digital environment.

2. Psychology of TOUCH

Pinch, Pinch New Pinch – Human body processes information at two levels, Physical and Perceptual.

  1. Physical – Signal from touch to the nervous system.
  2. Perceptual – The interpretation of information.

These can be broken down into three perceptions;

  1. Tactile – The sensation on the skin, e.g. feather touch
  2. Kinesthetic – The sense of the body position (Left/right turn where the body shifts along as per the road curve.
  3. Haptic – Combination of both tactile and kinesthetic. Felt when one does rock climbing or watching a 4D movie, where water smoke, material sensation are combined with lights, moving image and sound.

3. Practicalities of Tactile Interaction

Pilot flying a plane on an auto mode and taking control over the plane – in both situations one has to take high risk decisions but the precision can only be achieved by a human hand (knowledge) and the visual information as feedback from the system. Thus the relationship between touch and sight has a deep connect in final decision-making.

There has been some research in this field by various people; some of the interesting ones are listed below.

Vision versus Touch

  • Variation in judgment of shapes – Rock and Victor  (1964)
  • Seeing and touching to compare the right relationship and objet details (wooden block experiment). Result showed diversity of responses and suggested that only visual dominance was not significant always – McDonnell and Duffett (1972)

Visual to tactile mapping

  • Klatzky and Lederman (1987) – Their theory is that Haptics and vision are simply different ways of perceiving, although at a much higher cognitive level there may be some convergence of the haptic system into the visual. Example when an infant sees and feels a soft toy both evoke a different emotions and reactions. The material, surfaces and textures, which cannot be felt by seeing and need to be touched in order to get the imagery formed in the brain about the object.
  • Another research showed that substance dimensions (e.g. hardness and texture) can be extracted quickly and reliably, whereas structural information is extracted slowly and is error prone.

A very good example of this is the use of touch books by toddlers.

Line Symbols

  • Dating back to stone-age “Line” drawing has come a long way. Combine it with the 3D (sculptures, engraved surfaces) you get a haptic/tactile environment. “Which line is short?” – the optical illusion imagery, we all have come across.
  • Bentzen and Peck (1979) researched on which styles of lines are easiest to follow by tracing with a finger. Findings were that single lines, rough or smooth, are preferable to double lines in tactile displays that do not have intersecting lines. Other observation was that a single narrow line intersected by a double is an undesirable feature.
  • Lederman and Campbell (1983) explored the use of raised lines with tangible graph displays for blind people. The observation was that some people used both hands to explore the graphs and others kept one hand at the start point for reference.

Point Symbols

Joining the dot’s to reveal the figure; I believe most of us have drawn it. Similarly finding hidden objects in a figure – known as figure-ground problem. In tactile environment it would be raised verses incised.

  • Nolan and Morris (1971) experimented with Blind readers on tracing a tactile symbol. The result, raised figures were easily traced than incised figures – faster reading when the surface is raised.

Areal Symbols

Tactile diagram which uses either a texture or a tactile pattern to communicate information, e.g. the raised surface on a button, or the textures used by industrial designers to provide the feedback to users when they use the device. Other examples could be ‘smoothness’ of a baby’s skin, ‘roughness’ of sandpaper, ‘softness’ of cashmere, ‘rubberiness’ of elastic and the ‘slipperiness’ of ice.

  • According to Lederman and Kinch (1979) there are around forty tactile patterns that can be found easy to recognize but only up to eight can be used together. It is very difficult to find any more than eight patterns that can be used as a group without ambiguities creeping in.

Strategies of Exploration

  • Berla (1972) defines three problem areas in the use of tactile diagrams by blind people: legibility, organization and strategies for exploration. Of these three problems, the last is of particular interest as it is concerned with how individuals adopt different methods for exploring diagrams.
  • The strategies, which users would use, but non is an ideal strategy as per research.
  1. Horizontal-unidirectional
  2. Horizontal-bidirectional
  3. Asymmetrical horizontal scan
  4. Vertical-unidirectional
  5. Vertical-bidirectional
  6. Perimeter
  7. Bounded
  8. Density distribution scan
  9. Spoked wheel scan

Braille Symbols

  • Close your eyes and the whole world vanishes, that’s what a blind sees. So how do you read when one is blind? “Braille” – it was devised in 1825 by Louis Braille, a blind Frenchman himself. Experiment around braille was conducted by Millar (1985) which showed that that braille letters can be read and recognized significantly faster with cells of dots rather than joined lines.

Design Principles for tactile interaction

  1. Combination of static overlays, resistive touch-pad in order to ‘feel’ the structure of the page layout plus audio feedback, size of display, visual-to-tactile mapping, simplicity of symbol design and perhaps most significantly empty space are some of the key design principles used in tactile mapping.

4. Tactile Interaction in Human Computer Interface

Now how do we utilize the tactile interaction know how and leverage it for human computer interface. What we get is different display environments; static tactile displays, dynamic tactile displays and force-feedback technology.

  • Static and Dynamic tactile displays

Digital installation, where the sensors capture the number of people present in the room and gives visual feedback. Interactive dance floors, which let’s you create your own visual patterns, are some of the examples. On the other hand a dynamic display such as refreshable braille displays would be a great help in assistive reading, but technology is a challenge currently.

  • Haptic Display Technology

Computer Games: Joystick – now you see the connection of haptic technology, or an airline simulator or a virtual reality parachute trainer are some of the existing technologies which are helping people enhance over all experiences and learning’s. The useful aspect is the dynamic nature of these displays.

5. What’s the future?

A difficult one to answer, but with the following listed examples it seems to take a leap both in terms of experience building and new design strategies and how we all will be connected while interacting with different object forms and textures.

Listed are some of the articles, which I came across, directly highlighting the potential of tactile interaction possibilities;

  1. 1. Touch board PCB transforms tactile interaction into sound

  1. 2. Megaface

  1. 3. Synaesynth converts color into music at the push of a button

  1. 4. Nuance: Dancing with Light

How about if you could move around pixels (physical pixels for communication – in the same way as kids use Lego blocks to build objects. “ENDLESS” possibilities across domains.

“I believe you are wearing this for the first time – New Pinch.” Well you pinch on your phone and your friend senses it on the other side….wow, not that relates to IOT.

Vikas Swarankar (I319893)

User Experience Design Expert

SAP Labs, Gurgaon, India

To report this post you need to login first.

Be the first to leave a comment

You must be Logged on to comment or reply to a post.

Leave a Reply