by Sanjiv Sirpal
Or why they popularized the 'Look to Touch Screen'.
On June 29, 2007 - a mere 8 years ago - Apple introduced the iPhone. A fantastic product by any definition. I become a fanboy on two levels. The first as a consumer, but the second as a designer. The interaction system that they developed, was jaw dropping, amazing and overwhelming. It was from the future.
The iPhone is here to stay and nothing will change it (for a while at least). I've watched people use their phones, and its amazing to see how they go about their personal choreographed routine. I see patterns in how they hold it, how they use it and how they relate to it. It almost looks like a dance sometimes. It's powerful, because it makes you stop whatever you're doing, and pour your focus and energy into the small "look to touch screen" and become completely engrossed. Touch, tap and swipe.
Its asynchronous, and opaque.
For as long as we have had hands, we have been using them to interact with the environment around us. People, places and things. Through our hands and feet, coupled with our senses we have developed real-time interactions with the analog world. And as we repeated our actions, our bodies remembered (in different parts of our brains).
We touch people. Hold their hands, embrace them, heal them, hold them, guide them, and even push them.
We walk through places and spaces, moving both horizontally and vertically. On the surface, through water, and even the air.
We craft, collect, hold, use, play with, caress, pull, push, pry, carry... a million things.
The iPhone, took advantage of touch screen technology, and our ease of interacting with the physical world, and built a bridge to the new digital frontier. They popularized a 'Look to Touch Screen'.
You gotta look at it to use it.
Touch screen technology enabled the creation of an interaction system that allowed for direct manipulation of digital "stuff" with our digits. We first needed to look at where we wanted to touch, then touch - as a result the content on the screen did something, or nothing. Swipe, touch, pinch, spread are gestures that are applied to the surface of the touch screen. These in result have a million interpretations in software.
So whats the point?
These devices are not transparent, they are not synchronous. And the future that seems to be swerving towards us, will need transparent, synchronous interaction systems, that work with us, and our appendages, muscles, and muscle memory to operate.
You can drive a car, without looking at it.
A car's interaction system is learned. Once learned, people are highly efficient at using them to manipulate the vehicle. And as a result, the car becomes an extension of the body. They are highly synchronous, rely on muscle memory, and in general use standard methods of interaction.
The world does not need VR helmets, or Augmented reality, that all have proprietary interaction systems. There needs to be a common language that can be learned, and then applied to that digital space, so we can truly have transparent and synchronous interaction systems.
For wearables to be truly a part of us, they can't expect us to look at them to use them. They need to feel a part of us, they need to be interacted with blindly with a squeeze, or a subtle shift in their physical appearance, or by manipulating their edges.
VR/Augmented Reality, Wearables (and IoT) will weave a fabric of the digital world onto or a part of the Analog world. The gravity of these technologies need to be biased towards the Analog world, and not the digital.
Originally Posted here.