From punch cards to keyboards was a big leap. When the keyboard was joined by the mouse for the first Apple Macintosh shipment in the 1980s, it was a watershed moment in point-and-click computing. By early 2000s, the first iPod Touch Wheel was creating waves but was soon replaced by the revelation of the Click Wheel. Today, the Click Wheel is buried and forgotten. Human Machine Interaction (HMI) is now all about touch, gesture, voice and biometrics. There are devices like Amazon’s Alexa that don’t even have a screen – they simply respond to natural language. Devices like the head-mounted display (HMD) combine many of these capabilities – visual, audio and text – to take HMI one step further into the future.
Have you noticed a clear trend as HMI has evolved over the years? People and their capabilities are becoming more important than machines. Modern computer interfaces are taking note of natural human interaction and making a supreme effort to understand it and mimic it.
Everyone remembers the time when computer users had to invoke awkward DOS commands like C:\>chkdsk. Today, the tide has turned. Computers are learning to interpret sophisticated (and even somewhat ambiguous) commands like “What does my day look like tomorrow?” by pulling in context, cognitive intelligence, location and text-to-voice technologies.
Sensors for fragrance and taste are being researched and will soon be developed. Today’s capacitive touch sensing technology is one way – we can touch a screen (and even use varying force) to convey a variety of commands. As technology develops, we will have two-way touch sensing. For example, we will be able to sense the texture of cloth remotely. One of the areas where touch sensing technology is already making rapid progress is in medical prosthetics where the wearer of a prosthetic hand can be delivered a sense of touch using sensors that impact nerve bundles[i]. Technologies like these will soon seep into broader applications of touch sensing.
With machines capable of providing a sense of touch, fragrance and even taste, we will be able to communicate with the machines the same way we communicate with other humans.
It is a two-way alley: As we begin to use innately native methods of interacting with machines, machines will also become capable of more “human-like” behavior. Machines will begin to use a mix of Artificial Intelligence (AI), Augmented Reality (AR), Virtual Reality (VR) and Mixed Reality to kick off an era that truly begins to erase the lines between humans and machines, improving their usability and effectiveness. Among the areas where we will begin to see the first extensive use of these new interfaces is where human-to-human interaction is the maximum. Today’s scenario from an enterprise perspective will be in customer care, with maximum user interactions. Customer care and employee training are pervasive across industries. These two verticals will, in all likelihood, be the first to make HMI more supportive, rewarding and motivating.