Paradigm Shift in Human-Machine Interaction
Touchless Sensing and Gesture (or Gestural) Recognition are two technologies in fast-paced growth mode because both are part of the paradigm shift in human-machine interfaces. The typical use of these technologies is to detect the motion of a person within an area covered by a sensor. To capture gestures, a camera reads the movements of the user.
Detecting Human Emotions & More
Another emerging capability is the detection of human emotions which then become part of the human-machine interaction. These technologies are being made part of everyday consumer products including kitchen sinks, desk lamps, garbage cans, soap dispensers – you name it.
Huge Market Growth Underway
As these technologies develop, they are becoming more and more a part of applications and products both in the US and abroad. It is estimated that this market reached $6 billion in 2015 and at a 20%-30% compound annual growth rate (CAGR) will reach $30 billion by 2023.
There are many new products. For example, PMD Technologies has developed “time-of-flight” (TOF) depth sensors that are integrated into a PhotonICs 19k-S3 chip. And Logitech’s ZeroTouch is a touchless car system allowing Alexa to read emails aloud and users to respond to those emails. Gesture recognition and touchless sensing are now being integrated into smartphones and computers and are expected to be a common feature in the next couple of years. The next iPhone, iPhone 8, will have in-display fingerprint sensors and facial and gesture recognition capabilities reportedly enabled by laser technology. Finally, hand gesture recognition is now being applied to still images and video feeds using a C# based framework, AForge.NET.
In the following video, Jeff Johnson explains automating image and shape gesture recognition for Unity.