omniture

Man versus Machine Reaches Symbiotic State; Eye Tracking and Gesture Applications to Revolutionize Patient Care

ABI Research
2016-07-12 00:55 1887

OYSTER BAY, New York, July 12, 2016 /PRNewswire/ -- Gesture, eye tracking, and proximity sensor technologies will mark the next stage of innovation in machine design, finds ABI Research. A broader and more competitive ecosystem spurred by smartphone and tablet sensor integration—forecast to hit close to $5 billion in 2016—will create massive opportunities in automotive, consumer electronics, and healthcare. Healthcare, in particular, shows the largest, untapped opportunity for eye tracking and gesture applications in patient care.

Logo - http://photos.prnewswire.com/prnh/20151014/276887LOGO

"The same way that touchscreens eclipsed the PC mouse, gesture and eye tracking sensors will transform the way people interact with machines, systems, and their environment," says Jeff Orr, Research Director for ABI Research. "Healthcare professionals are relying on these sensors to move away from subjective patient observations and toward more quantifiable and measurable prognoses, revolutionizing patient care."

Eye tracking sensors can help detect concussions and head trauma, identify autism in children before they are speaking, and enable vision therapy programs for early childhood learning challenges to retrain the learned aspects of vision. Similarly, gesture sensors are translating sign language into speech, providing doctors a means to manipulate imaging hands-free during surgical procedures, and providing a natural means to navigate through virtual experiences.

Both established and startup companies are involved in the human-machine interface revolution. Sensor innovation is stemming from Hillcrest Labs, NXP, and Synaptics, among others. Atheer, Bluemint Labs, eyeSight, Google, Intel, Leap Motion, Microsoft, Nod Labs, RightEye, and Tobii Group also all recently announced creative gesture, proximity, and eye tracking solutions.

"Healthcare is only one industry poised to benefit from reinventing the user interface," adds Orr. "The larger competitive ecosystem for perceptual sensors is forging opportunities in consumer appliances, autonomous driving, musical instruments, gaming, retail, and even hazardous locations."

These findings are from ABI Research's Eye Tracking, Gestures and Proximity Sensor Applications (https://www.abiresearch.com/market-research/product/1022551-eye-tracking-gestures-and-proximity-applic/) and Human-Machine Interfaces (https://www.abiresearch.com/webinars/human-machine-interface/) webinar. This report is part of the company's Wearables & Devices sector (https://www.abiresearch.com/market-research/practice/wearables-devices/), which includes research, data, and analyst insights.

About ABI Research

ABI Research stands at the forefront of technology market research, providing business leaders with comprehensive research and consulting services to help them implement informed, transformative technology decisions. Founded more than 25 years ago, the company's global team of senior and long-tenured analysts delivers deep market data forecasts, analyses, and teardown services. ABI Research is an industry pioneer, proactively uncovering ground-breaking business cycles and publishing research 18 to 36 months in advance of other organizations. For more information, visit www.abiresearch.com.

Contact Info: Mackenzie Gavel

Tel: +1.516.624.2542


pr@abiresearch.com

Source: ABI Research
collection