Build intelligent man machine interface device gesture attention

  From the eettaiwan news, latest report by market research firm IHS, if we use Google Trends to understand the market “gesture sensing (gesture-sensing)” popularity, can be found in between 2013 and 2015, the technical terms used for fairly obvious search popularity. In fact, some companies with platforms to develop strength, some gesture-sensing technology from the 2009 begins layout; after several years of improvement and time waiting around between 2013 and 2015, some results have emerged one after another, the media, industry, supply chain, naturally this time showed their interest.

  ”Gesture” is basically a medium of gestures to do human-computer interface, as with voice, touch, can be a natural person-machine interface (natural user interface). When impact both to the use of the touch screen keyboard, after the mouse gestures with speech become the next concern of human-machine interface. Ultimate man-machine interface is based on intelligent machines, allows users in natural language conversation way, users don’t need to do anything at all, depends on the machine behind the artificial intelligence to understand the consumer.

  However, the AI will not mature in recent years to the extent can be applied to daily electronic product; even the opportunity, related laws, moral issues were inevitably aroused heated debates in society and even opposition. Compared close currently actual of situation is: in touch control screen became new of people machine between surface zhihou, we of electronic products, and real networking, and machine, equipment, is constantly to installation sense measuring device (sensors), these sense measuring device became people machine between surface or huge information (big data) of based; then end of calculus method, and even future development out of artificial intelligence, will will is these equipment real has “intelligent” of key.

  IHS said gesture attention by users since 2006, Nintendo (Nintendo) Wii game machine counts; by the year 2010 Microsoft (Microsoft) Kinect was born, the technology has reached a new milestone, because this is the first time users can Freehand, without depending on any controller and device interaction. From 2013 the fourth quarter on, mainstream family game machines, including Microsoft Xbox and Sony PlayStation, you can gesture in support of Freehand, and man-machine interface of the Nintendo Wii u have too traditional by comparison.

  Intel (Intel) RealSense after more than two years after the CES show, and finally officially installed in 2015 to laptops and other equipment. However, judging from the current maturity of the gesture itself, gesture-controlled human-machine interface in the performance and applications, have not reached the projected capacitive stunning standing for at the beginning of 2007. Even can be achieved, changes in consumer behaviour still needs time, not to mention on some devices, gesture could be an accessible human-computer interface, instead of the main interface.

  IHS said that gesture and touch screen is not a substitute for, both natural person-machine interface has its own appropriate scenarios. Touch screen is suitable for mobile phones and tablet computers, and also the major and intuitive user interface, but users have been able to touch the screen, so the gesture redundant, or are just supplementary objective. For some devices, users do not have access to the screen, gesture is quite appropriate for human-machine interface.

  For example, smart TV can do so through a screen device, or on a machine box to do this, but the use of the two situations, users need to keep the screen a certain degree of distance, so the touch screen is not appropriate. In contrast, unless it continues to adhere to the most remote way man-machine interface, or hand gesture were expected to be quite the right way.

  

Build intelligent man-machine interface device gesture attention

  Gesture is quite right for smart TV’s man-machine interface Dash may be sprouting insect robots anyone can

  Emerging immersive applications, such as amplified reality (augmented reality), VR (virtual reality), and screen of the device itself does not have an area large enough to facilitate operations, so in terms of their usage contexts, gesture is quite suitable for man-machine interface.

  Smart Watches have a similar difficulty, wearable device more comfortable to wear than mobile devices even compose screen, operation situation than mobile phone is quite inconvenient, gesture and touch each other can match, become mutually reinforcing roles. Google released in 2015 for Android Wear Wrist Gestures, using inertial MEMS sensors built in, allowing users to operate simple way to twist your wrist watch.

  Furthermore, many Internet based intelligent home appliances of the future (IoT-based home appliances) itself may not have a screen, or only have a small screen with the indication message, except on her cell phone, box, through voice manipulation, gesture would be appropriate human-computer interfaces.

  HomeKit platform, published for example Apple now match the iPhone, has gradually let the Internet eco-system of intelligent household appliances have a clear outline of the Apple early in November 2013, also acquired the PrimeSense, made gesture technology and two years continued to consolidate its patent. Google Project published in May 2015, Soli, more innovative; radar reflection principle reduces the entire functionality of the gesture to a width of less than 1 cm long single chip.

  

Build intelligent man-machine interface device gesture attention

  Google Project Soli principle of radar wave gesture feature will be shrunk to the length and width of less than 1 cm chip

  Gesture technology may have different functional areas, which affect the design of the sensor. Many phones have built in inertial MEMS sensors, so you can use to do simple gesture applications, such as: a vast mobile phone to select previous or next song music, whipping their phones to play dice games, flip phone to refuse to answer, and so on. Similarly, the proximity sensors (proximity sensor) can also be used for gesture. However, these are not the real potential of gesture sensor, just because the sensors have built-in, plus the application of creativity, only these applications.

  Depth (3D depth map) gesture sensors will have a chance to shine in the future. When sensors are able to construct depth charts, in addition to gesture capabilities, but also for equipment “vision”. For example: Intel RealSense, the first generation of the Microsoft Kinect and current Project Tango uses a similar principle of depth-sensing technology (structured light), but extending the strengthening and adjustment can be extended to different application areas.

  RealSense stresses gesture, machine vision, even the IRIS identification, Project Tango with special computer vision chip to do environmental space identification, while the Kinect is to strengthen the whole body bone node algorithm, so movement is expanding from gestures to body.

  After several years of development, gesture as a man-machine interface, from application development, special stages, gradually to normal users can often see equipment penetration. Gesture was not intended to replace the touch screen, but based on the related sensors, make gestures like touch-like, into another mature natural person-machine interface.

  In the “Visual” capacity sensor imported, in addition to the equipment has the ability to use hand gestures, but will also make the devices more intelligent. IHS said, we did in building a human-machine interface for smart devices, but on the other hand, we are also giving “sensory perception” on these devices, it is also the key to intelligent.

Advertisements

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google+ photo

You are commenting using your Google+ account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s