4 Gesture navigation
4.5 Gestures in practice

Gestures recognition can be interpreted as one of the first possible ways how computers can understand humans or human body language. It can led to the big step when the input devices as keyboard and mouse become minority. Using the concept of gesture recognition, it is possible to point a finger at the computer screen so that the cursor will move accordingly.

Gesture navigation become more and more popular mainly in smart phones, tablets and modern TVs. Everybody knows swipe gesture to scroll up/down or move left/right. There are already applications offering set of gestures for navigation in multi touch devices, like volume control, lock the screen, control music player, take screen shot, navigate to home, back, recent-apps and menu.

The area for gestures is not limited, as example can be mentiond a tennis training application using 3D gesture recognition published by Cristian García Bauza and his team.

As an example, Samsung introduced gesture recognition in their flagship TV back in 2013. It consisted of hand tracking algorithm with recognition of the “click” gesture performed by closing the palm. In order for user to switch channels or adjust volume, he/she had to first present a raised hand to the TV, then track the shown cursor (a remnant of the computer era) to either side of the screen where buttons appeared. Then, to i.e. increase volume, user had to “click” with his/her hand several times to achieve the desired volume while keeping it in position. This approach could be considered intuitive, at least by computer-savy users, but it could hardly serve as an example of natural gesture.