4 System control in applications
4.3 Gestures for smart phones (and other applications)

Many applications in mobile phones have designed and implemented use of intuitive gestures that would allow users to guess which movement they should make in order to run a specific command. Gestures also let designers develop nice interfaces by leaving more space for professional ideas. New interfaces are now usually designed without clickable buttons and offer space for professional ideas. Buttons cannot disappear from the mobile application for good as they play a crucial role in driving calls-to-actions. However, in case where gestures feel more natural and intuitive, and they simplify user interaction, they should be implemented [5].

An example of mobile application, controlled with gestures, can be Google Maps or any navigation system used in mobile phone. A Google Maps app provides users with an opportunity to apply various gestures to control its certain functions. For example, to zoom in or zoom out the map on the screen, you can use your finger to move up and down, respectively.

Other application is Clear, an iOS task managing mobile application. The very amazing fact about this application is that is has no buttons, so it is completely based on gestures control. It uses taps and swipes to add and remove tasks from a to-do list.

There are definitely more applications based on gesture navigation and control, but we listed only the most wildly used.

image
Google maps with gesture navigation

Other applications using gestures control (...)

Many applications are developed for professional coaching applications (e.g. golf, baseball). User do not need any additional hardware like keyboards and joysticks. Applications can provide three-dimensional body and hand motion capture capabilities in real-time. For this purpose, Kinect is used. Kinect uses depth camera for motion control. While Kinect primarily focuses on capturing body pose, Leap Motion developed a short-range gesture capture device using a stereo infrared camera. Leap Motion can track fine gestures of two hands at high frame rate. It enables applications like drawing and manipulating small objects in virtual space. Some PC vendors partnered with Leap Motion to provide the user natural user interface (NUI) in desktop applications like Computer Aided Design (CAD) [3].

Several software vendors are also providing SDK (software development kit) or middleware for application developers to easily integrate gesture and pose recognition to their applications (e.g. Gestoos, eyeSight).

To eliminate a driver distraction and to increase traffic safety the automobile manufacturers are coming up with a touch-less hand gesture interface (for example BMW’s camera-based gesture control system). This interface reduces the need for drivers to reach out to the dashboard control panel. It is more natural way to control the infotainment system and helps to keep the driver’s eye on the road.

Hand gestures are the viable way to also guide drone operations outdoor, so drones can fly autonomously from the remote control (e.g. summoning the drone back by waving hands). Example can be a new drone from DJI called Spark. Spark is a hand-gesture controlled drone, where all you need is your hands to command it. You can order it to distance itself, snap a selfie of you, or freely explore the skies in any way you choose—all with gestures. Spark can sense objects ahead of it from up to 5m away, to automatically avoid any unpleasant collisions [4].

image
Spark