Nokia GlassNokia has been interested in developing a wearable glass is a known fact. We had earlier reported two patents which talk about displaying information on a see-though display and another talks about eye-movement tracking with help of a sensor. The latest patent granted to Nokia talks about improved method of interaction with such near to eye display (NED). Noting that existing methods don’t provide such effective interaction mechanisms, the patent in question talks about a more advanced method and equipment. And the patent clearly labels such near-to-eye display as mobile device eyewear.

Head-worn display i.e. near-to-eye display devices, particularly mobile device eyewear, enable an individual to see natural or electronically produced images while simultaneously allowing the individual to see the real world through the display. In addition, head-worn display devices may comprise technology allowing the individual to interact with the display device, i.e. to control the device. Existing control mechanisms may comprise, for example, motion sensors, gaze tracking systems or touch sensors. However, existing control mechanisms are often difficult, inconvenient or hard to use.

Now, an improved method and technical equipment implementing the method have been invented. Various aspects of the invention include a method, an apparatus and a computer program, which are characterized by what is stated in the independent claims. Various aspects of examples of the invention are set out in the claims.

The method of interaction with such wearable glass display has two aspects. The first aspect is about tracking gaze of the eye and second is detecting the hand movement, when there is little or no movement of the eye. One camera detects gaze and blinking of an eye and the cursor moves with gaze of the eye, while another camera detects hand movement. Hand gestures and blinking of eye will enable selecting the menu option or clicking on some icon, if the cursor moves to menu or icon following gaze of the eye.

According to a first aspect, a method is provided, comprising tracking the gaze of an eye of a user by a first camera, wherein a position of the gaze determines a place for the cursor on the screen of a near-to-eye display (NED), wherein the movement of the gaze is configured to move the cursor on the screen and when the movement of the gaze is detected to be still or moving only slightly, observing a gesture of one hand of the user by a second camera, and executing a further operation if at least one gesture of the hand is detected.

The detected gesture is a movement of a fist, shaking of a palm or movement of a palm from or to the second camerathe further operation is selecting a menu option or a functional icon displayed on the screen of the near-to-eye display if the cursor is on the menu option or functional icon. According to an embodiment, the further operation is zooming in or out the view of the screen of the near-to-eye display. According to an embodiment, the further operation is returning, removing or deleting. According to an embodiment, the method further comprises observing blinking of the eye of the user if no gesture of the hand is detected and if the cursor is on the area of a menu option or a functional icon displayed on the screen, and selecting the menu option or the functional icon if blinking of the eye of the user is detected.

The eye tracking camera may be “an inside-looking infrared video camera” while second camera may be “any video camera that may be trained on a hand of the user”. On detection of hand gesture the “manual hand gesture control” gets activated and takes over the control of UI. As soon as the hand disappears from view of the second camera, eye gaze control reactivates and controls the device.

The first camera may be an inside-looking infrared video camera that may track the gaze of one eye of a user of the NED. A cursor is arranged at the view point of the eye on a see- through screen, wherein the screen is at least part of at least one lens of the NED.

The second camera may be any video camera that may be trained on a hand of the user; in other words, the second camera may be a so called side down-looking camera observing gestures of the hand. When at least one gesture is detected by the second camera after activation, the second camera may inform the user interface (Ul) software that a hand gesture has been detected and hand gesture control will be activated. Thus, when the second camera detects the hand, the manual control takes over the gaze control. And on the other hand, when the hand disappears from the view of the second camera, the gaze reassumes the control of the Ul

The patent in question is WO2014015521A1.