A method for selecting and activating a particular menu displayed in a client's region of a monitor screen by use of an image cognition is disclosed. Using an image-capturing device such as a camera attached to a system, a user's image is recognized at real time and displayed on an initial screen of a monitor. The user makes a direct hand motion while viewing his own image displayed on the initial screen, and when a desired menu icon is designated among a variety of menu icons arrayed on the initial screen, the system guides the user's hand image to the corresponding menu icon for its selection. When the user makes a particular body motion to activate the selected menu, the system recognizes the motion for thereby activating the selected menu.
|
21. An application menu selecting and activating method using image cognition, comprising the steps of:
displaying the a user's image on a client region of a display screen, the user's image being displayed on a first area of the display screen;
displaying a menu image on pattern regions of the display screen, the menu image being displayed on a second area of the display screen, wherein the first area and the second area are separately positioned on the display screen;
determining whether a pattern position on a screen is positioned in a predetermined pattern region by scanning a the predetermined pattern region;
selecting a menu in the pattern region in which the pattern is positioned; and
activating the selected menu, wherein the pattern is displayed and moved on the second area of the display screen, the pattern being moved between the pattern regions in response to a user's gesture displayed on the first area of the display screen.
0. 1. An application menu selecting and activating method using image cognition, comprising the steps of:
recognizing a pattern position on a screen using a pattern cognition function executed per predetermined time period;
selecting a menu when the recognized pattern position is within a certain pattern region on the screen, the pattern region containing the menu; and
activating the selected menu.
0. 2. The application menu selecting and activating method of
0. 3. The application menu selecting and activating method of
0. 4. The application menu selecting and activating method of
0. 5. The application menu selecting and activating method of
0. 6. The application menu selecting and activating method of
0. 7. The application menu selecting and activating method of
a body grabbable by the user;
a first pattern portion formed on a side end of the body;
a second pattern portion disposed at an outer end of the first pattern portion and guidable through the first pattern portion; and
a button for guiding the second pattern portion into and out of the first pattern portion.
0. 8. The application menu selecting and activating method of
0. 9. The application menu selecting and activating method of
0. 10. The application menu selecting and activating method of
0. 11. The application menu selecting and activating method of
0. 12. The application menu selecting and activating method of
0. 13. An application menu selecting and activating method using image cognition, comprising the steps of:
determining a pattern position on a screen by scanning a predetermined pattern region on the screen;
selecting a menu in the pattern region in which the pattern is positioned; and
activating the selected menu.
0. 14. The application menu selecting and activating method of
0. 15. The application menu selecting and activating method of
16. The An application menu selecting and activating method of
determining whether a pattern on a screen is positioned in a predetermined pattern region by scanning the predetermined pattern region on the screen;
selecting a menu in the pattern region in which the pattern is positioned; and
activating the selected menu, wherein the pattern is an indication rod, the indication rod including a first pattern portion and a second pattern portion, the first pattern portion being used in the selecting of the menu, the second pattern portion being used in the activating of the selected menu.
0. 17. The application menu selecting and activating method of
0. 18. The application menu selecting and activating method of
0. 19. An application menu selecting and activating apparatus using image cognition, comprising:
a camera for capturing an image; and
display means for displaying the image received from the camera on a screen, for designating particular regions of the screen for displaying respectively a plurality of predetermined menus, and for selecting a menu from the plurality of predetermined menus when a pattern is positioned on its corresponding region.
0. 20. An application menu selecting and activating method using image cognition, comprising the steps of:
recognizing a user's image in real time;
displaying the user's image on a client region of a display screen;
recognizing a pattern position by a pattern cognition per predetermined time period;
selecting a menu when the recognized pattern position is within a certain pattern region containing predetermined menus; and
activating the selected menu.
0. 22. An application menu selecting and activating apparatus using image cognition, comprising:
a camera for capturing a user's image in real time;
display means for displaying the user's image received from the camera on a client region and for designating a particular region of the externally applied image;
means for selecting a required menu when a pattern is positioned on a corresponding region; and
a means for activating the selected menu.
0. 23. The method of claim 21, wherein the activating is performed by recognizing the user's gesture.
0. 24. The method of claim 23, wherein the recognizing the user's gesture is performed by comparing a previously captured user image with a currently captured user image.
0. 25. The method of claim 21, wherein the activating is performed by determining a predetermined stationed time lapse.
|
This application is a Reissue of U.S. Pat. No. 6,160,899. More than one reissue application has been filed for the reissue of U.S. Pat. No. 6,160,899. The Reissue application numbers are Ser. Nos. 13/027,619 and 13/048,945 (the present application).
1. Field of the Invention
The present invention relates to a method of selecting and activating an application menu, and more particularly, to an improved method of application menu selection and activation through image cognition, wherein a menu is selected and activated in correspondence to a user's motion while the motion image of the user is recognized at real time by an image-capturing device such as a camera.
2. Description of the Background Art
In order to select and activate a particular item from a list of application menu being displayed on a monitor screen, a computer generally adopts an input device, such as keyboard, mouse and touchpad.
Under a touch-screen method, the moment a user touches directly by hand a desired menu item among the menu list displayed on the monitor screen, the menu item becomes activated.
As another example, a pointer type wireless control device is employed to select and activate a menu list using an infrared transmission device. Such a pointer type wireless control device is provided with a plurality of sensors at corner portions of a monitor and it calculates a phase difference using an infrared signal being generated from a transmission unit, and accordingly coordinate values are obtained so that a transmitter may move the pointer to a desired position, thereby selecting and activating the required menu item.
However, such a conventional technology requires an additional, external device for the menu selection and activation.
Further, in case of a touch-screen and a pointer type wireless control device, there should be disadvantageously provided a plurality of sensors at corner portions of the monitor.
The present invention is directed to solving the conventional disadvantages.
Accordingly, it is an object of the present invention to provide a method of application menu selection and activation using image cognition which is capable of selecting and activating a menu list in response to a user's motion or a particular device movement while recognizing a user's image at real time by use of an image-capturing device such as a camera.
According to an embodiment of the present invention, using an image-capturing device such as a camera attached to a system, a user's image is recognized at real time and displayed on an initial screen of a monitor. The user makes a direct hand motion while viewing his own image displayed on the initial screen, and when a desired menu icon is designated among a variety of menu icons arrayed on the initial screen, the system guides the user's hand image to the corresponding menu icon for its selection. When the user makes a particular body motion to activate the selected menu, the system recognizes the motion for thereby activating the selected menu.
In the above-described embodiment, a pattern wearable on a finger may be employed so as to accurately recognize a user's specific motion. When the user indicates a desired menu icon wearing the pattern on his finger, the system guides the user's hand image on the screen to move toward the corresponding menu icon for the menu selection. As described in the above-described embodiment, when the user makes a particular body motion to activate the selected menu, the system recognizes the motion for thereby activating the selected menu.
According to another embodiment of the present invention, a particular pattern grabbable by a user is employed. When the user indicates a desired menu icon, the system guides the user's hand image displayed on the screen to move to the corresponding menu icon for its selection, and when the user operates a menu activating member provided in the pattern itself, the system responds, whereby the selected menu becomes activated.
The object and advantages of the present invention will become more readily apparent from the detailed description given hereinafter. However, it should be understood that the detailed description and specific example, while indicating preferred embodiments of the invention, are given by way of illustration only, since various changes and modifications within the spirit and scope of the invention will become apparent to those skilled in the art from this detailed description.
The present invention will become better understood with reference to the accompanying drawings which are given only by way of illustration and thus are not limitative of the present invention, wherein:
On the initial screen serving as a client's window region, there are displayed a plurality of menu lists in type form of icons 11, 12, 13, . . . , 16. A user's image is displayed on the entire initial screen together with the menu lists.
The menu icons 11, 12, 13, . . . , 16 are displayed on the left of the screen, and the dotted squares B1, B2, B3, . . . , B6 enclosing the icons 11, 12, 13, . . . , 16, respectively, are pattern regions for pattern cognition and they do not appear on the real screen.
When the system 2 starts operation, the user's image captured by the camera 1 is displayed on the monitor screen. Accordingly, the user can view his own image being displayed on the screen as shown in
Likewise, as the user's own image is displayed on the screen, the user can easily notice his hand's location while feeling he stands in front of a mirror.
Then, following the hand's movement of the user, a menu icon will be selected and the selected menu icon will be activated and the relevant steps will now be described.
When the user moves his hand toward the region of menu icons 11, 12, 13, . . . , 16, the user's hand image on the screen also moves toward the menu icons.
In the meantime, the system 2 continuously checks up the screen color within the plurality of patterns regions B1, B2, B3, . . . , B6 (Step S22). Since the user's hand is flesh color and the screen background color is not so, when the user moves his hand to a certain pattern region B2, the color in the pattern region B2 changes to flesh color. The system 2 checks up whether the screen color within the pattern regions B1, B2, B3, . . . , B6 is converted to flesh color, thereby determining that the user's hand is positioned on a particular menu icon (Step S23).
In
In the next step, if the user nods his head, the system 2 recognizes the nodding through a gesture cognition device provided within the system 2 and accordingly activates the selected menu icon 12 (Steps S25, S26).
Meanwhile, in order for the system to recognize the user's gesture, there should be provided a pattern cognition using a moving image. That is, the system continuously captures the user's image and the captured moving image is preprocessed, and the previously captured image is compared with the presently captured image so as to extract characteristics of the two images, whereby the nodding of the user's head can be determined on the ground of the extracted characteristics.
The method in the above-described embodiment is to activate menu by recognizing the user's gesture. Here, the menu activation can be also performed when a particular pattern stays within a certain pattern region for a certain time period. Here, by adding a function to the system, the stationed time period of the particular pattern may be counted so that if a predetermined time lapses the menu becomes activated.
In the selection mode of the menu using the hand motion recognition of the user, there may occur an error operation in the result of erroneous recognition in which a hand motion of the user is mistaken for an arm motion due to the inaccurate recognition of the system. In order to overcome such an erroneous operation of the system, a simple type of pattern can be worn on a user's finger.
As shown in
As further shown in
The indication rod as shown in
With reference to
First, the data with regard to the first pattern portion 12 and the second pattern portion 13 are set in the system.
Step S31 is identical to Step S21 in
Next, when the button 14 of the indication rod is pressed by the user, the second pattern portion 13 is externally exposed from the first pattern portion 12. When the exposure of the second pattern portion 13 is detected by the system, the selected menu icon 12 becomes activated. Likewise, if there is employed such an indication rod having the first and second pattern portions, the system does not require such a gesture cognition function as described in the first embodiment of the present invention.
In
As shown in
When the user moves his hand leftward to select a menu, the system 52 recognizes the leftward motion so that the hand in the user's image displayed in the image block 53 makes a leftward movement, and accordingly the user's desired menu is selected by checking up the screen color of the pattern region 54.
When the user moves his hand, the user's image is displayed inside the image block 63 and the system causes the pointer 64 to move in response to the user's hand movement. Here, the pointer serves as a mouse pointer mainly employed in the window's operating system in a computer.
The method of menu selection and activation using image cognition according to the preferred embodiments of the present invention may also replace the mouse-oriented menu selection and activation in prevalent window system computers.
As the present invention may be embodied in several forms without departing from the spirit of essential characteristics thereof, it should also be understood that the above-described embodiments are not limited by any of the details of the foregoing description, unless otherwise specified, but rather should be construed broadly within its spirit and scope as defined in the appended claims, and therefore all changes and modifications that fall within meets and bounds of the claims, or equivalences of such meets and bounds are therefore intended to embrace the appended claims.
Patent | Priority | Assignee | Title |
9965039, | Dec 23 2013 | Samsung Electronics Co., Ltd. | Device and method for displaying user interface of virtual input device based on motion recognition |
Patent | Priority | Assignee | Title |
4565999, | Apr 01 1983 | Bankers Trust Company | Light pencil |
5252951, | Apr 28 1989 | INTERNATIONAL BUSINESS MACHINES CORPORATION A CORP OF NEW YORK | Graphical user interface with gesture recognition in a multiapplication environment |
5319747, | Apr 02 1990 | U.S. Philips Corporation | Data processing system using gesture-based input data |
5454043, | Jul 30 1993 | Mitsubishi Electric Research Laboratories, Inc | Dynamic and static hand gesture recognition through low-level image analysis |
5511148, | Apr 30 1993 | Xerox Corporation | Interactive copying system |
5528263, | Jun 15 1994 | Daniel M., Platzker | Interactive projected video image display system |
5553277, | Dec 29 1992 | Fujitsu Limited | Image search method for searching and retrieving desired image from memory device |
5617312, | Nov 19 1993 | GOOGLE LLC | Computer system that enters control information by means of video camera |
5732227, | Jul 05 1994 | Hitachi, LTD | Interactive information processing system responsive to user manipulation of physical objects and displayed images |
5802220, | Dec 15 1995 | University of Maryland | Apparatus and method for tracking facial motion through a sequence of images |
5898434, | May 15 1991 | Apple Inc | User interface system having programmable user interface elements |
5900863, | Mar 16 1995 | Kabushiki Kaisha Toshiba | Method and apparatus for controlling computer without touching input device |
5926168, | Sep 30 1994 | CDN INNOVATIONS, LLC | Remote pointers for interactive televisions |
5926264, | Oct 12 1994 | Qinetiq Limited | Position sensing of a remote target |
6094197, | Dec 21 1993 | Xerox Corporation | Graphical keyboard |
6525749, | Dec 30 1993 | Xerox Corporation | Apparatus and method for supporting the implicit structure of freeform lists, outlines, text, tables and diagrams in a gesture-based input system and editing system |
JP5324181, |
Executed on | Assignor | Assignee | Conveyance | Frame | Reel | Doc |
Mar 16 2011 | LG Electronics Inc. | (assignment on the face of the patent) | / |
Date | Maintenance Fee Events |
Sep 25 2012 | ASPN: Payor Number Assigned. |
Date | Maintenance Schedule |
Jun 05 2015 | 4 years fee payment window open |
Dec 05 2015 | 6 months grace period start (w surcharge) |
Jun 05 2016 | patent expiry (for year 4) |
Jun 05 2018 | 2 years to revive unintentionally abandoned end. (for year 4) |
Jun 05 2019 | 8 years fee payment window open |
Dec 05 2019 | 6 months grace period start (w surcharge) |
Jun 05 2020 | patent expiry (for year 8) |
Jun 05 2022 | 2 years to revive unintentionally abandoned end. (for year 8) |
Jun 05 2023 | 12 years fee payment window open |
Dec 05 2023 | 6 months grace period start (w surcharge) |
Jun 05 2024 | patent expiry (for year 12) |
Jun 05 2026 | 2 years to revive unintentionally abandoned end. (for year 12) |