US20120213408A1 - System of controlling device in response to gesture - Google Patents
System of controlling device in response to gesture Download PDFInfo
- Publication number
- US20120213408A1 US20120213408A1 US13/455,095 US201213455095A US2012213408A1 US 20120213408 A1 US20120213408 A1 US 20120213408A1 US 201213455095 A US201213455095 A US 201213455095A US 2012213408 A1 US2012213408 A1 US 2012213408A1
- Authority
- US
- United States
- Prior art keywords
- gesture
- attribute
- background target
- control
- background
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/01—Input arrangements or combined input and output arrangements for interaction between user and computer
- G06F3/017—Gesture based interaction, e.g. based on a set of recognized hand gestures
Definitions
- the present invention relates to a system of controlling a device in response to a gesture.
- a gesture may be required for every function.
- a certain function for example, a “power on” function.
- a control or a device should be selected.
- a certain method for proposing a candidate of the control or the device is required in advance, and a gesture for selection is performed on this basis, and thus, a gesture which should be naturally intuitive becomes laborious and inefficient.
- the problems may be partly solved by providing different meanings for a specific gesture using several methods.
- the method uses a voice together with a gesture, for example, as disclosed in JP-A-10-31551. That is, the method uses the voice at a point of time when the gesture is recognized, or language information according to the voice or voice recognition.
- An advantage of some aspects of the invention is that it provides a control system enhancing convenience of a technology that enables a different control using the same gesture.
- a system of controlling a device includes: a gesture recognition unit which recognizes a gesture; an attribute recognition unit which recognizes an attribute of a background target of the recognized gesture; and a command transmitting unit which generates a control command on the basis of a combination of the recognized gesture and the background target attribute and transmits the control command to a device.
- the gesture is not limited to motion of a human body, but may be performed with anything (for example, an indicating stick).
- the background target is, for example, at least one of a shape, letter (for example, including numeral or symbol), color and pattern.
- the background target may be specified on the basis of a photograph image of a background of the gesture, or may be expressed as information read from a storage medium (for example, RFID (radio frequency identification) tag) installed in a variety of targets.
- a storage medium for example, RFID (radio frequency identification) tag
- FIG. 1 illustrates functional blocks and process flows of a control system according to a first embodiment of the invention.
- FIG. 2 illustrates functional blocks and process flows of the control system according to a second embodiment of the invention.
- FIG. 3 illustrates functional blocks and process flows of the control system according to a third embodiment of the invention.
- FIG. 4 illustrates a process performed according to a fourth embodiment of the invention.
- FIG. 5A illustrates a process performed according to a fifth embodiment of the invention
- FIG. 5B illustrates a process performed as a modified example according to the fifth embodiment of the invention.
- FIG. 6 illustrates a process performed according to a sixth embodiment of the invention.
- FIG. 1 illustrates functional blocks and process flows of a control system 100 according to a first embodiment of the invention.
- the device 109 may employ a variety of apparatuses, for example, a personal computer, a television apparatus, a printer, a projector, an electronic paper, an illumination device, or an air conditioner.
- the control target device 109 is provided as, for example, a single device, but the control target device 109 may be provided as a plurality of devices, as in a third embodiment to be described later.
- the control system 100 recognizes color as a background target of a gesture and transmits a control command corresponding to a combination of the gesture and an attribute of the recognized color to the device 109 . For this reason, a user controls the device 109 by performing a gesture over a target having a certain color. That is, a gesture to be performed over a target having a certain color is selected, according to how it is desired to control the device 109 .
- the background target of the gesture may include, for example, things such as a wall, paper, a device, a human body or some part thereof (such as a head, hand or leg), an image projected by a projector or the like, or a display image on a display screen of a mobile phone, a personal computer, an electronic paper or the like.
- the control system 100 includes a gesture sensor 108 , a camera 107 , a gesture recognition unit 105 , an attribute recognition unit 104 , a setting information storing unit 103 , and a command transmitting unit 106 .
- the gesture recognition unit 105 , the attribute recognition unit 104 and the command transmitting unit 106 may be provided as a hardware circuit, a function realized by executing a computer program by a microprocessor, or a combination of the hardware circuit and the function. Further, the gesture recognition unit 105 , the attribute recognition unit 104 , the setting information storing unit 103 and the command transmitting unit 106 may be provided in a single apparatus (for example, the hardware circuit) or may be separately provided in a plurality of apparatuses.
- the gesture sensor 108 detects a gesture.
- the gesture sensor 108 may include, for example, a camera, a laser pointer or the like.
- a signal indicating the detected gesture is input to the gesture recognition unit 105 from the gesture sensor 108 .
- the camera 107 is installed, for example, in a predetermined place (for example, a ceiling of a room) to photograph a background of the gesture.
- a signal indicating an image photographed by the camera 107 is input to the attribute recognition unit 104 from the camera 107 .
- the photograph image may be a moving image or a still image.
- the camera 107 may be provided in a plurality.
- a special camera which detects only light having a specific wavelength may be employed as the camera 107 . In this case, a color material emitting the light having the specific wavelength is put on a surface of the target having the background target attribute of the gesture.
- the gesture recognition unit 105 recognizes the gesture on the basis of the signal input from the gesture sensor 108 .
- the gesture recognition unit 105 notifies the command transmitting unit 106 of information indicating the recognized gesture.
- the gesture recognition unit 105 also recognizes a position (hereinafter, referred to as a gesture position) in which the gesture is performed.
- the gesture position is recognized, for example, on the basis of information indicating that the signal is input from a certain gesture sensor 108 among the plurality of gesture sensors 108 (that is, on the basis of information indicating that the certain gesture sensor 108 detects the gesture).
- the gesture recognition unit 105 notifies the attribute recognition unit 104 of information indicating the recognized gesture position.
- the gesture position may be expressed in a coordination system, for example, in response to an installation position, etc.
- the gesture position may be expressed in an xyz coordination system in which a longitudinal axis and a transverse axis of the room are the x-axis and y-axis, respectively, and a height thereof is z-axis.
- an attribute of the background target of the gesture exists in a position lower than a z coordinate of the gesture position.
- the attribute recognition unit 104 analyzes the photograph image input from the camera 107 on the basis of the gesture position indicated by information which is notified from the gesture recognition unit 105 , so as to recognize the attribute of the background target of the gesture.
- the attribute recognition unit 104 notifies the command transmitting unit 106 of information indicating the recognized background target attribute.
- the setting information storing unit 103 is a storing resource (for example, volatile or nonvolatile memory).
- the setting information storing unit 103 stores setting information in advance.
- the setting information is information indicating that a certain control is to be performed in the case that a certain gesture and a certain background target attribute are recognized.
- the setting information may be edited by a user from a specific console.
- the command transmitting unit 106 specifies, from the setting information in the setting information storing unit 103 , a control and a control target corresponding to a combination of the gesture indicated by information notified from the gesture recognition unit 105 and the background target attribute indicated by information notified from the attribute recognition unit 104 .
- the command transmitting unit 106 generates a control command for performing the control specified for the specified control target (device 109 ) and transmits the generated control command to the device 109 .
- the device 109 is an air conditioner and the setting information includes information indicating that a control target is an air conditioner; a red color or a blue color represents a power switch; a red color represents “ON”; and a blue color represents “OFF”.
- an attribute of color which is a background target of a gesture is two colors of red and blue, and things having the background target are a red paper 112 and a blue paper 113 (the red paper 112 and the blue paper 113 may be paper on which a red color region and a blue color region are printed using a color material emitting light having a special wavelength or may be commercially available paper).
- gestures which are performed over the red paper 112 and the blue paper 113 are the same (for example, a gesture that a hand is horizontally moved from the left to the right as indicated by an arrow).
- the gesture sensor 108 detects the gesture (S 101 ).
- a signal indicating the detected gesture is input to the gesture recognition unit 105 .
- the gesture recognition unit 105 recognizes the gesture and a position of the gesture on the basis of the signal input from the gesture sensor 108 (S 102 ).
- the gesture recognition unit 105 notifies the command transmitting unit 106 of information indicating the recognized gesture (S 103 ), and notifies the attribute recognition unit 104 of information indicating the recognized gesture position (S 104 ).
- the attribute recognition unit 104 obtains a photograph image of the background of the gesture position indicated by the notified information, that is, a photograph image displayed in the blue paper 113 , from the camera 107 (S 105 ). For example, the attribute recognition unit 104 enables the camera 107 to photograph the background using a region of a predetermined size including the gesture position as a view. In this case, the photograph image of the background target of the gesture after the gesture is recognized is obtained. For this reason, in the photograph image, the possibility that the background target is hidden by a part moved by the gesture may be decreased. The attribute recognition unit 104 analyzes the photograph image to recognize the blue color as the background target attribute (S 106 ), and notifies the command transmitting unit 106 of information indicating that the recognized background target attribute is the blue color (S 107 ).
- the command transmitting unit 106 specifies a control “power switch on” corresponding to a combination of the gesture indicated by the notified information and the background target attribute of the blue color, from the setting information in the setting information storing unit 103 (S 108 ).
- the command transmitting unit 106 generates a control command for turning on the air conditioner 109 (S 109 ) and transmits the generated control command to the air conditioner 109 (S 110 ).
- the air conditioner 109 is turned on.
- the red paper 112 and the blue paper 113 may be provided in a user desired place, for example, on a wall of a room, or may be displaced to any other room.
- the gesture is performed over the red paper 112 and the blue paper 113 in the other room, power on and off of an air conditioner installed in the room is controlled. That is, power switches (red paper 112 and blue paper 113 ) of air conditioners in a plurality of rooms may be provided as a common switch.
- control system 100 may be provided in every room, but it is preferable that the camera 107 , the gesture sensor 108 and the device 109 are installed in every room and the gesture recognition unit 105 , the attribute recognition unit 104 , the setting information storing unit 103 and the command transmitting unit 106 are common for the plurality of rooms, from a point of view of reduction of the number of components, etc.
- the control is performed according to the combination of the gesture and the background target attribute. For this reason, it is possible to perform different controls with the same gesture. In addition, since it is possible to intuitively analogize that a certain control is to be performed from the background target attribute, the burden that a user has to memorize a corresponding relation between the combination of the gesture and the background target attribute and the performed control may be reduced.
- FIG. 2 illustrates functional blocks and process flows of a control system 200 according to the second embodiment of the invention.
- a sheet of yellow paper 214 is used as a power switch (a thing having a background target) of an air conditioner 109 .
- a method of performing a gesture for each of power on and off of the air conditioner 109 is written on the yellow paper 214 (for example, as shown in FIG. 2 , an arrow indicating a direction in which a hand is moved).
- the yellow paper 214 it is necessary to perform a down gesture in which the hand is moved from up to down over the yellow paper 214 so as to turn on power of the air conditioner 109 , and it is necessary to perform an up gesture in which the hand is moved from down to up over the yellow paper 214 so as to turn off power.
- Setting information in a setting information storing unit 203 includes information indicating that a control target is an air conditioner; a yellow paper represents a power switch; an up gesture represents “OFF” and a down gesture represents “ON”.
- step S 202 the up gesture is recognized; and in step S 203 , information indicating the up gesture is notified to a command transmitting unit 106 .
- step S 206 a yellow color is recognized as an attribute of the background target; and in step S 207 , information indicating that the background target attribute is the yellow color is notified to the command transmitting unit 106 .
- the command transmitting unit 106 specifies a control “OFF” corresponding to the up gesture and the background target attribute of the yellow color from the setting information (S 208 ), generates a control command for turning off the air conditioner 109 (S 209 ) and then transmits the control command to the air conditioner 109 . Accordingly, the air conditioner 109 is turned off.
- a set of a downward arrow and a letter “ON”, and a set of an upward arrow and a letter “OFF” are written on the yellow paper 214 .
- any human may control the air conditioner 109 , and further, it is possible to reduce the paper as the power switch to a single sheet (in the first embodiment, a region of the red color and a region of the blue color may exist on a single sheet).
- the second embodiment may be realized on the basis of the first embodiment as follows. That is, in the second embodiment, the setting information in the first embodiment is edited to the above described setting information and the yellow paper 214 is prepared instead of papers 112 and 113 .
- a method of performing a gesture by means of an arrow or the like may include a user's hand writing.
- FIG. 3 illustrates functional blocks and process flows of a control system 300 according to a third embodiment of the invention.
- a plurality of devices includes, for example, an air conditioner 109 A and a television apparatus (TV) 109 T. Further, an OCR (optical character reader) engine 302 is provided in the control system 300 . The OCR engine 302 recognizes a letter on a target displayed in a photograph image. The OCR engine 302 is, for example, a computer program and is executed in a microprocessor.
- OCR optical character reader
- setting information in a setting information storing unit 303 includes information indicating that a letter “A” represents an air conditioner; a letter “T” represents a TV; a transverse gesture represents an on and off toggle; an up gesture represents “move up”; a down gesture represents “move down”, and a yellow color represents a switch.
- a first paper 315 A and a second paper 315 T are provided as things having the background target.
- the first paper 315 A has the letter “A” and the yellow color on a surface thereof
- the second paper 315 T has the letter “T” and the yellow color on a surface thereof.
- step S 302 the transverse gesture is recognized; and in step S 303 , information indicating the transverse gesture is notified to a command transmitting unit 106 .
- step S 306 the yellow color is recognized as an attribute of the background target, and an attribute recognition unit 304 makes the OCR engine 302 analyze the photograph image, and thus, the letter “T” is also recognized as the background target attribute.
- step S 307 information indicating that the background target attribute represents the yellow color and the letter “T” is notified to the command transmitting unit 106 .
- the command transmitting unit 106 specifies, from the setting information, a control target “TV” and a control “on and off toggle” corresponding to the transverse gesture and the background target attribute of the yellow color and the letter “T” (S 308 ).
- the command transmitting unit 106 generates a control command for switching the power of the TV 109 T on and off (S 309 ) and transmits the control command to the TV 109 T.
- the power on and off of the TV 109 T is switched.
- the third embodiment it is possible to switch the power of the air conditioner 109 A on and off and to perform up and down control of a setting temperature, or to switch power on and off of the TV 109 T and to perform up and down control of volume, by means of only three types of gestures of the transverse gesture, the up gesture and the down gesture. That is, a letter is added as a type of the background target attribute in addition to color, more control targets may be controlled using the same gesture or many controls may be performed. Moreover, since it may be easily analogized the up and down gesture that a certain control is performed, such a gesture may be easily contrived.
- the third embodiment may employ the following applications. For example, simply by moving a finger in a transverse direction over a desired telephone number among a list of telephone numbers, the telephone number is transmitted to the other party (in this case, a control target is a telephone in a room). Further, simply by moving a finger in a certain direction over numerals (dates) on a calendar installed on a wall in the room, information including a user's schedule or a TV program schedule, etc. is may be displayed on an output device such as a TV or PC (In this case, the control target is the output device).
- a command transmitting unit generates and transmits a command control for performing an output having a background target attribute of a gesture in an output device.
- the command transmitting unit enables a projector to project an initial menu 123 on a screen 122 .
- This process may be performed in response to detection of the entrance of a user or may be performed in response to a specific gesture of the user.
- the projector is a kind of device which may be a control target according to the gesture.
- the initial menu 123 is a list of letter rows indicating a device which may be the control target. Instead of the letter row, any other gesture background target such as a mark may be employed.
- the command transmitting unit If a predetermined gesture is performed over the initial menu 123 , the gesture and a background target attribute of the gesture are recognized.
- the command transmitting unit generates a control command for indicating a manipulation menu corresponding to the combination and transmits the control command to the projector.
- a transition is performed from the initial menu 123 to the manipulation menu. For example, if a predetermined gesture is performed over a letter row “TV” in the initial menu 123 , a TV manipulation menu 124 is displayed. If a predetermined gesture is performed over a letter row “air conditioner” in the initial menu 123 , an air conditioner manipulation menu 125 is displayed.
- the command transmitting unit enables the projector to display a menu having the background target of the gesture. According to the fourth embodiment, even though devices or controls for the control target are increased, it is not necessary to increase things (for example, paper) having the background target.
- the same gesture is performed over the whole of the background targets in consideration of convenience, but instead, a user desired gesture may be performed.
- the combination of the gesture and the background target attribute is defined as setting information.
- the output device through which the menu is output is not limited to the projector, but may employ a device such as a personal computer, a TV, or an electronic paper.
- a background target notification unit which notifies a user of a gesture background target among things which exist in a place (for example, inside a room) in which the user is present.
- the background target notification unit may be realized by a combination of, for example, the above described command transmitting unit and an illumination device.
- the command transmitting unit illuminates light from the illumination device to the background target, to thereby notify the user of the background target. According to an example in FIG.
- the attribute recognition unit obtains a photograph image in a room from a camera, analyzes the photograph image and specifies the background target on the basis of the analyzed result and setting information. For example, the attribute recognition unit recognizes, in the case that only information indicating an attribute for color is included in the setting information, the color is recognized as the background target. However, even though a letter or any other target is specified from the photograph image, it is not recognized as the background target.
- the attribute recognition unit notifies the command transmitting unit of information indicating a position of the specified background target.
- the command transmitting unit controls the illumination device to illuminate light to a position indicated by information notified from the attribute recognition unit.
- the attribute recognition unit recognizes a position of a user from the photograph image and notifies the command transmitting unit of the position information of the user.
- the command transmitting unit controls the illumination device to illuminate light to the background target in the case that the user is present at a predetermined distance from the background target, and stops the light illumination of the illumination device in the case that the user moves to a place removed from the background target by a predetermined distance.
- the illumination is stopped, thereby saving power.
- the notification method of the background target is not limited to the light illumination, but various methods such as notification using a voice may be employed.
- an effective period of a background target is set. If a gesture is performed over the background target in a period other than the effective period, a control corresponding to a combination of an attribute of the background target and the gesture is not performed.
- a specific example thereof will be described.
- a system 601 which receives the reservation (hereinafter, for example, a microprocessor which executes a computer program) prints a region 610 having a background target (for example, color) and a two-dimensional bar code 603 through a printer. Accordingly, the user may obtain a printout 602 which includes the region 610 having the background target and the two-dimensional bar code 603 .
- the two-dimensional bar code 603 has information indicating the reserved usage time.
- the user brings the printout 602 to the conference room 604 .
- the user puts the printout 602 on a desired or specific place (for example, on a table) in the conference room.
- the user performs a gesture for performing a desired control on a region 610 in the printout 602 .
- An attribute recognition unit analyzes a photograph image of the printout 602 which exists to the rear of the gesture. Accordingly, the attribute recognition unit specifies information indicated by the two-dimensional bar code 603 in the printout 602 , and notifies a command transmitting unit of the information specified from the two-dimensional bar code 603 other than information indicating the background target attribute.
- the command transmitting unit specifies the usage time indicated by information specified from the two-dimensional bar code 603 , and determines whether the current date and time exists in the usage time. If the current date and time exists in the usage time, the command transmitting unit transmits a control command corresponding to the combination of the gesture and the background target attribute to the control target (that is, device in the conference room). Meanwhile, if the current date and time is not in the usage time, the command transmitting unit does not transmit the control command to the control target.
- the sixth embodiment is not limited to the reservation of the conference room, but may be applied to usage of any other room or device.
- the background target attribute before the background target attribute is recognized, it is possible to determine whether the current date and time belongs to the usage time indicated by information specified from the two-dimensional bar code 603 . In this case, if it is determined that the current date and time belongs to the usage time, the background target attribute may be recognized. Contrarily, if it is determined that the current date and time does not belong to the usage time, the background target attribute may not be recognized. Alternatively, for example, if it is determined that the current date and time does not belong to the usage time indicated by information specified from the two-dimensional bar code 603 , the control command may not be generated.
- the camera 107 may be provided integrally with the gesture sensor 108 . That is, both the gesture and the background target attribute may be recognized by analyzing the photograph image of the camera 107 .
Abstract
A control system includes: an input unit through which a signal for a gesture and a background of the gesture is input; a gesture recognition unit which recognizes the gesture on the basis of the input signal; an attribute recognition unit which recognizes an attribute of a background target of the recognized gesture on the basis of the input signal; and a command transmitting unit which generates a control command on the basis of a combination of the recognized gesture and the background target attribute and transmits the control command to a device.
Description
- 1. Technical Field
- The present invention relates to a system of controlling a device in response to a gesture.
- 2. Related Art
- In order for a human being to control a device in a naturally intuitive manner, there are proposed several technologies which control a device using a gesture. However, in the case that different controls are to be performed using the same gesture (for example, in the case that different controls are to be performed for the same device or in the case that a plurality of different devices is to be controlled), there exist the following problems.
- In any case, a gesture may be required for every function. However, it is difficult to analogize a corresponding gesture to a certain function (for example, a “power on” function). It is not easy for a human being to remember and correctly control all the gestures each corresponding to all the functions. In another case, a control or a device should be selected. In this case, a certain method for proposing a candidate of the control or the device is required in advance, and a gesture for selection is performed on this basis, and thus, a gesture which should be naturally intuitive becomes laborious and inefficient.
- The problems may be partly solved by providing different meanings for a specific gesture using several methods.
- As one of the methods, there is a method which uses a voice together with a gesture, for example, as disclosed in JP-A-10-31551. That is, the method uses the voice at a point of time when the gesture is recognized, or language information according to the voice or voice recognition.
- However, with voices, a problem of false recognition due to noises or individual differences is significant, and further, a user needs to remember a corresponding relation between controls or devices and plural types of words, and thus, it is not possible to substantially prevent a human being from being inconvenienced.
- An advantage of some aspects of the invention is that it provides a control system enhancing convenience of a technology that enables a different control using the same gesture.
- A system of controlling a device includes: a gesture recognition unit which recognizes a gesture; an attribute recognition unit which recognizes an attribute of a background target of the recognized gesture; and a command transmitting unit which generates a control command on the basis of a combination of the recognized gesture and the background target attribute and transmits the control command to a device.
- The gesture is not limited to motion of a human body, but may be performed with anything (for example, an indicating stick).
- Herein, the background target is, for example, at least one of a shape, letter (for example, including numeral or symbol), color and pattern. The background target may be specified on the basis of a photograph image of a background of the gesture, or may be expressed as information read from a storage medium (for example, RFID (radio frequency identification) tag) installed in a variety of targets.
- The invention will be described with reference to the accompanying drawings, wherein like numbers reference like elements.
-
FIG. 1 illustrates functional blocks and process flows of a control system according to a first embodiment of the invention. -
FIG. 2 illustrates functional blocks and process flows of the control system according to a second embodiment of the invention. -
FIG. 3 illustrates functional blocks and process flows of the control system according to a third embodiment of the invention. -
FIG. 4 illustrates a process performed according to a fourth embodiment of the invention. -
FIG. 5A illustrates a process performed according to a fifth embodiment of the invention, andFIG. 5B illustrates a process performed as a modified example according to the fifth embodiment of the invention. -
FIG. 6 illustrates a process performed according to a sixth embodiment of the invention. - Hereinafter, embodiments of the present invention will be described with reference to the accompanying drawings.
-
FIG. 1 illustrates functional blocks and process flows of acontrol system 100 according to a first embodiment of the invention. - There is provided a
device 109 as a control target. Thedevice 109 may employ a variety of apparatuses, for example, a personal computer, a television apparatus, a printer, a projector, an electronic paper, an illumination device, or an air conditioner. In the present embodiment, thecontrol target device 109 is provided as, for example, a single device, but thecontrol target device 109 may be provided as a plurality of devices, as in a third embodiment to be described later. - The
control system 100 recognizes color as a background target of a gesture and transmits a control command corresponding to a combination of the gesture and an attribute of the recognized color to thedevice 109. For this reason, a user controls thedevice 109 by performing a gesture over a target having a certain color. That is, a gesture to be performed over a target having a certain color is selected, according to how it is desired to control thedevice 109. The background target of the gesture may include, for example, things such as a wall, paper, a device, a human body or some part thereof (such as a head, hand or leg), an image projected by a projector or the like, or a display image on a display screen of a mobile phone, a personal computer, an electronic paper or the like. - The
control system 100 includes agesture sensor 108, acamera 107, agesture recognition unit 105, anattribute recognition unit 104, a settinginformation storing unit 103, and acommand transmitting unit 106. Thegesture recognition unit 105, theattribute recognition unit 104 and thecommand transmitting unit 106 may be provided as a hardware circuit, a function realized by executing a computer program by a microprocessor, or a combination of the hardware circuit and the function. Further, thegesture recognition unit 105, theattribute recognition unit 104, the settinginformation storing unit 103 and thecommand transmitting unit 106 may be provided in a single apparatus (for example, the hardware circuit) or may be separately provided in a plurality of apparatuses. - The
gesture sensor 108 detects a gesture. Thegesture sensor 108 may include, for example, a camera, a laser pointer or the like. A signal indicating the detected gesture is input to thegesture recognition unit 105 from thegesture sensor 108. - The
camera 107 is installed, for example, in a predetermined place (for example, a ceiling of a room) to photograph a background of the gesture. A signal indicating an image photographed by thecamera 107 is input to theattribute recognition unit 104 from thecamera 107. In this respect, the photograph image may be a moving image or a still image. Thecamera 107 may be provided in a plurality. A special camera which detects only light having a specific wavelength may be employed as thecamera 107. In this case, a color material emitting the light having the specific wavelength is put on a surface of the target having the background target attribute of the gesture. - The
gesture recognition unit 105 recognizes the gesture on the basis of the signal input from thegesture sensor 108. Thegesture recognition unit 105 notifies thecommand transmitting unit 106 of information indicating the recognized gesture. In addition, thegesture recognition unit 105 also recognizes a position (hereinafter, referred to as a gesture position) in which the gesture is performed. The gesture position is recognized, for example, on the basis of information indicating that the signal is input from acertain gesture sensor 108 among the plurality of gesture sensors 108 (that is, on the basis of information indicating that thecertain gesture sensor 108 detects the gesture). Thegesture recognition unit 105 notifies theattribute recognition unit 104 of information indicating the recognized gesture position. The gesture position may be expressed in a coordination system, for example, in response to an installation position, etc. of thecamera 107. For example, in the case that thecamera 107 is installed on a ceiling of a room, the gesture position may be expressed in an xyz coordination system in which a longitudinal axis and a transverse axis of the room are the x-axis and y-axis, respectively, and a height thereof is z-axis. In this case, an attribute of the background target of the gesture exists in a position lower than a z coordinate of the gesture position. - The
attribute recognition unit 104 analyzes the photograph image input from thecamera 107 on the basis of the gesture position indicated by information which is notified from thegesture recognition unit 105, so as to recognize the attribute of the background target of the gesture. Theattribute recognition unit 104 notifies thecommand transmitting unit 106 of information indicating the recognized background target attribute. - The setting
information storing unit 103 is a storing resource (for example, volatile or nonvolatile memory). The settinginformation storing unit 103 stores setting information in advance. The setting information is information indicating that a certain control is to be performed in the case that a certain gesture and a certain background target attribute are recognized. The setting information may be edited by a user from a specific console. - The
command transmitting unit 106 specifies, from the setting information in the settinginformation storing unit 103, a control and a control target corresponding to a combination of the gesture indicated by information notified from thegesture recognition unit 105 and the background target attribute indicated by information notified from theattribute recognition unit 104. Thecommand transmitting unit 106 generates a control command for performing the control specified for the specified control target (device 109) and transmits the generated control command to thedevice 109. - Hereinafter, a process flow performed in the present embodiment will be described. In this respect, it is assumed that the
device 109 is an air conditioner and the setting information includes information indicating that a control target is an air conditioner; a red color or a blue color represents a power switch; a red color represents “ON”; and a blue color represents “OFF”. Further, it is assumed that an attribute of color which is a background target of a gesture is two colors of red and blue, and things having the background target are ared paper 112 and a blue paper 113 (thered paper 112 and theblue paper 113 may be paper on which a red color region and a blue color region are printed using a color material emitting light having a special wavelength or may be commercially available paper). In addition, it is assumed that gestures which are performed over thered paper 112 and theblue paper 113 are the same (for example, a gesture that a hand is horizontally moved from the left to the right as indicated by an arrow). - For example, in the case that a user performs a gesture in proximity to and over the
blue paper 113, thegesture sensor 108 detects the gesture (S101). A signal indicating the detected gesture is input to thegesture recognition unit 105. - The
gesture recognition unit 105 recognizes the gesture and a position of the gesture on the basis of the signal input from the gesture sensor 108 (S102). Thegesture recognition unit 105 notifies thecommand transmitting unit 106 of information indicating the recognized gesture (S103), and notifies theattribute recognition unit 104 of information indicating the recognized gesture position (S104). - The
attribute recognition unit 104 obtains a photograph image of the background of the gesture position indicated by the notified information, that is, a photograph image displayed in theblue paper 113, from the camera 107 (S105). For example, theattribute recognition unit 104 enables thecamera 107 to photograph the background using a region of a predetermined size including the gesture position as a view. In this case, the photograph image of the background target of the gesture after the gesture is recognized is obtained. For this reason, in the photograph image, the possibility that the background target is hidden by a part moved by the gesture may be decreased. Theattribute recognition unit 104 analyzes the photograph image to recognize the blue color as the background target attribute (S106), and notifies thecommand transmitting unit 106 of information indicating that the recognized background target attribute is the blue color (S107). - The
command transmitting unit 106 specifies a control “power switch on” corresponding to a combination of the gesture indicated by the notified information and the background target attribute of the blue color, from the setting information in the setting information storing unit 103 (S108). Thecommand transmitting unit 106 generates a control command for turning on the air conditioner 109 (S109) and transmits the generated control command to the air conditioner 109 (S110). Thus, theair conditioner 109 is turned on. - Hereinbefore, the first embodiment has been described. The
red paper 112 and theblue paper 113 may be provided in a user desired place, for example, on a wall of a room, or may be displaced to any other room. In the case that the gesture is performed over thered paper 112 and theblue paper 113 in the other room, power on and off of an air conditioner installed in the room is controlled. That is, power switches (red paper 112 and blue paper 113) of air conditioners in a plurality of rooms may be provided as a common switch. Further, thecontrol system 100 may be provided in every room, but it is preferable that thecamera 107, thegesture sensor 108 and thedevice 109 are installed in every room and thegesture recognition unit 105, theattribute recognition unit 104, the settinginformation storing unit 103 and thecommand transmitting unit 106 are common for the plurality of rooms, from a point of view of reduction of the number of components, etc. - According to the above described first embodiment, the control is performed according to the combination of the gesture and the background target attribute. For this reason, it is possible to perform different controls with the same gesture. In addition, since it is possible to intuitively analogize that a certain control is to be performed from the background target attribute, the burden that a user has to memorize a corresponding relation between the combination of the gesture and the background target attribute and the performed control may be reduced.
- Hereinafter, a second embodiment according to the invention will be described. Herein, different configurations of the first embodiment and the second embodiment will be mainly described, and like configurations of the first embodiment and the second embodiment will be omitted or simply described (this is the same as in the third to sixth embodiments).
-
FIG. 2 illustrates functional blocks and process flows of acontrol system 200 according to the second embodiment of the invention. - A sheet of
yellow paper 214 is used as a power switch (a thing having a background target) of anair conditioner 109. A method of performing a gesture for each of power on and off of theair conditioner 109 is written on the yellow paper 214 (for example, as shown inFIG. 2 , an arrow indicating a direction in which a hand is moved). According to theyellow paper 214, it is necessary to perform a down gesture in which the hand is moved from up to down over theyellow paper 214 so as to turn on power of theair conditioner 109, and it is necessary to perform an up gesture in which the hand is moved from down to up over theyellow paper 214 so as to turn off power. - Setting information in a setting
information storing unit 203 includes information indicating that a control target is an air conditioner; a yellow paper represents a power switch; an up gesture represents “OFF” and a down gesture represents “ON”. - For example, as shown in
FIG. 2 , it is assumed that the gesture (up gesture) in which the hand is moved from down to up in proximity to and over theyellow paper 214 is performed. In this case, steps S201 to S210 which are approximately the same as the above described steps S101 to S110 are performed, respectively. For example, in step S202, the up gesture is recognized; and in step S203, information indicating the up gesture is notified to acommand transmitting unit 106. In addition, in step S206, a yellow color is recognized as an attribute of the background target; and in step S207, information indicating that the background target attribute is the yellow color is notified to thecommand transmitting unit 106. Thecommand transmitting unit 106 specifies a control “OFF” corresponding to the up gesture and the background target attribute of the yellow color from the setting information (S208), generates a control command for turning off the air conditioner 109 (S209) and then transmits the control command to theair conditioner 109. Accordingly, theair conditioner 109 is turned off. - According to the second embodiment, a set of a downward arrow and a letter “ON”, and a set of an upward arrow and a letter “OFF” are written on the
yellow paper 214. For this reason, although different gestures are required for the power on and off, any human may control theair conditioner 109, and further, it is possible to reduce the paper as the power switch to a single sheet (in the first embodiment, a region of the red color and a region of the blue color may exist on a single sheet). - The second embodiment may be realized on the basis of the first embodiment as follows. That is, in the second embodiment, the setting information in the first embodiment is edited to the above described setting information and the
yellow paper 214 is prepared instead ofpapers -
FIG. 3 illustrates functional blocks and process flows of acontrol system 300 according to a third embodiment of the invention. - A plurality of devices includes, for example, an
air conditioner 109A and a television apparatus (TV) 109T. Further, an OCR (optical character reader)engine 302 is provided in thecontrol system 300. TheOCR engine 302 recognizes a letter on a target displayed in a photograph image. TheOCR engine 302 is, for example, a computer program and is executed in a microprocessor. - The present embodiment uses a letter as well as a color, as the background target. For example, setting information in a setting
information storing unit 303 includes information indicating that a letter “A” represents an air conditioner; a letter “T” represents a TV; a transverse gesture represents an on and off toggle; an up gesture represents “move up”; a down gesture represents “move down”, and a yellow color represents a switch. Thus, afirst paper 315A and asecond paper 315T are provided as things having the background target. Thefirst paper 315A has the letter “A” and the yellow color on a surface thereof, and thesecond paper 315T has the letter “T” and the yellow color on a surface thereof. - For example, as shown in
FIG. 3 , it is assumed that a transverse gesture that a hand is moved in a horizontal direction is performed in proximity to and over thesecond paper 315T. In this case, steps S301 to S310 which are approximately the same as the above described steps S101 to S110 are performed, respectively. For example, in step S302, the transverse gesture is recognized; and in step S303, information indicating the transverse gesture is notified to acommand transmitting unit 106. Further, in step S306, the yellow color is recognized as an attribute of the background target, and anattribute recognition unit 304 makes theOCR engine 302 analyze the photograph image, and thus, the letter “T” is also recognized as the background target attribute. In step S307, information indicating that the background target attribute represents the yellow color and the letter “T” is notified to thecommand transmitting unit 106. Thecommand transmitting unit 106 specifies, from the setting information, a control target “TV” and a control “on and off toggle” corresponding to the transverse gesture and the background target attribute of the yellow color and the letter “T” (S308). Thecommand transmitting unit 106 generates a control command for switching the power of theTV 109T on and off (S309) and transmits the control command to theTV 109T. Thus, the power on and off of theTV 109T is switched. - According to the third embodiment, it is possible to switch the power of the
air conditioner 109A on and off and to perform up and down control of a setting temperature, or to switch power on and off of theTV 109T and to perform up and down control of volume, by means of only three types of gestures of the transverse gesture, the up gesture and the down gesture. That is, a letter is added as a type of the background target attribute in addition to color, more control targets may be controlled using the same gesture or many controls may be performed. Moreover, since it may be easily analogized the up and down gesture that a certain control is performed, such a gesture may be easily contrived. - The third embodiment may employ the following applications. For example, simply by moving a finger in a transverse direction over a desired telephone number among a list of telephone numbers, the telephone number is transmitted to the other party (in this case, a control target is a telephone in a room). Further, simply by moving a finger in a certain direction over numerals (dates) on a calendar installed on a wall in the room, information including a user's schedule or a TV program schedule, etc. is may be displayed on an output device such as a TV or PC (In this case, the control target is the output device).
- In addition, according to the third embodiment or the above described first and second embodiments, as a target having certain setting information and a certain background target attribute is prepared, a variety of control methods is generated. That is, flexibility becomes high and the things to be stored decrease, and thus, user convenience is enhanced.
- In a fourth embodiment of the invention, a command transmitting unit generates and transmits a command control for performing an output having a background target attribute of a gesture in an output device.
- More specifically, for example, as shown in
FIG. 4 , the command transmitting unit enables a projector to project aninitial menu 123 on ascreen 122. This process may be performed in response to detection of the entrance of a user or may be performed in response to a specific gesture of the user. The projector is a kind of device which may be a control target according to the gesture. - The
initial menu 123 is a list of letter rows indicating a device which may be the control target. Instead of the letter row, any other gesture background target such as a mark may be employed. - If a predetermined gesture is performed over the
initial menu 123, the gesture and a background target attribute of the gesture are recognized. The command transmitting unit generates a control command for indicating a manipulation menu corresponding to the combination and transmits the control command to the projector. Thus, a transition is performed from theinitial menu 123 to the manipulation menu. For example, if a predetermined gesture is performed over a letter row “TV” in theinitial menu 123, aTV manipulation menu 124 is displayed. If a predetermined gesture is performed over a letter row “air conditioner” in theinitial menu 123, an airconditioner manipulation menu 125 is displayed. - As described above, in the fourth embodiment, the command transmitting unit enables the projector to display a menu having the background target of the gesture. According to the fourth embodiment, even though devices or controls for the control target are increased, it is not necessary to increase things (for example, paper) having the background target.
- In the above description, the same gesture is performed over the whole of the background targets in consideration of convenience, but instead, a user desired gesture may be performed. The combination of the gesture and the background target attribute is defined as setting information. Further, the output device through which the menu is output is not limited to the projector, but may employ a device such as a personal computer, a TV, or an electronic paper.
- In a fifth embodiment of the invention, there is provided in a control system a background target notification unit which notifies a user of a gesture background target among things which exist in a place (for example, inside a room) in which the user is present. The background target notification unit may be realized by a combination of, for example, the above described command transmitting unit and an illumination device. In this case, the command transmitting unit illuminates light from the illumination device to the background target, to thereby notify the user of the background target. According to an example in
FIG. 5A , since the light is illuminated on a letter row in acalendar 501 or color on apaper 503, as indicated by a dotted line, the user may recognize that the letter row in thecalendar 501 or the color on thepaper 503 is the background target. - The present embodiment will be more specifically described as follows. For example, the attribute recognition unit obtains a photograph image in a room from a camera, analyzes the photograph image and specifies the background target on the basis of the analyzed result and setting information. For example, the attribute recognition unit recognizes, in the case that only information indicating an attribute for color is included in the setting information, the color is recognized as the background target. However, even though a letter or any other target is specified from the photograph image, it is not recognized as the background target. The attribute recognition unit notifies the command transmitting unit of information indicating a position of the specified background target. The command transmitting unit controls the illumination device to illuminate light to a position indicated by information notified from the attribute recognition unit.
- Hereinafter, a modified example will be described as follows. That is, the attribute recognition unit recognizes a position of a user from the photograph image and notifies the command transmitting unit of the position information of the user. The command transmitting unit controls the illumination device to illuminate light to the background target in the case that the user is present at a predetermined distance from the background target, and stops the light illumination of the illumination device in the case that the user moves to a place removed from the background target by a predetermined distance. In this case, for example, as shown in
FIG. 5B , in the case that the user moves close to apaper 119 having the background target, thepaper 119 is illuminated, and thus, the user recognizes that the background target exists in thepaper 119. In addition, in the case that the user moves away from the background target, the illumination is stopped, thereby saving power. - Hereinbefore, the fifth embodiment of the invention and one modified example thereof have been described. In this respect, the notification method of the background target is not limited to the light illumination, but various methods such as notification using a voice may be employed.
- In a sixth embodiment of the invention, an effective period of a background target is set. If a gesture is performed over the background target in a period other than the effective period, a control corresponding to a combination of an attribute of the background target and the gesture is not performed. Hereinafter, a specific example thereof will be described.
- Specifically, for example, as shown in
FIG. 6 , in the case that reservation of a usage time of a conference room (or a device in the conference room) is performed, asystem 601 which receives the reservation (hereinafter, for example, a microprocessor which executes a computer program) prints aregion 610 having a background target (for example, color) and a two-dimensional bar code 603 through a printer. Accordingly, the user may obtain aprintout 602 which includes theregion 610 having the background target and the two-dimensional bar code 603. The two-dimensional bar code 603 has information indicating the reserved usage time. - The user brings the
printout 602 to theconference room 604. The user puts theprintout 602 on a desired or specific place (for example, on a table) in the conference room. The user performs a gesture for performing a desired control on aregion 610 in theprintout 602. - An attribute recognition unit analyzes a photograph image of the
printout 602 which exists to the rear of the gesture. Accordingly, the attribute recognition unit specifies information indicated by the two-dimensional bar code 603 in theprintout 602, and notifies a command transmitting unit of the information specified from the two-dimensional bar code 603 other than information indicating the background target attribute. - The command transmitting unit specifies the usage time indicated by information specified from the two-
dimensional bar code 603, and determines whether the current date and time exists in the usage time. If the current date and time exists in the usage time, the command transmitting unit transmits a control command corresponding to the combination of the gesture and the background target attribute to the control target (that is, device in the conference room). Meanwhile, if the current date and time is not in the usage time, the command transmitting unit does not transmit the control command to the control target. - As described above, it is not possible to control the device in the conference room in a time range other than the reserved usage time. The sixth embodiment is not limited to the reservation of the conference room, but may be applied to usage of any other room or device.
- In the sixth embodiment, before the background target attribute is recognized, it is possible to determine whether the current date and time belongs to the usage time indicated by information specified from the two-
dimensional bar code 603. In this case, if it is determined that the current date and time belongs to the usage time, the background target attribute may be recognized. Contrarily, if it is determined that the current date and time does not belong to the usage time, the background target attribute may not be recognized. Alternatively, for example, if it is determined that the current date and time does not belong to the usage time indicated by information specified from the two-dimensional bar code 603, the control command may not be generated. - Hereinbefore, preferred embodiments of the invention have been described, but these are illustrative, and thus, the scope of the present invention is not limited to the embodiments. The present invention is applicable in various forms. For example, the
camera 107 may be provided integrally with thegesture sensor 108. That is, both the gesture and the background target attribute may be recognized by analyzing the photograph image of thecamera 107. - The entire disclosure of Japanese Patent Application No. 2009-046678, filed Feb. 27, 2009 is expressly incorporated by reference herein.
Claims (8)
1. A control system comprising:
an input unit through which a signal for a gesture and a background of the gesture is input;
a gesture recognition unit which recognizes the gesture on the basis of the input signal;
an attribute recognition unit which recognizes an attribute of a background target of the recognized gesture on the basis of the input signal; and
a command transmitting unit which generates a control command on the basis of a combination of the recognized gesture and the background target attribute and transmits the control command to a device.
2. The control system according to claim 1 , further comprising:
a storing unit which stores setting information indicating that a certain control is to be performed in the case that a certain gesture and a certain background target attribute are recognized; and
an output control unit which outputs, in the case that a usage time of a specific room or a device which is provided in the room is designated, a usage restricted object which is an object for the usage time and a background target for controlling the device in the specific room, from an output device;
wherein the input unit includes a gesture sensor which detects the gesture and a camera which photographs a background of the gesture;
the input signal in the gesture recognition unit is a signal indicating the detected gesture;
the input signal in the attribute recognition unit is a background image, which is photographed by the camera, of a gesture on an output medium to which the usage restricted object and the background target are output;
the attribute recognition unit analyzes the background image and recognizes the usage restricted object on the output medium and the background target attribute of the gesture; and
the command transmitting unit transmits the control command to the device which is provided in the specific room only in the case that the usage time indicated by the recognized usage restricted object is specified and current date and time belongs to the usage time, and the control command is a command for performing, for the device, a control specified from the setting information on the basis of the combination of the recognized gesture and the background target attribute.
3. The control system according to claim 1 , wherein the command transmitting unit generates and transmits the command control for performing an output having the background target of the gesture in an output device.
4. The control system according to claim 1 , further comprising a background target attribute notification unit which notifies a user of the gesture background target among objects which exist in a space in which the user exists.
5. The control system according to claim 4 , wherein the background target attribute notification unit illuminates light to the gesture background target to notify the user of the gesture background target in the case that the user exists at a predetermined distance from the target, and stops the light illuminating to the gesture background target in the case that the user moves to a place removed from the target by a predetermined distance.
6. The control system according to claim 1 , wherein the command transmitting unit specifies a schedule, and a process performed by the command transmitting unit is varied according to a relation between the specified schedule and the current date and time.
7. A control apparatus comprising:
a gesture recognition unit which recognizes a gesture;
an attribute recognition unit which recognizes an attribute of a background target of the recognized gesture; and
a command transmitting unit which generates a control command on the basis of a combination of the recognized gesture and the background target attribute and transmits the control command to a device.
8. A computer program for performing in a computer the following steps comprising:
recognizing a gesture;
recognizing an attribute of a background target of the recognized gesture; and
generating a control command on the basis of a combination of the recognized gesture and the background target attribute and transmitting the control command.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US13/455,095 US20120213408A1 (en) | 2009-02-27 | 2012-04-24 | System of controlling device in response to gesture |
Applications Claiming Priority (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2009046678A JP2010204730A (en) | 2009-02-27 | 2009-02-27 | System of controlling device in response to gesture |
JP2009-046678 | 2009-02-27 | ||
US12/713,117 US8183977B2 (en) | 2009-02-27 | 2010-02-25 | System of controlling device in response to gesture |
US13/455,095 US20120213408A1 (en) | 2009-02-27 | 2012-04-24 | System of controlling device in response to gesture |
Related Parent Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US12/713,117 Continuation US8183977B2 (en) | 2009-02-27 | 2010-02-25 | System of controlling device in response to gesture |
Publications (1)
Publication Number | Publication Date |
---|---|
US20120213408A1 true US20120213408A1 (en) | 2012-08-23 |
Family
ID=42666800
Family Applications (2)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US12/713,117 Active 2030-11-28 US8183977B2 (en) | 2009-02-27 | 2010-02-25 | System of controlling device in response to gesture |
US13/455,095 Abandoned US20120213408A1 (en) | 2009-02-27 | 2012-04-24 | System of controlling device in response to gesture |
Family Applications Before (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US12/713,117 Active 2030-11-28 US8183977B2 (en) | 2009-02-27 | 2010-02-25 | System of controlling device in response to gesture |
Country Status (2)
Country | Link |
---|---|
US (2) | US8183977B2 (en) |
JP (1) | JP2010204730A (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20130326253A1 (en) * | 2012-06-01 | 2013-12-05 | Wilfred Lam | Toggling sleep-mode of a mobile device without mechanical or electromagnetic toggling buttons |
Families Citing this family (48)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US8933876B2 (en) | 2010-12-13 | 2015-01-13 | Apple Inc. | Three dimensional user interface session control |
US9035876B2 (en) | 2008-01-14 | 2015-05-19 | Apple Inc. | Three-dimensional user interface session control |
US8305188B2 (en) * | 2009-10-07 | 2012-11-06 | Samsung Electronics Co., Ltd. | System and method for logging in multiple users to a consumer electronics device by detecting gestures with a sensory device |
US20110164032A1 (en) * | 2010-01-07 | 2011-07-07 | Prime Sense Ltd. | Three-Dimensional User Interface |
TW201135341A (en) * | 2010-04-13 | 2011-10-16 | Hon Hai Prec Ind Co Ltd | Front projection system and method |
JP5439347B2 (en) * | 2010-12-06 | 2014-03-12 | 日立コンシューマエレクトロニクス株式会社 | Operation control device |
US8872762B2 (en) | 2010-12-08 | 2014-10-28 | Primesense Ltd. | Three dimensional user interface cursor control |
EP2686254B1 (en) * | 2011-03-17 | 2018-08-15 | SSI Schäfer Automation GmbH | Controlling and monitoring a storage and order-picking system by means of movement and speech |
US8693726B2 (en) * | 2011-06-29 | 2014-04-08 | Amazon Technologies, Inc. | User identification by gesture recognition |
US8881051B2 (en) | 2011-07-05 | 2014-11-04 | Primesense Ltd | Zoom-based gesture user interface |
US9377865B2 (en) | 2011-07-05 | 2016-06-28 | Apple Inc. | Zoom-based gesture user interface |
US9459758B2 (en) | 2011-07-05 | 2016-10-04 | Apple Inc. | Gesture-based interface with enhanced features |
US9030498B2 (en) | 2011-08-15 | 2015-05-12 | Apple Inc. | Combining explicit select gestures and timeclick in a non-tactile three dimensional user interface |
US9218063B2 (en) * | 2011-08-24 | 2015-12-22 | Apple Inc. | Sessionless pointing user interface |
JP5868128B2 (en) * | 2011-11-10 | 2016-02-24 | キヤノン株式会社 | Information processing apparatus and control method thereof |
US9679215B2 (en) | 2012-01-17 | 2017-06-13 | Leap Motion, Inc. | Systems and methods for machine control |
US8638989B2 (en) | 2012-01-17 | 2014-01-28 | Leap Motion, Inc. | Systems and methods for capturing motion in three-dimensional space |
US8693731B2 (en) | 2012-01-17 | 2014-04-08 | Leap Motion, Inc. | Enhanced contrast for object detection and characterization by optical imaging |
US11493998B2 (en) | 2012-01-17 | 2022-11-08 | Ultrahaptics IP Two Limited | Systems and methods for machine control |
US9070019B2 (en) | 2012-01-17 | 2015-06-30 | Leap Motion, Inc. | Systems and methods for capturing motion in three-dimensional space |
US9501152B2 (en) | 2013-01-15 | 2016-11-22 | Leap Motion, Inc. | Free-space user interface and control using virtual constructs |
US10691219B2 (en) | 2012-01-17 | 2020-06-23 | Ultrahaptics IP Two Limited | Systems and methods for machine control |
US9229534B2 (en) | 2012-02-28 | 2016-01-05 | Apple Inc. | Asymmetric mapping for tactile and non-tactile user interfaces |
CA2779837C (en) | 2012-04-24 | 2021-05-04 | Comcast Cable Communications, Llc | Video presentation device and method using gesture control |
CN103375880B (en) * | 2012-04-27 | 2016-10-05 | 珠海格力电器股份有限公司 | The remote control of air-conditioner and method |
US9285893B2 (en) | 2012-11-08 | 2016-03-15 | Leap Motion, Inc. | Object detection and tracking with variable-field illumination devices |
US10609285B2 (en) | 2013-01-07 | 2020-03-31 | Ultrahaptics IP Two Limited | Power consumption in motion-capture systems |
US9626015B2 (en) | 2013-01-08 | 2017-04-18 | Leap Motion, Inc. | Power consumption in motion-capture systems with audio and optical signals |
US9632658B2 (en) | 2013-01-15 | 2017-04-25 | Leap Motion, Inc. | Dynamic user interactions for display control and scaling responsiveness of display objects |
US9459697B2 (en) | 2013-01-15 | 2016-10-04 | Leap Motion, Inc. | Dynamic, free-space user interactions for machine control |
WO2014200589A2 (en) | 2013-03-15 | 2014-12-18 | Leap Motion, Inc. | Determining positional information for an object in space |
US10620709B2 (en) | 2013-04-05 | 2020-04-14 | Ultrahaptics IP Two Limited | Customized gesture interpretation |
US9916009B2 (en) | 2013-04-26 | 2018-03-13 | Leap Motion, Inc. | Non-tactile interface systems and methods |
US9747696B2 (en) | 2013-05-17 | 2017-08-29 | Leap Motion, Inc. | Systems and methods for providing normalized parameters of motions of objects in three-dimensional space |
US10281987B1 (en) | 2013-08-09 | 2019-05-07 | Leap Motion, Inc. | Systems and methods of free-space gestural interaction |
US9721383B1 (en) | 2013-08-29 | 2017-08-01 | Leap Motion, Inc. | Predictive information for free space gesture control and communication |
US9632572B2 (en) | 2013-10-03 | 2017-04-25 | Leap Motion, Inc. | Enhanced field of view to augment three-dimensional (3D) sensory space for free-space gesture interpretation |
US9996638B1 (en) | 2013-10-31 | 2018-06-12 | Leap Motion, Inc. | Predictive information for free space gesture control and communication |
JP6222830B2 (en) * | 2013-12-27 | 2017-11-01 | マクセルホールディングス株式会社 | Image projection device |
US9613262B2 (en) | 2014-01-15 | 2017-04-04 | Leap Motion, Inc. | Object detection and tracking for providing a virtual device experience |
US9857869B1 (en) | 2014-06-17 | 2018-01-02 | Amazon Technologies, Inc. | Data optimization |
PT107791A (en) * | 2014-07-21 | 2016-01-21 | Ricardo José Carrondo Paulino | INTEGRATED MULTIMEDIA DISCLOSURE SYSTEM WITH CAPACITY OF REAL-TIME INTERACTION BY NATURAL CONTROL AND CAPACITY OF CONTROLLING AND CONTROL OF ENGINES AND ELECTRICAL AND ELECTRONIC ACTUATORS |
JP2016038889A (en) | 2014-08-08 | 2016-03-22 | リープ モーション, インコーポレーテッドLeap Motion, Inc. | Extended reality followed by motion sensing |
CN104202640B (en) * | 2014-08-28 | 2016-03-30 | 深圳市国华识别科技开发有限公司 | Based on intelligent television intersection control routine and the method for image recognition |
CN105549446A (en) * | 2016-02-24 | 2016-05-04 | 中国科学院城市环境研究所 | Intelligent control system of body sense environment-friendly stove |
JP2018181140A (en) * | 2017-04-19 | 2018-11-15 | 東芝情報システム株式会社 | Command input system and program for command input system |
JP6418618B1 (en) * | 2017-09-04 | 2018-11-07 | 三菱ロジスネクスト株式会社 | Vehicle allocation system and vehicle allocation method |
US11875012B2 (en) | 2018-05-25 | 2024-01-16 | Ultrahaptics IP Two Limited | Throwable interface for augmented reality and virtual reality environments |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6075895A (en) * | 1997-06-20 | 2000-06-13 | Holoplex | Methods and apparatus for gesture recognition based on templates |
US20040193413A1 (en) * | 2003-03-25 | 2004-09-30 | Wilson Andrew D. | Architecture for controlling a computer using hand gestures |
Family Cites Families (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP3251639B2 (en) * | 1992-06-08 | 2002-01-28 | 株式会社東芝 | Pointing device |
EP0622722B1 (en) * | 1993-04-30 | 2002-07-17 | Xerox Corporation | Interactive copying system |
JPH07175587A (en) | 1993-10-28 | 1995-07-14 | Hitachi Ltd | Information processor |
US5732227A (en) * | 1994-07-05 | 1998-03-24 | Hitachi, Ltd. | Interactive information processing system responsive to user manipulation of physical objects and displayed images |
JPH1031551A (en) | 1996-07-15 | 1998-02-03 | Mitsubishi Electric Corp | Human interface system and high-speed moving body position detecting device using the same |
US6175954B1 (en) * | 1997-10-30 | 2001-01-16 | Fuji Xerox Co., Ltd. | Computer programming using tangible user interface where physical icons (phicons) indicate: beginning and end of statements and program constructs; statements generated with re-programmable phicons and stored |
JPH11327753A (en) * | 1997-11-27 | 1999-11-30 | Matsushita Electric Ind Co Ltd | Control method and program recording medium |
US8035612B2 (en) * | 2002-05-28 | 2011-10-11 | Intellectual Ventures Holding 67 Llc | Self-contained interactive video display system |
US7724242B2 (en) * | 2004-08-06 | 2010-05-25 | Touchtable, Inc. | Touch driven method and apparatus to integrate and display multiple image layers forming alternate depictions of same subject matter |
JP4676303B2 (en) * | 2005-10-18 | 2011-04-27 | 株式会社日立製作所 | Terminal device |
JP2007158680A (en) * | 2005-12-05 | 2007-06-21 | Victor Co Of Japan Ltd | Tracking imaging apparatus and tracking imaging system utilizing it |
JP4968922B2 (en) * | 2007-06-19 | 2012-07-04 | キヤノン株式会社 | Device control apparatus and control method |
US9377874B2 (en) * | 2007-11-02 | 2016-06-28 | Northrop Grumman Systems Corporation | Gesture recognition light and video image projector |
US9241143B2 (en) * | 2008-01-29 | 2016-01-19 | At&T Intellectual Property I, L.P. | Output correction for visual projection devices |
-
2009
- 2009-02-27 JP JP2009046678A patent/JP2010204730A/en not_active Withdrawn
-
2010
- 2010-02-25 US US12/713,117 patent/US8183977B2/en active Active
-
2012
- 2012-04-24 US US13/455,095 patent/US20120213408A1/en not_active Abandoned
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6075895A (en) * | 1997-06-20 | 2000-06-13 | Holoplex | Methods and apparatus for gesture recognition based on templates |
US20040193413A1 (en) * | 2003-03-25 | 2004-09-30 | Wilson Andrew D. | Architecture for controlling a computer using hand gestures |
US20100151946A1 (en) * | 2003-03-25 | 2010-06-17 | Wilson Andrew D | System and method for executing a game process |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20130326253A1 (en) * | 2012-06-01 | 2013-12-05 | Wilfred Lam | Toggling sleep-mode of a mobile device without mechanical or electromagnetic toggling buttons |
Also Published As
Publication number | Publication date |
---|---|
JP2010204730A (en) | 2010-09-16 |
US8183977B2 (en) | 2012-05-22 |
US20100219934A1 (en) | 2010-09-02 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US8183977B2 (en) | System of controlling device in response to gesture | |
US6979087B2 (en) | Display system with interpretable pattern detection | |
EP1441514A2 (en) | Interactive image projector | |
US20130044054A1 (en) | Method and apparatus for providing bare-hand interaction | |
CN104808821A (en) | Method and apparatus for data entry input | |
EP2133774A1 (en) | Projector system | |
US10429963B2 (en) | User notification method, handwritten data capture device, and program | |
JP2019000438A (en) | Drawing system, drawing device and terminal device | |
JP2006172439A (en) | Desktop scanning using manual operation | |
US10276133B2 (en) | Projector and display control method for displaying split images | |
JP2015014882A (en) | Information processing apparatus, operation input detection method, program, and storage medium | |
US20150261385A1 (en) | Picture signal output apparatus, picture signal output method, program, and display system | |
KR101395723B1 (en) | Electronic blackboard system | |
US8890816B2 (en) | Input system and related method for an electronic device | |
US10795467B2 (en) | Display device, electronic blackboard system, and user interface setting method | |
US9489077B2 (en) | Optical touch panel system, optical sensing module, and operation method thereof | |
US10551972B2 (en) | Interactive projector and method of controlling interactive projector | |
CN109583404B (en) | A kind of plane gestural control system and control method based on characteristic pattern identification | |
JP6207211B2 (en) | Information processing apparatus and control method thereof | |
CN106164828B (en) | Bi-directional display method and bi-directional display device | |
JP2012063974A (en) | Stroke display system and program | |
US9305210B2 (en) | Electronic apparatus and method for processing document | |
JP6057407B2 (en) | Touch position input device and touch position input method | |
KR20170039345A (en) | Electronic pen system | |
US11494136B2 (en) | Storage medium having fingernail image processing, information processing apparatus, and control method of information processing apparatus |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |