CN102279670B - Man machine interface based on gesture - Google Patents

Man machine interface based on gesture Download PDF

Info

Publication number
CN102279670B
CN102279670B CN201110115071.2A CN201110115071A CN102279670B CN 102279670 B CN102279670 B CN 102279670B CN 201110115071 A CN201110115071 A CN 201110115071A CN 102279670 B CN102279670 B CN 102279670B
Authority
CN
China
Prior art keywords
screen
image
distance
information
processor
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201110115071.2A
Other languages
Chinese (zh)
Other versions
CN102279670A (en
Inventor
D·L·S·吉蒙兹
N·P·奥兹
P·S·塔皮亚
D·E·卡姆皮罗
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Boeing Co
Original Assignee
Boeing Co
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from EP10382168.2A external-priority patent/EP2395413B1/en
Application filed by Boeing Co filed Critical Boeing Co
Publication of CN102279670A publication Critical patent/CN102279670A/en
Application granted granted Critical
Publication of CN102279670B publication Critical patent/CN102279670B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Abstract

The present invention relates to man machine interface based on gesture, the graphic user interface of the program such as performed on computers for control.The gesture of user is monitored and provides response based on the gesture detected.Object is used to refer to showing the information on screen.It is not only in response to object point to the determination of where and in response to the determination of distance to object off screen curtain, shows that the information on screen is modified.

Description

Man machine interface based on gesture
Technical field
The present invention relates to man machine interface based on gesture, such as, can be used for controlling to hold on computers The graphic user interface of the program of row.Although suitable for the most eurypalynous program, but especially It is interested in the program of the flight controlling one or more unmanned vehicle.
Background technology
Man machine interface many decades in the past there occurs the biggest change.Even if at computer In the narrower field controlled, interface develops into requirement from order line and uses mouse or similar finger Point device is to select to be shown to the graphic user interface of the icon of user.
Recently, touch panel device has begun to popular.When touch panel device is opened based on hands During the probability that gesture controls, it is allowed to the touch panel device of Random seismic field is advantageous particularly.Fructus Mali pumilae IPhone (TM) is that touch can be used to selection project, up or down roll screen, amplifies or contract Little and the preferable example of rotating items.Such as, screen tend to have slower response time, Poor accuracy and poor reliability, and frequently use touch screen and cause residue and dust Accumulation, residue and dust cause further performance degradation.
By avoiding contact with screen, it is to avoid the system of some problems of touch panel device is carried Go out.Instead of, the gesture of user is monitored, and provides response based on the gesture detected. Such as, the system of the hands of monitoring user has been suggested so that the gesture produced with hands is used to Selection, rolling, zoom, rotation etc., be similar to depend on the existing system touching screen.
Summary of the invention
In this context, the invention belongs to use department of computer science by man machine interface based on gesture The method of system.The method includes using object to point to display letter on the screen of computer system Breath, and by the scene before at least two captured by camera screen.Processor is used to analysis and shines The scene of cameras capture is to identify object, to determine that object points to the where on screen and object The distance of off screen curtain.Afterwards, processor in response to object point to where and object off screen curtain away from From determination revise display information on screen.
In this way, it is possible to avoid the shortcoming of touch screen.Furthermore, it is possible to utilize about object from Screen how far information.This information can be used in a different manner.Such as, can pass through The screen portions amplifying object sensing changes display information on screen, and amplification depends on Distance determined by from screen to object.Therefore, the indication near screen can be used to The amplification bigger than the indication in farther place.Limit value can be set so that than certain apart from farther Give directions and produce consistent amplification, and arrive maximum away from setting screen distance amplification. How amplification between these distances changes and can be controlled, and e.g., it can linearly or press The change of exponential law ground.
The method can include that the movement following the trail of object is to determine that object points to the where on screen. The method can include determining that the extending longitudinally of object is to determine that object points to the where on screen. The optional feature of the two is used as alternative or they can be used to mutually increase By force.
Being determined by the distance of object off screen curtain within a period of time, the speed that object moves is permissible It is determined.This speed is used as the further control to the information shown on screen. Object can be explained different to gradually moving of screen from object to the quick movement of screen. Such as, gradually movement can be interpreted to click, and quickly movement can be interpreted to double-click.
Alternatively, the method can include pointing to the display screen in computer system with two objects Information on curtain, and by the scene of at least two captured by camera screen front.Processor is permissible It is used to analyze the scene that caught of photographing unit to identify object, so that it is determined that object sensing screen On where and the distance of object off screen curtain.Afterwards, processor points on screen in response to object The determination of distance of where and object off screen curtain revises display information on screen.This allows Further function.Such as, this object can be used to independently from the different controls on screen Reciprocal action processed, to regulate volume control and to amplify on a certain area.The two object also may be used To be used together.Can be with object operation display image on screen, such as by rotating Object.Such as, the object making the left side moves to screen and makes the object on the right move away from screen Image can be caused to be rotated clockwise around vertical axis, and make object above to screen Moving and making following object move away from screen to cause image to rotate around horizontal axis, root Can also be there is other according to the relative movement between alignment relative and the object of object to rotate.
The most different objects can be used to refer on screen.Such as, object can be user Hands.Preferably, object can be the finger of stretching, extension of hands of user.Afterwards, the finger of finger Point can be the point of the distance being used to determine off screen curtain.Finger extension (extension) is permissible It is used to determine that user points to the where on screen.
The present invention also resides in the computer system including man machine interface based on gesture.This computer System includes: (a) can be operated to the screen of display information, and (b) is arranged to catch screen At least two photographing unit of the scene before Mu, and (c) processor.Processor is arranged to connect The image of photographing unit offer is provided and analyzes image, to identify sensing display information on screen Object.Processor is also arranged to determine that object points to the where on screen and object off screen curtain Distance.Processor is also arranged to the distance in response to object sensing where and object off screen curtain Determine the information revising display on screen.
Selectively, processor is arranged to the movement following the trail of object to determine that object points to screen On where.Further additionally or alternatively, processor can be arranged to determine that the longitudinal direction of object is stretched Exhibition, to determine that object points to the where on screen.Processor can be arranged to amplify object and refer to To screen portions change display information on screen, amplification depends on from screen to thing Distance determined by body.As described in the method above with respect to invention, two objects can by with Revise display information on screen.Object can be the hands of the hands of user, such as user The finger stretched.
Accompanying drawing explanation
In order to make invention be more easily understood, only by example with reference to accompanying drawing, wherein:
Fig. 1 is the perspective view simplifying view, it illustrates and includes according to an embodiment of the invention The system of man machine interface, this system includes two screens side by side and four photographing units;
Fig. 2 is the perspective view of screen observed from user perspective, it illustrates user by refer to by Button selects display button on screen;
Fig. 3 a to Fig. 3 d is the diagrammatic top view of system, and it illustrates according to the present invention is man-machine The embodiment at interface, this system includes screen and one or more photographing unit, it is shown that photograph How the visual field of machine combines;
Fig. 3 e to Fig. 3 h is the diagrammatic elevation view of the system shown in Fig. 3 a to Fig. 3 d, Fig. 3 e, Fig. 3 f, Fig. 3 g and Fig. 3 h correspond respectively to Fig. 3 a, Fig. 3 b, Fig. 3 c and Fig. 3 d;
Fig. 4 is the schematic diagram of the system of the embodiment illustrating the man machine interface according to the present invention;With And
Fig. 5 a to Fig. 5 c is the simplification front view of screen, it illustrates according to the present invention by people The zoom facility that the embodiment at machine interface provides.
Detailed description of the invention
Figure shows the computer system 10 including man machine interface based on gesture.Department of computer science System 10 includes one or more screen 12 of the information that is actually activated and driven to display.The display of information can To be controlled by above doing gesture with his or her hands 14 at screen 12 by user.These Gesture four photographing unit 16 records being disposed in around screen 12.Analyze photographing unit 16 The image of capture is to determine the three-dimensional position of the hands 14 of user and to follow the trail of the movement of hands 14.Hands 14 The computer system 10 that is moved through explain, such as with identify corresponding to display on the screen 12 Icon select or be shown enlarged in the region on screen 12.Computer system 10 is in response to this A little gestures change display information on the screen 12.Fig. 2 show user by forefinger 18 to The example that display previous dynasty button 20 on the screen 12 moves.This moves imitation user's pressing Button 20, and computer system 10 interprets this as user's select button 20.This can draw Play computer system 10 and show new image on the screen 12.
Although any number of screen 12 can be used, but figures 1 and 2 show that use two The computer system 10 of individual screen 12 side by side.The arm 22 of user is illustrated schematically in The front of screen 12.The movement of arm 22 is by four exterior angles the court being disposed in screen 12 Capture to four photographing units 16 at the center of screen 12.Therefore, when the hands 14 of user is at screen When moving before 12, the movement of the hands 14 of captured by camera user.Use four photographing units 16 graphics that can set up the space before screen 12.Therefore, in x, y, z coordinate system In system, the position (finger tip of the finger 18 of such as user) of object can be determined.At Fig. 1 With Fig. 3 indicates these coordinate axess.Spatial information according to all three x, y, z axle can To be used in man machine interface.
Fig. 3 a to Fig. 3 h shows how the visual field 24 of each photographing unit 16 combines to provide empty Between volume, computer system 10 can determine the position of object wherein.Photographing unit 16 is phase With, and also there is identical visual field 24.Fig. 3 a and Fig. 3 e is single screen 12 respectively Plane graph and front view, which show only single camera 16 (in order to clear, photographing unit 16 It is schematically depicted as round dot).Therefore, they provides a better illustration from each photographing unit 16 The visual field 24 obtained.Fig. 3 b and Fig. 3 f is plane graph and the front view of same screen 12 respectively, Show two photographing units 16 in the right hand edge being disposed in screen 12 specifically.This figure Show how the visual field of two photographing units 16 combines.Fig. 3 c and Fig. 3 g is the flat of screen 12 Face figure and front view, it is shown that all four photographing unit 16 and their visual field 24.If thing Within body is trapped in the visual field 24 of at least two photographing unit 16, then object is before screen 12 Position can be determined.Therefore, no matter there is visual field 24 in the where in Fig. 3 c and Fig. 3 g Overlap, the position of object can be determined.Useful core space 26 be illustrated in Fig. 3 d and In Fig. 3 h, this core space may determine that the position of object.
Fig. 4 illustrates in greater detail computer system 10.Computer system 10 has as it The computer 40 of hub (hub).Computer 40 can include the most different parts, example Such as primary processor 42, including the memorizer of program stored within, such as, it is used for being similar to In the driver of ancillary equipment of screen 12 and it is similar to the ancillary equipment of screen 12 for operation Card.
As can be seen, four photographing units 16 are connected to image processor by input (feed) 44 46.Image processor 46 can be a part for primary processor 42, or image processor 46 May be provided in single processor.The either any form in both, image procossing Device 46 all receives the image from photographing unit 16.Image processor 46 uses generally available soft Part processes image to improve their quality.For example, it is possible to improve brightness, contrast and clear Clear degree is so that producing higher-quality image.Processed image is transferred to primary processor 42. Store image analysis software in memory to be retrieved by primary processor 43 and run, to analyze The image processed, so that it is determined that user points to the where on screen 12.It should be known that this figure It is conventional as analyzing software.
Once primary processor 42 is it has been determined that user points to the where on screen, primary processor 42 Determine that the image presented on screen 12 is the need of change.If it is determined that need, the most main process Device 42 produces the signal of necessity to cause the necessary change of display information on the screen 12.This A little signals are transferred to screen drive/card 48, and screen drive/card 48 provides and is supplied to screen The current demand signal of 12.
As shown in Figure 4, computer 40 can include for receiving defeated from the touch screen of screen 12 The input equipment 50 entered, i.e. mean to allow user to select display by touching screen 12 Icon on the screen 12.This feature is provided to be probably in some cases useful.Such as, Crucial selection may require user and touches screen 12 as further step, to guarantee user Determine that they want that making that selects.Such as, this can be used for causing system emergency to close Button: this action is clearly extreme case and requires that user touches screen 12 and can reflect this Point.Therefore input equipment 50 is provided.
As mentioned above, primary processor 42 obtain by image processor 46 provide still by The image of reason, and analyze these images to determine whether user points to screen 12.This can be with often The image recognition technology of rule (such as uses and is used to identify and has the food stretching to a screen 12 Refer to the software of the relevant shape of hands 14 of 18) carry out.Primary processor 42 determines finger afterwards 18 point to the where on screen 12.Primary processor 42 can for a hands play this function or For being considered suitable many handss to play this function.Such as, primary processor 42 can be for Refer to that it is determined by all handss on screen.Following description is for single finger 18 Example, as will be progressively understood, for being expected to or be determined refer on the screen 12 Multiple fingers 18, the method can be repeated.
How primary processor 42 determines that finger 18 refers to that where on the screen 12 can be with different Mode is carried out.
In one embodiment, primary processor 42 identifies that the finger tip of forefinger 18 is sat in x, y, z Position in mark system.This can be by carrying out triangle to the image captured by four photographing units 16 Measure and carry out.Behind the position of the finger tip of the forefinger 18 according to one group of four image recognition, under One group of four image can be processed in an identical manner, in order to determines under the finger tip of forefinger 18 One position.In this way, the finger tip of forefinger 18 can be tracked, and if its motion Continue, then its movement repeats in time forward to determine that it will touch the position of screen 12.
In alternative embodiments, image is analyzed to determine the stretching, extension (extension) of forefinger 18 The direction pointed to finger 18.Certainly, this technology can be with such as embodiments described above Combine, to identify when that the direction that finger 18 points to along it is moved, because this can be solved It is interpreted as finger 18 " pressing " display object on the screen 12.
Fig. 5 a to Fig. 5 c shows the embodiment according to zoom facility provided by the present invention.Carry There is the single screen 12 of four photographing units 16 for side, be described.Photograph Machine 16 and screen 12 are connected to computer system 10, as previously described, computer system 10 operations are to provide man machine interface based on gesture.
In example shown in Fig. 5 a to Fig. 5 c, screen 12 shows Figure 80 and relevant letter Breath.The top of screen 12 has heading message 82, and the optional button 84 of string four is located at screen The left hand edge of 12.Button 84 can be with text 86 to represent the new information that can select Screen or change display information on Figure 80.Figure 80 occupies major part and the quilt of screen 12 It is arranged as being partial to the lower right of screen 12.Aircraft 88 is shown as to have and illustrates that it currently flies by Figure 80 The round dot of the arrow of line direction.Identify that the information of aircraft 88 can also be displayed on round dot side, As shown in 90.Further information is provided at a line frame 92 of the bottom margin along Figure 80 In.
User may want the aircraft 88 interested amplifying on such as Figure 80, with the most more detailed Display geography information on Figure 80 is shown.In doing so, user can point to one and presses Button 84 is to select zoom mode, and can point to the aircraft 88 interested on Figure 80 afterwards. As shown in Fig. 5 b and Fig. 5 c, this region causing user to point to is shown with bigger amplification In circle 94.Circle 94 is shown as covering in background Figure 80.As it is known in the art, Edge and background Figure 80 of the circle 94 of zoom can merge when needed.In order to regulate amplification The factor, user only makes his or her forefinger 18 toward and away from screen 12 (the most in a z-direction) Mobile.Forefinger 18 is shifted to screen and causes bigger amplification.
Therefore, the x of finger 18 of user, y location are used to determine on Figure 80 the district being exaggerated Territory, the z location of finger 18 is used to determine amplification.Higher limit and the lower limit of Z location can To be configured to corresponding to upper limit amplification factor and lower limit amplification factor.Such as, amplification 1 can be arranged to, and certain of the finger tip 18 at least off screen curtain 12 of user is apart from (e.g., 30 Centimetre).And, the minimum interval (e.g., 5 centimetres) of off screen curtain can be configured so that maximum Amplification so that if 18 to 5 centimetres of the finger of user is more closely near screen 12, amplify Rate is not further added by.Can select how the amplification between these distances changes as required.Example As, amplification can with distance change linearly or it can follow some other relation, Such as exponential relationship.
Fig. 5 b and Fig. 5 c reflects following situation, the i.e. user starting position from Fig. 5 b by him Forefinger 18 move to refer at aircraft 88 interested closer to screen 12 simultaneously so that Amplification increases, as shown in Figure 5 c.With move the same towards screen 12, user's transverse shifting Their finger 18, then amplification by increase and the region amplified by movement to follow finger 18 Transverse shifting.
Such as those of skill in the art it will be realized that above-described embodiment can be modified, Without deviating from the scope of the present invention being defined by the appended claims.
Such as, the quantity of screen 12 freely can change to Arbitrary Digit from one.Additionally, screen The type of 12 can change.Such as, screen 12 can be as plasma screen, lcd screen, The flat screen of OLED screen curtain, or it can be oscillotron or be only that image is projected onto Surface thereon.When using multiple screen 12, they need not common type.Although CCD camera is preferred, but the type of the photographing unit 16 used can also change.Photograph Machine 16 can operate with visible ray, but can use the electromagnetic radiation of other wavelength.Such as, red Outside line photographing unit can be used under light conditions.
Software can be configured so that any object of monitoring and determines from what object screen 12 selects. Such as, the finger 18 of above-described user.It is alternatively possible to use such as club or The pointing device of rod.
The present invention can be used to effectively access the menu being arranged to tree.Example As, finger 18 can point to the button presented on screen 12 or menu option with at screen 12 The new information of upper generation shows.Afterwards, user can move their finger 18 to point to another Individual button or menu option show to produce another new information on the screen 12, etc..Cause This, make it point to the different piece of screen 12 by only moving finger 18, it is allowed to Yong Hufei Often patrol rapidly and look at tree menu structure.
Such as can be by following the trail of the position that the finger tip of finger 18 continuously determines the finger 18 of user Put.This can make the translational speed of finger 18 be determined.Control can be used to after this speed Information on screen 12 processed.Such as, the speed moved towards screen 12 can be used, and makes Obtain gradually to move and cause and quickly move different reactions.Transverse shifting can also be used so that Different speed produces different results.Such as, slower transverse shifting can cause display to exist Object on screen 12 moves back and forth in screen, i.e. the slowest movement can be by thing Body moves to the position in the right hand edge of screen 12 from the central position.On the contrary, quick movement can Also object is caused to be removed from screen 12, i.e. from left to right quickly movement can cause object to fly Go out the right hand edge of screen 12.
As mentioned above, primary processor 42 can be monitored and more than one is similar to user's finger 18 Object.This can make multiple object be used to control the information on screen 12.A pair object can Mutual independently from the different controls on screen to be used to, such as select new projects also with regulation Change the type of the relevant information of project with selection.Two objects can also be used together.Aobvious Show that the image on screen can operate with two handss 14.Display object on the screen 12 can To rotate.Such as, their hands can be placed on identical height, the finger of each hands by user 18 left hand pointing to display object on the screen 12 and right hand edge.By make left hand 14 to Screen 12 moves and makes the right hand 14 move away from screen 12, and Objects around A vertical axis can be made suitable Hour hands rotate.If a hands 14 is placed on another on hand, object can rotate around horizontal axis. Rotation axis can be defined as the line between the finger tip corresponding to finger 18.

Claims (17)

1. the method using computer system by man machine interface based on gesture, described method bag Include:
The object for pointing to is used to point to the position on the screen of described computer system, its middle finger To including maintaining the distance between described object and described screen, described distance on the occasion of and wherein institute State screen and include exterior angle and center;
The image in the space of described screen front, described photographing unit quilt is captured with at least four photographing unit It is arranged as being connected at the exterior angle of described screen and pointing to the center of described screen, wherein utilizes described At least four photographing unit capture image allows to set up the graphics in the space of described screen front, described sky Between include described object;
Processor is used to analyze described image to identify described object, in order to determine that described object points to Described screen on described position, and determine the described distance between described object and described screen; And
In response to the described position on the described screen that described object is pointed to and described object and described screen The determination of described both the distances between Mu, amendment display information on the screen;
Wherein amendment display described information on the screen includes making described object with a certain amplification In a part for the described screen pointed to, the described information of display is amplified, and this amplification depends on institute State the distance of determination between object and described screen, the wherein said distance determined be in maximum and Between minima, and wherein along with the described distance determined described maximum and described minima it Between change, amplification ratio exponentially changes.
Method the most according to claim 1, wherein determines on the described screen that described object points to Described position include the movement of following the trail of described object.
Method the most according to claim 1, wherein determines on the described screen that described object points to Described position include the stretching, extension that determines described object towards described screen.
Method the most according to claim 1, it farther includes:
Described processor is used to analyze described image to determine that described object moves towards or away from described screen Dynamic speed;With
In response to the determination to the speed that described object moves towards or away from described screen, amendment display Described information on the screen.
Method the most according to claim 1, it farther includes:
Described processor is used to analyze described image to determine the translational speed of described object;And
Different translational speeds according to described object differently revise display described letter on the screen Breath.
Method the most according to claim 5, it farther includes:
Described processor is used to analyze described image to determine described object horizontal stroke before described screen To translational speed;And
In response to the described translational speed periodical repair really of described object being changed display institute on the screen State information.
Method the most according to claim 1, farther includes:
Use and point to the position on the described screen of described computer system for two objects pointed to Put;
Use described processor to analyze described image to identify said two object, so that it is determined that described Described position on the described screen of two object sensings, and determine that said two object is from described screen Distance;And
In response to the described position on the described screen that said two object is pointed to and said two object Display described information on the screen is revised in determination from described both distances of described screen.
Method the most according to claim 1, wherein said object is the hands of the stretching, extension of the hands of user Refer to.
9. including an equipment for man machine interface based on gesture, described equipment includes:
Operable to show the screen of information, wherein said screen includes exterior angle and center;
At least four photographing unit, described at least four photographing unit is arranged to be connected to outside described screen At angle and point to the center of described screen to capture the image in the space of described screen front, Qi Zhongli Allow to set up the graphics in the space of described screen front with described at least four photographing unit capture image; And
Processor, it is arranged to
Receive described image,
Analyze described image to identify the object of the position pointed on described screen, with really Fixed relative to described object and the screen position of described at least four photographing unit, and determine described thing Body is from the distance of described screen, and described screen is for determining that described object points to described screen The reference plane of described position, described distance be on the occasion of, and
In response to the described position on the described screen that described object is pointed to and described object from institute Display described information on the screen is revised in the determination of described both the distances stating screen;
Wherein amendment display described information on the screen includes making with a certain amplification described In a part for the described screen that object is pointing to, the described information of display is amplified, and this amplification takes The certainly distance of the determination between described object and described screen, the wherein said distance determined is in Between maximum and minima, and wherein along with the described distance determined is in described maximum and institute Stating and change between minima, amplification ratio exponentially changes.
Equipment the most according to claim 9, wherein said processor is arranged to analyze described figure As with follow the trail of described object movement so that it is determined that described object point to described screen on institute's rheme Put.
11. equipment according to claim 9, wherein said processor is arranged to analyze described figure As with determine described object towards described screen stretch so that it is determined that described object point to described screen On described position.
12. equipment according to claim 9, wherein said processor is also arranged to determine described The translational speed of object also revises display on the screen in response to the determination of described translational speed Described information.
13. equipment according to claim 9, wherein said object is the hands of the stretching, extension of the hands of user Refer to.
The man machine interface based on gesture of 14. 1 kinds of computer systems, described interface is stored in described calculating On storage medium in machine system, and farther include:
Being configured to the first module enabling screen to run to show information, wherein said screen includes Exterior angle and center;
Being configured to send the second module of instruction at least four photographing unit, described at least four is taken a picture Machine is arranged to the center being connected at the exterior angle of described screen and pointing to described screen to capture institute State the image in the space of screen front;And
Being configured to control the three module of processor, described processor is arranged to
Receive described image,
Analyze described image to identify the object pointing to a position on the screen, with Determine relative to described object and the screen position of described at least four photographing unit, determine described object From the distance of described screen, and determining the translational speed of described object, described screen is for determining Described object points to the reference plane of the described position on described screen, described distance be on the occasion of, and And
In response to the described position on the described screen that described object is pointed to, described object from described The described distance of screen and the determination of described translational speed are revised and are shown on the screen Described information;
Wherein amendment display described information on the screen includes making described object with a certain amplification In a part for the described screen pointed to, the described information of display is amplified, and this amplification depends on institute State the distance of determination between object and described screen, the wherein said distance determined be in maximum and Between minima, and wherein along with the described distance determined described maximum and described minima it Between change, amplification ratio exponentially changes.
15. interfaces according to claim 14, wherein said object includes for determining that user refers to The finger extension of the where on described screen.
16. interfaces according to claim 14, it is described that wherein said processor is arranged to analysis Image with determine described object towards described screen stretch so that it is determined that described object point to described screen Described position on curtain.
17. interfaces according to claim 14, it is described that wherein said processor is arranged to analysis Image with follow the trail of described object movement so that it is determined that described object point to described screen on institute's rheme Put.
CN201110115071.2A 2010-06-09 2011-04-28 Man machine interface based on gesture Active CN102279670B (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
EP10382168.2 2010-06-09
EP10382168.2A EP2395413B1 (en) 2010-06-09 2010-06-09 Gesture-based human machine interface

Publications (2)

Publication Number Publication Date
CN102279670A CN102279670A (en) 2011-12-14
CN102279670B true CN102279670B (en) 2016-12-14

Family

ID=

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101040242A (en) * 2004-10-15 2007-09-19 皇家飞利浦电子股份有限公司 System for 3D rendering applications using hands
US7348963B2 (en) * 2002-05-28 2008-03-25 Reactrix Systems, Inc. Interactive video display system
CN101636207A (en) * 2007-03-20 2010-01-27 科乐美数码娱乐株式会社 Game device, progress control method, information recording medium, and program

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7348963B2 (en) * 2002-05-28 2008-03-25 Reactrix Systems, Inc. Interactive video display system
CN101040242A (en) * 2004-10-15 2007-09-19 皇家飞利浦电子股份有限公司 System for 3D rendering applications using hands
CN101636207A (en) * 2007-03-20 2010-01-27 科乐美数码娱乐株式会社 Game device, progress control method, information recording medium, and program

Similar Documents

Publication Publication Date Title
US9569010B2 (en) Gesture-based human machine interface
US11720181B2 (en) Cursor mode switching
US8643598B2 (en) Image processing apparatus and method, and program therefor
CN105980965A (en) Systems, devices, and methods for touch-free typing
US20140300542A1 (en) Portable device and method for providing non-contact interface
US20150062004A1 (en) Method and System Enabling Natural User Interface Gestures with an Electronic System
JP5264844B2 (en) Gesture recognition apparatus and method
US20130335324A1 (en) Computer vision based two hand control of content
US20140375547A1 (en) Touch free user interface
US20140240225A1 (en) Method for touchless control of a device
JP2011022984A (en) Stereoscopic video interactive system
US20110298708A1 (en) Virtual Touch Interface
CN103729054A (en) Multi display device and control method thereof
JP2006209563A (en) Interface device
CN103150020A (en) Three-dimensional finger control operation method and system
Zhang et al. A novel human-3DTV interaction system based on free hand gestures and a touch-based virtual interface
TWI486815B (en) Display device, system and method for controlling the display device
CN105373329A (en) Interactive method and system for display and booth
CN102279670B (en) Man machine interface based on gesture
US20090051652A1 (en) Control apparatus and method
CN117311486A (en) Interaction method and device for light field display and light field display system
CN116661596A (en) Man-machine virtual interaction system for exhibition hall construction
WO2017096802A1 (en) Gesture-based operating component control method and device, computer program, and storage medium
EP3776160A1 (en) Device operation control

Legal Events

Date Code Title Description
PB01 Publication
SE01 Entry into force of request for substantive examination
GR01 Patent grant