CN103294996A - 3D gesture recognition method - Google Patents

3D gesture recognition method Download PDF

Info

Publication number
CN103294996A
CN103294996A CN2013101681231A CN201310168123A CN103294996A CN 103294996 A CN103294996 A CN 103294996A CN 2013101681231 A CN2013101681231 A CN 2013101681231A CN 201310168123 A CN201310168123 A CN 201310168123A CN 103294996 A CN103294996 A CN 103294996A
Authority
CN
China
Prior art keywords
gesture
finger
image
describer
specifically
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN2013101681231A
Other languages
Chinese (zh)
Other versions
CN103294996B (en
Inventor
程洪
代仲君
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
University of Electronic Science and Technology of China
Original Assignee
University of Electronic Science and Technology of China
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by University of Electronic Science and Technology of China filed Critical University of Electronic Science and Technology of China
Priority to CN201310168123.1A priority Critical patent/CN103294996B/en
Publication of CN103294996A publication Critical patent/CN103294996A/en
Application granted granted Critical
Publication of CN103294996B publication Critical patent/CN103294996B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Abstract

The invention relates to the field of computer vision and human-computer interaction, in particular to a 3D gesture recognition method. The 3D gesture recognition method comprises the following steps that an RGB image and a depth image are acquired from an image input device and used as training samples; detection and division of a hand image are conducted according to a self-adaptive dynamic depth threshold value; arms in the hand image are removed through image morphology operation to acquire central positions of palms; a gesture outline is obtained through edge extraction and described through a time sequence curve; fingertip positions and finger connection point positions in a gesture are acquired through the image morphology operation, division of the time sequence curve is conducted at corresponding positions of the time sequence curve, and the time sequence curves of fingers are subjected to specific combination to obtain a finger describer. By means of the 3D gesture recognition method, natural interaction with a computer is realized, a traditional human-computer interaction mode is expanded effectively, and the recognition rate is increased greatly and can reach 99%.

Description

A kind of 3D gesture identification method
Technical field
The present invention relates to computer vision and field of human-computer interaction, be specifically related to a kind of 3D gesture identification method.
Background technology
Along with development of computer, man-machine interaction has become indispensable part in people's life, but most of two-dimentional interaction technique that all is based on mouse, keyboard, handheld device and window interface alternately, how to allow becomes alternately becomes research heat topic more naturally in recent years.Gesture is as one of main means of human interaction, and is historical even early than sound language, uses gesture more friendlyly, convenient, succinct as the man-machine interaction meeting, intuitively so make, and becomes a kind of expansion of traditional man-machine interaction very naturally.Identify gesture and at first want perception, the sensing device of existing three kinds of perception gestures: based on the sensing device of handheld device, such as the digital gesture ring of Microsoft, infrared gesture sensory perceptual system; Based on the sensing device that touches, such as iPhone; Based on the sensing device of vision, such as TOF video camera, Kinect.
Summary of the invention
The object of the present invention is to provide a kind of 3D gesture identification method, it is convenient and swift inadequately to solve existing man-machine interaction method, and the problem that can't realize under complex environment.
For solving above-mentioned technical matters, the present invention by the following technical solutions:
A kind of 3D gesture identification method may further comprise the steps:
Step 1 is obtained RGB image and depth image as training sample from image input device;
Step 2 is carried out the detection of staff and is cut apart by setting self-adaptation dynamic depth threshold value;
Step 3 is removed arm by the morphological image operation, obtains the palm center;
Step 4 obtains the gesture profile by edge extracting, and utilizes time-serial position to describe the gesture profile;
Step 5, position by fingertip location and finger tie point in the morphological image operation acquisition gesture, correspondence position at time-serial position carries out cutting apart of curve, and the time-serial position of finger is specifically made up and obtains pointing the describer proper vector;
Step 6 is expressed as a finger describer eigenmatrix with each class in the gesture training sample;
Step 7 is obtained RGB image and depth image in real time by image input device, and the gesture that execution in step two~step 5 will be imported in real time is expressed as a finger describer eigenmatrix;
Step 8 with the finger describer proper vector that step 7 obtains, is carried out image with the eigenmatrix that obtains in the step 6 and is handled to the dynamic time warping classification of class, obtains the gesture identification result at last.
Further technical scheme is, in the described step 2, is the object of forefront by the hypothesis hand, sets the self-adaptation dynamic threshold: specifically be to utilize the method for binaryzation to separate staff and background by the self-adaptation dynamic threshold, obtain the gesture figure of binaryzation.
Further technical scheme is, in the described step 3, remove arm by the morphological image operation, obtain the palm center, specifically be to detect the arm profile, select minimum point on it as wrist location, remove the following part of wrist, remaining image is made burn into expand and remove finger and obtain palm, calculate the palm geometric center.
Further technical scheme is, in the described step 4, obtains the gesture profile by edge extracting, specifically be with palm geometric center position, the palm circumscribed circle is that radius is drawn circle, covers the palm in the binary map, obtain the gesture binary map, the gesture binary map is extracted the edge, obtain the gesture profile diagram.
Further technical scheme is, in the described step 4, utilizing time-serial position to describe the gesture profile, specifically is to obtain the useful information relevant with shape that contain in the data by the calculating to the angle and distance on profile summit, and depict time-serial position as, realize the extraction of shape facility.For example the palm central point is P 0, the profile initial point is P 1, the profile summit is P i(i=2 ..., n), then can get angle ∠ P 1P 0P i=<P 0P 1, P 0P i, (i=2 ..., n), angle is done after 360 ° the normalization horizontal ordinate as time-serial position; The Euclidean distance of profile summit and palm central point | P 0P i| the ordinate as time-serial position, utilize formula | P 0P i|=| P 0P i|/max{|P 0P 1|, | P 0P 2| ..., | P 0P n| adjust the distance and do normalization, obtain crossing the time-serial position of describing the gesture profile.
Further technical scheme is, in the described step 5, position by fingertip location and finger tie point in the morphological image operation acquisition gesture, realization is cut apart time-serial position, specifically be to estimate by the gesture profile being carried out polygon, detect polygonal sags and crests again, salient point is finger tip, concave point is the finger tie point, at last detected sags and crests is carried out filtering, obtain fingertip location and finger tie point position, these positions are corresponded on the time-serial position, realize cutting apart curve.For example specifically being by using among the OpenCV<approxPloyDP〉function obtains polygon and estimates, detect polygonal sags and crests, salient point is finger tip, concave point is the finger tie point, sags and crests detects and can use among the OpenCV<convexHull〉and<cvConvexiyDefects〉function, detected sags and crests is carried out filtering, the sags and crests of y coordinate figure below central point all filtered, realize cutting apart finger.
Further technical scheme is, in the described step 5, the time-serial position of described finger specifically makes up and obtains pointing describer, specifically is the finger segment of curve that curve segmentation obtains specifically to be made up obtain pointing the describer proper vector.Described finger describer proper vector is as follows: f=[f 1, f 2... f s... f S],
Wherein the value of s is
s={1,2,...k,12,23,...(k-1)k,123,234,...,(k-2)(k-1)k,...123...k},
K ∈ 1,2 ... K}, K are the maximum finger numbers that comprise in all gestures.
Further technical scheme is, in the described step 6, each class in the gesture training sample is expressed as a finger describer eigenmatrix, and described finger describer eigenmatrix is made up of the finger describer proper vector of such all training samples.The matrix that described finger describer proper vector is formed
Figure BDA00003163096100031
Wherein c represents a certain class of gesture, G cBe a N c* M cMatrix, f C, nBe n training sample of c class gesture, N cBe number of training, M cSum for c class gesture finger describer changes with c.
Further technical scheme is, in the described step 7, obtains RGB image and depth image in real time by image input device, and the gesture that execution in step two~step 5 will be imported in real time is expressed as a finger describer proper vector.Obtain pointing describer proper vector f Test=[f 1', f 2' ... f s' ... f S'].
Further technical scheme is, in the described step 8, image is handled to the dynamic time warping classification of class, specifically be that test sample book and training sample are carried out image to the dynamic time warping calculating of class, obtain the similarity of test sample book and all kinds of training samples, selection similarity maximum namely is the shortest gesture-type as test sample book in dynamic time warping path of class.
Specifically be that test data and training sample are carried out dynamic time warping calculating:
I 2 C - DTW ( G c , f test ) = Σ s = 1 S min n ∈ { 1,2 , . . . , N c } { DTW ( f c , n , s , f s , ) }
Wherein, DTW (f C, n, s, f s') expression f C, n, sWith f s' between the shortest regular path, f C, n, sBe f C, nIn the combination of a kind of finger, select minimum I2C-DTW (G at last c, f Test) as the gesture-type of test data correspondence, i.e. gesture identification result.
Compared with prior art, the invention has the beneficial effects as follows:
The present invention is undertaken alternately by user's gesture information and computing machine, be exactly with user's hand profile information as the replenishing of traditional keyboard and mouse interactive mode, enrich interactive means.It only obtains the image that contains user's hand in real time by Kinect, in computing machine, carry out the analysis of hand information, and analysis result is converted into the steering order of application program, realize the natural interaction with computing machine, effectively expanded traditional man-machine interaction mode, on discrimination, also improve a lot, can reach 99% discrimination.
Description of drawings
Fig. 1 is overall flow and the example schematic of a kind of 3D gesture identification method of the present invention.
Fig. 2 is for the staff detection of a kind of 3D gesture identification method of the present invention is cut apart, the gesture profile extracts and the feature extraction synoptic diagram.
Fig. 3 is training sample finger describer eigenmatrix among embodiment of a kind of 3D gesture identification method of the present invention.
Fig. 4 is test data finger describer proper vector among embodiment of a kind of 3D gesture identification method of the present invention.
Embodiment
In order to make purpose of the present invention, technical scheme and advantage clearer, below in conjunction with drawings and Examples, the present invention is further elaborated.Should be appreciated that specific embodiment described herein only in order to explaining the present invention, and be not used in restriction the present invention.
Fig. 1 shows an embodiment of a kind of 3D gesture identification method of the present invention, and the middle part of Fig. 1 is block diagram, and both sides are instantiation: a kind of 3D gesture identification method may further comprise the steps:
Step 1 is obtained RGB image and depth image as training sample from image input device;
Step 2 is carried out the detection of staff and is cut apart by setting self-adaptation dynamic depth threshold value;
Step 3 is removed arm by the morphological image operation, obtains the palm center;
Step 4 obtains the gesture profile by edge extracting, and utilizes time-serial position to describe the gesture profile;
Step 5, position by fingertip location and finger tie point in the morphological image operation acquisition gesture, correspondence position at time-serial position carries out cutting apart of curve, and the time-serial position of finger is specifically made up and obtains pointing the describer proper vector;
Step 6 is expressed as a finger describer eigenmatrix with each class in the gesture training sample;
Step 7 is obtained RGB image and depth image in real time by image input device, and the gesture that execution in step two~step 5 will be imported in real time is expressed as a finger describer eigenmatrix;
Step 8, the finger describer proper vector that step 7 is obtained, (Image-to-Class Dynamic Time Warping, I2C-DTW) classification is handled, and obtains the gesture identification result at last to the dynamic time warping of class to carry out image with the eigenmatrix that obtains in the step 6.
A kind of 3D gesture identification method preferred embodiment according to the present invention, in the described step 2, by supposing that hand is the object of forefront, set the self-adaptation dynamic threshold: specifically be to utilize the method for binaryzation to separate staff and background by the self-adaptation dynamic threshold, obtain the gesture figure of binaryzation, shown in Fig. 2 (a).Described dynamic threshold namely be by the people's face on the coloured image detect obtain people's face coordinate after, correspond to the mean depth value that obtains people's face on the depth image, since then with people's health and background separation, re-use that Otsu adaptive threshold partitioning algorithm obtains.
Another preferred embodiment of a kind of 3D gesture identification method according to the present invention, in the described step 3, remove arm by the morphological image operation, obtaining the palm center, specifically is shown in Fig. 2 (b), detects the arm profile, select minimum point on it as wrist location, remove the following part of wrist, remaining image is made burn into expand and remove finger and obtain palm, calculate the palm geometric center.
Another preferred embodiment of a kind of 3D gesture identification method according to the present invention, in the described step 4, obtaining the gesture profile by edge extracting, specifically is with palm geometric center position, and the palm circumscribed circle is that radius is drawn circle, cover the palm in the binary map, obtain the gesture binary map, as Fig. 2 (c), the gesture binary map is extracted the edge, obtain the gesture profile diagram, as Fig. 2 (d).
Another preferred embodiment of a kind of 3D gesture identification method according to the present invention, in the described step 4, utilize time-serial position to describe the gesture profile, specifically be to obtain the useful information relevant with shape that contain in the data by the calculating to the angle and distance on profile summit, and depict time-serial position as, realize the extraction of shape facility.For example the palm central point is P 0, the profile initial point is P 1, the profile summit is P i(i=2 ..., n), then can get angle ∠ P 1P 0P i=<P 0P 1, P 0P i, (i=2 ..., n), angle is done after 360 ° the normalization horizontal ordinate as time-serial position; The Euclidean distance of profile summit and palm central point | P 0P i| the ordinate as time-serial position, utilize formula | P 0P i|=| P 0P i|/max{|P 0P 1|, | P 0P 2| ..., | P 0P n| adjust the distance and do normalization, obtain crossing the time-serial position of describing the gesture profile, as Fig. 2 (e).Described time-serial position is converted by the gesture profile, with the useful information relevant with shape that obtains to contain in the data, realizes the extraction of shape facility.
Another preferred embodiment of a kind of 3D gesture identification method according to the present invention, in the described step 5, position by fingertip location and finger tie point in the morphological image operation acquisition gesture, realization is cut apart time-serial position, specifically be to estimate by the gesture profile being carried out polygon, detect polygonal sags and crests again, salient point is finger tip, concave point is the finger tie point, at last detected sags and crests is carried out filtering, obtain fingertip location and finger tie point position, these positions are corresponded on the time-serial position, realize cutting apart curve.For example specifically being by using among the OpenCV<approxPloyDP〉function obtains polygon and estimates, detect polygonal sags and crests, salient point is finger tip, concave point is the finger tie point, sags and crests detects and can use among the OpenCV<convexHull〉and<cvConvexiyDefects〉function, detected sags and crests is carried out filtering, the sags and crests of y coordinate figure below central point all filtered, realize cutting apart finger.
Another preferred embodiment of a kind of 3D gesture identification method according to the present invention, in the described step 5, the time-serial position of described finger specifically makes up and obtains pointing describer, specifically is the finger segment of curve that curve segmentation obtains specifically to be made up obtain pointing the describer proper vector.Described finger describer proper vector is as follows: f=[f 1, f 2... f s... f S],
Wherein the value of s is
s={1,2,...k,12,23,...(k-1)k,123,234,...,(k-2)(k-1)k,...123...k},
K ∈ 1,2 ... K}, K are the maximum finger numbers that comprise in all gestures.
Another preferred embodiment of a kind of 3D gesture identification method according to the present invention, in the described step 6, each class in the gesture training sample is expressed as a finger describer eigenmatrix, and described finger describer eigenmatrix is made up of the finger describer proper vector of such all training samples.The matrix that described finger describer proper vector is formed
Figure BDA00003163096100051
Wherein c represents a certain class of gesture, G cBe a N c* M cMatrix, f C, nBe n training sample of c class gesture, N cBe number of training, M cSum for c class gesture finger describer changes with c.
Another preferred embodiment of a kind of 3D gesture identification method according to the present invention, in the described step 7, obtain RGB image and depth image in real time by image input device, the gesture that execution in step two~step 5 will be imported in real time is expressed as a finger describer proper vector.Obtain pointing describer proper vector f Test=[f 1', f 2' ... f s' ... f S'], as shown in Figure 4.
Another preferred embodiment of a kind of 3D gesture identification method according to the present invention, in the described step 8, image is handled to the dynamic time warping classification of class, specifically be that test sample book and training sample are carried out image to the dynamic time warping calculating of class, obtain the similarity of test sample book and all kinds of training samples, selection similarity maximum namely is the shortest gesture-type as test sample book in dynamic time warping path of class.
Specifically be that test data and training sample are carried out dynamic time warping calculating:
I 2 C - DTW ( G c , f test ) = Σ s = 1 S min n ∈ { 1,2 , . . . , N c } { DTW ( f c , n , s , f s , ) }
Wherein, DTW (f C, n, s, f s') expression f C, n, sWith f s' between the shortest regular path, f C, n, sBe f C, nIn the combination of a kind of finger, select minimum I2C-DTW (G at last c, f Test) as the gesture-type of test data correspondence, i.e. gesture identification result.
In addition, the computing method of described dynamic threshold are: preceding k the some gray-scale value r that detects gray-scale value minimum in the degree of depth picture i(i=1,2 ..., k), remove gray-scale value and be 0~10 noise spot, k=100 in this patent, then dynamic threshold
Figure BDA00003163096100062
Wherein ρ is the experiment estimated value, and the user can change as the case may be.
In the various embodiments described above, described RGB image can be cromogram; Described depth image can be gray-scale map, and its expression object is from the distance of image input device, and the more big distance of gray-scale value is more far away, and the relation of gray-scale value and distance is decided according to concrete image input device.
The present invention includes staff and detect holonomic system with staff recognition technology and two kinds of technology of a comprehensive utilization, two kinds of technology and comprehensive application system thereof can both be issued to the effect of real-time stabilization at natural complex background.Staff detects depth information and chromatic information combination, utilizes the singularity of staff, can accurately detect in one's hands and wrist, hand is split the position that obtains hand in the image.To creationary matching algorithm and the time dynamic regular algorithm of image to class that merged of the dynamic time warping algorithm of class, can accurately obtain the state of each two field picture expert gesture based on image, comprise the number of finger, position, finger tip angle.Two kinds of technology combine and can construct a personal-machine gesture interaction system.The present invention can be widely used in Smart Home, health medical treatment, education, aspects such as computer game.

Claims (10)

1. 3D gesture identification method is characterized in that may further comprise the steps:
Step 1 is obtained RGB image and depth image as training sample from image input device;
Step 2 is carried out the detection of staff and is cut apart by setting self-adaptation dynamic depth threshold value;
Step 3 is removed arm by the morphological image operation, obtains the palm center;
Step 4 obtains the gesture profile by edge extracting, and utilizes time-serial position to describe the gesture profile;
Step 5, position by fingertip location and finger tie point in the morphological image operation acquisition gesture, correspondence position at time-serial position carries out cutting apart of curve, and the time-serial position of finger is specifically made up and obtains pointing the describer proper vector;
Step 6 is expressed as a finger describer eigenmatrix with each class in the gesture training sample;
Step 7 is obtained RGB image and depth image in real time by image input device, and the gesture that execution in step two~step 5 will be imported in real time is expressed as a finger describer proper vector;
Step 8 with the finger describer proper vector that step 7 obtains, is carried out image with the eigenmatrix that obtains in the step 6 and is handled to the dynamic time warping classification of class, obtains the gesture identification result at last.
2. a kind of 3D gesture identification method according to claim 1, it is characterized in that: in the described step 2, by supposing that hand is the object of forefront, set the self-adaptation dynamic threshold: specifically be to utilize the method for binaryzation to separate staff and background by the self-adaptation dynamic threshold, obtain the gesture figure of binaryzation.
3. a kind of 3D gesture identification method according to claim 1, it is characterized in that: in the described step 3, remove arm by the morphological image operation, obtain the palm center, specifically be to detect the arm profile, select minimum point on it as wrist location, remove the following part of wrist, remaining image is made burn into expand and remove finger and obtain palm, calculate the palm geometric center.
4. a kind of 3D gesture identification method according to claim 1, it is characterized in that: in the described step 4, obtain the gesture profile by edge extracting, specifically be with palm geometric center position, the palm circumscribed circle is that radius is drawn circle, covers the palm in the binary map, obtains the gesture binary map, the gesture binary map is extracted the edge, obtain the gesture profile diagram.
5. according to any described a kind of 3D gesture identification method of claim 1 to 4, it is characterized in that: in the described step 4, utilize time-serial position to describe the gesture profile, specifically be to obtain the useful information relevant with shape that contain in the data by the calculating to the angle and distance on profile summit, and depict time-serial position as, realize the extraction of shape facility.
6. a kind of 3D gesture identification method according to claim 5, it is characterized in that: in the described step 5, position by fingertip location and finger tie point in the morphological image operation acquisition gesture, realization is cut apart time-serial position, specifically be to estimate by the gesture profile being carried out polygon, detect polygonal sags and crests again, salient point is finger tip, concave point is the finger tie point, at last detected sags and crests is carried out filtering, obtain fingertip location and finger tie point position, these positions are corresponded on the time-serial position, realize cutting apart curve.
7. a kind of 3D gesture identification method according to claim 6, it is characterized in that: in the described step 5, the time-serial position of described finger specifically makes up and obtains pointing describer, specifically is the finger segment of curve that curve segmentation obtains specifically to be made up obtain pointing the describer proper vector.
8. a kind of 3D gesture identification method according to claim 7, it is characterized in that: in the described step 6, each class in the gesture training sample is expressed as a finger describer eigenmatrix, and described finger describer eigenmatrix is made up of the finger describer proper vector of such all training samples.
9. a kind of 3D gesture identification method according to claim 8, it is characterized in that: in the described step 7, obtain RGB image and depth image in real time by image input device, the gesture that execution in step two~step 5 will be imported in real time is expressed as a finger describer proper vector.
10. a kind of 3D gesture identification method according to claim 9, it is characterized in that: in the described step 8, image is handled to the dynamic time warping classification of class, specifically be that test sample book and training sample are carried out image to the dynamic time warping calculating of class, obtain the similarity of test sample book and all kinds of training samples, selection similarity maximum namely is the shortest gesture-type as test sample book in dynamic time warping path of class.
CN201310168123.1A 2013-05-09 2013-05-09 A kind of 3D gesture identification method Active CN103294996B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201310168123.1A CN103294996B (en) 2013-05-09 2013-05-09 A kind of 3D gesture identification method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201310168123.1A CN103294996B (en) 2013-05-09 2013-05-09 A kind of 3D gesture identification method

Publications (2)

Publication Number Publication Date
CN103294996A true CN103294996A (en) 2013-09-11
CN103294996B CN103294996B (en) 2016-04-27

Family

ID=49095827

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201310168123.1A Active CN103294996B (en) 2013-05-09 2013-05-09 A kind of 3D gesture identification method

Country Status (1)

Country Link
CN (1) CN103294996B (en)

Cited By (34)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103679213A (en) * 2013-12-13 2014-03-26 电子科技大学 3D gesture recognition method
CN104268138A (en) * 2014-05-15 2015-01-07 西安工业大学 Method for capturing human motion by aid of fused depth images and three-dimensional models
CN104333794A (en) * 2014-11-18 2015-02-04 电子科技大学 Channel selection method based on depth gestures
CN104375631A (en) * 2013-10-22 2015-02-25 安徽寰智信息科技股份有限公司 Non-contact interaction method based on mobile terminal
CN104699238A (en) * 2013-12-10 2015-06-10 现代自动车株式会社 System and method for gesture recognition of vehicle
CN104714637A (en) * 2013-12-16 2015-06-17 纬创资通股份有限公司 Polygonal gesture detection and interaction method, device and computer program product
CN104750242A (en) * 2013-12-31 2015-07-01 现代自动车株式会社 Apparatus and method for recognizing user's gesture for carrying out operation of vehicle
CN104766055A (en) * 2015-03-26 2015-07-08 济南大学 Method for removing wrist image in gesture recognition
CN104899600A (en) * 2015-05-28 2015-09-09 北京工业大学 Depth map based hand feature point detection method
CN104978012A (en) * 2014-04-03 2015-10-14 华为技术有限公司 Pointing interactive method, device and system
WO2015112194A3 (en) * 2014-01-22 2015-11-05 Lsi Corporation Image processor comprising gesture recognition system with static hand pose recognition based on dynamic warping
WO2015197026A1 (en) * 2014-06-27 2015-12-30 华为技术有限公司 Method, apparatus and terminal for acquiring sign data of target object
CN105320937A (en) * 2015-09-25 2016-02-10 北京理工大学 Kinect based traffic police gesture recognition method
CN105354812A (en) * 2014-07-10 2016-02-24 北京中科盘古科技发展有限公司 Method for identifying profile interaction based on multi-Kinect collaboration depth threshold segmentation algorithm
CN105654103A (en) * 2014-11-12 2016-06-08 联想(北京)有限公司 Image identification method and electronic equipment
CN105739702A (en) * 2016-01-29 2016-07-06 电子科技大学 Multi-posture fingertip tracking method for natural man-machine interaction
CN106446911A (en) * 2016-09-13 2017-02-22 李志刚 Hand recognition method based on image edge line curvature and distance features
CN106503620A (en) * 2016-09-26 2017-03-15 深圳奥比中光科技有限公司 Numerical ciphers input method and its system based on gesture
CN106529480A (en) * 2016-11-14 2017-03-22 江汉大学 Finger tip detection and gesture identification method and system based on depth information
CN106610716A (en) * 2015-10-21 2017-05-03 华为技术有限公司 Gesture recognition method and device
WO2017113736A1 (en) * 2015-12-27 2017-07-06 乐视控股(北京)有限公司 Method of distinguishing finger from wrist, and device for same
CN107292904A (en) * 2016-03-31 2017-10-24 北京市商汤科技开发有限公司 A kind of palm tracking and system based on depth image
CN107491763A (en) * 2017-08-24 2017-12-19 歌尔科技有限公司 Finger areas dividing method and device based on depth image
CN107977070A (en) * 2016-10-25 2018-05-01 中兴通讯股份有限公司 A kind of methods, devices and systems of gesture manipulation virtual reality video
CN108255298A (en) * 2017-12-29 2018-07-06 安徽慧视金瞳科技有限公司 Infrared gesture identification method and equipment in a kind of projection interactive system
DE102017210317A1 (en) * 2017-06-20 2018-12-20 Volkswagen Aktiengesellschaft Method and device for detecting a user input by means of a gesture
CN109375766A (en) * 2018-09-13 2019-02-22 何艳玲 A kind of Novel learning method based on gesture control
CN110046603A (en) * 2019-04-25 2019-07-23 合肥工业大学 A kind of gesture motion recognition methods of the general musician's language coding of China
CN110874179A (en) * 2018-09-03 2020-03-10 京东方科技集团股份有限公司 Fingertip detection method, fingertip detection device, and medium
CN111309149A (en) * 2020-02-21 2020-06-19 河北科技大学 Gesture recognition method and gesture recognition device
CN111466882A (en) * 2020-04-23 2020-07-31 上海祉云医疗科技有限公司 Intelligent traditional Chinese medicine hand diagnosis analysis system and method
CN111723698A (en) * 2020-06-05 2020-09-29 中南民族大学 Method and equipment for controlling lamplight based on gestures
CN111753771A (en) * 2020-06-29 2020-10-09 武汉虹信技术服务有限责任公司 Gesture event recognition method, system and medium
CN112507924A (en) * 2020-12-16 2021-03-16 深圳荆虹科技有限公司 3D gesture recognition method, device and system

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050271279A1 (en) * 2004-05-14 2005-12-08 Honda Motor Co., Ltd. Sign based human-machine interaction
CN102769802A (en) * 2012-06-11 2012-11-07 西安交通大学 Man-machine interactive system and man-machine interactive method of smart television
CN102945079A (en) * 2012-11-16 2013-02-27 武汉大学 Intelligent recognition and control-based stereographic projection system and method
CN102968178A (en) * 2012-11-07 2013-03-13 电子科技大学 Gesture-based PPT (Power Point) control system

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20050271279A1 (en) * 2004-05-14 2005-12-08 Honda Motor Co., Ltd. Sign based human-machine interaction
CN102769802A (en) * 2012-06-11 2012-11-07 西安交通大学 Man-machine interactive system and man-machine interactive method of smart television
CN102968178A (en) * 2012-11-07 2013-03-13 电子科技大学 Gesture-based PPT (Power Point) control system
CN102945079A (en) * 2012-11-16 2013-02-27 武汉大学 Intelligent recognition and control-based stereographic projection system and method

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
李文生等: "基于多点手势识别的人机交互技术框架", 《计算机工程与设计》 *
贾建军: "基于视觉的手势识别技术的研究", 《中国优秀硕士学位论文全文数据库(电子期刊)》 *

Cited By (60)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104375631A (en) * 2013-10-22 2015-02-25 安徽寰智信息科技股份有限公司 Non-contact interaction method based on mobile terminal
CN104699238B (en) * 2013-12-10 2019-01-22 现代自动车株式会社 System and method of the gesture of user to execute the operation of vehicle for identification
CN104699238A (en) * 2013-12-10 2015-06-10 现代自动车株式会社 System and method for gesture recognition of vehicle
CN103679213A (en) * 2013-12-13 2014-03-26 电子科技大学 3D gesture recognition method
CN104714637B (en) * 2013-12-16 2017-09-01 纬创资通股份有限公司 Polygonal gesture detection and interaction method, device and computer program product
CN104714637A (en) * 2013-12-16 2015-06-17 纬创资通股份有限公司 Polygonal gesture detection and interaction method, device and computer program product
CN104750242B (en) * 2013-12-31 2019-04-16 现代自动车株式会社 The gesture of user is for identification to execute the device and method of the operation of vehicle
CN104750242A (en) * 2013-12-31 2015-07-01 现代自动车株式会社 Apparatus and method for recognizing user's gesture for carrying out operation of vehicle
WO2015112194A3 (en) * 2014-01-22 2015-11-05 Lsi Corporation Image processor comprising gesture recognition system with static hand pose recognition based on dynamic warping
CN104978012A (en) * 2014-04-03 2015-10-14 华为技术有限公司 Pointing interactive method, device and system
CN104978012B (en) * 2014-04-03 2018-03-16 华为技术有限公司 One kind points to exchange method, apparatus and system
US10466797B2 (en) 2014-04-03 2019-11-05 Huawei Technologies Co., Ltd. Pointing interaction method, apparatus, and system
CN104268138B (en) * 2014-05-15 2017-08-15 西安工业大学 Merge the human body motion capture method of depth map and threedimensional model
CN104268138A (en) * 2014-05-15 2015-01-07 西安工业大学 Method for capturing human motion by aid of fused depth images and three-dimensional models
US9984461B2 (en) 2014-06-27 2018-05-29 Huawei Technologies Co., Ltd. Method, apparatus, and terminal for obtaining vital sign data of target object
WO2015197026A1 (en) * 2014-06-27 2015-12-30 华为技术有限公司 Method, apparatus and terminal for acquiring sign data of target object
CN105354812A (en) * 2014-07-10 2016-02-24 北京中科盘古科技发展有限公司 Method for identifying profile interaction based on multi-Kinect collaboration depth threshold segmentation algorithm
CN105354812B (en) * 2014-07-10 2020-10-16 北京中科盘古科技发展有限公司 Multi-Kinect cooperation-based depth threshold segmentation algorithm contour recognition interaction method
CN105654103A (en) * 2014-11-12 2016-06-08 联想(北京)有限公司 Image identification method and electronic equipment
CN105654103B (en) * 2014-11-12 2020-03-24 联想(北京)有限公司 Image identification method and electronic equipment
CN104333794A (en) * 2014-11-18 2015-02-04 电子科技大学 Channel selection method based on depth gestures
CN104766055A (en) * 2015-03-26 2015-07-08 济南大学 Method for removing wrist image in gesture recognition
CN104899600B (en) * 2015-05-28 2018-07-17 北京工业大学 A kind of hand-characteristic point detecting method based on depth map
CN104899600A (en) * 2015-05-28 2015-09-09 北京工业大学 Depth map based hand feature point detection method
CN105320937A (en) * 2015-09-25 2016-02-10 北京理工大学 Kinect based traffic police gesture recognition method
CN105320937B (en) * 2015-09-25 2018-08-14 北京理工大学 Traffic police's gesture identification method based on Kinect
CN106610716A (en) * 2015-10-21 2017-05-03 华为技术有限公司 Gesture recognition method and device
US10732724B2 (en) 2015-10-21 2020-08-04 Huawei Technologies Co., Ltd. Gesture recognition method and apparatus
CN106610716B (en) * 2015-10-21 2019-08-27 华为技术有限公司 A kind of gesture identification method and device
WO2017113736A1 (en) * 2015-12-27 2017-07-06 乐视控股(北京)有限公司 Method of distinguishing finger from wrist, and device for same
CN105739702A (en) * 2016-01-29 2016-07-06 电子科技大学 Multi-posture fingertip tracking method for natural man-machine interaction
CN105739702B (en) * 2016-01-29 2019-01-22 电子科技大学 Multi-pose finger tip tracking for natural human-computer interaction
CN107292904A (en) * 2016-03-31 2017-10-24 北京市商汤科技开发有限公司 A kind of palm tracking and system based on depth image
CN107292904B (en) * 2016-03-31 2018-06-15 北京市商汤科技开发有限公司 A kind of palm tracking and system based on depth image
CN106446911A (en) * 2016-09-13 2017-02-22 李志刚 Hand recognition method based on image edge line curvature and distance features
CN106446911B (en) * 2016-09-13 2018-09-18 李志刚 A kind of human hand recognition methods based on image border embroidery and distance feature
CN106503620A (en) * 2016-09-26 2017-03-15 深圳奥比中光科技有限公司 Numerical ciphers input method and its system based on gesture
CN107977070A (en) * 2016-10-25 2018-05-01 中兴通讯股份有限公司 A kind of methods, devices and systems of gesture manipulation virtual reality video
CN107977070B (en) * 2016-10-25 2021-09-28 中兴通讯股份有限公司 Method, device and system for controlling virtual reality video through gestures
CN106529480A (en) * 2016-11-14 2017-03-22 江汉大学 Finger tip detection and gesture identification method and system based on depth information
DE102017210317A1 (en) * 2017-06-20 2018-12-20 Volkswagen Aktiengesellschaft Method and device for detecting a user input by means of a gesture
CN111095163A (en) * 2017-06-20 2020-05-01 大众汽车有限公司 Method and apparatus for detecting user input in dependence on gesture
US11430267B2 (en) 2017-06-20 2022-08-30 Volkswagen Aktiengesellschaft Method and device for detecting a user input on the basis of a gesture
CN107491763A (en) * 2017-08-24 2017-12-19 歌尔科技有限公司 Finger areas dividing method and device based on depth image
CN108255298A (en) * 2017-12-29 2018-07-06 安徽慧视金瞳科技有限公司 Infrared gesture identification method and equipment in a kind of projection interactive system
CN108255298B (en) * 2017-12-29 2021-02-19 安徽慧视金瞳科技有限公司 Infrared gesture recognition method and device in projection interaction system
US11315265B2 (en) 2018-09-03 2022-04-26 Boe Technology Group Co., Ltd. Fingertip detection method, fingertip detection device, and medium
WO2020048213A1 (en) * 2018-09-03 2020-03-12 京东方科技集团股份有限公司 Fingertip detection method, fingertip detection means, fingertip detection device, and medium
CN110874179A (en) * 2018-09-03 2020-03-10 京东方科技集团股份有限公司 Fingertip detection method, fingertip detection device, and medium
CN110874179B (en) * 2018-09-03 2021-09-14 京东方科技集团股份有限公司 Fingertip detection method, fingertip detection device, and medium
CN109375766A (en) * 2018-09-13 2019-02-22 何艳玲 A kind of Novel learning method based on gesture control
CN110046603A (en) * 2019-04-25 2019-07-23 合肥工业大学 A kind of gesture motion recognition methods of the general musician's language coding of China
CN110046603B (en) * 2019-04-25 2020-11-27 合肥工业大学 Gesture action recognition method for Chinese pule sign language coding
CN111309149B (en) * 2020-02-21 2022-08-19 河北科技大学 Gesture recognition method and gesture recognition device
CN111309149A (en) * 2020-02-21 2020-06-19 河北科技大学 Gesture recognition method and gesture recognition device
CN111466882A (en) * 2020-04-23 2020-07-31 上海祉云医疗科技有限公司 Intelligent traditional Chinese medicine hand diagnosis analysis system and method
CN111723698A (en) * 2020-06-05 2020-09-29 中南民族大学 Method and equipment for controlling lamplight based on gestures
CN111753771A (en) * 2020-06-29 2020-10-09 武汉虹信技术服务有限责任公司 Gesture event recognition method, system and medium
CN112507924A (en) * 2020-12-16 2021-03-16 深圳荆虹科技有限公司 3D gesture recognition method, device and system
CN112507924B (en) * 2020-12-16 2024-04-09 深圳荆虹科技有限公司 3D gesture recognition method, device and system

Also Published As

Publication number Publication date
CN103294996B (en) 2016-04-27

Similar Documents

Publication Publication Date Title
CN103294996B (en) A kind of 3D gesture identification method
JP6079832B2 (en) Human computer interaction system, hand-to-hand pointing point positioning method, and finger gesture determination method
CN104899600B (en) A kind of hand-characteristic point detecting method based on depth map
Feng et al. Features extraction from hand images based on new detection operators
US20120113241A1 (en) Fingertip tracking for touchless user interface
CN103971102A (en) Static gesture recognition method based on finger contour and decision-making trees
Zhu et al. Vision based hand gesture recognition using 3D shape context
RU2013154102A (en) FINGER RECOGNITION AND TRACKING SYSTEM
US10366281B2 (en) Gesture identification with natural images
CN102520790A (en) Character input method based on image sensing module, device and terminal
CN103226388A (en) Kinect-based handwriting method
CN111414837A (en) Gesture recognition method and device, computer equipment and storage medium
CN103092437A (en) Portable touch interactive system based on image processing technology
CN107450717B (en) Information processing method and wearable device
CN106503619B (en) Gesture recognition method based on BP neural network
CN102831408A (en) Human face recognition method
KR20150075648A (en) Method and recording medium for contactless input interface with real-time hand pose recognition
CN105046249B (en) A kind of man-machine interaction method
CN109189219A (en) The implementation method of contactless virtual mouse based on gesture identification
US20160187991A1 (en) Re-anchorable virtual panel in three-dimensional space
Geetha et al. Dynamic gesture recognition of Indian sign language considering local motion of hand using spatial location of Key Maximum Curvature Points
CN102194097A (en) Multifunctional method for identifying hand gestures
Chang et al. Automatic hand-pose trajectory tracking system using video sequences
Dhamanskar et al. Human computer interaction using hand gestures and voice
Wong et al. Virtual touchpad: Hand gesture recognition for smartphone with depth camera

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant