CN103455794A - Dynamic gesture recognition method based on frame fusion technology - Google Patents

Dynamic gesture recognition method based on frame fusion technology Download PDF

Info

Publication number
CN103455794A
CN103455794A CN2013103741769A CN201310374176A CN103455794A CN 103455794 A CN103455794 A CN 103455794A CN 2013103741769 A CN2013103741769 A CN 2013103741769A CN 201310374176 A CN201310374176 A CN 201310374176A CN 103455794 A CN103455794 A CN 103455794A
Authority
CN
China
Prior art keywords
gesture
density distribution
dynamic
distribution feature
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN2013103741769A
Other languages
Chinese (zh)
Other versions
CN103455794B (en
Inventor
冯志全
张廷芳
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
University of Jinan
Original Assignee
University of Jinan
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by University of Jinan filed Critical University of Jinan
Priority to CN201310374176.9A priority Critical patent/CN103455794B/en
Publication of CN103455794A publication Critical patent/CN103455794A/en
Application granted granted Critical
Publication of CN103455794B publication Critical patent/CN103455794B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Abstract

The invention discloses a dynamic gesture recognition method based on the frame fusion technology. The method is characterized by comprising the following steps that an established group of dynamic gestures are divided into a group of gesture sets according to the density distribution feature, for each gesture set, the density distribution feature parameter of a frame fusion image of each gesture is obtained, then the mean value of the density distribution feature parameters in each set is obtained, the mean value is used as the template feature vector H of the gesture set; for a dynamic gesture image to be recognized, a mapped static combination image Q is obtained; the density distribution feature of the static combination image Q is computed, according to the density distribution feature, the gesture set where the dynamic gesture is placed is recognized; gestures in the gesture set are determined according to the range of the density distribution feature, and a Hausdorff distance method or a fingertip feature point method is further selected for use.

Description

A kind of dynamic gesture identification method based on the frame integration technology
Technical field
The present invention relates to dynamic gesture identification field, specifically, relate to a kind of dynamic gesture identification method based on the frame integration technology.
Background technology
Along with developing rapidly of computer technology, human-computer interaction technology becomes current one of the most popular research topic gradually.And the importance of staff has determined its great researching value in field of human-computer interaction.Staff, as a kind of the most natural, directly perceived and be easy to the man-machine interaction means of study, is applied by people's broad research, and the gesture identification based on vision becomes one of content of people's broad research.
According to the movement characteristic of gesture, gesture can be divided into to static gesture and dynamic gesture.Static gesture relies on shape and the profile transmission of information of hand, and dynamic gesture is along with the variation of time, and position and the shape of hand also change, thereby can transmit more accurately detailed information.Current gesture identification method has template matching method (Model/Template Matching), hidden Markov model (HMM, Hidden Markov Model), dynamic time programming (DTW) etc.
The method of template matches is multiplex in static gesture identification.2002, the gesture identification based on the hausdorff distance that the people such as Zhang Liangguo, Wujiang qin propose, utilized the thought of hausdorff apart from template matches, realized the gesture identification that robustness is stronger.Huang Guofan and Li Ying propose an Alphabet Gesture recognition methods, at first, to all Alphabet Gesture image pre-service, then utilize the method for template matches to be identified.
2003, Ahmed Elgammal proposed a kind of HMM model of imparametrization, and the method means dynamic gesture, according to the probability volume architecture, carries out gesture identification by a series of attitudes of both having learned.Yan Yan etc. are used HMM gesture instruction model, adopt k_means to obtain the vector quantization of gesture sequence.Thereby improved the performance of gesture identification.
The DTW method is a kind of pattern matching algorithm with Nonlinear Time normalization effect.The people such as Trevor J. proposed a kind of DTW method in 1996,
The DTW method is simply effective, allows sufficient elasticity between test mould and reference model, thereby realizes correct classification.The people such as warp is of heap of stone, Marvin's army use the method for dynamic time programming effectively to improve the dynamic gesture recognition efficiency based on acceleration.
Summary of the invention
The technical problem to be solved in the present invention is to provide a kind of dynamic gesture identification method based on the frame integration technology, has increased the accuracy of dynamic gesture identification.
The present invention adopts following technical scheme to realize goal of the invention:
A kind of dynamic gesture identification method based on the frame integration technology, is characterized in that, comprises the steps:
(1) one group of set dynamic gesture is divided into to one group of gesture set according to Density Distribution Feature, each gesture set is asked to the Density Distribution Feature parameter of its each gesture frame fused images, then ask respectively the mean value of its Density Distribution Feature parameter in each set, the template characteristic vector H using mean value as this gesture set;
(2) static constitutional diagram Q dynamic gesture Image Acquisition to be identified shone upon;
(3) calculate the Density Distribution Feature of static constitutional diagram Q, identify the gesture set at dynamic gesture place according to Density Distribution Feature;
(4) determine that according to the scope of Density Distribution Feature the gesture in the gesture set further selects to adopt Haus dorff distance method or finger tip unique point method.
As the further restriction to the technical program, described step (2) comprises the steps:
(2.1) obtain the continuous dynamic gesture sequence image of N frame { P i, i=0,1,2 ..., N;
(2.2) pixel in static constitutional diagram Q is with { P iin the mapping relations of pixel as follows
x i y i = a 2 + X - γ sin β - A 2 cos α + Y + γ cos β - B 2 sin α b 2 - X - γ sin β - A 2 sin α - Y + γ cos β - B 2 cos α - - - ( 1 )
Wherein, x i, y irepresent respectively little figure P ithe transverse and longitudinal coordinate of a certain pixel, X, Y means respectively the transverse and longitudinal coordinate of respective pixel in constitutional diagram Q, a, b means respectively P iwidth and the height, A, B mean respectively Q width and the height, r is each little figure P ithe radius that rotation is arranged in Q, α presentation graphic P ithe angle that self rotates, β means the P after over-rotation ithe radius at center with the angle of the rectangular coordinate system ordinate of setting up in Q, and:
α = 3 2 π + 2 N πi - - - ( 2 )
β = 2 N πi - - - ( 3 )
As the further restriction to the technical program, described step (3) comprising:
(3.1) plane picture that static constitutional diagram Q forms is f (x, y), the centre of form of computed image f (x, y), i.e. center of gravity
(3.2) in computed image f (x, y), target pixel points is to the ultimate range D of the centre of form max, minor increment D min;
(3.3) in difference computed image f (x, y), take the centre of form as the center of circle, with D maxfor the maximum circumscribed circle in the target area of radius with D minfor the target area minimum circumscribed circle of radius, in the zone formed at maximum circumscribed circle and minimum circumscribed circle, use equidistant regional partitioning that image is divided into to M number of sub images zone (M>0);
(3.4) each sub-image area is added up respectively, calculated the total S of each sub-image area internal object pixel i(i=1 ..., M), and find out S imaximal value.
S max = max i = 1 , · · · , M ( S i ) - - - ( 6 )
(3.5) calculate the Density Distribution Feature d of static constitutional diagram:
r i=S i/S max(i=1,…,M) (7)
dr i = | r 1 - r 2 | i = 1 | 2 r i - r i - 1 - r i + 1 | 1 < i < M | r M - r M - 1 | i = M - - - ( 8 )
d=(r 1,…,r M;dr 1,…,dr M) (9)
(3.6) according to formula DDF '=(r 1..., r 10, ar 11..., ar 15, br 16..., br 20; Dr 1..., dr 10, cdr 11..., cdr m) obtain amended proper vector , wherein, a, b, c is customized parameter;
(3.7) by the proper vector obtained
Figure BDA0000371361670000044
with the template characteristic vector H in each gesture set, compare respectively, the difference compute euclidian distances, the gesture set of Euclidean distance minimum is selected gesture set.
As the further restriction to the technical program, described step (4) comprises the steps:
(4.1) for gesture in the 35th the gesture set of parameter vector between scope 5 to 6 of Density Distribution Feature, adopt the Hausdorff distance further to identify the last frame of dynamic gesture, thereby reach gesture identification effect accurately;
(4.2) for gesture in the 35th the gesture set of parameter vector between scope 3 to 4 of Density Distribution Feature, adopt the method for finger tip unique point further to identify the last frame of dynamic gesture, thereby reach gesture identification effect accurately.
As the further restriction to the technical program, the Hausdorff distance method of described step (4.1) carries out further identifying and comprising the steps: to the last frame of dynamic gesture
(4.1.1) the frontier point set L of gesture in the set of first trained gesture;
(4.1.2) then the last frame of gesture to be identified is asked the set e of its frontier point;
(4.1.3) calculate respectively the hausdorff distance of e with frontier point set L;
(4.1.4) output hausdorff is apart from minimum template gesture sequence i, and this sequence is identified correct gesture sequence.
As the further restriction to the technical program, the method for the finger tip unique point of described step (4.2) comprises the steps:
(4.2.1) first trained gesture set, detect the finger tip dot information of various gestures in the gesture set, the vector that each finger tip point is formed with the gesture center of gravity
Figure BDA0000371361670000051
characteristic information charge to vectorial template G;
(4.2.2) then detect the finger tip vector of gesture to be identified
Figure BDA0000371361670000052
, with the finger tip vector in vectorial template G, compare, the template gesture sequence i of output similarity maximum, this sequence is identified correct gesture sequence.
Compared with prior art, advantage of the present invention and good effect are: the present invention adopts the method for constitutional diagram, to be converted into to the identification of dynamic gesture the identification to static constitutional diagram, effectively improved the discrimination of dynamic gesture, in addition, the time loss of the method is not high, within the scope allowed at computing machine.
The accompanying drawing explanation
Fig. 1 is the static constitutional diagram of correspondence that dynamic gesture of the present invention captures.
10 kinds of initial state and various final states images of gestures thereof that dynamically capture that Fig. 2 is the preferred embodiments of the present invention.
The process flow diagram that Fig. 3 is the preferred embodiment of the present invention.
Fig. 4 is that the preferred embodiment of the present invention adopts the Density Distribution Feature method to carry out the Classification and Identification schematic diagram to static constitutional diagram.
Fig. 5 is that preferred embodiment of the present invention finger tip is surveyed schematic diagram.
Embodiment
Below in conjunction with accompanying drawing and preferred embodiment, the present invention is further described in detail.
1, constitutional diagram
How the image stream of dynamic gesture being processed, is a major issue in the dynamic gesture identifying.This paper is mapped as a static constitutional diagram by each two field picture in a dynamic gesture process.By the analyzing and processing to static constitutional diagram, complete the identification to dynamic gesture.In this constitutional diagram, rounded being arranged in wherein of dynamic image stream.This paper completes the first step of gesture identification by the identification to constitutional diagram: rough gesture identification, constitutional diagram as shown in Figure 1:
The mapping relations of dynamic gesture image stream and constitutional diagram are as follows:
Suppose to obtain the continuous dynamic gesture sequence image of N frame { P i, i=0,1,2 ..., N.Pixel in constitutional diagram is with { P iin the mapping relations of pixel as follows x i y i = a 2 + X - &gamma; sin &beta; - A 2 cos &alpha; + Y + &gamma; cos &beta; - B 2 sin &alpha; b 2 - X - &gamma; sin &beta; - A 2 sin &alpha; - Y + &gamma; cos &beta; - B 2 cos &alpha; ( 1 )
Wherein, x i, y irepresent respectively little figure P ithe transverse and longitudinal coordinate of a certain pixel, X, Y means respectively the transverse and longitudinal coordinate of respective pixel in constitutional diagram Q, a, b means respectively P iwidth and the height, A, B mean respectively Q width and the height.R is each little figure P ithe radius that rotation is arranged in Q.α presentation graphic P ithe angle that self rotates, β means the P after over-rotation ithe radius at center with the angle of the rectangular coordinate system ordinate of setting up in Q.And:
&alpha; = 3 2 &pi; + 2 N &pi;i - - - ( 2 )
&beta; = 2 N &pi;i - - - ( 3 )
2, Density Distribution Feature (DDF)
After the frame integration technology, the static constitutional diagram obtained is asked to its Density Distribution Feature.
After image binaryzation, the arrangement of object pixel in image is depended in the information transmission of image.The basic goal of Density Distribution Feature is to obtain the pixel distribution information of image in the distribution situation in zones of different space by the statistics object pixel, thereby reaches the purpose of expressing this bianry image.Classify and can carry out the identification of different images by the Density Distribution Feature to image.
This paper, according to the variation characteristic of gesture in constitutional diagram, has made certain modification to Density Distribution Feature.
Former DDF character representation is as follows:
DDF=(r 1,…,r M;dr 1,…,dr M) (4)
In static constitutional diagram, gesture changes the overwhelming majority and is present in finger part, and the variation of centre of the palm part is less.Thereby the DDF weight of increase finger part, can effectively reduce the DDF similarity between the different virtual constitutional diagram, improves recognition efficiency.Through great many of experiments, new DDF feature is as follows:
DDF′=(r 1,…,r 10,ar 11,…,ar 15,br 16,…,br 20;dr 1,…,dr 10,cdr 11,…,cdr M) (5)
Wherein, a, b, c is customized parameter, this paper operation parameter is: a=3, b=6, c=3.
3. gesture identification
The present invention intends 10 kinds of gestures of dynamically grabbing are identified.The initial state of dynamic gesture and various final states images of gestures thereof are as Fig. 4.
Identifying is divided into two stages.First stage adopts the method for Density Distribution Feature to be identified to constitutional diagram, identify the gesture set of dynamic gesture place, subordinate phase adopts Hausdorff distance or the method for unique point further to identify the last frame of dynamic gesture in the gesture set, thereby reach gesture identification effect accurately, identifying as shown in Figure 3.
3.1 rough gesture identification
The identification of this stage adopts the Density Distribution Feature method to carry out Classification and Identification (Fig. 4) to static constitutional diagram.10 kinds of gestures to be identified are divided into to A, B, C, tetra-set of D according to the DDF feature, gesture to be identified is carried out to sets classification.Method step is as follows:
(3.1) plane picture that static constitutional diagram Q forms is f (x, y), the centre of form of computed image f (x, y), i.e. center of gravity
Figure BDA0000371361670000081
(3.2) in computed image f (x, y), target pixel points is to the ultimate range D of the centre of form max, minor increment D min;
(3.3) in difference computed image f (x, y), take the centre of form as the center of circle, with D maxfor the maximum circumscribed circle in the target area of radius with D minfor the target area minimum circumscribed circle of radius, in the zone formed at maximum circumscribed circle and minimum circumscribed circle, use equidistant regional partitioning that image is divided into to M number of sub images zone (M>0);
(3.4) each sub-image area is added up respectively, calculated the total S of each sub-image area internal object pixel i(i=1 ..., M), and find out S imaximal value.
S max = max i = 1 , &CenterDot; &CenterDot; &CenterDot; , M ( S i ) - - - ( 6 )
(3.5) calculate the Density Distribution Feature d of static constitutional diagram:
r i=S i/S max(i=1,…,M) (7)
dr i = | r 1 - r 2 | i = 1 | 2 r i - r i - 1 - r i + 1 | 1 < i < M | r M - r M - 1 | i = M - - - ( 8 )
d=(r 1,…,r M;dr 1,…,dr M) (9)
(3.6) according to formula DDF '=(r 1..., r 10, ar 11..., ar 15, br 16..., br 20; Dr 1..., dr 10, cdr 11..., cdr m) obtain amended proper vector
Figure BDA0000371361670000092
, wherein, a, b, c is customized parameter;
(3.7) the proper vector d obtained is compared with the template characteristic vector H in each gesture set respectively, the difference compute euclidian distances, the gesture set of Euclidean distance minimum is selected gesture set.
3.2 accurate gesture identification
In the subordinate phase identifying, different characteristic according to dynamic gesture last frame images of gestures, last frame is identified, thereby complete the accurate identification of gesture, learn by analysis, the 35th parameter vector of the gesture Density Distribution Feature of C set, between scope 5 to 6, therefore adopts the hausdorff range formula to be identified, the 35th parameter vector of the gesture Density Distribution Feature in the D set, between scope 3 to 4, adopts finger tip unique point method.Concrete recognition methods is as follows.
In the identification of C set, to last frame, adopt the hausdorff range formula to be identified.The frontier point set L of gesture in the set of first trained gesture; Then the last frame of gesture to be identified is asked the set e of its frontier point; Calculate respectively the hausdorff distance of e with frontier point set L; Hausdorff is apart from minimum template gesture sequence i in output, and this sequence is identified correct gesture sequence.
In the identification of D set, to last frame, adopt finger tip to survey (Fig. 5) and identified.
The set of first trained gesture, detect the finger tip dot information of various gestures in the gesture set, the vector that each finger tip point is formed with the gesture center of gravity
Figure BDA0000371361670000101
characteristic information charge to vectorial template G;
Then detect the finger tip vector of gesture to be identified
Figure BDA0000371361670000102
, with the finger tip vector in vectorial template G, compare, the template gesture sequence i of output similarity maximum, this sequence is identified correct gesture sequence.
4. beneficial effect
The present invention uses common camera, under constant illumination condition, to 10 kinds of dynamic gestures, adopt the method for constitutional diagram to carry out ground floor identification, then the method that adopts hausdorff distance and finger tip to survey is carried out second layer identification, and identification has improved recognition efficiency greatly by different level.Table 1 is depicted as the discrimination that the discrimination that adopts this paper recognition methods to obtain carries out the identification of DDF single frames with single frames gesture figure and is contrasted.
The single frames recognition methods refers to its DDF parameter of each frame recording to continuous dynamic gesture, and by the DDF Characteristics creation template of each gesture, the DDF feature of gesture to be identified is compared with the DDF feature in template, thereby carries out gesture identification.
Table 1: discrimination contrast
Figure BDA0000371361670000103
Figure BDA0000371361670000111
From form, for set gesture, the discrimination of this paper method is significantly improved than the discrimination of single-frame images DDF identification.
Certainly, above-mentioned explanation is not limitation of the present invention, and the present invention also is not limited only to above-mentioned giving an example, and the variation that those skilled in the art make in essential scope of the present invention, remodeling, interpolation or replacement, also belong to protection scope of the present invention.

Claims (6)

1. the dynamic gesture identification method based on the frame integration technology, is characterized in that, comprises the steps:
(1) one group of set dynamic gesture is divided into to one group of gesture set according to Density Distribution Feature, each gesture set is asked to the Density Distribution Feature parameter of its each gesture frame fused images, then ask respectively the mean value of its Density Distribution Feature parameter in each set, the template characteristic vector H using mean value as this gesture set;
(2) static constitutional diagram Q dynamic gesture Image Acquisition to be identified shone upon;
(3) calculate the Density Distribution Feature of static constitutional diagram Q, identify the gesture set at dynamic gesture place according to Density Distribution Feature;
(4) determine that according to the scope of Density Distribution Feature the gesture in the gesture set further selects to adopt Hausdorff distance method or finger tip unique point method.
2. the dynamic gesture identification method based on the frame integration technology according to claim 1, is characterized in that, described step (2) comprises the steps:
(2.1) obtain the continuous dynamic gesture sequence image of N frame { P i, i=0,1,2 ..., N;
(2.2) pixel in static constitutional diagram Q is with { P iin the mapping relations of pixel as follows
x i y i = a 2 + X - &gamma; sin &beta; - A 2 cos &alpha; + Y + &gamma; cos &beta; - B 2 sin &alpha; b 2 - X - &gamma; sin &beta; - A 2 sin &alpha; - Y + &gamma; cos &beta; - B 2 cos &alpha; - - - ( 1 )
Wherein, x i, y irepresent respectively little figure P ithe transverse and longitudinal coordinate of a certain pixel, X, Y means respectively the transverse and longitudinal coordinate of respective pixel in constitutional diagram Q, a, b means respectively P iwidth and the height, A, B mean respectively Q width and the height, r is each little figure P ithe radius that rotation is arranged in Q, α presentation graphic P ithe angle that self rotates, β means the P after over-rotation ithe radius at center with the angle of the rectangular coordinate system ordinate of setting up in Q, and:
&alpha; = 3 2 &pi; + 2 N &pi;i - - - ( 2 )
&beta; = 2 N &pi;i - - - ( 3 )
3. the dynamic gesture identification method based on the frame integration technology according to claim 1, is characterized in that, described step (3) comprising:
(3.1) plane picture that static constitutional diagram Q forms is f (x, y), the centre of form of computed image f (x, y), i.e. center of gravity
Figure FDA0000371361660000023
(3.2) in computed image f (x, y), target pixel points is to the ultimate range D of the centre of form max, minor increment D min;
(3.3) in difference computed image f (x, y), take the centre of form as the center of circle, with D maxfor the maximum circumscribed circle in the target area of radius with D minfor the target area minimum circumscribed circle of radius, in the zone formed at maximum circumscribed circle and minimum circumscribed circle, use equidistant regional partitioning that image is divided into to M number of sub images zone (M>0);
(3.4) each sub-image area is added up respectively, calculated the total S of each sub-image area internal object pixel i(i=1 ..., M), and find out S imaximal value.
S max = max i = 1 , &CenterDot; &CenterDot; &CenterDot; , M ( S i ) - - - ( 6 )
(3.5) calculate the Density Distribution Feature d of static constitutional diagram:
r i=S i/S max(i=1,…,M) (7)
dr i = | r 1 - r 2 | i = 1 | 2 r i - r i - 1 - r i + 1 | 1 < i < M | r M - r M - 1 | i = M - - - ( 8 )
d=(r 1,…,r M;dr 1,…,dr M) (9)
(3.6) according to formula DDF '=(r 1..., r 10, ar 11..., ar 15, br 16..., br 20; Dr 1..., dr 10, cdr 11..., cdr m) obtain amended proper vector , wherein, a, b, c is customized parameter;
(3.7) by the proper vector obtained
Figure FDA0000371361660000032
with the template characteristic vector H in each gesture set, compare respectively, the difference compute euclidian distances, the gesture set of Euclidean distance minimum is selected gesture set.
4. the dynamic gesture identification method based on the frame integration technology according to claim 1, is characterized in that, described step (4) comprises the steps:
(4.1) for gesture in the 35th the gesture set of parameter vector between scope 5 to 6 of Density Distribution Feature, adopt the Hausdorff distance further to identify the last frame of dynamic gesture, thereby reach gesture identification effect accurately;
(4.2) for gesture in the 35th the gesture set of parameter vector between scope 3 to 4 of Density Distribution Feature, adopt the method for finger tip unique point further to identify the last frame of dynamic gesture, thereby reach gesture identification effect accurately.
5. the dynamic gesture identification method based on the frame integration technology according to claim 4, is characterized in that, the Hausdorff distance method of described step (4.1) carries out further identifying and comprising the steps: to the last frame of dynamic gesture
(4.1.1) the frontier point set L of gesture in the set of first trained gesture;
(4.1.2) then the last frame of gesture to be identified is asked the set e of its frontier point;
(4.1.3) calculate respectively the hausdorff distance of e with frontier point set L;
(4.1.4) output hausdorff is apart from minimum template gesture sequence i, and this sequence is identified correct gesture sequence.
6. the dynamic gesture identification method based on the frame integration technology according to claim 4, is characterized in that, the method for the finger tip unique point of described step (4.2) comprises the steps:
(4.2.1) first trained gesture set, detect the finger tip dot information of various gestures in the gesture set, the vector that each finger tip point is formed with the gesture center of gravity
Figure FDA0000371361660000041
characteristic information charge to vectorial template G;
(4.2.2) then detect the finger tip vector of gesture to be identified
Figure FDA0000371361660000042
, with the finger tip vector in vectorial template G, compare, the template gesture sequence i of output similarity maximum, this sequence is identified correct gesture sequence.
CN201310374176.9A 2013-08-23 2013-08-23 A kind of dynamic gesture identification method based on frame integration technology Active CN103455794B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201310374176.9A CN103455794B (en) 2013-08-23 2013-08-23 A kind of dynamic gesture identification method based on frame integration technology

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201310374176.9A CN103455794B (en) 2013-08-23 2013-08-23 A kind of dynamic gesture identification method based on frame integration technology

Publications (2)

Publication Number Publication Date
CN103455794A true CN103455794A (en) 2013-12-18
CN103455794B CN103455794B (en) 2016-08-10

Family

ID=49738138

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201310374176.9A Active CN103455794B (en) 2013-08-23 2013-08-23 A kind of dynamic gesture identification method based on frame integration technology

Country Status (1)

Country Link
CN (1) CN103455794B (en)

Cited By (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN104050454A (en) * 2014-06-24 2014-09-17 深圳先进技术研究院 Movement gesture track obtaining method and system
CN104102904A (en) * 2014-07-14 2014-10-15 济南大学 Static gesture identification method
WO2015112194A3 (en) * 2014-01-22 2015-11-05 Lsi Corporation Image processor comprising gesture recognition system with static hand pose recognition based on dynamic warping
CN105893959A (en) * 2016-03-30 2016-08-24 北京奇艺世纪科技有限公司 Gesture identifying method and device
CN106295464A (en) * 2015-05-15 2017-01-04 济南大学 Gesture identification method based on Shape context
CN106295463A (en) * 2015-05-15 2017-01-04 济南大学 A kind of gesture identification method of feature based value
CN106372564A (en) * 2015-07-23 2017-02-01 株式会社理光 Gesture identification method and apparatus
CN107403167A (en) * 2017-08-03 2017-11-28 华中师范大学 Gesture identification method and device
CN107766842A (en) * 2017-11-10 2018-03-06 济南大学 A kind of gesture identification method and its application
CN108197596A (en) * 2018-01-24 2018-06-22 京东方科技集团股份有限公司 A kind of gesture identification method and device
CN108520205A (en) * 2018-03-21 2018-09-11 安徽大学 A kind of human motion recognition method based on Citation-KNN
CN109271840A (en) * 2018-07-25 2019-01-25 西安电子科技大学 A kind of video gesture classification method
CN109634415A (en) * 2018-12-11 2019-04-16 哈尔滨拓博科技有限公司 It is a kind of for controlling the gesture identification control method of analog quantity
CN110717385A (en) * 2019-08-30 2020-01-21 西安文理学院 Dynamic gesture recognition method
CN114229451A (en) * 2021-12-30 2022-03-25 宁波智能成型技术创新中心有限公司 Intelligent grabbing anti-falling detection and regulation method based on multi-axial force and moment

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6002808A (en) * 1996-07-26 1999-12-14 Mitsubishi Electric Information Technology Center America, Inc. Hand gesture control system
CN102830938A (en) * 2012-09-13 2012-12-19 济南大学 3D (three-dimensional) human-computer interaction method based on gesture and animation

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6002808A (en) * 1996-07-26 1999-12-14 Mitsubishi Electric Information Technology Center America, Inc. Hand gesture control system
CN102830938A (en) * 2012-09-13 2012-12-19 济南大学 3D (three-dimensional) human-computer interaction method based on gesture and animation

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
张翔: "基于计算机视觉的织物组织结构自动识别系统", 《中国优秀硕士学位论文全文数据库 信息科技辑》, no. 4, 15 April 2007 (2007-04-15), pages 10 - 2 *
许婷: "基于行为分析的运动人手跟踪方法", 《中国优秀硕士学位论文全文数据库 信息科技辑》, no. 10, 15 October 2011 (2011-10-15) *

Cited By (23)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2015112194A3 (en) * 2014-01-22 2015-11-05 Lsi Corporation Image processor comprising gesture recognition system with static hand pose recognition based on dynamic warping
CN104050454A (en) * 2014-06-24 2014-09-17 深圳先进技术研究院 Movement gesture track obtaining method and system
CN104102904A (en) * 2014-07-14 2014-10-15 济南大学 Static gesture identification method
CN104102904B (en) * 2014-07-14 2016-03-23 济南大学 A kind of static gesture identification method
CN106295463B (en) * 2015-05-15 2019-05-07 济南大学 A kind of gesture identification method based on characteristic value
CN106295463A (en) * 2015-05-15 2017-01-04 济南大学 A kind of gesture identification method of feature based value
CN106295464A (en) * 2015-05-15 2017-01-04 济南大学 Gesture identification method based on Shape context
CN106372564A (en) * 2015-07-23 2017-02-01 株式会社理光 Gesture identification method and apparatus
CN105893959B (en) * 2016-03-30 2019-04-12 北京奇艺世纪科技有限公司 A kind of gesture identification method and device
CN105893959A (en) * 2016-03-30 2016-08-24 北京奇艺世纪科技有限公司 Gesture identifying method and device
CN107403167A (en) * 2017-08-03 2017-11-28 华中师范大学 Gesture identification method and device
CN107403167B (en) * 2017-08-03 2020-07-03 华中师范大学 Gesture recognition method and device
CN107766842A (en) * 2017-11-10 2018-03-06 济南大学 A kind of gesture identification method and its application
CN107766842B (en) * 2017-11-10 2020-07-28 济南大学 Gesture recognition method and application thereof
CN108197596A (en) * 2018-01-24 2018-06-22 京东方科技集团股份有限公司 A kind of gesture identification method and device
US10803304B2 (en) 2018-01-24 2020-10-13 Boe Technology Group Co., Ltd. Gesture recognition method, device, apparatus, and storage medium
CN108520205A (en) * 2018-03-21 2018-09-11 安徽大学 A kind of human motion recognition method based on Citation-KNN
CN108520205B (en) * 2018-03-21 2022-04-12 安徽大学 motion-KNN-based human body motion recognition method
CN109271840A (en) * 2018-07-25 2019-01-25 西安电子科技大学 A kind of video gesture classification method
CN109634415A (en) * 2018-12-11 2019-04-16 哈尔滨拓博科技有限公司 It is a kind of for controlling the gesture identification control method of analog quantity
CN109634415B (en) * 2018-12-11 2019-10-18 哈尔滨拓博科技有限公司 It is a kind of for controlling the gesture identification control method of analog quantity
CN110717385A (en) * 2019-08-30 2020-01-21 西安文理学院 Dynamic gesture recognition method
CN114229451A (en) * 2021-12-30 2022-03-25 宁波智能成型技术创新中心有限公司 Intelligent grabbing anti-falling detection and regulation method based on multi-axial force and moment

Also Published As

Publication number Publication date
CN103455794B (en) 2016-08-10

Similar Documents

Publication Publication Date Title
CN103455794A (en) Dynamic gesture recognition method based on frame fusion technology
CN109325454B (en) Static gesture real-time recognition method based on YOLOv3
Doumanoglou et al. Recovering 6D object pose and predicting next-best-view in the crowd
CN106682598B (en) Multi-pose face feature point detection method based on cascade regression
CN103226835B (en) Based on method for tracking target and the system of online initialization gradient enhancement regression tree
CN104517104B (en) A kind of face identification method and system based under monitoring scene
CN103488972B (en) Fingertip Detection based on depth information
CN103324938A (en) Method for training attitude classifier and object classifier and method and device for detecting objects
CN103413145B (en) Intra-articular irrigation method based on depth image
CN103971102A (en) Static gesture recognition method based on finger contour and decision-making trees
CN103294996A (en) 3D gesture recognition method
CN103150019A (en) Handwriting input system and method
CN101901350A (en) Characteristic vector-based static gesture recognition method
CN103105924B (en) Man-machine interaction method and device
Yue et al. Robust loop closure detection based on bag of superpoints and graph verification
CN103246891A (en) Chinese sign language recognition method based on kinect
CN104182973A (en) Image copying and pasting detection method based on circular description operator CSIFT (Colored scale invariant feature transform)
CN102938065A (en) Facial feature extraction method and face recognition method based on large-scale image data
CN104392223A (en) Method for recognizing human postures in two-dimensional video images
CN105006003A (en) Random projection fern based real-time target tracking algorithm
CN103400109A (en) Free-hand sketch offline identification and reshaping method
CN103198330B (en) Real-time human face attitude estimation method based on deep video stream
CN106503626A (en) Being mated with finger contours based on depth image and refer to gesture identification method
CN104866824A (en) Manual alphabet identification method based on Leap Motion
CN105138990A (en) Single-camera-based gesture convex hull detection and palm positioning method

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant