WO2013162313A1 - A method and system for robust object tracking using particle filter framework - Google Patents

A method and system for robust object tracking using particle filter framework Download PDF

Info

Publication number
WO2013162313A1
WO2013162313A1 PCT/KR2013/003587 KR2013003587W WO2013162313A1 WO 2013162313 A1 WO2013162313 A1 WO 2013162313A1 KR 2013003587 W KR2013003587 W KR 2013003587W WO 2013162313 A1 WO2013162313 A1 WO 2013162313A1
Authority
WO
WIPO (PCT)
Prior art keywords
likelihood function
object likelihood
drift values
product
updated
Prior art date
Application number
PCT/KR2013/003587
Other languages
French (fr)
Inventor
Viswanath GOPALAKRISHNAN
Hariprasad Kannan
Balasubramanian Anand
Pratibha Moogi
Sudha Velusamy
Original Assignee
Samsung Electronics Co., Ltd.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Samsung Electronics Co., Ltd. filed Critical Samsung Electronics Co., Ltd.
Priority to KR1020147029541A priority Critical patent/KR102026651B1/en
Publication of WO2013162313A1 publication Critical patent/WO2013162313A1/en

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/20Analysis of motion
    • G06T7/277Analysis of motion involving stochastic approaches, e.g. using Kalman filters
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence

Definitions

  • the present invention relates to computer vision systems and more particularly to formulating and correcting an object tracking mechanism in particle filter framework.
  • the present application is based on, and claims priority from, an Indian Application Number, 1637/CHE/2012 filed on 25 th April, 2012, the disclosure of which is hereby incorporated by reference herein.
  • Visual tracking is one of the important research areas in computer vision. Some of the applications in computer vision that require accurate tracking of an object are video surveillance, augmented reality, human-computer interaction, video analytics, and the like.
  • the object tracking is generally based on either a deterministic approach or a stochastic approach.
  • the deterministic approach is computationally efficient, but they are sensitive to computer visual properties such as background distraction, clutter, occlusion, or the like. If the target object is not tracked correctly then the tracking mechanism cannot recover from failure.
  • a probabilistic distribution of the target object is used for time series estimations and predictions.
  • Most commonly used method under this approach is a particle filter framework. This method can efficiently handle non-linearity in state-space models and assume non-Gaussian input and output noise components.
  • Existing stochastic approach does not provide detailed analysis of the object likelihood as handled by the deterministic approach.
  • the existing tracking algorithms fail to represent the object appearance accurately as it is influenced by the appearance changes. Due to this limitation, the robustness or accuracy of the tracking algorithm cannot be corrected or improved while tracking the object.
  • a sub-space based object appearance model also known as an incremental visual tracker
  • the method adopts the stochastic approach to track motion of the object.
  • This algorithm estimates state posterior density on time t n , by generating particles from the state transition density p(X n
  • the method achieves robust tracking by propagating set of particles to next stage t n+1 and generating new set of particles around the propagated particles.
  • accuracy of appearance of the tracked object cannot be adjusted or improved, as the detailed analysis of the deterministic aspect of object motion is neglected in the particle filter framework.
  • the accuracy of the object tracking mechanism is measured and modified while tracking specific objects of the human-body such as hands, finger-tips, and the like.
  • the tracking mechanism is enabled only upon detecting parallel lines and curves of an object, and the robustness of the tracking mechanism is altered as required.
  • the principal object of the embodiments herein is to provide a robust visual tracking mechanism by combining a deterministic approach into a stochastic approach of a particle filter framework.
  • Another object of the invention is to provide a method to perform trend analysis of the particle filter performance in every frame and initiate correction action if required by redesigning or updating state drift values.
  • Another object of the invention is to provide a method to iteratively correct the tracker performance.
  • the invention provides a method for providing object tracking in a video, the method comprises formulating an object likelihood function for a set of consecutive frames in the video, wherein the object likelihood function is determined for at least one state parameter within short time interval. Further the method comprises determining concavity of the object likelihood function and updating state drift values, if the object likelihood function shows concavity. Furthermore the method comprises updating variance of a sampling function based on the updated state drift values and direction of principal curvature of the object likelihood function.
  • the invention provides a computer program product for providing object tracking in a video, wherein the product comprises an integrated circuit. Further the integrated circuit comprises at least one processor, at least one memory. Further the memory comprises a computer program code within the circuit. At least one memory and the computer program code with the at least one processor cause the product to formulate an object likelihood function for a set of consecutive frames in the video, wherein the object likelihood function is determined for at least one state parameter within short time interval. Further the product is configured to determine concavity of the object likelihood function and update state drift values, if the object likelihood function shows concavity. Further the product is configured to update variance of a sampling function based on the updated state drift values and direction of principal curvature of the object likelihood function.
  • FIG. 1 illustrates an exemplary object likelihood estimation for consecutive frames, according to embodiments as disclosed herein;
  • FIG. 2 illustrates a flow diagram explaining a process of tracking and improving accuracy of object tracking mechanism, according to embodiments as disclosed herein;
  • FIG. 3 illustrates a computing environment implementing the method for robust object tracking using particle filter framework, according to embodiments as disclosed herein.
  • the embodiments herein achieve a method and system for providing robust object tracking in a video by combining deterministic model into stochastic framework of particle filter. Based on the tracking mechanism variation, the particle filter iteratively applies appropriate corrective measures to improve accuracy of tracking mechanism.
  • the object likelihood for consecutive frames is determined based on a multivariate function for smooth variation of state parameters within short time intervals.
  • the sudden occlusions of a tracked object are handled by measuring percentage change in the object likelihood in consecutive frames of the video and map the change to a logistic regression function.
  • the object likelihood is a scalar value that represents the chance of an object being captured by the current state estimate.
  • the state parameters model posterior distribution of the object states.
  • the object is tracked using a set of six affine parameters.
  • the six state parameters include but not limited to an X-translation, a Y-translation, a scale, an aspect ratio, a rotation angle( ⁇ ) , and a skewness angle ( ⁇ ).
  • multivariate function defines a function to evaluate object likelihood in the successive frame using the state parameters.
  • FIGS. 1 through 3 where similar reference characters denote corresponding features consistently throughout the figures, there are shown preferred embodiments.
  • FIG. 1 illustrates exemplary object likelihood estimation for consecutive frames, according to embodiments as disclosed herein.
  • the object likelihood is considered as a time-invariant function of state parameters for a short duration.
  • the state parameters includes but not limited to an X-translation, a Y-translation, a scale, an aspect ratio, a rotation angle ( ⁇ ), and a skewness angle ( ⁇ ).
  • a particle filter framework recursively estimates the most probable object states by tracking the object using the state parameters.
  • the object likelihood for the frames f n-2 , f n-1 , and f n are represented as (O 1 , X 1 ), (O 2 , X 2 ), and (O 3 , X 3 ) respectively, where X 1 , X 2 , and X 3 represent the state parameters of the object.
  • the object likelihood of the frame f n+1 is drifted from pattern of motion as formulated and calculated by the particle filter framework (shown by dotted line).
  • the trend analysis of the tracking algorithm is performed by computing a second order differential behavior of the function f(X).
  • ⁇ X n the state drift value
  • second order differential value computed by the particle filter framework to correct the tracking mechanism in the frame f n+1 .
  • FIG. 2 illustrates a flow diagram explaining a process of tracking and improving accuracy of object tracking mechanism, according to embodiments as disclosed herein.
  • the deterministic approach is combined into the stochastic approach of the particle filter framework to compute the object likelihood for every frame.
  • the object is tracked using the six state parameters of the object in the frame.
  • the object state parameters are initialized for computing the object likelihood using the particle filter framework.
  • the particle filter framework predicts the most probable object states for the current frame based on the object likelihood state parameters of the previous frame.
  • the object likelihood is obtained based on the current states and observation.
  • whether any sudden occlusions of the tracked object are predicted.
  • the occlusions are predicted by measuring the percentage change in the object likelihood. And then map the predicted occlusion to the logistic regression function. Further the trend analysis is performed, if the object likelihood calculation does not predict occlusion. Further the object likelihood is computed in the current frame using a Hessian matrix.
  • the Hessian Matrix H(f) for a function f(X) encodes a second order differential behavior of the function with respect to its variables, as follows:
  • the partial derivatives of the object likelihood state parameters for the current frame are calculated over short time intervals.
  • the Hessian matrix of the object likelihood function is analyzed to estimate the trend based on Eigen values.
  • the direction of the object likelihood function is also determined from the Hessian matrix.
  • the maximum Eigen value (computed by the Hessian matrix) determines the principle curvature value of the object likelihood in the current frame.
  • the object likelihood function display a concave behavior (negative semi-definite) of the function or a convex behavior (positive semi-definite) of the function.
  • the concave behavior depicts the downward trend of the object likelihood function.
  • the convex behavior depicts the upward trend of the object likelihood function.
  • the determined likelihood function displays concavity
  • the state drift values for the concave behavior of the object likelihood are obtained in the current frame based on the principal curvature of the object likelihood function.
  • the direction of the principle curvature of the object likelihood is determined in the current frame using Eigen vector corresponding to the maximum Eigen value of the Hessian matrix. Further, state drift values are drifted in the direction opposite to the direction of the principle curvature.
  • the state drift value x 1 is scaled with the absolute value of e 1 , which represents the strength of the principle curvature along the corresponding dimension.
  • e 1 represents the strength of the principle curvature along the corresponding dimension.
  • sampling function variance is adapted based on the principle curvature of the object likelihood in the current frame and based on the previously estimated state drift values.
  • the sampling is sparsely performed as the system confidence is low.
  • the sampling is densely performed as the system confidence is high.
  • new particles for the current frame are generated using the new state drift values and adapted sampling function variance.
  • the object likelihood state is obtained based on the current frame. After modification of the state drift values and sampling function variance, the particle filter framework is iterated at least once for an improved tracking performance.
  • FIG. 3 illustrates a computing environment implementing the method for robust object tracking using particle filter framework, according to embodiments as disclosed herein.
  • the computing environment 301 comprises at least one processing unit 304 that is equipped with a control unit 302 and an Arithmetic Logic Unit (ALU) 303, a memory 305, a storage unit 306, plurality of networking devices 308 and a plurality Input output (I/O) devices 307.
  • the processing unit 304 is responsible for processing the instructions of the algorithm.
  • the processing unit 304 receives commands from the control unit in order to perform its processing. Further, any logical and arithmetic operations involved in the execution of the instructions are computed with the help of the ALU 303.
  • ALU Arithmetic Logic Unit
  • the overall computing environment 301 can be composed of multiple homogeneous and/or heterogeneous cores, multiple CPUs of different kinds, special media and other accelerators.
  • the processing unit 304 is responsible for processing the instructions of the algorithm. Further, the plurality of processing units 304 may be located on a single chip or over multiple chips.
  • the algorithm comprising of instructions and codes required for the implementation are stored in either the memory unit 305 or the storage 306 or both. At the time of execution, the instructions may be fetched from the corresponding memory 305 and/or storage 306, and executed by the processing unit 304.
  • networking devices 308 or external I/O devices 307 may be connected to the computing environment to support the implementation through the networking unit and the I/O device unit.
  • the embodiments disclosed herein can be implemented through at least one software program running on at least one hardware device and performing network management functions to control the elements.
  • the elements shown in Fig. 3 include blocks which can be at least one of a hardware device, or a combination of hardware device and software module.

Abstract

A method and system for providing robust object tracking mechanism using a combination of a deterministic approach and a stochastic approach in a particle filter framework, is disclosed. The method includes computing object likelihood for consecutive frames using the state parameters of the tracked object within short time interval. The method performs a trend analysis of the tracking mechanism based on the computation of the object likelihood in the current frame and by determining the state drift values of an object likelihood function. Based on the calculation of the state drift values and the object likelihood function, sampling function variance is updated to correct the tracking mechanism.

Description

A METHOD AND SYSTEM FOR ROBUST OBJECT TRACKING USING PARTICLE FILTER FRAMEWORK
The present invention relates to computer vision systems and more particularly to formulating and correcting an object tracking mechanism in particle filter framework. The present application is based on, and claims priority from, an Indian Application Number, 1637/CHE/2012 filed on 25th April, 2012, the disclosure of which is hereby incorporated by reference herein.
Visual tracking is one of the important research areas in computer vision. Some of the applications in computer vision that require accurate tracking of an object are video surveillance, augmented reality, human-computer interaction, video analytics, and the like.
In an existing method, the object tracking is generally based on either a deterministic approach or a stochastic approach. The deterministic approach is computationally efficient, but they are sensitive to computer visual properties such as background distraction, clutter, occlusion, or the like. If the target object is not tracked correctly then the tracking mechanism cannot recover from failure. In the stochastic approach a probabilistic distribution of the target object is used for time series estimations and predictions. Most commonly used method under this approach is a particle filter framework. This method can efficiently handle non-linearity in state-space models and assume non-Gaussian input and output noise components. Existing stochastic approach does not provide detailed analysis of the object likelihood as handled by the deterministic approach.
In current scenario, the existing tracking algorithms fail to represent the object appearance accurately as it is influenced by the appearance changes. Due to this limitation, the robustness or accuracy of the tracking algorithm cannot be corrected or improved while tracking the object.
In an existing method, a sub-space based object appearance model, also known as an incremental visual tracker, uses the particle filter framework for robust tracking based on condensation algorithm. The method adopts the stochastic approach to track motion of the object. This algorithm estimates state posterior density on time tn, by generating particles from the state transition density p(Xn|Xn-1). The method achieves robust tracking by propagating set of particles to next stage tn+1 and generating new set of particles around the propagated particles. However, accuracy of appearance of the tracked object cannot be adjusted or improved, as the detailed analysis of the deterministic aspect of object motion is neglected in the particle filter framework.
In an existing method, the accuracy of the object tracking mechanism is measured and modified while tracking specific objects of the human-body such as hands, finger-tips, and the like. The tracking mechanism is enabled only upon detecting parallel lines and curves of an object, and the robustness of the tracking mechanism is altered as required.
In the light of above discussion, it is evident that the existing method fails to provide a combination of the deterministic approach and the stochastic approach that provides more accurate, generalized, iterative, and corrective object tracking mechanism in the computer vision systems.
The principal object of the embodiments herein is to provide a robust visual tracking mechanism by combining a deterministic approach into a stochastic approach of a particle filter framework.
Another object of the invention is to provide a method to perform trend analysis of the particle filter performance in every frame and initiate correction action if required by redesigning or updating state drift values.
Another object of the invention is to provide a method to iteratively correct the tracker performance.
Accordingly the invention provides a method for providing object tracking in a video, the method comprises formulating an object likelihood function for a set of consecutive frames in the video, wherein the object likelihood function is determined for at least one state parameter within short time interval. Further the method comprises determining concavity of the object likelihood function and updating state drift values, if the object likelihood function shows concavity. Furthermore the method comprises updating variance of a sampling function based on the updated state drift values and direction of principal curvature of the object likelihood function.
Accordingly the invention provides a computer program product for providing object tracking in a video, wherein the product comprises an integrated circuit. Further the integrated circuit comprises at least one processor, at least one memory. Further the memory comprises a computer program code within the circuit. At least one memory and the computer program code with the at least one processor cause the product to formulate an object likelihood function for a set of consecutive frames in the video, wherein the object likelihood function is determined for at least one state parameter within short time interval. Further the product is configured to determine concavity of the object likelihood function and update state drift values, if the object likelihood function shows concavity. Further the product is configured to update variance of a sampling function based on the updated state drift values and direction of principal curvature of the object likelihood function.
These and other aspects of the embodiments herein will be better appreciated and understood when considered in conjunction with the following description and the accompanying drawings. It should be understood, however, that the following descriptions, while indicating preferred embodiments and numerous specific details thereof, are given by way of illustration and not of limitation. Many changes and modifications may be made within the scope of the embodiments herein without departing from the spirit thereof, and the embodiments herein include all such modifications.
This invention is illustrated in the accompanying drawings, throughout which like reference letters indicate corresponding parts in the various figures. The embodiments herein will be better understood from the following description with reference to the drawings, in which:
FIG. 1 illustrates an exemplary object likelihood estimation for consecutive frames, according to embodiments as disclosed herein;
FIG. 2 illustrates a flow diagram explaining a process of tracking and improving accuracy of object tracking mechanism, according to embodiments as disclosed herein; and
FIG. 3 illustrates a computing environment implementing the method for robust object tracking using particle filter framework, according to embodiments as disclosed herein.
The embodiments herein and the various features and advantageous details thereof are explained more fully with reference to the non-limiting embodiments that are illustrated in the accompanying drawings and detailed in the following description. Descriptions of well-known components and processing techniques are omitted so as to not unnecessarily obscure the embodiments herein. The examples used herein are intended merely to facilitate an understanding of ways in which the embodiments herein can be practiced and to further enable those of skill in the art to practice the embodiments herein. Accordingly, the examples should not be construed as limiting the scope of the embodiments herein.
The embodiments herein achieve a method and system for providing robust object tracking in a video by combining deterministic model into stochastic framework of particle filter. Based on the tracking mechanism variation, the particle filter iteratively applies appropriate corrective measures to improve accuracy of tracking mechanism.
The object likelihood for consecutive frames is determined based on a multivariate function for smooth variation of state parameters within short time intervals.
The sudden occlusions of a tracked object are handled by measuring percentage change in the object likelihood in consecutive frames of the video and map the change to a logistic regression function.
The object likelihood is a scalar value that represents the chance of an object being captured by the current state estimate.
The state parameters model posterior distribution of the object states. The object is tracked using a set of six affine parameters. The six state parameters include but not limited to an X-translation, a Y-translation, a scale, an aspect ratio, a rotation angle(θ) , and a skewness angle (Φ).
The term multivariate function defines a function to evaluate object likelihood in the successive frame using the state parameters.
Referring now to the drawings, and more particularly to FIGS. 1 through 3, where similar reference characters denote corresponding features consistently throughout the figures, there are shown preferred embodiments.
FIG. 1 illustrates exemplary object likelihood estimation for consecutive frames, according to embodiments as disclosed herein. The object likelihood is considered as a time-invariant function of state parameters for a short duration. The state parameters includes but not limited to an X-translation, a Y-translation, a scale, an aspect ratio, a rotation angle (θ), and a skewness angle (Φ).
A particle filter framework recursively estimates the most probable object states by tracking the object using the state parameters. As depicted in the FIG. 1, consider exemplary three consecutive frames fn-2, fn-1, and fn. For each frame f, the particle filter framework can provide a scalar measure of the object likelihood Oi using the multivariate function f(X)=0 for the state parameters. The object likelihood for the frames fn-2, fn-1, and fn are represented as (O1, X1), (O2, X2), and (O3, X3) respectively, where X1, X2, and X3 represent the state parameters of the object. As depicted in the FIG. 1, the object likelihood of the frame fn+1 is drifted from pattern of motion as formulated and calculated by the particle filter framework (shown by dotted line). The trend analysis of the tracking algorithm is performed by computing a second order differential behavior of the function f(X). By using the state drift value (ΔXn) along with second order differential value computed by the particle filter framework to correct the tracking mechanism in the frame fn+1.
FIG. 2 illustrates a flow diagram explaining a process of tracking and improving accuracy of object tracking mechanism, according to embodiments as disclosed herein. The deterministic approach is combined into the stochastic approach of the particle filter framework to compute the object likelihood for every frame. The object is tracked using the six state parameters of the object in the frame. As depicted in the flow diagram 200, at step 201 the object state parameters are initialized for computing the object likelihood using the particle filter framework. At step 202, the particle filter framework predicts the most probable object states for the current frame based on the object likelihood state parameters of the previous frame. Further, at step 203 the object likelihood is obtained based on the current states and observation. At step 204, whether any sudden occlusions of the tracked object are predicted. In an embodiment, the occlusions are predicted by measuring the percentage change in the object likelihood. And then map the predicted occlusion to the logistic regression function. Further the trend analysis is performed, if the object likelihood calculation does not predict occlusion. Further the object likelihood is computed in the current frame using a Hessian matrix.
The Hessian Matrix H(f) for a function f(X), encodes a second order differential behavior of the function with respect to its variables, as follows:
Figure PCTKR2013003587-appb-I000001
From the above matrix, the partial derivatives of the object likelihood state parameters for the current frame are calculated over short time intervals. At step 205, the Hessian matrix of the object likelihood function is analyzed to estimate the trend based on Eigen values. The direction of the object likelihood function is also determined from the Hessian matrix. The maximum Eigen value (computed by the Hessian matrix) determines the principle curvature value of the object likelihood in the current frame. Based on the direction of the principle curvature, at step 206 it is determined whether the object likelihood function display a concave behavior (negative semi-definite) of the function or a convex behavior (positive semi-definite) of the function. The concave behavior depicts the downward trend of the object likelihood function. And the convex behavior depicts the upward trend of the object likelihood function. If the determined likelihood function displays concavity, at step 207 the state drift values for the concave behavior of the object likelihood are obtained in the current frame based on the principal curvature of the object likelihood function. The direction of the principle curvature of the object likelihood is determined in the current frame using Eigen vector corresponding to the maximum Eigen value of the Hessian matrix. Further, state drift values are drifted in the direction opposite to the direction of the principle curvature. For example, consider the six dimensional Eigen vector represented as: E = [e1, e2, e3, e4, e5, e6] in the direction of the principle curvature, and the state drift value vector represented as: ΔX = [x1, x2, x3, x4, x5, x6]. The directions of both the vectors along different dimensionalities of the state parameters are used to update the state drift values for the current frame. If both the vectors (e1 and x1) are in the same direction, which is the direction of the principle curvature, then the direction of the state drift value is updated by reversing the sign. The state drift value x1 is scaled with the absolute value of e1, which represents the strength of the principle curvature along the corresponding dimension. For the state drift values that are not in the same direction as the direction of the principle curvature, the method does not reverse the sign of the state drift value but, the value is scaled to the value represented by the corresponding Eigen vector.
Further, at step 208a sampling function variance is adapted based on the principle curvature of the object likelihood in the current frame and based on the previously estimated state drift values. When the direction of the state drift value and the direction of the Eigen vector are same then the sampling is sparsely performed as the system confidence is low. When the direction of the state drift value and the direction of the Eigen vector are opposite then the sampling is densely performed as the system confidence is high. Based on the state drift values, at step 209 new particles for the current frame are generated using the new state drift values and adapted sampling function variance. Further, at step 210 the object likelihood state is obtained based on the current frame. After modification of the state drift values and sampling function variance, the particle filter framework is iterated at least once for an improved tracking performance.
FIG. 3 illustrates a computing environment implementing the method for robust object tracking using particle filter framework, according to embodiments as disclosed herein. As depicted the computing environment 301 comprises at least one processing unit 304 that is equipped with a control unit 302 and an Arithmetic Logic Unit (ALU) 303, a memory 305, a storage unit 306, plurality of networking devices 308 and a plurality Input output (I/O) devices 307. The processing unit 304 is responsible for processing the instructions of the algorithm. The processing unit 304 receives commands from the control unit in order to perform its processing. Further, any logical and arithmetic operations involved in the execution of the instructions are computed with the help of the ALU 303.
The overall computing environment 301 can be composed of multiple homogeneous and/or heterogeneous cores, multiple CPUs of different kinds, special media and other accelerators. The processing unit 304 is responsible for processing the instructions of the algorithm. Further, the plurality of processing units 304 may be located on a single chip or over multiple chips.
The algorithm comprising of instructions and codes required for the implementation are stored in either the memory unit 305 or the storage 306 or both. At the time of execution, the instructions may be fetched from the corresponding memory 305 and/or storage 306, and executed by the processing unit 304.
In case of any hardware implementations various networking devices 308 or external I/O devices 307 may be connected to the computing environment to support the implementation through the networking unit and the I/O device unit.
The embodiments disclosed herein can be implemented through at least one software program running on at least one hardware device and performing network management functions to control the elements. The elements shown in Fig. 3 include blocks which can be at least one of a hardware device, or a combination of hardware device and software module.
The foregoing description of the specific embodiments will so fully reveal the general nature of the embodiments herein that others can, by applying current knowledge, readily modify and/or adapt for various applications such specific embodiments without departing from the generic concept, and, therefore, such adaptations and modifications should and are intended to be comprehended within the meaning and range of equivalents of the disclosed embodiments. It is to be understood that the phraseology or terminology employed herein is for the purpose of description and not of limitation. Therefore, while the embodiments herein have been described in terms of preferred embodiments, those skilled in the art will recognize that the embodiments herein can be practiced with modification within the spirit and scope of the embodiments as described herein.

Claims (17)

  1. A method for providing object tracking in a video, said method comprises:
    formulating an object likelihood function for a set of consecutive frames in said video, wherein said object likelihood function is determined for at least one state parameter within short time interval;
    determining concavity of said object likelihood function and updating state drift values, if said object likelihood function shows said concavity; and
    updating variance of a sampling function based on said updated state drift values and direction of principal curvature of said object likelihood function.
  2. The method as in claim 1, wherein said method further comprises tracking said object based on said at least one state parameter, wherein state parameter comprises at least one of: X-translation, Y-translation, scale, aspect ratio, rotation angle, and skewness angle.
  3. The method as in claim 1, wherein said method further comprises calculating partial derivative of said object likelihood function in a Hessian matrix, in accordance with value of said at least one state parameter over a plurality of said consecutive frames.
  4. The method as in claim 1, wherein said method further comprises computing said state drift values based on Eigen vector in accordance to maximum Eigen value, wherein said largest Eigen value represents said principal curvature of said object likelihood function.
  5. The method as in claim 1, wherein said method further comprises generating at least one new particle for current frame of said video using said updated state drift values and said updated variance.
  6. The method as in claim 1, wherein said method further comprises performing sampling sparsely, when said updated state drift values is in same direction of said principal curvature.
  7. The method as in claim 1, wherein said method further comprises performing sampling densely, when said updated state drift values is in direction opposite to said principal curvature.
  8. The method as in claim 5, wherein said method further comprises iterating particle filter at least once using said updated state drift values and said updated variance.
  9. A system for providing object tracking in a video, wherein said system is configured to perform at least one step as claimed in claims 1 to 8.
  10. A computer program product for providing object tracking in a video, wherein said product comprises:
    an integrated circuit further comprising at least one processor;
    at least one memory having a computer program code within said circuit;
    said at least one memory and said computer program code with said at least one processor cause said product to:
    formulate an object likelihood function for a set of consecutive frames in said video, wherein said object likelihood function is determined for at least one state parameter within short time interval;
    determine concavity of said object likelihood function and update state drift values, if said object likelihood function shows said concavity; and
    update variance of a sampling function based on said updated state drift values and direction of principal curvature of said object likelihood function.
  11. The computer program product as in claim 10, wherein said product is further configured to track said object based on said at least one state parameter, wherein state parameter comprises at least one of: X-translation, Y-translation, scale, aspect ratio, rotation angle, and skewness angle.
  12. The computer program product as in claim 10, wherein said product is further configured to calculate partial derivative of said object likelihood function in a Hessian matrix, in accordance with value of said at least one state parameter over a plurality of said consecutive frames.
  13. The computer program product as in claim 10, wherein said product is further configured to compute said state drift values based on Eigen vector in accordance to maximum Eigen value, wherein said largest Eigen value represents said principal curvature of said object likelihood function.
  14. The computer program product as in claim 10, wherein said product is further configured to generate at least one new particle for current frame of said video using said updated state drift value and said updated variance.
  15. The computer program product as in claim 10, wherein said product is further configured to perform sampling sparsely, when said updated state drift values is in same direction of said principal curvature.
  16. The computer program product as in claim 10, wherein said product is further configured to perform said sampling densely, when said updated state drift values is in direction opposite to said principal curvature.
  17. The computer program product as in claim 14, wherein said product is further configured to iterate particle filter at least once using said updated state drift values and said updated variance.
PCT/KR2013/003587 2012-04-25 2013-04-25 A method and system for robust object tracking using particle filter framework WO2013162313A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
KR1020147029541A KR102026651B1 (en) 2012-04-25 2013-04-25 A method and system for robust object tracking using particle filter framework

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
IN1637CH2012 2012-04-25
IN1637/CHE/2012 2013-04-09

Publications (1)

Publication Number Publication Date
WO2013162313A1 true WO2013162313A1 (en) 2013-10-31

Family

ID=49483527

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/KR2013/003587 WO2013162313A1 (en) 2012-04-25 2013-04-25 A method and system for robust object tracking using particle filter framework

Country Status (2)

Country Link
KR (1) KR102026651B1 (en)
WO (1) WO2013162313A1 (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105046717A (en) * 2015-05-25 2015-11-11 浙江师范大学 Robust video object tracking method
CN106408590A (en) * 2016-10-21 2017-02-15 西安电子科技大学 Regression analysis based particle filter target tracking method
CN107292918A (en) * 2016-10-31 2017-10-24 清华大学深圳研究生院 Tracking and device based on video on-line study
CN111159924A (en) * 2020-04-02 2020-05-15 上海彩虹鱼海洋科技股份有限公司 Method and apparatus for predicting drift trajectory

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6674877B1 (en) * 2000-02-03 2004-01-06 Microsoft Corporation System and method for visually tracking occluded objects in real time
US20060291696A1 (en) * 2005-06-27 2006-12-28 Jie Shao Subspace projection based non-rigid object tracking with particle filters
US20080063236A1 (en) * 2006-06-09 2008-03-13 Sony Computer Entertainment Inc. Object Tracker for Visually Tracking Object Motion
US20080166045A1 (en) * 2005-03-17 2008-07-10 Li-Qun Xu Method of Tracking Objects in a Video Sequence
US20110058708A1 (en) * 2008-03-14 2011-03-10 Sony Computer Entertainment Inc. Object tracking apparatus and object tracking method

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6674877B1 (en) * 2000-02-03 2004-01-06 Microsoft Corporation System and method for visually tracking occluded objects in real time
US20080166045A1 (en) * 2005-03-17 2008-07-10 Li-Qun Xu Method of Tracking Objects in a Video Sequence
US20060291696A1 (en) * 2005-06-27 2006-12-28 Jie Shao Subspace projection based non-rigid object tracking with particle filters
US20080063236A1 (en) * 2006-06-09 2008-03-13 Sony Computer Entertainment Inc. Object Tracker for Visually Tracking Object Motion
US20110058708A1 (en) * 2008-03-14 2011-03-10 Sony Computer Entertainment Inc. Object tracking apparatus and object tracking method

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105046717A (en) * 2015-05-25 2015-11-11 浙江师范大学 Robust video object tracking method
CN106408590A (en) * 2016-10-21 2017-02-15 西安电子科技大学 Regression analysis based particle filter target tracking method
CN106408590B (en) * 2016-10-21 2019-03-08 西安电子科技大学 Particle filter method for tracking target based on regression analysis
CN107292918A (en) * 2016-10-31 2017-10-24 清华大学深圳研究生院 Tracking and device based on video on-line study
CN111159924A (en) * 2020-04-02 2020-05-15 上海彩虹鱼海洋科技股份有限公司 Method and apparatus for predicting drift trajectory

Also Published As

Publication number Publication date
KR102026651B1 (en) 2019-09-30
KR20150006424A (en) 2015-01-16

Similar Documents

Publication Publication Date Title
CN110766724B (en) Target tracking network training and tracking method and device, electronic equipment and medium
CN109544592B (en) Moving object detection algorithm for camera movement
US11514607B2 (en) 3-dimensional reconstruction method, 3-dimensional reconstruction device, and storage medium
CN110909712B (en) Moving object detection method and device, electronic equipment and storage medium
CN111524112B (en) Steel chasing identification method, system, equipment and medium
CN110349212B (en) Optimization method and device for instant positioning and map construction, medium and electronic equipment
WO2013162313A1 (en) A method and system for robust object tracking using particle filter framework
CN110706262B (en) Image processing method, device, equipment and storage medium
US20200356812A1 (en) Systems and methods for automated training of deep-learning-based object detection
CN111145076A (en) Data parallelization processing method, system, equipment and storage medium
CN112652020B (en) Visual SLAM method based on AdaLAM algorithm
CN111461998A (en) Environment reconstruction method and device
CN111783626A (en) Image recognition method and device, electronic equipment and storage medium
CN110390668A (en) Bolt looseness detection method, terminal device and storage medium
Sardari et al. An object tracking method using modified galaxy-based search algorithm
JP6244886B2 (en) Image processing apparatus, image processing method, and image processing program
CN109242882B (en) Visual tracking method, device, medium and equipment
JP7221920B2 (en) Target detection method, device, electronic device, storage medium, and program
US8351653B2 (en) Distance estimation from image motion for moving obstacle detection
Ranjbar et al. Using stochastic architectures for edge detection algorithms
CN113052019A (en) Target tracking method and device, intelligent equipment and computer storage medium
KR101834084B1 (en) Method and device for tracking multiple objects
JP2012185655A (en) Image processing system, image processing method and image processing program
CN114202804A (en) Behavior action recognition method and device, processing equipment and storage medium
WO2015086076A1 (en) Method for determining a similarity value between a first image and a second image

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 13781953

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 20147029541

Country of ref document: KR

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 13781953

Country of ref document: EP

Kind code of ref document: A1