WO2014118767A1 - Classifying types of locomotion - Google Patents

Classifying types of locomotion Download PDF

Info

Publication number
WO2014118767A1
WO2014118767A1 PCT/IL2013/051004 IL2013051004W WO2014118767A1 WO 2014118767 A1 WO2014118767 A1 WO 2014118767A1 IL 2013051004 W IL2013051004 W IL 2013051004W WO 2014118767 A1 WO2014118767 A1 WO 2014118767A1
Authority
WO
WIPO (PCT)
Prior art keywords
locomotion
sensor
subject
motion
signal
Prior art date
Application number
PCT/IL2013/051004
Other languages
French (fr)
Inventor
Tal Anker
Avishay MERON
Original Assignee
Sensogo Ltd.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sensogo Ltd. filed Critical Sensogo Ltd.
Publication of WO2014118767A1 publication Critical patent/WO2014118767A1/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1123Discriminating type of movement, e.g. walking or running
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7235Details of waveform analysis
    • A61B5/7242Details of waveform analysis using integration
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2562/00Details of sensors; Constructional details of sensor housings or probes; Accessories for sensors
    • A61B2562/02Details of sensors specially adapted for in-vivo measurements
    • A61B2562/0219Inertial sensors, e.g. accelerometers, gyroscopes, tilt switches

Abstract

A method for automatic detection of types of locomotion, including acquiring a motion signal from a sensor attached to a subject during a period of time when the subject is in motion, extracting one or more features from the motion signal, inputting the one or more features to a locomotion recognition unit, and having the locomotion recognition unit produce an output indicating a likelihood that the subject was moving using a specific type of locomotion during the period of time. A system for automatic classification of different types of locomotion, including a locomotion classification module configured to accept input of a motion signal from a motion sensor and to produce output including an indication of a locomotion classification based, at least in part, on the motion signal. Related apparatus and methods are also described.

Description

CLASSIFYING TYPES OF LOCOMOTION
FIELD OF THE INVENTION
The invention relates to the field of motion analysis.
BACKGROUND
Gait analysis is often defined as the systematic study of human locomotion, using aids such as instrumentation for measuring body movements, body mechanics and muscles activity. Gait analysis is commonly used to assess, plan, and treat individuals with conditions affecting their ability to walk. It is also used in sports biomechanics to help athletes run more efficiently and to identify posture-related or movement-related problems in people with injuries.
In some cases, gait analysis requires a subject to walk in a straight line in a laboratory setting, and to be monitored by a variety of instruments. In some cases, a video camera is positioned to be pointing directly along the line of the straight line, and a trained professional analyzes the video. Such analysis itself is labor-intensive and normally requires a long time by a trained professional.
In some cases, gait analysis involves walking or running on a treadmill. In some cases, the professional simply watches the way that the subject moves, looking in particular at the subject's feet, ankles, knees and hips. In more specialist settings, a video recorder will often be set-up behind the treadmill, which will record a video of the subject's walking or running cycle. This is then relayed to a computing device where slow motion and freeze frames are used to carefully assess the subject's running or walking style. This form of gait analysis usually focuses on the feet and ankles.
Gait analysis is commonly performed by a professional, such as a podiatrist or physiotherapist, although it is now becoming more widespread and readily available with many specialist running and sports shops which own equipment and professional staff that are trained in gait analysis.
Background art includes, for example:
U.S. Patent No. 7,421,369 to Clarkson, which describes an activity recognition apparatus for detecting an activity of a subject. The apparatus includes: a sensor unit including a plurality of linear motion sensors configured to detect linear motions and a plurality of rotational motion sensors, the linear motions being orthogonal to each other, the rotational motions being orthogonal to each other; and a computational unit configured to receive and process signals from the sensors included in the sensor unit so as to detect an activity of the subject. The sensor unit is directly or indirectly supported by the subject with an arbitrary orientation with respect to the subject. The computational unit performs a calculation that uses the signals from both linear motion sensors and rotational motion sensors to determine the activity of the subject independent of the orientation of the sensor unit;
U.S. Patent No. 7,689,378 to Kolen, which describes a highly miniaturized electronic data acquisition system, includes MEMS sensors that can be embedded onto moving device without affecting the static/dynamic motion characteristics of the device. The basic inertial magnetic motion capture (IMMCAP) module consists of a 3D printed circuit board having MEMS sensors configured to provide a triaxial accelerometer; a tri-axial gyroscope, and a tri-axial magnetometer all in communication with analog to digital converters to convert the analog motion data to digital data for determining classic inertial measurement and change in spatial orientation (rho, theta, phi) and linear translation (x, y, z) relative to a fixed external coordinate system as well as the initial spatial orientation relative to the know relationship of the earth magnetic and gravitational fields. The data stream from the IMMCAP modules will allow the reconstruction of the time series of the 6 degrees of freedom for each rigid axis associated with each independent IMMCAP module;
U.S. Patent Application Publication No. 2008/0146968 to Hanawaka et al., which describes a gait analysis system which has: a gait sensor which is to be attached to one foot or both feet of a walking person, and which wirelessly outputs detection data of at least one of an acceleration and an angular velocity; a portable terminal which receives the detection data, and which stores the data for a predetermined time period; and a gait analyzing apparatus which, based on the detection data obtained from the portable terminal, calculates two or three-dimensional position information and status information of the foot or feet at an arbitrary time;
U.S. Patent Publication No. 2008/0045804 to Williams, which describes a method or system which can involve associating a plurality of biokinetographic comparison results with a first specific dysfunction from a group of specific dysfunctions, each of the biokinetographic comparison results obtained from a comparison of a biokinetographic value to a standard for a corresponding biokinetographic variable;
An article entitled "Scenario Test of Accelerometer-Based Biometric Gait Recognition" by Claudia Nickel, Mohammad O. Derawi, Patrick Bours, and Christoph Busch, published in Security and Communication Networks (rWSCN), 2011 3rd International Workshop on May 2011; and
An article entitled "Human Identification via Gait Recognition Using Accelerometer Gyro Forces", by Michael Fitzgerald Nowlan, published as "CPSC- 536-Networked Embedded Systems and Sensor Networks, Professor Savvides, Fall 2009", which may be found at: www.cs.yale.edu/homes/mfn3/pub/mfn_gait_id.pdf.
The disclosures of all references mentioned above and throughout the present specification, as well as the disclosures of all references mentioned in those references, are hereby incorporated herein by reference.
The foregoing examples of the related art and limitations related therewith are intended to be illustrative and not exclusive. Other limitations of the related art will become apparent to those of skill in the art upon a reading of the specification and a study of the figures.
SUMMARY OF THE INVENTION
The following embodiments and aspects thereof are described and illustrated in conjunction with systems, tools and methods which are meant to be exemplary and illustrative, not limiting in scope.
The present invention, in some embodiments thereof, teaches a method of automatic and/or semiautomatic methods for classifying a patient's locomotion into one of several types of locomotion, such as, by way of some non-limiting examples, walking straight (WS), standing (S), turning left or right (TL/R), running (R) or climbing up stairs (C). Example embodiments of hardware used for implementing are also described.
In some embodiments, a method for automatic identification of which leg, right or left, a motion sensor is attached to, is described.
According to an aspect of some embodiments of the present invention there is provided a method for automatic detection of types of locomotion, including using at least one hardware processor for acquiring a motion signal from a sensor attached to a subject during a period of time when the subject is in motion, extracting one or more features from the motion signal, inputting the one or more features to a locomotion recognition unit, and having the locomotion recognition unit produce an output indicating a likelihood that the subject was moving using a specific type of locomotion during the period of time.
According to some embodiments of the invention, the specific type of locomotion is walking straight (WS).
According to some embodiments of the invention, the locomotion recognition unit includes a trained machine learning unit.
According to some embodiments of the invention, the sensor is attached to the subject's leg, and the locomotion recognition produces an output indicating to which of the subject's legs the sensor is attached.
According to some embodiments of the invention, the locomotion recognition which produces the output indicating to which of the subject's legs the sensor is attached, is based on identifying a direction of a twist of the leg following a toe-off event.
According to some embodiments of the invention, the specific type of locomotion is one of group consisting of standing (S), turning left (TL), turning right (TR), running (R), and climbing (C).
According to some embodiments of the invention, the locomotion recognition unit produce an output indicating a likelihood that the subject was turning during the period of time.
According to some embodiments of the invention, the acquiring the motion signal includes preprocessing the motion signal according to at least one method selected from the group including analog to digital conversion, and de-noising the signal.
According to some embodiments of the invention, the motion signal includes a plurality of motion signals, including a linear motion signal and a rotational motion signal.
According to some embodiments of the invention, the extracting one or more features from the motion signal includes using Wavelet Packet Decomposition (WPD) to extract at least one of the one or more features. According to some embodiments of the invention, the locomotion recognition unit includes an Artificial Neural Network (ANN). According to some embodiments of the invention, the locomotion recognition unit includes a plurality of locomotion recognition units. According to some embodiments of the invention, the locomotion recognition unit includes a Support Vector Machine (SVM).
According to some embodiments of the invention, the plurality of locomotion recognition units is provided to a decision unit, and it is the decision unit which produces the output indicating the likelihood that the subject was moving using the specific type of locomotion during the period of time. According to some embodiments of the invention, the decision unit includes an expert system.
According to some embodiments of the invention, the acquiring a motion signal from a sensor attached to a subject during a period of time when the subject is in motion includes attaching the sensor to the subject, allowing the subject to walk in an unconstrained environment for the period of time, and downloading a recording of the motion signal to a computer for performing the extracting, the inputting to a locomotion recognition unit, and the producing an output.
According to some embodiments of the invention, the extracting includes processing chunks of data from discrete windows of time. According to some embodiments of the invention, the chunks of data from discrete windows of time partially overlap.
According to some embodiments of the invention, Wavelet Packet Decomposition (WPD) is applied to the chunks of data, WPD terminal node values are calculated for the chunks of data, filter coefficients are computed for each one of the terminal node values, and energy is calculated for each filter, and DCT is applied to a vector of logarithms of the filter energies.
According to some embodiments of the invention, the specific type of locomotion includes Walking Straight (WS) locomotion.
According to some embodiments of the invention, two sensors are attached, each one of the two sensors to a different leg of the subject.
According to an aspect of some embodiments of the present invention there is provided a method for training an automatic locomotion classification system which includes a machine learning component, the method including using at least one hardware processor for obtaining a motion sensor signal from a motion sensor attached to a walking subject, extracting one or more features from the motion sensor signal, identifying a type of locomotion to which the motion sensor signal belongs, and feeding the one or more features to the machine learning component as a training example of the type of locomotion.
According to some embodiments of the invention, the machine learning component includes an Artificial Neural network.
According to some embodiments of the invention, the extracting one or more features from the motion sensor signal includes splitting the motion sensor signal into chunks of data from discrete windows of time, applying Wavelet Packet Decomposition (WPD) to the chunks of data, calculating WPD terminal node values for the chunks of data, computing filter coefficients for each one of the terminal node values, calculating energy for each filter coefficient, and applying DCT to a vector of logarithms of the energies.
According to some embodiments of the invention, the machine learning component includes a plurality of Feed Forward Artificial Neural networks.
According to some embodiments of the invention, further including training a Probabilistic Neural Network to accept output of the Feed Forward Artificial Neural networks and provide output of an indication of the type of locomotion.
According to an aspect of some embodiments of the present invention there is provided a system for automatic classification of different types of locomotion, including a locomotion classification module being configured, when executed by at least one hardware processor, to accept input of a motion signal from a motion sensor and to produce output including an indication of a locomotion classification based, at least in part, on the motion signal.
According to some embodiments of the invention, the locomotion classification module includes at least one computerized machine learning component.
According to some embodiments of the invention, further including a sensor package including at least one motion sensor for producing the motion signal, and in which the locomotion classification module is configured to accept the motion signal from the sensor package.
According to some embodiments of the invention, the sensor package is included in a mobile personal computing device which includes at least one acceleration sensor, collecting the motion signal is included in an application residing on the mobile personal computing device, and the application is configured to send the motion signal to the locomotion classification module.
According to some embodiments of the invention, the locomotion classification module is included in the mobile personal computing device, collecting and classifying the motion signal is included in an application residing on the mobile personal computing device, and the application is configured to send only a portion of the motion signal including a specific classification of locomotion to another computer.
According to some embodiments of the invention, the sensor package and the locomotion classification module are both included in one unit. According to some embodiments of the invention, the unit includes a smart phone.
According to some embodiments of the invention, the sensor package includes a plurality of motion sensors, at least one of which is a linear motion sensor, and one of which is a rotational motion sensor.
According to some embodiments of the invention, the computerized machine learning component includes an Artificial Neural Network (ANN).
According to some embodiments of the invention, the computerized machine learning component includes a first plurality of computerized machine learning components.
According to some embodiments of the invention, the first plurality of computerized machine learning components includes Feed Forward Neural Networks.
According to some embodiments of the invention, each one of the first plurality of computerized machine learning components is configured to identify a different one from a set of types of locomotion.
According to some embodiments of the invention, further including a second, additional, machine learning component configured to accept input from the first plurality of computerized machine learning components, and to provide an output indicating which of the set of types of locomotion is most likely present in the motion signal.
According to some embodiments of the invention, the second machine learning component is a Probabilistic Neural Network. According to some embodiments of the invention, further including a unit for indicating a beginning and an end of a segment in a motion signal belonging to a specific type of locomotion.
Unless otherwise defined, all technical and/or scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which the invention pertains. Although methods and materials similar or equivalent to those described herein can be used in the practice or testing of embodiments of the invention, exemplary methods and/or materials are described below. In case of conflict, the patent specification, including definitions, will control. In addition, the materials, methods, and examples are illustrative only and are not intended to be necessarily limiting.
Implementation of the method and/or system of embodiments of the invention can involve performing or completing selected tasks manually, automatically, or a combination thereof. Moreover, according to actual instrumentation and equipment of embodiments of the method and/or system of the invention, several selected tasks could be implemented by hardware, by software or by firmware or by a combination thereof using an operating system.
For example, hardware for performing selected tasks according to embodiments of the invention could be implemented as a chip or a circuit. As software, selected tasks according to embodiments of the invention could be implemented as a plurality of software instructions being executed by a computer using any suitable operating system. In an exemplary embodiment of the invention, one or more tasks according to exemplary embodiments of method and/or system as described herein are performed by a data processor, such as a computing platform for executing a plurality of instructions. Optionally, the data processor includes a volatile memory for storing instructions and/or data and/or a non- volatile storage, for example, a magnetic hard-disk and/or removable media, for storing instructions and/or data. Optionally, a network connection is provided as well. A display and/or a user input device such as a keyboard or mouse are optionally provided as well.
In addition to the exemplary aspects and embodiments described above, further aspects and embodiments will become apparent by reference to the figures and by study of the following detailed description. BRIEF DESCRIPTION OF THE DRAWINGS
Some embodiments of the invention are herein described, by way of example only, with reference to the accompanying drawings. With specific reference now to the drawings in detail, it is stressed that the particulars shown are by way of example and for purposes of illustrative discussion of embodiments of the invention. In this regard, the description taken with the drawings makes apparent to those skilled in the art how embodiments of the invention may be practiced.
In the drawings:
Figure 1A is an illustration of a subject wearing an example embodiment of a sensor package, constructed according to an example embodiment of the invention, attached to the subject's foot;
Figure IB is an illustration of a subject wearing two sensor packages, one per leg, constructed according to an example embodiment of the invention, attached to the subject's feet, walking in a non-clinic environment;
Figure 1C is an illustration of a subject wearing two example embodiments of a sensor package, constructed according to an example embodiment of the invention, attached to the subject's feet, in which the sensors are transmitting data to a receptor in a clinic environment;
Figure ID is an illustration of a reference coordinate system including a set of three perpendicular axes each of which may be used to measure linear acceleration and rotational velocity, according to an example embodiment of the invention;
Figure IE is an illustration of a leg of a subject with a FORWARD direction indicated, and reference coordinate system of a set of three perpendicular axes, according to Figure ID, indicated next to the leg, according to an example embodiment of the invention;
Figure IF is a simplified flowchart illustration of use of locomotion classification according to an example embodiment of the invention;
Figure 2A is a simplified block diagram of a locomotion classification system constructed according to an example embodiment of the invention;
Figure 2B is a simplified block diagram of a locomotion classification system constructed according to another example embodiment of the invention;
Figure 2C is a simplified block diagram of a locomotion classification system constructed according to yet another example embodiment of the invention; Figure 3 is a graph showing two input signals, which are examples of input signals similar to the input signal of Figure 2C;
Figure 4 A is a graph showing two input signals, which are examples of input signals produced by a linear motion sensor and a gyroscopic angular motion sensor attached to a right leg of a subject making a right turn, according to an example embodiment of the invention;
Figure 4B is a graph showing two input signals, which are examples of input signals produced by a linear motion sensor and a gyroscopic angular motion sensor attached to a left leg of a subject making a right turn, according to an example embodiment of the invention;
Figure 4C is a graph showing two input signals, which are examples of input signals produced by a linear motion sensor and a gyroscopic angular motion sensor attached to a right leg of a subject making a left turn, according to an example embodiment of the invention;
Figure 4D is a graph showing two input signals, which are examples of input signals produced by a linear motion sensor and a gyroscopic angular motion sensor attached to a left leg of a subject making a left turn, according to an example embodiment of the invention; and
Figure 5 is a simplified flowchart illustration of training of an automatic locomotion classification system which includes a machine learning component according to an example embodiment of the invention.
DETAILED DESCRIPTION
The present invention, in some embodiments thereof, relates to a method and a system for automatic classification of different types of locomotion and, more particularly, but not exclusively, to a method for machine learning for automatic differentiation between different types of locomotion and, even more particularly, but not exclusively, to a method for training a machine learning unit(s) for automatic differentiation between different types of locomotion.
As described above, gait analysis is typically performed on a subject walking straight. It is useful to automatically detect motion signals produced by motion sensors attached to the subject when the subject is walking straight, whether as a precursor to automatic gait analysis, or even as a method of pointing out a walking straight segment of walking for semi-automatic or even manual gait analysis. Automatic locomotion classification is useful in the above role.
In some cases, two motion sensors are attached to the subject, one to each leg. A bonus result of a locomotion classification system as described herein is that the same sensors and classification components can serve to identify, based on motion signals received from the motion sensors, which motion sensor was attached to which leg. Such identification, again, is useful, whether as a precursor to automatic gait analysis, or as a method of eliminating potential errors for semi-automatic or even manual gait analysis.
As will be appreciated by one skilled in the art, aspects of the present invention may be embodied as a system, method or computer program product. Accordingly, aspects of the present invention may take the form of an entirely hardware embodiment, an entirely software embodiment (including firmware, resident software, micro-code, etc.) or an embodiment combining software and hardware aspects that may all generally be referred to herein as a "circuit," "module" or "system." Furthermore, aspects of the present invention may take the form of a computer program product embodied in one or more computer readable medium(s) having computer readable program code embodied thereon.
Any combination of one or more computer readable medium(s) may be utilized. The computer readable medium may be a computer readable signal medium or a computer readable storage medium. A computer readable storage medium may be, for example, but not limited to, an electronic, magnetic, optical, electromagnetic, infrared, or semiconductor system, apparatus, or device, or any suitable combination of the foregoing. More specific examples (a non-exhaustive list) of the computer readable storage medium would include the following: an electrical connection having one or more wires, a portable computer diskette, a hard disk, a random access memory (RAM), a read-only memory (ROM), an erasable programmable read-only memory (EPROM or Flash memory), an optical fiber, a portable compact disc read-only memory (CD-ROM), an optical storage device, a magnetic storage device, or any suitable combination of the foregoing. In the context of this document, a computer readable storage medium may be any tangible medium that can contain, or store a program for use by or in connection with an instruction execution system, apparatus, or device. A computer readable signal medium may include a propagated data signal with computer readable program code embodied therein, for example, in baseband or as part of a carrier wave. Such a propagated signal may take any of a variety of forms, including, but not limited to, electro-magnetic, optical, or any suitable combination thereof. A computer readable signal medium may be any computer readable medium that is not a computer readable storage medium and that can communicate, propagate, or transport a program for use by or in connection with an instruction execution system, apparatus, or device.
Program code embodied on a computer readable medium may be transmitted using any appropriate medium, including but not limited to wireless, wireline, optical fiber cable, RF, etc., or any suitable combination of the foregoing.
Computer program code for carrying out operations for aspects of the present invention may be written in any combination of one or more programming languages, including an object oriented programming language such as Java, Smalltalk, C++ or the like and conventional procedural programming languages, such as the "C" programming language or similar programming languages. The program code may execute entirely on the user's computer, partly on the user's computer, as a stand-alone software package, partly on the user's computer and partly on a remote computer or entirely on the remote computer or server. In the latter scenario, the remote computer may be connected to the user's computer through any type of network, including a local area network (LAN) or a wide area network (WAN), or the connection may be made to an external computer (for example, through the Internet using an Internet Service Provider.
Aspects of the present invention are described below with reference to flowchart illustrations and/or block diagrams of methods, apparatus (systems) and computer program products according to embodiments of the invention. It will be understood that each block of the flowchart illustrations and/or block diagrams, and combinations of blocks in the flowchart illustrations and/or block diagrams, can be implemented by computer program instructions. These computer program instructions may be provided to a hardware processor of a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks.
These computer program instructions may also be stored in a computer readable medium that can direct a computer, other programmable data processing apparatus, or other devices to function in a particular manner, such that the instructions stored in the computer readable medium produce an article of manufacture including instructions which implement the function/act specified in the flowchart and/or block diagram block or blocks.
The computer program instructions may also be loaded onto a computer, other programmable data processing apparatus, or other devices to cause a series of operational steps to be performed on the computer, other programmable apparatus or other devices to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide processes for implementing the functions/acts specified in the flowchart and/or block diagram block or blocks.
Introduction
In some aspects of gait analysis, gait analysis professionals look at a subject walking straight in a lab, and analyze how the subject places feet on the surface, and also differences between how the subject places one foot on the surface versus how the subject places the other foot on the surface.
As described in the Background section above, typical prior art gait analysis requires a subject to walk in a straight line in a laboratory setting, and to be monitored by a variety of instruments placed in the laboratory.
Automatic gait analysis using a computerized system can potentially provide a more rapid analysis, and/or at a lower cost, than manual gait analysis by a trained professional.
Gait analysis can potentially gain by being freed from being performed with the subject in a laboratory setting.
Some embodiments of the invention receive an input signal from motion sensors, and perform a pre-analysis, detecting when a subject used a specific type of locomotion, such as for example walking straight or running, and can provide a gait analysis system with the signal from just the specific type of locomotion. Various embodiments of the invention, described below, teach how such freedom is achieved for classifying types of locomotion, such as, by way of example, types of walking and/or types of running.
In some embodiments, collecting data on a subject's walking or running style is optionally done using a sensor package including one or more motion sensor(s) attached to the subject's feet.
Reference is now made to Figure 1A, which is a simplified image of a subject wearing an example embodiment of a sensor package 102, constructed according to an example embodiment of the invention, attached to the subject's foot 104.
In some embodiments, data about a subject's locomotion is collected by one or more motion sensors. The data is transferred to a system for classifying the locomotion. In some embodiments, the transfer is immediate, for example if the subject is inside a laboratory.
In some embodiments, the subject is not constrained to walking or running in a laboratory setting, but rather may walk around a clinic wearing the motion sensor(s), walk in the environs of the clinic, and/or leave the clinic and walk elsewhere.
In some embodiments, the locomotion data is immediately transferred to the classifying system, for example by cellular communication, and/or by other wireless communication such as WiMax. In some embodiments, the data is collected, and transferred to the classifying system later. For example, when the sensor package is brought to a locomotion or a gait analysis laboratory, and/or by a form of wireless transfer, as described above, to the locomotion or gait analysis laboratory.
In some embodiments, the motion sensors are embodied within portable computerized device such as a smart phone, which collects locomotion data using smart phone sensors, such as accelerometers, and the data is transferred from the smart phone to an analysis unit, the analysis unit possibly being in a locomotion or gait analysis center.
In some embodiments, both the motion sensors and a locomotion classification unit are embodied within a portable computerized device such as a smart phone, which collects locomotion data using smart phone sensors, such as accelerometers. The smart phone classifies the data according to locomotion classes, and optionally sends only data belonging to a specific type of locomotion from the smart phone to the analysis unit at a locomotion or gait analysis center. It is noted that the smart phone mentioned above should be taken to stand for a family of motion-sensor-equipped mobile personal computing devices which are used nowadays, such as tablets, smart phones, and possibly even a communication-enabled pedometer.
In some embodiments, output of a locomotion classification unit is sent on to an automatic locomotion analysis unit, whether embodied in the same computer/device as the locomotion classification unit or in a separate device.
Reference is now made to Figure IB, which is a simplified image of a subject wearing two sensor packages 102, one per leg, constructed according to an example embodiment of the invention, attached to the subject's feet 104, walking in a non- clinic environment.
In some embodiments, locomotion data is collected over a period of time, only later to be uploaded into a computer for classification and/or analysis. In some embodiments, the locomotion data is transmitted and uploaded in real time to a computer for classification and/or analysis. Preferably, the uploading is done wirelessly.
Reference is now made to Figure 1C, which is a simplified image of a subject wearing two example embodiments of a sensor package 102, constructed according to an example embodiment of the invention, attached to the subject's feet 104, in which the sensors are transmitting 106 data to a receptor 108 in a clinic 110 environment.
In some embodiments, the sensors may be attached to the subject's feet by a person who is not a medical professional or a gait analysis professional. The sensors may optionally even be attached by the subject.
In some embodiments, locomotion data classification is optionally done automatically.
In some embodiments, locomotion data classification is optionally performed automatically, optionally including segmenting duration of a subject's locomotion into segments which belong to a single continuous type of locomotion, such as, for example, segmenting into a period of Walking Straight locomotion, followed by a period of Climbing, followed by another period of Walking Straight.
In some embodiments, locomotion classification is done semi-automatically, with a technician selecting which portions of the locomotion data should be analyzed by a computer, and which should be left out of the classification process. In some embodiments, a derivation of walking straight (WS) segments of the subject's walk is performed automatically without human intervention.
It is noted that when collecting data about a subject's locomotion while the subject is walking about freely, outside of a straight walking track in a laboratory, it is especially useful to be able to identify Walking Straight segments, which may be used for further gait analysis.
It is noted that when collecting data about a subject's locomotion, in situations where sensors are attached to both of the subject's legs, it may be especially useful to be able to identify automatically, and/or verify, which sensor is attached to which of the subject's legs. Such identification may prevent a need to insist on attaching a specific sensor to a specific leg, allowing retroactive identification which sensor is attached to which of the subject's legs. Such identification may also prevent errors in gait analysis.
Before explaining at least one embodiment of the invention in detail, it is to be understood that the invention is not necessarily limited in its application to the details of construction and the arrangement of the components and/or methods set forth in the following description and/or illustrated in the drawings and/or the Examples. The invention is capable of other embodiments or of being practiced or carried out in various ways.
Motion Sensors
The introduction section generally described a process of collecting data.
In some embodiments, the locomotion data is optionally collected from one or more linear accelerometers. A linear accelerometer can sense walking, and various types of locomotion. For example, when walking straight forward, a subject generates a first kind of motion forward, a second kind of motion in the vertical direction (a periodic up and down motion), and a third kind of motion side-to-side.
It is noted that different types of locomotion generate different signals in linear accelerometers.
In some embodiments, three linear accelerometers are used, measuring linear acceleration, and/or linear motion, in three perpendicular directions, termed x, y and z.
It is also noted that even using three linear accelerometers in three perpendicular directions, the accelerometers directions may not be aligned with the direction of motion, and so the signals from the accelerometers, for example for "walking straight forward", may not be purely as described above, since the axes of the accelerometers may not be aligned with the forward, up and down, and sideways directions.
In some embodiments of the invention, the signals from three perpendicular accelerometers are modified by a mathematical rotation of the axes so as to align with the forward, up and down, and sideways directions.
In some embodiments, the direction of the axes relative to the forward, up and down, and sideways directions is learned, and optionally, after the learning, the signals from three perpendicular accelerometers are modified by a mathematical rotation of the axes so as to align with the forward, up and down, and sideways directions.
In some embodiments of the invention, the signals from three not-all-in-the- same-plane accelerometers are modified by a mathematical rotation of the axes so as to produce signals corresponding to the forward, up and down, and sideways directions.
In some embodiments of the invention, the signals from three accelerometers are modified by a mathematical rotation of the axes so as to produce signals corresponding to polar coordinates.
In some embodiments, the rotation of the axes is performed so as to minimize amplitude of the acceleration signal in a first direction which is optionally defined as sideways, and/or minimize amplitude of the acceleration signal in a second direction which is optionally defined as forward, and/or maximize amplitude of the acceleration signal in a third direction which is optionally defined as up and down.
In some embodiments, the locomotion data is optionally collected from one or more gyroscopic measurement units, measuring angular velocity. In some embodiments, three gyroscopic measurement units are used, measuring angular velocity in three perpendicular directions, also termed x, y and z.
In some embodiments, the locomotion data is optionally collected from one or more gyroscopic measurement units, measuring angular acceleration. In some embodiments, three gyroscopic measurement units are used, measuring angular acceleration in three perpendicular directions, also termed x, y and z.
In some embodiments, the locomotion data is optionally collected from one or more gyroscopic measurement units. A gyroscopic measurement unit can sense walking, and various types of locomotion. For example, when walking straight forward, a subject may generate a little rotation from motion forward, some rotation in the vertical direction, and little rotation sided-to-side.
It is noted that different types of locomotion generate different signals in gyroscopic measurement units.
In some embodiments, gyroscopic measurement units are used, measuring angular velocity, and/or angular motion, in three perpendicular directions, termed x, y and z.
In some embodiments, gyroscopic measurement units are used, measuring angular acceleration, and/or angular motion, in three perpendicular directions, termed x, y and z.
It is also noted that even using three gyroscopic measurement units in three perpendicular directions, the gyroscopic axes may not be aligned with the direction of motion, and so the signals from the gyroscopic measurement units, for example for "walking straight forward", may not be purely as described above.
In some embodiments of the invention, the signals from three perpendicular gyroscopic measurement units are modified by a mathematical rotation of the axes so as to align with the forward, up and down, and sideways directions.
In some embodiments, both linear accelerometers and gyroscopic measurement units are used to collect locomotion data.
In some embodiments the sensor packages 102 described above with reference to Figures 1A, IB and 1C, include one or more accelerometers and/or one or more gyroscopes.
Reference is now made to Figure ID, which is a simplified image of a reference coordinate system 140 including a set of three perpendicular axes 142 145 148 each of which may be used to measure linear acceleration and rotational velocity, according to an example embodiment of the invention.
In some embodiments sensors are used to measure linear acceleration and rotational velocity in three perpendicular directions, providing complete movement information about a sensor attached to a moving subject (not shown).
Reference is now made to Figure IE, which is a simplified image of a leg 160 of a subject with a FORWARD direction 162 indicated, and reference coordinate system 164 of a set of three perpendicular axes, according to Figure ID indicated next to the leg 160, according to an example embodiment of the invention.
In some embodiments, one of the axes 142 145 148 of the coordinate system is preferably aligned in the FORWARD direction 162.
In some embodiments, one of the axes 142 145 148 of the coordinate system is preferably aligned in an up-down direction (not shown) perpendicular to a floor, and/or a sole of the subject's shoe.
Example embodiment of use of a classification system
Reference is now made to Figure IF, which is a simplified flowchart illustration of use of locomotion classification according to an example embodiment of the invention.
The flowchart of Figure IF illustrates a method which includes the following: acquiring a motion signal from a sensor attached to a subject during a period of time when the subject is in motion (122);
extracting one or more features from the motion signal (124);
inputting the one or more features to a trained machine learning unit (126); and having the trained machine learning unit produce an output indicating a likelihood that the subject was moving using a specific type of locomotion during the period of time (128).
Example embodiments of classification systems
Reference is now made to Figure 2A, which is a simplified block diagram of a locomotion classification system 200 constructed according to an example embodiment of the invention.
The locomotion classification system 200 of Figure 2A accepts an input signal 202 from one or more sensors such as packaged in the sensor package 102 of Figures 1A, IB and 1C.
The input signal 202 is provided to a machine learning component, in this example an Artificial Neural Network (ANN) 204.
In some embodiments, the input signal 202 includes several input signals, for example input signals from several accelerometers, and/or several gyroscopic measurement units. The ANN 204 receives input, optionally from all the sensors packaged within the sensor packages 102 of Figures 1A, IB and 1C.
The ANN 204 produces, based on its input, a respective output 206.
In some embodiments, the output 206 is a confidence level that the input signal 202 is an input signal describing a specific type of locomotion, for example walking straight (WS).
In some embodiments the output signal 206 indicates whether or not the input signal 202 corresponds to the specific type of locomotion, such as, for example, walking straight (WS).
In some embodiments the output signal 206 indicates to which specific type of locomotion the input signal 202 corresponds, for example a value of "1" for a first specific locomotion class, and a value of "2" for a second specific locomotion class, and so on.
Reference is now made to Figure 2B, which is a simplified block diagram of a locomotion classification system 210 constructed according to another example embodiment of the invention.
The locomotion classification system 210 of Figure 2B accepts input signals
212 213 214 216 from sensors such as the sensors on the sensor package 102 of
Figures 1A, IB and 1C.
The input signals 212 213 214 216 are provided to a machine learning component, in this example an Artificial Neural Network (ANN) 224.
In some embodiments, the input signals 212 213 214 216 include several input signals, for example input signals from several accelerometers, and/or several gyroscopic measurement units.
The ANN 224 produces, based on its input signals 212 213 214 216, outputs
226 227 228 230. Each one of the outputs 226 227 228 230 is optionally a confidence level that the input signals 212 213 214 216 are input signals describing a different specific type of locomotion. For example, the output signal 226 may provide a confidence level that the input signals 212 213 214 216 correspond to walking straight (WS), and the output signal 227 may provide a confidence level that the input signals
212 213 214 216 correspond to turning (T).
The outputs 226 227 228 230 are optionally collected by a decision unit 232.
The decision unit 232 optionally outputs an output signal 234. In some embodiments the output signal 234 indicates whether or not the input signals 212 213 214 216 correspond to a specific type of locomotion, such as, for example, walking straight (WS).
In some embodiments the output signal 232 indicates to which specific type of locomotion the input signals 212 213 214 216 corresponds and/or whether the input signals 212 213 214 216 correspond to one of a set of specific types of locomotion.
In some embodiments the decision unit 232 is an expert system.
In some embodiments, instead of the ANN 224 acting as a machine learning unit, a Support Vector Machine (SVM) is used.
In machine learning, Support Vector Machines (SVMs, sometimes also termed support vector networks) are supervised learning models with associated learning algorithms that analyze data and recognize patterns, used for classification and regression analysis. An example basic SVM takes a set of input data and predicts, for each given input, which of two possible classes forms the output, making it a non- probabilistic binary linear classifier. Given a set of training examples, each marked as belonging to one of two categories, an SVM training algorithm builds a model that assigns new examples into one category or the other. An SVM model is a representation of the examples as points in space, mapped so that the examples of the separate categories are divided by a clear gap that is as wide as possible. New examples are then mapped into that same space and predicted to belong to a category based on which side of the gap they fall on.
It is noted that in the present specification and claims, where an ANN is mentioned, an SVM should also be understood to apply. A person ordinarily skilled in the art is able to discern when an SVM may be used in place of an ANN.
In some embodiments, training the SVM is optionally done in a similar manner to training the ANN. Optionally, similar feature extraction is performed. Optionally, feature vectors that corresponds to locomotion classes, such as Walking Straight, Left Turn, Right Turn, and so on, are optionally produced by training one or more SVM instances, where the input is optionally similar to input vectors which are fed to the ANN.
In some embodiments, each instance of the SVM is trained to detect a specific locomotion class. A detection phase is optionally similar in an SVM embodiment as in an ANN embodiment, in that after the training phase is done, data captured by the sensor is processed fed to a set of SVM instances, with the data processing optionally including phases similar to data processing for the ANN - optional preprocessing, followed by a feature extraction phase.
In some embodiments, each SVM instance optionally classifies signal segments, and an expert system optionally combines results from the SVM classifications into a final result.
Reference is now made to Figure 2C which is a simplified block diagram of a locomotion classification system 250 constructed according to yet another example embodiment of the invention.
The locomotion classification system 250 of Figure 2C accepts an input signal 252 from the sensors 102 of Figures 1A, IB and 1C.
The input signal 252 is provided to several machine learning components, in this example several Artificial Neural Networks (ANNs) 256 257 258 259.
In some embodiments, the input signal 252 includes several input signals, for example input signals from several accelerometers, and/or several gyroscopic measurement units.
Each one of the ANNs 256 257 258 259 receives input, optionally from all the sensors packaged within the sensor packages 102 of Figures 1A, IB and 1C.
In some embodiments, each one of the ANNs 256 257 258 259 receives input, optionally from only some of the sensors packaged within the sensor packages 102 of Figures 1A, IB and 1C. For example, input from only two acceleration sensors - a front and back acceleration sensor, and a sideways acceleration sensor.
In some embodiments the input signals are preprocessed, optionally within a preprocessing unit (not shown), transforming signals picked up by three perpendicular directions which do not necessarily correspond to front- and-back, sideways and up- and-down, to three perpendicular directions which do correspond to front-and-back, sideways and up-and-down. The transformation is optionally performed by detecting three perpendicular directions: a direction of an up-and-down motion, a direction of mostly forward motion, and a direction of little sideways motion, which can be achieved by a rotation of axes of the three perpendicular directions of the actual sensors. In some embodiments the preprocessing optionally includes a denoising of the input signals. The denoising may include a smoothing of the input signals, such as, by way of some non-limiting examples, low-pass filtering (LPF), and wavelet denoising.
Each one of the ANNs 256 257 258 259 produces, based on its input, a respective output 266 267 268 269 of a confidence level that the input signal 252 is an input signal describing a specific type of locomotion, for example walking straight (WS).
The outputs 266 267 268 269 are optionally collected by a decision unit 262. The decision unit 262 optionally outputs an output signal 265.
In some embodiments the output signal 265 indicates whether or not the input signal 252 corresponds to a specific type of locomotion, such as, for example, walking straight (WS).
In some embodiments the output signal 265 indicates to which specific type of locomotion the input signal 252 corresponds and/or whether the input signal 252 corresponds to one of a set of specific types of locomotion.
In some embodiments the decision unit 262 is an expert system, as will be further described below.
An example of an input signal for locomotion classification
Reference is now made to Figure 3, which is a graph 300 showing two input signals 310 312, which are examples of input signals similar to the input signal 252 of Figure 2C.
The graph 300 of Figure 3 includes an x-axis 302 of time, and a y-axis 304 which is a qualitative indication of a signal's amplitude.
The graph 300 depicts a first line 310 which corresponds to a signal from a gyroscopic sensor in a direction termed "x", which in this example is a sideways direction, and a second line 312 which corresponds to a signal from a linear acceleration sensor, which in this example is the sideways direction termed "y".
In the graph 300 a section 314 of the input signals marks one of a number of sections of the input signals which are known to represent a class of locomotion of a subject walking straight (WS). Section 314 is suitable for use in training one or more ANNs to recognize the WS locomotion. The graph 300 depicted one input signal, the first line 310, of a gyroscopic sensor, and one input signal, the second line 312, of a linear acceleration sensor.
In some embodiments, input signals such as depicted in section 314 of Figure 3 are used as a positive example to train a machine learning unit to identify the Walking Straight type of locomotion.
It is noted that other sections, corresponding to other classed of locomotion, may be used as positive examples to a machine learning unit to identify other types of locomotion.
Reference is now made to Figure 4A, which is a graph 405 showing two input signals 410 412, which are examples of input signals produced by a linear motion sensor and a gyroscopic angular motion sensor attached to a right leg of a subject making a right turn, according to an example embodiments of the invention.
The graph 405 of Figure 4A includes an x-axis 407 of time, and a y-axis 409 which is a qualitative indication of a signal's amplitude.
The graph 405 depicts a first line 410 which corresponds to a signal from the gyroscopic sensor in a direction termed "x", which in this example is a sideways direction, and a second line 412 which corresponds to a signal from the linear motion sensor which in this example is the sideways direction termed "y".
In the graph 405, a section 414 of the input signals marks one of a number of sections of the input signals which are known to represent a right turn of a subject. Section 414 is suitable for use in training one or more ANNs to recognize the right turn.
In some embodiments, input signals such as depicted in section 414 of Figure 4A are used as a positive example to train a machine learning unit to identify a "Turning Right" type of locomotion.
Reference is now made to Figure 4B, which is a graph 415 showing two input signals 420 422, which are examples of input signals produced by a linear motion sensor and a gyroscopic angular motion sensor attached to a left leg of a subject making a right turn, according to an example embodiments of the invention.
The graph 415 of Figure 4B includes an x-axis 417 of time, and a y-axis 419 which is a qualitative indication of a signal's amplitude.
The graph 415 depicts a first line 420 which corresponds to a signal from the gyroscopic angular motion sensor in a direction termed "x", which in this example is a sideways direction, and a second line 422 which corresponds to a signal from the linear motion sensor which in this example is the sideways direction termed "y".
In the graph 415 a section 424 of the input signals marks one of a number of sections of the input signals which are known to represent a right turn of a subject. Section 424 is suitable for use in training one or more ANNs to recognize the right turn.
In some embodiments, input signals such as depicted in section 424 of Figure 4B are used as a positive example to train a machine learning unit to identify a "Turning Right" type of locomotion.
In some embodiments, Y-axis gyro readings may be used for determining whether a sensor is attached to the right leg of a subject or to the left leg of the subject. Such differentiation between a left-worn sensor and a right-worn sensor may exploit a certain kinematic property associated with a TO event; in humans, immediately after a leg is lifted off the ground, it tends to make a slight twist in the direction of the body's sagittal plane. Namely, the right leg twists to the left and the left leg twist to the right. This twist is barely noticeable with the naked eye, but can be discerned when using a sensor with a high enough sampling rate (e.g. in the range of tens or hundreds of samples per second).
Accordingly, in present embodiments, gyro Y readings which immediately follow a TO event are analyzed, to identify the twist. Optionally, the identification is performed by observing the gyro Y readings starting at about 50 milliseconds (+50%) after the TO event, and lasting about 200 milliseconds (+50%). These timings may also be defined by the sampling rate of the sensor: the observation may start X samples after the TO event, where X is equal to 5% (+50%) of the sampling rate, and last Y samples, where Y is equal to 20% (+50%) of the sampling rate.
Negative gyro Y readings indicate a left twist - meaning that the sensor is worn on the right leg. See Fig. 4E, which shows a graph 450 of a gyro Y signal 452 and a linear motion sensor signal 454 - both as a function of time. As exhibited in a section 456, the gyro Y value becomes noticeably negative over a time window of about 100 samples, which starts about 25 samples following a TO event (marked with a triangle 458). Conversely, positive gyro Y readings indicate a right twist - meaning that the sensor is worn on the left leg. See Fig. 4F, which shows a graph 460 of a gyro Y signal 462 and a linear motion sensor signal 464 - both as a function of time. As exhibited in a section 466, the gyro Y value becomes noticeably positive over a time window of about 100 samples, which starts about 25 samples following a TO event (marked with a triangle 458).
In some embodiments, the time window after a TO event, in which the twist is identified, may be automatically and dynamically adapted as the subject walks. That is, the length of the time window may be adapted based on the walking speed of the subject and/or a step size of the subject, detected using one or more of the sensor.
Reference is now made to Figure 4C, which is a graph 425 showing two input signals 430 432, which are examples of input signals produced by a linear motion sensor and a gyroscopic angular motion sensor attached to a right leg of a subject making a left turn, according to an example embodiments of the invention.
The graph 425 of Figure 4C includes an x-axis 427 of time, and a y-axis 429 which is a qualitative indication of a signal's amplitude.
The graph 425 depicts a first line 430 which corresponds to the signal from a gyroscopic angular motion sensor in a direction termed "x", which in this example is a sideways direction, and a second line 432 which corresponds to a signal from the linear motion sensor which in this example is the sideways direction termed "y".
In the graph 425 a section 434 of the input signals marks a section of the input signals which is known to represent a left turn of a subject, and is suitable for use in training one or more ANNs to recognize the left turn.
In some embodiments, input signals such as depicted in section 434 of Figure 4C are used as a positive example to train a machine learning unit to identify a "Turning Left" type of locomotion.
Reference is now made to Figure 4D, which is a graph 435 showing two input signals 440 442, which are examples of input signals produced by a linear motion sensor and a gyroscopic angular motion sensor attached to a left leg of a subject making a left turn, according to an example embodiment of the invention.
The graph 435 of Figure 4D includes an x-axis 437 of time, and a y-axis 439 which is a qualitative indication of a signal's amplitude.
The graph 435 depicts a first line 440 which corresponds to a signal from the gyroscopic angular motion sensor in a direction termed "x", which in this example is a sideways direction, and a second line 442 which corresponds to a signal from the linear motion sensor which in this example is the sideways direction termed "y". In the graph 435 a section 444 of the input signals marks one of a number of sections of the input signals which are known to represent a right turn of a subject. Section 444 is suitable for use in training one or more ANNs to recognize the right turn.
In some embodiments, input signals such as depicted in section 444 of Figure
4D are used as a positive example to train a machine learning unit to identify a "Turning Left" type of locomotion.
In some embodiments, input signals such as depicted in sections 414 and 424 of Figures 4A and 4B are used as examples to train a machine learning unit to classify between a sensor attached to a right leg of a subject and a sensor attached to a left leg of a subject while the subject is turning right.
In some embodiments, input signals such as depicted in sections 434 and 444 of Figures 4C and 4D are used as examples to train a machine learning unit to classify between a sensor attached to a right leg of a subject and a sensor attached to a left leg of a subject while the subject is turning left.
In some embodiments only one linear acceleration sensor is used. In some embodiments two linear acceleration sensors are used. In some embodiments three linear acceleration sensors are used. In some embodiments even more linear acceleration sensors are used.
In some embodiments the linear acceleration sensors are mounted within a sensor package in perpendicular directions. A potential benefit of mounting the linear acceleration sensors in perpendicular directions is that of capturing any movement of the subject, in any direction.
In some embodiments only one gyroscopic sensor is used. In some embodiments two gyroscopic sensors are used. In some embodiments three gyroscopic sensors are used. In some embodiments even more gyroscopic sensors are used.
In some embodiments the gyroscopic sensors are mounted within a sensor package in perpendicular directions. A potential benefit of mounting the gyroscopic sensors in perpendicular directions is that of capturing any movement of the subject, in any direction.
The number of sensors to be used is preferably as many as needed to capture the subject's movement and enable automatic locomotion classification. However, it is noted that linear acceleration sensors and gyroscopic sensors are inexpensive, so using more than one sensor and even three perpendicular sensors of each type, is not prohibitively expensive, and potentially adds to the accuracy of locomotion segmentation and classification. The Walking Segmentation Method (WSM)
In some embodiments, the WSM optionally uses one or more Artificial Neural Network (ANNs) which have optionally been trained in a supervised training fashion, that is, the ANNs are first trained, and then used for classification.
In some embodiments, in both the training and classification, sensor data is optionally modeled as features. In such embodiments, the features are a compact representation of the data. The features are results of signal processing the sensor data, optionally using wavelet-packet-decomposition (WPD) and spectral analysis.
In an example embodiment the WSM uses four ANNs. Three of the ANNs are fully-connected Feed Forward Networks (FFNs), and the additional ANN is a Probabilistic Neural Network (PNN). All four networks are trained and used in classifying input signals. The use of 4 networks potentially increases the classification accuracy.
In some embodiments, even one ANN or PNN may be trained and used.
The ANN(s) may be implemented as software ANN(s) or as hardware ANN circuit(s) with appropriate surrounding support circuits.
In some embodiments the walking segmentation method uses signal processing and machine learning techniques which are further described with reference to an example embodiment below.
For example, six simultaneous data input channels are used: 3 linear accelerometer channels, in directions termed x, y and z axes; and 3 gyroscopic sensor channels in directions termed x, y and z axes.
In another example embodiment, the input signals of which are depicted in Figure 3, input data comprises the x channel of the gyroscopic sensor and the same x channel of the linear accelerometer. The gyro x measurement is optionally used to identify and optionally extract the WS segments of the same x linear accelerometer signal.
A brief note is now made regarding use of the term signal in the present patent application and claims. Wherever the term signal is used, it stands for either an analog signal or a digital signal. A person ordinarily skilled in the art is able to discern when an operation which is described as performed on a signal is inappropriate for use on either the analog signal or the digital signal, and should then understand that the operation is used on an appropriate form (analog/digital) of the signal, or that the signal is transformed into the appropriate form at a stage prior to performing the operation.
Figure 3 depicts typical measurements of both x gyro and y accelerometer. The input data that is used by the ANNs of the example embodiment is that of the y accelerometer. The X gyro in the example embodiment is used for segmenting the input signal from the y accelerometer into one or more WS segments. Each y accelerometer segment is a potential input data entry in the learning data set.
The analog y accelerometer segment is transformed into a vector of digital values, which is a time series of digital values of the analog input signal.
The vector of digital values is a compact representation of a WS segment. In some embodiments, a compact representation of an entry which can store most of the information in the input signal is desired. The representation is optionally invariant to possible transformations of input signal.
In the example embodiment of the WS segmentation method described above, the input signal is a y accelerometer signal recording. An example procedure for processing the input signal to extract its features is now described.
After extracting WS segments for building a training set, whether automatically or manually, one or more of the following processes are optionally applied to the segments, in order to build the training set:
• An optional pre-processing stage, where the input signal(s) may be improved, for example by de-noising, smoothing, expanding to a predetermined window size, and so on. In some embodiments the denoising is performed by using a filter which removes high frequencies from an input signal which are not associated with the walking.
• A feature extraction phase including one or more of:
o Splitting the segments to Hamming windows.
o Applying Wavelet Packet Decomposition (WPD) to each Hamming window.
o Extracting WPD terminal nodes. o Computing filter coefficients for each terminal node,
o Computing the energy of each filter.
o Applying a Discrete Cosine Transform (DCT) over a normalized log of a vector of the energies.
A vector of features is produced by optionally concatenating low frequency values of the DCT to the vector of normalized log energies. In some embodiments, each Hamming window produces is own vector of features. Using several Hamming windows results in a matrix of features, where a row represents Hamming window features. The feature matrix rows are optionally concatenated as a single vector, which represents a final feature vector, corresponding to a WS segment.
The above procedure is optionally repeated to produce several feature vectors, corresponding to several WS segments.
Once a training set is built, it is optionally used to train the three ANN and the one PNN networks of the example embodiment described above. After training, a classification system is ready to be fed with new measured data for classification using the trained ANNs.
In some embodiments, the walking segmentation method uses a signal processing technique which is further described below. This technique is aimed at discerning, from sensor readings, segments of WS. This may be beneficial, as one example, in knee osteoarthritis analysis, which usually requires the patient to walk straight. In knee osteoarthritis analysis, single limb support (SLS) and double limb support (DLS) may be used to assess the functional status of the patient. These measurements should usually be done while the patient is walking straight. Therefore, according to the technique, sensor reading are used for detecting turns, thereby classifying the walking segments between turns as WS segments - during which gait analysis can be made.
The technique for WS segmentation may be based on observing kinematic data represented by an integrated Y angular velocity (GY). The integration serves as an estimation of the Y direction angular change. Theoretically, during WS segments, the sensor should measure zero acceleration (AY) and zero angular velocity (GY) in the Y direction (which is perpendicular to X (forward) and Z (up) directions). During turns, the sensor should measure non-zero Y values of acceleration and angular velocity. During a left turn, we would expect negative measurements of AY and GY, whereas during a right turn, we would expect positive measurements of AY and GY.
In practice, the sensor does not commonly measure zero values during WS segments. There are various reasons for the non-zero measurements, such as mechanical noise and/or miscalibration of the sensor. Still, it is possible to differentiate turn segments from WS segments by observing high amplitudes in GY and AY, whereas in WS segments these amplitudes are lower. Since the gyro measurements inherently produce drift over time, naive integration over gyro measurements may not suffice for amplitude classification. Gyro integration during turn may provide an angle which may be misinterpreted as a noisy WS segment. Therefore, before integrating GY, we may use AY as a weight function for the GY values. During turns, AY measurements get higher values. In order to eliminate high frequency fluctuation, we integrate AY. This serves as linear velocity estimation, VY. VY is a smooth function. It slowly fluctuates around zero in WS segments and slowly fluctuates around a non-zero value during a period of time when the patient turns. Therefore, VY is an advantageous choice for a weight function.
The integration may be performed, for example, in the time range between approximately 5 milliseconds (±50%) after a TO and until an HS (±50%), where TO is a toe-off event (indicating the point in time when patient has fully lifted its foot off the ground) and HS is a heel-strike event (indicating the point in time when the patient's foot re-touches the ground). When using a sensor with a sampling rate to 500 Hz, for example, this time range translates to 25 samples after a TO event until the HS. The 25 samples constant factor has been experimentally introduced to make sure we integrate starting from the true TO, assuming the TO estimation may introduce error. Those of skill in the art will recognize that the constant factor may be different from 25, such as between 5-10, 10-15, 15-20, 20-25, 25-30, 30-35 or higher. When using a sensor with a sampling rate different than 500 Hz, these scalars will be converted to match the other sampling rate, as known in the art. The result of the definite integral is direction feature (f), a number:
HS HS
Figure imgf000032_0001
The f feature may be thresholded, though it depends on the integration range, or, in other words, on the length of the stride. Therefore, the threshold for each segment is a normalized factor which represents percentage of the swing out of the entire step length.
HSi 2 - TO
thresh =— -—— - (2) where HS1 represents the first heel- strike event and HS2 represents the next, consecutive, heel-strike (HS) event. First, the absolute value of f is compared to thresh. If it is lower than thresh, we classify the corresponding segment as WS. In case f is higher than thresh we classify the corresponding segment as a turn. Negative and positive signs suggest left and right turns, respectively.
Reference is now made to Fig. 6A, which shows a graph 635 of an integral 640 of GY and of that integral multiplied by a gyroscope reading 642 - both as a function of time. The behavior of graph 635, in accordance with the discussion above, is indicative of a right turn.
Reference is also made to Fig. 6B, which shows a graph 645 of an integral 650 of GY and of that integral multiplied by a gyroscope reading 652 - both as a function of time. The behavior of graph 645, in accordance with the above, is indicative of a left turn.
Finally, reference is made to Fig. 6C, which shows a graph 655 of an integral 660 of GY and of that integral multiplied by a gyroscope reading 662 - both as a function of time. The behavior of graph 655, in accordance with the above, is indicative of SW.
Training
In some embodiments, training the ANNS is done in a supervised fashion. A training data set is produced by extracting WS segments from one or more recordings.
In some embodiments, the WSM performs multiple classifications, that is, its training data set includes WS segments and various non-WS segments.
In some embodiments, the ANNs classify input signals, whereas a decision unit which includes an expert system accepts or denies the classification. In some embodiments, the expert system optionally classifies non-WS segments when an input example is classified as one of the non-WS segments, and/or when classification confidence is poor. This is further described below.
In some embodiments, the WSM trains a number of FFN (Feed-Forward Network) networks, and saves for classification several of the FFNs which provided best locomotion classification performance during training.
In some embodiments an additional trained PNN (Probabilistic Neural Network) is used to classify locomotion.
A description is now provided of producing three FFNs.
During training, a classification system, by way of a non-limiting example a classification system as depicted in Figure 2C is fed with extracted segments. Each of the segments is manually labeled according to its walking nature (i.e. WS, S, TL, TR, and C). The labeled segments are a training dataset.
Optionally, for each one of the segments in the training dataset, features are extracted and used to train one or more ANNs. The trained networks are optionally tested against a test set of signals. A test set is optionally the same as, or similar to, the training dataset though the test set is optionally not used in the training stage. The test set is optionally used to assess accuracy of neural network performance.
Reference is now made to Figure 5, which is a simplified flowchart illustration of training of an automatic locomotion classification system which includes a machine learning component according to an example embodiment of the invention.
The method depicted in Figure 5 includes:
obtaining a motion sensor signal from a motion sensor attached to a walking subject (510);
extracting one or more features from the motion sensor signal (520);
identifying a type of locomotion to which the motion sensor signal belongs (530); and
feeding the one or more features to the machine learning component as a training example of said type of locomotion (540).
Expert system - classification
In some embodiments, during training, 4 neural networks are optionally trained for each given type of locomotion; three neural networks of type FFN (Feed-Forward Network) and one neural network of type PNN (Probabilistic Neural Network). The neural networks are assigned with classification weights according to successful classifications produced in the learning stage. The classification weights are optionally used to compute a total expectation confidence for each output classification.
For example, after training three FFNs and one PNN with a training set which includes the WS type of locomotion, training was terminated when the following WS recognition values were obtained on the training set:
• A first FFN achieved relatively high recognition - 89%
• A second FFN achieved relatively less recognition - 75%
· A third FFN achieved relatively least recognition - 69%
• While the PNN achieved relatively most recognition - 83%
The values correspond qualitatively, but not necessarily quantitatively, to a success rate of recognition of WS over the training data. For a general input signal, the above trained networks are considered to have the following confidence values for detection of a Walking Straight type of locomotion:
• FFN-high. confidence = .5
• FFN-medium. confidence = .6
· FFN-low. confidence = .8
• PNN confidence = .5
In some embodiments, an overall confidence of detecting WS for an example input signal, is taken to be the maximal value provided by the neural networks.
In some embodiments, an overall confidence of detecting WS for an example input signal, is taken to be an average of the values provided by the neural networks.
In some embodiments, it is a decision unit, such as the decision units 232 262 of Figures 2B and 2C which selects the maximal value or calculates the average value.
In some embodiments, a confidence level lower than a specific value, by way of a non- limiting example lower than .5 optionally indicates negative classification, and may be interpreted as 1 -confidence for negating a specific locomotion classification.
In some embodiments, an input signal is optionally fed to networks which are each trained to identify a specific class of locomotion. In some embodiments, each type of locomotion has 4 neural networks as described above. For each type of locomotion, an expected confidence is computed based on the output of all 4 networks specific to the locomotion. This produces confidence levels for the types of locomotion. A final classification is optionally based on which type of locomotion has a highest confidence.
In some embodiments, if the highest confidence corresponds, for example, to WS, and if the confidence is above a predefined threshold, the input signal is classified as WS. Otherwise, the input signal is classified as a non-WS segment. Automatic locomotion classification
In some embodiments, recording data of a subject's locomotion include various walking segments. For example, segments of walking straight (WS), standing (S), turning left or right (TL/TR), or climbing up stairs (C).
In some embodiments, locomotion classification is performed over WS segments only. In such embodiments, the WS segments are optionally extracted from, or identified in, the recorded data.
In some embodiments, one or more WS segments are optionally marked by a human operator, and locomotion is then automatically analyzed in the WS segments.
In some embodiments, one or more WS segments are automatically identified as a first process, and a second process is used to automatically analyze the subject's locomotion within the WS segments.
In some embodiments, specific portions of walking are extracted for purposes other than locomotion classification. Example purposes include: tracking how straight a patient walks - which potentially identifies some medical problems; tracking a straight-walking speed and variations in the speed; and tracking stride size and its variations - for instance to predict risk of falling in an elderly population.
Potential benefits of embodiments of the invention
Current methods for gait analysis require the person to walk in a laboratory in a straight line and to be monitored by a variety of instruments.
In some embodiments, walking patterns of a subject may be tracked, both inside and outside a laboratory setting. In some cases it is desired to track gait of a subject under natural conditions, and/or as the gait develops over a day, as the subject gets tired.
The analysis itself is human intensive and takes many hours by a trained personal.
Some embodiments of the invention perform automatic locomotion classification, which potentially saves time and money. The time saving potentially includes both a professional's time, for analyzing, and a subject's time, for staying at the laboratory and/or sport facility. The money saving potentially includes a saving in employing a professional to perform the classification.
It is expected that during the life of a patent maturing from this application many relevant linear motion sensors and angular motion sensors will be developed and the scope of the terms accelerometer and gyroscopic sensor are intended to include all such new technologies a priori.
The flowcharts and block diagrams in the Figures illustrate the architecture, functionality, and operation of possible implementations of systems, methods and computer program products according to various embodiments of the present invention. In this regard, each block in the flowchart or block diagrams may represent a module, segment, or portion of code, which comprises one or more executable instructions for implementing the specified logical function(s). It should also be noted that, in some alternative implementations, the functions noted in the block may occur out of the order noted in the figures. For example, two blocks shown in succession may, in fact, be executed substantially concurrently, or the blocks may sometimes be executed in the reverse order, depending upon the functionality involved. It will also be noted that each block of the block diagrams and/or flowchart illustration, and combinations of blocks in the block diagrams and/or flowchart illustration, can be implemented by special purpose hardware-based systems that perform the specified functions or acts, or combinations of special purpose hardware and computer instructions.
The terms "comprising", "including", "having" and their conjugates mean
"including but not limited to".
The term "consisting of is intended to mean "including and limited to". The term "consisting essentially of means that the composition, method or structure may include additional ingredients, steps and/or parts, but only if the additional ingredients, steps and/or parts do not materially alter the basic and novel characteristics of the claimed composition, method or structure.
As used herein, the singular form "a", "an" and "the" include plural references unless the context clearly dictates otherwise. For example, the term "a unit" or "at least one unit" may include a plurality of units, including combinations thereof.
The words "example" and "exemplary" are used herein to mean "serving as an example, instance or illustration". Any embodiment described as an "example or "exemplary" is not necessarily to be construed as preferred or advantageous over other embodiments and/or to exclude the incorporation of features from other embodiments.
The word "optionally" is used herein to mean "is provided in some embodiments and not provided in other embodiments". Any particular embodiment of the invention may include a plurality of "optional" features unless such features conflict.
It is appreciated that certain features of the invention, which are, for clarity, described in the context of separate embodiments, may also be provided in combination in a single embodiment. Conversely, various features of the invention, which are, for brevity, described in the context of a single embodiment, may also be provided separately or in any suitable sub-combination or as suitable in any other described embodiment of the invention. Certain features described in the context of various embodiments are not to be considered essential features of those embodiments, unless the embodiment is inoperative without those elements.
Although the invention has been described in conjunction with specific embodiments thereof, it is evident that many alternatives, modifications and variations will be apparent to those skilled in the art. Accordingly, it is intended to embrace all such alternatives, modifications and variations that fall within the spirit and broad scope of the appended claims.
All publications, patents and patent applications mentioned in this specification are herein incorporated in their entirety by reference into the specification, to the same extent as if each individual publication, patent or patent application was specifically and individually indicated to be incorporated herein by reference. In addition, citation or identification of any reference in this application shall not be construed as an admission that such reference is available as prior art to the present invention. To the extent that section headings are used, they should not be construed as necessarily limiting.

Claims

CLAIMS What is claimed is:
1. A method for automatic detection of types of locomotion, comprising using at least one hardware processor for:
acquiring a motion signal from a sensor attached to a subject during a period of time when the subject is in motion;
extracting one or more features from the motion signal;
inputting the one or more features to a locomotion recognition unit; and having the locomotion recognition unit produce an output indicating a likelihood that the subject was moving using a specific type of locomotion during the period of time.
2. The method of claim 1, in which the specific type of locomotion is walking straight (WS).
3. The method of claim 1, in which the locomotion recognition unit comprises a trained machine learning unit.
4. The method of claim 1, in which:
the sensor is attached to the subject's leg; and
the locomotion recognition produces an output indicating to which of the subject's legs the sensor is attached.
5. The method of claim 4, wherein the locomotion recognition which produces the output indicating to which of the subject's legs the sensor is attached, is based on identifying a direction of a twist of the leg following a toe-off event.
6. The method of claim 1 in which the specific type of locomotion is selected from the group consisting of:
standing (S);
turning left (TL);
turning right (TR);
running (R); and
climbing (C).
7. The method of claim 1, in which the locomotion recognition unit produces an output indicating a likelihood that the subject was turning during the period of time.
8. The method of claim 7, in which the locomotion recognition unit integrates a Y gyroscope measurement of the sensor with a Y accelerometer measurement of the sensor, to produce a velocity function fluctuating around a non-zero value during the period when the subject was turning.
9. The method of claim 1, in which the acquiring of the motion signal comprises preprocessing the motion signal according to at least one method selected from the group consisting of:
analog to digital conversion; and
de-noising the signal.
10. The method of claim 1, in which the motion signal comprises a plurality of motion signals, including a linear motion signal and a rotational motion signal.
11. The method of claim 1, in which the extracting of the one or more features from the motion signal comprises using Wavelet Packet Decomposition (WPD) to extract at least one of the one or more features.
12. The method of claim 1, in which the locomotion recognition unit comprises an Artificial Neural Network (ANN).
13. The method of claim 1, in which the locomotion recognition unit comprises a Support Vector Machine (SVM).
14. The method of claim 1, in which the locomotion recognition unit comprises a plurality of locomotion recognition units.
15. The method of claim 14, in which output of the plurality of locomotion recognition units is provided to a decision unit, and it is the decision unit which produces the output indicating the likelihood that the subject was moving using the specific type of locomotion during the period of time.
16. The method of claim 15, in which the decision unit comprises an expert system.
17. The method of claim 1, in which the acquiring of the motion signal from the sensor attached to the subjectn comprises:
attaching the sensor to the subject;
allowing the subject to walk in an unconstrained environment for the period of time; and
downloading a recording of the motion signal to a computer for performing the extracting, the inputting to a locomotion recognition unit, and the producing an output.
18. The method of claim 17, in which said specific type of locomotion comprises Walking Straight (WS) locomotion.
19. The method of claim 17, in which two sensors are attached, each one of said two sensors to a different leg of said subject.
20. The method of claim 1, in which the extracting comprises processing chunks of data from discrete windows of time.
21. The method of claim 20, in which said chunks of data from discrete windows of time partially overlap.
22. The method of claim 20, in which Wavelet Packet Decomposition (WPD) is applied to said chunks of data;
WPD terminal node values are calculated for said chunks of data;
filter coefficients are computed for each one of said terminal node values; and energy is calculated for each filter; and
DCT is applied to a vector of logarithms of said filter energies.
23. A method for training an automatic locomotion classification system which comprises a machine learning component, the method comprising using at least one hardware processor for:
obtaining a motion sensor signal from a motion sensor attached to a walking subject;
extracting one or more features from the motion sensor signal; identifying a type of locomotion to which the motion sensor signal belongs; and
feeding the one or more features to the machine learning component as a training example of said type of locomotion.
24. The method of claim 23, in which the machine learning component comprises an Artificial Neural network.
25. The method of claim 23, in which the extracting one or more features from the motion sensor signal comprises:
splitting the motion sensor signal into chunks of data from discrete windows of time;
applying Wavelet Packet Decomposition (WPD) to said chunks of data;
calculating WPD terminal node values for said chunks of data;
computing filter coefficients for each one of said terminal node values;
calculating energy for each filter coefficient; and
applying DCT to a vector of logarithms of said energies.
26. The method of claim 23, in which the machine learning component comprises a plurality of Feed Forward Artificial Neural networks.
27. The method of claim 26, further comprising training a Probabilistic Neural Network to accept output of the Feed Forward Artificial Neural networks and provide output of an indication of said type of locomotion.
28. A system for automatic classification of different types of locomotion, comprising a locomotion classification module being configured, when executed by at least one hardware processor, to accept input of a motion signal from a motion sensor and to produce output comprising an indication of a locomotion classification based, at least in part, on the motion signal.
29. The system of claim 28, in which the locomotion classification module comprises at least one computerized machine learning component.
30. The system of claim 28, further comprising a sensor package comprising at least one motion sensor for producing the motion signal, and in which the locomotion classification module is configured to accept the motion signal from the sensor package.
31. The system of claim 30, in which:
the sensor package is comprised in a mobile personal computing device which comprises at least one acceleration sensor;
collecting the motion signal is comprised in an application residing on the mobile personal computing device; and
the application is configured to send the motion signal to the locomotion classification module.
32. The system of claim 31, in which:
the locomotion classification module is comprised in the mobile personal computing device;
collecting and classifying the motion signal is comprised in an application residing on the mobile personal computing device; and
the application is configured to send only a portion of the motion signal comprising a specific classification of locomotion to another computer.
33. The system of claim 30, in which the sensor package and the locomotion classification module are both comprised in one unit.
34. The system of claim 33, in which the unit comprises a smart phone.
35. The system according to claim 30, in which the sensor package comprises a plurality of motion sensors, at least one of which is a linear motion sensor, and one of which is a rotational motion sensor.
36. The system according to claim 29, in which the computerized machine learning component comprises an Artificial Neural Network (ANN).
37. The system according to claim 29, in which the computerized machine learning component comprises a first plurality of computerized machine learning components.
38. The system according to claim 37, in which the first plurality of computerized machine learning components comprises Feed Forward Neural Networks.
39. The system according to claim 37, in which each one of the first plurality of computerized machine learning components is configured to identify a different one from a set of types of locomotion.
40. The system according to claim 37, further comprising a second, additional, machine learning component configured to accept input from the first plurality of computerized machine learning components, and to provide an output indicating which of the set of types of locomotion is most likely present in the motion signal.
41. The system according to claim 40, in which the second machine learning component is a Probabilistic Neural Network.
42. The system of claim 28, further comprising a unit for indicating a beginning and an end of a segment in a motion signal belonging to a specific type of locomotion.
PCT/IL2013/051004 2013-02-03 2013-12-05 Classifying types of locomotion WO2014118767A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201361760098P 2013-02-03 2013-02-03
US61/760,098 2013-02-03

Publications (1)

Publication Number Publication Date
WO2014118767A1 true WO2014118767A1 (en) 2014-08-07

Family

ID=51261546

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IL2013/051004 WO2014118767A1 (en) 2013-02-03 2013-12-05 Classifying types of locomotion

Country Status (1)

Country Link
WO (1) WO2014118767A1 (en)

Cited By (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2016081946A1 (en) * 2014-11-21 2016-05-26 The Regents Of The University Of California Fast behavior and abnormality detection
EP3396319A4 (en) * 2015-12-24 2018-12-26 Fujitsu Limited Information processing system, information processing program, and information processing method
WO2019173321A1 (en) * 2018-03-06 2019-09-12 Anki, Inc. Robot transportation mode classification
CN111351524A (en) * 2018-12-21 2020-06-30 亚玛芬体育数字服务公司 Sensor data management
CN111694829A (en) * 2020-06-10 2020-09-22 北京卡路里信息技术有限公司 Motion trail processing method and device and motion trail processing system
US10856776B2 (en) 2015-12-21 2020-12-08 Amer Sports Digital Services Oy Activity intensity level determination
IT201900014631A1 (en) * 2019-08-12 2021-02-12 Webbdone Srl HANDLING METHOD FOR VIRTUAL REALITY
CN113303789A (en) * 2021-04-30 2021-08-27 武汉齐物科技有限公司 Gait event detection method and device based on acceleration
US11137820B2 (en) 2015-12-01 2021-10-05 Amer Sports Digital Services Oy Apparatus and method for presenting thematic maps
US11145272B2 (en) 2016-10-17 2021-10-12 Amer Sports Digital Services Oy Embedded computing device
US11144107B2 (en) 2015-12-01 2021-10-12 Amer Sports Digital Services Oy Apparatus and method for presenting thematic maps
US11210299B2 (en) 2015-12-01 2021-12-28 Amer Sports Digital Services Oy Apparatus and method for presenting thematic maps
US11215457B2 (en) 2015-12-01 2022-01-04 Amer Sports Digital Services Oy Thematic map based route optimization
US11284807B2 (en) 2015-12-21 2022-03-29 Amer Sports Digital Services Oy Engaging exercising devices with a mobile device
FR3118235A1 (en) * 2020-12-17 2022-06-24 Orange Movement mode recognition by motion sensor
US11541280B2 (en) 2015-12-21 2023-01-03 Suunto Oy Apparatus and exercising device
US11587484B2 (en) 2015-12-21 2023-02-21 Suunto Oy Method for controlling a display
US11607144B2 (en) 2015-12-21 2023-03-21 Suunto Oy Sensor based context management
US11703938B2 (en) 2016-10-17 2023-07-18 Suunto Oy Embedded computing device
US11838990B2 (en) 2015-12-21 2023-12-05 Suunto Oy Communicating sensor data in wireless communication systems

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080146968A1 (en) * 2006-12-14 2008-06-19 Masuo Hanawaka Gait analysis system
US20110054359A1 (en) * 2009-02-20 2011-03-03 The Regents of the University of Colorado , a body corporate Footwear-based body weight monitor and postural allocation, physical activity classification, and energy expenditure calculator
US20110054833A1 (en) * 2009-09-02 2011-03-03 Apple Inc. Processing motion sensor data using accessible templates
WO2011033799A1 (en) * 2009-09-18 2011-03-24 株式会社日立製作所 Management method of computer system, computer system, and program for same

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080146968A1 (en) * 2006-12-14 2008-06-19 Masuo Hanawaka Gait analysis system
US20110054359A1 (en) * 2009-02-20 2011-03-03 The Regents of the University of Colorado , a body corporate Footwear-based body weight monitor and postural allocation, physical activity classification, and energy expenditure calculator
US20110054833A1 (en) * 2009-09-02 2011-03-03 Apple Inc. Processing motion sensor data using accessible templates
WO2011033799A1 (en) * 2009-09-18 2011-03-24 株式会社日立製作所 Management method of computer system, computer system, and program for same

Cited By (24)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2016081946A1 (en) * 2014-11-21 2016-05-26 The Regents Of The University Of California Fast behavior and abnormality detection
US10503967B2 (en) 2014-11-21 2019-12-10 The Regents Of The University Of California Fast behavior and abnormality detection
US11215457B2 (en) 2015-12-01 2022-01-04 Amer Sports Digital Services Oy Thematic map based route optimization
US11210299B2 (en) 2015-12-01 2021-12-28 Amer Sports Digital Services Oy Apparatus and method for presenting thematic maps
US11144107B2 (en) 2015-12-01 2021-10-12 Amer Sports Digital Services Oy Apparatus and method for presenting thematic maps
US11137820B2 (en) 2015-12-01 2021-10-05 Amer Sports Digital Services Oy Apparatus and method for presenting thematic maps
US10856776B2 (en) 2015-12-21 2020-12-08 Amer Sports Digital Services Oy Activity intensity level determination
US11587484B2 (en) 2015-12-21 2023-02-21 Suunto Oy Method for controlling a display
US11838990B2 (en) 2015-12-21 2023-12-05 Suunto Oy Communicating sensor data in wireless communication systems
US11607144B2 (en) 2015-12-21 2023-03-21 Suunto Oy Sensor based context management
US11541280B2 (en) 2015-12-21 2023-01-03 Suunto Oy Apparatus and exercising device
US11284807B2 (en) 2015-12-21 2022-03-29 Amer Sports Digital Services Oy Engaging exercising devices with a mobile device
EP3396319A4 (en) * 2015-12-24 2018-12-26 Fujitsu Limited Information processing system, information processing program, and information processing method
US11145272B2 (en) 2016-10-17 2021-10-12 Amer Sports Digital Services Oy Embedded computing device
US11703938B2 (en) 2016-10-17 2023-07-18 Suunto Oy Embedded computing device
WO2019173321A1 (en) * 2018-03-06 2019-09-12 Anki, Inc. Robot transportation mode classification
CN111351524A (en) * 2018-12-21 2020-06-30 亚玛芬体育数字服务公司 Sensor data management
TWI729596B (en) * 2018-12-21 2021-06-01 芬蘭商亞瑪芬體育數字服務公司 Sensor data management
IT201900014631A1 (en) * 2019-08-12 2021-02-12 Webbdone Srl HANDLING METHOD FOR VIRTUAL REALITY
CN111694829A (en) * 2020-06-10 2020-09-22 北京卡路里信息技术有限公司 Motion trail processing method and device and motion trail processing system
CN111694829B (en) * 2020-06-10 2023-08-15 北京卡路里信息技术有限公司 Motion trail processing method and device and motion trail processing system
FR3118235A1 (en) * 2020-12-17 2022-06-24 Orange Movement mode recognition by motion sensor
CN113303789B (en) * 2021-04-30 2023-01-10 武汉齐物科技有限公司 Gait event detection method and device based on acceleration
CN113303789A (en) * 2021-04-30 2021-08-27 武汉齐物科技有限公司 Gait event detection method and device based on acceleration

Similar Documents

Publication Publication Date Title
WO2014118767A1 (en) Classifying types of locomotion
US10918312B2 (en) Wearable and connected gait analytics system
CN101394788B (en) Gait analysis
US9307932B2 (en) System and method for 3D gait assessment
US20190150793A1 (en) Method and System for Analyzing Human Gait
Mannini et al. Walking speed estimation using foot-mounted inertial sensors: Comparing machine learning and strap-down integration methods
AU2010286471B2 (en) Characterizing a physical capability by motion analysis
KR20160031246A (en) Method and apparatus for gait task recognition
Santhiranayagam et al. A machine learning approach to estimate minimum toe clearance using inertial measurement units
CN108958482B (en) Similarity action recognition device and method based on convolutional neural network
Sama et al. Analyzing human gait and posture by combining feature selection and kernel methods
Khandelwal et al. Identification of gait events using expert knowledge and continuous wavelet transform analysis
Iervolino et al. A wearable device for sport performance analysis and monitoring
WO2021028641A4 (en) Method and system for analysing biomechanical activity and exposure to a biomechanical risk factor on a human subject in a context of physical activity
Baroudi et al. Estimating walking speed in the wild
EP3808268B1 (en) System and method for shoulder proprioceptive analysis
KR102194313B1 (en) Apparatus and method for identifying individuals by performing neural network analysis for various detection information
KR20190120923A (en) Method and system for walking ability prediction using foot characteristics information
Ma et al. Toward robust and platform-agnostic gait analysis
McCalmont et al. eZiGait: toward an AI gait analysis and sssistant system
KR20210046121A (en) Apparatus and method for identify patients with parkinson's disease and patients with podarthritis by performing neural network analysis by various detection information
JP2021030050A (en) Cognitive function evaluation method, cognitive function evaluation device, and cognitive function evaluation program
KR20210011097A (en) Apparatus and method for classification of gait type by performing neural network analysis for various detection information
US11497452B2 (en) Predictive knee joint loading system
Alcaraz et al. Mobile quantification and therapy course tracking for gait rehabilitation

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 13874174

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 13874174

Country of ref document: EP

Kind code of ref document: A1

32PN Ep: public notification in the ep bulletin as address of the adressee cannot be established

Free format text: NOTING OF LOSS OF RIGHTS PURSUANT TO RULE 112(1) EPC (EPO FORM 1205 DATED 12.02.2016)

122 Ep: pct application non-entry in european phase

Ref document number: 13874174

Country of ref document: EP

Kind code of ref document: A1