CN101784230A - System and method for displaying anonymously annotated physical exercise data - Google Patents

System and method for displaying anonymously annotated physical exercise data Download PDF

Info

Publication number
CN101784230A
CN101784230A CN200880104207A CN200880104207A CN101784230A CN 101784230 A CN101784230 A CN 101784230A CN 200880104207 A CN200880104207 A CN 200880104207A CN 200880104207 A CN200880104207 A CN 200880104207A CN 101784230 A CN101784230 A CN 101784230A
Authority
CN
China
Prior art keywords
people
data
physical exercise
exercise data
practises
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN200880104207A
Other languages
Chinese (zh)
Inventor
G·兰弗曼
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Koninklijke Philips NV
Original Assignee
Koninklijke Philips Electronics NV
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Koninklijke Philips Electronics NV filed Critical Koninklijke Philips Electronics NV
Publication of CN101784230A publication Critical patent/CN101784230A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0002Remote monitoring of patients using telemetry, e.g. transmission of vital signals via a communication network
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1126Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb using a particular sensing technique
    • A61B5/1127Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb using a particular sensing technique using markers
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/22Ergometry; Measuring muscular strength or the force of a muscular blow
    • A61B5/221Ergometry, e.g. by using bicycle type apparatus
    • A61B5/222Ergometry, e.g. by using bicycle type apparatus combined with detection or measurement of physiological parameters, e.g. heart rate
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B24/00Electric or electronic controls for exercising apparatus of preceding groups; Controlling or monitoring of exercises, sportive games, training or athletic performances
    • A63B24/0003Analysing the course of a movement or motion sequences during an exercise or trainings sequence, e.g. swing for golf or tennis
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B69/00Training appliances or apparatus for special sports
    • A63B69/36Training appliances or apparatus for special sports for golf
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B71/00Games or sports accessories not covered in groups A63B1/00 - A63B69/00
    • A63B71/06Indicating or scoring devices for games or players, or for other sports activities
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B19/00Teaching not covered by other main groups of this subclass
    • G09B19/003Repetitive work cycles; Sequence of movements
    • G09B19/0038Sports
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H20/00ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance
    • G16H20/30ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance relating to physical therapies or activities, e.g. physiotherapy, acupressure or exercising
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H30/00ICT specially adapted for the handling or processing of medical images
    • G16H30/40ICT specially adapted for the handling or processing of medical images for processing medical images, e.g. editing
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2562/00Details of sensors; Constructional details of sensor housings or probes; Accessories for sensors
    • A61B2562/02Details of sensors specially adapted for in-vivo measurements
    • A61B2562/0219Inertial sensors, e.g. accelerometers, gyroscopes, tilt switches
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1113Local tracking of patients, e.g. in a hospital or private home
    • A61B5/1114Tracking parts of the body
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/74Details of notification to user or communication with user or patient ; user input means
    • A61B5/742Details of notification to user or communication with user or patient ; user input means using visual displays
    • A61B5/744Displaying an avatar, e.g. an animated cartoon character
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B24/00Electric or electronic controls for exercising apparatus of preceding groups; Controlling or monitoring of exercises, sportive games, training or athletic performances
    • A63B24/0003Analysing the course of a movement or motion sequences during an exercise or trainings sequence, e.g. swing for golf or tennis
    • A63B24/0006Computerised comparison for qualitative assessment of motion sequences or the course of a movement
    • A63B2024/0012Comparing movements or motion sequences with a registered reference
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B24/00Electric or electronic controls for exercising apparatus of preceding groups; Controlling or monitoring of exercises, sportive games, training or athletic performances
    • A63B24/0087Electric or electronic controls for exercising apparatus of groups A63B21/00 - A63B23/00, e.g. controlling load
    • A63B2024/0096Electric or electronic controls for exercising apparatus of groups A63B21/00 - A63B23/00, e.g. controlling load using performance related parameters for controlling electronic or video games or avatars
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B71/00Games or sports accessories not covered in groups A63B1/00 - A63B69/00
    • A63B71/06Indicating or scoring devices for games or players, or for other sports activities
    • A63B71/0619Displays, user interfaces and indicating devices, specially adapted for sport equipment, e.g. display mounted on treadmills
    • A63B71/0622Visual, audio or audio-visual systems for entertaining, instructing or motivating the user
    • A63B2071/0638Displaying moving images of recorded environment, e.g. virtual environment
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B71/00Games or sports accessories not covered in groups A63B1/00 - A63B69/00
    • A63B71/06Indicating or scoring devices for games or players, or for other sports activities
    • A63B71/0619Displays, user interfaces and indicating devices, specially adapted for sport equipment, e.g. display mounted on treadmills
    • A63B2071/0647Visualisation of executed movements
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B71/00Games or sports accessories not covered in groups A63B1/00 - A63B69/00
    • A63B71/06Indicating or scoring devices for games or players, or for other sports activities
    • A63B71/0619Displays, user interfaces and indicating devices, specially adapted for sport equipment, e.g. display mounted on treadmills
    • A63B2071/065Visualisation of specific exercise parameters
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B2220/00Measuring of physical parameters relating to sporting activity
    • A63B2220/05Image processing for measuring physical parameters
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B2220/00Measuring of physical parameters relating to sporting activity
    • A63B2220/40Acceleration
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B2220/00Measuring of physical parameters relating to sporting activity
    • A63B2220/80Special sensors, transducers or devices therefor
    • A63B2220/803Motion sensors
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B2220/00Measuring of physical parameters relating to sporting activity
    • A63B2220/80Special sensors, transducers or devices therefor
    • A63B2220/806Video cameras
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B2220/00Measuring of physical parameters relating to sporting activity
    • A63B2220/80Special sensors, transducers or devices therefor
    • A63B2220/83Special sensors, transducers or devices therefor characterised by the position of the sensor
    • A63B2220/833Sensors arranged on the exercise apparatus or sports implement
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B2225/00Miscellaneous features of sport apparatus, devices or equipment
    • A63B2225/15Miscellaneous features of sport apparatus, devices or equipment with identification means that can be read by electronic means
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B2225/00Miscellaneous features of sport apparatus, devices or equipment
    • A63B2225/20Miscellaneous features of sport apparatus, devices or equipment with means for remote communication, e.g. internet or the like
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B2225/00Miscellaneous features of sport apparatus, devices or equipment
    • A63B2225/50Wireless data transmission, e.g. by radio transmitters or telemetry
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B2230/00Measuring physiological parameters of the user
    • A63B2230/04Measuring physiological parameters of the user heartbeat characteristics, e.g. ECG, blood pressure modulations
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B2230/00Measuring physiological parameters of the user
    • A63B2230/20Measuring physiological parameters of the user blood composition characteristics
    • A63B2230/202Measuring physiological parameters of the user blood composition characteristics glucose
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B2230/00Measuring physiological parameters of the user
    • A63B2230/20Measuring physiological parameters of the user blood composition characteristics
    • A63B2230/207P-O2, i.e. partial O2 value
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B69/00Training appliances or apparatus for special sports
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H40/00ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
    • G16H40/60ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices
    • G16H40/67ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices for remote operation
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/50ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for simulation or modelling of medical disorders

Abstract

The present invention relates to a method for displaying anonymously annotated physical exercise data to a person undertaking exercises. Based on physical exercise data, the physical exercise data is annotated at a physically separate annotation unit. At the location of the person, visual recordings of the person undertaking exercises together with synchronized annotation information are displayed to the person. A system for performing the method comprises a physical data processing unit (1), a display device (2), at least one posture recording device (3, 3'), a visual recording device (4), a data storage unit (5) and a physically separate annotation unit (6) in connection with the physical data processing unit (1), the connection being via an interconnected computer network (7).

Description

Be used to show the system and method for the anonymous physical exercise data of explaining
Background of invention
The present invention relates to be used for physical exercise (physical exercise) data show of anonymity note is given the people's who practises system and method.
Family care for the people who suffers the such health status puzzlement of similar apoplexy is practised, or improves the people's of the such body kinematics of similar golf swing family training exercise for hope, can come record via pick off.These exercises can also be assessed such as Physical Therapist or golf teacher by the professional person, so that provide direct feedback to this people.
If the professional person who comments (review) not at the scene, then can send to him to the video camera record between practice period.These records can be commented intuitively by the professional person, and can be understood intuitively by the people who practises by the record of filling comment.Yet these records particularly when being sent out away to the far-end professional person, may be invaded individual's privacy.And fully handling such document image automatically is an exigent task so that significant feedback to be provided.
As selection, only transmit the privacy that not invade the individual from the data of pick off.In this respect, US 6,817, and 979 B2 relate to a kind of by using mobile communication equipment to provide the virtual physiological model with the user to carry out mutual system and method.Obtain the physiological data that is associated with this user from the user.Preferably, this physiological data is sent to mobile communication equipment by using wireless communication protocol.This method also involves uses mobile communication equipment that physiological data is delivered to the webserver.This physiological data is integrated in user's the virtual physiological model.The user can visit from the user's data of this physiological data derivation and describe.
As an example, the user can create the incarnation (avatar) of the current condition of representing this user.The user can regulate described incarnation, so that the appearance of incarnation changed over the appearance of more wanting.For example, the anatomy yardstick of incarnation can be changed, waist, chest, upper arm and the thigh yardstick wanted with reflection.After the difference between given incarnation feature of wanting and the current incarnation feature, can draw various training, diet and relevant body-building suggestion, so that set up the training course of the body-building target that is suitable for helping this user to reach most and wants.Obtain physiological data subsequently, apply it to user's incarnation, and compare, reaching whether effective aspect the body-building target of wanting so that determine this training course with the data of the incarnation of wanting.
Yet, the common difficulty that can cause User Part in front end decipher sensor signal.Be difficult to interrelate with artificial screen personage's abstract presenting.
Although therefore this effort is arranged, still exist technically being used for showing the needs of the system and method for the anonymous physical exercise data of explaining to the people who practises.
Brief summary of the invention
In order to reach this and other purpose, the present invention is directed to a kind of method that is used for showing the anonymous physical exercise data of explaining to the people who practises, it may further comprise the steps:
A) gather the physical exercise data from the people who practises;
B) synchronously gather this people's who practises visual record;
C) the physical exercise data are sent to the note unit that physically separates;
D), explain this physical exercise data at the place, note unit that this physically separates according to these physical exercise data;
E) annotating information be sent to show and processing unit to be used for this people's who practises commentary;
F) this people's who practises visual record is shown to this people together with synchronous annotating information.
Detailed description of the invention
Before describing the present invention in detail, should be understood that the specific features part that the invention is not restricted to described equipment or the treatment step of described method, because such equipment and method can change.Be to be further appreciated that employed term only is in order to describe certain embodiments, and does not plan to limit here.It must be noted that when being used, singulative " " and " being somebody's turn to do " (" a ", " an " and " the ") comprise odd number and/or plural object in this description and claims, unless context is stipulated in addition significantly.
In the context of the present invention, the data that term " anonymous explain data " expression is such, the 3rd people who promptly wherein explains does not know that people's that he is explaining its data identity.Particularly, these data do not allow to discern this people.Reach anonymous a kind of mode and be by giving this data assigned identification number.The physical exercise data relate to the data of individual's motion or other exercise.
Two steps of described method are described two different information groups of how to gather about individual's exercise.At first, for example gather the physical exercise data by monitoring continuously from that people's sensor signal.Simultaneously, for example by using digital video camcorder to gather visual record.By synchronously gathering this data, guarantee that afterwards certain part of video flowing can be belonged to certain part of sensor-signal streams, and vice versa.
Because visual record and physical exercise data are independent entities, so the physical exercise data can be sent to the note unit that physically separates then.Explain the unitary anonymity that data separately are provided physically.Explaining the place, unit, the physical exercise data can be processed into the expression of the exercise of commenting for the 3rd people.The physical exercise data can be explained then.This comprises the automatic processing of data, is for example undertaken by detecting with the deviation of motion template.And the 3rd people can include comment and suggestion, so that provide helpful feedback to the people who practises.Subsequently, annotating information is sent to demonstration and the processing unit at the people's who practises place, place.Here, annotating information and visual record combine.Then, the people's who practises visual record is shown to that people together with synchronous annotating information.This assurance synchronously explained in correct time showing, and like this, it is what has caused the observer or has commented the attention of system automatically that people can be directly acquainted with.
In a word, by according to method of the present invention, individual's exercise can be commented anonymously, and can provide feedback to that people.This anonymous the permission shared professional person's resource, makes that the commentary process is more effective.Simultaneously, when that people receives when feedback, feedback most clearly shows to him via visual record: which of exercise partly evoked feedback.
In one embodiment of the invention, in step d), the place, note unit that separates physically is according to physical exercise data computation incarnation.For purposes of the present invention, that term " incarnation " should be represented is that computer generates, commissarial posture or the abstract of motion present.Under simple situation, incarnation can be the stick figure.Under more complicated situation, incarnation can be represented additional information, as pulse rate, volume of perspiration, muscle fatigue degree or the like.Use the advantage of incarnation representation to be, incarnation can be rotated on the unitary screen of note when the expression exercise.This makes the observer can select the optimal viewing angle in order to the assessment exercise.
In another embodiment of the present invention, step f) additionally comprises the calculating incarnation, and incarnation and visual record synchronously are shown to that people with explaining.In a word, that people will see visual record, note and the incarnation of his exercise then.This is favourable, because if those people's motion blocked by loose fitting clothes in visual record, if or they on video camera not by record correctly, then incarnation can more clearly be described those people's motion.In addition, incarnation can be rotated, so that obtain best viewing angle.Another option is to equip one or more incarnation to a plurality of visual angles.
In another embodiment of the present invention, transmitting the physical exercise data and transmit annotating information in step c) in step e), is to carry out via the computer network that interconnects, and this computer network is the Internet preferably.This allows the people who is positioned at far-end to comment and explain.Suitable agreement can comprise those agreements of ICP/IP protocol.
In another embodiment of the present invention, physical exercise data from that people are selected from the group that comprises following item, that is: the order of severity and/or the breathing rate of exercise data, gesture data, electromyographic data, pulse rate, blood pressure, oxygen content, blood sugar content, perspiration.Each of these data types relates to exercise itself, such as under the situation of motion and gesture data.Other data type relates to that people's total situation or physical ability.Can provide valuable insight about the knowledge of this respect for the effectiveness of rehabilitation or training measure.For example, can infer whether that people is in the exceeding compensation stage behind training stimulus.
In another embodiment of the present invention, annotating information is selected from the group that comprises following item, that is: visual information, audio signal and/or voice record.Visual information can have the form of labelling, arrow in all images that is inserted into incarnation in this way, that point out particular problem.In addition, can insert little video clipping, to show the correct execution of exercise.Other visual information can be written comment, or the figure of the statistics of video data, and described data are as the order of severity of electromyographic data, pulse rate, blood pressure, oxygen content, blood sugar content, perspiration and/or breathing rate.This feasible situation that can assess out this people who practises at a glance.Audio signal can be the simple buzzer when correctly not carrying out motion.When the voice comment is when explaining the simplest mode of exercise, can add the voice comment that has write down by the observer.
The present invention also at a kind of system that is used for showing to the people who practises the anonymous physical exercise data of explaining, comprising:
-body data processing unit;
-the display device of communicating by letter with the body data processing unit;
-at least one posture recording equipment, it is assigned to the people who practises, and communicates by letter with the body data processing unit;
-visual record the equipment that communicates with the body data processing unit;
-data storage cell is used to store and retrieve the data from body data processing unit and visual record equipment, and this data storage device is communicated by letter with the body data processing unit;
-with body data processing unit note that be connected, that physically separate unit, described connection be via the interconnection computer network.
In one embodiment of the invention, this at least one posture recording equipment is included in the motion sensor on the person of practising, and this pick off is selected from the group that comprises following item, that is: acceleration transducer, inertial sensor and/or gravity sensor.Motion sensor can be worn at the select location on the person, as upper arm, underarm, thigh, shank or trunk.They can be at the integrated solid state sensor of the height that can buy on the market.Sensor signal can be via wired, wireless or utilize the electric conductivity of application on human skin to carry out in body area network to the unitary transmission of postural assessment.After the posture of calculating the people, its result can provide with the form of incarnation.
In another embodiment of the present invention, this at least one posture recording equipment is included in the optical markings on the person of practising.The posture recording equipment utilizes optical tracking system to follow the tracks of described at least one optical markings then.According to the signal of optical tracking system, calculate the expression of that people's posture then.Optical markings can be carried on the select location on the person, as upper arm, underarm, thigh, shank or trunk.Can realize the tracking of labelling with single camera or numerous video camera.When using stereo camera, generate three-dimensional posture and exercise data.After people's posture being carried out Flame Image Process and calculating, the result can provide with the form of incarnation.
Also might make up several postures and monitor principle.For example, motion sensor and optically tracked combination can provide complementary data, so that calculate people's posture better.
Another aspect of the present invention according to claim of the present invention, be used for showing the use of the system of the anonymous physical exercise data of explaining to the people who practises.
The accompanying drawing summary
With reference to following accompanying drawing, the present invention will become and be more readily understood, wherein:
Fig. 1 shows according to system of the present invention;
Fig. 2 display of visually record is overlapping synchronously with the incarnation of representing the physical exercise data;
Fig. 3 shows the flow chart according to method of the present invention;
Fig. 4 shows the module that is used to carry out according to method of the present invention.
Describe in detail
Fig. 1 shows according to system of the present invention, that be used for showing to the people who practises the anonymous physical exercise data of explaining.This person has the movement monitor 3 that is positioned on its thigh and its ankle joint, with as the posture recording equipment.In addition, optical markings 3 ' is positioned on wrist and the trunk.As the physical exercise data, the signal of motion sensor 3 is transmitted wirelessly body data processing unit 1, and primary here sensor signal is processed into motion and gesture data.Video camera 4 recorders' motion.And body data processing unit 1 is carried out the optical tracking operation for the video flowing of video camera 4, with the position and the motion of identification optical markings 3 '.This also is processed into motion and gesture data, and replenishes the data that obtain from motion sensor 3.
Primary or treated sensor signal and be stored in data storage cell 5 from the positional information of optical markings 3 '.In addition, the people's who practises video flowing also is stored in the there.Data in the data storage cell 5 are stored together with the information of time about record.This makes might be correlated with or synchronous described information, for example, knows by which frame of indicated which position of posture recording equipment 3,3 ' corresponding to the people's who practises video clipping.
By using the computer network such as the such interconnection in the Internet 7, body data processing unit 1 is sent to the note unit 6 that physically separates treated pick off 3 signals with from the positional information of optical markings 3 '.Delivery time information also.This is explained the unit and calculates visual representation according to the body data that is received then, such as incarnation.The Physical Therapist watches the motion of this visual representation and to each segment filling comment, explains thereby carry out on his terminal 8.This note is transferred back to the body data processing unit 1 of the position that is in the people who practises together with the time of explaining in this exercise.Again, described transmission is by realizing such as the computer network of the such interconnection in the Internet 7.
Body data processing unit 1 is access data storage unit 5 then, and the control oneself data and the video clipping of record of the particular exercises explained of retrieval.Generate the film sequence of watching, and it is presented on the display 2 for that people.In this case, that people's video flowing and shown simultaneously from the incarnation that recorded data calculates.At reasonable time, Physical Therapist's comment also is displayed to or says to that people.
Fig. 2 display of visually record is overlapping synchronously with the incarnation of representing the physical exercise data.A people is practising.Represent the body data of his motion to be recorded and to be used in to calculate incarnation and represent.The motion of incarnation is decomposed by the time, and is divided into the stream of frame 20 one by one.Similarly, this person's motion is by the video camera record.This sequence of video images is also decomposed by the time, and is divided into the stream of frame 21 one by one.Because physical exercise data and visual record are synchronously gathered, so can be assigned to them to a common timeline.Flow down the timeline of face among Fig. 2 at frame, at random divide beginning and divide end at 4:21 at 4:16.
In the exercise of Fig. 2, this person stretches downwards from his two arms.In image, left arm remains stretching, extension, and lifts along coronalplane, surpasses this person's head until hands.Arm is maintained at this position, and supposes simultaneously to carry out identical motion with right arm.In the time of 4:20, this person can not make his right arm remain at horizontal level to stretch out.Arm is in elbow bends.This feasible easier lifting arm, like this, in this point, the benefit that does not obtain medical treatment.Can pick out the frame that divides at 4:20 so remotely comment the Physical Therapist of incarnation frame 20, and add visual or oral comment.This comment is together with dividing the information that is shown in the exercise to be transmitted to this person at 4:20, to be used for commentary in the future.In this person's position, this note can be made up with visual record 21, get in touch so that this person can more directly produce with exercise, and watch the mistake that he occurs attentively when practising.
Fig. 3 shows the flow chart according to method of the present invention.First step 30 is: the use video camera comes visually and uses pick off to write down an ongoing exercise of people via gesture data.Visual record is stored 31, and the posture record is sent to annotation system 32.By using annotation system, someone comments described posture record, and adds his comment and labelling 33.These notes are transmitted to patient system 34, wherein the people that practises of " patient " expression.In patient-side, the visual record of being stored is retrieved 35, and with this explain combined 36 so that provide his the comprehensive feedback of anonymity of still being safe from harm to that people.
Fig. 4 shows the module be used to carry out according to method of the present invention, so that replenish describing the system of Fig. 1.Pick off receiver 40 receives from the signal of motion sensor or from the information of the tracking of optical markings.This pick off receiver 40 is delivered to motion transport module 41 to its data.With pick off receiver 40 synchronously, video camera 42 is caught the people's who practises video sequence.These video sequences are stored in the storage facility 43.Motion transport module 41 is sent to its data the mobile receiver 45 that is positioned at far-end.This is represented by the boundary line 44 of separating two module groups.
Motion receiver module 45 is explained device 46 to data delivery to motion, and these data are converted into accessible data therein, and are explained by the observer.This note is delivered to together with the information about the time location of this note in exercise and explains transport module 47.Above-mentioned note transport module 47 is sent to this information the note receiver 48 that is positioned at child group of place of the module that is assigned to the people who practises.This annotating information arrives to be handled and overlapping module 49, the video sequence that processing and overlapping module 49 are visited from memory module 43, and make up described sequence and note, so that the reasonable time that should explain at video sequence occurs.At last, via presenting module 50, the overlapping video sequence is shown to the people who has practised.
In order to provide comprehensive disclosure under the situation of increase application length within reason, the applicant is incorporated in this to above-mentioned each patent and patent application by reference.
The unit in the embodiment of above detailed description and the particular combinations of feature only are exemplary; These instructions with in the application and by reference and the exchange of other instruction in merged patent/application and alternative also obviously expect.Just as the skilled person will recognize, it may occur to persons skilled in the art that content change as described herein, modification and other realizations, and do not deviate from desired the spirit and scope of the present invention.Therefore, above explanation only is as an example, and is not intended as restriction.Scope of the present invention limits in following claim and equivalent thereof.And the reference number that uses in description and claim does not limit desired scope of the present invention.

Claims (10)

1. method that is used for showing to the people who practises the anonymous physical exercise data of explaining may further comprise the steps:
A) gather the physical exercise data from the people who practises;
B) synchronously gather this people's who practises visual record;
C) the physical exercise data are sent to the note unit that physically separates;
D), explain this physical exercise data at the place, note unit that this physically separates according to the physical exercise data;
E) this annotating information be sent to show and processing unit to be used for this people's who practises commentary;
F) people's who practises visual record is shown to this people together with synchronous annotating information.
2. according to the process of claim 1 wherein in step d), at place, note unit that this physically separates according to physical exercise data computation incarnation.
3. according to the method for claim 1 or 2, wherein step f) additionally comprises the calculating incarnation, and this incarnation and visual record synchronously are shown to this people with explaining.
4. according to the method for claim 1 to 3, wherein transmitting the physical exercise data and transmit annotating information in step c) in step e) is to carry out via the computer network that interconnects, and described computer network is the Internet preferably.
5. according to the method for claim 1 to 4, wherein from the group that comprises following item, select, that is: the order of severity and/or the breathing rate of exercise data, gesture data, electromyographic data, pulse rate, blood pressure, oxygen content, blood sugar content, perspiration from this people's physical exercise data.
6. according to the method for claim 1 to 5, wherein annotating information is selected from the group that comprises following item, that is: visual information, audio signal and/or voice record.
7. system that is used for showing to the people who practises the anonymous physical exercise data of explaining comprises:
-body data processing unit (1)
-the display device (2) of communicating by letter with body data processing unit (1);
-at least one posture recording equipment (3,3 '), it is assigned to the people who practises, and communicates by letter with body data processing unit (1);
-visual record the equipment (4) of communicating by letter with body data processing unit (1);
-data storage cell (5) is used for storage and the retrieval data from body data processing unit (1) and visual record equipment (4); This data storage device (5) is communicated by letter with body data processing unit (1);
-with body data processing unit (1) note that be connected, that physically separate unit (6), described connection be via the interconnection computer network (7).
8. according to the system of claim 7, wherein said at least one posture recording equipment (3,3 ') be included in motion sensor (3) on the person of practising, this pick off is selected from comprise following group, that is: acceleration transducer, inertial sensor and/or gravity sensor.
9. according to the system of claim 7, wherein said at least one posture recording equipment (3,3 ') is included in the optical markings (3 ') on the person of practising.
10. according to use claim 7 to 9, that be used for showing the system of the anonymous physical exercise data of explaining to the people who practises.
CN200880104207A 2007-08-24 2008-08-22 System and method for displaying anonymously annotated physical exercise data Pending CN101784230A (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
EP07114912.4 2007-08-24
EP07114912 2007-08-24
PCT/IB2008/053386 WO2009027917A1 (en) 2007-08-24 2008-08-22 System and method for displaying anonymously annotated physical exercise data

Publications (1)

Publication Number Publication Date
CN101784230A true CN101784230A (en) 2010-07-21

Family

ID=40122948

Family Applications (1)

Application Number Title Priority Date Filing Date
CN200880104207A Pending CN101784230A (en) 2007-08-24 2008-08-22 System and method for displaying anonymously annotated physical exercise data

Country Status (5)

Country Link
US (1) US20110021317A1 (en)
EP (1) EP2185071A1 (en)
JP (1) JP2010536459A (en)
CN (1) CN101784230A (en)
WO (1) WO2009027917A1 (en)

Cited By (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102440774A (en) * 2011-09-01 2012-05-09 东南大学 Remote measurement module for related physiological information in rehabilitation training process
CN103502987A (en) * 2011-02-17 2014-01-08 耐克国际有限公司 Selecting and correlating physical activity data with image date
US9297709B2 (en) 2013-03-15 2016-03-29 Nike, Inc. System and method for analyzing athletic activity
CN105615852A (en) * 2016-03-17 2016-06-01 北京永数网络科技有限公司 Blood pressure detection system and method
CN105641900A (en) * 2015-12-28 2016-06-08 联想(北京)有限公司 Respiration state reminding method, electronic equipment and system
US9381420B2 (en) 2011-02-17 2016-07-05 Nike, Inc. Workout user experience
US9389057B2 (en) 2010-11-10 2016-07-12 Nike, Inc. Systems and methods for time-based athletic activity measurement and display
US9462844B2 (en) 2008-06-13 2016-10-11 Nike, Inc. Footwear having sensor system
US9549585B2 (en) 2008-06-13 2017-01-24 Nike, Inc. Footwear having sensor system
US9622537B2 (en) 2008-06-13 2017-04-18 Nike, Inc. Footwear having sensor system
US9743861B2 (en) 2013-02-01 2017-08-29 Nike, Inc. System and method for analyzing athletic activity
US9756895B2 (en) 2012-02-22 2017-09-12 Nike, Inc. Footwear having sensor system
US9924760B2 (en) 2011-02-17 2018-03-27 Nike, Inc. Footwear having sensor system
US10070680B2 (en) 2008-06-13 2018-09-11 Nike, Inc. Footwear having sensor system
US10568381B2 (en) 2012-02-22 2020-02-25 Nike, Inc. Motorized shoe with gesture control
US10926133B2 (en) 2013-02-01 2021-02-23 Nike, Inc. System and method for analyzing athletic activity
CN112805073A (en) * 2018-08-07 2021-05-14 交互力量公司 Interactive fitness equipment system with mirror display
US11006690B2 (en) 2013-02-01 2021-05-18 Nike, Inc. System and method for analyzing athletic activity
US11684111B2 (en) 2012-02-22 2023-06-27 Nike, Inc. Motorized shoe with gesture control

Families Citing this family (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8500604B2 (en) * 2009-10-17 2013-08-06 Robert Bosch Gmbh Wearable system for monitoring strength training
JP5791726B2 (en) 2010-09-29 2015-10-07 ダカドー・アーゲー Automated health data acquisition, health data processing, and health data communication system
US9378336B2 (en) 2011-05-16 2016-06-28 Dacadoo Ag Optical data capture of exercise data in furtherance of a health score computation
US20130178960A1 (en) * 2012-01-10 2013-07-11 University Of Washington Through Its Center For Commercialization Systems and methods for remote monitoring of exercise performance metrics
ITGE20120011A1 (en) * 2012-01-27 2013-07-28 Paybay Networks S R L PATIENT REHABILITATION SYSTEM
US9652992B2 (en) * 2012-10-09 2017-05-16 Kc Holdings I Personalized avatar responsive to user physical state and context
US9501942B2 (en) 2012-10-09 2016-11-22 Kc Holdings I Personalized avatar responsive to user physical state and context
JP5811360B2 (en) * 2012-12-27 2015-11-11 カシオ計算機株式会社 Exercise information display system, exercise information display method, and exercise information display program
JP2014199613A (en) * 2013-03-29 2014-10-23 株式会社コナミデジタルエンタテインメント Application control program, application control method, and application control device
US20150133820A1 (en) * 2013-11-13 2015-05-14 Motorika Limited Virtual reality based rehabilitation apparatuses and methods
US20170000388A1 (en) * 2014-01-24 2017-01-05 Icura Aps System and method for mapping moving body parts
US10484437B2 (en) * 2015-01-21 2019-11-19 Logmein, Inc. Remote support service with two-way smart whiteboard
US20160346612A1 (en) * 2015-05-29 2016-12-01 Nike, Inc. Enhancing Exercise Through Augmented Reality
WO2017055080A1 (en) * 2015-09-28 2017-04-06 Koninklijke Philips N.V. System and method for supporting physical exercises
KR102511518B1 (en) * 2016-01-12 2023-03-20 삼성전자주식회사 Display apparatus and control method of the same
JP7009955B2 (en) * 2017-11-24 2022-01-26 トヨタ自動車株式会社 Medical data communication equipment, servers, medical data communication methods and medical data communication programs
US20200107750A1 (en) * 2018-10-03 2020-04-09 Surge Motion Inc. Method and system for assessing human movements

Family Cites Families (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5679004A (en) * 1995-12-07 1997-10-21 Movit, Inc. Myoelectric feedback system
EP0816986B1 (en) * 1996-07-03 2006-09-06 Hitachi, Ltd. System for recognizing motions
JP3469410B2 (en) * 1996-11-25 2003-11-25 三菱電機株式会社 Wellness system
US20060247070A1 (en) * 2001-06-11 2006-11-02 Recognition Insight, Llc Swing position recognition and reinforcement
US20030054327A1 (en) * 2001-09-20 2003-03-20 Evensen Mark H. Repetitive motion feedback system and method of practicing a repetitive motion
US6817979B2 (en) * 2002-06-28 2004-11-16 Nokia Corporation System and method for interacting with a user's virtual physiological model via a mobile terminal
US20060025229A1 (en) * 2003-12-19 2006-02-02 Satayan Mahajan Motion tracking and analysis apparatus and method and system implementations thereof
EP1846115A4 (en) * 2005-01-26 2012-04-25 Bentley Kinetics Inc Method and system for athletic motion analysis and instruction
US20060183980A1 (en) * 2005-02-14 2006-08-17 Chang-Ming Yang Mental and physical health status monitoring, analyze and automatic follow up methods and its application on clothing
WO2006103676A2 (en) * 2005-03-31 2006-10-05 Ronen Wolfson Interactive surface and display system
US20090299232A1 (en) * 2006-07-12 2009-12-03 Koninklijke Philips Electronics N.V. Health management device
WO2009024929A1 (en) * 2007-08-22 2009-02-26 Koninklijke Philips Electronics N.V. System and method for displaying selected information to a person undertaking exercises

Cited By (42)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9462844B2 (en) 2008-06-13 2016-10-11 Nike, Inc. Footwear having sensor system
US10314361B2 (en) 2008-06-13 2019-06-11 Nike, Inc. Footwear having sensor system
US10408693B2 (en) 2008-06-13 2019-09-10 Nike, Inc. System and method for analyzing athletic activity
US11707107B2 (en) 2008-06-13 2023-07-25 Nike, Inc. Footwear having sensor system
US10070680B2 (en) 2008-06-13 2018-09-11 Nike, Inc. Footwear having sensor system
US9622537B2 (en) 2008-06-13 2017-04-18 Nike, Inc. Footwear having sensor system
US9549585B2 (en) 2008-06-13 2017-01-24 Nike, Inc. Footwear having sensor system
US11026469B2 (en) 2008-06-13 2021-06-08 Nike, Inc. Footwear having sensor system
US10912490B2 (en) 2008-06-13 2021-02-09 Nike, Inc. Footwear having sensor system
US11600371B2 (en) 2010-11-10 2023-03-07 Nike, Inc. Systems and methods for time-based athletic activity measurement and display
US9389057B2 (en) 2010-11-10 2016-07-12 Nike, Inc. Systems and methods for time-based athletic activity measurement and display
US11568977B2 (en) 2010-11-10 2023-01-31 Nike, Inc. Systems and methods for time-based athletic activity measurement and display
US10632343B2 (en) 2010-11-10 2020-04-28 Nike, Inc. Systems and methods for time-based athletic activity measurement and display
US11817198B2 (en) 2010-11-10 2023-11-14 Nike, Inc. Systems and methods for time-based athletic activity measurement and display
US9757619B2 (en) 2010-11-10 2017-09-12 Nike, Inc. Systems and methods for time-based athletic activity measurement and display
US11935640B2 (en) 2010-11-10 2024-03-19 Nike, Inc. Systems and methods for time-based athletic activity measurement and display
US10293209B2 (en) 2010-11-10 2019-05-21 Nike, Inc. Systems and methods for time-based athletic activity measurement and display
US10179263B2 (en) 2011-02-17 2019-01-15 Nike, Inc. Selecting and correlating physical activity data with image data
US9411940B2 (en) 2011-02-17 2016-08-09 Nike, Inc. Selecting and correlating physical activity data with image data
US9924760B2 (en) 2011-02-17 2018-03-27 Nike, Inc. Footwear having sensor system
CN103502987A (en) * 2011-02-17 2014-01-08 耐克国际有限公司 Selecting and correlating physical activity data with image date
US9381420B2 (en) 2011-02-17 2016-07-05 Nike, Inc. Workout user experience
CN103502987B (en) * 2011-02-17 2017-04-19 耐克创新有限合伙公司 Selecting and correlating physical activity data with image date
CN102440774A (en) * 2011-09-01 2012-05-09 东南大学 Remote measurement module for related physiological information in rehabilitation training process
US11793264B2 (en) 2012-02-22 2023-10-24 Nike, Inc. Footwear having sensor system
US10568381B2 (en) 2012-02-22 2020-02-25 Nike, Inc. Motorized shoe with gesture control
US10357078B2 (en) 2012-02-22 2019-07-23 Nike, Inc. Footwear having sensor system
US11684111B2 (en) 2012-02-22 2023-06-27 Nike, Inc. Motorized shoe with gesture control
US9756895B2 (en) 2012-02-22 2017-09-12 Nike, Inc. Footwear having sensor system
US11071344B2 (en) 2012-02-22 2021-07-27 Nike, Inc. Motorized shoe with gesture control
US11071345B2 (en) 2012-02-22 2021-07-27 Nike, Inc. Footwear having sensor system
US9743861B2 (en) 2013-02-01 2017-08-29 Nike, Inc. System and method for analyzing athletic activity
US11006690B2 (en) 2013-02-01 2021-05-18 Nike, Inc. System and method for analyzing athletic activity
US10926133B2 (en) 2013-02-01 2021-02-23 Nike, Inc. System and method for analyzing athletic activity
US11918854B2 (en) 2013-02-01 2024-03-05 Nike, Inc. System and method for analyzing athletic activity
US9410857B2 (en) 2013-03-15 2016-08-09 Nike, Inc. System and method for analyzing athletic activity
US10024740B2 (en) 2013-03-15 2018-07-17 Nike, Inc. System and method for analyzing athletic activity
US9297709B2 (en) 2013-03-15 2016-03-29 Nike, Inc. System and method for analyzing athletic activity
US9810591B2 (en) 2013-03-15 2017-11-07 Nike, Inc. System and method of analyzing athletic activity
CN105641900A (en) * 2015-12-28 2016-06-08 联想(北京)有限公司 Respiration state reminding method, electronic equipment and system
CN105615852A (en) * 2016-03-17 2016-06-01 北京永数网络科技有限公司 Blood pressure detection system and method
CN112805073A (en) * 2018-08-07 2021-05-14 交互力量公司 Interactive fitness equipment system with mirror display

Also Published As

Publication number Publication date
US20110021317A1 (en) 2011-01-27
JP2010536459A (en) 2010-12-02
EP2185071A1 (en) 2010-05-19
WO2009027917A1 (en) 2009-03-05

Similar Documents

Publication Publication Date Title
CN101784230A (en) System and method for displaying anonymously annotated physical exercise data
US20220005577A1 (en) Systems, apparatus and methods for non-invasive motion tracking to augment patient administered physical rehabilitation
US10755466B2 (en) Method and apparatus for comparing two motions
US10089763B2 (en) Systems and methods for real-time data quantification, acquisition, analysis and feedback
KR100772497B1 (en) Golf clinic system and application method thereof
US20170136296A1 (en) System and method for physical rehabilitation and motion training
US8758020B2 (en) Periodic evaluation and telerehabilitation systems and methods
US20150327794A1 (en) System and method for detecting and visualizing live kinetic and kinematic data for the musculoskeletal system
US9248361B1 (en) Motion capture and analysis systems for use in training athletes
KR20180103280A (en) An exercise guidance system for the elderly that performs posture recognition based on distance similarity between joints
EP2635988A1 (en) Method and system for automated personal training
US20130280683A1 (en) Equestrian Performance Sensing System
US20170112418A1 (en) Motion capture and analysis system for assessing mammalian kinetics
KR20200059428A (en) Exercise management system based on wearable device
CA3152977A1 (en) Systems and methods for wearable devices that determine balance indices
US20210265055A1 (en) Smart Meditation and Physiological System for the Cloud
US20200371738A1 (en) Virtual and augmented reality telecommunication platforms
KR102388337B1 (en) Service provision method of the application for temporomandibular joint disease improvement service
US20240057926A1 (en) Neurofeedback rehabilitation system
JP7353605B2 (en) Inhalation motion estimation device, computer program, and inhalation motion estimation method
US20210352066A1 (en) Range of Motion Tracking System
CN117396976A (en) Patient positioning adaptive guidance system
KR20220067781A (en) Electrical muscle stimulation training system and method
Vella et al. Towards the Human Ethome: Human Kinematics Study in Daily Life Environments

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C02 Deemed withdrawal of patent application after publication (patent law 2001)
WD01 Invention patent application deemed withdrawn after publication

Open date: 20100721