WO2017088068A1 - Motion capture garment - Google Patents

Motion capture garment Download PDF

Info

Publication number
WO2017088068A1
WO2017088068A1 PCT/CA2016/051398 CA2016051398W WO2017088068A1 WO 2017088068 A1 WO2017088068 A1 WO 2017088068A1 CA 2016051398 W CA2016051398 W CA 2016051398W WO 2017088068 A1 WO2017088068 A1 WO 2017088068A1
Authority
WO
WIPO (PCT)
Prior art keywords
sensor
segment
user
rotation matrix
motion
Prior art date
Application number
PCT/CA2016/051398
Other languages
French (fr)
Inventor
Mazen ELBAWAB
Jonathan OLESIK
Fatemeh AGHAZADEH
Francis AMANKRAH
Diana HORQQUE
Lisa ZANE
Original Assignee
9281-7428 Québec Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 9281-7428 Québec Inc. filed Critical 9281-7428 Québec Inc.
Publication of WO2017088068A1 publication Critical patent/WO2017088068A1/en

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01PMEASURING LINEAR OR ANGULAR SPEED, ACCELERATION, DECELERATION, OR SHOCK; INDICATING PRESENCE, ABSENCE, OR DIRECTION, OF MOVEMENT
    • G01P13/00Indicating or recording presence, absence, or direction, of movement
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6801Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
    • A61B5/6802Sensor mounted on worn items
    • A61B5/6804Garments; Clothes
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6801Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
    • A61B5/6802Sensor mounted on worn items
    • A61B5/6804Garments; Clothes
    • A61B5/6805Vests
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6801Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
    • A61B5/6813Specially adapted to be attached to a specific body part
    • A61B5/6822Neck
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6801Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
    • A61B5/6813Specially adapted to be attached to a specific body part
    • A61B5/6823Trunk, e.g., chest, back, abdomen, hip
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6801Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
    • A61B5/6813Specially adapted to be attached to a specific body part
    • A61B5/6824Arm or wrist
    • AHUMAN NECESSITIES
    • A41WEARING APPAREL
    • A41DOUTERWEAR; PROTECTIVE GARMENTS; ACCESSORIES
    • A41D1/00Garments
    • A41D1/002Garments adapted to accommodate electronic equipment
    • AHUMAN NECESSITIES
    • A41WEARING APPAREL
    • A41DOUTERWEAR; PROTECTIVE GARMENTS; ACCESSORIES
    • A41D13/00Professional, industrial or sporting protective garments, e.g. surgeons' gowns or garments protecting against blows or punches
    • A41D13/0015Sports garments other than provided for in groups A41D13/0007 - A41D13/088
    • AHUMAN NECESSITIES
    • A41WEARING APPAREL
    • A41DOUTERWEAR; PROTECTIVE GARMENTS; ACCESSORIES
    • A41D2400/00Functions or special features of garments
    • A41D2400/80Friction or grip reinforcement
    • A41D2400/82Friction or grip reinforcement with the body of the user
    • AHUMAN NECESSITIES
    • A41WEARING APPAREL
    • A41DOUTERWEAR; PROTECTIVE GARMENTS; ACCESSORIES
    • A41D2500/00Materials for garments
    • A41D2500/10Knitted
    • AHUMAN NECESSITIES
    • A41WEARING APPAREL
    • A41DOUTERWEAR; PROTECTIVE GARMENTS; ACCESSORIES
    • A41D31/00Materials specially adapted for outerwear
    • A41D31/04Materials specially adapted for outerwear characterised by special function or use
    • A41D31/18Elastic
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2503/00Evaluating a particular growth phase or type of persons or animals
    • A61B2503/10Athletes
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1113Local tracking of patients, e.g. in a hospital or private home
    • A61B5/1114Tracking parts of the body

Definitions

  • the technical field generally relates to wearable technologies. More particularly, it relates to a garment for three-dimensional tracking of a user's movements.
  • An improved system is therefore needed which solves as least some of the problems of the prior art.
  • such an improved system should reduce the number of sensors required to track 3D body movement, and should be more practical for use in the context of tracking body movement during physical activity or sport.
  • a device for sensing three-dimensional (3D) motion of a user's body includes at least one form-fitting garment article having a plurality of compression areas formed of compressive fabric and a plurality of tracked areas, the compression areas being located to maintain position of the tracked areas over the user's body when the form-fitting garment article is worn by the user, a plurality of motion sensors positioned on the form-fitting garment article at the tracked areas, the motion sensors being operable to capture 3D motion.
  • a kit for capturing 3-D motion of a user there is provided.
  • the kit includes a plurality of sensor capsules operable to capture 3-D motion, a form-fitting garment article having a plurality of sensor holders attached thereto, each sensor holder being configured to removably retain one of the sensor capsules, and a brain pack removably attachable to the form-fitting garment article and configured to receive captured motion data from the sensor capsules.
  • a method for sensing three- dimensional (3D) motion of a user includes wearing, over the user's body, a form-fitting garment article having a plurality of compression areas and a plurality of tracked areas; positioning the compression areas over a first set of body parts of the user to be compressed; positioning the tracked areas over a second set of body parts of the user to be tracked, and capturing, by a plurality of sensors positioned on the form-fitting garment article at the tracked areas, 3-D motion of the second set of body parts of the user.
  • a method for processing three- dimensional (3D) motion of a user includes receiving orientation information captured by a first inertial sensor positioned at an upper sub-segment of a body segment of the user, receiving orientation information captured by a second inertial sensor positioned at a lower sub-segment of the body segment of the user, calibrating the first inertial sensor and second inertial sensor based on the orientation information captured when the user is in a calibration pose, transforming the orientation information captured by the first inertial sensor into a first rotation matrix defining the orientation of the first inertial sensor relative to the configuration pose, transforming the orientation information captured by the second inertial sensor into a second rotation matrix defining the orientation of the second inertial sensor relative to the configuration pose; and adjusting the one of the rotation matrices based on the other of the rotation matrices.
  • a method for processing three-dimensional (3D) motion of a user includes receiving orientation information captured by an inertial sensor positioned at an upper sub-segment of a body segment of the user; receiving stretch signal captured by a stretch sensor positioned at a joint connecting the upper sub- segment to a lower sub-segment of the body segment; calibrating the inertial sensor and the stretch sensor based on the orientation information and angle information captured when the user is in a calibration pose; transforming the orientation information captured by the inertial sensor into a first rotation matrix defining the orientation of the inertial sensor relative to the configuration pose; mapping the stretch signal by the stretch sensor to a flexion angle information; and determining a corrected first rotation matrix based on the flexion angle to determine a second rotation matrix defining an orientation of the lower sub-segment.
  • a method for processing three- dimensional (3D) motion of a user includes receiving orientation information captured by a first inertial sensor positioned at an upper sub-segment of a body segment of the user; receiving orientation information captured by a second inertial sensor positioned at a lower sub-segment of the body segment; receiving stretch signal captured by a stretch sensor positioned at a joint connecting the upper sub-segment to the lower sub-segment; calibrating the first inertial sensor, the second inertial sensor and the stretch sensor based on the orientation information and angle information captured when the user is in a calibration pose; transforming the orientation information captured by the first inertial sensor into a first rotation matrix defining the orientation of the first inertial sensor relative to the configuration pose; transforming the orientation information captured by the second inertial sensor into a second rotation matrix defining the orientation of the second inertial sensor relative to the configuration pose; mapping the stretch signal by the stretch sensor to a flexion angle information; and compensating the
  • a smart garment for tracking three-dimensional body movement of a user includes at least one stretchable sensor spanning across a joint of the user's body, the stretchable sensor generating a flexion signal corresponding to a flexion angle of said joint; and at least one additional sensor positioned along a first segment of the user's body adjacent said joint, the additional sensor generating an orientation signal corresponding to an orientation of said first segment.
  • the flexion signal and the orientation signal are processed to estimate a three-dimensional position or orientation of a second segment of the user's body adjacent to said joint and opposite said first segment.
  • a smart garment for tracking three-dimensional body movement of a user includes at least one sensor cluster for tracking movement of a segment of the user's body.
  • the sensor cluster includes: a stretchable sensor spanning across a joint of the user's body, the stretchable sensor generating a first signal corresponding to a flexion angle of said joint; a first inertial sensor positioned adjacent the joint, the first inertial sensor generating a second signal corresponding to an orientation of a proximal segment of the user's body; and a second inertial sensor positioned adjacent the joint, opposite the first inertial sensor, the second inertial sensor generating a third signal corresponding to an orientation of a distal segment of the user's body adjacent said proximal segment.
  • the first, second and third signals are processed to estimate a three-dimensional movement of a portion of the user's body.
  • a kit for tracking three-dimensional body movement of a user includes: a garment wearable to cover a portion of a user's limb; a stretchable sensor attachable across a portion of the garment spanning a joint of the user's body, the stretchable sensor generating a first signal corresponding to a flexion angle of said joint; a first inertial sensor attachable to a portion of the garment adjacent the joint, the first inertial sensor generating a second signal corresponding to an orientation of a proximal segment of the user's body; a second inertial sensor attachable to a portion of the garment adjacent the joint and opposite the first inertial sensor, the second inertial sensor generating a third signal corresponding to an orientation of a distal segment of the user's body adjacent said proximal segment.
  • the first, second and third signals are processed to estimate a three-dimensional movement
  • a method for tracking three-dimensional body movement of a user includes the steps of: generating a first signal corresponding to a flexion angle of a joint on the user's body, the flexion angle being generated by a stretchable sensor positioned across said joint; generating a second signal corresponding to an orientation of a proximal segment of a user's body, the second signal being generated by an inertial sensor positioned on the proximal segment of the user's body adjacent the joint; generating a third signal corresponding to an orientation of a distal segment of the user's body, the third signal being generated by an inertial sensor positioned on the distal segment of the user's body opposite the proximal segment and adjacent the joint; and processing the first, second and third signals to estimate a three dimensional movement of a portion of the user's body.
  • processing the first, second and third signals comprises processing the first and second signals to determine an estimated position of the proximal and distal segments, processing the third signal to determine the flexion angle of the joint, and correcting the estimated position of the proximal and distal segments using the determined flexion angle of the joint.
  • Figures 1A and 1 B are respective front and rear views of a motion capture garment, according to an embodiment.
  • Figure 1 C is an individual view of the brain pack in the motion capture garment of Figures 1 A and 1 B.
  • Figure 2A is a schematic showing the capsule placement on the motion capture garment of Figures 1A and 1 B.
  • Figures 2B, 2C and 2D are respective individual views of the capsules and holders shown in Figure 2A.
  • Figure 2E is a rear view of the capsule.
  • Figure 3A is a schematic showing the stretch sensor placement on the motion capture garment of Figures 1 A and 1 B.
  • Figures 3B and 3C are respective individual views of the stretch sensors shown in Figure 3A.
  • Figure 4 is a schematic illustrating a segmentation of the human body for creating a model thereof, as used in a method for tracking body motion according to an embodiment.
  • Figure 5 is a schematic flowchart illustrating sensor fusion, according to an embodiment where body motion is tracked using inertial sensors only.
  • Figure 6 is a schematic flowchart illustrating sensor fusion, according to an embodiment where body motion is tracked using one inertial sensor and one stretch sensor per tracked segment.
  • Figure 7 is a schematic flowchart illustrating sensor fusion, according to an embodiment where body motion is tracked using two inertial sensors and one stretch sensor per tracked segment.
  • Figures 8A and 8B are respective block diagrams of the data and power modules in the brain pack of Figure 1 C. DETAILED DESCRIPTION
  • a motion capture garment 100 is provided according to an embodiment.
  • the garment 100 includes at least one garment article that is wearable by a user.
  • the garment 100 is further configured to track body movements in 3D.
  • the garment article of the garment 100 is sportswear garment in that it is designed to be worn for sports, exercise or other physical activities.
  • the garment article is a seamless garment made of a stretchable and breathable material, and is sized and shaped such that it is form- fitting in that it closely conforms to a user's body.
  • the garment article can, for example, be a compression-type garment article, and can be provided with compression areas at specific locations on the user's body which can aid the user in positioning the garment 100 properly, consequently placing sensors 108 in their proper positions.
  • Compression areas herein refers to areas of the garment article of the garment 100 that are formed of compressive fabric to provide a compression fit to a corresponding area of the user's body when the garment article is worn.
  • the various of areas of the garment article may be formed of compressive fabric.
  • the garment article may be entirely formed of compressive fabric. Accordingly, the compression areas of the at least one garment article corresponds to areas of the garment article that apply a greater compressive force than other areas of the form-fitting garment article.
  • the form-fitting garment article is formed of a continuously knitted material so as to be substantially seamless.
  • the compression areas correspond to more tightly knitted areas of the continuously knitted material.
  • the compression areas are located on the at least one garment article so that tracked areas of the garment article are maintained in position over the user's body when the form-fitting garment article is worn by the user. Some tracked areas may be located in proximity of the compression areas. Alternatively, or additionally, some tracked areas may be located at the compression area.
  • the tracked areas of the garment article correspond to locations where sensors described herein are affixed to the garment article of the garment 100.
  • the at least one garment article includes an upper- body section 102 that is designed to fit a user's upper body, and comprises sleeve portions which cover the user's arms, preferably down to the user's wrists, and a torso portion which covers the user's torso.
  • the at least one garment article includes a lower-body section 104 that is designed to fit a user's lower body, and comprises leg portions which cover the user's legs, preferably down to the user's ankles, and a waist portion covering the user's waist.
  • the garment article can serve as a frame for mounting and embedding components useful in capturing 3D motion: by providing garment sections 102, 104 which cover the entirety of the user's body as described above, components can be positioned at virtually any location on the user's body.
  • the garment sections 102, 104 are each provided with a network of electrically conductive channels 106.
  • the conducting channels 106 can comprise, for example, isolated and preferably insulated or protected channels through which conductive wiring can be fed.
  • the conducting channels 106 can additionally or alternatively comprise electrically conductive fibres embedded in or affixed to the garment.
  • the conductive channels 106 serve to electrically connect various electrical components positioned along the garment 100.
  • the conductive channels 106 are positioned and arranged such that they minimize the length of wire required to form a complete conductive network, and such that they do not interfere with the free movement of the user wearing the garment 100.
  • the channels 106 extend over a user's trapezius muscles to connect to the back of the user's arms.
  • the channels 106 in the upper-body section 102 and lower-body section 104 are configured such that they meet near a common nodal region 103, for example near the junction of the section 102, 104 near the user's waist.
  • the channels 106 can comprise mainly straight lines, and can be provided with gradually curved portions in order to reach the nodal region 103.
  • the garment article is a one-piece suit that covers the full body of the user.
  • the at least one garment article of the garment 100 may cover less than the full body of the user.
  • only the upper body section 102 or only the lower body section 104 is provided or worn at a given time.
  • the upper body section 102 may be a short sleeve article in which the lower arm portions of the user are exposed when worn.
  • the lower body section 104 may be in the form of shorts in which lower leg portions of the user are exposed when worn.
  • a plurality of sensors 108 are strategically positioned at various locations on the garment 100 in order to capture 3D motion data of the user's full-body movements.
  • the sensors 108 are positioned along the network of conductive channels 106, and are in electric communication therewith.
  • the sensors 108 can receive power from a power source connected to the network of channels 106, and can also transmit and/or receive signals from other components through the channels 106.
  • the sensors 108 can be provided with an independent power source, such as a battery, which can be charged through the channels 106.
  • the sensors 108 can communicate wirelessly, or store data locally in order to be transferred at a later time. In such configurations, the sensors 108 need only use the channels 106 for charging purposes.
  • a first type of sensor 108 can include a biomechanical capsule 200.
  • the capsule 200 comprises an inertial measurement unit (IMU) capable of measuring movement along 9 degrees-of-freedom (DOF): orientation in the x, y and z directions (yaw, pitch, and roll), acceleration in the x, y and z directions, rate of rotation in x, y, and z directions, and magnetic field in the x, y and z directions.
  • IMU inertial measurement unit
  • the capsule also includes a controller and a transmitter for operating the IMU and communicating data captured therefrom to another location, such as a remote processor or a central hub.
  • the controller can be implemented using an application specific integrated circuit or could comprise, for example, a processor and memory.
  • the transmitter is wireless and comprises a Bluetooth chip, but in other embodiments, other wired or wireless transmitters can also be used.
  • the capsule can be provided with a storage module including memory which can serve to save data captured by the IMU for offline processing.
  • the capsule 200 comprises a rechargeable battery for independently powering the components housed therein. It should be understood, however, that in other embodiments a replaceable or non-rechargeable battery can be provided. In other embodiments, the capsule 200 can operate without a battery and be powered through a connection with the network of channels 106.
  • additional sensors can be provided within the capsule 200, which can be used to measure other 3D motion parameters, increase the accuracy of the IMU and/or capture physiological data of the user, such as the user's heartrate, body temperature, etc.
  • the garment 100 can include a plurality of capsules 200, and each capsule can include different types of sensors depending on the capsule's location and intended function.
  • the capsule 200 is preferably removably attached to the garment 100, thereby allowing the capsule 200 to be removed while the garment is being washed or stored.
  • the capsule 200 is removably engageable with a corresponding holder 250.
  • the holder 250 is preferably permanently attached to the garment 100, and is preferably made from a water and heat resistant material, such as a strong plastic.
  • the holder 250 is attached to the network of channels 106 in the garment 100, and comprises an interface 254, allowing the capsule 200 to connect to the network of channels 106 when inserted into the holder 254.
  • a backside of the capsule 200 comprises one or more electrical contacts for forming a conductive path with an electrical contact of the capsule holder.
  • nine capsules 200A-200I are provided which can be inserted into nine corresponding holders 250A-250I at different locations on the garment 100.
  • the capsules 200A-200I and holders 250A-250I can be provided with identifiers 202, 252, allowing a user to match a particular one of the capsules 200A-200I with its corresponding holder 250A-250I.
  • a first holder 250A is provided on the user's chest, a second 250B on the user's right upper arm, a third 250C on the user's right lower arm, a fourth 250D on the user's left upper arm, a fifth 250E on the user's left lower arm, a sixth 250F on the user's right upper leg, a seventh 250G on the user's right lower leg, an eighth 250H on the user's left upper leg, and a ninth 250I on the user's left lower leg.
  • more or fewer capsules can be provided and their position can vary depending on the type of sensor data required to determine the user's 3D movements.
  • the upper body section 102 garment article includes a first elbow compression area 256A to be located over a right elbow joint of the user's body when the upper body section 102 is worn.
  • the first elbow compression area 256A is in proximity of an right upper arm tracked area where the second holder 250B is located. Accordingly proper placement of the first elbow compression area 256A over the user's right elbow contributes to maintaining the second holder 250B in position over the user's right upper arm.
  • the first elbow compression area 256A is also in proximity of a right lower arm tracked area where the third holder 250C is located. Accordingly proper placement of the first compression area 256A over the user's right elbow also contributes to maintaining the third holder 250C in position over the user's right lower arm. It will be appreciated that the second elbow compression area 256B to be located over a left elbow joint of the user's body provides a similar effect in maintaining the fourth holder 200D and/or fifth holder 200E in position over the user's upper left arm and lower left arm respectively.
  • the lower body section 104 garment article includes a first knee compression area 256C to be located over a right knee of the user's body when the lower body section 104 is worn.
  • the first knee compression area 256C is in proximity of a right upper leg tracked area where the sixth holder 250C is located. Accordingly proper placement of the first knee compression area 256C over the user's right knee contributes to maintaining the sixth holder 250C position over the user's right upper leg.
  • the first knee compression area 256C is also in proximity of right lower leg tracked area where the seventh holder 250G is located. Accordingly proper placement of the third compression area 256C over the user's right knee also contributes to maintaining the seventh holder 250G in position over the user's right lower leg.
  • the second knee compression area 256D to be located over a left knee joint of the user's body provides a similar effect in maintaining the eighth holder 250H and/or ninth holder 250I in position over the user's upper left leg and lower left leg respectively.
  • the lower body section 104 garment article may further include a first calf compression area 256E to be located over a right calf of the user's body when the lower body section 104 is worn.
  • the first calf compression area 256E is in proximity of the right lower leg tracked area where the seventh holder 250G is located. Accordingly proper placement of the fifth compression area 256E over the user's right calf in combination with the first knee compression area 256C contributes to maintaining the seventh holder 250G in position over the user's right lower leg.
  • the second calf compression area 256F to be located over the left calf of the user's body provides a similar effect in maintaining the ninth holder 250I in position over the user's lower left leg.
  • distinctive visual indicators are provided on the compression areas 256A-256F so that they may be easily identified and correctly positioned over their corresponding body parts.
  • a second type of sensor 108 includes a stretchable sensor (or “stretch sensor”) such as a sensor band 300.
  • the sensor band 300 comprises a flexible fabric sensor, such as the StretchSenseTM fabric sensor.
  • the output of the sensor corresponds to an amount by which the sensor is stretched, providing one DOF along the stretching axis.
  • the sensor band 300 can be positioned along a joint or a muscle, thereby sensing the movement angle of a joint or the flex of a muscle.
  • the sensor band 300 comprises a fabric sensor, other types of stretch sensors can also be used.
  • the sensor band 300 is preferably removably attached to the garment 100, thereby allowing the band 300 to be removed while the garment is being washed or stored. It should be understood, however, that in some embodiments the band 300 can be made water and/or heat resistant such that it is safe to wash the garment 100 with the band 300 still attached. In the present embodiment, the band 300 is securable at both ends to a connector 350 on the garment 100 which serve to hold the band 300 in place and allows the band to be stretch as the user moves while wearing the garment 100.
  • the connector 350 can comprise, for example, textile hook-and- loop fasteners and/or buckle-type fasteners.
  • the connector 350 also comprises an interface 352 for electrically connecting with a corresponding interface 302 of the band 300.
  • connection of these interfaces 302, 352 allow the band 300 to communicate with the network of channels 106 in the garment 100, and thus provide sensing signals thereto for processing.
  • the band 300 can further be secured to the garment 100 by insertion through a fastener loop 356 attached to or woven into the garment 100.
  • each band 300A-300D is provided which can be secured to connectors 350 at four different locations on the garment 100.
  • the bands 300 and connectors 350 can be provided with identifiers 302, 352, allowing a user to match a particular band 300 with its corresponding connector 350.
  • a first band 300A is securable across the user's right elbow, a second band 300B across the user's left elbow, a third 300C across the user's right knee, and a fourth 300D across the user's left knee.
  • each band 300 is associated with and positioned between two capsules 200.
  • more or fewer bands can be provided and their position can vary depending on the type of sensor data required to determine the user's 3D movements.
  • all the bands 300 can be made of the same material and have the same sensing properties, while in other embodiments, the bands 300 can be made of different materials or have different properties depending on where they are to be placed and depending on their desired sensing objective. For example, a band destined to measure high forces on large muscle groups or large joints can be made differently than a band destined for low force; the same goes for fast vs. slow acceleration, long vs. short impulse, and other sensing considerations.
  • the size and/or shape of the bands 300 can be designed for optimal sensing at a particular location, and/or different materials can be used to form the bands 300.
  • the sensors 108 communicate with and are charged through a brain pack 1 10.
  • the brain pack 1 10 is sized, shaped and positioned such that it does not interfere with the user's movements.
  • the brain pack 1 10 fits around the user's waist, and is positioned near the common nodal region 103 of the conductive channels 106 in the upper- and lower-body sections 102, 104.
  • the pack 1 10 does not necessarily contain intelligence, and therefore does not necessarily process information.
  • the pack 1 10 in the present embodiment serves as a hub, and serves many different functions including interfacing with the capsules 200 and external processing devices through a data printed circuit board (PCB) 800 and providing power to the garment 100 and sensors 108 through a power PCB 850.
  • PCB data printed circuit board
  • the components of the brain pack 1 10 can be implemented on a single PCB or on several PCBs, depending on the form factor and manufacturability requirements.
  • the power PCB 850 mainly serves to manage power in the system. It comprises a power section 852, which can interface with and manage the charging and discharging of a battery which is housed in the pack, and the distribution of power to different components of the system. It can also serve to operate the stretch sensors 300, for example by powering the sensors and/or receiving stretch information received therefrom through an electronics interface 854.
  • the data PCB 800 mainly serves to manage data, for example by storing it to an SD card 132, or transmitting it wirelessly via Bluetooth.
  • the data PCB 800 is provided with a controller, in this case a microcontroller unit (MCU) 804, for controlling the operation of the sensors 108, the SD card 132 and the I/O elements such as buttons 128 and LEDs 130.
  • the MCU 804 can also serve to receive and interpret data received from the sensors 108, and operate a wireless module, in this case a Bluetooth radio frequency (RF) section 806.
  • RF radio frequency
  • the processor 804 can operate Bluetooth aggregator modules in the RF section 806 to gather information received from each of the sensors, and combine this information into a single frame to be transmitted to an external device for further processing.
  • the brain pack 1 10 can wirelessly communicate with a mobile phone in order to transmit sensor information thereto, and to receive operation commands therefrom. It can also be wirelessly connected to the capsules 200, for example via a Bluetooth connection.
  • the pack 1 10 is provided with a memory module, in this case a removable SD memory card 132, for saving recorded data from the sensors 108 for offline processing, however in alternate embodiments, different types of memory can be provided.
  • a memory module in this case a removable SD memory card 132, for saving recorded data from the sensors 108 for offline processing, however in alternate embodiments, different types of memory can be provided.
  • human interface devices are also provided such as buttons 128 and/or LEDs 130, but in alternate embodiments, different human interface or I/O devices can be provided, such as a touchscreen or touch pad.
  • the brain pack 1 10 is electrically connectable to the network of channels 106 in each of the upper-body 102 and lower-body 104 sections via removable connectors 1 12, 1 14.
  • the removable connectors 1 12, 1 14 fit within correspondingly marked ports 122, 124 in the brain pack 1 10.
  • the brain pack 1 10 is further provided with an interface port 126, such as a USB port for example, for receiving power from an external source and/or for communicating with an external device.
  • an interface port 126 such as a USB port for example, for receiving power from an external source and/or for communicating with an external device.
  • the brain pack 1 10 can provide power to the garment 100 through the channels 106, allowing any sensors 108 or other components attached thereto to charge.
  • the battery provided in the brain pack 1 10 is preferably rechargeable, allowing it to be charged when a power source is connected to the interface port 126.
  • the charging of the garment 100 is completely integrated: a single power connection 126 is required in order to charge the brain pack 1 10 and sensors 108, so long as they are all connected through the conductive channels 106.
  • the garment 100 gathers motion data from the sensors 108 and can transmit the data in real-time to an external processing device, such as a smartphone.
  • the smartphone can receive this information and process it in order to generate a model of the user's body in real-time.
  • the generated model can then be displayed to the user, or to a coach who is aiding the user, in order to provide the user with feedback, for example to improve form when performing athletic activities.
  • the motion data can be recorded by the SD card and can be processed offline, for example after being loaded to an external processing device such as a computer. The user can thereby see a playback of his recorded movements, and pause, rewind and fast-forward as necessary.
  • This motion information can further be aggregated to identify trends in the user's motions, and identify improvements or progressions in certain repeated movements, for example to aid in improving weightlifting form over time.
  • sensors can be used in order to capture 3D motion. Data from these sensors can be processed and interpreted in order to determine/estimate a user's actual body movements in 3D space.
  • data from these sensors can be processed and interpreted in order to determine/estimate a user's actual body movements in 3D space.
  • the garment 100 described herein therefore attempts to reduce the number of sensors required while maintaining sufficient accuracy and reliability by optimizing sensor placement and by combining sensor types.
  • a combination of inertial sensors 200 and stretchable sensors 300 are used in order to capture 3D movement. These two different types of sensors are placed at strategic locations on the user's body, allowing them each to independently collect data relative to the user's movements. This data can then be processed and combined to obtain a result that is more accurate than any one of the sensors taken individually.
  • two inertial sensors 200 can be associated with a stretchable sensor 300.
  • a stretchable sensor 300C can be placed across the user's right knee, between an inertial sensor 200F on the user's right upper leg and inertial sensor 200G on the user's right lower leg. Using sensor fusion techniques, information from these three sensors can together form an accurate representation of the user's right leg movement.
  • the capsules 200 can be removed from their holders 250, the stretchable sensor 300, where provided, can be removed and the brain pack 1 10 can be disconnected so that the garment 100 can be washed.
  • the garment article having the network of conductive channel 106 affixed thereto maybe fully submerged, such as being washed in a machine washing, while still providing full sensing functionality when subsequently dried.
  • the form-fitting garment article is first worn over the user's body.
  • the compression areas of the form- fitting garment article are appropriately positioned over a first set of corresponding body parts of the user.
  • tracked areas of the garment article located in proximity or at the compression areas are also positioned over a second set of correspond body parts of the users. The tracked areas correspond to where motion sensors are to be placed.
  • 3-D motion of the second set of body parts may be captured using the sensor.
  • the capturing generates motion data, which may be further processed as described hereinbelow.
  • the method further includes, prior to capturing motion, attaching the brain pack to the waist portion of the user and connecting the brain pack to sensors via the conductive channels formed in the garment article.
  • the method also includes, prior to capturing motion, inserting the sensor capsules into their respective sensor holders affixed to the garment article.
  • the brain pack may be detached from the garment article and the capsule sensors may be removed from the capsule holders.
  • the garment article with the capsule holders attached and the conductive channels still embedded can be fully submerged for washing, such as washing inside a washing machine.
  • Data acquired using the above-described garment can be processed in order to track the 3D body movement of a user.
  • the data acquired may be processed in real time.
  • the user's body can be modelled as being comprised of distinct segments 400.
  • the body can be divided into arm segments 402, leg segments 408 and a chest segment 414, with each of the segments preferably being connected and movable relative to another segment along a joint 450.
  • the model can further be subdivided into sub-segments, for example where two portions of the segment meet at a joint 450, for example by dividing the arm segment 402 into an upper (or proximal) sub- segment 404 and lower (or distal) sub-segment 406, the sub-segments being connected by an elbow joint 450.
  • the leg segment 408 can be divided into an upper (or proximal) sub-segment 410 and lower (or distal) sub-segment 412, the sub-segments meeting at a knee joint 450.
  • the body can be divided into more or fewer segments and/or sub-segments, depending on the modelling requirements.
  • the user's global body position and movement can be constructed by determining the orientation of each of the joints 250 which form part of the model.
  • each segment of the user's body 400 is provided with at least one inertial sensor 200.
  • the inertial sensor 200 provides an orientation of each segment 400 (i.e. pitch, yaw and roll).
  • a stretchable sensor is provided across the joint 450 joining the sub-segments.
  • the stretchable sensor thus allows determining a flexion angle of the joint 450, and thus an angle of the first sub- segment relative to the second sub-segment.
  • the segment orientation and the flexion angle can be processed in order to estimate the shoulder and elbow orientation, and thus construct a model of the 3D position of the segment and/or sub-segment.
  • a single inertial sensor 200 can be positioned on only one of the sub-segments, providing orientation information relating to only that sub-segment.
  • a single inertial sensor when a single inertial sensor is provided on a segment, it should be positioned on the upper or proximal sub-segment, thereby allowing the sensor to be more accurate as it is subject to less acceleration.
  • a single inertial sensor 200D can be positioned on the upper sub-segment 404.
  • a second inertial sensor 200 can be provided in order to provide an orientation of the lower or distal sub-segment.
  • a second inertial sensor 200E can be positioned on the lower sub-segment 406. Providing a second inertial sensor 200 can improve accuracy and reliability of the tracking method.
  • each segment which comprises sub-segments is tracked using at least two sensors.
  • information from these sensors is fused, thereby allowing the use of information of more than one sensor in a complimentary way to improve accuracy, reliability and other properties than if those sensors were to be otherwise taken alone.
  • Sensors may reflect the information with different accuracies and characteristics.
  • the key point in sensor fusion is that the sensors have the ability to complement each other based on different characteristics, source of information, location, etc. The goal is to benefit from advantages of both and prevent the disadvantages and uncertainty of each sensor in the final model.
  • a user's body movement can be tracked using inertial sensors 200 only.
  • Each segment of the user's body can be tracked individually, with the information from each segment allowing for the construction of a full 3D model of the user's movement.
  • Each sub-segment of a tracked segment is provided with an inertial sensor. Data from the two inertial sensors on a given segment can be combined in order to improve accuracy and compensate for uncertainties.
  • an arm segment 402 can be tracked using a first inertial sensor 200D positioned on the upper sub-segment 404, and a second inertial sensor 200E positioned on the lower sub-segment 406.
  • a first step comprises calibration.
  • the user is positioned in a T-pose (i.e. standing up straight, with legs together and arms raised to the sides at a 90 degree angle), providing an initial frame of reference for changes in orientation recorded by the inertial sensors. Any subsequent movements of the user will be recorded relative to the initial T-pose.
  • all the inertial sensors can be calibrated at once, for example by having the user stand in the T-pose for a period of time.
  • each sensor can be calibrated individually, for example by allowing the user to hold one arm out to his side at a time for a period of time.
  • the calibration can be initiated, for example, by pressing a button on the brain pack or by receiving a command from an external device, such as an application on a smartphone.
  • orientation information from each of the upper arm and lower arm inertial sensors is captured. This information is transformed into a rotation matrix, which defines the orientation of those sensors as a rotation relative to the T-pose.
  • the rotation matrices are then processed by a fusion unit, which can for example be implemented in a processor, either located on the garment or in an external device such as a smartphone.
  • a fusion unit information from each of the sensors is combined and known constraints are applied in order to obtain a more accurate result by adjusting the rotation matrices.
  • the inertial sensors are provided along a common segment. Therefore, at least one of their pitch, yaw, and roll axes should be aligned with one another, or should rotate within a common plane. If the axes are misaligned, the rotation matrix of the lower sensor is adjusted, such that the angle of the lower sensor is properly aligned with the upper sensor.
  • the yaw axis of the lower arm sensor should be in the same plane as the yaw axis of the upper arm sensor. Therefore, if the yaw axis of the lower sensor is misaligned, the rotation matrix of the lower sensor can be compensated such that it rotates within the same plane as the roll and yaw axes of the upper arm.
  • the pitch axis of the upper and lower leg sensors should be aligned.
  • the rotation matrix of the lower leg sensor can be corrected such that it is in alignment with the upper leg sensor. Once corrected, the aligned information from the two inertial sensors can be used to estimate the orientation of the elbow. This estimate can be improved by applying known constraints.
  • the rotation matrices of both sensors can be corrected until the estimated elbow angle falls within a permitted range.
  • Other constraints can be applied according to other range of motion limitations in human physiology, and according to user-specific range of motion limitations.
  • the same type of constraints can be applied for the leg segments, for example for the limits in range of motion of the knee and/or hip.
  • the rotation matrices correspond to final or compensated rotation matrices which are preferably a more accurate and reliable representation of the orientations of the user's joint.
  • These rotation matrices can be applied to the 3D model of the user's body in order to accurately represent the body position of the user at any given time.
  • Method B Tracking segments using one inertial sensor and one stretch sensor
  • a user's body movement can be tracked using inertial sensors and stretch sensors.
  • An inertial sensor and a stretch sensor are provided for each segment of the user's body: the inertial sensor being provided on an upper sub- segment of a body segment, and the stretch sensor being provided along a joint which connects the upper sub-segment to a lower sub-segment.
  • the inertial sensor provides orientation information of the upper sub-segment, while the stretch sensor provides an angle of the upper sub-segment relative to the lower sub- segment. The information from these two sensors can be combined to predict a position of the lower sub-segment.
  • an arm segment 402 can be tracked using a first inertial sensor 200D positioned on the upper sub- segment 404, and a stretch sensor 300B extending across the joint 450.
  • a first inertial sensor 200D positioned on the upper sub- segment 404
  • a stretch sensor 300B extending across the joint 450.
  • a first step comprises calibration.
  • the user is positioned in a T-pose, providing an initial frame of reference for the movement and orientation recorded by the inertial sensor, and for the initial stretch amount of the stretch sensor. Any subsequent movements of the user will be calculated relative to the initial T-pose.
  • the sensors can be calibrated all at once.
  • the stretch sensor can be further calibrated to identify a maximum stretch amount, for example by having the user flex his elbow as much as possible and capturing the value recorded by the stretch sensor at maximum flexion.
  • tracking information is gathered from the inertial and stretch sensors. With respect to the inertial sensor, orientation information (i.e. yaw, pitch and roll) is captured. This information is transformed into a rotation matrix, which defines the orientation of this sensor as a rotation relative to the T-pose.
  • the output signal must be mapped to a corresponding flexion angle of the elbow.
  • the signal In order to map the signal to a flexion angle, the signal must first be filtered. Any known suitable filtering algorithm can be applied, according to the properties of the sensor, in order to remove undesired noise, distortion or outlier values.
  • the filtered signal can then be mapped to a flexion angle.
  • This mapping can be accomplished, for example, by determining a function which represents the flexion angle as a function of the output of the stretch sensor. This function can vary according to the sensor configuration and type, and must therefore be determined experimentally, for example by flexing the stretch sensor by known amounts and observing the results.
  • the function can be modelled as a linear or polynomial function. The model can further be adapted to account for initial stretching amounts.
  • the fusion unit serves to predict the orientation of the lower segment based on the rotation matrix from the upper segment and based on the flexion angle.
  • the rotation matrix obtained for the upper segment is transformed according to the flexion angle to obtain an estimated rotation matrix for the lower segment.
  • the rotation matrices for the upper and lower segments can be applied to the 3D model of the user's body in order to accurately represent the body position of the user at any given time.
  • Method C Tracking segments using two inertial sensors and one stretch sensor
  • a user's body movement can be tracked using inertial sensors and stretch sensors.
  • Two inertial sensors and a stretch sensor are provided for each segment of the user's body: the inertial sensors being provided on the upper and lower sub-segments of a body segment, and the stretch sensor being provided along a joint which connects the upper sub-segment to a lower sub-segment.
  • the inertial sensors provide orientation information of the upper sub-segment and lower sub-segment, while the stretch sensor provides an angle of the upper sub- segment relative to the lower sub-segment. The information from these two sensors can be combined to more accurately determine the actual position and orientation of the tracked segment.
  • a leg segment 408 can be tracked using a first inertial sensor 200F positioned on the upper sub- segment 410, a second inertial sensor 200G positioned on the lower sub-segment 412, and a stretch sensor 300C extending across the joint 450.
  • first inertial sensor 200F positioned on the upper sub- segment 410
  • second inertial sensor 200G positioned on the lower sub-segment 412
  • stretch sensor 300C extending across the joint 450.
  • Method A can be performed using the two inertial sensors in order to obtain orientation information of the upper and lower sub-segments, and in order to estimate the flexion angle of the joint connecting the two sub-segments.
  • Method B can then be performed in order to obtain a better estimation of the flexion angle using the stretch sensor.
  • An optimal flexion angle can be determined based on the estimates obtained, and then the orientation information from the inertial sensors can be corrected according to the optimal flexion angle.
  • a first step comprises calibration.
  • the user is positioned in a T- pose, providing an initial frame of reference for the movement and orientation recorded by the inertial sensors, and for the initial stretch amount of the stretch sensor. Any subsequent movements of the user will be calculated relative to the initial T-pose.
  • the sensors can be calibrated all at once, or one at a time.
  • the stretch sensors can further be calibrated to identify the maximum and minimum flexion range.
  • orientation information i.e. yaw, pitch and roll
  • stretch information is obtained from the stretch sensor which can be mapped to a corresponding angle according to the steps described in Method B.
  • the rotation matrices and the flexion angle are fed into a fusion unit which can combine information obtained from all sensors and produce compensated rotation matrices which more accurately and reliably represent the actual motion of the tracked segment.
  • the pitch, yaw, and/or roll axes of the inertial sensors are aligned with one another, according to similar steps as described in Method A, and corrected rotation matrices are obtained.
  • the corrected matrices can then be used to estimate the orientation of the joint, in this case the knee, and estimate its flexion angle. This estimate can be improved by applying known constraints, such as the 180 degree rotation limit of a knee, as described above in Method A.
  • the orientation information can be further improved by adjusting it using information from the stretch sensor.
  • the flexion angle obtained using the stretch sensor can be compared to the flexion angle obtained using the inertial sensors in order to determine the optimal flexion angle (i.e. the flexion angle which corresponds to the most accurate representation of the actual flexion angle).
  • the optimal flexion angle can be used to further correct the rotation matrices of the inertial sensors such that they are a more accurate representation of the user's actual movement.
  • Determining the optimal flexion angle comprises determining which sensors are currently providing the most accurate flexion angle information.
  • the stretch sensor generally provides a more accurate estimate of the flexion angle of the knee joint. Therefore, if it is determined that the flexion angle estimated by the stretch sensor is within allowable ranges (i.e. within the 180 degree constraint), the stretch sensor angle will be chosen as the optimal flexion angle.
  • the rotation matrices obtained by the inertial sensors can thus be corrected such that they correspond to the optimal flexion angle.
  • the corrected rotation matrices for the upper and lower segments can be applied to the 3D model of the user's body in order to accurately represent the body position of the user at any given time.
  • Various methods of processing captured data allows determining based on the rotation matrix (corrected, aligned and/or compensated) for an upper sub-segment that is an upper arm and based on the rotation matrix (corrected, aligned and/or compensated) for a lower sub-segment that is a lower arm one or more of elbow flexion/extension, forearm pronation/supination, shoulder flexion/extension, shoulder vertical adduction/abduction, shoulder horizontal adduction/abduction, and shoulder rotation.
  • Various methods of processing captured data allows determining based on the rotation matrix (corrected, aligned and/or compensated) for an upper sub-segment that is an upper leg and based on the rotation matrix (corrected, aligned and/or compensated) for a lower sub-segment that is a lower leg one or more of knee flexion/extension; hip flexion/extension, hip abduction/adduction, hip rotation, trunk flexion, trunk lateral bending and trunk rotation.
  • the above-described system and method provide many advantages compared to systems and methods of the prior-art.
  • the strategic placement of sensors and the combination of sensor types allows for the full body motion of a user to be tracked in three dimensions using a reduced number of sensors.
  • the reduction of sensors is more cost effective, and allows for a wearable device which is not cumbersome to a user performing active physical activity.
  • the tracking information obtained using the system and method provided above can provide valuable information which can be used for coaching and improving athletic performance and/or ergonomic performance.

Abstract

A garment for sensing three-dimensional motion includes a form-fitting garment article having compression areas for maintain positions of tracked areas over the user's body. Motion sensors positioned at the tracked areas are operable to capture 3D motion. A method of processing the motion data includes receiving orientation information captured by two inertial sensors and adjusting a rotation matrix for one of the sensors based on the rotation matrix for the other of the sensors. Another method of processing the motion data includes receiving orientation information captured by an inertial sensor and receiving stretch signal captured by a stretch sensor and correcting a first rotation matrix for the inertial sensor based on a flexion angle represented by the stretch signal. Another method of processing the motion data includes receiving orientation information captured by two inertial sensors and receiving stretch signal captured by a stretch sensor and correcting rotation matrices for the inertial sensors based on a flexion angle represented by the stretch signal.

Description

MOTION CAPTURE GARMENT
TECHNICAL FIELD
The technical field generally relates to wearable technologies. More particularly, it relates to a garment for three-dimensional tracking of a user's movements.
BACKGROUND
A variety of systems exist to track three-dimensional (3D) body movement. Disadvantageously, many of these systems are inaccessible to the general consumer, and are not practical for certain applications, for example to track body movement during physical activity or sport. Such systems can require a large number of sensors, expensive equipment, and can be difficult to transport and setup.
An improved system is therefore needed which solves as least some of the problems of the prior art. Preferably, such an improved system should reduce the number of sensors required to track 3D body movement, and should be more practical for use in the context of tracking body movement during physical activity or sport.
SUMMARY According to an aspect, a device for sensing three-dimensional (3D) motion of a user's body is provided. The device includes at least one form-fitting garment article having a plurality of compression areas formed of compressive fabric and a plurality of tracked areas, the compression areas being located to maintain position of the tracked areas over the user's body when the form-fitting garment article is worn by the user, a plurality of motion sensors positioned on the form-fitting garment article at the tracked areas, the motion sensors being operable to capture 3D motion. According to another aspect, there is provided a kit for capturing 3-D motion of a user. The kit includes a plurality of sensor capsules operable to capture 3-D motion, a form-fitting garment article having a plurality of sensor holders attached thereto, each sensor holder being configured to removably retain one of the sensor capsules, and a brain pack removably attachable to the form-fitting garment article and configured to receive captured motion data from the sensor capsules.
According to yet another aspect, there is provided a method for sensing three- dimensional (3D) motion of a user. The method includes wearing, over the user's body, a form-fitting garment article having a plurality of compression areas and a plurality of tracked areas; positioning the compression areas over a first set of body parts of the user to be compressed; positioning the tracked areas over a second set of body parts of the user to be tracked, and capturing, by a plurality of sensors positioned on the form-fitting garment article at the tracked areas, 3-D motion of the second set of body parts of the user.
According to yet another aspect, there is provided a method for processing three- dimensional (3D) motion of a user. The method includes receiving orientation information captured by a first inertial sensor positioned at an upper sub-segment of a body segment of the user, receiving orientation information captured by a second inertial sensor positioned at a lower sub-segment of the body segment of the user, calibrating the first inertial sensor and second inertial sensor based on the orientation information captured when the user is in a calibration pose, transforming the orientation information captured by the first inertial sensor into a first rotation matrix defining the orientation of the first inertial sensor relative to the configuration pose, transforming the orientation information captured by the second inertial sensor into a second rotation matrix defining the orientation of the second inertial sensor relative to the configuration pose; and adjusting the one of the rotation matrices based on the other of the rotation matrices.
According to another aspect, there is provided a method for processing three-dimensional (3D) motion of a user. The method includes receiving orientation information captured by an inertial sensor positioned at an upper sub-segment of a body segment of the user; receiving stretch signal captured by a stretch sensor positioned at a joint connecting the upper sub- segment to a lower sub-segment of the body segment; calibrating the inertial sensor and the stretch sensor based on the orientation information and angle information captured when the user is in a calibration pose; transforming the orientation information captured by the inertial sensor into a first rotation matrix defining the orientation of the inertial sensor relative to the configuration pose; mapping the stretch signal by the stretch sensor to a flexion angle information; and determining a corrected first rotation matrix based on the flexion angle to determine a second rotation matrix defining an orientation of the lower sub-segment.
According to another aspect, there is provided a method for processing three- dimensional (3D) motion of a user. The method includes receiving orientation information captured by a first inertial sensor positioned at an upper sub-segment of a body segment of the user; receiving orientation information captured by a second inertial sensor positioned at a lower sub-segment of the body segment; receiving stretch signal captured by a stretch sensor positioned at a joint connecting the upper sub-segment to the lower sub-segment; calibrating the first inertial sensor, the second inertial sensor and the stretch sensor based on the orientation information and angle information captured when the user is in a calibration pose; transforming the orientation information captured by the first inertial sensor into a first rotation matrix defining the orientation of the first inertial sensor relative to the configuration pose; transforming the orientation information captured by the second inertial sensor into a second rotation matrix defining the orientation of the second inertial sensor relative to the configuration pose; mapping the stretch signal by the stretch sensor to a flexion angle information; and compensating the first rotation matrix and the second rotation matrix based on the flexion angle. According to an aspect, a smart garment for tracking three-dimensional body movement of a user is provided. The garment includes at least one stretchable sensor spanning across a joint of the user's body, the stretchable sensor generating a flexion signal corresponding to a flexion angle of said joint; and at least one additional sensor positioned along a first segment of the user's body adjacent said joint, the additional sensor generating an orientation signal corresponding to an orientation of said first segment. The flexion signal and the orientation signal are processed to estimate a three-dimensional position or orientation of a second segment of the user's body adjacent to said joint and opposite said first segment.
According to an aspect, a smart garment for tracking three-dimensional body movement of a user is provided. The garment includes at least one sensor cluster for tracking movement of a segment of the user's body. The sensor cluster includes: a stretchable sensor spanning across a joint of the user's body, the stretchable sensor generating a first signal corresponding to a flexion angle of said joint; a first inertial sensor positioned adjacent the joint, the first inertial sensor generating a second signal corresponding to an orientation of a proximal segment of the user's body; and a second inertial sensor positioned adjacent the joint, opposite the first inertial sensor, the second inertial sensor generating a third signal corresponding to an orientation of a distal segment of the user's body adjacent said proximal segment. The first, second and third signals are processed to estimate a three-dimensional movement of a portion of the user's body. According to an aspect, a kit for tracking three-dimensional body movement of a user is provided. The kit includes: a garment wearable to cover a portion of a user's limb; a stretchable sensor attachable across a portion of the garment spanning a joint of the user's body, the stretchable sensor generating a first signal corresponding to a flexion angle of said joint; a first inertial sensor attachable to a portion of the garment adjacent the joint, the first inertial sensor generating a second signal corresponding to an orientation of a proximal segment of the user's body; a second inertial sensor attachable to a portion of the garment adjacent the joint and opposite the first inertial sensor, the second inertial sensor generating a third signal corresponding to an orientation of a distal segment of the user's body adjacent said proximal segment. The first, second and third signals are processed to estimate a three-dimensional movement of a portion of the user's body.
According to an aspect, a method for tracking three-dimensional body movement of a user is provided. The method includes the steps of: generating a first signal corresponding to a flexion angle of a joint on the user's body, the flexion angle being generated by a stretchable sensor positioned across said joint; generating a second signal corresponding to an orientation of a proximal segment of a user's body, the second signal being generated by an inertial sensor positioned on the proximal segment of the user's body adjacent the joint; generating a third signal corresponding to an orientation of a distal segment of the user's body, the third signal being generated by an inertial sensor positioned on the distal segment of the user's body opposite the proximal segment and adjacent the joint; and processing the first, second and third signals to estimate a three dimensional movement of a portion of the user's body. In an embodiment, processing the first, second and third signals comprises processing the first and second signals to determine an estimated position of the proximal and distal segments, processing the third signal to determine the flexion angle of the joint, and correcting the estimated position of the proximal and distal segments using the determined flexion angle of the joint.
BRIEF DESCRIPTION OF THE DRAWINGS
For a better understanding of the embodiments described herein and to show more clearly how they may be carried into effect, reference will now be made, by way of example only, to the accompanying drawings which show at least one exemplary embodiment, and in which: Figures 1A and 1 B are respective front and rear views of a motion capture garment, according to an embodiment. Figure 1 C is an individual view of the brain pack in the motion capture garment of Figures 1 A and 1 B.
Figure 2A is a schematic showing the capsule placement on the motion capture garment of Figures 1A and 1 B. Figures 2B, 2C and 2D are respective individual views of the capsules and holders shown in Figure 2A. Figure 2E is a rear view of the capsule.
Figure 3A is a schematic showing the stretch sensor placement on the motion capture garment of Figures 1 A and 1 B. Figures 3B and 3C are respective individual views of the stretch sensors shown in Figure 3A. Figure 4 is a schematic illustrating a segmentation of the human body for creating a model thereof, as used in a method for tracking body motion according to an embodiment.
Figure 5 is a schematic flowchart illustrating sensor fusion, according to an embodiment where body motion is tracked using inertial sensors only.
Figure 6 is a schematic flowchart illustrating sensor fusion, according to an embodiment where body motion is tracked using one inertial sensor and one stretch sensor per tracked segment.
Figure 7 is a schematic flowchart illustrating sensor fusion, according to an embodiment where body motion is tracked using two inertial sensors and one stretch sensor per tracked segment. Figures 8A and 8B are respective block diagrams of the data and power modules in the brain pack of Figure 1 C. DETAILED DESCRIPTION
It will be appreciated that for simplicity and clarity of illustration, elements shown in the figures have not necessarily been drawn to scale. For example, the dimensions of some of the elements may be exaggerated relative to other elements for clarity. Further, where considered appropriate, reference numerals may be repeated among the figures to indicate corresponding or analogous elements.
It will be appreciated that, for simplicity and clarity of illustration, where considered appropriate, reference numerals may be repeated among the figures to indicate corresponding or analogous elements or steps. In addition, numerous specific details are set forth in order to provide a thorough understanding of the exemplary embodiments described herein. However, it will be understood by those of ordinary skill in the art that the embodiments described herein may be practiced without these specific details. In other instances, well-known methods, procedures and components have not been described in detail so as not to obscure the embodiments described herein. Furthermore, this description is not to be considered as limiting the scope of the embodiments described herein in any way but rather as merely describing the implementation of the various embodiments described herein.
With reference to Figures 1A and 1 B, a motion capture garment 100 is provided according to an embodiment. The garment 100 includes at least one garment article that is wearable by a user. The garment 100 is further configured to track body movements in 3D. Preferably, the garment article of the garment 100 is sportswear garment in that it is designed to be worn for sports, exercise or other physical activities. Preferably, the garment article is a seamless garment made of a stretchable and breathable material, and is sized and shaped such that it is form- fitting in that it closely conforms to a user's body. The garment article can, for example, be a compression-type garment article, and can be provided with compression areas at specific locations on the user's body which can aid the user in positioning the garment 100 properly, consequently placing sensors 108 in their proper positions.
Compression areas herein refers to areas of the garment article of the garment 100 that are formed of compressive fabric to provide a compression fit to a corresponding area of the user's body when the garment article is worn.
In some example embodiments, the various of areas of the garment article may be formed of compressive fabric. The garment article may be entirely formed of compressive fabric. Accordingly, the compression areas of the at least one garment article corresponds to areas of the garment article that apply a greater compressive force than other areas of the form-fitting garment article.
In one example, the form-fitting garment article is formed of a continuously knitted material so as to be substantially seamless. The compression areas correspond to more tightly knitted areas of the continuously knitted material. The compression areas are located on the at least one garment article so that tracked areas of the garment article are maintained in position over the user's body when the form-fitting garment article is worn by the user. Some tracked areas may be located in proximity of the compression areas. Alternatively, or additionally, some tracked areas may be located at the compression area. The tracked areas of the garment article correspond to locations where sensors described herein are affixed to the garment article of the garment 100.
In the present embodiment, the at least one garment article includes an upper- body section 102 that is designed to fit a user's upper body, and comprises sleeve portions which cover the user's arms, preferably down to the user's wrists, and a torso portion which covers the user's torso. Similarly, the at least one garment article includes a lower-body section 104 that is designed to fit a user's lower body, and comprises leg portions which cover the user's legs, preferably down to the user's ankles, and a waist portion covering the user's waist. As can be appreciated, the garment article can serve as a frame for mounting and embedding components useful in capturing 3D motion: by providing garment sections 102, 104 which cover the entirety of the user's body as described above, components can be positioned at virtually any location on the user's body.
The garment sections 102, 104 are each provided with a network of electrically conductive channels 106. The conducting channels 106 can comprise, for example, isolated and preferably insulated or protected channels through which conductive wiring can be fed. The conducting channels 106 can additionally or alternatively comprise electrically conductive fibres embedded in or affixed to the garment. The conductive channels 106 serve to electrically connect various electrical components positioned along the garment 100.
Preferably, the conductive channels 106 are positioned and arranged such that they minimize the length of wire required to form a complete conductive network, and such that they do not interfere with the free movement of the user wearing the garment 100. For example, in the upper-body section 102, the channels 106 extend over a user's trapezius muscles to connect to the back of the user's arms. Preferably still, the channels 106 in the upper-body section 102 and lower-body section 104 are configured such that they meet near a common nodal region 103, for example near the junction of the section 102, 104 near the user's waist. In order to optimize manufacturability, the channels 106 can comprise mainly straight lines, and can be provided with gradually curved portions in order to reach the nodal region 103.
According to an alternative embodiment, the garment article is a one-piece suit that covers the full body of the user.
In yet other alternative embodiments, the at least one garment article of the garment 100 may cover less than the full body of the user. For example, only the upper body section 102 or only the lower body section 104 is provided or worn at a given time. The upper body section 102 may be a short sleeve article in which the lower arm portions of the user are exposed when worn. Similarly, the lower body section 104 may be in the form of shorts in which lower leg portions of the user are exposed when worn.
A plurality of sensors 108 are strategically positioned at various locations on the garment 100 in order to capture 3D motion data of the user's full-body movements. In the present embodiment, the sensors 108 are positioned along the network of conductive channels 106, and are in electric communication therewith. In this configuration, the sensors 108 can receive power from a power source connected to the network of channels 106, and can also transmit and/or receive signals from other components through the channels 106. It should be understood that although the sensors 108 can receive power from the channels 106, the sensors 108 can be provided with an independent power source, such as a battery, which can be charged through the channels 106. It should also be appreciated that in some embodiments, the sensors 108 can communicate wirelessly, or store data locally in order to be transferred at a later time. In such configurations, the sensors 108 need only use the channels 106 for charging purposes.
The sensors 108 can comprise different types of sensing components which can capture various kinds of data required to determine the 3D motion of a user's body. With reference now to Figures 2A-2E, a first type of sensor 108 can include a biomechanical capsule 200. In the present embodiment, the capsule 200 comprises an inertial measurement unit (IMU) capable of measuring movement along 9 degrees-of-freedom (DOF): orientation in the x, y and z directions (yaw, pitch, and roll), acceleration in the x, y and z directions, rate of rotation in x, y, and z directions, and magnetic field in the x, y and z directions. The capsule also includes a controller and a transmitter for operating the IMU and communicating data captured therefrom to another location, such as a remote processor or a central hub. The controller can be implemented using an application specific integrated circuit or could comprise, for example, a processor and memory. In the present embodiment, the transmitter is wireless and comprises a Bluetooth chip, but in other embodiments, other wired or wireless transmitters can also be used. In some embodiments, the capsule can be provided with a storage module including memory which can serve to save data captured by the IMU for offline processing.
In the present embodiment the capsule 200 comprises a rechargeable battery for independently powering the components housed therein. It should be understood, however, that in other embodiments a replaceable or non-rechargeable battery can be provided. In other embodiments, the capsule 200 can operate without a battery and be powered through a connection with the network of channels 106.
In some embodiments, additional sensors can be provided within the capsule 200, which can be used to measure other 3D motion parameters, increase the accuracy of the IMU and/or capture physiological data of the user, such as the user's heartrate, body temperature, etc. As can be appreciated, the garment 100 can include a plurality of capsules 200, and each capsule can include different types of sensors depending on the capsule's location and intended function.
The capsule 200 is preferably removably attached to the garment 100, thereby allowing the capsule 200 to be removed while the garment is being washed or stored. In the present embodiment, the capsule 200 is removably engageable with a corresponding holder 250. The holder 250 is preferably permanently attached to the garment 100, and is preferably made from a water and heat resistant material, such as a strong plastic. Preferably, the holder 250 is attached to the network of channels 106 in the garment 100, and comprises an interface 254, allowing the capsule 200 to connect to the network of channels 106 when inserted into the holder 254. As illustrated in Figure 2E, a backside of the capsule 200 comprises one or more electrical contacts for forming a conductive path with an electrical contact of the capsule holder. In the present embodiment, nine capsules 200A-200I are provided which can be inserted into nine corresponding holders 250A-250I at different locations on the garment 100. The capsules 200A-200I and holders 250A-250I can be provided with identifiers 202, 252, allowing a user to match a particular one of the capsules 200A-200I with its corresponding holder 250A-250I. In the present embodiment, a first holder 250A is provided on the user's chest, a second 250B on the user's right upper arm, a third 250C on the user's right lower arm, a fourth 250D on the user's left upper arm, a fifth 250E on the user's left lower arm, a sixth 250F on the user's right upper leg, a seventh 250G on the user's right lower leg, an eighth 250H on the user's left upper leg, and a ninth 250I on the user's left lower leg. In other embodiments, more or fewer capsules can be provided and their position can vary depending on the type of sensor data required to determine the user's 3D movements.
Referring back to Figure 1 B, the upper body section 102 garment article includes a first elbow compression area 256A to be located over a right elbow joint of the user's body when the upper body section 102 is worn. As illustrated, the first elbow compression area 256A is in proximity of an right upper arm tracked area where the second holder 250B is located. Accordingly proper placement of the first elbow compression area 256A over the user's right elbow contributes to maintaining the second holder 250B in position over the user's right upper arm.
The first elbow compression area 256A is also in proximity of a right lower arm tracked area where the third holder 250C is located. Accordingly proper placement of the first compression area 256A over the user's right elbow also contributes to maintaining the third holder 250C in position over the user's right lower arm. It will be appreciated that the second elbow compression area 256B to be located over a left elbow joint of the user's body provides a similar effect in maintaining the fourth holder 200D and/or fifth holder 200E in position over the user's upper left arm and lower left arm respectively.
Referring now to Figure 1A, the lower body section 104 garment article includes a first knee compression area 256C to be located over a right knee of the user's body when the lower body section 104 is worn. As illustrated, the first knee compression area 256C is in proximity of a right upper leg tracked area where the sixth holder 250C is located. Accordingly proper placement of the first knee compression area 256C over the user's right knee contributes to maintaining the sixth holder 250C position over the user's right upper leg. The first knee compression area 256C is also in proximity of right lower leg tracked area where the seventh holder 250G is located. Accordingly proper placement of the third compression area 256C over the user's right knee also contributes to maintaining the seventh holder 250G in position over the user's right lower leg. It will be appreciated that the second knee compression area 256D to be located over a left knee joint of the user's body provides a similar effect in maintaining the eighth holder 250H and/or ninth holder 250I in position over the user's upper left leg and lower left leg respectively.
Referring now to Figure 1 B, the lower body section 104 garment article may further include a first calf compression area 256E to be located over a right calf of the user's body when the lower body section 104 is worn. As illustrated, the first calf compression area 256E is in proximity of the right lower leg tracked area where the seventh holder 250G is located. Accordingly proper placement of the fifth compression area 256E over the user's right calf in combination with the first knee compression area 256C contributes to maintaining the seventh holder 250G in position over the user's right lower leg.
It will be appreciated that the second calf compression area 256F to be located over the left calf of the user's body provides a similar effect in maintaining the ninth holder 250I in position over the user's lower left leg. In the present embodiment, distinctive visual indicators are provided on the compression areas 256A-256F so that they may be easily identified and correctly positioned over their corresponding body parts.
With reference now to Figures 3A, 3B and 3C, a second type of sensor 108 includes a stretchable sensor (or "stretch sensor") such as a sensor band 300. In the present embodiment, the sensor band 300 comprises a flexible fabric sensor, such as the StretchSense™ fabric sensor. The output of the sensor corresponds to an amount by which the sensor is stretched, providing one DOF along the stretching axis. As can be appreciated, the sensor band 300 can be positioned along a joint or a muscle, thereby sensing the movement angle of a joint or the flex of a muscle. Although in the present embodiment the sensor band 300 comprises a fabric sensor, other types of stretch sensors can also be used.
The sensor band 300 is preferably removably attached to the garment 100, thereby allowing the band 300 to be removed while the garment is being washed or stored. It should be understood, however, that in some embodiments the band 300 can be made water and/or heat resistant such that it is safe to wash the garment 100 with the band 300 still attached. In the present embodiment, the band 300 is securable at both ends to a connector 350 on the garment 100 which serve to hold the band 300 in place and allows the band to be stretch as the user moves while wearing the garment 100. The connector 350 can comprise, for example, textile hook-and- loop fasteners and/or buckle-type fasteners. The connector 350 also comprises an interface 352 for electrically connecting with a corresponding interface 302 of the band 300. The connection of these interfaces 302, 352 allow the band 300 to communicate with the network of channels 106 in the garment 100, and thus provide sensing signals thereto for processing. In addition to connector 350, the band 300 can further be secured to the garment 100 by insertion through a fastener loop 356 attached to or woven into the garment 100.
In the present embodiment, four bands 300A-300D are provided which can be secured to connectors 350 at four different locations on the garment 100. The bands 300 and connectors 350 can be provided with identifiers 302, 352, allowing a user to match a particular band 300 with its corresponding connector 350. In the present embodiment, a first band 300A is securable across the user's right elbow, a second band 300B across the user's left elbow, a third 300C across the user's right knee, and a fourth 300D across the user's left knee. As can be appreciated, in the present embodiment, each band 300 is associated with and positioned between two capsules 200. In other embodiments, more or fewer bands can be provided and their position can vary depending on the type of sensor data required to determine the user's 3D movements. In some embodiments, all the bands 300 can be made of the same material and have the same sensing properties, while in other embodiments, the bands 300 can be made of different materials or have different properties depending on where they are to be placed and depending on their desired sensing objective. For example, a band destined to measure high forces on large muscle groups or large joints can be made differently than a band destined for low force; the same goes for fast vs. slow acceleration, long vs. short impulse, and other sensing considerations. For example, the size and/or shape of the bands 300 can be designed for optimal sensing at a particular location, and/or different materials can be used to form the bands 300.
Referring back to Figures 1A and 1 B, the sensors 108 communicate with and are charged through a brain pack 1 10. Preferably, the brain pack 1 10 is sized, shaped and positioned such that it does not interfere with the user's movements. In the present embodiment, the brain pack 1 10 fits around the user's waist, and is positioned near the common nodal region 103 of the conductive channels 106 in the upper- and lower-body sections 102, 104.
Although referred to herein as a brain pack 1 10, it should be understood that the pack 1 10 does not necessarily contain intelligence, and therefore does not necessarily process information. With further reference to the block diagrams of Figures 8A and 8B, the pack 1 10 in the present embodiment serves as a hub, and serves many different functions including interfacing with the capsules 200 and external processing devices through a data printed circuit board (PCB) 800 and providing power to the garment 100 and sensors 108 through a power PCB 850. Although illustrated in the present embodiment as two PCBs 800 and 850 in communication with one another, it should be understood that the components of the brain pack 1 10 can be implemented on a single PCB or on several PCBs, depending on the form factor and manufacturability requirements. As illustrated, the power PCB 850 mainly serves to manage power in the system. It comprises a power section 852, which can interface with and manage the charging and discharging of a battery which is housed in the pack, and the distribution of power to different components of the system. It can also serve to operate the stretch sensors 300, for example by powering the sensors and/or receiving stretch information received therefrom through an electronics interface 854.
The data PCB 800 mainly serves to manage data, for example by storing it to an SD card 132, or transmitting it wirelessly via Bluetooth. The data PCB 800 is provided with a controller, in this case a microcontroller unit (MCU) 804, for controlling the operation of the sensors 108, the SD card 132 and the I/O elements such as buttons 128 and LEDs 130. The MCU 804 can also serve to receive and interpret data received from the sensors 108, and operate a wireless module, in this case a Bluetooth radio frequency (RF) section 806. For example, the processor 804 can operate Bluetooth aggregator modules in the RF section 806 to gather information received from each of the sensors, and combine this information into a single frame to be transmitted to an external device for further processing. The brain pack 1 10 can wirelessly communicate with a mobile phone in order to transmit sensor information thereto, and to receive operation commands therefrom. It can also be wirelessly connected to the capsules 200, for example via a Bluetooth connection.
As illustrated, the pack 1 10 is provided with a memory module, in this case a removable SD memory card 132, for saving recorded data from the sensors 108 for offline processing, however in alternate embodiments, different types of memory can be provided. Moreover, in the present embodiment, human interface devices are also provided such as buttons 128 and/or LEDs 130, but in alternate embodiments, different human interface or I/O devices can be provided, such as a touchscreen or touch pad.
In the present embodiment, and with reference to Figure 1 C, the brain pack 1 10 is electrically connectable to the network of channels 106 in each of the upper-body 102 and lower-body 104 sections via removable connectors 1 12, 1 14. The removable connectors 1 12, 1 14 fit within correspondingly marked ports 122, 124 in the brain pack 1 10. The brain pack 1 10 is further provided with an interface port 126, such as a USB port for example, for receiving power from an external source and/or for communicating with an external device. As can be appreciated, when connected to an external power source through the interface port 126, the brain pack 1 10 can provide power to the garment 100 through the channels 106, allowing any sensors 108 or other components attached thereto to charge. The battery provided in the brain pack 1 10 is preferably rechargeable, allowing it to be charged when a power source is connected to the interface port 126. As can be appreciated, in this configuration the charging of the garment 100 is completely integrated: a single power connection 126 is required in order to charge the brain pack 1 10 and sensors 108, so long as they are all connected through the conductive channels 106. In operation, the garment 100 gathers motion data from the sensors 108 and can transmit the data in real-time to an external processing device, such as a smartphone. The smartphone can receive this information and process it in order to generate a model of the user's body in real-time. The generated model can then be displayed to the user, or to a coach who is aiding the user, in order to provide the user with feedback, for example to improve form when performing athletic activities. In some embodiments, the motion data can be recorded by the SD card and can be processed offline, for example after being loaded to an external processing device such as a computer. The user can thereby see a playback of his recorded movements, and pause, rewind and fast-forward as necessary. This motion information can further be aggregated to identify trends in the user's motions, and identify improvements or progressions in certain repeated movements, for example to aid in improving weightlifting form over time.
As can be appreciated, different types of sensors can be used in order to capture 3D motion. Data from these sensors can be processed and interpreted in order to determine/estimate a user's actual body movements in 3D space. In order to obtain the most accurate representation of the user's body movements, it is preferable to provide as many sensors as possible on the user's body. Practically speaking, this is not always possible, and in many cases it is preferred to minimize the number of sensors in order to reduce the complexity of the system and make it easier to use an maintain. The garment 100 described herein therefore attempts to reduce the number of sensors required while maintaining sufficient accuracy and reliability by optimizing sensor placement and by combining sensor types.
In the present embodiment, a combination of inertial sensors 200 and stretchable sensors 300 are used in order to capture 3D movement. These two different types of sensors are placed at strategic locations on the user's body, allowing them each to independently collect data relative to the user's movements. This data can then be processed and combined to obtain a result that is more accurate than any one of the sensors taken individually. Preferably, to measure movement of a major joint such as a knee or elbow, two inertial sensors 200 can be associated with a stretchable sensor 300. By way of example, and with reference to Figure 3A, a stretchable sensor 300C can be placed across the user's right knee, between an inertial sensor 200F on the user's right upper leg and inertial sensor 200G on the user's right lower leg. Using sensor fusion techniques, information from these three sensors can together form an accurate representation of the user's right leg movement.
As described herein, the capsules 200 can be removed from their holders 250, the stretchable sensor 300, where provided, can be removed and the brain pack 1 10 can be disconnected so that the garment 100 can be washed. When these components are removed, the garment article having the network of conductive channel 106 affixed thereto maybe fully submerged, such as being washed in a machine washing, while still providing full sensing functionality when subsequently dried.
Method for Tracking Body Movement
According to a method for tracking body movement, the form-fitting garment article is first worn over the user's body. In doing so, the compression areas of the form- fitting garment article are appropriately positioned over a first set of corresponding body parts of the user. Similarly, tracked areas of the garment article located in proximity or at the compression areas are also positioned over a second set of correspond body parts of the users. The tracked areas correspond to where motion sensors are to be placed.
When the sensors are properly position, 3-D motion of the second set of body parts may be captured using the sensor. The capturing generates motion data, which may be further processed as described hereinbelow.
The method further includes, prior to capturing motion, attaching the brain pack to the waist portion of the user and connecting the brain pack to sensors via the conductive channels formed in the garment article.
Where the sensors are removable, the method also includes, prior to capturing motion, inserting the sensor capsules into their respective sensor holders affixed to the garment article.
After use, the brain pack may be detached from the garment article and the capsule sensors may be removed from the capsule holders. The garment article with the capsule holders attached and the conductive channels still embedded can be fully submerged for washing, such as washing inside a washing machine.
Data acquired using the above-described garment can be processed in order to track the 3D body movement of a user. As described herein, the data acquired may be processed in real time. With reference to Figures 2A, 3A and 4, the user's body can be modelled as being comprised of distinct segments 400. For example, the body can be divided into arm segments 402, leg segments 408 and a chest segment 414, with each of the segments preferably being connected and movable relative to another segment along a joint 450. The model can further be subdivided into sub-segments, for example where two portions of the segment meet at a joint 450, for example by dividing the arm segment 402 into an upper (or proximal) sub- segment 404 and lower (or distal) sub-segment 406, the sub-segments being connected by an elbow joint 450. Similarly, the leg segment 408 can be divided into an upper (or proximal) sub-segment 410 and lower (or distal) sub-segment 412, the sub-segments meeting at a knee joint 450. In other embodiments, the body can be divided into more or fewer segments and/or sub-segments, depending on the modelling requirements. As can be appreciated, by modelling the body as such, the user's global body position and movement can be constructed by determining the orientation of each of the joints 250 which form part of the model.
As can be appreciated, in the garment described above, each segment of the user's body 400 is provided with at least one inertial sensor 200. The inertial sensor 200 provides an orientation of each segment 400 (i.e. pitch, yaw and roll). Where a segment 400 is divided into sub-segments, a stretchable sensor is provided across the joint 450 joining the sub-segments. The stretchable sensor thus allows determining a flexion angle of the joint 450, and thus an angle of the first sub- segment relative to the second sub-segment. The segment orientation and the flexion angle can be processed in order to estimate the shoulder and elbow orientation, and thus construct a model of the 3D position of the segment and/or sub-segment.
Where a segment 400 is divided into sub-segments, a single inertial sensor 200 can be positioned on only one of the sub-segments, providing orientation information relating to only that sub-segment. Preferably, when a single inertial sensor is provided on a segment, it should be positioned on the upper or proximal sub-segment, thereby allowing the sensor to be more accurate as it is subject to less acceleration. For example, for an arm segment 402, a single inertial sensor 200D can be positioned on the upper sub-segment 404. In some embodiments, a second inertial sensor 200 can be provided in order to provide an orientation of the lower or distal sub-segment. In the context of the arm segment 402, a second inertial sensor 200E can be positioned on the lower sub-segment 406. Providing a second inertial sensor 200 can improve accuracy and reliability of the tracking method.
As can be appreciated, each segment which comprises sub-segments is tracked using at least two sensors. Preferably, information from these sensors is fused, thereby allowing the use of information of more than one sensor in a complimentary way to improve accuracy, reliability and other properties than if those sensors were to be otherwise taken alone. Sensors may reflect the information with different accuracies and characteristics. The key point in sensor fusion is that the sensors have the ability to complement each other based on different characteristics, source of information, location, etc. The goal is to benefit from advantages of both and prevent the disadvantages and uncertainty of each sensor in the final model. Method A: Tracking segments using inertial sensors only
In an embodiment, a user's body movement can be tracked using inertial sensors 200 only. Each segment of the user's body can be tracked individually, with the information from each segment allowing for the construction of a full 3D model of the user's movement. Each sub-segment of a tracked segment is provided with an inertial sensor. Data from the two inertial sensors on a given segment can be combined in order to improve accuracy and compensate for uncertainties. In an embodiment, an arm segment 402 can be tracked using a first inertial sensor 200D positioned on the upper sub-segment 404, and a second inertial sensor 200E positioned on the lower sub-segment 406. Although the following method will be described in the context of an arm segment 402, it should be appreciated that similar methods can be applied to a leg segment 408 or other segment.
A first step comprises calibration. The user is positioned in a T-pose (i.e. standing up straight, with legs together and arms raised to the sides at a 90 degree angle), providing an initial frame of reference for changes in orientation recorded by the inertial sensors. Any subsequent movements of the user will be recorded relative to the initial T-pose. As can be appreciated, all the inertial sensors can be calibrated at once, for example by having the user stand in the T-pose for a period of time. Alternatively, each sensor can be calibrated individually, for example by allowing the user to hold one arm out to his side at a time for a period of time. The calibration can be initiated, for example, by pressing a button on the brain pack or by receiving a command from an external device, such as an application on a smartphone.
Next, and with reference to Figure 5, orientation information from each of the upper arm and lower arm inertial sensors is captured. This information is transformed into a rotation matrix, which defines the orientation of those sensors as a rotation relative to the T-pose.
The rotation matrices are then processed by a fusion unit, which can for example be implemented in a processor, either located on the garment or in an external device such as a smartphone. In the fusion unit, information from each of the sensors is combined and known constraints are applied in order to obtain a more accurate result by adjusting the rotation matrices. As can be appreciated, the inertial sensors are provided along a common segment. Therefore, at least one of their pitch, yaw, and roll axes should be aligned with one another, or should rotate within a common plane. If the axes are misaligned, the rotation matrix of the lower sensor is adjusted, such that the angle of the lower sensor is properly aligned with the upper sensor.
In the context of an arm segment, the yaw axis of the lower arm sensor should be in the same plane as the yaw axis of the upper arm sensor. Therefore, if the yaw axis of the lower sensor is misaligned, the rotation matrix of the lower sensor can be compensated such that it rotates within the same plane as the roll and yaw axes of the upper arm. Similarly, in the context of a leg segment, the pitch axis of the upper and lower leg sensors should be aligned. The rotation matrix of the lower leg sensor can be corrected such that it is in alignment with the upper leg sensor. Once corrected, the aligned information from the two inertial sensors can be used to estimate the orientation of the elbow. This estimate can be improved by applying known constraints. For example, in the context of an arm segment, it is known that the elbow cannot rotate backwards (i.e. past 180 degrees). If the estimated elbow angle does not meet this constraint, the rotation matrices of both sensors can be corrected until the estimated elbow angle falls within a permitted range. Other constraints can be applied according to other range of motion limitations in human physiology, and according to user-specific range of motion limitations. Similarly, the same type of constraints can be applied for the leg segments, for example for the limits in range of motion of the knee and/or hip. Once aligned and corrected, the rotation matrices correspond to final or compensated rotation matrices which are preferably a more accurate and reliable representation of the orientations of the user's joint. These rotation matrices can be applied to the 3D model of the user's body in order to accurately represent the body position of the user at any given time.
Method B: Tracking segments using one inertial sensor and one stretch sensor
In an embodiment, a user's body movement can be tracked using inertial sensors and stretch sensors. An inertial sensor and a stretch sensor are provided for each segment of the user's body: the inertial sensor being provided on an upper sub- segment of a body segment, and the stretch sensor being provided along a joint which connects the upper sub-segment to a lower sub-segment. The inertial sensor provides orientation information of the upper sub-segment, while the stretch sensor provides an angle of the upper sub-segment relative to the lower sub- segment. The information from these two sensors can be combined to predict a position of the lower sub-segment.
In an embodiment, and with reference to Figures 2A, 3A and 4, an arm segment 402 can be tracked using a first inertial sensor 200D positioned on the upper sub- segment 404, and a stretch sensor 300B extending across the joint 450. Although the following method will be described in the context of an arm segment 402, it should be appreciated that similar methods can be applied to a leg segment 408 or other segment.
A first step comprises calibration. The user is positioned in a T-pose, providing an initial frame of reference for the movement and orientation recorded by the inertial sensor, and for the initial stretch amount of the stretch sensor. Any subsequent movements of the user will be calculated relative to the initial T-pose. As described above, the sensors can be calibrated all at once. In the present embodiment, the stretch sensor can be further calibrated to identify a maximum stretch amount, for example by having the user flex his elbow as much as possible and capturing the value recorded by the stretch sensor at maximum flexion. Next, and with reference to Figure 6, tracking information is gathered from the inertial and stretch sensors. With respect to the inertial sensor, orientation information (i.e. yaw, pitch and roll) is captured. This information is transformed into a rotation matrix, which defines the orientation of this sensor as a rotation relative to the T-pose. With respect to the stretch sensor, the output signal must be mapped to a corresponding flexion angle of the elbow.
In order to map the signal to a flexion angle, the signal must first be filtered. Any known suitable filtering algorithm can be applied, according to the properties of the sensor, in order to remove undesired noise, distortion or outlier values. The filtered signal can then be mapped to a flexion angle. This mapping can be accomplished, for example, by determining a function which represents the flexion angle as a function of the output of the stretch sensor. This function can vary according to the sensor configuration and type, and must therefore be determined experimentally, for example by flexing the stretch sensor by known amounts and observing the results. Depending on the sensor type and accuracy requirements, the function can be modelled as a linear or polynomial function. The model can further be adapted to account for initial stretching amounts.
Once the rotation matrix and the angle have been obtained, they can be processed in a fusion unit. In this embodiment, the fusion unit serves to predict the orientation of the lower segment based on the rotation matrix from the upper segment and based on the flexion angle. The rotation matrix obtained for the upper segment is transformed according to the flexion angle to obtain an estimated rotation matrix for the lower segment. The rotation matrices for the upper and lower segments can be applied to the 3D model of the user's body in order to accurately represent the body position of the user at any given time.
Method C: Tracking segments using two inertial sensors and one stretch sensor
In an embodiment, a user's body movement can be tracked using inertial sensors and stretch sensors. Two inertial sensors and a stretch sensor are provided for each segment of the user's body: the inertial sensors being provided on the upper and lower sub-segments of a body segment, and the stretch sensor being provided along a joint which connects the upper sub-segment to a lower sub-segment. The inertial sensors provide orientation information of the upper sub-segment and lower sub-segment, while the stretch sensor provides an angle of the upper sub- segment relative to the lower sub-segment. The information from these two sensors can be combined to more accurately determine the actual position and orientation of the tracked segment.
In an embodiment, and with reference to Figures 2A, 3A and 4, a leg segment 408 can be tracked using a first inertial sensor 200F positioned on the upper sub- segment 410, a second inertial sensor 200G positioned on the lower sub-segment 412, and a stretch sensor 300C extending across the joint 450. Although the following method will be described in the context of a leg segment 408, it should be appreciated that similar methods can be applied to an arm segment 402 or other segment. Broadly described the present method involves combining steps from the above described Methods A and B in order to obtain a more accurate and/or reliable result. Method A can be performed using the two inertial sensors in order to obtain orientation information of the upper and lower sub-segments, and in order to estimate the flexion angle of the joint connecting the two sub-segments. Method B can then be performed in order to obtain a better estimation of the flexion angle using the stretch sensor. An optimal flexion angle can be determined based on the estimates obtained, and then the orientation information from the inertial sensors can be corrected according to the optimal flexion angle.
More specifically, a first step comprises calibration. The user is positioned in a T- pose, providing an initial frame of reference for the movement and orientation recorded by the inertial sensors, and for the initial stretch amount of the stretch sensor. Any subsequent movements of the user will be calculated relative to the initial T-pose. As can be appreciated, the sensors can be calibrated all at once, or one at a time. The stretch sensors can further be calibrated to identify the maximum and minimum flexion range. Next, and with reference to Figure 7, orientation information (i.e. yaw, pitch and roll) from each of the upper leg (thigh) and lower leg (tibia) inertial sensors is captured. This information is transformed into a rotation matrix, which defines the orientation of those sensors as a rotation relative to the T-pose. Similarly, stretch information is obtained from the stretch sensor which can be mapped to a corresponding angle according to the steps described in Method B.
The rotation matrices and the flexion angle are fed into a fusion unit which can combine information obtained from all sensors and produce compensated rotation matrices which more accurately and reliably represent the actual motion of the tracked segment.
In the fusion unit, the pitch, yaw, and/or roll axes of the inertial sensors are aligned with one another, according to similar steps as described in Method A, and corrected rotation matrices are obtained. The corrected matrices can then be used to estimate the orientation of the joint, in this case the knee, and estimate its flexion angle. This estimate can be improved by applying known constraints, such as the 180 degree rotation limit of a knee, as described above in Method A.
Once the constraints have been applied and the rotation matrices have been further corrected, the orientation information can be further improved by adjusting it using information from the stretch sensor. The flexion angle obtained using the stretch sensor can be compared to the flexion angle obtained using the inertial sensors in order to determine the optimal flexion angle (i.e. the flexion angle which corresponds to the most accurate representation of the actual flexion angle). The optimal flexion angle can be used to further correct the rotation matrices of the inertial sensors such that they are a more accurate representation of the user's actual movement.
Determining the optimal flexion angle comprises determining which sensors are currently providing the most accurate flexion angle information. As can be appreciated, the stretch sensor generally provides a more accurate estimate of the flexion angle of the knee joint. Therefore, if it is determined that the flexion angle estimated by the stretch sensor is within allowable ranges (i.e. within the 180 degree constraint), the stretch sensor angle will be chosen as the optimal flexion angle. The rotation matrices obtained by the inertial sensors can thus be corrected such that they correspond to the optimal flexion angle. Finally, the corrected rotation matrices for the upper and lower segments can be applied to the 3D model of the user's body in order to accurately represent the body position of the user at any given time.
Various methods of processing captured data allows determining based on the rotation matrix (corrected, aligned and/or compensated) for an upper sub-segment that is an upper arm and based on the rotation matrix (corrected, aligned and/or compensated) for a lower sub-segment that is a lower arm one or more of elbow flexion/extension, forearm pronation/supination, shoulder flexion/extension, shoulder vertical adduction/abduction, shoulder horizontal adduction/abduction, and shoulder rotation. Various methods of processing captured data allows determining based on the rotation matrix (corrected, aligned and/or compensated) for an upper sub-segment that is an upper leg and based on the rotation matrix (corrected, aligned and/or compensated) for a lower sub-segment that is a lower leg one or more of knee flexion/extension; hip flexion/extension, hip abduction/adduction, hip rotation, trunk flexion, trunk lateral bending and trunk rotation.
As can be appreciated, the above-described system and method provide many advantages compared to systems and methods of the prior-art. The strategic placement of sensors and the combination of sensor types allows for the full body motion of a user to be tracked in three dimensions using a reduced number of sensors. The reduction of sensors is more cost effective, and allows for a wearable device which is not cumbersome to a user performing active physical activity. The tracking information obtained using the system and method provided above can provide valuable information which can be used for coaching and improving athletic performance and/or ergonomic performance. While the above description provides examples of the embodiments, it will be appreciated that some features and/or functions of the described embodiments are susceptible to modification without departing from the spirit and principles of operation of the described embodiments. Accordingly, what has been described above has been intended to be illustrative and non-limiting and it will be understood by persons skilled in the art that other variants and modifications may be made without departing from the scope of the invention as defined in the claims appended hereto.

Claims

1 . A device for sensing three-dimensional (3D) motion of a user's body, the device comprising:
at least one form-fitting garment article having a plurality of compression areas formed of compressive fabric and a plurality of tracked areas, the compression areas being located to maintain position of the tracked areas over the user's body when the form-fitting garment article is worn by the user; and
a plurality of motion sensors positioned on the form-fitting garment article at the tracked areas, the motion sensors being operable to capture 3D motion.
2. The device of claim 1 , wherein the compressive fabric of the compression areas of the form-fitting garment article applies a greater compressive force than other areas of the form-fitting garment article.
3. The device of any one of claims 1 or 2, wherein the form-fitting garment article is formed of a continuously knitted material and wherein the compression areas correspond to more tightly knitted areas of the continuously knitted material.
4. The device of any one of claims 1 to 3, wherein the plurality of compression areas comprises an elbow compression area located over an elbow joint of the user's body when the form-fitting garment article is worn by the user.
5. The device of claim 4, wherein the plurality of tracked areas comprises an upper arm tracked area associated to the elbow compression area and the plurality of motion sensors comprises a sensor positioned at the upper arm tracked area.
6. The device of claim 5, wherein the plurality of tracked areas comprises a lower arm tracked area associated to the elbow compression area and the plurality of motion sensors comprises a sensor positioned at the lower arm tracked area.
7. The device of claims 5 or 6, wherein the sensor positioned at the upper arm tracked area is operable to sense a motion of a shoulder of the user.
8. The device of any one of claims 4 to 7, wherein the plurality of motion sensors comprises a stretch sensor extending over the elbow compression area.
9. The device of any one of claims 1 to 8, wherein the plurality of compression areas comprises a knee compression area located over a knee joint of the user's body when the form-fitting garment article is worn by the user.
10. The device of claim 9, wherein the plurality of tracked area comprises an upper leg tracked area associated to the knee compression area and the plurality of motion sensors comprises a sensor positioned at the upper leg tracked area.
1 1 . The device of claim 10, wherein the plurality of tracked areas comprises a lower leg tracked area associated to the knee compression area and the plurality of motion sensors comprises a sensor positioned at the lower leg tracked area.
12. The device of claim 1 1 , wherein the plurality of compression areas further comprises a calf compression area located over a calf of the user's body when the form-fitting garment article is worn by the user, the calf compression area maintaining the lower leg tracked area in position in combination with the knee compression area.
13. The device of any one of claims 10 to 12, wherein the sensor positioned at the upper leg tracked area is operable to sense a motion of a hip of the user.
14. The device of any one of claims 9 to 13, wherein the plurality of motion sensors comprises a stretch sensor extending over the knee compression area.
15. The device of any one of claims 1 to 14, wherein the plurality of motion sensors further comprises a torso sensor positioned over a torso of the user's body.
16. The device of any one of claims 1 to 15, wherein each of the compression areas comprises a distinctive visual indicator.
17. The device of any one of claims 1 to 16, wherein the motion sensors comprises inertial measurement units.
18. The device of claim 17, wherein the inertial measurement units comprise sensor modules for measuring orientation in x, y, and z directions, acceleration in x, y, and z directions, rate of rotation in x, y, and z directions, and magnetic field in x, y, and z directions.
19. The device of any one of claims 1 to 18, wherein each motion sensor comprises a sensor capsule for performing the 3-D motion capture and a sensor holder for retaining the sensor capsule, the sensor holder being attached to the form-fitting garment article at one of the tracked areas and the sensor capsule being removable from the sensor holder.
20. The device of claim 19, wherein a backside of the sensor capsule comprises one or more electrical contacts for forming a conductive path with an electrical contact of the capsule holder.
21 . The device of claims 19 or 20, wherein sensor holders and the form-fitting garment article are machine washable when the sensor capsules are removed.
22. The device of any one of claims 1 to 21 , further comprising a brain pack removably attached to the form-fitting garment article and a network of conductive channels providing a plurality of electrical connections between the brain pack and the plurality of motion sensors.
23. The device of claim 22, wherein the brain pack comprises a power supply module and wherein the motion sensors receive electrical power from the power supply module over the electrical connections.
24. The device of claims 22 or 23, wherein the conductive channels comprise conductive fibers embedded in the garment article.
25. The device of any one of claims 22 to 24, wherein the brain pack is configured to receive motion data generated by the plurality of motion sensors over the electrical connections.
26. The device of claim 25, wherein the brain pack further comprises non- volatile memory for storing the motion data.
27. The device of any one of claims 22 to 26, wherein the brain pack is attached to a waist portion of the form-fitting garment article.
28. The device of any one of claims 1 to 27, wherein the form-fitting garment article is a two-piece garment article comprising an upper body portion and a lower body portion.
29. A kit for capturing 3-D motion of a user, the kit comprising:
a plurality of sensor capsules operable to capture 3-D motion;
a form-fitting garment article having a plurality of sensor holders attached thereto, each sensor holder being configured to removably retain one of the sensor capsules; and
a brain pack removably attachable to the form-fitting garment article and configured to receive captured motion data from the sensor capsules.
30. The kit of claim 29, wherein the form-fitting garment article comprises a plurality of integrated conductive channels providing one or more electrical connections between the sensor capsules and one or more connectors adapted to mate with the brain pack.
31 . The kit of claim 30, wherein a backside of each sensor capsule comprises one or more electrical contacts for forming a conductive path with an electrical contact of one of the capsule holder, the sensor capsule being connected to one of the conductive traces via the sensor holder.
32. The kit of claims 30 or 31 , further comprising a plurality of stretch sensors removably attached to the form-fitting garment article, each stretch sensor having a plurality of contacts for connecting to the conductive traces.
33. The kit of claims 30 to 32, wherein the brain pack comprises a power supply module and wherein the sensor capsules are operable to receive electrical power from the power supply module when the sensors capsules are received within the sensor holders and one or more connectors are mated to the brain pack.
34. The kit of claims 30 to 33, wherein the brain pack is configured to receive motion data captured by the sensor capsules when the sensors capsules are received within the sensor holders and one or more connectors are mated to the brain pack.
35. The device of claim 34, wherein the brain pack further comprises nonvolatile memory for storing the motion data.
36. The device of any one of claims 29 to 35, wherein the brain pack is attached to a waist portion of the form-fitting garment article.
37. The kit of claim 29 to 36, wherein the form-fitting garment article and the sensor holders attached thereto are machine-washable when the sensor capsules and the brain packed are removed therefrom.
38. The device of any one of claims 29 to 37, wherein the form-fitting garment article is a two-piece garment article comprising an upper body portion and a lower body portion.
39. A method for sensing three-dimensional (3D) motion of a user, the method comprising:
wearing, over the user's body, a form-fitting garment article having a plurality of compression areas and a plurality of tracked areas;
positioning the compression areas over a first set of body parts of the user to be compressed; positioning the tracked areas over a second set of body parts of the user to be tracked; and
capturing, by a plurality of sensors positioned on the form-fitting garment article at the tracked areas, 3-D motion of the second set of body parts of the user.
40. The method of claim 39, wherein the compression areas are located to maintain position of the tracked areas over the second set of body parts when the compression areas are positioned over the first set of body parts.
41 . The method of claim 40, wherein the compressive fabric of the compression areas of the form-fitting garment article applies a greater compressive force than other areas of the form-fitting garment article.
42. The method of any one of claims 39 to 41 , wherein positioning the compression areas over the first set of body parts comprises positioning an elbow compression area over an elbow joint of the user.
43. The method of claim 42, wherein positioning the tracked areas over the second set of body parts comprises positioning an upper arm tracked area over an upper arm of the user's body.
44. The method of claims 43 wherein positioning the tracked areas over the second set of body parts further comprises positioning a lower arm tracked area over a lower arm of the user's body.
45. The method of claims 43 or 44, further comprising positioning a stretch sensor extending over the elbow compression area and measuring 3-D motion of the elbow with the stretch sensor.
46. The method of any one of claims 39 to 45, wherein positioning the compression areas over the first set of body parts comprises positioning a knee compression area over a knee joint of the user.
47. The method of claim 46, wherein positioning the tracked areas over the second set of body parts comprises positioning an upper leg tracked area over a upper leg of the user's body.
48. The method of claim 47, wherein positioning the tracked areas over the second set of body parts comprises positioning a lower leg tracked area over a lower leg of the user's body.
49. The method of claims 47 or 48, further comprising positioning a stretch sensor extending over the knee compression area and measuring 3-D motion of the knee with the stretch sensor.
50. The method of any one of claims 39 to 49, wherein measuring 3-D motion comprises measuring 3-D motion of the user's torso with a motion sensor positioned over a torso of the user's body.
51 . The method of any one of 39 to 50, wherein each motion sensor comprises a sensor capsule for performing the 3-D motion capture and a sensor holder for retaining the sensor capsule, the sensor holder being attached to the form-fitting garment article at one of the tracked areas and the sensor capsule being removable from the sensor holder;
wherein the method further comprises, prior to measuring the 3-D motion, inserting each sensor capsule into a corresponding sensor holder.
52. The method of claim 51 , further comprising, prior to measuring the 3-D motion:
attaching a brain pack to a waist portion of the form-fitting garment article;
connecting the brain pack to the sensor capsules via a plurality of conductive channels formed in the form-fitting garment article, the conductive channels providing a plurality of electrical connections between the brain pack and the plurality of motion sensors.
53. The method of claim 52, wherein the brain pack comprises a power supply module and wherein the motion sensors receives electrical power from the power supply module over the electrical connections.
54. The method of any one of claims 52 to 53, further comprising:
detaching the brain pack from the form-fitting garment article;
removing the capsule sensors from the capsule holders; and washing the form-fitting garment article with the sensor holders attached thereto.
55. The method of any one of claims 39 to 54, wherein the capturing by the plurality of sensors comprises:
capturing orientation information by a first inertial sensor positioned at an upper sub-segment of a body segment of the user;
capturing orientation information by a second inertial sensor positioned at a lower sub-segment of the body segment of the user; and
the method further comprising:
calibrating the first inertial sensor and second inertial sensor based on the orientation information captured when the user is in a calibration pose;
transforming the orientation information captured by the first inertial sensor into a first rotation matrix defining the orientation of the first inertial sensor relative to the configuration pose;
transforming the orientation information captured by the second inertial sensor into a second rotation matrix defining the orientation of the second inertial sensor relative to the configuration pose; and
adjusting one of the rotation matrices based on the other of the rotation matrices.
56. The method of claim 55, wherein the calibration pose is a T-pose.
57. The method of claims 55 or 56, wherein the second rotation matrix is adjusted based on the first rotation matrix.
58. The method of claim 57, wherein the adjusting comprises compensating the second rotation matrix such that it rotates within a same plane as a roll and yaw axes of the upper sub-segment.
59. The method of claim 58, further comprising determining an angle of a joint joining the upper sub-segment and the lower sub-segment based the first rotation matrix and the adjusted second rotation matrix.
60. The method of claim 59, further comprising comparing the determining angle of the joint with a predetermined constraint and correcting the first rotation matrix and the adjusted second rotation matrix based on the comparing.
61 . The method of claim 60, wherein the upper sub-segment is an upper arm and the lower sub-segment is a lower arm and the joint is an elbow joint, the method further comprising determining, based on the corrected first rotation matrix and corrected second rotation matrix, one or more of elbow flexion/extension, forearm pronation/supination, shoulder flexion/extension, shoulder vertical adduction/abduction, shoulder horizontal adduction/abduction, and shoulder rotation.
62. The method of claim 60, wherein the upper sub-segment is an upper leg and the lower sub-segment is a lower leg and the joint is a knee joint, the method further comprising determining, based on the corrected first rotation matrix and corrected second rotation matrix, one or more of knee flexion/extension; hip flexion/extension, hip abduction/adduction, hip rotation, trunk flexion/extension, trunk lateral bending and trunk rotation.
63. The method of any one of claims 39 to 54, wherein the capturing by the plurality of sensors comprises:
capturing orientation information by an inertial sensor positioned at an upper sub-segment of a body segment of the user; capturing a stretch signal by a stretch sensor positioned at a joint connecting the upper sub-segment to a lower sub-segment of the body segment; and
the method further comprising:
calibrating the inertial sensor and the stretch sensor based on the orientation information and angle information captured when the user is in a calibration pose;
transforming the orientation information captured by the inertial sensor into a first rotation matrix defining the orientation of the inertial sensor relative to the configuration pose;
mapping the stretch signal by the stretch sensor to a flexion angle information; and
determining a corrected first rotation matrix based on the flexion angle to determine a second rotation matrix defining an orientation of the lower sub-segment.
64. The method of claim 63, wherein the calibration pose is a T-pose.
65. The method of claims 63 or 64, wherein the upper sub-segment is an upper arm and the lower sub-segment is a lower arm and the joint is an elbow joint, the method further comprising determining, based on the first rotation matrix and the second rotation matrix, one or more of elbow flexion/extension, forearm pronation/supination, shoulder flexion/extension, shoulder vertical adduction/abduction, shoulder horizontal adduction/abduction, and shoulder rotation.
66. The method of claims 63 or 64, wherein the upper sub-segment is an upper leg and the lower sub-segment is a lower leg and the joint is a knee joint, the method further comprising determining, based on the first rotation matrix and the second rotation matrix, one or more of knee flexion/extension; hip flexion/extension, hip abduction/adduction, hip rotation, trunk flexion/extension, trunk lateral bending and trunk rotation.
67. The method of any one of claims 39 to 54, wherein the capturing by the plurality of sensors comprises:
capturing orientation information by a first inertial sensor positioned at an upper sub-segment of a body segment of the user;
capturing orientation information by a second inertial sensor positioned at a lower sub-segment of the body segment;
capturing a stretch signal by a stretch sensor positioned at a joint connecting the upper sub-segment to the lower sub-segment; and
the method further comprising:
calibrating the first inertial sensor, the second inertial sensor and the stretch sensor based on the orientation information and angle information captured when the user is in a calibration pose;
transforming the orientation information captured by the first inertial sensor into a first rotation matrix defining the orientation of the first inertial sensor relative to the configuration pose;
transforming the orientation information captured by the second inertial sensor into a second rotation matrix defining the orientation of the second inertial sensor relative to the configuration pose;
mapping the stretch signal by the stretch sensor to a flexion angle information; and
compensating the first rotation matrix and the second rotation matrix based on the flexion angle.
68. The method of claim 67, wherein the calibration pose is a T-pose.
69. The method of claims 67 or 68, wherein the upper sub-segment is an upper arm and the lower sub-segment is a lower arm and the joint is an elbow joint, the method further comprising determining, based on the first rotation matrix and the second rotation matrix, one or more of elbow flexion/extension, forearm pronation/supination, shoulder flexion/extension, shoulder vertical adduction/abduction, shoulder horizontal adduction/abduction, and shoulder rotation.
70. The method of claims 67 or 68, wherein the upper sub-segment is an upper leg and the lower sub-segment is a lower leg and the joint is a knee joint, the method further comprising determining, based on the first rotation matrix and the second rotation matrix, one or more of knee flexion/extension; hip flexion/extension, hip abduction/adduction, hip rotation, trunk flexion/extension, trunk lateral bending and trunk rotation.
71 . A method for processing three-dimensional (3D) motion of a user, the method comprising:
receiving orientation information captured by a first inertial sensor positioned at an upper sub-segment of a body segment of the user;
receiving orientation information captured by a second inertial sensor positioned at a lower sub-segment of the body segment of the user;
calibrating the first inertial sensor and second inertial sensor based on the orientation information captured when the user is in a calibration pose;
transforming the orientation information captured by the first inertial sensor into a first rotation matrix defining the orientation of the first inertial sensor relative to the configuration pose;
transforming the orientation information captured by the second inertial sensor into a second rotation matrix defining the orientation of the second inertial sensor relative to the configuration pose; and
adjusting the one of the rotation matrices based on the other of the rotation matrices.
72. The method of claim 71 , wherein the calibration pose is a T-pose.
73. The method of claims 71 or 72, wherein the second rotation matrix is adjusted based on the first rotation matrix.
74. The method of claim 73, wherein the adjusting comprises compensating the second rotation matrix such that it rotates within a same plane as a roll and yaw axes of the upper sub-segment.
75. The method of claim 74, further comprising determining an angle of a joint joining the upper sub-segment and the lower sub-segment based the first rotation matrix and the adjusted second rotation matrix.
76. The method of claim 75, further comprising comparing the determining angle of the joint with a predetermined constraint and correcting the first rotation matrix and the adjusted second rotation matrix based on the comparing.
77. The method of claim 76, wherein the upper sub-segment is an upper arm and the lower sub-segment is a lower arm and the joint is an elbow joint, the method further comprising determining, based on the corrected first rotation matrix and corrected second rotation matrix, one or more of elbow flexion/extension, forearm pronation/supination, shoulder flexion/extension, shoulder vertical adduction/abduction, shoulder horizontal adduction/abduction, and shoulder rotation.
78. The method of claim 76, wherein the upper sub-segment is an upper leg and the lower sub-segment is a lower leg and the joint is a knee joint, the method further comprising determining, based on the corrected first rotation matrix and corrected second rotation matrix, one or more of knee flexion/extension; hip flexion/extension, hip abduction/adduction, hip rotation, trunk flexion/extension, trunk lateral bending and trunk rotation.
79. A method for processing three-dimensional (3D) motion of a user, the method comprising: receiving orientation information captured by an inertial sensor positioned at an upper sub-segment of a body segment of the user;
receiving stretch signal captured by a stretch sensor positioned at a joint connecting the upper sub-segment to a lower sub-segment of the body segment;
calibrating the inertial sensor and the stretch sensor based on the orientation information and angle information captured when the user is in a calibration pose;
transforming the orientation information captured by the inertial sensor into a first rotation matrix defining the orientation of the inertial sensor relative to the configuration pose;
mapping the stretch signal by the stretch sensor to a flexion angle information; and
determining a corrected first rotation matrix based on the flexion angle to determine a second rotation matrix defining an orientation of the lower sub-segment.
80. The method of claim 79, wherein the calibration pose is a T-pose.
81 . The method of claims 79 or 80, wherein the upper sub-segment is an upper arm and the lower sub-segment is a lower arm and the joint is an elbow joint, the method further comprising determining, based on the first rotation matrix and the second rotation matrix, one or more of elbow flexion/extension, forearm pronation/supination, shoulder flexion/extension, shoulder vertical adduction/abduction, shoulder horizontal adduction/abduction, and shoulder rotation.
82. The method of claims 79 or 80, wherein the upper sub-segment is an upper leg and the lower sub-segment is a lower leg and the joint is a knee joint, the method further comprising determining, based on the first rotation matrix and the second rotation matrix, one or more of knee flexion/extension; hip flexion/extension, hip abduction/adduction, hip rotation, trunk flexion/extension, trunk lateral bending and trunk rotation.
83. A method for processing three-dimensional (3D) motion of a user, the method comprising:
receiving orientation information captured by a first inertial sensor positioned at an upper sub-segment of a body segment of the user;
receiving orientation information captured by a second inertial sensor positioned at a lower sub-segment of the body segment;
receiving stretch signal captured by a stretch sensor positioned at a joint connecting the upper sub-segment to the lower sub-segment;
calibrating the first inertial sensor, the second inertial sensor and the stretch sensor based on the orientation information and angle information captured when the user is in a calibration pose;
transforming the orientation information captured by the first inertial sensor into a first rotation matrix defining the orientation of the first inertial sensor relative to the configuration pose;
transforming the orientation information captured by the second inertial sensor into a second rotation matrix defining the orientation of the second inertial sensor relative to the configuration pose;
mapping the stretch signal by the stretch sensor to a flexion angle information; and
compensating the first rotation matrix and the second rotation matrix based on the flexion angle.
84. The method of claim 83, wherein the calibration pose is a T-pose.
85. The method of claims 83 or 84, wherein the upper sub-segment is an upper arm and the lower sub-segment is a lower arm and the joint is an elbow joint, the method further comprising determining, based on the first rotation matrix and the second rotation matrix, one or more of elbow flexion/extension, forearm pronation/supination, shoulder flexion/extension, shoulder vertical adduction/abduction, shoulder horizontal adduction/abduction, and shoulder rotation.
86. The method of claims 83 or 84, wherein the upper sub-segment is an upper leg and the lower sub-segment is a lower leg and the joint is a knee joint, the method further comprising determining, based on the first rotation matrix and the second rotation matrix, one or more of knee flexion/extension; hip flexion/extension, hip abduction/adduction, hip rotation, trunk flexion/extension, trunk lateral bending and trunk rotation.
87. A computer-implemented system comprising:
at least one data storage device; and
at least one processor operably coupled to the at least one storage device, the at least one processor being configured for performing the method of any one of claims 71 to 86.
88. A computer readable storage medium comprising computer executable instructions for performing the method of any one of claims 71 to 86.
PCT/CA2016/051398 2015-11-27 2016-11-28 Motion capture garment WO2017088068A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201562260395P 2015-11-27 2015-11-27
US62/260,395 2015-11-27

Publications (1)

Publication Number Publication Date
WO2017088068A1 true WO2017088068A1 (en) 2017-06-01

Family

ID=58763920

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CA2016/051398 WO2017088068A1 (en) 2015-11-27 2016-11-28 Motion capture garment

Country Status (1)

Country Link
WO (1) WO2017088068A1 (en)

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108294583A (en) * 2016-06-29 2018-07-20 罗伯特·博世有限公司 Clothing preserves element and the preservation system for clothes object
GB2569101A (en) * 2017-11-03 2019-06-12 Incus Performance Ltd Wearable exercise assessment system
CN111994312A (en) * 2020-08-13 2020-11-27 上海精密计量测试研究所 Motion capture system training space suit based on virtual reality technology
WO2020259859A1 (en) * 2019-06-28 2020-12-30 RLT IP Ltd. Motion capture system
GB2588235A (en) * 2019-10-18 2021-04-21 Mclaren Applied Tech Ltd Joint sensing
FR3102665A1 (en) * 2019-11-05 2021-05-07 Yukik Device for determining motion data
GB2568503B (en) * 2017-11-17 2021-08-04 Jaguar Land Rover Ltd A motorcycle
WO2022035324A1 (en) * 2020-08-14 2022-02-17 Weta Digital Limited Wearable article for a performance capture system
US11369866B2 (en) 2017-12-21 2022-06-28 Sony Interactive Entertainment Inc. Position tracking apparatus and method
US11510035B2 (en) 2018-11-07 2022-11-22 Kyle Craig Wearable device for measuring body kinetics
US11849415B2 (en) 2018-07-27 2023-12-19 Mclaren Applied Technologies Limited Time synchronisation
US11898874B2 (en) 2019-10-18 2024-02-13 Mclaren Applied Technologies Limited Gyroscope bias estimation

Citations (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5963891A (en) * 1997-04-24 1999-10-05 Modern Cartoons, Ltd. System for tracking body movements in a virtual reality system
US6428490B1 (en) * 1997-04-21 2002-08-06 Virtual Technologies, Inc. Goniometer-based body-tracking device and method
US6563424B1 (en) * 2001-05-22 2003-05-13 Nokia Corporation Smart garment system, method and apparatus involved for integrating electronic devices into garments
US6801140B2 (en) * 2001-01-02 2004-10-05 Nokia Corporation System and method for smart clothing and wearable electronic devices
US20040219498A1 (en) * 2002-04-09 2004-11-04 Davidson Lance Samuel Training apparatus and methods
US20080091373A1 (en) * 2006-07-31 2008-04-17 University Of New Brunswick Method for calibrating sensor positions in a human movement measurement and analysis system
US20080295216A1 (en) * 2007-05-31 2008-12-04 Nike, Inc. Articles of Apparel Providing Enhanced Body Position Feedback
US20090306485A1 (en) * 2008-06-03 2009-12-10 Jonathan Arnold Bell Wearable Electronic System
US20110302686A1 (en) * 2010-06-14 2011-12-15 Salomon S.A.S Close-fitting sports garment
US8527228B2 (en) * 2010-06-04 2013-09-03 Apple Inc. Calibration for three dimensional motion sensor
US20130274587A1 (en) * 2012-04-13 2013-10-17 Adidas Ag Wearable Athletic Activity Monitoring Systems
US20140070957A1 (en) * 2012-09-11 2014-03-13 Gianluigi LONGINOTTI-BUITONI Wearable communication platform
US20140172134A1 (en) * 2012-12-13 2014-06-19 Nike, Inc. Apparel Having Sensor System
CN104026768A (en) * 2014-04-28 2014-09-10 金进精密泵业制品(深圳)有限公司 Intelligent suit and charging hanger thereof
US20140318699A1 (en) * 2012-09-11 2014-10-30 Gianluigi LONGINOTTI-BUITONI Methods of making garments having stretchable and conductive ink
WO2015138515A1 (en) * 2014-03-10 2015-09-17 L.I.F.E. Corporation S.A. Physiological monitoring garments
CA165595S (en) * 2015-11-30 2016-11-04 9281 7428 Québec Inc Heddoko Tm Motion tracking shirt with trouser

Patent Citations (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6428490B1 (en) * 1997-04-21 2002-08-06 Virtual Technologies, Inc. Goniometer-based body-tracking device and method
US5963891A (en) * 1997-04-24 1999-10-05 Modern Cartoons, Ltd. System for tracking body movements in a virtual reality system
US6801140B2 (en) * 2001-01-02 2004-10-05 Nokia Corporation System and method for smart clothing and wearable electronic devices
US6563424B1 (en) * 2001-05-22 2003-05-13 Nokia Corporation Smart garment system, method and apparatus involved for integrating electronic devices into garments
US20040219498A1 (en) * 2002-04-09 2004-11-04 Davidson Lance Samuel Training apparatus and methods
US20080091373A1 (en) * 2006-07-31 2008-04-17 University Of New Brunswick Method for calibrating sensor positions in a human movement measurement and analysis system
US20080295216A1 (en) * 2007-05-31 2008-12-04 Nike, Inc. Articles of Apparel Providing Enhanced Body Position Feedback
US20090306485A1 (en) * 2008-06-03 2009-12-10 Jonathan Arnold Bell Wearable Electronic System
US8527228B2 (en) * 2010-06-04 2013-09-03 Apple Inc. Calibration for three dimensional motion sensor
US20110302686A1 (en) * 2010-06-14 2011-12-15 Salomon S.A.S Close-fitting sports garment
US20130274587A1 (en) * 2012-04-13 2013-10-17 Adidas Ag Wearable Athletic Activity Monitoring Systems
US20140070957A1 (en) * 2012-09-11 2014-03-13 Gianluigi LONGINOTTI-BUITONI Wearable communication platform
US20140318699A1 (en) * 2012-09-11 2014-10-30 Gianluigi LONGINOTTI-BUITONI Methods of making garments having stretchable and conductive ink
US20140172134A1 (en) * 2012-12-13 2014-06-19 Nike, Inc. Apparel Having Sensor System
WO2015138515A1 (en) * 2014-03-10 2015-09-17 L.I.F.E. Corporation S.A. Physiological monitoring garments
CN104026768A (en) * 2014-04-28 2014-09-10 金进精密泵业制品(深圳)有限公司 Intelligent suit and charging hanger thereof
CA165595S (en) * 2015-11-30 2016-11-04 9281 7428 Québec Inc Heddoko Tm Motion tracking shirt with trouser

Cited By (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108294583A (en) * 2016-06-29 2018-07-20 罗伯特·博世有限公司 Clothing preserves element and the preservation system for clothes object
GB2569101A (en) * 2017-11-03 2019-06-12 Incus Performance Ltd Wearable exercise assessment system
GB2568503B (en) * 2017-11-17 2021-08-04 Jaguar Land Rover Ltd A motorcycle
EP3648856B1 (en) * 2017-12-21 2022-11-30 Sony Interactive Entertainment Inc. Position tracking apparatus and method
US11369866B2 (en) 2017-12-21 2022-06-28 Sony Interactive Entertainment Inc. Position tracking apparatus and method
US11849415B2 (en) 2018-07-27 2023-12-19 Mclaren Applied Technologies Limited Time synchronisation
US11510035B2 (en) 2018-11-07 2022-11-22 Kyle Craig Wearable device for measuring body kinetics
WO2020259859A1 (en) * 2019-06-28 2020-12-30 RLT IP Ltd. Motion capture system
GB2588235A (en) * 2019-10-18 2021-04-21 Mclaren Applied Tech Ltd Joint sensing
GB2588235B (en) * 2019-10-18 2023-07-12 Mclaren Applied Ltd Joint sensing
US11898874B2 (en) 2019-10-18 2024-02-13 Mclaren Applied Technologies Limited Gyroscope bias estimation
FR3102665A1 (en) * 2019-11-05 2021-05-07 Yukik Device for determining motion data
CN111994312A (en) * 2020-08-13 2020-11-27 上海精密计量测试研究所 Motion capture system training space suit based on virtual reality technology
WO2022035324A1 (en) * 2020-08-14 2022-02-17 Weta Digital Limited Wearable article for a performance capture system

Similar Documents

Publication Publication Date Title
WO2017088068A1 (en) Motion capture garment
JP7267497B2 (en) Garment with sensor system
US20210084999A1 (en) Dynamic proprioception
KR102519894B1 (en) detectable clothing
US20150182841A1 (en) Communication Module for Personal Performance Monitoring and related Arrangement and Method
US20210247841A1 (en) Motion capturing garments and system and method for motion capture using jeans and other garments
EP3142757B1 (en) Wearable device comprising one or more impact sensors
US9724040B2 (en) Garment integrating a system for collecting physiological data
CN203943102U (en) A kind of intelligent clothing and charging clothes hanger thereof
CN104026768A (en) Intelligent suit and charging hanger thereof
US20210100460A1 (en) Conformable Garment for Physiological Sensing
US11510035B2 (en) Wearable device for measuring body kinetics
CN106112997B (en) Ectoskeleton clothes
US11527109B1 (en) Form analysis system
CN209916310U (en) Appearance physiotherapy device is rectified to intelligence
CN211131576U (en) Intelligent wrist and hand orthosis based on 3D printing technology
TWI522052B (en) Smart sportswear
CN108175151A (en) Intelligent insole and monitoring system
US20230104675A1 (en) Wearable Device for Measuring Body Kinetics
CN207995977U (en) Intelligent insole and monitoring system
GB2567522A (en) Garment with sensors
CN113167576A (en) Method and system for estimating topography of at least two parts of body

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16867498

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 16867498

Country of ref document: EP

Kind code of ref document: A1