WO2017055080A1 - System and method for supporting physical exercises - Google Patents

System and method for supporting physical exercises Download PDF

Info

Publication number
WO2017055080A1
WO2017055080A1 PCT/EP2016/071612 EP2016071612W WO2017055080A1 WO 2017055080 A1 WO2017055080 A1 WO 2017055080A1 EP 2016071612 W EP2016071612 W EP 2016071612W WO 2017055080 A1 WO2017055080 A1 WO 2017055080A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
person
drone
projector
control unit
Prior art date
Application number
PCT/EP2016/071612
Other languages
French (fr)
Inventor
Gabriele PAPINI
Francesco SARTOR
Paul Anthony Shrubsole
Original Assignee
Koninklijke Philips N.V.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Koninklijke Philips N.V. filed Critical Koninklijke Philips N.V.
Publication of WO2017055080A1 publication Critical patent/WO2017055080A1/en

Links

Classifications

    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B19/00Teaching not covered by other main groups of this subclass
    • G09B19/003Repetitive work cycles; Sequence of movements
    • G09B19/0038Sports
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U10/00Type of UAV
    • B64U10/10Rotorcrafts
    • B64U10/13Flying platforms
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B5/00Electrically-operated educational appliances
    • G09B5/06Electrically-operated educational appliances with both visual and audible presentation of the material to be studied
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H20/00ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance
    • G16H20/30ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance relating to physical therapies or activities, e.g. physiotherapy, acupressure or exercising
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U2101/00UAVs specially adapted for particular uses or applications
    • B64U2101/20UAVs specially adapted for particular uses or applications for use as communications relays, e.g. high-altitude platforms
    • B64U2101/24UAVs specially adapted for particular uses or applications for use as communications relays, e.g. high-altitude platforms for use as flying displays, e.g. advertising or billboards
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B64AIRCRAFT; AVIATION; COSMONAUTICS
    • B64UUNMANNED AERIAL VEHICLES [UAV]; EQUIPMENT THEREFOR
    • B64U2101/00UAVs specially adapted for particular uses or applications
    • B64U2101/30UAVs specially adapted for particular uses or applications for imaging, photography or videography
    • B64U2101/31UAVs specially adapted for particular uses or applications for imaging, photography or videography for surveillance

Definitions

  • the present invention generally relates to the field of personal training, fitness and coaching for persons performing any type of physical exercise.
  • the present invention relates to a system that makes use of a drone for supporting a person performing physical exercises.
  • the present invention also relates to a corresponding method as well as to a corresponding computer program.
  • Another aspect of physical training which is not trivial to achieve without making use of a trained professional personal trainer is the correct execution of the physical exercises.
  • US 2008/0269016 Al relates to an adaptive training system with aerial mobility.
  • This system includes a mobile unit which includes multiple sensors,
  • the mobile unit executes one or more training paths to simulate chasing associated with various sports.
  • the mobile unit is furthermore capable of determining its own location and the location of the athlete throughout a training session. Still further, the mobile unit is configured to adapt the training path to stress weaknesses of the athlete with respect to various types of athletic skills.
  • a system that comprises: a flying, and/or ground-moving drone;
  • a projector arranged at the drone;
  • a surveillance sensor arranged at the drone for observing spatial confinements in an environment of the person and for identifying one or more surfaces in the environment of the person;
  • a memory unit for storing images or image sequences relating to training instructions for different kinds of physical exercises
  • control unit which is configured:
  • a method that supports a person performing physical exercises by means of a flying, and/or ground-moving drone which comprises (i) a projector and (ii) a surveillance sensor for observing spatial confinements in an environment of the person and for identifying one or more surfaces in the environment of the person, the method comprises the steps of:
  • a computer program which comprises program code means for causing a computer to carry out the steps of the aforementioned method when said computer program is carried out on a computer.
  • Preferred embodiments of the invention are defined in the dependent claims. It shall be understood that the claimed method and the claimed computer program have similar and/or identical preferred embodiments as the claimed system and as defined in the dependent claims.
  • a “drone” as used herein shall refer to any type of robot vehicle in general. In other words, it refers to an unmanned vehicle which is able to move around in space by flying, and/or moving on the ground.
  • a drone may have any type of locomotion actuator that allows the drone to move around in space. The drone may either self-control its movements or may be remotely controlled.
  • a "projector” as used herein refers to any type of optical device that is suitable for projecting an image or moving images (an image sequence) onto a surface.
  • a “surveillance sensor” as used herein is any type of sensor that is suitable for receiving monitoring signals, preferably but not necessarily optical signals, which allow an observation of the space in the environment of the sensor.
  • Typical components of such surveillance sensors may include a camera, an optical scanner, a proximity sensor, etc.
  • spatial confinements refers to the spatial conditions of the surrounding space, e.g. in an indoor situation the geometry, size and/or volume of a room, its walls, floor and/or ceiling.
  • a "memory unit” as used herein may refer to any type of data storage suitable for permanently or temporarily retaining digital data.
  • a memory unit may include any type of volatile or non- volatile memory and may also refer to a database which is accessible via cloud computing.
  • a "control unit” as used herein may include any type of computer hardware with software stored thereon in order to carry out logical processes for controlling and/or steering a system.
  • a control unit or controller may be implemented e.g. as a chip or as a stand-alone device that interfaces with a peripheral device.
  • the control unit may also be implemented as an external device that manages the operation of a system or device, in the present case the operation of the drone and the devices arranged at the drone.
  • a “physical exercise” as used herein shall refer to any type of bodily activity that enhances or maintains physical fitness, overall health and/or wellness.
  • the herein presented system and method make use of a flying, and/or ground-moving drone which, inter alia, comprises a projector by means of which images or image sequences relating to training instructions for different kinds of physical exercises may be projected onto one or more surfaces used as projection locations.
  • Said images or image sequences may e.g. include pictures and/or text explaining the person how to perform a certain kind of physical exercise in a correct manner.
  • Said image sequences may also include videos or video clips showing a person or an avatar
  • the herein presented system and method enable a very flexible and versatile way of projecting the training images or image sequences to different parts of the environment (surroundings) of the person by means of the projector that is arranged at the drone.
  • One of the central aspects of the herein presented system and method is the ability of the system to select a surface in the environment of the person that suites as an optimal projection location for the training images or image sequences, as well as an optimal location for conducting the exercise.
  • the selection of the surfaces used for the projection of the training images or image sequences is based on at least one of: (i) the observed spatial confinements in the environment of the person, (ii) the kind of physical exercise to which each image or image sequence is related, and (iii) the characteristics of the one or more surfaces as identified by the surveillance sensor.
  • the presented system first analyses the environment, i.e. the space around the person, by means of the surveillance sensor in order to e.g. determine the size of the room in which the person is located, the geometry of the room, and/or other environmental parameters that could influence the performance of a physical exercise and/or the projection of an image or image sequence.
  • This spatial analysis of the environment of the person may e.g. be carried out during an initial sweep of the drone in which the surveillance sensor observes the environment while the drone is moving around in space.
  • the observed spatial confinements may then not only be used to determine possible candidates for projection surfaces, but also to determine possible positions where the person may stand, sit or lie.
  • a further relevant factor for the selection of an optimal projection location is, as already mentioned above, the kind of physical exercise to which the image or image sequence is related. For example, if the person is lying on his back in an exemplary kind of physical exercise (such as a Pilates crunch), the control unit will select, if possible, a surface on the ceiling of the room as projection surface onto which the at least one image or image sequence showing said exercise is projected. On the other hand, in case of the projection of an image or image sequence relating to a second kind of physical exercise (such as e.g. a front bridge exercise), which makes it necessary that the person is facing downwards, the control unit will select, if possible, a surface on the floor of the room as projection surface for the projector.
  • a second kind of physical exercise such as e.g. a front bridge exercise
  • a still further criterion for the selection of an optimal projection location may be the characteristics of the one or more surfaces that have been identified by the surveillance sensor. Said characteristics may include at least one of a size of the surface, a shape of the surface, a surface condition of the surface, an orientation in space of the surface, a reflectivity of the surface, a color of the surface, and an ambient lighting condition of the surface.
  • Each different kind of physical exercise may require different characteristics of the projection location. For some kinds of physical exercises it may advantageous to use a small or comparatively dark projection surface, whereas for other kinds of physical exercises it may be more advantageous to use a comparatively reflective and large projection surface.
  • control unit of the presented system preferably bases the selection on all of the above-mentioned three selection criteria (spatial confinement of space, kind of physical exercise, and surface characteristics). However, not necessarily all of them, but also only one of theses criteria may be used for the above-mentioned surface selection without leaving the scope of the present invention.
  • each of the different images or image sequences stored in the memory unit is assigned at least one of the surfaces identified by the surveillance sensor as a suitable projection surface.
  • the assigned projection surfaces may be different for each image or image sequence depending on the kind of physical exercise shown in the image or image sequence; meaning that image sequence A may be assigned projection surface X, while image sequence B is assigned projection surface Y, etc.
  • control unit is configured to select said subset of the images or image sequences based on the observed spatial confinements in the environment of the person. If it is e.g. detected that the person is in a rather small room, only training images and image sequences are selected to said subset which relate to physical exercises that do not require a lot of space.
  • control unit may be configured to select said subset of the images or image sequences based on the characteristics of the one or more surfaces identified by the surveillance sensor. If e.g. the reflectivity, ambient lighting condition and/or surface condition of the ceiling of the room in which the person is located does not allow the ceiling to be used as projection surface, no images or image sequences are selected to the subset that have to be carried out while lying on the back.
  • the presented system does not only search the optimal projection surface for each type of physical exercise, but already makes a kind of preselection of physical exercises that may be suitably carried out in the room the person is currently in. Other physical exercises that may not be suitably carried out in said room may thus not even be accessible for the user.
  • control unit is configured to control the projector to adapt a size, a shape, an image brightness and/or an image contrast of the least one image or image sequence projected by the projector based on characteristics of the at least one surface that has been selected for said at least one image or image sequence. Projection properties and settings of the projector may thus be individually adjusted to each different kind of physical exercise. This further increases the user comfort.
  • the settings of the projector may not only be controlled based on the characteristics of the projection surface selected for a certain kind of physical exercise, but also based on the characteristics of the projected physical exercise itself.
  • the settings of the projector may also be controlled based on the overall spatial confinements observed by the surveillance sensor. If the surveillance sensor e.g. detects that the environment is rather dark and/or the room (the environment) is rather small, the projector may automatically reduce the image brightness so as to optimize battery consumption.
  • the surveillance sensor comprises a camera
  • the control unit is configured to determine the characteristics of the one or more surfaces identified by the surveillance sensor by controlling the projector to project a test image onto the one or more surfaces, controlling the camera to record a projection of the test image, and by comparing the test image to the projection of the test image recorded by the camera.
  • control unit is configured to project a test image onto different surfaces in the environment of the person during the initial sweep of the drone where it is moving around.
  • the projection quality of each surface may then be evaluated by means of image analysis techniques that compare the projected image with the source image (test image).
  • These evaluation results may be stored in the memory unit of the system or in any other database together with the identified location and orientation of the respective surface. In this way the drone is "aware" of the conditions in its surroundings after performing the initial sweep.
  • the surveillance sensor is configured to identify one or more objects in the environment of the person, and the control unit is configured to select said subset of the images or image sequences based on said one or more identified object.
  • the subset of images or image sequences relating to physical exercises that may be suitably carried out in the environment of the person/drone may also be selected based on an object detection technique.
  • Objects in the environment of the person may be observed and identified either as useful objects that can be of benefit to the person for certain types of physical exercises or as obstacles that interfere with certain types of physical exercises.
  • An identification of stairs, a chair, or table could positively trigger the selection of physical exercises in which the stairs, the chair and/or the table may be beneficially used, whereas the same objects may exclude some other physical exercises for which the stairs, the chair and/or the table may not suitably be used. In this way the system makes optimal use of the space and objects in the environment of the person. It is needless to say that the aforementioned object identification may also influence the preferred projection surface and/or the location of the drone during the projection.
  • system further comprises an input unit for receiving information including personal data and/or preferences of the person, wherein the control unit is configured to select said subset of the images or image sequences based on said information.
  • the control unit may alternatively or additionally also be configured to select the at least one surface onto which the images or image sequences are projected based on said personal information.
  • the personal data may e.g. include a height or weight of the person.
  • the preferences of the person may include preferences regarding certain kinds of physical exercises that the user likes or dislikes.
  • this embodiment allows a further refinement of the selection of the physical exercises/the images or image sequences that are related to them as well as a refinement of the selection of the optimal projection surface for the projection of the images or image sequences. For example, if the person is rather tall, only projection surfaces above a certain height will be assigned to images or image sequences that are related to physical exercises carried out in an upright body position.
  • the system further comprises an output unit arranged at the drone for providing audible, and/ or visual instructions to the persons, which instructions indicate where the person shall position him- /herself and/or how the person shall carry out the physical exercise to which the at least one images or image sequence projected by the projector is related.
  • These instructions may include simple feed-forward instructions as well as feedback instructions.
  • the control unit may e.g. be configured to determine said instructions based on at least one of: the observed spatial confinements in the environment of the person, the kind of physical exercise to which the at least one image or image sequence projected by the projector is related, and
  • the drone further comprises a tracking unit for tracking a movement of one or more body parts of the person while performing the physical exercises, wherein the control unit is configured to adapt the instructions to the person based on the track movement of the one or more body parts of the person.
  • the system is according to this embodiment thus able to correct the person if the person is not carrying out the physical exercises properly.
  • the drone comprises a tracking unit for tracking a movement of one or more body parts of the person while performing the physical exercises, and a projector movement actuator for moving the projector relative to the drone, wherein the control unit is configured to control the projector movement actuator based on the track movement of the one or more body parts of the person.
  • This embodiment allows e.g. to move the images or image sequence projected by the projector in conjunction with the movement of the head of the person. This is particularly advantageous for physical exercises that require extensive head movements, since the person may then still see the projected image or image sequence.
  • the drone comprises at least one locomotion actuator
  • the control unit is configured to deactivate the at least one locomotion actuator of the drone before controlling the projector to project the at least one of the images or image sequences.
  • the control unit in this embodiment first stops the movement of the drone, e.g. by landing the drone in case of a flying drone, before the projection starts.
  • control unit and/or the memory unit of the system may be arranged at the drone, but do not necessarily have to be arranged at the drone.
  • the control unit and/or the memory unit may also be provided as separate devices that are arranged locally remote from the drone. In such embodiments it is preferred that the drone is connected to the control unit and/or the memory unit by means of a wireless data connection.
  • Fig. 1 schematically illustrates a first embodiment of a system according to the present invention
  • Fig. 2 schematically illustrates possible units/components of a control unit used in the system according to the present invention
  • FIG. 3 schematically shows a second embodiment of the system according to the present invention
  • Fig. 4A schematically illustrates a first exemplary situation during usage of the system according to the present invention
  • Fig. 4B shows a second exemplary situation during usage of the system according to the present invention.
  • Fig. 5 shows a block diagram which schematically illustrates an embodiment of the method according to the present invention.
  • Fig. 1 schematically illustrates components of a system according to the present invention, which system is configured for supporting a person performing physical exercises.
  • the system is therein denoted in its entirety by reference numeral 10.
  • the system acts as a kind of virtual coach which supports a person during the performance of a physical exercise by giving feed- forward and feedback information about the particular physical exercise.
  • a drone 12 is implemented as a flying drone.
  • the drone 12 may be e.g. realized as a helicopter or quadcopter, or hexacopter, etc. However, it shall be noted that other types of unmanned vehicles may be used as drone 12 as well.
  • the drone 12 is apart from that not restricted to a flying drone, but may also be realized in form of a ground-moving drone.
  • the drone 12 includes a control unit 14, a memory unit 16 as well as several sensors 18-22 and actuators 24-34.
  • the control unit 14 may be generally imagined as the "brain" of the system 10.
  • the control unit 14 may be realized as a CPU or other kind of processor having software stored thereon which is configured to control the actuators 24-34 of the drone 12 based on the input signals of the sensors 18-22 of the drone 12.
  • the control unit 14 is e.g. configured to coordinate the movements of the drone 12, analyze the environment around the drone 12 and based thereupon coordinate the interactions of the drone 12 with the user of the system 10.
  • the control unit 14 is according to the first embodiment shown in Fig. 1 arranged at or within the drone 12. However, it shall be noted that parts of the control unit 14 or the whole control unit 14 may also be arranged in one or more external devices so as to control the functions of the drone 12 from a remote location (see second embodiment shown in Fig. 3 and explained further below).
  • the memory unit 16 is preferably realized as a non- volatile or volatile memory, such as a flash memory or hard drive. It is used in the system 10 according to the present invention to either temporarily or permanently store electronic data generated and used during and for the operation of the drone 12 and its components.
  • the memory unit 16 is, same as the control unit 14, according to the first embodiment shown in Fig. 1 arranged at or within the drone 12, but may be also arranged in a separate device remote from the drone 12 (as in the second embodiment shown in Fig. 3).
  • the drone 12 further comprises a surveillance sensor 18 which is configured to environmental-sensing, such as observing the spatial confinements in the environment of the drone 12, surface and object detection, etc.
  • the surveillance sensor 18 is generally used for investigating and exploring the space around the drone 12.
  • the surveillance sensor 18 preferably includes one or more cameras (optical, infrared, night vision, etc.).
  • the surveillance sensor 18 includes a 3D camera and/or a range-imaging camera such as a time-of- flight camera, which is able to determine relative depths and positions of objects, surfaces and persons in the space.
  • the surveillance sensor 18 may alternatively or additionally also include a gyroscope, a barometer, and accelerometer, an inclinometer, a light sensor, a magnetometer, a proximity sensor, a pressure sensor, a chemical sensor, a sound sensor, a airflow sensor, a humidity sensor and/or a microphone.
  • the drone 12 furthermore includes an input unit 20.
  • the input unit 20 may comprise any type of device that allows the user of the system 10 to input any data or information.
  • the input unit 20 may e.g. comprise a microphone, one or more buttons, or a touchscreen.
  • the input unit 20 is arranged at the drone 12.
  • the input unit 20 may also be part of a separate device that is arranged remote from the drone 12, e.g. as offered by a smartphone or tablet device located near the user.
  • the drone 12 according to the first embodiment may furthermore comprise a tracking unit 22.
  • This tracking unit 22 may include any type of device that is suitable for tracking a person or body part of a person.
  • the tracking unit 22 is e.g. used in the system 10 for observing a person and tracking the movements of one or more body parts of the person while said person is performing a physical exercise. Even though the tracking unit 22 is in Fig. 1 illustrated as a separate component, the tracking unit 22 may generally also be implemented as a part of the surveillance sensor 18, such that the tracking unit 22 e.g. makes use of the same camera as the surveillance sensor 18.
  • the actuators 24-34 of the drone 12 according to the first embodiment include a projector 24, a projector movement actuator 26, an output unit 28, and a locomotion actuator 32 which in the present case includes a motor 30 and one or more propellers 34.
  • the projector 24 may include any type of optical device that is able to project an image or image sequence onto a surface (projection surface).
  • the projector 24 is preferably configured to be free to rotate its direction of projection independent of the orientation of the drone 12. This is preferably realized by means of the projector movement actuator 26.
  • This projector movement actuator 26 preferably includes a motorized gimbal that is suitable for swiveling the projector 24 in a 360° motion. While this is a preferred design, there are of course a lot of other technical designs to accomplish a 360° motion of the projector 24.
  • the output unit 28 is used for outputting feedback to the user of the system 10.
  • the output unit 28 may comprise a display, a loudspeaker, and/or a laser pointing device.
  • the output unit 28 may, similar as several of therefore mentioned sensors and actuators be arranged either at the drone 12 (see first embodiment shown in Fig. 1) or remote therefrom and included in a separate device (see second embodiment shown in Fig. 3).
  • the control unit 14 preferably controls all of the aforementioned sensors 18-22 and actuators 24-34 in order to operate the system 10 and carry out the functions of the system 10.
  • One of the main functions of the system 10 is the identification of an optimal projection location onto which the projector 24 of the drone 12 may project one or more images or image sequences relating to training instructions for different kinds of physical exercises.
  • the selection of the optimal projection location is preferably individualized for each kind of physical exercise, meaning that the system 10 may choose different projection surfaces for different images or image sequences.
  • the system 10 may thereto take at least one of the following factors into account: (a) the projection viewing quality of different parts of the environment, (b) the type of physical exercise shown in the image or image sequence, (c) the required space needed by a person to conduct the physical exercise, (d) the viewing position of a person in the space during performing the physical exercise, (e) the availability of objects in the space that can aid or interfere a physical exercise.
  • the control unit 14 may be logically divided into different sub-units, as this is schematically shown in Fig. 2.
  • the control 14 may e.g. comprise an environment analysis unit 14.1, an object detection unit 14.2, an exercise recommender unit 14.3, a user detection and tracking unit 14.4 and a projector and drone steering unit 14.5. It shall be noted that theses sub-unites 14.4-14.5 do not necessarily have to exist structurally, but are used in the following to explain the logical functions of the control unit 14.
  • the explanation of the logical functions of the control unit 14 is set out below in conjunction with the schematic sketches illustrated in Figs. 4 A and 4B as well as in conjunction with the block diagram illustrated in Fig. 5.
  • the environment analysis unit 14.1 of the control unit 14 is preferably configured to carry out steps S10-S14.
  • step S 10 the drone 12 generally observes its environment, so as to be aware of the location and space surrounding the drone 12.
  • the locomotion actuator 32 is thereto preferably controlled to perform a sweep in which the drone 12 is automatically moved around in space.
  • the locomotion actuator 32 may be controlled to move the drone 12 during said sweep along a pre-defined trajectory.
  • This situation is schematically sketched in Fig. 4A.
  • the surveillance sensor 18 may help to prevent collisions with obstacles, walls or ceilings during said sweep.
  • the surveillance sensor 18 is during said sweep also used for observing the spatial confinements in the environment of the person 36 and to identify one or more surfaces in the environment of the person 36 that are generally suitable as projection surfaces.
  • the sweep is preferably started from the location of the person 36.
  • Locations can be determined absolutely during the sweep using an indoor/outdoor positioning technology, e.g. GPS, or they can be defined relatively by using a sensor that tracks the position relative to a reference position at the start of the sweep.
  • a 3D camera may be part of the surveillance sensor 18 so as to be able to determine relative depths and positions of objects and users in the environment of the drone 12. All observed location data are stored during or after step S10 in the memory unit 16.
  • the projection quality of the one or more surfaces identified in the environment of the drone 12 is assessed.
  • This assessment may include an evaluation of several different characteristics.
  • the environment analysis unit 14.1 may e.g. evaluate for each identified surface a size of the surface, a surface condition of the surface, an orientation in space of the surface, a reflectivity of the surface, a color of the surface, and/or an ambient lighting condition of the surface.
  • the control unit 14 preferably controls the projector 24 to project a test image onto each of the one or more surfaces identified by the surveillance sensor 18.
  • the control unit 14 at the same time controls the surveillance sensor 18 to record the projected test image e.g. by means of the above-mentioned 3D camera.
  • test image source image
  • image analysis techniques may result in an evaluation of one or more of the above-mentioned characteristics of each of the one or more surfaces that were identified by the surveillance sensor 18 in the first step S10.
  • the evaluation results, i.e. the determined projectability characteristics of each surface are preferably stored in the memory unit 16 together with the earlier detected location information of each surface.
  • the environment analysis is further detailed in step S14.
  • the environment analysis unit 14.1 may be configured to determine in this process step for each identified projection surface a space that may be used by the person 36 to perform a physical exercise while at the same time being able to observe an image projected onto said surface. This information may be again stored in the memory unit 16 in conjunction with each identified surface, its earlier defined characteristics and position information.
  • the object detection unit 14.2 may identify in step S16 one or more objects in the environment of the person 36 by means of the surveillance sensor 18 and may asses these objects for their utility to support physical exercises.
  • the identified objects may be stored in the memory unit 16 and tagged in the memory unit 16 according to their location, dimensions, type and utility. Some objects will be tagged with a negative utility. This means that their location may interfere with physical exercises based on an estimated position of the person 36 for certain projection locations. Other objects will be tagged a positive utility relative to a projection surface. E.g. if a table, or edge height is detected it may be associated a positive utility for physical exercises that require support when standing on one leg.
  • the person 36 himself may provide in step S18 information including personal data and/or preferences of the person 36 to the system 10 via the input unit 20.
  • the exercise recommender unit 14.3 may then recommend one or more physical exercises to the person 36. It should be clear that in fact not the physical exercises themselves are recommended to the person 36, but rather the images or image sequences that relate to the different kinds of physical exercises.
  • the exercise recommender unit 14.3 does, however, not only recommend images or image sequences to the person 36, but also determines at least one of the one or more surfaces that has been identified by the surveillance sensor 18 and that is optimal for the projection of said image or image sequence.
  • the exercise recommender unit 14.3 selects a subset of images or image sequences that may be suitably carried out in the observed environment.
  • This selection may be based on one or more of the above-mentioned parameters that have been evaluated in steps S10-S18.
  • Each of the subset of images or image sequences is then assigned at least one of the evaluated surfaces (step S22). This assignment of the potential projection surfaces to each of the subset of selected images or image sequences is again preferably stored in the memory unit 16.
  • the exercise recommender unit 14.3 selects in steps S20 and S22 the most suitable physical exercises, locations for projections and locations for the person 36. Since each image or image sequence stored in the memory unit 16 is tagged in the memory unit 16 together with the characteristics of the physical exercises shown in the image or image sequence, the exercise recommender unit 14.3 so to say knows the physical exercise in terms of knowing what space is needed for the physical exercise, in what body position the physical exercise has to be carried out, and/or how large the projection of the image sequence should be at minimum to be observable during the physical exercise. Exercises may be stored together with their images or image sequences in the memory unit 16 as a semantic description including information regarding space, motion, physical properties and/or physical requirements of the exercise.
  • the control unit 14 is thus able to choose the optimal projection surface for each of the subset of selected physical exercises individually.
  • an image or image sequence showing a squat exercise may be tagged with information that at least lxlm is needed and that the person's view is assumed to be always more or less parallel to the ground.
  • this may imply that a wall surface, which is at a height of 1.6 m-2.0 m and has a free space of lxlm in front of it, is the optimal projection surface for the squad exercise.
  • a ceiling surface may be associated to an image or image sequence which is related to a physical exercise where the user is on his back, e.g. a bench press exercise.
  • the memory unit 16 is constantly updated during steps S10-S22.
  • the different kinds of data mentioned above do not necessarily have to be stored in one and the same memory unit 16, but may also be stored in a plurality of separate memory units.
  • the exercise recommender unit 14.3 may finally recommend the selected physical exercises/image sequences to the person 36 in step S24.
  • the recommended exercises/image sequences may be e.g. displayed on a display that is part of the output unit 28.
  • the projector and drone steering unit 14.5 controls in step S28 the locomotion actuator 32 of the drone 12 to move to a certain location as well as it controls the projector 24 to project the selected image sequence to the projection surface that has been previously assigned to it (see also Fig. 4B).
  • the control unit 14 may furthermore control the projector 24 to adjust the projector settings, such as the size, shape, image brightness and/or image contrast of the projected image. This adjustment may be based on at least one of the parameters that have been previously determined by the system 10 in steps S 10-S26.
  • the selected image sequence may be finally projected to the selected projection surface in step S30.
  • additional visible, and/or audible instructions may be output via the output unit 28.
  • Step S34 which is performed by the user detection and tracking unit 14.4, shows a feedback loop to steps S30 and S32.
  • the person 36 may be tracked by means of the tracking unit 21.
  • the tracked movements of at least one or more body parts of the person 36 may be evaluated, so that the control unit 14 may the accordingly control the projector movement actuator 26 or the instructions output via the output unit 28 based on the tracked movement of the one or more body parts of the person 36.
  • the tracking unit 22 in other words follows the movement of the person 36 during the physical exercise and observes whether the person 36 is performing the physical exercise in a correct way.
  • the tracking unit 22 thereto preferably includes a camera which is configured to follow the person 36 while conducting the physical exercise.
  • the analysis of the movement of the person 36 may be done by using an image analysis process combined with the biometrics of the subject to compare the execution to the ideal execution of the physical exercise stored in the memory unit 16.
  • the control unit 14 may e.g. be configured to adapt the projection and/or the user instructions in case a deviation of the tracked movement relative to the ideal movement is detected to be above a certain threshold.
  • the drone 12 may then e.g. show the person 36 the right execution of the physical exercise by showing a trajectory that certain body parts have to follow during the physical exercise. Said trajectory may be e.g. generated by means of a laser that is pointed to one of the previously identified projection surfaces.
  • the drone 12 may show the person 36 a live feed of the user with an overlay of feedback and feedforward information relating to the exercise.
  • the control unit may be, in other words, configured to control the projector 24 to project not only the image or image sequence relating to the currently performed exercise, but to concurrently project also an image sequence recorded by means of the camera of the tracking unit 22 (e.g. by overlaying it onto the exercise image sequence).
  • the pre-rendered images may be adapted and combined with the live camera feed prior to them being rendered through the projector 24. They may also be procedurally generated based on the exercise profile and context information from the sensors.
  • the control unit 14 can also take into account the position and/or angle of the camera of the tracking unit 22 whilst performing an exercise to give the person 36 a useful representation of themselves when projecting the live feed.
  • the tracking unit 22 uses the same camera as the surveillance sensor 18, such that only one camera is needed.
  • the drone 12 can first calibrate to the height of the person's head by moving to that height/position prior to the physical exercise, wherein the height may be estimated from an air pressure or ultrasound sensor. The drone 12 can then move the 3D camera (surveillance sensor 18) to see if the projected image is generally observable and can then correct for this. With (indoor) positioning technology, the calibration step could be carried out even easier, since the positioning technology will know where it is in space at any time. Otherwise, the relative changes in position between the position of the physical exercise and the final position of the drone 12 would need to be estimated. Regarding the real-time height estimate of the person's head from the floor, a 3D/range imaging camera is preferred (e.g.
  • control unit 14 may control the projector movement actuator 26 to move the projector 24 according to a forecasted head movement or according to a head movement that is tracked in realtime.
  • the input of the sensors 18-22 may be used to determine the speed of the body movements of the person 36, wherein the control unit 14 is configured to adapt the speed of the projected video sequence accordingly. For example, if it is detected via the microphone that is part of the input unit 20 that the user starts to breathe more heavily or grunt from straining too much, the control 14 may control the projector 24 to pause the projected image sequence or to slow down the reproduction of the projected image sequence.
  • the second embodiment which is schematically illustrated in Fig. 3, is generally able to perform the same processes as explained above with reference to the first embodiment shown in Fig. 1. Hence, only the differences between the two embodiments will be described in the following.
  • the same or similar reference numerals are used for the same or similar components of the system 10.
  • the memory unit 16' is according to the second embodiment realized as a separate device that is arranged locally remote from the drone 12.
  • the memory unit 16' may e.g. be implemented by an external database or cloud server.
  • a further difference is the usage of an extra computing device, such as a smartphone or table PC that implements the above- mentioned functions of the control unit 14', the input unit 20' and/or the output unit 28'.
  • an extra computing device such as a smartphone or table PC that implements the above- mentioned functions of the control unit 14', the input unit 20' and/or the output unit 28'.
  • the drone 12 according to the second embodiment furthermore comprises a data communication unit 38 by means of which the drone 12 may communicate with the memory unit 16' and the external computing device implementing the control unit 14', input unit 20' and/or output unit 28'.
  • Figs. 1 and 3 only show two exemplary embodiments of the herein presented system 10. It shall be also noted that Fig. 5 also illustrates only an exemplary embodiment and that not all steps illustrated in Fig. 5 are necessary steps. For example, steps SI 6, SI 8, S28, S30 and S34 are optional steps.
  • a computer program may be stored/distributed on a suitable medium, such as an optical storage medium or a solid-state medium supplied together with or as part of other hardware, but may also be distributed in other forms, such as via the Internet or other wired or wireless telecommunication systems.
  • a suitable medium such as an optical storage medium or a solid-state medium supplied together with or as part of other hardware, but may also be distributed in other forms, such as via the Internet or other wired or wireless telecommunication systems.

Abstract

The present invention relates to a person performing physical exercises, comprising: - a flying, and/or ground-moving drone; - a projector arranged at the drone; - a surveillance sensor arranged at the drone for observing spatial confinements in an environment of the person and for identifying one or more surfaces in the environment of the person; - a memory unit for storing images or image sequences relating to training instructions for different kinds of physical exercises; and - a control unit which is configured: to select at least one of the one or more surfaces identified by the surveillance sensor, based on at least one of: (i) the observed spatial confinements, (ii) the kind of physical exercise, and (iii) characteristics of the one or more surfaces, and to control the projector upon request to project at least one of the images or image sequences of the at least one subset onto the at least one surface that has been selected for said at least one image or image sequence.

Description

System and method for supporting physical exercises
FIELD OF THE INVENTION
The present invention generally relates to the field of personal training, fitness and coaching for persons performing any type of physical exercise. In particular, the present invention relates to a system that makes use of a drone for supporting a person performing physical exercises. The present invention also relates to a corresponding method as well as to a corresponding computer program.
BACKGROUND OF THE INVENTION
It is well known that many people significantly benefit from having a personal trainer to achieve wellness, fitness and physical performance goals, such as losing weight, improving the cardiac health, or completing a marathon. On the other hand, many people have difficulties in finding an appropriate personal trainer with an adequate background. The compensation of such a personal trainer is of course another problem for many people. Lack of time and scarce flexibility are further reasons why particularly employed persons decide against working with personal trainers.
Nowadays there is an upcoming trend to replace personal trainers by automatic solutions. The number of fitness apps or computer programs which provide "virtual coaches" is constantly increasing. Even though such apps and computer programs are comparatively cheap, easy to get and flexible in use, state of the art solutions show several drawbacks.
Currently known solutions for virtual coaches are usually limited to providing set exercise programs, i.e. giving passive feedback, and delegate in the best case scenario the coaching feedback part to a hybrid solution of e-coaching and human remote coaching. Typical systems of the state of the art are: an app that can track your heart rate and gives feedback on your performance, a pedometer that counts the distance that you run during a training session, etc.
A further problem why virtual coaches are currently not able to substitute a personal trainer is because they are not physically present when the exercise occurs to provide online feedback. Basic fundaments of training methodology are: assessment, interpretation, prescription, which consists of training modality, frequency, duration, intensity, precautions and progressions. Next to this basic process of developing an exercise program, online feedback on exercise execution and training adjustments according to individual responses to each single physical exercise or training session are required. All this is usually part of the professional knowledge of a qualified personal trainer.
While developing a static training program can be a rather easy task, based on said guidelines and recommendations, altering such program according to improvements, injuries, illnesses, holidays, or other commitments is not straightforward for virtual coaches implemented in apps, computer programs, or other automated systems.
Another aspect of physical training which is not trivial to achieve without making use of a trained professional personal trainer is the correct execution of the physical exercises.
US 2008/0269016 Al relates to an adaptive training system with aerial mobility. This system includes a mobile unit which includes multiple sensors,
communication devices and a mobility system. The mobile unit executes one or more training paths to simulate chasing associated with various sports. The mobile unit is furthermore capable of determining its own location and the location of the athlete throughout a training session. Still further, the mobile unit is configured to adapt the training path to stress weaknesses of the athlete with respect to various types of athletic skills.
Even though the system proposed in US 2008/0269016 Al already provides several benefits with respect to the above-mentioned problems, there is still room for improvement, especially when considering indoor-based exercise programs. SUMMARY OF THE INVENTION
It is an object of the present invention to provide a system and method which provide an improved support for persons performing physical exercises. It is particularly an object of the present invention to provide a system and method which make use of virtual training and coaching techniques in a way that is more comfortable and more efficient for the user.
In a first aspect of the present invention a system is presented that comprises: a flying, and/or ground-moving drone;
a projector arranged at the drone; a surveillance sensor arranged at the drone for observing spatial confinements in an environment of the person and for identifying one or more surfaces in the environment of the person;
a memory unit for storing images or image sequences relating to training instructions for different kinds of physical exercises; and
a control unit which is configured:
to select for each of at least a subset of the images or image sequences stored in the memory unit at least one of the one or more surfaces identified by the surveillance sensor, said selection being based on at least one of: (i) the observed spatial confinements in the environment of the person, (ii) the kind of physical exercise to which each image or image sequence of the at least one subset is related, and (iii) characteristics of the one or more surfaces identified by the surveillance sensor, and
to control the projector upon request to project at least one of the images or image sequences of the at least one subset onto the at least one surface that has been selected for said at least one image or image sequence.
In a second aspect of the present invention a method is presented that supports a person performing physical exercises by means of a flying, and/or ground-moving drone which comprises (i) a projector and (ii) a surveillance sensor for observing spatial confinements in an environment of the person and for identifying one or more surfaces in the environment of the person, the method comprises the steps of:
selecting for each of at least a subset of images or image sequences, which relate to training instructions for different kinds of physical exercises and are stored in a memory unit, at least one of the one or more surfaces identified by the surveillance sensor, said selection being based on at least one of: (i) the observed spatial confinements in the environment of the person, (ii) the kind of physical exercise to which each image or image sequence of the at least one subset is related, and (iii) characteristics of the one or more surfaces identified by the surveillance sensor, and
controlling the projector upon request to project at least one of the images or image sequences of the at least one subset onto the at least one surface that has been selected for said at least one image or image sequence.
In a still further aspect of the present invention a computer program is presented which comprises program code means for causing a computer to carry out the steps of the aforementioned method when said computer program is carried out on a computer. Preferred embodiments of the invention are defined in the dependent claims. It shall be understood that the claimed method and the claimed computer program have similar and/or identical preferred embodiments as the claimed system and as defined in the dependent claims.
Before starting to explain the central aspects and embodiments of the present invention some terms and definitions that are used herein shall be shortly explained:
A "drone" as used herein shall refer to any type of robot vehicle in general. In other words, it refers to an unmanned vehicle which is able to move around in space by flying, and/or moving on the ground. A drone may have any type of locomotion actuator that allows the drone to move around in space. The drone may either self-control its movements or may be remotely controlled.
A "projector" as used herein refers to any type of optical device that is suitable for projecting an image or moving images (an image sequence) onto a surface.
A "surveillance sensor" as used herein is any type of sensor that is suitable for receiving monitoring signals, preferably but not necessarily optical signals, which allow an observation of the space in the environment of the sensor. Typical components of such surveillance sensors may include a camera, an optical scanner, a proximity sensor, etc.
The term "spatial confinements" as used herein refers to the spatial conditions of the surrounding space, e.g. in an indoor situation the geometry, size and/or volume of a room, its walls, floor and/or ceiling.
A "memory unit" as used herein may refer to any type of data storage suitable for permanently or temporarily retaining digital data. A memory unit may include any type of volatile or non- volatile memory and may also refer to a database which is accessible via cloud computing.
A "control unit" as used herein may include any type of computer hardware with software stored thereon in order to carry out logical processes for controlling and/or steering a system. A control unit or controller may be implemented e.g. as a chip or as a stand-alone device that interfaces with a peripheral device. The control unit may also be implemented as an external device that manages the operation of a system or device, in the present case the operation of the drone and the devices arranged at the drone.
A "physical exercise" as used herein shall refer to any type of bodily activity that enhances or maintains physical fitness, overall health and/or wellness.
As already mentioned above, the herein presented system and method make use of a flying, and/or ground-moving drone which, inter alia, comprises a projector by means of which images or image sequences relating to training instructions for different kinds of physical exercises may be projected onto one or more surfaces used as projection locations. Said images or image sequences may e.g. include pictures and/or text explaining the person how to perform a certain kind of physical exercise in a correct manner. Said image sequences may also include videos or video clips showing a person or an avatar
demonstrating a particular kind of physical exercise.
Due to the ability of the drone to almost freely move around in space the herein presented system and method enable a very flexible and versatile way of projecting the training images or image sequences to different parts of the environment (surroundings) of the person by means of the projector that is arranged at the drone. One of the central aspects of the herein presented system and method is the ability of the system to select a surface in the environment of the person that suites as an optimal projection location for the training images or image sequences, as well as an optimal location for conducting the exercise.
The selection of the surfaces used for the projection of the training images or image sequences is based on at least one of: (i) the observed spatial confinements in the environment of the person, (ii) the kind of physical exercise to which each image or image sequence is related, and (iii) the characteristics of the one or more surfaces as identified by the surveillance sensor.
The presented system first analyses the environment, i.e. the space around the person, by means of the surveillance sensor in order to e.g. determine the size of the room in which the person is located, the geometry of the room, and/or other environmental parameters that could influence the performance of a physical exercise and/or the projection of an image or image sequence. This spatial analysis of the environment of the person may e.g. be carried out during an initial sweep of the drone in which the surveillance sensor observes the environment while the drone is moving around in space.
The observed spatial confinements may then not only be used to determine possible candidates for projection surfaces, but also to determine possible positions where the person may stand, sit or lie.
A further relevant factor for the selection of an optimal projection location is, as already mentioned above, the kind of physical exercise to which the image or image sequence is related. For example, if the person is lying on his back in an exemplary kind of physical exercise (such as a Pilates crunch), the control unit will select, if possible, a surface on the ceiling of the room as projection surface onto which the at least one image or image sequence showing said exercise is projected. On the other hand, in case of the projection of an image or image sequence relating to a second kind of physical exercise (such as e.g. a front bridge exercise), which makes it necessary that the person is facing downwards, the control unit will select, if possible, a surface on the floor of the room as projection surface for the projector.
This results in a very comfortable situation for the person, since the person may observe the projected image or image sequence during his physical exercise in a very natural way, without especially having to move his head or other body parts.
A still further criterion for the selection of an optimal projection location may be the characteristics of the one or more surfaces that have been identified by the surveillance sensor. Said characteristics may include at least one of a size of the surface, a shape of the surface, a surface condition of the surface, an orientation in space of the surface, a reflectivity of the surface, a color of the surface, and an ambient lighting condition of the surface.
Each different kind of physical exercise may require different characteristics of the projection location. For some kinds of physical exercises it may advantageous to use a small or comparatively dark projection surface, whereas for other kinds of physical exercises it may be more advantageous to use a comparatively reflective and large projection surface.
It shall be noted that further optional criteria may be used in the presented system for selecting the optimal projection location for each type of training image or image sequence, as will be set out further below in detail. It shall be also noted that the control unit of the presented system preferably bases the selection on all of the above-mentioned three selection criteria (spatial confinement of space, kind of physical exercise, and surface characteristics). However, not necessarily all of them, but also only one of theses criteria may be used for the above-mentioned surface selection without leaving the scope of the present invention.
It shall be furthermore noted that preferably each of the different images or image sequences stored in the memory unit, or at least each of a subset of the images or image sequences stored in the memory unit, is assigned at least one of the surfaces identified by the surveillance sensor as a suitable projection surface. As explained above, the assigned projection surfaces may be different for each image or image sequence depending on the kind of physical exercise shown in the image or image sequence; meaning that image sequence A may be assigned projection surface X, while image sequence B is assigned projection surface Y, etc. These assignments may be saved in the memory unit upfront before the actual training/projection of the images or images sequences starts. A further advantage of the presented system is that the control unit may also be configured to automatically exclude certain kinds of physical exercises and their related training images or image sequences based on the input delivered by the surveillance sensor. The control unit preferably selects a subset of the images or image sequences stored in the memory unit, wherein said subset only includes images or image sequences that relate to physical exercises which are suitable to be carried out in the environment identified by the surveillance sensor of the drone.
According to an embodiment, the control unit is configured to select said subset of the images or image sequences based on the observed spatial confinements in the environment of the person. If it is e.g. detected that the person is in a rather small room, only training images and image sequences are selected to said subset which relate to physical exercises that do not require a lot of space.
According to a further embodiment, the control unit may be configured to select said subset of the images or image sequences based on the characteristics of the one or more surfaces identified by the surveillance sensor. If e.g. the reflectivity, ambient lighting condition and/or surface condition of the ceiling of the room in which the person is located does not allow the ceiling to be used as projection surface, no images or image sequences are selected to the subset that have to be carried out while lying on the back.
In other words, the presented system does not only search the optimal projection surface for each type of physical exercise, but already makes a kind of preselection of physical exercises that may be suitably carried out in the room the person is currently in. Other physical exercises that may not be suitably carried out in said room may thus not even be accessible for the user.
According to a further embodiment of the present invention, the control unit is configured to control the projector to adapt a size, a shape, an image brightness and/or an image contrast of the least one image or image sequence projected by the projector based on characteristics of the at least one surface that has been selected for said at least one image or image sequence. Projection properties and settings of the projector may thus be individually adjusted to each different kind of physical exercise. This further increases the user comfort.
It shall be noted that the settings of the projector (size, shape, image brightness and/or image contrast) may not only be controlled based on the characteristics of the projection surface selected for a certain kind of physical exercise, but also based on the characteristics of the projected physical exercise itself. The settings of the projector may also be controlled based on the overall spatial confinements observed by the surveillance sensor. If the surveillance sensor e.g. detects that the environment is rather dark and/or the room (the environment) is rather small, the projector may automatically reduce the image brightness so as to optimize battery consumption.
According to a further embodiment, the surveillance sensor comprises a camera, and the control unit is configured to determine the characteristics of the one or more surfaces identified by the surveillance sensor by controlling the projector to project a test image onto the one or more surfaces, controlling the camera to record a projection of the test image, and by comparing the test image to the projection of the test image recorded by the camera.
In other words, the control unit is configured to project a test image onto different surfaces in the environment of the person during the initial sweep of the drone where it is moving around. The projection quality of each surface may then be evaluated by means of image analysis techniques that compare the projected image with the source image (test image). These evaluation results may be stored in the memory unit of the system or in any other database together with the identified location and orientation of the respective surface. In this way the drone is "aware" of the conditions in its surroundings after performing the initial sweep.
Alternative ways of determining/evaluating the characteristics of the one or more surfaces in the environment of the person/drone include optical reflectivity
measurements (e.g. using a laser), optical surface topography scanning procedures, etc..
According to a further embodiment, the surveillance sensor is configured to identify one or more objects in the environment of the person, and the control unit is configured to select said subset of the images or image sequences based on said one or more identified object.
In other words, the subset of images or image sequences relating to physical exercises that may be suitably carried out in the environment of the person/drone may also be selected based on an object detection technique. Objects in the environment of the person may be observed and identified either as useful objects that can be of benefit to the person for certain types of physical exercises or as obstacles that interfere with certain types of physical exercises. An identification of stairs, a chair, or table could positively trigger the selection of physical exercises in which the stairs, the chair and/or the table may be beneficially used, whereas the same objects may exclude some other physical exercises for which the stairs, the chair and/or the table may not suitably be used. In this way the system makes optimal use of the space and objects in the environment of the person. It is needless to say that the aforementioned object identification may also influence the preferred projection surface and/or the location of the drone during the projection.
According to a further embodiment, the system further comprises an input unit for receiving information including personal data and/or preferences of the person, wherein the control unit is configured to select said subset of the images or image sequences based on said information.
The control unit may alternatively or additionally also be configured to select the at least one surface onto which the images or image sequences are projected based on said personal information.
The personal data may e.g. include a height or weight of the person. The preferences of the person may include preferences regarding certain kinds of physical exercises that the user likes or dislikes. Hence, this embodiment allows a further refinement of the selection of the physical exercises/the images or image sequences that are related to them as well as a refinement of the selection of the optimal projection surface for the projection of the images or image sequences. For example, if the person is rather tall, only projection surfaces above a certain height will be assigned to images or image sequences that are related to physical exercises carried out in an upright body position.
According to a further embodiment of the present invention, the system further comprises an output unit arranged at the drone for providing audible, and/ or visual instructions to the persons, which instructions indicate where the person shall position him- /herself and/or how the person shall carry out the physical exercise to which the at least one images or image sequence projected by the projector is related. These instructions may include simple feed-forward instructions as well as feedback instructions. The control unit may e.g. be configured to determine said instructions based on at least one of: the observed spatial confinements in the environment of the person, the kind of physical exercise to which the at least one image or image sequence projected by the projector is related, and
characteristics of the at least one surface that has been selected of the at least one images or image sequence projected by the projector.
In order to provide real feedback instructions to the person it is according to a further embodiment preferred that the drone further comprises a tracking unit for tracking a movement of one or more body parts of the person while performing the physical exercises, wherein the control unit is configured to adapt the instructions to the person based on the track movement of the one or more body parts of the person. The system is according to this embodiment thus able to correct the person if the person is not carrying out the physical exercises properly. Several ways of implementing such feedback are conceivable. Some of the implementation possibilities are set out further below.
According to a further embodiment, the drone comprises a tracking unit for tracking a movement of one or more body parts of the person while performing the physical exercises, and a projector movement actuator for moving the projector relative to the drone, wherein the control unit is configured to control the projector movement actuator based on the track movement of the one or more body parts of the person. This embodiment allows e.g. to move the images or image sequence projected by the projector in conjunction with the movement of the head of the person. This is particularly advantageous for physical exercises that require extensive head movements, since the person may then still see the projected image or image sequence.
According to a still further embodiment of the present invention, the drone comprises at least one locomotion actuator, wherein the control unit is configured to deactivate the at least one locomotion actuator of the drone before controlling the projector to project the at least one of the images or image sequences. In other words, the control unit in this embodiment first stops the movement of the drone, e.g. by landing the drone in case of a flying drone, before the projection starts. The main advantage that is provided therewith is a reduced battery consumption.
It shall be also noted that the control unit and/or the memory unit of the system may be arranged at the drone, but do not necessarily have to be arranged at the drone. The control unit and/or the memory unit may also be provided as separate devices that are arranged locally remote from the drone. In such embodiments it is preferred that the drone is connected to the control unit and/or the memory unit by means of a wireless data connection. BRIEF DESCRIPTION OF THE DRAWINGS
These and other aspects of the invention will be apparent from and elucidated with reference to the embodiments described hereinafter. In the following drawings
Fig. 1 schematically illustrates a first embodiment of a system according to the present invention;
Fig. 2 schematically illustrates possible units/components of a control unit used in the system according to the present invention;
Fig. 3 schematically shows a second embodiment of the system according to the present invention; Fig. 4A schematically illustrates a first exemplary situation during usage of the system according to the present invention;
Fig. 4B shows a second exemplary situation during usage of the system according to the present invention; and
Fig. 5 shows a block diagram which schematically illustrates an embodiment of the method according to the present invention.
DETAILED DESCRIPTION OF THE INVENTION
Fig. 1 schematically illustrates components of a system according to the present invention, which system is configured for supporting a person performing physical exercises. The system is therein denoted in its entirety by reference numeral 10.
The system acts as a kind of virtual coach which supports a person during the performance of a physical exercise by giving feed- forward and feedback information about the particular physical exercise.
One of the central components of the system 10 is a drone 12. In the embodiment shown in Fig. 1 the drone 12 is implemented as a flying drone. The drone 12 may be e.g. realized as a helicopter or quadcopter, or hexacopter, etc. However, it shall be noted that other types of unmanned vehicles may be used as drone 12 as well. The drone 12 is apart from that not restricted to a flying drone, but may also be realized in form of a ground-moving drone.
The drone 12 according to the first embodiment includes a control unit 14, a memory unit 16 as well as several sensors 18-22 and actuators 24-34.
The control unit 14 may be generally imagined as the "brain" of the system 10. The control unit 14 may be realized as a CPU or other kind of processor having software stored thereon which is configured to control the actuators 24-34 of the drone 12 based on the input signals of the sensors 18-22 of the drone 12. The control unit 14 is e.g. configured to coordinate the movements of the drone 12, analyze the environment around the drone 12 and based thereupon coordinate the interactions of the drone 12 with the user of the system 10.
The control unit 14 is according to the first embodiment shown in Fig. 1 arranged at or within the drone 12. However, it shall be noted that parts of the control unit 14 or the whole control unit 14 may also be arranged in one or more external devices so as to control the functions of the drone 12 from a remote location (see second embodiment shown in Fig. 3 and explained further below). The memory unit 16 is preferably realized as a non- volatile or volatile memory, such as a flash memory or hard drive. It is used in the system 10 according to the present invention to either temporarily or permanently store electronic data generated and used during and for the operation of the drone 12 and its components. The memory unit 16 is, same as the control unit 14, according to the first embodiment shown in Fig. 1 arranged at or within the drone 12, but may be also arranged in a separate device remote from the drone 12 (as in the second embodiment shown in Fig. 3).
The drone 12 further comprises a surveillance sensor 18 which is configured to environmental-sensing, such as observing the spatial confinements in the environment of the drone 12, surface and object detection, etc. The surveillance sensor 18 is generally used for investigating and exploring the space around the drone 12. The surveillance sensor 18 preferably includes one or more cameras (optical, infrared, night vision, etc.). In a
particularly preferred embodiment the surveillance sensor 18 includes a 3D camera and/or a range-imaging camera such as a time-of- flight camera, which is able to determine relative depths and positions of objects, surfaces and persons in the space. The surveillance sensor 18 may alternatively or additionally also include a gyroscope, a barometer, and accelerometer, an inclinometer, a light sensor, a magnetometer, a proximity sensor, a pressure sensor, a chemical sensor, a sound sensor, a airflow sensor, a humidity sensor and/or a microphone.
The drone 12 according to the first embodiment furthermore includes an input unit 20. The input unit 20 may comprise any type of device that allows the user of the system 10 to input any data or information. The input unit 20 may e.g. comprise a microphone, one or more buttons, or a touchscreen. In the first embodiment shown in Fig. 1 the input unit 20 is arranged at the drone 12. As it will be seen from the description of the second embodiment shown in Fig. 3, the input unit 20 may also be part of a separate device that is arranged remote from the drone 12, e.g. as offered by a smartphone or tablet device located near the user.
The drone 12 according to the first embodiment may furthermore comprise a tracking unit 22. This tracking unit 22 may include any type of device that is suitable for tracking a person or body part of a person. The tracking unit 22 is e.g. used in the system 10 for observing a person and tracking the movements of one or more body parts of the person while said person is performing a physical exercise. Even though the tracking unit 22 is in Fig. 1 illustrated as a separate component, the tracking unit 22 may generally also be implemented as a part of the surveillance sensor 18, such that the tracking unit 22 e.g. makes use of the same camera as the surveillance sensor 18. The actuators 24-34 of the drone 12 according to the first embodiment include a projector 24, a projector movement actuator 26, an output unit 28, and a locomotion actuator 32 which in the present case includes a motor 30 and one or more propellers 34.
The projector 24 may include any type of optical device that is able to project an image or image sequence onto a surface (projection surface). The projector 24 is preferably configured to be free to rotate its direction of projection independent of the orientation of the drone 12. This is preferably realized by means of the projector movement actuator 26. This projector movement actuator 26 preferably includes a motorized gimbal that is suitable for swiveling the projector 24 in a 360° motion. While this is a preferred design, there are of course a lot of other technical designs to accomplish a 360° motion of the projector 24.
The output unit 28 is used for outputting feedback to the user of the system 10. The output unit 28 may comprise a display, a loudspeaker, and/or a laser pointing device. The output unit 28 may, similar as several of therefore mentioned sensors and actuators be arranged either at the drone 12 (see first embodiment shown in Fig. 1) or remote therefrom and included in a separate device (see second embodiment shown in Fig. 3).
The control unit 14 preferably controls all of the aforementioned sensors 18-22 and actuators 24-34 in order to operate the system 10 and carry out the functions of the system 10.
One of the main functions of the system 10 is the identification of an optimal projection location onto which the projector 24 of the drone 12 may project one or more images or image sequences relating to training instructions for different kinds of physical exercises. The selection of the optimal projection location is preferably individualized for each kind of physical exercise, meaning that the system 10 may choose different projection surfaces for different images or image sequences. The system 10 may thereto take at least one of the following factors into account: (a) the projection viewing quality of different parts of the environment, (b) the type of physical exercise shown in the image or image sequence, (c) the required space needed by a person to conduct the physical exercise, (d) the viewing position of a person in the space during performing the physical exercise, (e) the availability of objects in the space that can aid or interfere a physical exercise.
The control unit 14 may be logically divided into different sub-units, as this is schematically shown in Fig. 2. The control 14 may e.g. comprise an environment analysis unit 14.1, an object detection unit 14.2, an exercise recommender unit 14.3, a user detection and tracking unit 14.4 and a projector and drone steering unit 14.5. It shall be noted that theses sub-unites 14.4-14.5 do not necessarily have to exist structurally, but are used in the following to explain the logical functions of the control unit 14. The explanation of the logical functions of the control unit 14 is set out below in conjunction with the schematic sketches illustrated in Figs. 4 A and 4B as well as in conjunction with the block diagram illustrated in Fig. 5.
The environment analysis unit 14.1 of the control unit 14 is preferably configured to carry out steps S10-S14.
In step S 10 the drone 12 generally observes its environment, so as to be aware of the location and space surrounding the drone 12. The locomotion actuator 32 is thereto preferably controlled to perform a sweep in which the drone 12 is automatically moved around in space. The locomotion actuator 32 may be controlled to move the drone 12 during said sweep along a pre-defined trajectory. This situation is schematically sketched in Fig. 4A. The surveillance sensor 18 may help to prevent collisions with obstacles, walls or ceilings during said sweep. The surveillance sensor 18 is during said sweep also used for observing the spatial confinements in the environment of the person 36 and to identify one or more surfaces in the environment of the person 36 that are generally suitable as projection surfaces. The sweep is preferably started from the location of the person 36. Locations can be determined absolutely during the sweep using an indoor/outdoor positioning technology, e.g. GPS, or they can be defined relatively by using a sensor that tracks the position relative to a reference position at the start of the sweep. As already mentioned above, a 3D camera may be part of the surveillance sensor 18 so as to be able to determine relative depths and positions of objects and users in the environment of the drone 12. All observed location data are stored during or after step S10 in the memory unit 16.
In the next step S12 the projection quality of the one or more surfaces identified in the environment of the drone 12 is assessed. This assessment may include an evaluation of several different characteristics. The environment analysis unit 14.1 may e.g. evaluate for each identified surface a size of the surface, a surface condition of the surface, an orientation in space of the surface, a reflectivity of the surface, a color of the surface, and/or an ambient lighting condition of the surface. The control unit 14 preferably controls the projector 24 to project a test image onto each of the one or more surfaces identified by the surveillance sensor 18. The control unit 14 at the same time controls the surveillance sensor 18 to record the projected test image e.g. by means of the above-mentioned 3D camera. The test image (source image) is then compared to the recorded image of the 3D camera using image analysis techniques. This comparison may result in an evaluation of one or more of the above-mentioned characteristics of each of the one or more surfaces that were identified by the surveillance sensor 18 in the first step S10. The evaluation results, i.e. the determined projectability characteristics of each surface are preferably stored in the memory unit 16 together with the earlier detected location information of each surface.
The environment analysis is further detailed in step S14. The environment analysis unit 14.1 may be configured to determine in this process step for each identified projection surface a space that may be used by the person 36 to perform a physical exercise while at the same time being able to observe an image projected onto said surface. This information may be again stored in the memory unit 16 in conjunction with each identified surface, its earlier defined characteristics and position information.
At the same time or after the above-mentioned environment analysis performed in steps S10-S14 the object detection unit 14.2 may identify in step S16 one or more objects in the environment of the person 36 by means of the surveillance sensor 18 and may asses these objects for their utility to support physical exercises. The identified objects may be stored in the memory unit 16 and tagged in the memory unit 16 according to their location, dimensions, type and utility. Some objects will be tagged with a negative utility. This means that their location may interfere with physical exercises based on an estimated position of the person 36 for certain projection locations. Other objects will be tagged a positive utility relative to a projection surface. E.g. if a table, or edge height is detected it may be associated a positive utility for physical exercises that require support when standing on one leg.
Optionally, the person 36 himself may provide in step S18 information including personal data and/or preferences of the person 36 to the system 10 via the input unit 20.
Based upon all or at least parts of these information (surface location information, surface projectability information, spatial confinements of the environment, object information, and personal data/preferences) the exercise recommender unit 14.3 may then recommend one or more physical exercises to the person 36. It should be clear that in fact not the physical exercises themselves are recommended to the person 36, but rather the images or image sequences that relate to the different kinds of physical exercises. The exercise recommender unit 14.3 does, however, not only recommend images or image sequences to the person 36, but also determines at least one of the one or more surfaces that has been identified by the surveillance sensor 18 and that is optimal for the projection of said image or image sequence. In step S20 the exercise recommender unit 14.3 selects a subset of images or image sequences that may be suitably carried out in the observed environment. This selection may be based on one or more of the above-mentioned parameters that have been evaluated in steps S10-S18. Each of the subset of images or image sequences is then assigned at least one of the evaluated surfaces (step S22). This assignment of the potential projection surfaces to each of the subset of selected images or image sequences is again preferably stored in the memory unit 16.
The exercise recommender unit 14.3 in other words selects in steps S20 and S22 the most suitable physical exercises, locations for projections and locations for the person 36. Since each image or image sequence stored in the memory unit 16 is tagged in the memory unit 16 together with the characteristics of the physical exercises shown in the image or image sequence, the exercise recommender unit 14.3 so to say knows the physical exercise in terms of knowing what space is needed for the physical exercise, in what body position the physical exercise has to be carried out, and/or how large the projection of the image sequence should be at minimum to be observable during the physical exercise. Exercises may be stored together with their images or image sequences in the memory unit 16 as a semantic description including information regarding space, motion, physical properties and/or physical requirements of the exercise. The control unit 14 is thus able to choose the optimal projection surface for each of the subset of selected physical exercises individually. For example, an image or image sequence showing a squat exercise may be tagged with information that at least lxlm is needed and that the person's view is assumed to be always more or less parallel to the ground. Together with the information regarding the height of the user, which is provided in step SI 8, this may imply that a wall surface, which is at a height of 1.6 m-2.0 m and has a free space of lxlm in front of it, is the optimal projection surface for the squad exercise. On the other hand, a ceiling surface may be associated to an image or image sequence which is related to a physical exercise where the user is on his back, e.g. a bench press exercise.
It shall be noted that the memory unit 16 is constantly updated during steps S10-S22. However, it shall be noted that the different kinds of data mentioned above do not necessarily have to be stored in one and the same memory unit 16, but may also be stored in a plurality of separate memory units.
The exercise recommender unit 14.3 may finally recommend the selected physical exercises/image sequences to the person 36 in step S24. The recommended exercises/image sequences may be e.g. displayed on a display that is part of the output unit 28.
As soon as the person 36 then selects a physical exercise in step S26, the projector and drone steering unit 14.5 controls in step S28 the locomotion actuator 32 of the drone 12 to move to a certain location as well as it controls the projector 24 to project the selected image sequence to the projection surface that has been previously assigned to it (see also Fig. 4B). In this step S28 the control unit 14 may furthermore control the projector 24 to adjust the projector settings, such as the size, shape, image brightness and/or image contrast of the projected image. This adjustment may be based on at least one of the parameters that have been previously determined by the system 10 in steps S 10-S26.
The selected image sequence may be finally projected to the selected projection surface in step S30. Optionally, additional visible, and/or audible instructions may be output via the output unit 28.
Step S34, which is performed by the user detection and tracking unit 14.4, shows a feedback loop to steps S30 and S32. In this step the person 36 may be tracked by means of the tracking unit 21. The tracked movements of at least one or more body parts of the person 36 may be evaluated, so that the control unit 14 may the accordingly control the projector movement actuator 26 or the instructions output via the output unit 28 based on the tracked movement of the one or more body parts of the person 36. The tracking unit 22 in other words follows the movement of the person 36 during the physical exercise and observes whether the person 36 is performing the physical exercise in a correct way. The tracking unit 22 thereto preferably includes a camera which is configured to follow the person 36 while conducting the physical exercise. The analysis of the movement of the person 36 may be done by using an image analysis process combined with the biometrics of the subject to compare the execution to the ideal execution of the physical exercise stored in the memory unit 16. The control unit 14 may e.g. be configured to adapt the projection and/or the user instructions in case a deviation of the tracked movement relative to the ideal movement is detected to be above a certain threshold. The drone 12 may then e.g. show the person 36 the right execution of the physical exercise by showing a trajectory that certain body parts have to follow during the physical exercise. Said trajectory may be e.g. generated by means of a laser that is pointed to one of the previously identified projection surfaces. Alternatively, the drone 12 may show the person 36 a live feed of the user with an overlay of feedback and feedforward information relating to the exercise. The control unit may be, in other words, configured to control the projector 24 to project not only the image or image sequence relating to the currently performed exercise, but to concurrently project also an image sequence recorded by means of the camera of the tracking unit 22 (e.g. by overlaying it onto the exercise image sequence). It should also be noted that the pre-rendered images may be adapted and combined with the live camera feed prior to them being rendered through the projector 24. They may also be procedurally generated based on the exercise profile and context information from the sensors. The control unit 14 can also take into account the position and/or angle of the camera of the tracking unit 22 whilst performing an exercise to give the person 36 a useful representation of themselves when projecting the live feed. It shall be noted that, according to a preferred embodiment, the tracking unit 22 uses the same camera as the surveillance sensor 18, such that only one camera is needed.
In the case where the position/orientation of the projection needs to be corrected, the drone 12 can first calibrate to the height of the person's head by moving to that height/position prior to the physical exercise, wherein the height may be estimated from an air pressure or ultrasound sensor. The drone 12 can then move the 3D camera (surveillance sensor 18) to see if the projected image is generally observable and can then correct for this. With (indoor) positioning technology, the calibration step could be carried out even easier, since the positioning technology will know where it is in space at any time. Otherwise, the relative changes in position between the position of the physical exercise and the final position of the drone 12 would need to be estimated. Regarding the real-time height estimate of the person's head from the floor, a 3D/range imaging camera is preferred (e.g. time of flight). Otherwise, the system 10 may be calibrated by asking the person 36 to initially lie on the floor and then estimate the height whilst tracking the position of the head. Optionally, for exercises that involve wide head movements, the control unit 14 may control the projector movement actuator 26 to move the projector 24 according to a forecasted head movement or according to a head movement that is tracked in realtime.
Further optionally, the input of the sensors 18-22 may be used to determine the speed of the body movements of the person 36, wherein the control unit 14 is configured to adapt the speed of the projected video sequence accordingly. For example, if it is detected via the microphone that is part of the input unit 20 that the user starts to breathe more heavily or grunt from straining too much, the control 14 may control the projector 24 to pause the projected image sequence or to slow down the reproduction of the projected image sequence.
The second embodiment, which is schematically illustrated in Fig. 3, is generally able to perform the same processes as explained above with reference to the first embodiment shown in Fig. 1. Hence, only the differences between the two embodiments will be described in the following. The same or similar reference numerals are used for the same or similar components of the system 10.
The memory unit 16' is according to the second embodiment realized as a separate device that is arranged locally remote from the drone 12. The memory unit 16' may e.g. be implemented by an external database or cloud server. A further difference is the usage of an extra computing device, such as a smartphone or table PC that implements the above- mentioned functions of the control unit 14', the input unit 20' and/or the output unit 28'. Even though some parts of the control of the drone 12 may be carried out from said external device, it is nevertheless preferred to have a part of the processor 14" still arranged at or within the drone 12. The drone 12 according to the second embodiment furthermore comprises a data communication unit 38 by means of which the drone 12 may communicate with the memory unit 16' and the external computing device implementing the control unit 14', input unit 20' and/or output unit 28'.
Lastly, it shall be noted that not all of the system components shown in the drawings are necessary system components. Figs. 1 and 3 only show two exemplary embodiments of the herein presented system 10. It shall be also noted that Fig. 5 also illustrates only an exemplary embodiment and that not all steps illustrated in Fig. 5 are necessary steps. For example, steps SI 6, SI 8, S28, S30 and S34 are optional steps.
While the invention has been illustrated and described in detail in the drawings and foregoing description, such illustration and description are to be considered illustrative or exemplary and not restrictive; the invention is not limited to the disclosed embodiments.
Other variations to the disclosed embodiments can be understood and effected by those skilled in the art in practicing the claimed invention, from a study of the drawings, the disclosure, and the appended claims.
In the claims, the word "comprising" does not exclude other elements or steps, and the indefinite article "a" or "an" does not exclude a plurality. A single element or other unit may fulfill the functions of several items recited in the claims. The mere fact that certain measures are recited in mutually different dependent claims does not indicate that a combination of these measures cannot be used to advantage.
A computer program may be stored/distributed on a suitable medium, such as an optical storage medium or a solid-state medium supplied together with or as part of other hardware, but may also be distributed in other forms, such as via the Internet or other wired or wireless telecommunication systems.
Any reference signs in the claims should not be construed as limiting the scope.

Claims

CLAIMS:
1. A system (10) for supporting a person (36) performing physical exercises, comprising:
a flying, and/or ground-moving drone (12);
a projector (24) arranged at the drone (12);
- a surveillance sensor (18) arranged at the drone (12) for observing spatial confinements in an environment of the person (36) and for identifying one or more surfaces (40) in the environment of the person (36);
a memory unit (16) for storing images or image sequences relating to training instructions for different kinds of physical exercises; and
- a control unit (14) which is configured:
to select for each of at least a subset of the images or image sequences stored in the memory unit (16) at least one of the one or more surfaces (40) identified by the surveillance sensor (18), said selection being based on at least one of: (i) the observed spatial confinements in the environment of the person (36), (ii) the kind of physical exercise to which each image or image sequence of the at least one subset is related, and (iii) characteristics of the one or more surfaces (40) identified by the surveillance sensor (18), and to control the projector (24) upon request to project at least one of the images or image sequences of the at least one subset onto the at least one surface (40) that has been selected for said at least one image or image sequence.
2. The system according to claim 1, wherein the control unit (14) is configured to select said subset of the images or image sequences based on the observed spatial confinements in the environment of the person (36).
3. The system according to claim 1, wherein the control unit (14) is configured to select said subset of the images or image sequences based on characteristics of the one or more surfaces identified by the surveillance sensor (18).
4. The system according to claim 1, wherein the control unit (14) is configured to control the projector (24) to adapt a size, a shape, an image brightness and/or an image contrast of the at least one image or image sequence projected by the projector (24) based on characteristics of the at least one surface (40) that has been selected for said at least one image or image sequence.
5. The system according to claim 1, wherein said characteristics of the one or more surfaces (40) identified by the surveillance sensor (18) include at least one of (i) a size, (ii) a shape, (iii) a surface condition, (iv) an orientation in space, (v) a reflectivity, (vi) a colour, and (vii) an ambient lighting.
6. The system according to claim 5, wherein the surveillance sensor (18) comprises a camera, and wherein the control unit (14) is configured to determine said characteristics of the one or more surfaces by controlling the projector (24) to project a test image onto the one or more surfaces, by controlling the camera to record a projection of the test image, and by comparing the test image to the projection of the test image recorded by the camera.
7. The system according to claim 1, wherein the surveillance sensor (18) is configured to identify one or more objects in the environment of the person (36), and wherein the control unit (14) is configured to select said subset of the images or image sequences based on said one or more identified objects.
8. The system according to claim 1, further comprising an input unit (20) for receiving information including personal data and/or preferences of the person (36), wherein the control unit (14) is configured to select said subset of the images or image sequences based on said information.
9. The system according to claim 1, further comprising an output unit (28) arranged at the drone (12) for providing audible, and/ or visual instructions to the person (36) which instructions indicate where the person (36) shall position him-/herself and/or how the person shall carry out the physical exercise to which the at least one image or image sequence projected by the projector (24) is related.
10. The system according to claim 9, wherein the control unit (14) is configured to determine said instructions based on at least one of: (i) the observed spatial confinements in the environment of the person (36), (ii) the kind of physical exercise to which the at least one image or image sequence projected by the projector (24) is related, and (iii) characteristics of the at least one surface (40) that has been selected for the at least one image or image sequence projected by the projector (24).
11. The system according to claim 10, wherein the drone (12) further comprises a tracking unit (22) for tracking a movement of one or more body parts of the person (36) while performing the physical exercises, wherein the control unit (14) is configured to adapt the instructions to the person (36) based on the tracked movement of the one or more body parts of the person (36).
12. The system according to claim 1, wherein the drone (12) further comprises a tracking unit (22) for tracking a movement of one or more body parts of the person (36) while performing the physical exercises, and a projector movement actuator (26) for moving the projector (24) relative to the drone (12), wherein the control unit (14) is configured to control the projector movement actuator (26) based on the tracked movement of the one or more body parts of the person (36).
13. The system according to claim 1, wherein the drone (12) comprises at least one locomotion actuator (32), and wherein the control unit (14) is configured to deactivate the at least one locomotion actuator (32) of the drone (12) before controlling the projector (24) to project the at least one of the images or image sequences.
14. Method for supporting a person (36) performing physical exercises by means of a flying and/or ground-moving drone (12) which comprises (i) a projector (24) and (ii) a surveillance sensor (18) for observing spatial confinements in an environment of the person (36) and for identifying one or more surfaces (40) in the environment of the person (36), the method comprises the steps of:
selecting for each of at least a subset of images or image sequences, which relate to training instructions for different kinds of physical exercises and are stored in a memory unit (16), at least one of the one or more surfaces (40) identified by the surveillance sensor (18), said selection being based on at least one of: (i) the observed spatial confinements in the environment of the person (36), (ii) the kind of physical exercise to which each image or image sequence of the at least one subset is related, and (iii)
characteristics of the one or more surfaces (40) identified by the surveillance sensor (18), and controlling the projector (24) upon request to project at least one of the images or image sequences of the at least one subset onto the at least one surface (40) that has been selected for said at least one image or image sequence.
15. Computer program comprising program code means for causing a computer to carry out the steps of the method as claimed in claim 14 when said computer program is carried out on a computer.
PCT/EP2016/071612 2015-09-28 2016-09-14 System and method for supporting physical exercises WO2017055080A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
EP15187119.1 2015-09-28
EP15187119 2015-09-28

Publications (1)

Publication Number Publication Date
WO2017055080A1 true WO2017055080A1 (en) 2017-04-06

Family

ID=54252013

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2016/071612 WO2017055080A1 (en) 2015-09-28 2016-09-14 System and method for supporting physical exercises

Country Status (1)

Country Link
WO (1) WO2017055080A1 (en)

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109432724A (en) * 2018-12-13 2019-03-08 福州大学 Novel body building aircraft and its control method
WO2019163264A1 (en) * 2018-02-20 2019-08-29 ソニー株式会社 Flying body and flying body control method
CN114588612A (en) * 2021-11-25 2022-06-07 北京华锐视界科技有限公司 Ball game system
US11443518B2 (en) 2020-11-30 2022-09-13 At&T Intellectual Property I, L.P. Uncrewed aerial vehicle shared environment privacy and security
US11726475B2 (en) 2020-11-30 2023-08-15 At&T Intellectual Property I, L.P. Autonomous aerial vehicle airspace claiming and announcing
TWI814682B (en) * 2023-02-24 2023-09-01 國立虎尾科技大學 Intellectual analysis system for high jump skill
US11797896B2 (en) 2020-11-30 2023-10-24 At&T Intellectual Property I, L.P. Autonomous aerial vehicle assisted viewing location selection for event venue
EP4280197A1 (en) * 2022-05-20 2023-11-22 National Cheng Kung University Training methods and training systems utilizing uncrewed vehicles
WO2023247982A1 (en) * 2022-06-22 2023-12-28 Szegedi Tudományegyetem Smart drone system for simulating a moving object

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040183775A1 (en) * 2002-12-13 2004-09-23 Reactrix Systems Interactive directed light/sound system
US20080269016A1 (en) 2007-04-30 2008-10-30 Joseph Ungari Adaptive Training System with Aerial Mobility
US20110021317A1 (en) * 2007-08-24 2011-01-27 Koninklijke Philips Electronics N.V. System and method for displaying anonymously annotated physical exercise data

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20040183775A1 (en) * 2002-12-13 2004-09-23 Reactrix Systems Interactive directed light/sound system
US20080269016A1 (en) 2007-04-30 2008-10-30 Joseph Ungari Adaptive Training System with Aerial Mobility
US20110021317A1 (en) * 2007-08-24 2011-01-27 Koninklijke Philips Electronics N.V. System and method for displaying anonymously annotated physical exercise data

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
ANONYMOUS: "Drones. Sector overview. March 2015", 31 March 2015 (2015-03-31), pages 1 - 37, XP055246116, Retrieved from the Internet <URL:http://citenpl.internal.epo.org/wf/storage/1528D977B1800079B8F/originalPdf> [retrieved on 20160129] *
JÜRGEN SCHEIBLE ET AL: "Displaydrone", PROCEEDINGS OF THE 2ND ACM INTERNATIONAL SYMPOSIUM ON PERVASIVE DISPLAYS, PERDIS '13, 5 June 2013 (2013-06-05), New York, New York, USA, pages 49, XP055246123, ISBN: 978-1-4503-2096-2, DOI: 10.1145/2491568.2491580 *
RAINMAKER, DC: "Sports, Drones, and Follow-Me Aerial Action Imagery: A State of the Industry | DC Rainmaker", 5 September 2015 (2015-09-05), XP055248129, Retrieved from the Internet <URL:https://web.archive.org/web/20150905092039/http://www.dcrainmaker.com/2015/02/drone-sports-usage.html> [retrieved on 20160205] *
SUN-WOOK CHOI ET AL: "Interactive display robot", HUMAN-ROBOT INTERACTION, IEEE PRESS, 445 HOES LANE, PO BOX 1331, PISCATAWAY, NJ 08855-1331 USA, 3 March 2013 (2013-03-03), pages 109 - 110, XP058013757, ISBN: 978-1-4673-3055-8 *

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019163264A1 (en) * 2018-02-20 2019-08-29 ソニー株式会社 Flying body and flying body control method
CN109432724A (en) * 2018-12-13 2019-03-08 福州大学 Novel body building aircraft and its control method
US11443518B2 (en) 2020-11-30 2022-09-13 At&T Intellectual Property I, L.P. Uncrewed aerial vehicle shared environment privacy and security
US11726475B2 (en) 2020-11-30 2023-08-15 At&T Intellectual Property I, L.P. Autonomous aerial vehicle airspace claiming and announcing
US11797896B2 (en) 2020-11-30 2023-10-24 At&T Intellectual Property I, L.P. Autonomous aerial vehicle assisted viewing location selection for event venue
CN114588612A (en) * 2021-11-25 2022-06-07 北京华锐视界科技有限公司 Ball game system
EP4280197A1 (en) * 2022-05-20 2023-11-22 National Cheng Kung University Training methods and training systems utilizing uncrewed vehicles
WO2023247982A1 (en) * 2022-06-22 2023-12-28 Szegedi Tudományegyetem Smart drone system for simulating a moving object
TWI814682B (en) * 2023-02-24 2023-09-01 國立虎尾科技大學 Intellectual analysis system for high jump skill

Similar Documents

Publication Publication Date Title
WO2017055080A1 (en) System and method for supporting physical exercises
US11132533B2 (en) Systems and methods for creating target motion, capturing motion, analyzing motion, and improving motion
JP7026214B2 (en) Head-mounted display tracking system
US9498720B2 (en) Sharing games using personal audio/visual apparatus
US9311742B1 (en) Navigating an avatar through a virtual world in a motion capture simulation environment
US10474793B2 (en) Systems, apparatus and methods for delivery and augmentation of behavior modification therapy and teaching
JP7059937B2 (en) Control device for movable image pickup device, control method and program for movable image pickup device
KR101981774B1 (en) Method and device for providing user interface in the virtual reality space and recordimg medium thereof
CN111201539A (en) Continuously selecting, by an autonomous personal companion, a scene for execution by an artificial intelligence model of a user based on identified tags describing the contextual environment of the user
JP2016508241A (en) Wireless wrist computing and controlling device and method for 3D imaging, mapping, networking and interfacing
CN111163906B (en) Mobile electronic device and method of operating the same
CN106104650A (en) Remote Device Control is carried out via gaze detection
JP2018163460A (en) Information processing apparatus, information processing method, and program
JP2010127779A (en) Information serving apparatus, information serving method and program
JPWO2017213070A1 (en) Information processing apparatus and method, and recording medium
CN109271028A (en) Control method, device, equipment and the storage medium of smart machine
JPWO2019171557A1 (en) Image display system
US20180077356A1 (en) System and method for remotely assisted camera orientation
KR20180058139A (en) Smart health service system and smart health service method
US20230095900A1 (en) Systems for simulating joining operations using mobile devices
Birbach et al. Rapid calibration of a multi-sensorial humanoid’s upper body: An automatic and self-contained approach
US20180082119A1 (en) System and method for remotely assisted user-orientation
EP2482935B1 (en) System for supporting a user to do exercises
WO2022091832A1 (en) Information processing device, information processing system, information processing method, and information processing terminal
JP2016189073A (en) Personality estimation device, personality estimation program, and personality estimation method

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16770711

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 16770711

Country of ref document: EP

Kind code of ref document: A1