US20070021199A1 - Interactive games with prediction method - Google Patents

Interactive games with prediction method Download PDF

Info

Publication number
US20070021199A1
US20070021199A1 US11/189,176 US18917605A US2007021199A1 US 20070021199 A1 US20070021199 A1 US 20070021199A1 US 18917605 A US18917605 A US 18917605A US 2007021199 A1 US2007021199 A1 US 2007021199A1
Authority
US
United States
Prior art keywords
player
image
players
offense
motions
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US11/189,176
Inventor
Ned Ahdoot
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Priority to US11/189,176 priority Critical patent/US20070021199A1/en
Priority to US11/349,431 priority patent/US20070021207A1/en
Publication of US20070021199A1 publication Critical patent/US20070021199A1/en
Priority to US12/798,335 priority patent/US20110256914A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B24/00Electric or electronic controls for exercising apparatus of preceding groups; Controlling or monitoring of exercises, sportive games, training or athletic performances
    • A63B24/0003Analysing the course of a movement or motion sequences during an exercise or trainings sequence, e.g. swing for golf or tennis
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/213Input arrangements for video game devices characterised by their sensors, purposes or types comprising photodetecting means, e.g. cameras, photodiodes or infrared cells
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/55Controlling game characters or game objects based on the game progress
    • A63F13/57Simulating properties, behaviour or motion of objects in the game world, e.g. computing tyre load in a car race game
    • A63F13/573Simulating properties, behaviour or motion of objects in the game world, e.g. computing tyre load in a car race game using trajectories of game objects, e.g. of a golf ball according to the point of impact
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/55Controlling game characters or game objects based on the game progress
    • A63F13/57Simulating properties, behaviour or motion of objects in the game world, e.g. computing tyre load in a car race game
    • A63F13/577Simulating properties, behaviour or motion of objects in the game world, e.g. computing tyre load in a car race game using determination of contact between game characters or objects, e.g. to avoid collision between virtual racing cars
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/60Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor
    • A63F13/65Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor automatically by game devices or servers from real world data, e.g. measurement in live racing competition
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/80Special adaptations for executing a specific game genre or game mode
    • A63F13/833Hand-to-hand fighting, e.g. martial arts competition
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B2220/00Measuring of physical parameters relating to sporting activity
    • A63B2220/80Special sensors, transducers or devices therefor
    • A63B2220/806Video cameras
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63BAPPARATUS FOR PHYSICAL TRAINING, GYMNASTICS, SWIMMING, CLIMBING, OR FENCING; BALL GAMES; TRAINING EQUIPMENT
    • A63B2244/00Sports without balls
    • A63B2244/10Combat sports
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/10Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
    • A63F2300/1087Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals comprising photodetecting means, e.g. a camera
    • A63F2300/1093Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals comprising photodetecting means, e.g. a camera using visible light
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/60Methods for processing data by generating or executing the game program
    • A63F2300/6045Methods for processing data by generating or executing the game program for mapping control signals received from the input arrangement into game commands
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/60Methods for processing data by generating or executing the game program
    • A63F2300/69Involving elements of the real world in the game world, e.g. measurement in live races, real video
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/80Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
    • A63F2300/8029Fighting without shooting

Definitions

  • This invention relates generally to games of interactive play between two or more entities including individuals and computer simulated opponents, i.e., the invention may be used by two individuals, an individual and a simulation, and even between two simulations, as for demonstration purposes, and more particularly to a computer controlled interactive movement and contact simulation game in which a player mutually interacts with a computer generated image that responds to the player's movement in real-time.
  • U.S. Pat. No. 5,913,727 discloses an interactive contact and simulation game apparatus in which a player and a three dimensional computer generated image interact in simulated physical contact. Alternately two players may interact through the apparatus of the invention.
  • the game apparatus includes a computerized control means generating a simulated image or images of the players, and displaying the images on a large display.
  • a plurality of position sensing and impact generating means are secured to various locations on each of the player's bodies.
  • the position sensing means relay information to the control means indicating the exact position of the player. This is accomplished by the display means generating a moving light signal, invisible to the player, but detected by the position sensing means and relayed to the control means.
  • the control means then responds in real time to the player's position and movements by moving the image in a combat strategy.
  • the impact generating means positioned at the point of contact is activated to apply pressure to the player, thus simulating contact.
  • each players sees his opponent as a simulated image on his display device.
  • Lewis et al. U.S. Pat. No. 5,177,872 discloses a novel device for determining the position of a person or object.
  • the device is responsive to head or hand movements in order to move a dampened substance contained within a confined tube past one or more sensors. Light passing through the tube is interrupted by the movement of the dampened substance.
  • the intended use of the device, as disclosed, is changing the perspective shown on a video display.
  • Goo U.S. Pat. No. 4,817,950 teaches a video game controller for surfboarding simulation, and of particular interest is the use of a unique attitude sensing device to determine the exact position of the surfboard.
  • the attitude sensing device employs a plurality of switch closures to determine the tilt angle of the platform and open and close a plurality of electrical contacts enabling a signal input to a computer control unit.
  • U.S. Pat. No. 5,185,561 teaches the principals of tactile feedback through the use of a torque motor.
  • the device consists of a hand held, one dimensional torque feedback device used to manipulate computer generated visual information and associated torque forces.
  • Kosugi et al. U.S. Pat. No. 5,229,756 disclose a combination of components forming an interactive image control apparatus.
  • the main components of the device are a movement detector for detecting movement, a judging device for determining the state of the operator on the basis of the movement signal provided by the movement detector, and a controller that controls the image in accordance with the movement signal and the judgment of the judgment device.
  • the movement detector, judging device and the controller cooperate so as to control the image in accordance with the movement of the operator.
  • Kosugi requires that a detection means be attached adjacent to the operator's elbow and knee joints so as to measure the bending angle of the extremity and thus more accurately respond to the operator's movements.
  • the present invention employs a system in which the position of the player is continually monitored.
  • a system in which the position of the player is continually monitored.
  • the present invention takes the approach to simulate a combat adversary image, while allowing the player to exercise every part of his body in combat with the image. This is the final and most important objective.
  • the present invention fulfills these needs and provides further related advantages as described in the following summary.
  • the present invention teaches certain benefits in construction and use which give rise to the objectives described below.
  • a best mode embodiment of the present invention provides a method for engaging a player or a pair of players in a motion related game including the steps of attaching plural colored elements onto selected portions of the player(s); processing a video stream from a digital camera to separately identify the positions, velocities an accelerations of the several colored elements in time; providing a data stream of the video recorder to a data processor; calculating the distance between the player and the camera as a function of time; predicting the motions of the players and providing anticipatory motions of a virtual image in compensation thereof.
  • a primary objective of the present invention is to provide an apparatus and method of use of such apparatus that yields advantages not taught by the prior art.
  • Another objective of the invention is to provide a game for simulated combat between two individuals.
  • a further objective of the invention is to provide a game for simulated combat between an individual and a simulated second player of the game.
  • a further objective of the invention is to provide a game for simulated combat between an individual carrying a sport instrument in hand and a simulated offense and defense players of the game.
  • a still further objective of the invention is to provide the virtual image to anticipate and predict the movement of the real player and to change the virtual image accordingly.
  • FIG. 1 is a perspective view showing a method of the instant innovation providing video capture of the motions of a player and of projection of a competitor's image onto a screen;
  • FIG. 2 is a perspective view thereof showing one embodiment of the invention with a player at left and a simulated player's image at right;
  • FIG. 3 is a perspective view thereof showing a first and a second players in separate locations with video images of each projected onto a screen at the other player's location;
  • FIGS. 4-5 are a logic diagram of the method of the invention.
  • one or two players take part in a game involving physical movements.
  • Such games may comprise simulated combat, games of chance, competition, cooperative engagement, and similar subjects.
  • the present invention is ideal for use in games of hand-to-hand combat such as karate, aikido, kick-boxing and American style boxing where the players have contact but are not physically intertwined as they are in wrestling, Judo and similar sports.
  • a combat game is described, but such is not meant to limit the range of possible uses of the present invention.
  • a player 5 engages in simulated combat with an image 5 ′ projected onto a screen 10 placed in front of the player 5 .
  • the image 5 ′ is computer generated using the same technology as found in game arcades.
  • two players 5 stand in front of two separate screens 10 and engage in mutual simulated combat against recorded and projected images 5 ′ of each other. This avoids physical face-to-face combat where one of the players might receive injury.
  • the images projected onto the screens 10 are not computer generated.
  • a player 5 is positioned in front of a rear projection screen 10 .
  • One or more video cameras 20 referred to here as a camera 20 , is positioned behind the screen 10 .
  • the camera 20 is able to view the player 5 through the screen 10 and record the player's movements dynamically. If the screen 10 is not transparent enough for this to be done, the camera 20 is mounted on the front of the screen 10 , or is mounted on or at the rear of the screen 10 viewing the player 5 through a small hole in the screen 10 .
  • the screen 10 may be supported by a screen stand (not shown) or it may be mounted on a wall 25 as shown.
  • the screen 10 may also be mounted in the wall 25 with video equipment located on the side of the wall opposite the player 5 as shown in FIG. 1 .
  • a video projector 30 projects a simulated image 5 ′ of a competitor combatant from the rear onto the screen 10 and this image 5 ′ is visible to the player 5 as shown in FIG. 2 .
  • both the camera 20 and the projector 30 operate at identical rates (frames per second) but are set for recording and projecting respectively for only one-half of each frame, and are interlaced so that recording occurs only when the projector 30 is in an off state, and projecting occurs only when the camera 20 is in an off state.
  • the net result is that the player 5 , positioned at the front of the screen 10 , sees the projected image while the camera 20 sees the player 5 and not the projected image.
  • the screen 10 may be a two-way mirror with visibility of objects in front of the screen 10 very clear from the rear of the screen 20 , and with visibility through the screen 10 from the front not possible, yet visibility of images projected onto the back of the screen 10 highly visible from in front.
  • the player 5 wears colored bands as best seen in FIG. 2 .
  • the player 5 has a band 51 secured at his forehead, above each elbow 52 , on each wrist 53 , around the waist 54 , above each knee 55 and on each ankle 56 .
  • Each of these 10 bands is a different color. Further bands may be placed in additional locations on the player, but the 10 bands shown in FIG. 2 as described, are able to achieve the objectives of the instant innovation as will be shown.
  • the image 5 ′ of the player 5 as recorded by camera 20 is converted into a digital electronic signal. This signal is split into 10 identical signals and each of these 10 signals is filtered for only the color component related to one of the 10 bands 51 - 56 .
  • Each of the filtered signals contains two pieces of information: the location on the plane of the recording device of its related colored band as determined by which pixels are disposed to the band, and the distance from the recording device to the band as determined by the total number of pixels disposed to the band.
  • This information, from all ten bands is processed by a computer 60 to form a composite image 5 ′ of the player 5 .
  • the player 5 stands facing the screen 10 with feet a comfortable distance apart, legs straight, and arms hanging at the player's sides.
  • Each of the ten colored bands 51 - 56 are visible to the camera 20 and with a simple set of anatomical rules, the computer 60 is able to compose a mathematical model of the player's form that accurately represents the player's physical position and anatomical orientation at that moment.
  • the computer 60 is able to calculate the motion trajectory of the band.
  • the computer 60 is able to calculate the band's trajectory in 3-space.
  • the computer 60 calculation takes into account the corresponding portion of the human anatomy, has moved so as to be hidden behind another portion of the anatomy of the player 5 . This example is represented in FIG. 2 .
  • the computer 60 produces a digital image 5 ′ representing a competitor combatant and projects this image 5 ′ onto the screen 10 initially in a starting position with body erect, feet spread apart and arms at sides.
  • the computer 60 calculates the trajectory of motion of the attacking element, i.e., hand, arm, leg, etc., of the player 5 and moves the image 5 ′ to defensive postures or to counter attack.
  • the computer 60 is able to calculate if the player 5 has moved successfully to overcome defensive postures or counter attacks of the image 5 ′ so as to award points to the player 5 ′.
  • Two players 5 stand facing their respective screens 10 , each with feet a comfortable distance apart, legs straight, and arms hanging at their sides.
  • Each of the ten colored bands 51 - 56 on each of the players 5 are visible to their respective cameras 20 so that the computer 60 is able to compose mathematical models of each of the players 5 in a mathematical 3-space that accurately represents each of the player's physical position and anatomical orientation at that moment relative to the other of the player 5 .
  • the vertical plane represented by the screen 10 of one player 5 represents a vertical bisector of the other player 5 . Therefore, when one player 5 moves a fist, elbow, knee or foot toward his screen 10 , the computer 60 calculates that motion as projecting outwardly toward the other player 5 from the other player's screen 10 .
  • the computer 60 calculates contacts between players 5 in offensive and defensive moves.
  • the players 5 initially and nominally stand slightly more than an arm's length away from their screen, i.e., mathematically from their opponent.
  • Points are awarded to each of the players for successful offensive and defensive moves.
  • the images are preferably projected with three-dimensional realism by use of the well known horizontal and vertical polarization of dual simultaneous projections with slight image separation as is well known, and with the players 5 wearing horizontally and vertically polarized lenses so as to see a combined image providing the illusion of depth.
  • each of the players 5 sees the illusion of the opponent players image projecting toward him from the screen 10 .
  • This example is represented in FIG. 3 .
  • the present disclosure teaches an improved video frame processing method that enables the combative motions between two distant players 5 to be calculated and compared with respect to each other. This method is described as follows and is as shown in FIGS. 4-6 .
  • a stream of frames from the video recorder 30 is processed.
  • position, velocity, as the differential of the position, and acceleration, as the second differential of the position of each of the ten color elements of the player 5 are calculated.
  • Enablement of prediction is determined by evaluating the number of frames comprising a particular motion with a minimum number of frames set point.
  • the calculations continue until the number of frames is at least equal to the set point.
  • the image is modified so as to defend against an offensive move by the player 5 or to initiate a new offensive move from an inventory of such moves.
  • the final logical loops of this program are shown in FIGS.
  • 5 and 6 and comprise the determination of incoming offense commands, calculation of the player's new coordinates, determination if the defense or offence is complete, and calculating the player's offensive positions as compared to the image defense moves and vice-versa, and determining a score for the player 5 in accordance with a stored table of score related motion and counter motion comparisons. For each of the motion and counter motion determinations for both offensive and defensive motions of players, a score is created and projected onto the screen.

Abstract

A method for engaging a player or a pair of players in a motion related game including the steps of attaching plural colored elements onto selected portions of the player(s) garments and processing a video stream of each of the players to separately identify the positions, velocities an accelerations of the several colored elements. The method further comprises generation of a combatant competitor image and moving the image in a manor to overcome the player. In a further approach, two players are recorded and their video images are presented one screens frontal to the other of the players. The same colored elements are used to enable computer calculations of fighting proficiency of the players.

Description

    BACKGROUND OF THE INVENTION
  • 1. Field of the Subject Matter
  • This invention relates generally to games of interactive play between two or more entities including individuals and computer simulated opponents, i.e., the invention may be used by two individuals, an individual and a simulation, and even between two simulations, as for demonstration purposes, and more particularly to a computer controlled interactive movement and contact simulation game in which a player mutually interacts with a computer generated image that responds to the player's movement in real-time.
  • 2. Description of Related Art
  • The following art defines the present state of this field:
  • Invention and use of computer generated, interactive apparatus are known to the public, in that such apparatus are currently employed for a wide variety of uses, including interactive games, exercise equipment, and astronaut training.
  • Ahdoot, U.S. Pat. No. 5,913,727 discloses an interactive contact and simulation game apparatus in which a player and a three dimensional computer generated image interact in simulated physical contact. Alternately two players may interact through the apparatus of the invention. The game apparatus includes a computerized control means generating a simulated image or images of the players, and displaying the images on a large display. A plurality of position sensing and impact generating means are secured to various locations on each of the player's bodies. The position sensing means relay information to the control means indicating the exact position of the player. This is accomplished by the display means generating a moving light signal, invisible to the player, but detected by the position sensing means and relayed to the control means. The control means then responds in real time to the player's position and movements by moving the image in a combat strategy. When simulated contact between the image and the player is determined by the control means, the impact generating means positioned at the point of contact is activated to apply pressure to the player, thus simulating contact. With two players, each players sees his opponent as a simulated image on his display device.
  • Lewis et al. U.S. Pat. No. 5,177,872 discloses a novel device for determining the position of a person or object. The device is responsive to head or hand movements in order to move a dampened substance contained within a confined tube past one or more sensors. Light passing through the tube is interrupted by the movement of the dampened substance. The intended use of the device, as disclosed, is changing the perspective shown on a video display.
  • Goo U.S. Pat. No. 4,817,950 teaches a video game controller for surfboarding simulation, and of particular interest is the use of a unique attitude sensing device to determine the exact position of the surfboard. The attitude sensing device employs a plurality of switch closures to determine the tilt angle of the platform and open and close a plurality of electrical contacts enabling a signal input to a computer control unit.
  • Good et al. U.S. Pat. No. 5,185,561 teaches the principals of tactile feedback through the use of a torque motor. As disclosed, the device consists of a hand held, one dimensional torque feedback device used to manipulate computer generated visual information and associated torque forces.
  • Kosugi et al. U.S. Pat. No. 5,229,756 disclose a combination of components forming an interactive image control apparatus. The main components of the device are a movement detector for detecting movement, a judging device for determining the state of the operator on the basis of the movement signal provided by the movement detector, and a controller that controls the image in accordance with the movement signal and the judgment of the judgment device. The movement detector, judging device and the controller cooperate so as to control the image in accordance with the movement of the operator. Kosugi requires that a detection means be attached adjacent to the operator's elbow and knee joints so as to measure the bending angle of the extremity and thus more accurately respond to the operator's movements.
  • The present invention employs a system in which the position of the player is continually monitored. Between the simple types of games of combat as typically found in game arcades, wherein the player's is via a simple control joystick and punch-buttons, and the very sophisticated and complex artificial reality types of game wherein the headgear provides a full sensory input structure, and a highly instrumented and wired glove allows manual contact on a limited basis with the simulation, there is a need for a fully interactive game. The present invention takes the approach to simulate a combat adversary image, while allowing the player to exercise every part of his body in combat with the image. This is the final and most important objective. The present invention fulfills these needs and provides further related advantages as described in the following summary.
  • Our prior art search with abstracts described above teaches interactive game technology, technique and know-how. However, the prior art fails to teach the instant technique featuring simulated “stand-up” combat between two individuals or between an individual and a computer simulation. The present invention fulfills these needs and provides further related advantages as described in the following summary.
  • SUMMARY
  • The present invention teaches certain benefits in construction and use which give rise to the objectives described below.
  • A best mode embodiment of the present invention provides a method for engaging a player or a pair of players in a motion related game including the steps of attaching plural colored elements onto selected portions of the player(s); processing a video stream from a digital camera to separately identify the positions, velocities an accelerations of the several colored elements in time; providing a data stream of the video recorder to a data processor; calculating the distance between the player and the camera as a function of time; predicting the motions of the players and providing anticipatory motions of a virtual image in compensation thereof.
  • A primary objective of the present invention is to provide an apparatus and method of use of such apparatus that yields advantages not taught by the prior art.
  • Another objective of the invention is to provide a game for simulated combat between two individuals.
  • A further objective of the invention is to provide a game for simulated combat between an individual and a simulated second player of the game.
  • A further objective of the invention is to provide a game for simulated combat between an individual carrying a sport instrument in hand and a simulated offense and defense players of the game.
  • A still further objective of the invention is to provide the virtual image to anticipate and predict the movement of the real player and to change the virtual image accordingly.
  • Other features and advantages of the embodiments of the present invention will become apparent from the following more detailed description, taken in conjunction with the accompanying drawings, which illustrate, by way of example, the principles of at least one of the possible embodiments of the invention.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • The accompanying drawings illustrate at least one of the best mode embodiments of the present invention. In such drawings:
  • FIG. 1 is a perspective view showing a method of the instant innovation providing video capture of the motions of a player and of projection of a competitor's image onto a screen;
  • FIG. 2 is a perspective view thereof showing one embodiment of the invention with a player at left and a simulated player's image at right;
  • FIG. 3 is a perspective view thereof showing a first and a second players in separate locations with video images of each projected onto a screen at the other player's location; and
  • FIGS. 4-5 are a logic diagram of the method of the invention.
  • DETAILED DESCRIPTION
  • The above described drawing figures illustrate the present invention in at least one of its preferred, best mode embodiments, which is further defined in detail in the following description. Those having ordinary skill in the art may be able to make alterations and modifications in the present invention without departing from its spirit and scope. Therefore, it must be understood that the illustrated embodiments have been set forth only for the purposes of example and that they should not be taken as limiting the invention as defined in the appended claims.
  • In the present apparatus and method, one or two players take part in a game involving physical movements. Such games may comprise simulated combat, games of chance, competition, cooperative engagement, and similar subjects. However, the present invention is ideal for use in games of hand-to-hand combat such as karate, aikido, kick-boxing and American style boxing where the players have contact but are not physically intertwined as they are in wrestling, Judo and similar sports. In this disclosure a combat game is described, but such is not meant to limit the range of possible uses of the present invention. In one embodiment of the instant combat game, a player 5 engages in simulated combat with an image 5′ projected onto a screen 10 placed in front of the player 5. In this embodiment, the image 5′ is computer generated using the same technology as found in game arcades. In an alternate embodiment, two players 5 stand in front of two separate screens 10 and engage in mutual simulated combat against recorded and projected images 5′ of each other. This avoids physical face-to-face combat where one of the players might receive injury. In this second approach, the images projected onto the screens 10 are not computer generated.
  • In the first approach, a player 5 is positioned in front of a rear projection screen 10. One or more video cameras 20, referred to here as a camera 20, is positioned behind the screen 10. The camera 20 is able to view the player 5 through the screen 10 and record the player's movements dynamically. If the screen 10 is not transparent enough for this to be done, the camera 20 is mounted on the front of the screen 10, or is mounted on or at the rear of the screen 10 viewing the player 5 through a small hole in the screen 10. The screen 10 may be supported by a screen stand (not shown) or it may be mounted on a wall 25 as shown. The screen 10 may also be mounted in the wall 25 with video equipment located on the side of the wall opposite the player 5 as shown in FIG. 1.
  • A video projector 30 projects a simulated image 5′ of a competitor combatant from the rear onto the screen 10 and this image 5′ is visible to the player 5 as shown in FIG. 2. In the approach where the camera 20 is located behind the screen 10, in order for the camera 20 to not record the projected image 5′, both the camera 20 and the projector 30 operate at identical rates (frames per second) but are set for recording and projecting respectively for only one-half of each frame, and are interlaced so that recording occurs only when the projector 30 is in an off state, and projecting occurs only when the camera 20 is in an off state. The net result is that the player 5, positioned at the front of the screen 10, sees the projected image while the camera 20 sees the player 5 and not the projected image.
  • The screen 10 may be a two-way mirror with visibility of objects in front of the screen 10 very clear from the rear of the screen 20, and with visibility through the screen 10 from the front not possible, yet visibility of images projected onto the back of the screen 10 highly visible from in front.
  • In both of the above described approaches, the player 5 wears colored bands as best seen in FIG. 2. Preferably, the player 5 has a band 51 secured at his forehead, above each elbow 52, on each wrist 53, around the waist 54, above each knee 55 and on each ankle 56. Each of these 10 bands is a different color. Further bands may be placed in additional locations on the player, but the 10 bands shown in FIG. 2 as described, are able to achieve the objectives of the instant innovation as will be shown. In the instant method, the image 5′ of the player 5, as recorded by camera 20 is converted into a digital electronic signal. This signal is split into 10 identical signals and each of these 10 signals is filtered for only the color component related to one of the 10 bands 51-56. Each of the filtered signals contains two pieces of information: the location on the plane of the recording device of its related colored band as determined by which pixels are disposed to the band, and the distance from the recording device to the band as determined by the total number of pixels disposed to the band. This information, from all ten bands is processed by a computer 60 to form a composite image 5′ of the player 5.
  • EXAMPLE 1
  • The player 5 stands facing the screen 10 with feet a comfortable distance apart, legs straight, and arms hanging at the player's sides. Each of the ten colored bands 51-56 are visible to the camera 20 and with a simple set of anatomical rules, the computer 60 is able to compose a mathematical model of the player's form that accurately represents the player's physical position and anatomical orientation at that moment. When a band moves, its image on the recording plane moves accordingly so that the computer 60 is able to calculate the motion trajectory of the band. When the number of pixels related to a particular band diminishes or grows, the computer 60 is able to calculate the band's trajectory in 3-space. When a band disappears, the computer 60 calculation takes into account the corresponding portion of the human anatomy, has moved so as to be hidden behind another portion of the anatomy of the player 5. This example is represented in FIG. 2.
  • The computer 60 produces a digital image 5′ representing a competitor combatant and projects this image 5′ onto the screen 10 initially in a starting position with body erect, feet spread apart and arms at sides. As the player 5 moves to attack the competitor image 5′, the computer 60 calculates the trajectory of motion of the attacking element, i.e., hand, arm, leg, etc., of the player 5 and moves the image 5′ to defensive postures or to counter attack. The computer 60 is able to calculate if the player 5 has moved successfully to overcome defensive postures or counter attacks of the image 5′ so as to award points to the player 5′.
  • EXAMPLE 2
  • Two players 5 stand facing their respective screens 10, each with feet a comfortable distance apart, legs straight, and arms hanging at their sides. Each of the ten colored bands 51-56 on each of the players 5 are visible to their respective cameras 20 so that the computer 60 is able to compose mathematical models of each of the players 5 in a mathematical 3-space that accurately represents each of the player's physical position and anatomical orientation at that moment relative to the other of the player 5. The vertical plane represented by the screen 10 of one player 5 represents a vertical bisector of the other player 5. Therefore, when one player 5 moves a fist, elbow, knee or foot toward his screen 10, the computer 60 calculates that motion as projecting outwardly toward the other player 5 from the other player's screen 10. In this manner the computer 60 calculates contacts between players 5 in offensive and defensive moves. As in real face-to-face combat, the players 5 initially and nominally stand slightly more than an arm's length away from their screen, i.e., mathematically from their opponent. Points are awarded to each of the players for successful offensive and defensive moves. The images are preferably projected with three-dimensional realism by use of the well known horizontal and vertical polarization of dual simultaneous projections with slight image separation as is well known, and with the players 5 wearing horizontally and vertically polarized lenses so as to see a combined image providing the illusion of depth. In this manner, each of the players 5 sees the illusion of the opponent players image projecting toward him from the screen 10. This example is represented in FIG. 3.
  • The present disclosure teaches an improved video frame processing method that enables the combative motions between two distant players 5 to be calculated and compared with respect to each other. This method is described as follows and is as shown in FIGS. 4-6. Once the game is initiated, a stream of frames from the video recorder 30 is processed. When motion is determined by a change in the position of any of the color elements 51-56 being recorded, position, velocity, as the differential of the position, and acceleration, as the second differential of the position of each of the ten color elements of the player 5, as discriminated by the signal filtering process described above, are calculated. Enablement of prediction is determined by evaluating the number of frames comprising a particular motion with a minimum number of frames set point. The calculations continue until the number of frames is at least equal to the set point. Depending on whether the motion is defensive, i.e., lagging the opponents movement, or offensive, i.e., independent of the opponent's movement, in any of the colored elements, the image is modified so as to defend against an offensive move by the player 5 or to initiate a new offensive move from an inventory of such moves. The final logical loops of this program are shown in FIGS. 5 and 6 and comprise the determination of incoming offense commands, calculation of the player's new coordinates, determination if the defense or offence is complete, and calculating the player's offensive positions as compared to the image defense moves and vice-versa, and determining a score for the player 5 in accordance with a stored table of score related motion and counter motion comparisons. For each of the motion and counter motion determinations for both offensive and defensive motions of players, a score is created and projected onto the screen.
  • The enablements described in detail above are considered novel over the prior art of record and are considered critical to the operation of at least one aspect of one best mode embodiment of the instant invention and to the achievement of the above described objectives. The words used in this specification to describe the instant embodiments are to be understood not only in the sense of their commonly defined meanings, but to include by special definition in this specification: structure, material or acts beyond the scope of the commonly defined meanings. Thus if an element can be understood in the context of this specification as including more than one meaning, then its use must be understood as being generic to all possible meanings supported by the specification and by the word or words describing the element.
  • The definitions of the words or elements of the embodiments of the herein described invention and its related embodiments not described are, therefore, defined in this specification to include not only the combination of elements which are literally set forth, but all equivalent structure, material or acts for performing substantially the same function in substantially the same way to obtain substantially the same result. In this sense it is therefore contemplated that an equivalent substitution of two or more elements may be made for any one of the elements in the invention and its various embodiments or that a single element may be substituted for two or more elements in a claim.
  • Changes from the claimed subject matter as viewed by a person with ordinary skill in the art, now known or later devised, are expressly contemplated as being equivalents within the scope of the invention and its various embodiments. Therefore, obvious substitutions now or later known to one with ordinary skill in the art are defined to be within the scope of the defined elements. The invention and its various embodiments are thus to be understood to include what is specifically illustrated and described above, what is conceptually equivalent, what can be obviously substituted, and also what essentially incorporates the essential idea of the invention.
  • While the invention has been described with reference to at least one preferred embodiment, it is to be clearly understood by those skilled in the art that the invention is not limited thereto. Rather, the scope of the invention is to be interpreted only in conjunction with the appended claims and it is made clear, here, that the inventor(s) believe that the claimed subject matter is the invention.

Claims (9)

1. A method of playing a motion related hand-to-hand combat type game, between a player and an image of a competitor player; the method comprising the steps of:
a) identifying portions of the players with individual colored elements;
b) recording the players as video images and filtering the images into separate signals according to the colored elements for both of the players;
c) determining positions in 3-space of the portions of the players on each video frame of each of the recordings of the players, and calculating changes in position between each of the frames, and further calculating velocity, acceleration and trajectory line of each of the portions;
d) determining when one of the portions of one of the players intersects the impact plane;
e) determining if a portion of the other of the players intersects the impact plane and also intersects the one of the portions of the one of the players;
f) awarding a point to the one of the players; and
g) repeating steps (c) through (f) until one of the players has achieved a set number of points.
2. The method of claim 1 wherein, utilizing a digital video camera interfaced to a distributed processor to capture real time 3-d motions of a player comprising the further steps of:
a) calibrating the system by initially placing the player(s) at a fixed distance from the camera and having the colored elements, and bodily signatures of the player to be calibrated for real time 3-d motion detection;
b) continually receiving the camera's real time electro-optical, auto focus, and zooming control information along with video camera signals measuring the 3 dimensional positions of the player(s) at motions;
c) while in motion, the depth (z) is calculated by the ratio of the of the total pixel count of the colored elements worn by the player(s) to the total video pixels of the colored elements measured during initial calibration;
d) utilizing a camera that could be commanded to perform auto focus or computer controlled focus;
e) adjusting the pixel count information of the colored elements, and player(s) bodily signature based upon the received camera's auto-focus or computer controlled focus;
f) trajectory of motion, speed, and acceleration of the players body parts is measured upon the differential changes of recent frame to the previous frame provide filtering of images to provide a sharp image and eliminate background noises;
g) differential changes are measured from frame to frame by following the periphery of each colored element and measuring pixel changes;
h) utilize a computer controlled camera that is commanded to focus and stay focus on a specific moving colored element;
i) utilize a computer controlled camera that its zooming is computer controlled;
j) placing the digital camera on a computer controlled gimbal to follow the player's motions the pixel count derived from step c will be further adjusted based upon the 2-d gimbal motions;
k) utilizing the digital camera with inferred sensors to monitor the bodily temperature of the player.
3. The method of claim 1 wherein the computer's further actions are synchronized to the start of a player's motions, or verbal commands on a frame by frame basis, further comprising the steps of:
a) each incoming frame is compared to the previous frame to detect the magnitude of change compared to the previous frame changes in the incoming frames surpassing a threshold are lead to further processing changes in the incoming frames not surpassing the threshold are counted, discarded, and led to further processing;
b) continuous incoming frames not surpassing a threshold for a certain period of time (“c” number of frames) are counted, discarded, and led to an offense motion by the computer generated image;
c) the voice activated command or other commands are analyzed and led to different processing stages, depending upon the nature of the commands;
4. The method of claims 1 wherein an event detection and prediction distributed digital image processor, continually monitors the movement of the player to detect motions that are consistent within certain time period (“b” number of frames) an event is defined as offense or a defense motion by the player; and wherein the event detector's algorithm is comprising of the steps:
a) consecutive frames that have passed the threshold, each are compared to the previous frame to detect the magnitude of change changes are added to the previous trajectory of the player's motions;
b) if received frame number is less than b number of frames, repeat previous step, otherwise go to the next step.
c) at the end of “b” number of frames, does player's motions indicate an offense aimed at the image's sensitive parts? if yes, go to players offensive play (step f), otherwise continue;
d) at the end of “b” number of frames does player's motions indicate a defense, protecting and dodging the image's offensive moves? if yes, go to step player's defense (step g), otherwise continue;
e) at the end of “b” number of frames does player's motions indicate a combination of offense and defense against image's body parts? if yes, calculate the likelihood of hit success comparing that of player's and images' motions if player's offense is stronger, go to step f, if weaker, go to step g;
f) predict a player's offense course of motion that is the continuation of motion detections in “b” plan a defensive image's motions in conjunction with the player's prediction calculate player and image's final trajectory and coordinates at the end of the predicted or planned period, and send it to event follower processor step h, then go to step a.
g) predict the players defense course of motion that is the continuation of motion detections in “b” plan an offense course of motion for the image's motions in conjunction with the player's prediction calculate player, and image's final trajectory, coordinates at the end of the planned period and send results to event follower processor, step l, then go to step a;
h) new player's offense command received from the event detection prediction processor? if no, go to the next step, otherwise continue displaying planned image and repeat this step;
i) continually display the planned defense or offense motions of the image get next frame, process frame, calculate players new coordinates, add to the previous coordinate;
j) end of the players prediction period, or image's planned defense? if yes, go to next step, if no go to previous step;
k) compare player's offense compared to image's defense calculate and show scores, go step h;
l) new player's defense command received from the event detection processor? if no, go to the next step, otherwise continue displaying planned image and repeat this step;
m) continually display the planned defense or offense motions of the image get next frame, process frame, calculate players new coordinates, add to the previous coordinate;
n) end of player's prediction or image's planned offense period? if yes go to next step, otherwise go to previous step;
o) calculate image's offense compared to player's defense calculate and show scores, go to step l.
5. The method of claim 1 wherein the degree for player's speed, is decided by adjustment of “b” number of frames during the initialization, and the degree of expertise is decided by classifying predictions and plans.;
6. The method of claim 1 further comprising the step of increasing the number of cameras and display monitors to assist the player's view of the image at different angles while turning and facing from one camera to another, wherein:
a) the image processor to provide an image 3d field of play for the player to use as a visual guidelines for his/hers movements in a field of play, while the image is moved around from one side of the field of play to the other;
b) the image processor to detect player positions from different cameras and decide witch camera provides the best detection angle and display the image in a relevant field of play to be viewed by the player;
7. The method of claim 1 wherein two local players using two sets of camera(s), two sets of displays, and an image processors examine individual video pictures from each players and display the video or planned image of the opponent player.
8. The method of claim 1 wherein two remote players using two sets of camera(s), two sets of displays, and two sets of image processor further comprising the steps of:
p) each processor to examine the videos from each relevant local player.
q) each processor to receive the opponent's image motion information, (or actual opponent's, video and other relevant information) via remote transmission facilities on frame by frame basis;
r) each processor to display an image of the opponent and control the image based upon the information received from the opponents motions;
s) each processor to provide scores on each displays;
t) one processor to determine the winner score, to be displayed on both monitors;
9. A method of playing baseball, tennis, golf, or other related games between a player having a sport instruments in hand and the images of offense and defense players wherein a ball, the images of a players, and an image of the field are generated by a processor; the method comprising the steps of:
a. identify the play instrument and portions of the player body with individual colored elements;
b. each computer generated thrown image ball, will be planned and played with a known trajectory, speed, acceleration, and a prediction, simulating a pro player;
c. generate an image of the field of play in 3-d whereby offense and defense actions takes place, by image offense, and image defense players;
d. processing of the incoming frames at the vicinity of the time of impact of the player's tennis with the image ball;
e. recording the movements of the player and the instrument as a video image and electronically or optically filtering the image into separate signals according to the colored elements;
f. determining positions in 3-space of the portions of the player and the instrument on each recording video frame;
g. following the trajectory of the player's body parts and the sport instrument, utilizing the method of claim 4, to calculate changes in trajectory, velocity, and acceleration of the portions of the player's body and trajectory of the instrument;
h. predicting the trajectory, velocity, and acceleration of the image ball being hit by the player's instrument;
i. moving the image players, as the result of the predicted trajectory of the ball and the image's physical location in the field of play, at the moment of impact of instrument with the image ball;
j. calculate the likelihood of success for the image ball to stay within the image 3-d field of play;
k. displaying the predicted trajectory of the ball hit by the player instrument;
l. display the images of players, playing offense, defense and reacting to the image ball, based upon the positions of the image player, and the prediction;
m. compare the player's prediction and follow-on, to the initial planned trajectory (step b), and display scores.
US11/189,176 2005-07-25 2005-07-25 Interactive games with prediction method Abandoned US20070021199A1 (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
US11/189,176 US20070021199A1 (en) 2005-07-25 2005-07-25 Interactive games with prediction method
US11/349,431 US20070021207A1 (en) 2005-07-25 2006-02-06 Interactive combat game between a real player and a projected image of a computer generated player or a real player with a predictive method
US12/798,335 US20110256914A1 (en) 2005-07-25 2010-04-02 Interactive games with prediction and plan with assisted learning method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
US11/189,176 US20070021199A1 (en) 2005-07-25 2005-07-25 Interactive games with prediction method

Related Child Applications (2)

Application Number Title Priority Date Filing Date
US11/349,431 Continuation-In-Part US20070021207A1 (en) 2005-07-25 2006-02-06 Interactive combat game between a real player and a projected image of a computer generated player or a real player with a predictive method
US12/798,335 Continuation-In-Part US20110256914A1 (en) 2005-07-25 2010-04-02 Interactive games with prediction and plan with assisted learning method

Publications (1)

Publication Number Publication Date
US20070021199A1 true US20070021199A1 (en) 2007-01-25

Family

ID=37679760

Family Applications (1)

Application Number Title Priority Date Filing Date
US11/189,176 Abandoned US20070021199A1 (en) 2005-07-25 2005-07-25 Interactive games with prediction method

Country Status (1)

Country Link
US (1) US20070021199A1 (en)

Cited By (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20070060384A1 (en) * 2005-09-14 2007-03-15 Nintendo Co., Ltd. Storage medium storing video game program
US20080082311A1 (en) * 2006-09-28 2008-04-03 Microsoft Corporation Transformations for virtual guest representation
US20080113812A1 (en) * 2005-03-17 2008-05-15 Nhn Corporation Game Scrap System, Game Scrap Method, and Computer Readable Recording Medium Recording Program for Implementing the Method
US20090233769A1 (en) * 2001-03-07 2009-09-17 Timothy Pryor Motivation and enhancement of physical and mental exercise, rehabilitation, health and social interaction
US20090280896A1 (en) * 2006-06-19 2009-11-12 Ambx Uk Limited Game enhancer
NL2004273A (en) * 2010-02-22 2010-05-19 Valeri Mischenko Embedding humans and objects in virtual reality environments.
US20100272196A1 (en) * 2009-04-28 2010-10-28 Qualcomm Incorporated Using channel estimates associated with ofdm pilot symbols to estimate additional parameter
US20100281439A1 (en) * 2009-05-01 2010-11-04 Microsoft Corporation Method to Control Perspective for a Camera-Controlled Computer
US20100289912A1 (en) * 2009-05-14 2010-11-18 Sony Ericsson Mobile Communications Ab Camera arrangement with image modification
US20110175810A1 (en) * 2010-01-15 2011-07-21 Microsoft Corporation Recognizing User Intent In Motion Capture System
WO2011105983A1 (en) * 2010-02-24 2011-09-01 Eric Carnevale Online beer pong game
CN102207771A (en) * 2010-05-12 2011-10-05 微软公司 Intention deduction of users participating in motion capture system
US8538562B2 (en) 2000-03-07 2013-09-17 Motion Games, Llc Camera based interactive exercise
US8654198B2 (en) 1999-05-11 2014-02-18 Timothy R. Pryor Camera based interaction and instruction
US20140211994A1 (en) * 2013-01-30 2014-07-31 Panasonic Corporation Human detection and tracking apparatus, human detection and tracking method, and human detection and tracking program
US20140363799A1 (en) * 2013-06-06 2014-12-11 Richard Ivan Brown Mobile Application For Martial Arts Training
US20170354866A1 (en) * 2016-06-10 2017-12-14 Nintendo Co., Ltd. Non-transitory storage medium having game program stored therein, information processing apparatus, information processing system, game processing method
US10382672B2 (en) 2015-07-14 2019-08-13 Samsung Electronics Co., Ltd. Image capturing apparatus and method
US10537815B2 (en) * 2012-07-16 2020-01-21 Shmuel Ur System and method for social dancing
US10918949B2 (en) * 2019-07-01 2021-02-16 Disney Enterprises, Inc. Systems and methods to provide a sports-based interactive experience
US11103787B1 (en) 2010-06-24 2021-08-31 Gregory S. Rabin System and method for generating a synthetic video stream
CN116688494A (en) * 2023-08-04 2023-09-05 荣耀终端有限公司 Method and electronic device for generating game prediction frame

Citations (28)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4375674A (en) * 1980-10-17 1983-03-01 The United States Of America As Represented By The Administrator Of The National Aeronautics And Space Administration Kinesimetric method and apparatus
US4503506A (en) * 1981-08-05 1985-03-05 Westinghouse Electric Corp. Apparatus for mapping and identifying an element within a field of elements
US4542291A (en) * 1982-09-29 1985-09-17 Vpl Research Inc. Optical flex sensor
US4563617A (en) * 1983-01-10 1986-01-07 Davidson Allen S Flat panel television/display
US4736097A (en) * 1987-02-02 1988-04-05 Harald Philipp Optical motion sensor
US4817950A (en) * 1987-05-08 1989-04-04 Goo Paul E Video game control unit and attitude sensor
US4843568A (en) * 1986-04-11 1989-06-27 Krueger Myron W Real time perception of and response to the actions of an unencumbered participant/user
US4925189A (en) * 1989-01-13 1990-05-15 Braeunig Thomas F Body-mounted video game exercise device
US4988981A (en) * 1987-03-17 1991-01-29 Vpl Research, Inc. Computer data entry and manipulation apparatus and method
USRE33559E (en) * 1986-11-13 1991-03-26 James Fallacaro System for enhancing audio and/or visual presentation
US5045687A (en) * 1988-05-11 1991-09-03 Asaf Gurner Optical instrument with tone signal generating means
US5130794A (en) * 1990-03-29 1992-07-14 Ritchey Kurtis J Panoramic display system
US5177872A (en) * 1990-10-05 1993-01-12 Texas Instruments Incorporated Method and apparatus for monitoring physical positioning of a user
US5185561A (en) * 1991-07-23 1993-02-09 Digital Equipment Corporation Torque motor as a tactile feedback device in a computer system
US5229756A (en) * 1989-02-07 1993-07-20 Yamaha Corporation Image control apparatus
US5288078A (en) * 1988-10-14 1994-02-22 David G. Capper Control interface apparatus
US5317489A (en) * 1993-09-22 1994-05-31 Sal Delli Gatti Illuminated apparatus for playing a game of horseshoes
US5412554A (en) * 1994-04-21 1995-05-02 Lee; Deng-Ran Compound lamp shade frame
US5414256A (en) * 1991-10-15 1995-05-09 Interactive Light, Inc. Apparatus for and method of controlling a device by sensing radiation having an emission space and a sensing space
US5490784A (en) * 1993-10-29 1996-02-13 Carmein; David E. E. Virtual reality system with enhanced sensory apparatus
US5495576A (en) * 1993-01-11 1996-02-27 Ritchey; Kurtis J. Panoramic image based virtual reality/telepresence audio-visual system and method
US5524637A (en) * 1994-06-29 1996-06-11 Erickson; Jon W. Interactive system for measuring physiological exertion
US5577981A (en) * 1994-01-19 1996-11-26 Jarvik; Robert Virtual reality exercise machine and computer controlled video system
US5583795A (en) * 1995-03-17 1996-12-10 The United States Of America As Represented By The Secretary Of The Army Apparatus for measuring eye gaze and fixation duration, and method therefor
US5616078A (en) * 1993-12-28 1997-04-01 Konami Co., Ltd. Motion-controlled video entertainment system
US5913727A (en) * 1995-06-02 1999-06-22 Ahdoot; Ned Interactive movement and contact simulation game
US20020155896A1 (en) * 2001-02-14 2002-10-24 William Gobush Launch monitor system and a method for use thereof
US20030215130A1 (en) * 2002-02-12 2003-11-20 The University Of Tokyo Method of processing passive optical motion capture data

Patent Citations (30)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4375674A (en) * 1980-10-17 1983-03-01 The United States Of America As Represented By The Administrator Of The National Aeronautics And Space Administration Kinesimetric method and apparatus
US4503506A (en) * 1981-08-05 1985-03-05 Westinghouse Electric Corp. Apparatus for mapping and identifying an element within a field of elements
US4542291A (en) * 1982-09-29 1985-09-17 Vpl Research Inc. Optical flex sensor
US4563617A (en) * 1983-01-10 1986-01-07 Davidson Allen S Flat panel television/display
US4843568A (en) * 1986-04-11 1989-06-27 Krueger Myron W Real time perception of and response to the actions of an unencumbered participant/user
USRE33559E (en) * 1986-11-13 1991-03-26 James Fallacaro System for enhancing audio and/or visual presentation
US4736097A (en) * 1987-02-02 1988-04-05 Harald Philipp Optical motion sensor
US4988981A (en) * 1987-03-17 1991-01-29 Vpl Research, Inc. Computer data entry and manipulation apparatus and method
US4988981B1 (en) * 1987-03-17 1999-05-18 Vpl Newco Inc Computer data entry and manipulation apparatus and method
US4817950A (en) * 1987-05-08 1989-04-04 Goo Paul E Video game control unit and attitude sensor
US5045687A (en) * 1988-05-11 1991-09-03 Asaf Gurner Optical instrument with tone signal generating means
US5288078A (en) * 1988-10-14 1994-02-22 David G. Capper Control interface apparatus
US4925189A (en) * 1989-01-13 1990-05-15 Braeunig Thomas F Body-mounted video game exercise device
US5229756A (en) * 1989-02-07 1993-07-20 Yamaha Corporation Image control apparatus
US5130794A (en) * 1990-03-29 1992-07-14 Ritchey Kurtis J Panoramic display system
US5177872A (en) * 1990-10-05 1993-01-12 Texas Instruments Incorporated Method and apparatus for monitoring physical positioning of a user
US5185561A (en) * 1991-07-23 1993-02-09 Digital Equipment Corporation Torque motor as a tactile feedback device in a computer system
US5414256A (en) * 1991-10-15 1995-05-09 Interactive Light, Inc. Apparatus for and method of controlling a device by sensing radiation having an emission space and a sensing space
US5442168A (en) * 1991-10-15 1995-08-15 Interactive Light, Inc. Dynamically-activated optical instrument for producing control signals having a self-calibration means
US5495576A (en) * 1993-01-11 1996-02-27 Ritchey; Kurtis J. Panoramic image based virtual reality/telepresence audio-visual system and method
US5317489A (en) * 1993-09-22 1994-05-31 Sal Delli Gatti Illuminated apparatus for playing a game of horseshoes
US5490784A (en) * 1993-10-29 1996-02-13 Carmein; David E. E. Virtual reality system with enhanced sensory apparatus
US5616078A (en) * 1993-12-28 1997-04-01 Konami Co., Ltd. Motion-controlled video entertainment system
US5577981A (en) * 1994-01-19 1996-11-26 Jarvik; Robert Virtual reality exercise machine and computer controlled video system
US5412554A (en) * 1994-04-21 1995-05-02 Lee; Deng-Ran Compound lamp shade frame
US5524637A (en) * 1994-06-29 1996-06-11 Erickson; Jon W. Interactive system for measuring physiological exertion
US5583795A (en) * 1995-03-17 1996-12-10 The United States Of America As Represented By The Secretary Of The Army Apparatus for measuring eye gaze and fixation duration, and method therefor
US5913727A (en) * 1995-06-02 1999-06-22 Ahdoot; Ned Interactive movement and contact simulation game
US20020155896A1 (en) * 2001-02-14 2002-10-24 William Gobush Launch monitor system and a method for use thereof
US20030215130A1 (en) * 2002-02-12 2003-11-20 The University Of Tokyo Method of processing passive optical motion capture data

Cited By (38)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8654198B2 (en) 1999-05-11 2014-02-18 Timothy R. Pryor Camera based interaction and instruction
US8538562B2 (en) 2000-03-07 2013-09-17 Motion Games, Llc Camera based interactive exercise
US20090233769A1 (en) * 2001-03-07 2009-09-17 Timothy Pryor Motivation and enhancement of physical and mental exercise, rehabilitation, health and social interaction
US8892219B2 (en) 2001-03-07 2014-11-18 Motion Games, Llc Motivation and enhancement of physical and mental exercise, rehabilitation, health and social interaction
US8306635B2 (en) * 2001-03-07 2012-11-06 Motion Games, Llc Motivation and enhancement of physical and mental exercise, rehabilitation, health and social interaction
US20080113812A1 (en) * 2005-03-17 2008-05-15 Nhn Corporation Game Scrap System, Game Scrap Method, and Computer Readable Recording Medium Recording Program for Implementing the Method
US10773166B2 (en) 2005-03-17 2020-09-15 Nhn Entertainment Corporation Game scrapbook system, game scrapbook method, and computer readable recording medium recording program for implementing the method
US9242173B2 (en) * 2005-03-17 2016-01-26 Nhn Entertainment Corporation Game scrapbook system, game scrapbook method, and computer readable recording medium recording program for implementing the method
US9561441B2 (en) 2005-09-14 2017-02-07 Nintendo Co., Ltd. Storage medium storing video game program for calculating a distance between a game controller and a reference
US20080318692A1 (en) * 2005-09-14 2008-12-25 Nintendo Co., Ltd. Storage medium storing video game program for calculating a distance between a game controller and a reference
US20070060384A1 (en) * 2005-09-14 2007-03-15 Nintendo Co., Ltd. Storage medium storing video game program
US20090280896A1 (en) * 2006-06-19 2009-11-12 Ambx Uk Limited Game enhancer
US8376844B2 (en) * 2006-06-19 2013-02-19 Ambx Uk Limited Game enhancer
US9746912B2 (en) * 2006-09-28 2017-08-29 Microsoft Technology Licensing, Llc Transformations for virtual guest representation
US20080082311A1 (en) * 2006-09-28 2008-04-03 Microsoft Corporation Transformations for virtual guest representation
US20100272196A1 (en) * 2009-04-28 2010-10-28 Qualcomm Incorporated Using channel estimates associated with ofdm pilot symbols to estimate additional parameter
US8649554B2 (en) * 2009-05-01 2014-02-11 Microsoft Corporation Method to control perspective for a camera-controlled computer
US9910509B2 (en) 2009-05-01 2018-03-06 Microsoft Technology Licensing, Llc Method to control perspective for a camera-controlled computer
US9524024B2 (en) 2009-05-01 2016-12-20 Microsoft Technology Licensing, Llc Method to control perspective for a camera-controlled computer
US20100281439A1 (en) * 2009-05-01 2010-11-04 Microsoft Corporation Method to Control Perspective for a Camera-Controlled Computer
US20100289912A1 (en) * 2009-05-14 2010-11-18 Sony Ericsson Mobile Communications Ab Camera arrangement with image modification
US9195305B2 (en) 2010-01-15 2015-11-24 Microsoft Technology Licensing, Llc Recognizing user intent in motion capture system
US20110175810A1 (en) * 2010-01-15 2011-07-21 Microsoft Corporation Recognizing User Intent In Motion Capture System
US8334842B2 (en) 2010-01-15 2012-12-18 Microsoft Corporation Recognizing user intent in motion capture system
NL2004273A (en) * 2010-02-22 2010-05-19 Valeri Mischenko Embedding humans and objects in virtual reality environments.
WO2011105983A1 (en) * 2010-02-24 2011-09-01 Eric Carnevale Online beer pong game
US20110279368A1 (en) * 2010-05-12 2011-11-17 Microsoft Corporation Inferring user intent to engage a motion capture system
CN102207771A (en) * 2010-05-12 2011-10-05 微软公司 Intention deduction of users participating in motion capture system
US11103787B1 (en) 2010-06-24 2021-08-31 Gregory S. Rabin System and method for generating a synthetic video stream
US10537815B2 (en) * 2012-07-16 2020-01-21 Shmuel Ur System and method for social dancing
US9349042B2 (en) * 2013-01-30 2016-05-24 Panasonic Corporation Human detection and tracking apparatus, human detection and tracking method, and human detection and tracking program
US20140211994A1 (en) * 2013-01-30 2014-07-31 Panasonic Corporation Human detection and tracking apparatus, human detection and tracking method, and human detection and tracking program
US20140363799A1 (en) * 2013-06-06 2014-12-11 Richard Ivan Brown Mobile Application For Martial Arts Training
US10382672B2 (en) 2015-07-14 2019-08-13 Samsung Electronics Co., Ltd. Image capturing apparatus and method
US20170354866A1 (en) * 2016-06-10 2017-12-14 Nintendo Co., Ltd. Non-transitory storage medium having game program stored therein, information processing apparatus, information processing system, game processing method
US10653947B2 (en) * 2016-06-10 2020-05-19 Nintendo Co., Ltd. Non-transitory storage medium having game program stored therein, information processing apparatus, information processing system, game processing method
US10918949B2 (en) * 2019-07-01 2021-02-16 Disney Enterprises, Inc. Systems and methods to provide a sports-based interactive experience
CN116688494A (en) * 2023-08-04 2023-09-05 荣耀终端有限公司 Method and electronic device for generating game prediction frame

Similar Documents

Publication Publication Date Title
US20070021199A1 (en) Interactive games with prediction method
US20070021207A1 (en) Interactive combat game between a real player and a projected image of a computer generated player or a real player with a predictive method
US10821347B2 (en) Virtual reality sports training systems and methods
Miles et al. A review of virtual environments for training in ball sports
US5913727A (en) Interactive movement and contact simulation game
US6951515B2 (en) Game apparatus for mixed reality space, image processing method thereof, and program storage medium
US20110256914A1 (en) Interactive games with prediction and plan with assisted learning method
US11826628B2 (en) Virtual reality sports training systems and methods
CN103959094B (en) For the system and method for synkinesia training
KR101007947B1 (en) System and method for cyber training of martial art on network
JP2000033184A (en) Whole body action input type game and event device
Vignais et al. Virtual thrower versus real goalkeeper: the influence of different visual conditions on performance
Sato et al. Augmented recreational volleyball court: Supporting the beginners' landing position prediction skill by providing peripheral visual feedback
KR20010008367A (en) Pitching practice apparatus, pitching analysis method with the same, and method of performing on-line/off-line based baseball game by using pitching information from the same
TWI423114B (en) Interactive device and operating method thereof
JP2002248187A (en) Goal achievement system of sports such as golf practice and golf practice device
Kulpa et al. Displacements in Virtual Reality for sports performance analysis
Dabnichki Computers in sport
KR101032813B1 (en) Apparatus and method for cyber sparring of martial art and the recording medium
CN111672089B (en) Electronic scoring system for multi-person confrontation type project and implementation method
US20210187374A1 (en) Augmented extended realm system
US20230398427A1 (en) Mixed reality simulation and training system
Katz et al. Virtual reality
JP7248353B1 (en) Hitting analysis system and hitting analysis method
US20220288457A1 (en) Alternate reality system for a ball sport

Legal Events

Date Code Title Description
STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION