US20120108305A1 - Data generation device, control method for a data generation device, and non-transitory information storage medium - Google Patents

Data generation device, control method for a data generation device, and non-transitory information storage medium Download PDF

Info

Publication number
US20120108305A1
US20120108305A1 US13/283,994 US201113283994A US2012108305A1 US 20120108305 A1 US20120108305 A1 US 20120108305A1 US 201113283994 A US201113283994 A US 201113283994A US 2012108305 A1 US2012108305 A1 US 2012108305A1
Authority
US
United States
Prior art keywords
data
time point
player
music track
exemplary model
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US13/283,994
Inventor
Masato Akiyama
Kidai SUZUKI
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Konami Digital Entertainment Co Ltd
Original Assignee
Konami Digital Entertainment Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Konami Digital Entertainment Co Ltd filed Critical Konami Digital Entertainment Co Ltd
Assigned to KONAMI DIGITAL ENTERTAINMENT CO., LTD. reassignment KONAMI DIGITAL ENTERTAINMENT CO., LTD. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: AKIYAMA, MASATO, SUZUKI, KIDAI
Publication of US20120108305A1 publication Critical patent/US20120108305A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • A63F13/428Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle involving motion or position input signals, e.g. signals representing the rotation of an input controller or a player's arm motions sensed by accelerometers or gyroscopes
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/44Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment involving timing of operations, e.g. performing an action within a time slot
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/60Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor
    • A63F13/67Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor adaptively or by learning from player actions, e.g. skill level adjustment or by storing successful combat sequences for re-use
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/80Special adaptations for executing a specific game genre or game mode
    • A63F13/814Musical performances, e.g. by evaluating the player's ability to follow a notation
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/213Input arrangements for video game devices characterised by their sensors, purposes or types comprising photodetecting means, e.g. cameras, photodiodes or infrared cells
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/10Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
    • A63F2300/1087Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals comprising photodetecting means, e.g. a camera
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/60Methods for processing data by generating or executing the game program
    • A63F2300/6009Methods for processing data by generating or executing the game program for importing or creating game content, e.g. authoring tools during game development, adapting content to different platforms, use of a scripting language to create content
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/60Methods for processing data by generating or executing the game program
    • A63F2300/6027Methods for processing data by generating or executing the game program using adaptive systems learning from user actions, e.g. for skill level adjustment
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/60Methods for processing data by generating or executing the game program
    • A63F2300/63Methods for processing data by generating or executing the game program for controlling the execution of the game in time
    • A63F2300/638Methods for processing data by generating or executing the game program for controlling the execution of the game in time according to the timing of operation or a time limit
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/60Methods for processing data by generating or executing the game program
    • A63F2300/66Methods for processing data by generating or executing the game program for rendering three dimensional images
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/80Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
    • A63F2300/8047Music games

Definitions

  • the present invention relates to a data generation device, a control method for a data generation device, and a non-transitory information storage medium.
  • JP 3866474 B2 JP 2001-224730 A
  • JP 2001-224730 A describes a game configured such that a player aims to perform the same action as a character dancing in tune with a music track while an action of the player is analyzed based on the photographed image.
  • the action of the player is evaluated based on reference data obtained by associating information indicating a time point at which a game device determines the action of the player and information for identifying an action supposed to be performed by the player at this time point.
  • the present invention has been made in view of the above-mentioned problem, and an object thereof is to provide a data generation device, a control method for a data generation device, and a non-transitory information storage medium therefor, which are capable of efficiently generating reference data used in a game for evaluating an action of a player.
  • a data generation device for generating reference data used in a game for evaluating an action of a player performed within a reproduction period of music track data based on the reference data obtained by associating information relating to a reference time point set within the reproduction period of the music track data and information relating to a reference position in which a body part of the player is to be placed at the reference time point
  • the data generation device including: exemplary model data acquiring means for acquiring exemplary model data from means for storing the exemplary model data, the exemplary model data relating to an exemplary model posture of the player at each time point within the reproduction period of the music track data; display means for displaying a setting screen for generating the reference data; specification receiving means for receiving a specification of a time point within the reproduction period of the music track data through the setting screen; display control means for displaying an exemplary model image for showing the exemplary model posture of the player at the specified time point on the setting screen based on the exemplary model
  • a control method for a data generation device for generating reference data used in a game for evaluating an action of a player performed within a reproduction period of music track data based on the reference data obtained by associating information relating to a reference time point set within the reproduction period of the music track data and information relating to a reference position in which a body part of the player is to be placed at the reference time point
  • the control method including: an exemplary model data acquiring step of acquiring exemplary model data from means for storing the exemplary model data, the exemplary model data relating to an exemplary model posture of the player at each time point within the reproduction period of the music track data; a display step of displaying a setting screen for generating the reference data; a specification receiving step of receiving a specification of a time point within the reproduction period of the music track data through the setting screen; a display control step of displaying an exemplary model image for showing the exemplary model posture of the player at the specified time point on the setting screen based on the exemplary model
  • a program for causing a computer to function as a data generation device for generating reference data used in a game for evaluating an action of a player performed within a reproduction period of music track data based on the reference data obtained by associating information relating to a reference time point set within the reproduction period of the music track data and information relating to a reference position in which a body part of the player is to be placed at the reference time point
  • the data generation device including: exemplary model data acquiring means for acquiring exemplary model data from means for storing the exemplary model data, the exemplary model data relating to an exemplary model posture of the player at each time point within the reproduction period of the music track data; display means for displaying a setting screen for generating the reference data; specification receiving means for receiving a specification of a time point within the reproduction period of the music track data through the setting screen; display control means for displaying an exemplary model image for showing the exemplary model posture of the player at the specified time point on the setting screen based on the exemplary
  • the information storage medium according to the present invention is a non-transitory computer-readable information storage medium having the above-mentioned program recorded thereon.
  • the data generation device further includes: means for reproducing the music track data; and means for stopping reproduction of the music track data in a case where a predetermined reproduction stopping operation is performed, in which: the display control means changes, in a case where the music track data is being reproduced, the exemplary model posture shown by the exemplary model image on the setting screen so as to be synchronized with the reproduction of the music track data; the display control means displays, in a case where the reproduction of the music track data is stopped, the exemplary model image at a time point at which the reproduction of the music track data is stopped on the setting screen; and the specification receiving means acquires one of a time point earlier than the time point at which the reproduction of the music track data is stopped and a time point later than the time point at which the reproduction of the music track data is stopped as the specified time point based on a predetermined time point specifying operation.
  • the data generation device further includes means for reproducing the music track data in a case where a predetermined setting content confirmation operation is performed on the setting screen after the reference data is generated by the reference data generating means, in which the display control means includes means for changing the exemplary model posture shown by the exemplary model image on the setting screen in synchronization with reproduction of the music track data in a case where the predetermined setting content confirmation operation is performed after the reference data is generated by the reference data generating means, and showing the reference time point and the reference position based on the reference data in synchronization with the reproduction of the music track data.
  • the display control means displays a time axis of the reproduction period of the music track data on the setting screen
  • the specification receiving means receives the specification of the time point within the reproduction period of the music track data based on the time axis displayed on the setting screen.
  • the display control means sets a plurality of areas corresponding to a plurality of body parts of the player so as to extend in a direction of the time axis displayed on the setting screen in parallel with one another in a direction perpendicular to the time axis
  • the specification receiving means receives a specification of a position within any one of the plurality of areas set on the setting screen
  • the reference data generating means generates the reference data based on the time point corresponding to the specified position within the anyone of the plurality of areas and the position of the body part corresponding to the specified position within the any one of the plurality of areas.
  • the time axis indicates the reproduction period of the music track data for each of predetermined beats of the music track data
  • the specification receiving means receives the specification of the time point within the reproduction period of the music track data for each of the predetermined beats.
  • FIG. 1 is a diagram illustrating how a player plays a game
  • FIG. 2 is a diagram illustrating an example of a game screen
  • FIG. 3 is a diagram illustrating an example of a setting screen
  • FIG. 4 is a diagram illustrating an example of a photographed image generated by a CCD camera
  • FIG. 5 is a diagram for describing a method of measuring a depth of the player, which is performed by an infrared sensor;
  • FIG. 6 is a diagram illustrating an example of a depth image acquired by the infrared sensor
  • FIG. 7 is a diagram illustrating an example of player position information generated by a position detecting device
  • FIG. 8 is a diagram illustrating a position of the player, which is identified by the player position information
  • FIG. 9 is a diagram illustrating a hardware configuration of the position detecting device
  • FIG. 10 is a diagram illustrating a hardware configuration of a game device
  • FIG. 11 is a functional block diagram illustrating functions implemented on the game device
  • FIG. 12 is a diagram illustrating a game space
  • FIG. 13 is a diagram illustrating an example of setting data
  • FIG. 14 is a diagram illustrating types of a determination method
  • FIG. 15 is a diagram illustrating a relation between the setting data and a setting indication image
  • FIG. 16 is a flowchart illustrating an example of processing executed on the game device
  • FIG. 17 is a diagram illustrating an example of a setting screen according to Modified Example (1).
  • FIG. 18 is a diagram illustrating an example of a setting screen according to Modified Example (2).
  • a data generation device is implemented by, for example, a consumer game machine (stationary game machine), a portable game machine, a mobile phone, a personal digital assistant (PDA), or a personal computer.
  • a consumer game machine stationary game machine
  • PDA personal digital assistant
  • a game device is used to generate reference data used in order to evaluate an action (posture) of a player. Further, the game device executes a game based on the reference data.
  • FIG. 1 is a diagram illustrating how the player plays a game.
  • a player 100 is positioned, for example, in front of a position detecting device 1 .
  • the position detecting device 1 is structured by including, for example, a camera and an infrared sensor, and generates three-dimensional coordinates corresponding to a body part of the player 100 .
  • a game device 20 executes the game based on the three-dimensional coordinates.
  • the player 100 aims to, for example, dance in time to a dance performed by a character displayed on a game screen and dance in tune with a music track.
  • FIG. 2 is a diagram illustrating an example of the game screen.
  • a game screen 30 includes a character 32 that dances in tune with a music track and a score 34 indicating a score of the player.
  • the character 32 is displayed based on, for example, data created by performing motion capturing processing on a picture obtained by photographing a dance performed by a dancer.
  • a mark 36 is displayed on the game screen 30 .
  • the player plays a game with the help of the mark 36 .
  • the mark 36 is an image for showing the player a position and a time at which the character 32 is to move its body. For example, when the hand of the character 32 touches the center of the mark 36 , if the player is performing the same action, the player can obtain an excellent evaluation.
  • the player plays the game while facing the character 32 displayed on the game screen 30 , and hence the game screen 30 displays the character 32 so that the character 32 performs a dance that is left-right reversed to the dance supposed to be performed by the player. For example, when the character 32 raises its hand on the right side when viewed from the player (its hand on the left side when viewed from the character 32 ), if the player raises their right hand, the player can obtain an excellent evaluation.
  • the player aims to dance in the same manner as the character 32 .
  • a person who performs setting for example, player or game producer
  • the person who performs setting (hereinafter, description is given of a case where the person who performs setting is the player) generates the reference data on a setting screen.
  • FIG. 3 is a diagram illustrating an example of the setting screen.
  • a setting screen 70 includes the character 32 .
  • the character 32 is displayed based on motion data.
  • the character 32 displayed on the setting screen 70 performs the same action as the character 32 displayed on the game screen 30 ( FIG. 2 ).
  • a setting indication image 72 indicating contents set in the reference data is displayed.
  • the player performs the setting and generation of the reference data with the help of display contents of the setting indication image 72 .
  • the music track and the dance of the character 32 are configured to be synchronously reproduced.
  • the position detecting device 1 generates player position information relating to a position of the player in a three-dimensional space.
  • position information on the player includes information relating to positions of a plurality of body parts of the player 100 .
  • the body parts of the player 100 include, for example, a head and both arms.
  • the position detecting device 1 includes, for example, a CCD camera 2 , an infrared sensor 3 , and a microphone 4 including a plurality of microphones.
  • the CCD camera 2 is a known camera including a COD image sensor.
  • the CCD camera 2 generates a photographed image (for example, RGB digital image) by photographing the player 100 at predetermined time intervals (for example, every 1/60th of a second).
  • FIG. 4 is a diagram illustrating an example of the photographed image generated by the CCD camera 2 .
  • the photographed image includes, for example, the player 100 .
  • the photographed image there are set an Xs-axis and a Ys-axis, which are orthogonal to each other.
  • the upper left corner of the photographed image is set as an origin Os (0,0).
  • the lower right corner of the photographed image is set as coordinates Pmax (Xmax, Ymax).
  • the position of each pixel in the photographed image is identified by two-dimensional coordinates (Xs-Ys coordinates) that are assigned to each pixel.
  • the infrared sensor 3 is formed of, for example, an infrared emitting device and an infrared receiving device (for example, infrared diodes).
  • the infrared sensor 3 detects reflected light obtained by emitting infrared light.
  • the infrared sensor 3 measures the depth of a subject (for example, player 100 ) based on a detection result of the reflected light.
  • the depth of a subject is a distance between a measurement reference position and the position of the subject.
  • the measurement reference position is a position that serves as a reference in measuring the depth (perspective) of the position of the player 100 .
  • the measurement reference position may be a predetermined position associated with the position of the position detecting device 1 , such as the position of the infrared receiving device of the infrared sensor 3 .
  • the infrared sensor 3 measures the depth of the player 100 based, for example, on a time of flight (TOF), which is a time required for the infrared sensor 3 to receive reflected light after emitting infrared light.
  • TOF time of flight
  • FIG. 5 is a diagram for describing a method of measuring the depth of the player 100 , which is performed by the infrared sensor 3 .
  • the infrared sensor 3 emits pulsed infrared light at predetermined intervals.
  • the infrared light emitted from the infrared sensor 3 spreads spherically with an emission position of the infrared sensor 3 at the center.
  • the infrared light emitted from the infrared sensor 3 strikes, for example, surfaces of the body of the player 100 .
  • the infrared light that has struck those surfaces is reflected.
  • the reflected infrared light is detected by the infrared receiving device of the infrared sensor 3 .
  • the infrared sensor 3 detects reflected light having a phase shifted by 180° from that of the emitted infrared light.
  • the TOF of the infrared light reflected by both hands of the player 100 is shorter than the TOF of the infrared light reflected by the torso of the player 100 .
  • the value determined as follows corresponds to the distance between the measurement reference position and the player 100 (that is, depth). Specifically, the value is determined by multiplying a time required for the infrared sensor 3 to detect the reflected light after emitting the infrared light (that is, TOF) by the speed of the infrared light and then dividing the resultant value by two. In this manner, the infrared sensor 3 can measure the depth of the player 100 .
  • the infrared sensor 3 also detects an outline of a subject (player 100 ) by detecting depth differences acquired from the reflected infrared light.
  • the fact that the infrared sensor 3 receives the reflected infrared light as described above means that an object is located at that place. Further, if there is no other object located behind the object in the vicinity, the depth difference between the object and the surroundings of the object is large.
  • the infrared sensor 3 detects the outline of the player 100 by joining portions having the depth differences larger than a predetermined value.
  • the method of detecting the outline of the player 100 is not limited to the above-mentioned example.
  • the outline may be detected based on the brightness of each pixel of the photographed image acquired by the CCD camera 2 .
  • depth information Information relating to the depth of the player 100 (depth information), which is detected as described above, is expressed as, for example, a depth image.
  • depth information is expressed as a gray-scale depth image (for example, 256-bit gray-scale image data).
  • FIG. 6 is a diagram illustrating an example of the depth image acquired by the infrared sensor 3 .
  • an object located close to the infrared sensor 3 is expressed as bright (brightness is high), and an object located far from the infrared sensor 3 is expressed as dark (brightness is low).
  • the depth of the player 100 corresponds to the brightness (pixel value) of the depth image.
  • the depth image is expressed as the 256-bit gray-scale image data
  • the depth image is changed in brightness by one bit.
  • the infrared sensor 3 is capable of detecting the depth of the subject in units of 2 cm.
  • pixels corresponding to both hands of the player 100 are expressed as brighter (brightness is higher) than pixels corresponding to the torso.
  • the infrared sensor 3 similarly to the CCD camera 2 , the infrared sensor 3 generates the depth image at predetermined time intervals (for example, every 1/60th of a second). Based on the photographed image acquired by the CCD camera 2 and the depth image acquired by the infrared sensor 3 , the player position information is generated relating to the positions of body parts of the player 100 .
  • RGBD data a composite image that is obtained by adding the depth information (D: depth) indicated by the depth image to the photographed image (RGB data) acquired by the CCD camera 2 .
  • the composite image contains, for each pixel, color information (lightness of each of R, G, and B) and the depth information.
  • the color information (lightness of R, G, and B) of pixels enclosed within the outline is referred to.
  • pixels corresponding to each part of the body of the player 100 are identified.
  • a known method is applicable, such as a pattern matching method in which the object (that is, each part of the body of the player 100 ) is extracted from the image through a comparison with a comparison image (training image).
  • the three-dimensional coordinates are generated by carrying out predetermined matrix transformation processing on those pixel values.
  • the matrix transformation processing is executed through, for example, a matrix operation similar to transformation processing performed in 3D graphics between two coordinate systems of a world coordinate system and a screen coordinate system. Specifically, the RGB value indicating the color information of the pixel and the D value indicating the perspective are substituted into a predetermined determinant, to thereby calculate the three-dimensional coordinate of the pixel.
  • the method of calculating the three-dimensional coordinate that corresponds to a pixel based on the pixel value a known method may be applied, and the calculation method is not limited to the above-mentioned example.
  • the coordinate transformation may be performed using a lookup table.
  • FIG. 7 is a diagram illustrating an example of the player position information generated by the position detecting device 1 .
  • the player position information includes a plurality of pieces of information relating to positions of a plurality of body parts of the player 100 .
  • the player position information for example, each body part of the player 100 and the three-dimensional coordinates are stored in association with each other.
  • FIG. 8 is a diagram illustrating the position of the player 100 , which is identified by the player position information.
  • a predetermined position corresponding to the position detecting device 1 (for example, the measurement reference position) is set as an origin Op.
  • the origin Op represents the three-dimensional coordinates corresponding to the measurement reference position of the infrared sensor 3 .
  • the position of the origin Op may be set anywhere in the three-dimensional space in which the player 100 exists.
  • the three-dimensional coordinates corresponding to the origin Os of the photographed image may be set as the origin Op.
  • the player position information includes body part information relating to the positions of, at least, the head and the waist from among the plurality of body parts of the player 100 .
  • the player position information includes eleven sets of three-dimensional coordinates corresponding to the head P 1 , shoulders P 2 , left upper arm P 3 , right upper arm P 4 , left lower arm P 5 , right lower arm P 6 , back P 7 , left thigh P 8 , right thigh P 9 , left shin P 10 , and right shin P 11 of the player 100 .
  • the part of the body of the player 100 which is indicated by the player position information, may be a part that is determined in advance from the player's body (skeletal frame).
  • any part of the body may be used as long as the part is identifiable by the above-mentioned pattern matching method.
  • the player position information generated every predetermined time intervals (for example, every 1/60th of a second) is transmitted from the position detecting device 1 to the game device 20 .
  • the game device 20 executes the game by receiving the player position information from the position detecting device 1 and grasping the movement of the body of the player (hereinafter, reference numeral “100” of the player is omitted).
  • FIG. 9 is a diagram illustrating the hardware configuration of the position detecting device 1 .
  • the position detecting device 1 includes a control unit 10 , a storage unit 11 , a photographing unit 12 , a depth measuring unit 13 , an audio input unit 14 , and a communication interface unit 15 .
  • the respective components of the position detecting device 1 are connected to one another by a bus 16 so as to be able to exchange data thereamong.
  • the control unit 10 controls the respective units of the position detecting device 1 according to an operating system and various kinds of programs which are stored in the storage unit 11 .
  • the storage unit 11 stores programs and various kinds of parameters which are used for operating the operating system, the photographing unit 12 , and the depth measuring unit 13 . Further, the storage unit 11 stores a program for generating the player position information based on the photographed image and the depth image.
  • the photographing unit 12 includes the CCD camera 2 and the like.
  • the photographing unit 12 generates, for example, the photographed image of the player 100 .
  • the depth measuring unit 13 includes the infrared sensor 3 and the like.
  • the depth measuring unit 13 generates the depth image based, for example, on the TOF acquired by the infrared sensor 3 .
  • the control unit 10 generates the player position information based on the photographed image generated by the photographing unit 12 and the depth image generated by the depth measuring unit 13 at predetermined time intervals (for example, every 1/60th of a second).
  • the audio input unit 14 includes, for example, the microphone 4 .
  • the communication interface unit 15 is an interface for transmitting various kinds of data, such as the player position information, to the game device 20 .
  • FIG. 10 is a diagram illustrating the hardware configuration of the game device 20 .
  • the game device 20 includes a control unit 21 , a main storage unit 22 , an auxiliary storage unit 23 , an optical disc reproducing unit 24 , a communication interface unit 25 , an operation unit 26 , a display unit 27 , and an audio output unit 28 .
  • the respective components of the game device 20 are connected to one another by a bus 29 .
  • the control unit 21 includes, for example, a CPU, a graphics processing unit (GPU), and a sound processing unit (SPU).
  • the control unit 21 executes various kinds of processing according to an operating system and other programs.
  • the main storage unit 22 includes, for example, a random access memory (RAM).
  • the auxiliary storage unit 23 includes, for example, a hard disk drive (non-transitory information storage medium).
  • the main storage unit 22 stores programs and data read from the auxiliary storage unit 23 or an optical disc (non-transitory information storage medium). Further, the main storage unit 22 is also used as a work memory for storing data to be required in the course of the processing.
  • the optical disc reproducing unit 24 reads programs and data stored on the optical disc. For example, a game program is stored on the optical disc.
  • the communication interface unit 25 is an interface for communicatively connecting the game device 20 to a communication network.
  • the operation unit 26 is used by the player to perform an operation.
  • the operation unit 26 includes, for example, a game controller including a cross button and various kinds of buttons, a touch panel, a mouse, or a keyboard.
  • the display unit 27 is, for example, a consumer television set or a liquid crystal display panel.
  • the display unit 27 displays, for example, the setting screen 70 for generating the reference data.
  • the audio output unit 28 includes, for example, a speaker or headphones.
  • FIG. 11 is a functional block diagram illustrating functions implemented on the game device 20 .
  • a game data storage unit 40 As illustrated in FIG. 11 , on the game device 20 , there are implemented a game data storage unit 40 , an exemplary model data acquiring unit 42 , a specification receiving unit 44 , a display control unit 46 , and a reference data generating unit 48 .
  • Those functions are implemented by the control unit 21 operating according to programs read from the optical disc.
  • the game data storage unit 40 is mainly implemented by the main storage unit 22 and the auxiliary unit 23 .
  • the game data storage unit 40 stores information necessary for executing the game.
  • the game data storage unit 40 stores the following data: (1) music track data (data obtained by saving general popular music or the like in a predetermined data format); (2) motion data; (3) reference data; (4) data obtained by storing the player position information in chronological order; and (5) game situation data (data indicating a situation (including score and elapsed time) of the game being executed).
  • the music track data, the motion data, and the reference data among the above-mentioned list of data are data prepared by a game creator in advance.
  • the player position information is data acquired from the position detecting device 1
  • the game situation data is data generated and updated by a game program.
  • the control unit 21 functions as means for acquiring various kinds of data stored in the game data storage unit 40 .
  • the motion data is created by a game producer, for example, based on data generated by performing motion capturing processing on the picture obtained by photographing the action of a dancer.
  • the motion data is, for example, data for identifying the position of each body part of the character 32 .
  • data indicating a posture of the character 32 is stored in the motion data in chronological order.
  • the game device 20 can perform display control so that the character 32 dances on the game screen 30 .
  • FIG. 12 is a diagram illustrating the game space. As illustrated in FIG. 12 , a character object 62 indicating the character 32 and a virtual camera 64 (viewpoint) are located in a game space 60 .
  • the character object 62 is structured by including a plurality of polygons. For example, the character object 62 is created based on the data indicating the body parts of the character 32 which is stored in the motion data.
  • the character object 62 changes so as to show the posture supposed to be adopted by the player.
  • description is given of a case where the character 32 performs the dance that is left-right reversed to the dance supposed to be performed by the player.
  • an object indicating a left hand of the character 32 within the character object 62 is located in the high position.
  • the position of the body part of the character 32 is defined so that the character 32 performs the dance that is left-right reversed to the dance supposed to be performed by the player.
  • Displayed on the game screen 30 is an image indicating how the game space 60 is viewed from the virtual camera 64 .
  • Information indicating the position and a line-of-sight direction of the virtual camera 64 may be a fixed value, or may vary according to the game program or the operation performed by the player.
  • the information indicating the position and the line-of-sight direction of the virtual camera 64 is stored in, for example, the game situation data. Further, the motion data and the music track data are synchronously reproduced so that the character 32 dances in tune with the music track.
  • the reference data is data obtained by associating information relating to a reference time point set within a reproduction time of the music track data with information relating to a reference position in which the body part of the player is supposed to be placed at the reference time point.
  • the reference time point represents a time point at which the game device 20 determines (evaluates) the action of the player.
  • the action of the player performed within a reproduction period of the music track data is evaluated based on the reference data.
  • the wording “within the reproduction period of the music track data” means within a period from the start point of the music track until the end point thereof.
  • the position data for determination is data indicating a position in which each body part of the player is supposed to be placed at each time point after the reproduction of the music track starts. For example, the elapsed time since the reproduction of the music track started and the position (three-dimensional coordinates) in which the body part of the player is supposed to be placed after the elapsed time are stored in the position data for determination in association with each other.
  • the position data for determination is created by a game producer based on the data generated by performing the motion capturing processing on the picture obtained by photographing the action of a dancer. The position data for determination is compared with the position of the body part of the player indicated by the player position information.
  • the position data for determination is expressed by a coordinate system based on a representative point which is set in the player.
  • the three-dimensional coordinates stored in the position data for determination are expressed by the coordinate system with the representative point set as an origin thereof.
  • position coordinates of each body part of the player stored in the position data for determination indicate a positional relation between the position in which each body part of the player is supposed to be placed and the position of the representative point.
  • the representative point is set to the back P 7 .
  • the back P 7 is set as the origin
  • the three-dimensional coordinates indicating a position relative to the back P 7 are stored in the position data for determination.
  • the game device 20 expresses the player position information by the coordinate system based on the representative point.
  • the position data for determination is expressed by the coordinate system having the back P 7 set as the origin, and hence the three-dimensional coordinates included in the player position information are also expressed by the coordinate system having the back P 7 set as the origin.
  • the representative point is the back P 7
  • the representative point is not limited to the back P 7 as long as the representative point is set in the player and the character 32 .
  • the representative point may be the head P 1 or the like.
  • the character 32 performs the dance that is left-right reversed to the dance supposed to be performed by the player, and hence a fixed relation (left-right reversed relation) is maintained between the positions of the head P 1 , the shoulder P 2 , the left upper arm P 3 , the right upper arm P 4 , the left lower arm P 5 , the right lower arm P 6 , the back P 7 , the left thigh P 8 , the right thigh P 9 , the left shin P 10 , and the right shin P 11 of the player, which are indicated by the position data for determination, and the positions of a head Q 1 , a shoulder Q 2 , a left upper arm Q 3 , a right upper arm Q 4 , a left lower arm Q 5 , a right lower arm Q 6 , a back Q 7 , a left thigh Q 8 , a right thigh Q 9 , a left shin Q 10 , and a right shin Q 11 of the character 32
  • the setting data is data for identifying the reference time point and the body part of the player to be determined by the game device 20 at the reference time point.
  • the player generates the reference data by setting contents of setting data on the setting screen 70 .
  • FIG. 13 is a diagram illustrating an example of the setting data.
  • a t-axis illustrated in FIG. 13 represents a time axis.
  • the t-axis indicates the elapsed time since the reproduction of the music track started.
  • the setting data indicates the body part of the player to be determined by the game device 20 every predetermined bar (for example, 1/16th of a bar).
  • predetermined bar for example, 1/16th of a bar.
  • FIG. 14 is a diagram illustrating types of a determination method. As illustrated in FIG. 14 , in this embodiment, the method of determining the action of the player by the game device 20 differs in type depending on the value stored in the setting data.
  • the value “0” stored in the setting data indicates that the game device 20 is not to determine the action of the player at the corresponding time point. In other words, all the body parts stored in the setting data whose values are “0” indicate that the corresponding time point is not the reference time point.
  • the body part stored in the setting data whose value is any one of “1” to “7” indicates that there is a body part of the player to be determined by the game device 20 .
  • this body part is referred to as a determination subject body part.
  • any one of “1” to “7” stored in the setting data indicates that the corresponding time is the reference time point.
  • the method of determining the action of the player by the game device 20 is classified into the following seven types.
  • Ripple The game device 20 determines whether or not the player is moving their hand in the same manner as the character 32 .
  • Step The game device 20 determines whether or not the player is moving their foot in the same manner as the character 32 .
  • Pose The game device 20 determines whether or not the player is adopting the same pose as the character 32 by using their whole body.
  • Lock The game device 20 determines whether or not the player has been maintaining their body part in the same position as the character 32 for a predetermined time period.
  • Solid The game device 20 determines whether or not the player has been shaking their hand in the same manner as the character 32 for a predetermined time period.
  • Stream The game device 20 determines whether or not the player has been moving their hand in the same manner as the character 32 for a predetermined time period.
  • Gesture The game device 20 determines whether or not the player has been adopting the same pose as the character 32 by using their whole body or partial body part for a predetermined time period.
  • the game device 20 determines the action of the body part (head, left hand, right hand, left foot, or right foot) of the player having the above-mentioned value based on the above-mentioned type. Further, for example, if there is a body part whose value is “3” or “7” in the setting data, the game device 20 determines the action of the whole body or partial body part of the player based on the above-mentioned type.
  • the type of the determination method performed by the game device 20 is not limited to the above-mentioned example as long as the type of the determination method performed by the game device 20 relates to an action based on the dance of the character 32 .
  • the exemplary model data acquiring unit 42 is implemented mainly by the control unit 21 .
  • the exemplary model data acquiring unit 42 acquires exemplary model data from means for storing the exemplary model data relating to an exemplary model posture of the player at each time point within a reproduction period of the music track data.
  • the exemplary model posture represents an action (posture) supposed to be performed (adopted) by the player, for example, the posture (dance) of the character 32 .
  • the exemplary model data acquiring unit 42 acquires the motion data stored in the game data storage unit 40 .
  • the specification receiving unit 44 is implemented mainly by the control unit 21 and the operation unit 26 .
  • the specification receiving unit 44 receives a specification of a time point within a reproduction period of the music track data through the setting screen 70 .
  • the specification receiving unit 44 receives from the player a specification of the time point (elapsed time) within the reproduction period of the music track data based on display contents of the setting screen 70 .
  • the display control unit 46 is implemented mainly by the control unit 21 .
  • the display control unit 46 displays an exemplary model image (for example, character 32 ) for showing the exemplary model posture of the player at the specified time point on the setting screen 70 based on the exemplary model data (for example, motion data).
  • the display control unit 46 changes the exemplary model posture shown by the exemplary model image (for example, character 32 ) on the setting screen 70 so as to be synchronized with the reproduction of the music track data.
  • the display control unit 46 displays the character 32 indicated by the motion data on the setting screen 70 so that the elapsed time indicated by the motion data coincides with the elapsed time indicated by the music track data.
  • the setting screen 70 displayed by the display control unit 46 includes, as illustrated in FIG. 3 , the character 32 . As described above, the character 32 is displayed based on the motion data. The character 32 displayed on the setting screen 70 performs the same action as the character 32 displayed on the game screen 30 ( FIG. 2 ).
  • the setting indication image 72 indicating contents of the setting data is displayed on the setting screen 70 .
  • the setting indication image 72 is used by the player in order to set the setting data.
  • the setting indication image 72 is displayed based on the contents of the setting data that is currently set.
  • a reference time point and the body part used for determination at the reference time point, which are stored in the setting data, are displayed on the setting indication image 72 in association with each other.
  • two axes are set in the setting indication image 72 .
  • One axis (for example, vertical axis) is associated with the time point within the reproduction time of the music track data, and the other axis (for example, horizontal axis) is associated with the body part of the player to be determined by the game device 20 .
  • FIG. 15 is a diagram illustrating a relation between the setting data and the setting indication image 72 .
  • the display contents of the setting indication image 72 are decided based on the setting data.
  • the setting indication image 72 is displayed based on the setting data within a predetermined time including the current elapsed time.
  • the types of the determination method stored in the setting data are displayed on the setting indication image 72 so as to be distinguished by icons such as a rectangle and a circle.
  • the icons may be displayed so as to be connected to each other with regard to the determination to be performed over a predetermined period such as Stream or Solid. Further, the display control unit 46 displays a time axis t of the reproduction period of the music track data on the setting indication image 72 .
  • the horizontal axis set in the setting indication image 72 corresponds to the time axis t.
  • the display of the setting indication image 72 is updated with the lapse of time of the reproduction of the music track data. For example, with the lapse of time of the reproduction of the music track data, the setting indication image 72 is scrolled in a direction corresponding to the time axis t (for example, leftward).
  • the setting indication image 72 thus displayed allows the player to perform setting work while grasping which type of determination is to be performed at the current elapsed time in a case of playing the game in actuality.
  • the display control unit 46 sets a plurality of areas 72 a to 72 e corresponding to a plurality of body parts of the player so as to extend in the direction of the time axis t displayed on the setting screen 70 in parallel with one another in a direction perpendicular to the time axis t.
  • the long side direction of the areas 72 a to 72 e is the direction of the time axis t.
  • the plurality of areas 72 a to 72 e are arranged in parallel with one another in the short side direction of the respective areas 72 a to 72 e.
  • each of the plurality of areas 72 a to 72 e set in the setting indication image 72 corresponds to a determination subject body part.
  • the setting indication image 72 further includes a reference time point indication image 74 and a reference position indication image 76 .
  • the player sets the reference time point based on the reference time point indication image 74 , and sets a reference position based on the reference position indication image 76 .
  • the reference time point indication image 74 is used by the player in order to specify the time point within the reproduction period indicated by the time axis t displayed on the setting indication image 72 .
  • the reference time point indication image 74 moves horizontally based on the operation performed by the player. By moving the reference time point indication image 74 , the player can specify and change the time point within the reproduction period of the music track data.
  • the specification receiving unit 44 receives the specification of the time point within the reproduction period of the music track data based on the time axis t displayed on the setting screen 70 .
  • the time axis t indicates the reproduction period of the music track data for each of the predetermined beats of the music track, and hence the specification receiving unit 44 receives the specification of the time point within the reproduction period of the music track data for each of the predetermined beats of the music track.
  • the display control unit 46 displays the action of the character 32 at the time point specified by the player on the setting screen 70 . For example, based on the motion data and the time point specified by the player moving the reference time point indication image 74 , the action (posture) of the character 32 at this time point is displayed on the setting screen 70 .
  • the reference position indication image 76 is used by the player in order to specify a position within any one of the plurality of areas 72 a to 72 e set on the setting screen 70 .
  • the reference position indication image 76 moves vertically based on the operation performed by the player. By moving the reference position indication image 76 , the player can specify the determination subject body part to set the reference position.
  • the specification receiving unit 44 receives a specification of the position within any one of the plurality of areas 72 a to 72 e set on the setting screen 70 .
  • the number of reference time points may be displayed in a determination method display field 78 for each of the types of the determination method performed by the game device 20 .
  • Display contents of the determination method display field 78 are decided based on the numerical values “1” to “7” that are indicated by the setting data.
  • the determination method display field 78 includes a cursor 80 indicating the type of the determination method to be set by the player. By the player performing a predetermined operation, the cursor 80 moves vertically. By moving the cursor 80 vertically, it is possible to change the type of the determination method to be set by the player.
  • the reference data generating unit 48 is implemented mainly by the control unit 21 .
  • the reference data generating unit 48 generates the reference data based on the time point specified on the setting screen 70 and the position of the body part within the exemplary model posture of the player at the specified time point. For example, the reference data generating unit 48 sets the reference time point based on the time point received by the specification receiving unit 44 , and sets the reference position based on an exemplary model action (for example, action of the character 32 ) at this time point.
  • the reference data generating unit 48 generates the reference data based on the time point corresponding to the position within the areas 72 a to 72 e which is specified on the setting screen 70 and the position of the body part corresponding to the specified position within the areas 72 a to 72 e .
  • the reference data generating unit 48 identifies the setting contents of the setting data based on the body part indicated by the reference position indication image 76 and the type of the determination method indicated by the cursor 80 .
  • the setting data can be set so that the determination for Ripple is performed for the left hand of the player at the “2 and 3 ⁇ 4th” of a bar after the reproduction of the music track data starts.
  • the reference position becomes the three-dimensional coordinates of the left lower arm P 5 at the “2 and 3 ⁇ 4th” of a bar which is stored in the position data for determination.
  • the reference data generating unit 48 generates the reference data by generating the setting data based on the time point specified based on the reference time point indication image 74 , the body part of the character 32 specified based on the reference position indication image 76 , and the type of the determination method specified by the cursor 80 .
  • the display contents of the setting indication image 72 and the determination method display field 78 are updated as well. Further, in a case where the player performs the setting instruction operation, the icon representing the type of the determination method which is set is displayed in the position within the setting indication image 72 which is indicated by the reference time point indication image 74 and the reference position indication image 76 .
  • FIG. 16 is a flowchart illustrating an example of processing executed on the game device 20 .
  • the processing of FIG. 16 is executed by the control unit 21 operating according to programs read from the optical disc when, for example, the setting screen 70 is displayed.
  • the control unit 21 reproduces the music track based on the music track data (S 101 ).
  • the control unit 21 starts the dance of the character 32 based on the motion data, and displays the setting screen 70 (S 102 ).
  • the character object 62 is located in the game space 60 based on the motion data.
  • an image indicating how the character object 62 looks when viewed from the virtual camera 64 is displayed on the setting screen 70 .
  • the music track data and the motion data are synchronously reproduced so that the character 32 performs a dance (dance that is left-right reversed to the dance supposed to be performed by the player) in tune with the music track.
  • the control unit 21 updates the display contents of the setting screen 70 based on the current elapsed time of the music track, and the music track data and the motion data are synchronously reproduced (S 103 ).
  • S 103 for example, with the passage of time, the character 32 dances in tune with the music track, and the setting indication image 72 is scrolled in the direction of the time axis t.
  • the control unit 21 determines whether or not a reproduction stopping operation has been performed (S 104 ). In S 104 , based on whether or not a predetermined button of the operation unit 26 has been depressed, it is determined whether or not the reproduction stopping operation has been performed.
  • the control unit 21 stops the reproduction of the music track (S 105 ).
  • the control unit 21 stops the synchronous reproduction of the motion data (S 106 ).
  • the control unit 21 displays the exemplary model image (for example, character 32 ) at the time point at which the reproduction of the music track data is stopped on the setting screen 70 . Because the reproduction of the music track is stopped in S 105 , the dance of the character 32 is stopped on the setting screen 70 , and the scrolling of the setting indication image 72 is stopped as well.
  • the control unit 21 determines whether or not a movement instruction (time point specifying operation) for the reference time point indication image 74 has been performed (S 107 ). For example, it is determined whether or not a left or right button of the cross button of the operation unit 26 has been depressed.
  • the control unit 21 moves the reference time point indication image 74 according to the movement instruction, and changes the elapsed time of the music track data (S 108 ).
  • the control unit 21 (specification receiving unit 44 ) acquires a time point earlier than the time point at which the reproduction of the music track data is stopped or a time point later than the time point at which the reproduction of the music track data is stopped as the specified time point based on a predetermined time point specifying operation.
  • the time point at which the reproduction of the music track data is stopped means the elapsed time of the music track in the case where the reproduction stopping operation is performed.
  • the current elapsed time is changed in accordance with the movement of the reference time point indication image 74 . For example, the elapsed time is moved backward by moving the reference time point indication image 74 leftward, while the elapsed time is moved forward by moving the reference time point indication image 74 rightward.
  • the control unit 21 updates the display of the character 32 based on the changed elapsed time (S 109 ).
  • S 109 the display of the character 32 is updated based on the motion data.
  • the time axis t indicates the elapsed time for each of the predetermined beats of the music track, and hence in S 109 , the action of the character 32 at the elapsed time specified for each of the predetermined beats is displayed on the setting screen 70 .
  • the reference time point is set based on the elapsed time specified for each of the predetermined beats.
  • the control unit 21 determines whether or not the movement instruction for the reference position indication image 76 has been performed (S 110 ). For example, it is determined whether or not an up or down button of the cross button of the operation unit 26 has been depressed.
  • the control unit 21 moves the reference position indication image 76 (S 111 ). For example, the reference position indication image 76 moves vertically based on an operation input from the operation unit 26 .
  • the control unit 21 determines whether or not the setting instruction operation has been performed (S 112 ). If the setting instruction operation is performed (S 112 ; Y), the control unit 21 sets the reference time point based on the reference time point indication image 74 , and sets the reference position based on the body part indicated by the reference position indication image 76 (S 113 ). In other words, in S 113 , the reference data is generated by setting the contents of the setting data. Note that the position of the cursor 80 may move vertically by a predetermined operation input by the player as appropriate.
  • the control unit 21 determines whether or not a setting content confirmation operation has been input (S 114 ).
  • the setting content confirmation operation may be any operation as long as the operation is previously defined.
  • the setting content confirmation operation may be depression of a predetermined button of the operation unit 26 .
  • the processing returns to S 103 , and the synchronous reproduction of the motion data and the music track data is restarted.
  • the control unit 21 reproduces the music track data when a predetermined setting content confirmation operation is performed on the setting screen 70 after the reference data is generated.
  • the motion data is reproduced in synchronization with the reproduction of the music track data.
  • the exemplary model posture shown by the exemplary model image changes on the setting screen 70 in synchronization with the reproduction of the music track data.
  • display processing for the character 32 is performed based on the motion data with the lapse of time of the music track.
  • the reference time point and the reference position may be shown based on the reference data in synchronization with the reproduction of the music track data.
  • the reference time point and the reference position are shown by audio or an image.
  • an audio for informing of the reference position may be output from the audio output unit 28 , or a mark 36 for showing the reference position may be displayed on the setting screen 70 .
  • Those audio data and image data may be stored in the game data storage unit 40 in advance.
  • the control unit 21 determines whether or not an end condition is satisfied (S 115 ).
  • the end condition may be any condition as long as the condition is previously defined.
  • the end condition may be a condition as to whether or not the player has input an instruction to end the setting of the reference data.
  • the game device 20 described above has the setting screen 70 on which the time point within the reproduction period of the music track is specified and the posture of the character 32 at the specified time point is displayed.
  • the player can generate the reference data based on the specified time point and the position of the body part of the character 32 at the specified time point. In other words, the player can generate the reference data while confirming the posture of the character 32 , and hence the reference data can be efficiently generated.
  • the reproduction stopping operation is performed during the synchronous reproduction of the music track data and the motion data
  • the synchronous reproduction stops and the reference data is generated, which allows the player to easily specify the time point to be set as the reference time point.
  • the setting screen 70 is displayed based on the generated reference data, which allows the player to easily confirm the contents of the reference data set by themselves.
  • the game device 20 displays the image indicating the time axis t on the setting screen 70 , thereby allowing the player to easily specify the reference time point.
  • the player can specify the elapsed time for each of the predetermined beats based on the time axis t, and hence it becomes easy to specify the reference time point with consideration given to a tempo of the music track.
  • the game device 20 has the setting screen 70 on which the areas 72 a to 72 e are set for each of the body parts of the character 32 , thereby allowing the player to easily specify the body part for which the reference position is to be set.
  • the reference data is set based on the setting indication image 72 , but a method of setting the reference data is not limited to the example of the embodiment as long as the reference data is set based on reproduction contents of the synchronous reproduction of the motion data and the music track data.
  • the reference time point and the reference position may be specified by, for example, clicking on the body part of the character 32 displayed on the setting screen 70 with a mouse or the like.
  • reference data is generated when the reproduction stopping operation is performed to stop the synchronous reproduction, but work of generating the reference data may be performed while the synchronous reproduction is being performed.
  • the horizontal axis of the setting indication image 72 is set as the time axis t and the vertical axis is set as the areas 72 a to 72 e for specifying the body parts of the character 32
  • the time axis t and the areas for specifying the body parts of the character 32 that are displayed on the setting screen 70 are not limited to the above-mentioned example.
  • the vertical axis may be set as the time axis t
  • the horizontal axis may be set as the areas for specifying the body parts of the character 32 .
  • the two axes do not need to be perpendicular to each other.
  • the mark 36 may be displayed on the setting screen 70 based on the reference data, and the shape of the mark 36 may be allowed to be set.
  • the game data storage unit 40 stores image information relating to a guide image (for example, mark 36 ) for showing the player the reference time point and the reference position.
  • image data corresponding to the mark 36 , a position in which the mark 36 is to be displayed on the screen, and a period during which the mark 36 is to be displayed (hereinafter, referred to as “guide showing period”) are stored in the image information in association with one another.
  • the guide showing period represents, for example, a period from a time point earlier than the reference time point by a predetermined time until the reference time point.
  • the display control unit 46 acquires the image information from the game data storage unit 40 , and in the case where the guide showing period arrives, displays the guide image (for example, mark 36 ) on the setting screen 70 .
  • the mark 36 is displayed in, for example, the position of the determination subject body part of the character 32 .
  • the marks 36 each having a spherical shape as illustrated in FIG. 2 are displayed on the setting screen 70 so as to identify the body parts supposed to be moved by the player.
  • the mark 36 may differ in shape according to the type of the determination method.
  • the mark 36 for showing a flowing movement of the hand may be displayed on the setting screen 70 .
  • FIG. 17 is a diagram illustrating an example of the setting screen 70 according to Modified Example (1).
  • the reproduction stopping operation is input within the guide showing period whose type of the determination method is Solid or Stream, as illustrated in FIG. 17 , the mark 36 is displayed on the setting screen 70 .
  • the player can specify, for example, the length, curvature, hue, thickness, and the like of the mark 36 designed after an arrow. Further, in the case where an object indicating the mark 36 is located on a game space 60 , the orientation of a polygonal surface of this object, the position of coordinates of vertices thereof, and the like may be specified.
  • the game device 20 receives a change instruction for the shape of the guide image (for example, mark 36 ) displayed on the setting screen 70 .
  • the change instruction for the shape of the mark 36 is received based on the operation performed by the player with respect to the display contents of the setting screen 70 . Further, the game device 20 changes the shape of the mark 36 for which the change instruction has been received.
  • Information relating to the shape of the mark 36 specified by the player is stored in the game data storage unit 40 in association with the guide showing period.
  • the game device 20 according to Modified Example (1) can change the shape of the mark 36 on the setting screen 70 .
  • the three-dimensional coordinates indicated by the reference position may be changed on the setting screen 70 .
  • the display control unit 46 displays an index indicating the reference position on the setting screen 70 when the reference time point arrives during the synchronous reproduction.
  • FIG. 18 is a diagram illustrating an example of the setting screen 70 according to Modified Example (2). As illustrated in FIG. 18 , a determination area 82 for evaluating the action of the player is displayed on the setting screen 70 .
  • the determination area 82 is, for example, a sphere having a predetermined radius with the three-dimensional coordinates indicated by the reference position set as its center.
  • the determination area 82 is used in order to evaluate the action of the player. For example, in the case where the game is executed and the reference time point arrives, the position of the player indicated by the player position information and the position of the determination area 82 are compared with each other to thereby evaluate the action of the player.
  • the representative point (for example, P 7 ) indicated by the player position information is caused to coincide with the representative point (for example, back P 7 ) indicated by the position data for determination.
  • the determination area 82 is set, and the action of the player is evaluated based on whether or not the determination subject body part of the player is placed within the determination area 82 . If the three-dimensional coordinates of the determination subject body part of the player are included in the determination area 82 , the player obtains an excellent evaluation.
  • the game device 20 receives the change instruction for the reference position indicated by the determination area 82 displayed on the setting screen 70 .
  • the game device 20 receives the change of the reference position when a predetermined operation for instructing to change the position of the determination area 82 is performed.
  • the game device 20 changes the reference position according to the change instruction.
  • the reference position is caused to change by performing an operation for horizontally and vertically moving a center point 82 a of the determination area 82 displayed on the setting screen 70 .
  • Modified Example (2) for example, it is possible to change the three-dimensional coordinates indicated by the reference position on the setting screen 70 .
  • a difficulty level may be set in the game.
  • the game data storage unit 40 stores, for example, the reference data for each difficulty level.
  • the number of reference time points and the type of the determination method for the action of the player may differ even for the same dance and the same music track.
  • the number of reference time points becomes larger.
  • the determination method for example, Stream
  • the number of body parts for example, head, right hand, and left foot
  • the reference data may be set for each of a plurality of difficulty levels on the setting screen 70 .
  • a plurality of setting indication images 72 corresponding to the plurality of difficulty levels on a one-to-one basis are displayed on the setting screen 70 in parallel with one another in a predetermined direction.
  • the setting indication image 72 is structured in the same manner as in the embodiment.
  • the player generates the reference data at each of the difficulty levels by moving the reference time point indication image 74 and the reference position indication image 76 of the setting indication image 72 .
  • the player can set the reference data by comparing those setting indication images 72 with one other.
  • the player sets the reference data for each of the plurality of difficulty levels, it is hard to understand how difficult or easy the game is set, but the player can efficiently generate the reference data by using the setting screen 70 as described above to perform the setting.
  • the reference time point is associated with the reference position in the reference data.
  • the reference time point may be associated with the three-dimensional coordinates of the determination subject body part.
  • the position data for determination may be included in the motion data, and the motion data and the reference data may be integrally provided.
  • a method of displaying the game screen 30 and the setting screen 70 may be another method.
  • the game screen 30 and the setting screen 70 may be displayed by preparing animation data.
  • the game device 20 executes a dance game, but it suffices that the game device 20 executes a game configured such that the player moves their body in time to the action of the character 32 and in tune with the music track, and performs the setting of the game.
  • the game executed by the game device 20 is not limited to the dance game, and in addition, a game configured such that, for example, the player exercises in time to the action of the character 32 may be executed.
  • the data generation device according to the present invention is applied to the game device, but it suffices that the data generation device according to the present invention is applied to a device for generating the reference data used in the game for evaluating the action of the player based on the reference data.

Abstract

A data generation device includes: a display unit for displaying a setting screen for generating reference data; a specification receiving unit for receiving a specification of a time point within the reproduction period of the music track data through the setting screen; a display control unit for displaying an exemplary model image for showing the exemplary model posture of the player at the specified time point on the setting screen based on an exemplary model data; and a reference data generating unit for generating the reference data based on the specified time point and a position of the body part within the exemplary model posture of the player at the specified time point.

Description

    CROSS-REFERENCE TO RELATED APPLICATION
  • The present application claims priority from Japanese application JP 2010-242817 filed on Oct. 28, 2010, the content of which is hereby incorporated by reference into this application.
  • BACKGROUND OF THE INVENTION
  • 1. Field of the Invention
  • The present invention relates to a data generation device, a control method for a data generation device, and a non-transitory information storage medium.
  • 2. Description of the Related Art
  • There is known a game that uses an image (photographed image) obtained by photographing a player with a camera. For example, JP 3866474 B2 (JP 2001-224730 A) describes a game configured such that a player aims to perform the same action as a character dancing in tune with a music track while an action of the player is analyzed based on the photographed image. In the technology of JP 3866474 B2, the action of the player is evaluated based on reference data obtained by associating information indicating a time point at which a game device determines the action of the player and information for identifying an action supposed to be performed by the player at this time point.
  • SUMMARY OF THE INVENTION
  • As described above, in a case of generating the reference data used in order to evaluate the action of the player, consideration needs to be given not only to a tempo of the music track but also to a position of a body part moved by the player at each time point, which raises a problem that the generation of the reference data requires complicated work.
  • The present invention has been made in view of the above-mentioned problem, and an object thereof is to provide a data generation device, a control method for a data generation device, and a non-transitory information storage medium therefor, which are capable of efficiently generating reference data used in a game for evaluating an action of a player.
  • In order to solve the above-mentioned problem, according to the present invention, there is provided a data generation device for generating reference data used in a game for evaluating an action of a player performed within a reproduction period of music track data based on the reference data obtained by associating information relating to a reference time point set within the reproduction period of the music track data and information relating to a reference position in which a body part of the player is to be placed at the reference time point, the data generation device including: exemplary model data acquiring means for acquiring exemplary model data from means for storing the exemplary model data, the exemplary model data relating to an exemplary model posture of the player at each time point within the reproduction period of the music track data; display means for displaying a setting screen for generating the reference data; specification receiving means for receiving a specification of a time point within the reproduction period of the music track data through the setting screen; display control means for displaying an exemplary model image for showing the exemplary model posture of the player at the specified time point on the setting screen based on the exemplary model data; and reference data generating means for generating the reference data based on the specified time point and a position of the body part within the exemplary model posture of the player at the specified time point.
  • Further, according to the present invention, there is provided a control method for a data generation device for generating reference data used in a game for evaluating an action of a player performed within a reproduction period of music track data based on the reference data obtained by associating information relating to a reference time point set within the reproduction period of the music track data and information relating to a reference position in which a body part of the player is to be placed at the reference time point, the control method including: an exemplary model data acquiring step of acquiring exemplary model data from means for storing the exemplary model data, the exemplary model data relating to an exemplary model posture of the player at each time point within the reproduction period of the music track data; a display step of displaying a setting screen for generating the reference data; a specification receiving step of receiving a specification of a time point within the reproduction period of the music track data through the setting screen; a display control step of displaying an exemplary model image for showing the exemplary model posture of the player at the specified time point on the setting screen based on the exemplary model data; and a reference data generating step of generating the reference data based on the specified time point and a position of the body part within the exemplary model posture of the player at the specified time point.
  • Further, according to the present invention, there is provided a program for causing a computer to function as a data generation device for generating reference data used in a game for evaluating an action of a player performed within a reproduction period of music track data based on the reference data obtained by associating information relating to a reference time point set within the reproduction period of the music track data and information relating to a reference position in which a body part of the player is to be placed at the reference time point, the data generation device including: exemplary model data acquiring means for acquiring exemplary model data from means for storing the exemplary model data, the exemplary model data relating to an exemplary model posture of the player at each time point within the reproduction period of the music track data; display means for displaying a setting screen for generating the reference data; specification receiving means for receiving a specification of a time point within the reproduction period of the music track data through the setting screen; display control means for displaying an exemplary model image for showing the exemplary model posture of the player at the specified time point on the setting screen based on the exemplary model data; and reference data generating means for generating the reference data based on the specified time point and a position of the body part within the exemplary model posture of the player at the specified time point.
  • Further, the information storage medium according to the present invention is a non-transitory computer-readable information storage medium having the above-mentioned program recorded thereon.
  • According to the present invention, it is possible to efficiently generate the reference data used in the game for evaluating the action of the player.
  • Further, according to one aspect of the present invention, the data generation device further includes: means for reproducing the music track data; and means for stopping reproduction of the music track data in a case where a predetermined reproduction stopping operation is performed, in which: the display control means changes, in a case where the music track data is being reproduced, the exemplary model posture shown by the exemplary model image on the setting screen so as to be synchronized with the reproduction of the music track data; the display control means displays, in a case where the reproduction of the music track data is stopped, the exemplary model image at a time point at which the reproduction of the music track data is stopped on the setting screen; and the specification receiving means acquires one of a time point earlier than the time point at which the reproduction of the music track data is stopped and a time point later than the time point at which the reproduction of the music track data is stopped as the specified time point based on a predetermined time point specifying operation.
  • Further, according to one aspect of the present invention, the data generation device further includes means for reproducing the music track data in a case where a predetermined setting content confirmation operation is performed on the setting screen after the reference data is generated by the reference data generating means, in which the display control means includes means for changing the exemplary model posture shown by the exemplary model image on the setting screen in synchronization with reproduction of the music track data in a case where the predetermined setting content confirmation operation is performed after the reference data is generated by the reference data generating means, and showing the reference time point and the reference position based on the reference data in synchronization with the reproduction of the music track data.
  • Further, according to one aspect of the present invention, the display control means displays a time axis of the reproduction period of the music track data on the setting screen, and the specification receiving means receives the specification of the time point within the reproduction period of the music track data based on the time axis displayed on the setting screen.
  • Further, according to one aspect of the present invention, the display control means sets a plurality of areas corresponding to a plurality of body parts of the player so as to extend in a direction of the time axis displayed on the setting screen in parallel with one another in a direction perpendicular to the time axis, the specification receiving means receives a specification of a position within any one of the plurality of areas set on the setting screen, and the reference data generating means generates the reference data based on the time point corresponding to the specified position within the anyone of the plurality of areas and the position of the body part corresponding to the specified position within the any one of the plurality of areas.
  • Further, according to one aspect of the present invention, the time axis indicates the reproduction period of the music track data for each of predetermined beats of the music track data, and the specification receiving means receives the specification of the time point within the reproduction period of the music track data for each of the predetermined beats.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • In the accompanying drawings:
  • FIG. 1 is a diagram illustrating how a player plays a game;
  • FIG. 2 is a diagram illustrating an example of a game screen;
  • FIG. 3 is a diagram illustrating an example of a setting screen;
  • FIG. 4 is a diagram illustrating an example of a photographed image generated by a CCD camera;
  • FIG. 5 is a diagram for describing a method of measuring a depth of the player, which is performed by an infrared sensor;
  • FIG. 6 is a diagram illustrating an example of a depth image acquired by the infrared sensor;
  • FIG. 7 is a diagram illustrating an example of player position information generated by a position detecting device;
  • FIG. 8 is a diagram illustrating a position of the player, which is identified by the player position information;
  • FIG. 9 is a diagram illustrating a hardware configuration of the position detecting device;
  • FIG. 10 is a diagram illustrating a hardware configuration of a game device;
  • FIG. 11 is a functional block diagram illustrating functions implemented on the game device;
  • FIG. 12 is a diagram illustrating a game space;
  • FIG. 13 is a diagram illustrating an example of setting data;
  • FIG. 14 is a diagram illustrating types of a determination method;
  • FIG. 15 is a diagram illustrating a relation between the setting data and a setting indication image;
  • FIG. 16 is a flowchart illustrating an example of processing executed on the game device;
  • FIG. 17 is a diagram illustrating an example of a setting screen according to Modified Example (1); and
  • FIG. 18 is a diagram illustrating an example of a setting screen according to Modified Example (2).
  • DETAILED DESCRIPTION OF THE INVENTION 1. Embodiment
  • Hereinafter, detailed description is given of an example of an embodiment of the present invention with reference to the drawings. A data generation device according to the embodiment of the present invention is implemented by, for example, a consumer game machine (stationary game machine), a portable game machine, a mobile phone, a personal digital assistant (PDA), or a personal computer. In this specification, description is given of a case where the data generation device according to the embodiment of the present invention is implemented by a consumer game machine (game device).
  • In this embodiment, a game device is used to generate reference data used in order to evaluate an action (posture) of a player. Further, the game device executes a game based on the reference data.
  • 2. Game Executed Based on Reference Data
  • FIG. 1 is a diagram illustrating how the player plays a game. As illustrated in FIG. 1, a player 100 is positioned, for example, in front of a position detecting device 1. Although described later in detail, the position detecting device 1 is structured by including, for example, a camera and an infrared sensor, and generates three-dimensional coordinates corresponding to a body part of the player 100. A game device 20 executes the game based on the three-dimensional coordinates. The player 100 aims to, for example, dance in time to a dance performed by a character displayed on a game screen and dance in tune with a music track.
  • FIG. 2 is a diagram illustrating an example of the game screen. As illustrated in FIG. 2, a game screen 30 includes a character 32 that dances in tune with a music track and a score 34 indicating a score of the player. The character 32 is displayed based on, for example, data created by performing motion capturing processing on a picture obtained by photographing a dance performed by a dancer.
  • For example, when a time at which the game device 20 determines the action of the player is approaching, a mark 36 is displayed on the game screen 30. The player plays a game with the help of the mark 36. In other words, the mark 36 is an image for showing the player a position and a time at which the character 32 is to move its body. For example, when the hand of the character 32 touches the center of the mark 36, if the player is performing the same action, the player can obtain an excellent evaluation.
  • Note that in this embodiment, the player plays the game while facing the character 32 displayed on the game screen 30, and hence the game screen 30 displays the character 32 so that the character 32 performs a dance that is left-right reversed to the dance supposed to be performed by the player. For example, when the character 32 raises its hand on the right side when viewed from the player (its hand on the left side when viewed from the character 32), if the player raises their right hand, the player can obtain an excellent evaluation.
  • As described above, the player aims to dance in the same manner as the character 32. On the game device 20, a person who performs setting (for example, player or game producer) sets an action of the player and a time to determine the action within the dance of the character 32, and generates the reference data. The person who performs setting (hereinafter, description is given of a case where the person who performs setting is the player) generates the reference data on a setting screen.
  • FIG. 3 is a diagram illustrating an example of the setting screen. As illustrated in FIG. 3, a setting screen 70 includes the character 32. The character 32 is displayed based on motion data. The character 32 displayed on the setting screen 70 performs the same action as the character 32 displayed on the game screen 30 (FIG. 2).
  • Further, on the setting screen 70, a setting indication image 72 indicating contents set in the reference data is displayed. For example, the player performs the setting and generation of the reference data with the help of display contents of the setting indication image 72. In this embodiment, in a case where the player performs setting of the game, the music track and the dance of the character 32 are configured to be synchronously reproduced. Hereinafter, this technology is described in detail.
  • 3. Operation of Position Detecting Device
  • First, the position detecting device 1 is described. The position detecting device 1 generates player position information relating to a position of the player in a three-dimensional space. In this embodiment, description is given of a case where position information on the player includes information relating to positions of a plurality of body parts of the player 100. The body parts of the player 100 include, for example, a head and both arms.
  • As illustrated in FIG. 1, the position detecting device 1 includes, for example, a CCD camera 2, an infrared sensor 3, and a microphone 4 including a plurality of microphones.
  • The CCD camera 2 is a known camera including a COD image sensor. For example, the CCD camera 2 generates a photographed image (for example, RGB digital image) by photographing the player 100 at predetermined time intervals (for example, every 1/60th of a second).
  • FIG. 4 is a diagram illustrating an example of the photographed image generated by the CCD camera 2. As illustrated in FIG. 4, the photographed image includes, for example, the player 100. In the photographed image, there are set an Xs-axis and a Ys-axis, which are orthogonal to each other. For example, the upper left corner of the photographed image is set as an origin Os (0,0). Further, for example, the lower right corner of the photographed image is set as coordinates Pmax (Xmax, Ymax). The position of each pixel in the photographed image is identified by two-dimensional coordinates (Xs-Ys coordinates) that are assigned to each pixel.
  • The infrared sensor 3 is formed of, for example, an infrared emitting device and an infrared receiving device (for example, infrared diodes). The infrared sensor 3 detects reflected light obtained by emitting infrared light. The infrared sensor 3 measures the depth of a subject (for example, player 100) based on a detection result of the reflected light.
  • The depth of a subject is a distance between a measurement reference position and the position of the subject. The measurement reference position is a position that serves as a reference in measuring the depth (perspective) of the position of the player 100. The measurement reference position may be a predetermined position associated with the position of the position detecting device 1, such as the position of the infrared receiving device of the infrared sensor 3. The infrared sensor 3 measures the depth of the player 100 based, for example, on a time of flight (TOF), which is a time required for the infrared sensor 3 to receive reflected light after emitting infrared light.
  • FIG. 5 is a diagram for describing a method of measuring the depth of the player 100, which is performed by the infrared sensor 3. As illustrated in FIG. 5, the infrared sensor 3 emits pulsed infrared light at predetermined intervals. The infrared light emitted from the infrared sensor 3 spreads spherically with an emission position of the infrared sensor 3 at the center.
  • The infrared light emitted from the infrared sensor 3 strikes, for example, surfaces of the body of the player 100. The infrared light that has struck those surfaces is reflected. The reflected infrared light is detected by the infrared receiving device of the infrared sensor 3. Specifically, the infrared sensor 3 detects reflected light having a phase shifted by 180° from that of the emitted infrared light.
  • For example, as illustrated in FIG. 5, in a case where the player 100 is holding out both hands forward, those held-out hands are closer to the infrared sensor 3 than the torso of the player 100. Specifically, the TOF of the infrared light reflected by both hands of the player 100 is shorter than the TOF of the infrared light reflected by the torso of the player 100.
  • The value determined as follows corresponds to the distance between the measurement reference position and the player 100 (that is, depth). Specifically, the value is determined by multiplying a time required for the infrared sensor 3 to detect the reflected light after emitting the infrared light (that is, TOF) by the speed of the infrared light and then dividing the resultant value by two. In this manner, the infrared sensor 3 can measure the depth of the player 100.
  • Further, the infrared sensor 3 also detects an outline of a subject (player 100) by detecting depth differences acquired from the reflected infrared light. The fact that the infrared sensor 3 receives the reflected infrared light as described above means that an object is located at that place. Further, if there is no other object located behind the object in the vicinity, the depth difference between the object and the surroundings of the object is large. For example, the infrared sensor 3 detects the outline of the player 100 by joining portions having the depth differences larger than a predetermined value.
  • Note that the method of detecting the outline of the player 100 is not limited to the above-mentioned example. Alternatively, for example, the outline may be detected based on the brightness of each pixel of the photographed image acquired by the CCD camera 2. In this case, it is equally possible to detect the outline of the player 100 by, for example, joining portions having large brightness differences among the pixels.
  • Information relating to the depth of the player 100 (depth information), which is detected as described above, is expressed as, for example, a depth image. In this embodiment, description is given by taking, as an example, a case where the depth information is expressed as a gray-scale depth image (for example, 256-bit gray-scale image data).
  • FIG. 6 is a diagram illustrating an example of the depth image acquired by the infrared sensor 3. As illustrated in FIG. 6, for example, an object located close to the infrared sensor 3 is expressed as bright (brightness is high), and an object located far from the infrared sensor 3 is expressed as dark (brightness is low). The depth of the player 100 corresponds to the brightness (pixel value) of the depth image.
  • For example, in a case where the depth image is expressed as the 256-bit gray-scale image data, for every 2-cm change in depth of the player 100, the depth image is changed in brightness by one bit. This case means that the infrared sensor 3 is capable of detecting the depth of the subject in units of 2 cm. In the case where the player 100 is holding out both hands forward (FIG. 5), as illustrated in FIG. 6, pixels corresponding to both hands of the player 100 are expressed as brighter (brightness is higher) than pixels corresponding to the torso.
  • In this embodiment, similarly to the CCD camera 2, the infrared sensor 3 generates the depth image at predetermined time intervals (for example, every 1/60th of a second). Based on the photographed image acquired by the CCD camera 2 and the depth image acquired by the infrared sensor 3, the player position information is generated relating to the positions of body parts of the player 100.
  • For example, there is generated a composite image (RGBD data) that is obtained by adding the depth information (D: depth) indicated by the depth image to the photographed image (RGB data) acquired by the CCD camera 2. In other words, the composite image contains, for each pixel, color information (lightness of each of R, G, and B) and the depth information.
  • When player position information is generated based on the composite image, first, based on the depth image, pixels corresponding to the outline of the player 100 are identified.
  • Next, in the composite image, the color information (lightness of R, G, and B) of pixels enclosed within the outline is referred to. Based on the color information of the composite image, pixels corresponding to each part of the body of the player 100 are identified. For this identification method, for example, a known method is applicable, such as a pattern matching method in which the object (that is, each part of the body of the player 100) is extracted from the image through a comparison with a comparison image (training image).
  • Based on the pixel values (RGBD values) of the pixels identified as described above, sets of the three-dimensional coordinates of the head, shoulders, etc. of the player 100 are calculated. For example, the three-dimensional coordinates are generated by carrying out predetermined matrix transformation processing on those pixel values. The matrix transformation processing is executed through, for example, a matrix operation similar to transformation processing performed in 3D graphics between two coordinate systems of a world coordinate system and a screen coordinate system. Specifically, the RGB value indicating the color information of the pixel and the D value indicating the perspective are substituted into a predetermined determinant, to thereby calculate the three-dimensional coordinate of the pixel.
  • Note that for the method of calculating the three-dimensional coordinate that corresponds to a pixel based on the pixel value (RGBD value), a known method may be applied, and the calculation method is not limited to the above-mentioned example. Alternatively, for example, the coordinate transformation may be performed using a lookup table.
  • FIG. 7 is a diagram illustrating an example of the player position information generated by the position detecting device 1. As illustrated in FIG. 7, the player position information includes a plurality of pieces of information relating to positions of a plurality of body parts of the player 100. As the player position information, for example, each body part of the player 100 and the three-dimensional coordinates are stored in association with each other.
  • FIG. 8 is a diagram illustrating the position of the player 100, which is identified by the player position information. In this embodiment, for example, a predetermined position corresponding to the position detecting device 1 (for example, the measurement reference position) is set as an origin Op. For example, the origin Op represents the three-dimensional coordinates corresponding to the measurement reference position of the infrared sensor 3. Note that the position of the origin Op may be set anywhere in the three-dimensional space in which the player 100 exists. For example, the three-dimensional coordinates corresponding to the origin Os of the photographed image may be set as the origin Op.
  • As illustrated in FIG. 8, in this embodiment, description is given of a case where the player position information includes body part information relating to the positions of, at least, the head and the waist from among the plurality of body parts of the player 100. For example, the player position information includes eleven sets of three-dimensional coordinates corresponding to the head P1, shoulders P2, left upper arm P3, right upper arm P4, left lower arm P5, right lower arm P6, back P7, left thigh P8, right thigh P9, left shin P10, and right shin P11 of the player 100.
  • Note that the part of the body of the player 100, which is indicated by the player position information, may be a part that is determined in advance from the player's body (skeletal frame). For example, any part of the body may be used as long as the part is identifiable by the above-mentioned pattern matching method.
  • For example, the player position information generated every predetermined time intervals (for example, every 1/60th of a second) is transmitted from the position detecting device 1 to the game device 20. The game device 20 executes the game by receiving the player position information from the position detecting device 1 and grasping the movement of the body of the player (hereinafter, reference numeral “100” of the player is omitted).
  • Next, description is given of hardware configurations of the position detecting device 1 and the game device 20.
  • 4. Configuration of Position Detecting Device
  • FIG. 9 is a diagram illustrating the hardware configuration of the position detecting device 1. As illustrated in FIG. 9, the position detecting device 1 includes a control unit 10, a storage unit 11, a photographing unit 12, a depth measuring unit 13, an audio input unit 14, and a communication interface unit 15. The respective components of the position detecting device 1 are connected to one another by a bus 16 so as to be able to exchange data thereamong.
  • The control unit 10 controls the respective units of the position detecting device 1 according to an operating system and various kinds of programs which are stored in the storage unit 11.
  • The storage unit 11 stores programs and various kinds of parameters which are used for operating the operating system, the photographing unit 12, and the depth measuring unit 13. Further, the storage unit 11 stores a program for generating the player position information based on the photographed image and the depth image.
  • The photographing unit 12 includes the CCD camera 2 and the like. The photographing unit 12 generates, for example, the photographed image of the player 100. The depth measuring unit 13 includes the infrared sensor 3 and the like. The depth measuring unit 13 generates the depth image based, for example, on the TOF acquired by the infrared sensor 3. As described above, the control unit 10 generates the player position information based on the photographed image generated by the photographing unit 12 and the depth image generated by the depth measuring unit 13 at predetermined time intervals (for example, every 1/60th of a second).
  • The audio input unit 14 includes, for example, the microphone 4. The communication interface unit 15 is an interface for transmitting various kinds of data, such as the player position information, to the game device 20.
  • 5. Configuration of Game Device
  • FIG. 10 is a diagram illustrating the hardware configuration of the game device 20. As illustrated in FIG. 10, the game device 20 includes a control unit 21, a main storage unit 22, an auxiliary storage unit 23, an optical disc reproducing unit 24, a communication interface unit 25, an operation unit 26, a display unit 27, and an audio output unit 28. The respective components of the game device 20 are connected to one another by a bus 29.
  • The control unit 21 includes, for example, a CPU, a graphics processing unit (GPU), and a sound processing unit (SPU). The control unit 21 executes various kinds of processing according to an operating system and other programs.
  • The main storage unit 22 includes, for example, a random access memory (RAM). The auxiliary storage unit 23 includes, for example, a hard disk drive (non-transitory information storage medium). The main storage unit 22 stores programs and data read from the auxiliary storage unit 23 or an optical disc (non-transitory information storage medium). Further, the main storage unit 22 is also used as a work memory for storing data to be required in the course of the processing.
  • The optical disc reproducing unit 24 reads programs and data stored on the optical disc. For example, a game program is stored on the optical disc.
  • The communication interface unit 25 is an interface for communicatively connecting the game device 20 to a communication network.
  • The operation unit 26 is used by the player to perform an operation. The operation unit 26 includes, for example, a game controller including a cross button and various kinds of buttons, a touch panel, a mouse, or a keyboard. The display unit 27 is, for example, a consumer television set or a liquid crystal display panel. The display unit 27 displays, for example, the setting screen 70 for generating the reference data. The audio output unit 28 includes, for example, a speaker or headphones.
  • In this embodiment, description is given of a case where the programs and data necessary to execute the game are supplied to the game device 20 via the optical disc. Note that those programs and data may be supplied to the game device 20 via another non-transitory information storage medium (for example, memory card). Alternatively, the programs and data may be supplied from a remote site to the game device 20 via a communication network.
  • 6. Functions to be Implemented on Game Device
  • FIG. 11 is a functional block diagram illustrating functions implemented on the game device 20. As illustrated in FIG. 11, on the game device 20, there are implemented a game data storage unit 40, an exemplary model data acquiring unit 42, a specification receiving unit 44, a display control unit 46, and a reference data generating unit 48. Those functions are implemented by the control unit 21 operating according to programs read from the optical disc.
  • [6-1. Game Data Storage Unit]
  • The game data storage unit 40 is mainly implemented by the main storage unit 22 and the auxiliary unit 23. The game data storage unit 40 stores information necessary for executing the game. For example, the game data storage unit 40 stores the following data: (1) music track data (data obtained by saving general popular music or the like in a predetermined data format); (2) motion data; (3) reference data; (4) data obtained by storing the player position information in chronological order; and (5) game situation data (data indicating a situation (including score and elapsed time) of the game being executed).
  • Note that the music track data, the motion data, and the reference data among the above-mentioned list of data are data prepared by a game creator in advance. The player position information is data acquired from the position detecting device 1, and the game situation data is data generated and updated by a game program. Further, the control unit 21 functions as means for acquiring various kinds of data stored in the game data storage unit 40.
  • [Motion Data]
  • First, the motion data is described. The motion data is created by a game producer, for example, based on data generated by performing motion capturing processing on the picture obtained by photographing the action of a dancer. The motion data is, for example, data for identifying the position of each body part of the character 32.
  • In this embodiment, description is given of a case where the motion data is stored by associating the elapsed time since the reproduction of the music track started (for example, every 1/256th of a bar) with data indicating each body part (skeletal frame) of the character 32 within a game space.
  • In other words, data indicating a posture of the character 32 is stored in the motion data in chronological order. By locating an object indicating the character 32 in the game space based on the motion data, the game device 20 can perform display control so that the character 32 dances on the game screen 30.
  • FIG. 12 is a diagram illustrating the game space. As illustrated in FIG. 12, a character object 62 indicating the character 32 and a virtual camera 64 (viewpoint) are located in a game space 60. The character object 62 is structured by including a plurality of polygons. For example, the character object 62 is created based on the data indicating the body parts of the character 32 which is stored in the motion data.
  • The character object 62 changes so as to show the posture supposed to be adopted by the player. In this embodiment, description is given of a case where the character 32 performs the dance that is left-right reversed to the dance supposed to be performed by the player. For example, when the player is supposed to place his right hand in a high position, an object indicating a left hand of the character 32 within the character object 62 is located in the high position. In other words, in the motion data, the position of the body part of the character 32 is defined so that the character 32 performs the dance that is left-right reversed to the dance supposed to be performed by the player.
  • Displayed on the game screen 30 is an image indicating how the game space 60 is viewed from the virtual camera 64. Information indicating the position and a line-of-sight direction of the virtual camera 64 may be a fixed value, or may vary according to the game program or the operation performed by the player. The information indicating the position and the line-of-sight direction of the virtual camera 64 is stored in, for example, the game situation data. Further, the motion data and the music track data are synchronously reproduced so that the character 32 dances in tune with the music track.
  • [Reference Data]
  • Next, the reference data is described. The reference data is data obtained by associating information relating to a reference time point set within a reproduction time of the music track data with information relating to a reference position in which the body part of the player is supposed to be placed at the reference time point. The reference time point represents a time point at which the game device 20 determines (evaluates) the action of the player. The action of the player performed within a reproduction period of the music track data is evaluated based on the reference data. The wording “within the reproduction period of the music track data” means within a period from the start point of the music track until the end point thereof.
  • In this embodiment, description is given of a case where the reference data includes position data for determination and setting data. The position data for determination is data indicating a position in which each body part of the player is supposed to be placed at each time point after the reproduction of the music track starts. For example, the elapsed time since the reproduction of the music track started and the position (three-dimensional coordinates) in which the body part of the player is supposed to be placed after the elapsed time are stored in the position data for determination in association with each other. In the same manner as the motion data, the position data for determination is created by a game producer based on the data generated by performing the motion capturing processing on the picture obtained by photographing the action of a dancer. The position data for determination is compared with the position of the body part of the player indicated by the player position information.
  • Further, in this embodiment, description is given of a case where the position data for determination is expressed by a coordinate system based on a representative point which is set in the player. For example, the three-dimensional coordinates stored in the position data for determination are expressed by the coordinate system with the representative point set as an origin thereof. In other words, position coordinates of each body part of the player stored in the position data for determination indicate a positional relation between the position in which each body part of the player is supposed to be placed and the position of the representative point.
  • Note that in this embodiment, the representative point is set to the back P7. In this case, the back P7 is set as the origin, and the three-dimensional coordinates indicating a position relative to the back P7 are stored in the position data for determination. In the case of evaluating the action of the player, the game device 20 expresses the player position information by the coordinate system based on the representative point. For example, the position data for determination is expressed by the coordinate system having the back P7 set as the origin, and hence the three-dimensional coordinates included in the player position information are also expressed by the coordinate system having the back P7 set as the origin.
  • Note that in this embodiment, description is given of the case where the representative point is the back P7, but the representative point is not limited to the back P7 as long as the representative point is set in the player and the character 32. For example, the representative point may be the head P1 or the like.
  • Further, in this embodiment, the character 32 performs the dance that is left-right reversed to the dance supposed to be performed by the player, and hence a fixed relation (left-right reversed relation) is maintained between the positions of the head P1, the shoulder P2, the left upper arm P3, the right upper arm P4, the left lower arm P5, the right lower arm P6, the back P7, the left thigh P8, the right thigh P9, the left shin P10, and the right shin P11 of the player, which are indicated by the position data for determination, and the positions of a head Q1, a shoulder Q2, a left upper arm Q3, a right upper arm Q4, a left lower arm Q5, a right lower arm Q6, a back Q7, a left thigh Q8, a right thigh Q9, a left shin Q10, and a right shin Q11 of the character 32, which are illustrated in FIG. 12.
  • Next, the setting data is described. The setting data is data for identifying the reference time point and the body part of the player to be determined by the game device 20 at the reference time point. In this embodiment, the player generates the reference data by setting contents of setting data on the setting screen 70.
  • FIG. 13 is a diagram illustrating an example of the setting data. A t-axis illustrated in FIG. 13 represents a time axis. The t-axis indicates the elapsed time since the reproduction of the music track started. For example, the setting data indicates the body part of the player to be determined by the game device 20 every predetermined bar (for example, 1/16th of a bar). In this embodiment, description is given of a case where five body parts of a head, a left hand (for example, left lower arm), a right hand (for example, right lower arm), a left foot (for example, left shin), and a right foot (for example, right shin) are defined in the setting data.
  • As illustrated in FIG. 13, at each time point in units of 1/16th of a bar, it is expressed by the numerical values “0” to “7” whether or not the game device 20 is to determine the dance of the player. The values in “head”, “left hand”, “right hand”, “left foot”, and “right foot” which are stored in the setting data indicate whether or not the game device 20 is to determine the positions of the head, the left hand, the right hand, the left foot, and the right foot of the player, respectively. Further, a method of determining the action of the player by the game device 20 differs in type depending on the numerical value of the value within the setting data.
  • FIG. 14 is a diagram illustrating types of a determination method. As illustrated in FIG. 14, in this embodiment, the method of determining the action of the player by the game device 20 differs in type depending on the value stored in the setting data. The value “0” stored in the setting data indicates that the game device 20 is not to determine the action of the player at the corresponding time point. In other words, all the body parts stored in the setting data whose values are “0” indicate that the corresponding time point is not the reference time point.
  • The body part stored in the setting data whose value is any one of “1” to “7” indicates that there is a body part of the player to be determined by the game device 20. Hereinafter, this body part is referred to as a determination subject body part. In other words, any one of “1” to “7” stored in the setting data indicates that the corresponding time is the reference time point. In this embodiment, as illustrated in FIG. 14, the method of determining the action of the player by the game device 20 is classified into the following seven types. (1) Ripple: The game device 20 determines whether or not the player is moving their hand in the same manner as the character 32. (2) Step: The game device 20 determines whether or not the player is moving their foot in the same manner as the character 32. (3) Pose: The game device 20 determines whether or not the player is adopting the same pose as the character 32 by using their whole body. (4) Lock: The game device 20 determines whether or not the player has been maintaining their body part in the same position as the character 32 for a predetermined time period. (5) Solid: The game device 20 determines whether or not the player has been shaking their hand in the same manner as the character 32 for a predetermined time period. (6) Stream: The game device 20 determines whether or not the player has been moving their hand in the same manner as the character 32 for a predetermined time period. (7) Gesture: The game device 20 determines whether or not the player has been adopting the same pose as the character 32 by using their whole body or partial body part for a predetermined time period.
  • For example, if there is a body part whose value is “1”, “2”, “4”, “5”, or “6” in the setting data, the game device 20 determines the action of the body part (head, left hand, right hand, left foot, or right foot) of the player having the above-mentioned value based on the above-mentioned type. Further, for example, if there is a body part whose value is “3” or “7” in the setting data, the game device 20 determines the action of the whole body or partial body part of the player based on the above-mentioned type.
  • Note that in this embodiment, description is given of the case where such action as described above is performed by the player, but the type of the determination method performed by the game device 20 is not limited to the above-mentioned example as long as the type of the determination method performed by the game device 20 relates to an action based on the dance of the character 32.
  • [6-2. Exemplary Model Data Acquiring Unit]
  • The exemplary model data acquiring unit 42 is implemented mainly by the control unit 21. The exemplary model data acquiring unit 42 acquires exemplary model data from means for storing the exemplary model data relating to an exemplary model posture of the player at each time point within a reproduction period of the music track data. The exemplary model posture represents an action (posture) supposed to be performed (adopted) by the player, for example, the posture (dance) of the character 32. In this embodiment, the exemplary model data acquiring unit 42 acquires the motion data stored in the game data storage unit 40.
  • [6-3. Specification Receiving Unit]
  • The specification receiving unit 44 is implemented mainly by the control unit 21 and the operation unit 26. The specification receiving unit 44 receives a specification of a time point within a reproduction period of the music track data through the setting screen 70. The specification receiving unit 44 receives from the player a specification of the time point (elapsed time) within the reproduction period of the music track data based on display contents of the setting screen 70.
  • [6-4. Display Control Unit]
  • The display control unit 46 is implemented mainly by the control unit 21. The display control unit 46 displays an exemplary model image (for example, character 32) for showing the exemplary model posture of the player at the specified time point on the setting screen 70 based on the exemplary model data (for example, motion data).
  • In this embodiment, in a case where the music track data is being reproduced, the display control unit 46 changes the exemplary model posture shown by the exemplary model image (for example, character 32) on the setting screen 70 so as to be synchronized with the reproduction of the music track data. In other words, the display control unit 46 displays the character 32 indicated by the motion data on the setting screen 70 so that the elapsed time indicated by the motion data coincides with the elapsed time indicated by the music track data.
  • The setting screen 70 displayed by the display control unit 46 includes, as illustrated in FIG. 3, the character 32. As described above, the character 32 is displayed based on the motion data. The character 32 displayed on the setting screen 70 performs the same action as the character 32 displayed on the game screen 30 (FIG. 2).
  • Further, the setting indication image 72 indicating contents of the setting data is displayed on the setting screen 70. The setting indication image 72 is used by the player in order to set the setting data. For example, the setting indication image 72 is displayed based on the contents of the setting data that is currently set.
  • As illustrated in FIG. 3, a reference time point and the body part used for determination at the reference time point, which are stored in the setting data, are displayed on the setting indication image 72 in association with each other. For example, two axes are set in the setting indication image 72. One axis (for example, vertical axis) is associated with the time point within the reproduction time of the music track data, and the other axis (for example, horizontal axis) is associated with the body part of the player to be determined by the game device 20.
  • FIG. 15 is a diagram illustrating a relation between the setting data and the setting indication image 72. As illustrated in FIG. 15, the display contents of the setting indication image 72 are decided based on the setting data. For example, the setting indication image 72 is displayed based on the setting data within a predetermined time including the current elapsed time. The types of the determination method stored in the setting data are displayed on the setting indication image 72 so as to be distinguished by icons such as a rectangle and a circle.
  • Note that, as illustrated in FIG. 15, the icons may be displayed so as to be connected to each other with regard to the determination to be performed over a predetermined period such as Stream or Solid. Further, the display control unit 46 displays a time axis t of the reproduction period of the music track data on the setting indication image 72. In the example of FIG. 15, the horizontal axis set in the setting indication image 72 corresponds to the time axis t.
  • When the setting screen 70 is displayed, the display of the setting indication image 72 is updated with the lapse of time of the reproduction of the music track data. For example, with the lapse of time of the reproduction of the music track data, the setting indication image 72 is scrolled in a direction corresponding to the time axis t (for example, leftward). The setting indication image 72 thus displayed allows the player to perform setting work while grasping which type of determination is to be performed at the current elapsed time in a case of playing the game in actuality.
  • Further, in this embodiment, the display control unit 46 sets a plurality of areas 72 a to 72 e corresponding to a plurality of body parts of the player so as to extend in the direction of the time axis t displayed on the setting screen 70 in parallel with one another in a direction perpendicular to the time axis t. For example, the long side direction of the areas 72 a to 72 e is the direction of the time axis t. The plurality of areas 72 a to 72 e are arranged in parallel with one another in the short side direction of the respective areas 72 a to 72 e.
  • For example, each of the plurality of areas 72 a to 72 e set in the setting indication image 72 corresponds to a determination subject body part. In this embodiment, there are five determination subject body parts of the head, the left hand, the right hand, the left foot, and the right foot, and hence, as illustrated in FIG. 15, the five areas 72 a to 72 e are set so as to correspond to those five body parts.
  • Returning to FIG. 3, the setting indication image 72 further includes a reference time point indication image 74 and a reference position indication image 76. The player sets the reference time point based on the reference time point indication image 74, and sets a reference position based on the reference position indication image 76.
  • The reference time point indication image 74 is used by the player in order to specify the time point within the reproduction period indicated by the time axis t displayed on the setting indication image 72. The reference time point indication image 74 moves horizontally based on the operation performed by the player. By moving the reference time point indication image 74, the player can specify and change the time point within the reproduction period of the music track data.
  • In other words, the specification receiving unit 44 receives the specification of the time point within the reproduction period of the music track data based on the time axis t displayed on the setting screen 70. In this embodiment, the time axis t indicates the reproduction period of the music track data for each of the predetermined beats of the music track, and hence the specification receiving unit 44 receives the specification of the time point within the reproduction period of the music track data for each of the predetermined beats of the music track.
  • The display control unit 46 displays the action of the character 32 at the time point specified by the player on the setting screen 70. For example, based on the motion data and the time point specified by the player moving the reference time point indication image 74, the action (posture) of the character 32 at this time point is displayed on the setting screen 70.
  • Further, the reference position indication image 76 is used by the player in order to specify a position within any one of the plurality of areas 72 a to 72 e set on the setting screen 70. The reference position indication image 76 moves vertically based on the operation performed by the player. By moving the reference position indication image 76, the player can specify the determination subject body part to set the reference position. In other words, the specification receiving unit 44 receives a specification of the position within any one of the plurality of areas 72 a to 72 e set on the setting screen 70.
  • Note that, as illustrated in FIG. 3, on the setting screen 70, the number of reference time points may be displayed in a determination method display field 78 for each of the types of the determination method performed by the game device 20. Display contents of the determination method display field 78 are decided based on the numerical values “1” to “7” that are indicated by the setting data.
  • Further, the determination method display field 78 includes a cursor 80 indicating the type of the determination method to be set by the player. By the player performing a predetermined operation, the cursor 80 moves vertically. By moving the cursor 80 vertically, it is possible to change the type of the determination method to be set by the player.
  • [6-5. Reference Data Generating Unit]
  • The reference data generating unit 48 is implemented mainly by the control unit 21. The reference data generating unit 48 generates the reference data based on the time point specified on the setting screen 70 and the position of the body part within the exemplary model posture of the player at the specified time point. For example, the reference data generating unit 48 sets the reference time point based on the time point received by the specification receiving unit 44, and sets the reference position based on an exemplary model action (for example, action of the character 32) at this time point.
  • Further, in this embodiment, description is given of a case where the reference data generating unit 48 generates the reference data based on the time point corresponding to the position within the areas 72 a to 72 e which is specified on the setting screen 70 and the position of the body part corresponding to the specified position within the areas 72 a to 72 e. For example, the reference data generating unit 48 identifies the setting contents of the setting data based on the body part indicated by the reference position indication image 76 and the type of the determination method indicated by the cursor 80.
  • For example, in the case of the state of FIG. 3, when the player performs a predetermined setting instruction operation, the setting data can be set so that the determination for Ripple is performed for the left hand of the player at the “2 and ¾th” of a bar after the reproduction of the music track data starts. The reference position becomes the three-dimensional coordinates of the left lower arm P5 at the “2 and ¾th” of a bar which is stored in the position data for determination.
  • In other words, the reference data generating unit 48 generates the reference data by generating the setting data based on the time point specified based on the reference time point indication image 74, the body part of the character 32 specified based on the reference position indication image 76, and the type of the determination method specified by the cursor 80.
  • In a case where the setting instruction operation is performed, the display contents of the setting indication image 72 and the determination method display field 78 are updated as well. Further, in a case where the player performs the setting instruction operation, the icon representing the type of the determination method which is set is displayed in the position within the setting indication image 72 which is indicated by the reference time point indication image 74 and the reference position indication image 76.
  • 7. Processing Executed on Game Device
  • FIG. 16 is a flowchart illustrating an example of processing executed on the game device 20. The processing of FIG. 16 is executed by the control unit 21 operating according to programs read from the optical disc when, for example, the setting screen 70 is displayed.
  • First, as illustrated in FIG. 16, the control unit 21 reproduces the music track based on the music track data (S101). The control unit 21 starts the dance of the character 32 based on the motion data, and displays the setting screen 70 (S102). In S102, the character object 62 is located in the game space 60 based on the motion data. Then, an image indicating how the character object 62 looks when viewed from the virtual camera 64 is displayed on the setting screen 70. Note that with the processing of S101 and S102, the music track data and the motion data are synchronously reproduced so that the character 32 performs a dance (dance that is left-right reversed to the dance supposed to be performed by the player) in tune with the music track.
  • The control unit 21 updates the display contents of the setting screen 70 based on the current elapsed time of the music track, and the music track data and the motion data are synchronously reproduced (S103). In S103, for example, with the passage of time, the character 32 dances in tune with the music track, and the setting indication image 72 is scrolled in the direction of the time axis t.
  • The control unit 21 determines whether or not a reproduction stopping operation has been performed (S104). In S104, based on whether or not a predetermined button of the operation unit 26 has been depressed, it is determined whether or not the reproduction stopping operation has been performed.
  • If the reproduction stopping operation is performed (S104; Y), the control unit 21 stops the reproduction of the music track (S105). The control unit 21 stops the synchronous reproduction of the motion data (S106). In other words, if the reproduction of the music track data is stopped, the control unit 21 (display control unit 46) displays the exemplary model image (for example, character 32) at the time point at which the reproduction of the music track data is stopped on the setting screen 70. Because the reproduction of the music track is stopped in S105, the dance of the character 32 is stopped on the setting screen 70, and the scrolling of the setting indication image 72 is stopped as well.
  • The control unit 21 determines whether or not a movement instruction (time point specifying operation) for the reference time point indication image 74 has been performed (S107). For example, it is determined whether or not a left or right button of the cross button of the operation unit 26 has been depressed.
  • If the movement instruction for the reference time point indication image 74 is performed (S107; Y), the control unit 21 moves the reference time point indication image 74 according to the movement instruction, and changes the elapsed time of the music track data (S108).
  • In S108, the control unit 21 (specification receiving unit 44) acquires a time point earlier than the time point at which the reproduction of the music track data is stopped or a time point later than the time point at which the reproduction of the music track data is stopped as the specified time point based on a predetermined time point specifying operation. The time point at which the reproduction of the music track data is stopped means the elapsed time of the music track in the case where the reproduction stopping operation is performed. In other words, in S108, the current elapsed time is changed in accordance with the movement of the reference time point indication image 74. For example, the elapsed time is moved backward by moving the reference time point indication image 74 leftward, while the elapsed time is moved forward by moving the reference time point indication image 74 rightward.
  • The control unit 21 updates the display of the character 32 based on the changed elapsed time (S109). When the current elapsed time is changed, in S109, the display of the character 32 is updated based on the motion data. In this embodiment, as illustrated in FIG. 3 and FIG. 15, the time axis t indicates the elapsed time for each of the predetermined beats of the music track, and hence in S109, the action of the character 32 at the elapsed time specified for each of the predetermined beats is displayed on the setting screen 70. Further, in S113 described later, the reference time point is set based on the elapsed time specified for each of the predetermined beats.
  • On the other hand, if the movement instruction for the reference time point indication image 74 is not performed (S107; N), the control unit 21 determines whether or not the movement instruction for the reference position indication image 76 has been performed (S110). For example, it is determined whether or not an up or down button of the cross button of the operation unit 26 has been depressed.
  • If the movement instruction for the reference position indication image 76 is performed (S110; Y), the control unit 21 moves the reference position indication image 76 (S111). For example, the reference position indication image 76 moves vertically based on an operation input from the operation unit 26.
  • If the movement instruction for the reference position indication image 76 is not performed (S110; N), the control unit 21 determines whether or not the setting instruction operation has been performed (S112). If the setting instruction operation is performed (S112; Y), the control unit 21 sets the reference time point based on the reference time point indication image 74, and sets the reference position based on the body part indicated by the reference position indication image 76 (S113). In other words, in S113, the reference data is generated by setting the contents of the setting data. Note that the position of the cursor 80 may move vertically by a predetermined operation input by the player as appropriate.
  • If the setting instruction operation is not performed (S112; N), the control unit 21 determines whether or not a setting content confirmation operation has been input (S114). The setting content confirmation operation may be any operation as long as the operation is previously defined. For example, the setting content confirmation operation may be depression of a predetermined button of the operation unit 26.
  • If the setting content confirmation operation is input (S114; Y), the processing returns to S103, and the synchronous reproduction of the motion data and the music track data is restarted. In other words, the control unit 21 reproduces the music track data when a predetermined setting content confirmation operation is performed on the setting screen 70 after the reference data is generated.
  • Further, the motion data is reproduced in synchronization with the reproduction of the music track data. In other words, when the setting content confirmation operation is performed after the reference data is generated, the exemplary model posture shown by the exemplary model image (for example, character 32) changes on the setting screen 70 in synchronization with the reproduction of the music track data. For example, display processing for the character 32 is performed based on the motion data with the lapse of time of the music track.
  • In addition, the reference time point and the reference position may be shown based on the reference data in synchronization with the reproduction of the music track data. For example, the reference time point and the reference position are shown by audio or an image. For example, in a case where the reference time point arrives, an audio for informing of the reference position may be output from the audio output unit 28, or a mark 36 for showing the reference position may be displayed on the setting screen 70. Those audio data and image data may be stored in the game data storage unit 40 in advance.
  • If the setting content confirmation operation is not input (S114; N), the control unit 21 determines whether or not an end condition is satisfied (S115). The end condition may be any condition as long as the condition is previously defined. For example, the end condition may be a condition as to whether or not the player has input an instruction to end the setting of the reference data.
  • If the end condition is satisfied (S115; Y), the processing is brought to an end. If the end condition is not satisfied (S115; N), the processing returns to S104, and the setting of the reference data through the setting screen 70 is continued.
  • The game device 20 described above has the setting screen 70 on which the time point within the reproduction period of the music track is specified and the posture of the character 32 at the specified time point is displayed. The player can generate the reference data based on the specified time point and the position of the body part of the character 32 at the specified time point. In other words, the player can generate the reference data while confirming the posture of the character 32, and hence the reference data can be efficiently generated.
  • Further, in a case where the reproduction stopping operation is performed during the synchronous reproduction of the music track data and the motion data, the synchronous reproduction stops and the reference data is generated, which allows the player to easily specify the time point to be set as the reference time point. Further, in a case where the setting content confirmation operation is performed, the setting screen 70 is displayed based on the generated reference data, which allows the player to easily confirm the contents of the reference data set by themselves.
  • Further, the game device 20 displays the image indicating the time axis t on the setting screen 70, thereby allowing the player to easily specify the reference time point. The player can specify the elapsed time for each of the predetermined beats based on the time axis t, and hence it becomes easy to specify the reference time point with consideration given to a tempo of the music track. Further, the game device 20 has the setting screen 70 on which the areas 72 a to 72 e are set for each of the body parts of the character 32, thereby allowing the player to easily specify the body part for which the reference position is to be set.
  • Note that in the embodiment, description is given of the case where the reference data is set based on the setting indication image 72, but a method of setting the reference data is not limited to the example of the embodiment as long as the reference data is set based on reproduction contents of the synchronous reproduction of the motion data and the music track data. In addition, the reference time point and the reference position may be specified by, for example, clicking on the body part of the character 32 displayed on the setting screen 70 with a mouse or the like.
  • Further, description is given of the case where the reference data is generated when the reproduction stopping operation is performed to stop the synchronous reproduction, but work of generating the reference data may be performed while the synchronous reproduction is being performed.
  • Further, description is given of the example in which the horizontal axis of the setting indication image 72 is set as the time axis t and the vertical axis is set as the areas 72 a to 72 e for specifying the body parts of the character 32, but the time axis t and the areas for specifying the body parts of the character 32 that are displayed on the setting screen 70 are not limited to the above-mentioned example. For example, the vertical axis may be set as the time axis t, and the horizontal axis may be set as the areas for specifying the body parts of the character 32. Further, the two axes do not need to be perpendicular to each other.
  • Note that the present invention is not limited to the embodiment described above. Various modifications may be made as appropriate without departing from the spirit of the present invention.
  • (1) For example, the mark 36 may be displayed on the setting screen 70 based on the reference data, and the shape of the mark 36 may be allowed to be set.
  • The game data storage unit 40 according to Modified Example (1) stores image information relating to a guide image (for example, mark 36) for showing the player the reference time point and the reference position. For example, image data corresponding to the mark 36, a position in which the mark 36 is to be displayed on the screen, and a period during which the mark 36 is to be displayed (hereinafter, referred to as “guide showing period”) are stored in the image information in association with one another. The guide showing period represents, for example, a period from a time point earlier than the reference time point by a predetermined time until the reference time point.
  • Further, the display control unit 46 according to Modified Example (1) acquires the image information from the game data storage unit 40, and in the case where the guide showing period arrives, displays the guide image (for example, mark 36) on the setting screen 70. The mark 36 is displayed in, for example, the position of the determination subject body part of the character 32.
  • For example, in the case where the guide showing period whose type of the determination method is Ripple, Step, or Lock arrives, the marks 36 each having a spherical shape as illustrated in FIG. 2 are displayed on the setting screen 70 so as to identify the body parts supposed to be moved by the player. Note that the mark 36 may differ in shape according to the type of the determination method. For example, in the case where the type of the determination method is Solid or Stream, the mark 36 for showing a flowing movement of the hand may be displayed on the setting screen 70.
  • FIG. 17 is a diagram illustrating an example of the setting screen 70 according to Modified Example (1). For example, in a case where the reproduction stopping operation is input within the guide showing period whose type of the determination method is Solid or Stream, as illustrated in FIG. 17, the mark 36 is displayed on the setting screen 70.
  • By performing a predetermined operation (for example, clicking on the mark 36 with the mouse of the operation unit 26), the player can specify, for example, the length, curvature, hue, thickness, and the like of the mark 36 designed after an arrow. Further, in the case where an object indicating the mark 36 is located on a game space 60, the orientation of a polygonal surface of this object, the position of coordinates of vertices thereof, and the like may be specified.
  • The game device 20 receives a change instruction for the shape of the guide image (for example, mark 36) displayed on the setting screen 70. For example, the change instruction for the shape of the mark 36 is received based on the operation performed by the player with respect to the display contents of the setting screen 70. Further, the game device 20 changes the shape of the mark 36 for which the change instruction has been received. Information relating to the shape of the mark 36 specified by the player is stored in the game data storage unit 40 in association with the guide showing period.
  • The game device 20 according to Modified Example (1) can change the shape of the mark 36 on the setting screen 70.
  • (2) Further, for example, the three-dimensional coordinates indicated by the reference position may be changed on the setting screen 70. In the case of Modified Example (2), the display control unit 46 displays an index indicating the reference position on the setting screen 70 when the reference time point arrives during the synchronous reproduction.
  • FIG. 18 is a diagram illustrating an example of the setting screen 70 according to Modified Example (2). As illustrated in FIG. 18, a determination area 82 for evaluating the action of the player is displayed on the setting screen 70. The determination area 82 is, for example, a sphere having a predetermined radius with the three-dimensional coordinates indicated by the reference position set as its center.
  • The determination area 82 is used in order to evaluate the action of the player. For example, in the case where the game is executed and the reference time point arrives, the position of the player indicated by the player position information and the position of the determination area 82 are compared with each other to thereby evaluate the action of the player.
  • Specifically, in the case where the reference time point arrives, first, the representative point (for example, P7) indicated by the player position information is caused to coincide with the representative point (for example, back P7) indicated by the position data for determination. Subsequently, the determination area 82 is set, and the action of the player is evaluated based on whether or not the determination subject body part of the player is placed within the determination area 82. If the three-dimensional coordinates of the determination subject body part of the player are included in the determination area 82, the player obtains an excellent evaluation.
  • The game device 20 receives the change instruction for the reference position indicated by the determination area 82 displayed on the setting screen 70. For example, the game device 20 receives the change of the reference position when a predetermined operation for instructing to change the position of the determination area 82 is performed. The game device 20 changes the reference position according to the change instruction. For example, the reference position is caused to change by performing an operation for horizontally and vertically moving a center point 82 a of the determination area 82 displayed on the setting screen 70.
  • According to Modified Example (2), for example, it is possible to change the three-dimensional coordinates indicated by the reference position on the setting screen 70.
  • (3) Further, for example, a difficulty level may be set in the game. In the case where the difficulty level is set in the game, the game data storage unit 40 stores, for example, the reference data for each difficulty level. In other words, in this case, the number of reference time points and the type of the determination method for the action of the player may differ even for the same dance and the same music track.
  • For example, as the difficulty level becomes higher, the number of reference time points becomes larger. Further, for example, as the difficulty level becomes higher, the determination method (for example, Stream) by which it is more difficult to obtain an excellent evaluation is used. Further, for example, as the difficulty level becomes higher, the number of body parts (for example, head, right hand, and left foot) that are simultaneously used for the determination becomes larger. Therefore, as the difficulty level becomes higher, it is more difficult for the player to obtain an excellent evaluation.
  • In Modified Example (3), the reference data may be set for each of a plurality of difficulty levels on the setting screen 70. In this case, a plurality of setting indication images 72 corresponding to the plurality of difficulty levels on a one-to-one basis are displayed on the setting screen 70 in parallel with one another in a predetermined direction. The setting indication image 72 is structured in the same manner as in the embodiment. In other words, the player generates the reference data at each of the difficulty levels by moving the reference time point indication image 74 and the reference position indication image 76 of the setting indication image 72.
  • According to Modified Example (3), the player can set the reference data by comparing those setting indication images 72 with one other. In the case where the player sets the reference data for each of the plurality of difficulty levels, it is hard to understand how difficult or easy the game is set, but the player can efficiently generate the reference data by using the setting screen 70 as described above to perform the setting.
  • (4) Further, in the embodiment, description is given of the example in which the position data for determination and the setting data are included as the reference data, but it suffices that the reference time point is associated with the reference position in the reference data. For example, the reference time point may be associated with the three-dimensional coordinates of the determination subject body part. Further, the position data for determination may be included in the motion data, and the motion data and the reference data may be integrally provided.
  • (5) Further, in this embodiment, description is given of the case where the game screen 30 and the setting screen 70 are played based on the motion data, but a method of displaying the game screen 30 and the setting screen 70 may be another method. For example, the game screen 30 and the setting screen 70 may be displayed by preparing animation data.
  • (6) Further, in the above-mentioned embodiment and modified examples, description is given of the case where the game device 20 executes a dance game, but it suffices that the game device 20 executes a game configured such that the player moves their body in time to the action of the character 32 and in tune with the music track, and performs the setting of the game. The game executed by the game device 20 is not limited to the dance game, and in addition, a game configured such that, for example, the player exercises in time to the action of the character 32 may be executed.
  • (7) Further, description is given of the case where the data generation device according to the present invention is applied to the game device, but it suffices that the data generation device according to the present invention is applied to a device for generating the reference data used in the game for evaluating the action of the player based on the reference data.
  • While there have been described what are at present considered to be certain embodiments of the invention, it will be understood that various modifications may be made thereto, and it is intended that the appended claims coverall such modifications as fall within the true spirit and scope of the invention.

Claims (8)

1. A data generation device for generating reference data used in a game for evaluating an action of a player performed within a reproduction period of music track data based on the reference data obtained by associating information relating to a reference time point set within the reproduction period of the music track data and information relating to a reference position in which a body part of the player is to be placed at the reference time point, the data generation device comprising:
exemplary model data acquiring means for acquiring exemplary model data from means for storing the exemplary model data, the exemplary model data relating to an exemplary model posture of the player at each time point within the reproduction period of the music track data;
display means for displaying a setting screen for generating the reference data;
specification receiving means for receiving a specification of a time point within the reproduction period of the music track data through the setting screen;
display control means for displaying an exemplary model image for showing the exemplary model posture of the player at the specified time point on the setting screen based on the exemplary model data; and
reference data generating means for generating the reference data based on the specified time point and a position of the body part within the exemplary model posture of the player at the specified time point.
2. The data generation device according to claim 1, further comprising:
means for reproducing the music track data; and
means for stopping reproduction of the music track data when a predetermined reproduction stopping operation is performed, wherein:
the display control means changes, in a case where the music track data is being reproduced, the exemplary model posture shown by the exemplary model image on the setting screen so as to be synchronized with the reproduction of the music track data;
the display control means displays, in a case where the reproduction of the music track data is stopped, the exemplary model image at a time point at which the reproduction of the music track data is stopped on the setting screen; and
the specification receiving means acquires one of a time point earlier than the time point at which the reproduction of the music track data is stopped and a time point later than the time point at which the reproduction of the music track data is stopped as the specified time point based on a predetermined time point specifying operation.
3. The data generation device according to claim 1, further comprising means for reproducing the music track data in a case where a predetermined setting content confirmation operation is performed on the setting screen after the reference data is generated by the reference data generating means,
wherein the display control means comprises means for changing the exemplary model posture shown by the exemplary model image on the setting screen in synchronization with reproduction of the music track data in a case where the predetermined setting content confirmation operation is performed after the reference data is generated by the reference data generating means, and showing the reference time point and the reference position based on the reference data in synchronization with the reproduction of the music track data.
4. The data generation device according to claim 1, wherein:
the display control means displays a time axis of the reproduction period of the music track data on the setting screen; and
the specification receiving means receives the specification of the time point within the reproduction period of the music track data based on the time axis displayed on the setting screen.
5. The data generation device according to claim 4, wherein:
the display control means sets a plurality of areas corresponding to a plurality of body parts of the player so as to extend in a direction of the time axis displayed on the setting screen in parallel with one another in a direction perpendicular to the time axis;
the specification receiving means receives a specification of a position within any one of the plurality of areas set on the setting screen; and
the reference data generating means generates the reference data based on the time point corresponding to the specified position within the any one of the plurality of areas and the position of the body part corresponding to the specified position within the any one of the plurality of areas.
6. The data generation device according to claim 4, wherein:
the time axis indicates the reproduction period of the music track data for each of predetermined beats of the music track data; and
the specification receiving means receives the specification of the time point within the reproduction period of the music track data for each of the predetermined beats.
7. A control method for a data generation device for generating reference data used in a game for evaluating an action of a player performed within a reproduction period of music track data based on the reference data obtained by associating information relating to a reference time point set within the reproduction period of the music track data and information relating to a reference position in which a body part of the player is to be placed at the reference time point, the control method comprising:
an exemplary model data acquiring step of acquiring exemplary model data from means for storing the exemplary model data, the exemplary model data relating to an exemplary model posture of the player at each time point within the reproduction period of the music track data;
a display step of displaying a setting screen for generating the reference data;
a specification receiving step of receiving a specification of a time point within the reproduction period of the music track data through the setting screen;
a display control step of displaying an exemplary model image for showing the exemplary model posture of the player at the specified time point on the setting screen based on the exemplary model data; and
a reference data generating step of generating the reference data based on the specified time point and a position of the body part within the exemplary model posture of the player at the specified time point.
8. A non-transitory computer-readable information storage medium having recorded thereon a program for causing a computer to function as a data generation device for generating reference data used in a game for evaluating an action of a player performed within a reproduction period of music track data based on the reference data obtained by associating information relating to a reference time point set within the reproduction period of the music track data and information relating to a reference position in which a body part of the player is to be placed at the reference time point, the data generation device comprising:
exemplary model data acquiring means for acquiring exemplary model data from means for storing the exemplary model data, the exemplary model data relating to an exemplary model posture of the player at each time point within the reproduction period of the music track data;
display means for displaying a setting screen for generating the reference data;
specification receiving means for receiving a specification of a time point within the reproduction period of the music track data through the setting screen;
display control means for displaying an exemplary model image for showing the exemplary model posture of the player at the specified time point on the setting screen based on the exemplary model data; and
reference data generating means for generating the reference data based on the specified time point and a position of the body part within the exemplary model posture of the player at the specified time point.
US13/283,994 2010-10-28 2011-10-28 Data generation device, control method for a data generation device, and non-transitory information storage medium Abandoned US20120108305A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2010242817A JP5256269B2 (en) 2010-10-28 2010-10-28 Data generation apparatus, data generation apparatus control method, and program
JP2010-242817 2010-10-28

Publications (1)

Publication Number Publication Date
US20120108305A1 true US20120108305A1 (en) 2012-05-03

Family

ID=45997295

Family Applications (1)

Application Number Title Priority Date Filing Date
US13/283,994 Abandoned US20120108305A1 (en) 2010-10-28 2011-10-28 Data generation device, control method for a data generation device, and non-transitory information storage medium

Country Status (2)

Country Link
US (1) US20120108305A1 (en)
JP (1) JP5256269B2 (en)

Cited By (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8439733B2 (en) 2007-06-14 2013-05-14 Harmonix Music Systems, Inc. Systems and methods for reinstating a player within a rhythm-action game
US8444464B2 (en) 2010-06-11 2013-05-21 Harmonix Music Systems, Inc. Prompting a player of a dance game
US8449360B2 (en) 2009-05-29 2013-05-28 Harmonix Music Systems, Inc. Displaying song lyrics and vocal cues
US8465366B2 (en) 2009-05-29 2013-06-18 Harmonix Music Systems, Inc. Biasing a musical performance input to a part
US8550908B2 (en) 2010-03-16 2013-10-08 Harmonix Music Systems, Inc. Simulating musical instruments
US8663013B2 (en) 2008-07-08 2014-03-04 Harmonix Music Systems, Inc. Systems and methods for simulating a rock band experience
US8678896B2 (en) 2007-06-14 2014-03-25 Harmonix Music Systems, Inc. Systems and methods for asynchronous band interaction in a rhythm action game
US8686269B2 (en) 2006-03-29 2014-04-01 Harmonix Music Systems, Inc. Providing realistic interaction to a player of a music-based video game
US8702485B2 (en) 2010-06-11 2014-04-22 Harmonix Music Systems, Inc. Dance game and tutorial
US20140118522A1 (en) * 2012-11-01 2014-05-01 Josh Heath Zuniga Dance learning system using a computer
US9024166B2 (en) 2010-09-09 2015-05-05 Harmonix Music Systems, Inc. Preventing subtractive track separation
EP2857075A4 (en) * 2012-05-29 2016-03-16 Capcom Co Computer device, recording medium, and method for controlling computer device
US9358456B1 (en) 2010-06-11 2016-06-07 Harmonix Music Systems, Inc. Dance competition game
CN105641899A (en) * 2014-12-31 2016-06-08 深圳泰山在线科技有限公司 Step physical fitness test method and system
EP3015954A4 (en) * 2013-06-26 2017-02-15 Sony Interactive Entertainment Inc. Information processing device, control method for information processing device, program, and information storage medium
US20170243060A1 (en) * 2016-02-18 2017-08-24 Wistron Corporation Method for grading spatial painting, apparatus and system for grading spatial painting
CN107346172A (en) * 2016-05-05 2017-11-14 富泰华工业(深圳)有限公司 A kind of action induction method and device
US9981193B2 (en) 2009-10-27 2018-05-29 Harmonix Music Systems, Inc. Movement based recognition and evaluation
US10357714B2 (en) 2009-10-27 2019-07-23 Harmonix Music Systems, Inc. Gesture-based user interface for navigating a menu
US11130058B2 (en) * 2017-06-27 2021-09-28 Konami Amusement Co., Ltd. Game machine and storage medium

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5427923B2 (en) * 2012-06-18 2014-02-26 株式会社コナミデジタルエンタテインメント Data generation system, data generation method used therefor, and computer program
JP5912940B2 (en) * 2012-07-10 2016-04-27 株式会社コナミデジタルエンタテインメント Evaluation apparatus, evaluation method, program, and system

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020019258A1 (en) * 2000-05-31 2002-02-14 Kim Gerard Jounghyun Methods and apparatus of displaying and evaluating motion data in a motion game apparatus

Family Cites Families (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2009022556A (en) * 2007-07-20 2009-02-05 Daiko:Kk Dance practice device and dance practice support method
JP4137168B2 (en) * 2007-10-10 2008-08-20 株式会社バンダイナムコゲームス GAME DEVICE AND INFORMATION STORAGE MEDIUM
JP2009218900A (en) * 2008-03-11 2009-09-24 Casio Comput Co Ltd Imaging apparatus, motion picture recording and playback method, and program
JP2009277195A (en) * 2008-04-18 2009-11-26 Panasonic Electric Works Co Ltd Information display system

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020019258A1 (en) * 2000-05-31 2002-02-14 Kim Gerard Jounghyun Methods and apparatus of displaying and evaluating motion data in a motion game apparatus

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
"Taiko: Drum Master - Katamari on the Rocks," YouTube video, http://www.youtube.com/watch?v=earHPtRGSGE, January 27, 2009 *
Namco, "Taiko Drum Master," Namco, October 26, 2004, game manual *

Cited By (30)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8686269B2 (en) 2006-03-29 2014-04-01 Harmonix Music Systems, Inc. Providing realistic interaction to a player of a music-based video game
US8678896B2 (en) 2007-06-14 2014-03-25 Harmonix Music Systems, Inc. Systems and methods for asynchronous band interaction in a rhythm action game
US8444486B2 (en) 2007-06-14 2013-05-21 Harmonix Music Systems, Inc. Systems and methods for indicating input actions in a rhythm-action game
US8439733B2 (en) 2007-06-14 2013-05-14 Harmonix Music Systems, Inc. Systems and methods for reinstating a player within a rhythm-action game
US8678895B2 (en) 2007-06-14 2014-03-25 Harmonix Music Systems, Inc. Systems and methods for online band matching in a rhythm action game
US8663013B2 (en) 2008-07-08 2014-03-04 Harmonix Music Systems, Inc. Systems and methods for simulating a rock band experience
US8449360B2 (en) 2009-05-29 2013-05-28 Harmonix Music Systems, Inc. Displaying song lyrics and vocal cues
US8465366B2 (en) 2009-05-29 2013-06-18 Harmonix Music Systems, Inc. Biasing a musical performance input to a part
US10357714B2 (en) 2009-10-27 2019-07-23 Harmonix Music Systems, Inc. Gesture-based user interface for navigating a menu
US9981193B2 (en) 2009-10-27 2018-05-29 Harmonix Music Systems, Inc. Movement based recognition and evaluation
US10421013B2 (en) 2009-10-27 2019-09-24 Harmonix Music Systems, Inc. Gesture-based user interface
US8550908B2 (en) 2010-03-16 2013-10-08 Harmonix Music Systems, Inc. Simulating musical instruments
US8636572B2 (en) 2010-03-16 2014-01-28 Harmonix Music Systems, Inc. Simulating musical instruments
US8568234B2 (en) 2010-03-16 2013-10-29 Harmonix Music Systems, Inc. Simulating musical instruments
US8874243B2 (en) 2010-03-16 2014-10-28 Harmonix Music Systems, Inc. Simulating musical instruments
US9278286B2 (en) 2010-03-16 2016-03-08 Harmonix Music Systems, Inc. Simulating musical instruments
US8702485B2 (en) 2010-06-11 2014-04-22 Harmonix Music Systems, Inc. Dance game and tutorial
US9358456B1 (en) 2010-06-11 2016-06-07 Harmonix Music Systems, Inc. Dance competition game
US8562403B2 (en) 2010-06-11 2013-10-22 Harmonix Music Systems, Inc. Prompting a player of a dance game
US8444464B2 (en) 2010-06-11 2013-05-21 Harmonix Music Systems, Inc. Prompting a player of a dance game
US9024166B2 (en) 2010-09-09 2015-05-05 Harmonix Music Systems, Inc. Preventing subtractive track separation
EP2857075A4 (en) * 2012-05-29 2016-03-16 Capcom Co Computer device, recording medium, and method for controlling computer device
US20140118522A1 (en) * 2012-11-01 2014-05-01 Josh Heath Zuniga Dance learning system using a computer
EP3015954A4 (en) * 2013-06-26 2017-02-15 Sony Interactive Entertainment Inc. Information processing device, control method for information processing device, program, and information storage medium
US10376777B2 (en) 2013-06-26 2019-08-13 Sony Interactive Entertainment Inc. Information processor, control method of information processor, program, and information storage medium
CN105641899A (en) * 2014-12-31 2016-06-08 深圳泰山在线科技有限公司 Step physical fitness test method and system
US20170243060A1 (en) * 2016-02-18 2017-08-24 Wistron Corporation Method for grading spatial painting, apparatus and system for grading spatial painting
US10452149B2 (en) * 2016-02-18 2019-10-22 Wistron Corporation Method for grading spatial painting, apparatus and system for grading spatial painting
CN107346172A (en) * 2016-05-05 2017-11-14 富泰华工业(深圳)有限公司 A kind of action induction method and device
US11130058B2 (en) * 2017-06-27 2021-09-28 Konami Amusement Co., Ltd. Game machine and storage medium

Also Published As

Publication number Publication date
JP2012090905A (en) 2012-05-17
JP5256269B2 (en) 2013-08-07

Similar Documents

Publication Publication Date Title
US20120108305A1 (en) Data generation device, control method for a data generation device, and non-transitory information storage medium
US8740704B2 (en) Game device, control method for a game device, and a non-transitory information storage medium
TWI470534B (en) Three dimensional user interface effects on a display by using properties of motion
US8139087B2 (en) Image presentation system, image presentation method, program for causing computer to execute the method, and storage medium storing the program
EP2193825B1 (en) Mobile device for augmented reality applications
JP5238900B2 (en) GAME DEVICE, GAME DEVICE CONTROL METHOD, AND PROGRAM
US11055920B1 (en) Performing operations using a mirror in an artificial reality environment
US20110230266A1 (en) Game device, control method for a game device, and non-transitory information storage medium
US9573052B2 (en) Game device, control method for a game device, and non-transitory information storage medium
US8823647B2 (en) Movement control device, control method for a movement control device, and non-transitory information storage medium
US8684837B2 (en) Information processing program, information processing system, information processing apparatus, and information processing method
US9724613B2 (en) Game device, control method of game device, program, and information storage medium
US20210255328A1 (en) Methods and systems of a handheld spatially aware mixed-reality projection platform
JP2002247602A (en) Image generator and control method therefor, and its computer program
US11759701B2 (en) System and method for generating user inputs for a video game
JP2011258158A (en) Program, information storage medium and image generation system
US11145126B1 (en) Movement instruction using a mirror in an artificial reality environment
JP5373744B2 (en) GAME DEVICE, GAME DEVICE CONTROL METHOD, AND PROGRAM
JP2012196286A (en) Game device, control method for game device, and program
US11036987B1 (en) Presenting artificial reality content using a mirror
JP5925828B2 (en) GAME DEVICE, GAME DEVICE CONTROL METHOD, AND PROGRAM
US20230226460A1 (en) Information processing device, information processing method, and recording medium
WO2022124135A1 (en) Game program, game processing method, and game device
JP2022090964A (en) Game program, game processing method, and game device
JP2022090965A (en) Game program, game processing method, and game device

Legal Events

Date Code Title Description
AS Assignment

Owner name: KONAMI DIGITAL ENTERTAINMENT CO., LTD., JAPAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:AKIYAMA, MASATO;SUZUKI, KIDAI;REEL/FRAME:027141/0357

Effective date: 20111025

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION