US20080058101A1 - Game process control method, information storage medium, and game device - Google Patents
Game process control method, information storage medium, and game device Download PDFInfo
- Publication number
- US20080058101A1 US20080058101A1 US11/892,789 US89278907A US2008058101A1 US 20080058101 A1 US20080058101 A1 US 20080058101A1 US 89278907 A US89278907 A US 89278907A US 2008058101 A1 US2008058101 A1 US 2008058101A1
- Authority
- US
- United States
- Prior art keywords
- note
- character
- game
- input
- section
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Images
Classifications
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/40—Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
- A63F13/42—Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
- A63F13/424—Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle involving acoustic input signals, e.g. by using the results of pitch or rhythm extraction or voice recognition
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/25—Output arrangements for video game devices
- A63F13/26—Output arrangements for video game devices having at least one additional display device, e.g. on the game controller or outside a game booth
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/20—Input arrangements for video game devices
- A63F13/21—Input arrangements for video game devices characterised by their sensors, purposes or types
- A63F13/214—Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads
- A63F13/2145—Input arrangements for video game devices characterised by their sensors, purposes or types for locating contacts on a surface, e.g. floor mats or touch pads the surface being also a display device, e.g. touch screens
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/20—Input arrangements for video game devices
- A63F13/21—Input arrangements for video game devices characterised by their sensors, purposes or types
- A63F13/215—Input arrangements for video game devices characterised by their sensors, purposes or types comprising means for detecting acoustic signals, e.g. using a microphone
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/40—Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
- A63F13/44—Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment involving timing of operations, e.g. performing an action within a time slot
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/60—Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor
- A63F13/69—Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor by enabling or updating specific game elements, e.g. unlocking hidden features, items, levels or versions
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/80—Special adaptations for executing a specific game genre or game mode
- A63F13/825—Fostering virtual characters
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/90—Constructional details or arrangements of video game devices not provided for in groups A63F13/20 or A63F13/25, e.g. housing, wiring, connections or cabinets
- A63F13/92—Video game devices specially adapted to be hand-held while playing
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/10—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
- A63F2300/1068—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals being specially adapted to detect the point of contact of the player on a surface, e.g. floor mat, touch pad
- A63F2300/1075—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals being specially adapted to detect the point of contact of the player on a surface, e.g. floor mat, touch pad using a touch screen
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/10—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
- A63F2300/1081—Input via voice recognition
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/20—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterised by details of the game platform
- A63F2300/204—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterised by details of the game platform the platform being a handheld device
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/30—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by output arrangements for receiving control signals generated by the game device
- A63F2300/301—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by output arrangements for receiving control signals generated by the game device using an additional display connected to the game console, e.g. on the controller
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/60—Methods for processing data by generating or executing the game program
- A63F2300/6063—Methods for processing data by generating or executing the game program for sound processing
- A63F2300/6072—Methods for processing data by generating or executing the game program for sound processing of an input signal, e.g. pitch and rhythm extraction, voice recognition
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/60—Methods for processing data by generating or executing the game program
- A63F2300/609—Methods for processing data by generating or executing the game program for unlocking hidden game elements, e.g. features, items, levels
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/60—Methods for processing data by generating or executing the game program
- A63F2300/63—Methods for processing data by generating or executing the game program for controlling the execution of the game in time
- A63F2300/638—Methods for processing data by generating or executing the game program for controlling the execution of the game in time according to the timing of operation or a time limit
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/60—Methods for processing data by generating or executing the game program
- A63F2300/69—Involving elements of the real world in the game world, e.g. measurement in live races, real video
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/80—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
- A63F2300/8058—Virtual breeding, e.g. tamagotchi
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06F—ELECTRIC DIGITAL DATA PROCESSING
- G06F3/00—Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
- G06F3/16—Sound input; Sound output
Definitions
- the present invention relates to a game device which causes a new character based on input sound to appear and the like.
- a game device which includes a sound input means such as a microphone and utilizes sound input from the sound input means for a game process.
- a sound input means such as a microphone
- technology has been known which determines the parameter of a character caused to appear based on the input sound.
- the input sound (analog signal) is converted into a digital signal, and the digital signal is converted into a numerical value in frequency band units to create sequence data. Whether or not an arbitrary value in the sequence data coincides with predetermined reference data is determined, and the parameter of the character is determined based on the determination results (e.g. Japanese Patent No. 2860097).
- the parameter of the character caused to appear is determined based on the input sound.
- a parameter irrelevant to the meaning of the input sound is generated.
- the parameter of the character is virtually determined at random. Therefore, it may be troublesome for the player to input sound for generating a new character, whereby the player may lose interest in the game.
- a game process control method which causes a computer including a sound input section to execute a game in which a game character appears, the method comprising:
- the note set being one of different types of note sets formed by combining predetermined notes
- FIG. 1 shows an example of the outward appearance of a portable game device.
- FIG. 2A shows an example of an egg selection screen
- FIG. 2B shows an example of a character list screen.
- FIG. 3 shows an example of a character creation production screen.
- FIG. 4 shows an example of a character creation screen.
- FIG. 5 shows a functional configuration example of a portable game device.
- FIG. 6 shows a data configuration example of possessed item data.
- FIG. 7 shows a data configuration example of a character setting table.
- FIG. 8 shows a data configuration example of detected note data.
- FIG. 9 shows a data configuration example of level condition data.
- FIG. 10 is a view illustrative of causing silence to occur at a detection time t at which five or more consecutive notes are detected.
- FIG. 11 shows a data configuration example of note detection total count data.
- FIG. 12 shows a data configuration example of start note data.
- FIG. 13 is a view illustrative of generation of note-name-unit detection data from detected note data.
- FIG. 14 is a view illustrative of causing silence to occur at a detection time at which seven or more consecutive notes are detected.
- FIG. 15 is a view illustrative of determination of formation of a chord in note-name-unit detection data.
- FIG. 16 shows a data configuration example of chord formation count data.
- FIG. 17 shows a data configuration example of a chord classification table.
- FIG. 18 shows a data configuration example of score data.
- FIG. 19 shows a data configuration example of a group/attribute correspondence table.
- FIG. 20 shows a data configuration example of determined creation probability data.
- FIG. 21 shows a data configuration example of a score setting table.
- FIG. 22 shows a data configuration example of a note classification table.
- FIG. 23 shows a data configuration example of note classification data.
- FIG. 24 shows a data configuration example of a creation probability setting table.
- FIG. 25 shows a data configuration example of an attribute/color correspondence table.
- FIG. 26 shows a data configuration example of particle data.
- FIG. 27 shows an example of generation percentage control data.
- FIG. 28 shows an example of total generation count control data.
- FIG. 29 is a flowchart of a game process.
- FIG. 30 is a flowchart of a character creation process executed during the game process.
- FIG. 31 is a flowchart of a creation probability determination process executed during the character creation process.
- FIG. 32 is a flowchart of a filtering process executed during the creation probability determination process.
- FIG. 33 is a flowchart of a chord formation determination process executed during the creation probability determination process.
- FIG. 34 is a flowchart of a group count shortage process executed during the creation probability determination process.
- FIG. 35 is a flowchart of a character creation production process executed during the character creation process.
- the invention has been achieved in view of the above-described situation, and may allow a player to estimate the relationship between a game character caused to appear and input sound to a certain extent.
- a game process control method which causes a computer including a sound input section to execute a game in which a game character appears, the method comprising:
- the note set being one of different types of note sets formed by combining predetermined notes
- a game device comprising:
- a note set detection section which detects a note set included in input sound input to the sound input section, the note set being one of different types of note sets formed by combining predetermined notes;
- a character selection section which selects a game character caused to appear based on the detection result of the note set detection section
- a character appearance control section which causes the game character selected by the character selection section to appear.
- the note set included in the input sound is detected which is one of different types of note sets formed by combining predetermined notes, and the game character selected based on the detection result is caused to appear in the game.
- the game character caused to appear in the game is determined based on the note set included in the input sound. For example, when detecting a chord such as a major chord or a minor chord as the note set, the game character caused to appear is determined based on the chord included in the input sound. Since the chord is an important element which determines the tone of the input sound, the player can enjoy estimating the game character caused to appear from the tone of the input sound, whereby the player's interest in the game character can be increased.
- the note sets may be associated in advance with the game characters
- the method may further comprise determining selection candidate characters including at least the game character corresponding to the detected note set;
- the game character caused to appear may be selected from the determined selection candidate characters.
- the note sets are associated in advance with the game characters, and the game character caused to appear is selected from the selection candidate characters including at least the game character corresponding to the detected note set. Specifically, the game character selected from the selection candidate characters including the game character corresponding to the detected note set is caused to appear as a new game character. Therefore, since the game character corresponding to the detected note set is caused to appear depending on the probability, the game character caused to appear differs even if the sound is input, whereby the player can enjoy the game.
- the note set included in the input sound may be detected at given time intervals.
- the selection candidate characters may be determined based on a detection total count of each of the note sets detected.
- the note set included in the input sound is detected in units of time at given time intervals, and the selection candidate characters are determined based on the detection total count of each note set. Therefore, the game character determined to correspond to the tone of the input sound appears with a higher probability as a new game character by determining the game character corresponding to the note set with a detection total count equal to or greater than a specific number to be the selection candidate character, for example.
- the selection candidate character corresponding to the note set with a larger detection total count may be selected as the game character caused to appear with a higher probability.
- the game character caused to appear is selected so that the selection candidate character corresponding to the note set with a larger detection total count is selected with a higher probability. Specifically, since the game character corresponding to the note set with the largest detection total count appears as a new game character with the highest probability, the game character determined to correspond to the tone of the input sound appears as a new game character with the highest probability.
- the game character determined to be the selection candidate character may be associated in advance corresponding to a set note content which is a percentage of a predetermined note in the input sound;
- the method may further comprise determining the set note content of the input sound input to the sound input section;
- the game character corresponding to the determined set note content may be determined to be included in the selection candidate characters.
- the game character corresponding to the set note content of the input sound is further included in the selection candidate characters.
- a special game character exists which may appear according to the percentage of a specific note in the input sound. This makes it possible to further increase the player's interest in the game character caused to appear.
- the game process control method may further comprise:
- a special character may be determined to be included in the selection candidate characters when a detection total count of the detected set note has reached a specific number.
- the game character may be associated in advance with each of a plurality of time conditions obtained by dividing a period in which the input sound may be input by date and/or time;
- the game character corresponding to the time condition satisfied by an input time of the input sound from the sound input section may be selected as the game character caused to appear.
- the game character is caused to appear which is associated in advance with one of the time conditions obtained by dividing a period in which the input sound may be input by date and/or time and satisfied by the input time of the input sound.
- the game character caused to appear differs depending on the input time, even if the same sound is input, whereby the game playability can be increased.
- the game process control method may further comprise:
- note set may be detected which includes the notes input at the same input timing.
- the input timing of each note included in the input sound is detected, and the note set including the notes input at the same input timing is detected.
- the note set included in the input sound input to the sound input section may be detected in note name units.
- the note set included in the input sound is detected in note name units (i.e., while regarding the notes having the same name as those same notes). Specifically, since the note set is detected in note name units irrespective of the octave, the note set is more easily detected.
- the game process control method may further comprise:
- the note set may be detected using the input sound subjected to the filtering process as the input sound input to the sound input section.
- the note set is detected based on only the notes included in the input sound and having a specific intensity. Specifically, a weak note which is included in the input sound and does not have a specific intensity is not detected as a note.
- the filtering process may include causing a portion of the input sound input to the sound input section in which a specific number or more of notes are input at the same time to be silent.
- a portion of the input sound in which a specific number or more of notes are input at the same time is caused to be silent.
- a note set is almost necessarily detected.
- the invention since a portion of the input sound in which a specific number or more of notes are input at the same time is caused to be silent, an unfair input operation of inputting a number of notes at the same time can be prevented.
- a computer-readable information recording medium storing a program for causing a computer to execute the game process control method.
- the term “information storage medium” used herein refers to a storage medium, such as a hard disk, an MO, a CD-ROM, a DVD, a memory card, or an IC memory, from which the stored information can be read by a computer.
- the note set included in the input sound is detected which is one of different types of note sets formed by combining predetermined notes, and the game character selected based on the detection result is caused to appear in the game.
- the game character caused to appear in the game is determined based on the note set included in the input sound. For example, when detecting a chord as the note set, the game character caused to appear is determined based on the chord included in the input sound. Since the chord is an important element which determines the tone of the input sound, the player can enjoy estimating the game character caused to appear from the tone of the input sound, whereby the player's interest in the game character can be increased.
- FIG. 1 is a view showing an example of the outward appearance of a portable game device 1 according to this embodiment.
- the portable game device 1 is a folding-type game device in which an upper housing 10 A and a lower housing 10 B are connected through a hinge 11 so that the portable game device 1 can be opened and shut.
- FIG. 1 illustrates the portable game device 1 in an open state (during use).
- the inner sides of the housings 10 A and 10 B are provided with two displays 12 A and 12 B disposed on either side of the hinge 11 during use, a speaker 13 , a microphone 14 , various operation buttons 15 , and the like.
- a touch panel is integrally formed in the display 12 B over the entire display region.
- the touch panel detects a touch position in units of dots forming the display 12 B according to a detection principle such as a pressure-sensitive method, an optical method, an electrostatic method, or an electromagnetic induction method, for example.
- the player can input various operations by utilizing a stylus pen 30 provided as an accessory, or by touching the display 12 B.
- Game information including a program and data necessary for the portable game device 1 to execute a game process and the like is stored in a cartridge 20 removable from a slot 16 formed in the side surface of the housing 10 B.
- the portable game device 1 may connect with a wireless communication channel through a built-in wireless communication device 18 and acquires the game information from an external instrument.
- the portable game device 1 includes a control device 17 including a CPU and an IC memory, the wireless communication device 18 for performing wireless communication conforming to a wireless LAN standard, a reading device for the cartridge 20 , and the like.
- the CPU provided in the control device 17 executes various game processes based on a program and data read from the IC memory and the cartridge 20 , a touch position detected by the touch panel, a sound signal input from the microphone 14 , an operation signal input from the operation buttons 15 , data received by the wireless communication device 18 , and the like, and generates an image signal of a game screen and a sound signal of game sound.
- the CPU outputs the generated image signal to the displays 12 A and 12 B to cause the displays 12 A and 12 B to display a game screen, and outputs the generated sound signal to the speaker 13 to cause the speaker 13 to output game sound.
- the player enjoys the breeding game by operating the operation buttons 15 or touching the display 12 B while watching the game screens displayed on the displays 12 A and 12 B.
- the player acquires an egg as one type of item during the game.
- the player causes a game character (hereinafter simply called “character”) to be created (appear) by playing a melody for the egg, and rears the created character.
- the player can play a melody by inputting melody sound (music) through the microphone 14 .
- Different types of eggs are provided which differ in outward appearance.
- Different types of characters are set for each type of egg as characters created from the egg.
- One of the characters corresponding to the melody played by the player is created.
- the character created from a single egg differs depending on the melody played by the player.
- the character to be created differs depending on the type of egg, even if the player plays the same melody.
- An attribute is set for each character.
- the attribute is a parameter by which each character is classified.
- the attribute affects the rearing of the character, the game process, and the like.
- the attribute is classified as “fire”, “wind”, “earth”, “water”, “light”, and “darkness” (six types in total).
- FIGS. 2A and 2B are views showing an example of a game screen when creating a character.
- FIG. 2A shows a game screen displayed on the display 12 B
- FIG. 2B shows a game screen displayed on the display 12 A.
- an egg selection screen for selecting an egg from which a character is created is displayed on the display 12 B.
- Different types of eggs OB provided in advance are listed on the egg selection screen together with the number of eggs currently possessed by the player.
- One of the displayed eggs OB is in a selected state.
- three of all types of eggs OB are displayed.
- the remaining eggs OB are displayed by scrolling the screen.
- the egg OB at the center of the screen is in a selected state and enclosed by a frame M indicating the selected state.
- a character list screen which is a list of the characters set for the egg is displayed on the display 12 A.
- the characters set for the egg which is in a selected state on the egg selection screen i.e., characters which may be created from the egg
- the name of the character which has been created and is possessed by the player is displayed, and the name of the character which is not possessed by the player is not displayed (indicated by “???” in FIG. 2B ). This allows the player to easily determine whether or not the player possesses each character.
- the player selects the desired egg on the egg selection screen.
- the player inputs melody sound through the microphone 14 by producing a sound or playing a musical instrument according to a countdown instruction displayed on the display 12 A, for example.
- the portable game device 1 then performs a specific analysis process for the input sound, and causes the character corresponding to the processing results to be created from the selected egg. Note that the character is not necessarily created depending on the input sound.
- FIG. 3 is a view showing an example of the character creation production screen. As shown in FIG. 3 , the selected egg OB and a number of spherical particles P are displayed on the character creation production screen.
- the particle P indicates the character to be created. Different types (three types in FIG. 3 ) of particles P (P 1 to P 3 ) are displayed in combination. Each particle P has an identical shape, but differs in color depending on the type. Each particle P is displayed using animation techniques so that the particle P generated from the egg OB moves around the egg OB and is diffused. Each particle P disappears after a specific period of time (about a few seconds) has expired.
- the numbers of respective particles P displayed change with the passage of time. Specifically, the numbers of respective particles P displayed are almost the same when the display of the character creation production screen starts, and gradually change (increase/decrease) with the passage of time.
- One type of particles P i.e., particles P of a color corresponding to the attribute of the character to be created
- a change in the number of particles P displayed is controlled by changing the number of particles P generated.
- FIG. 4 is a view showing an example of the character creation screen. As shown in FIG. 4 , a state in which a new character CH is created from the selected egg OB is displayed on the character creation screen. The created character CH is added to the possessed characters, and the number of eggs of the selected type is decremented (reduced) by one.
- FIG. 5 is a block diagram showing a functional configuration of the portable game device 1 .
- the portable game device 1 is functionally configured to include an operation input section 100 , a sound input section 200 , a processing section 300 , an image display section 400 , a sound output section 500 , a communication section 600 , and a storage section 700 .
- the operation input section 100 receives an operation instruction input from the player, and outputs an operation signal corresponding to the operation to the processing section 300 .
- the function of the operation input section 100 is implemented by a button switch, a lever, a dial, a mouse, a keyboard, various sensors, and the like.
- the operation button 15 and the touch panel integrally formed in the display 12 B correspond to the operation input section 100 .
- the sound input section 200 collects sound such as voice input by the player, and outputs a sound signal corresponding to the collected sound to the processing section 300 .
- the function of the sound input section 200 is implemented by a microphone or the like. In FIG. 1 , the microphone 14 corresponds to the sound input section 200 .
- the processing section 300 controls the entire portable game device 1 and performs various calculations such as proceeding with the game and generating an image.
- the function of the processing section 300 is implemented by a calculation device such as a CPU (CISC or RISC) or an ASIC (e.g. gate array) and its control program, for example.
- a calculation device such as a CPU (CISC or RISC) or an ASIC (e.g. gate array) and its control program, for example.
- the CPU provided in the control device 17 corresponds to the processing section 300 .
- the processing section 300 includes a game calculation section 310 which mainly performs game calculations, an image generation section 330 which generates a game image based on various types of data calculated by the game calculation section 310 , and a sound generation section 340 which generates game sound such as effect sound and background music (BGM).
- a game calculation section 310 which mainly performs game calculations
- an image generation section 330 which generates a game image based on various types of data calculated by the game calculation section 310
- a sound generation section 340 which generates game sound such as effect sound and background music (BGM).
- the game calculation section 310 performs various game processes based on the operation signal input from the operation input section 100 , the sound signal input from the sound input section 200 , a program and data read from the storage section 700 , and the like.
- the game calculation section 310 includes a character creation control section 320 , and realizes the breeding game by performing a game process based on a game program 710 .
- the character creation control section 320 includes a creation probability determination section 321 and a creation production section 322 , and performs a process relating to the creation of a character. Specifically, the character creation control section 320 refers to possessed item data 731 , and causes the image display section 400 to display the egg selection screen in which different types of eggs provided in advance are displayed together with the number of eggs currently possessed by the player, as shown in FIG. 2A , for example.
- the possessed item data 731 is data relating to the currently possessed items.
- FIG. 6 shows an example of the data configuration of the possessed item data 731 .
- the possessed item data 731 includes possessed egg data 731 a and possessed character data 731 d .
- the possessed egg data 731 a is data relating to the possession of eggs, in which an egg type 731 b and a possession count 731 c are stored while being associated with each other.
- the possessed character data 731 a is data relating to the possession of characters, in which a character type 731 f and a possession count 731 g are stored while being associated with each other in units of attributes 731 e of characters.
- the character creation control section 320 refers to a character setting table 732 , and causes the image display section 400 to display the character list screen in which a list of the characters corresponding to the egg selected on the egg selection screen is displayed, as shown in FIG. 2B , for example.
- the character setting table 732 is a data table relating to the characters set for each egg.
- FIG. 7 shows an example of the data configuration of the character setting table 732 .
- the character setting table 732 is provided for each egg type.
- the character setting table 732 stores a corresponding egg type 732 a , and stores a plurality of character types 732 c associated with each attribute 732 b.
- the character creation control section 320 When the egg has been selected on the egg selection screen, the character creation control section 320 performs a specific countdown display and the like, and starts to record the input sound. Specifically, the character creation control section 320 converts the sound input from the sound input section 200 into a digital signal, and stores the digital signal in the storage section 700 as input sound data 721 . After completion of recording, the creation probability determination section 321 performs a specific analysis process for the input sound data 721 , and determines the creation probability of each of three candidate attributes as the attributes of creation candidate characters based on the processing results.
- the creation probability determination section 321 detects notes within a specific octave range (e.g. three octaves) from the input sound data 721 at specific time intervals (e.g. intervals of 1 ⁇ 8 seconds).
- the notes are within a specific octave range (e.g. three octaves) of which one octave includes “do”, “do#”, “re”, “re#”, “mi”, “fa”, “fa#”, “sol”, “sol#”, “la”, “la#”, “ti” (12 notes in total). These 12 notes are also called note names.
- the note detection results are stored as detected note data 722 .
- FIG. 8 shows an example of the detected note data 722 . As shown in FIG. 8 , the presence or absence of detection of each note 722 b is stored as the detected note data 722 in units of detection time 722 a . In FIG. 8 , “O” indicates that the note is detected, and “x” indicates that the note is not detected.
- the creation probability determination section 321 determines the maximum level (sound intensity) of the detected notes.
- the creation probability determination section 321 excludes the note which is included in the detected note data 722 and does not satisfy a specific level condition from the detected notes.
- the level condition is stored as level condition data 733 .
- FIG. 9 shows an example of the data configuration of the level condition data 733 . As shown in FIG. 9 , the level condition of the note with respect to the maximum level of the detected note is stored as the level condition data 733 .
- the creation probability determination section 321 determines the detected notes in the detected note data 722 in units of detection time t. When five or more adjacent notes have been detected, the creation probability determination section 321 excludes all notes at the time t from the detected notes (silent). In FIG. 10 ( 1 ), six adjacent notes from “mi” to “la” are detected at the time t n , for example. The creation probability determination section 321 excludes all notes including these six notes from the detected notes so that silence occurs at the time t n , as shown in FIG. 10 ( 2 ).
- the creation probability determination section 321 calculates the total count (detection total count) of each note detected based on the detected note data 722 .
- the creation probability determination section 321 counts the notes having the same name as those same notes irrespective of the octave.
- the creation probability determination section 321 sums up the detection total count of each note to calculate the detection total count of all the notes.
- the creation probability determination section 321 sums up the detection total count of each note provided with sharp “#” (black-key note) to calculate the detection total count of all the black-key notes.
- the black-key notes include “do#”, “re#”, “fa#”, “sol#”, and “la#” (five notes in total).
- FIG. 11 shows an example of the data configuration of the note detection total count data 741 .
- a note 741 a and a detection total count 741 b are stored as the note detection total count data 741 while being associated with each other.
- a detection total count 741 c of all the notes and a detection total count 741 d of all the black-key notes are also stored as the note detection total count data 741 .
- the creation probability determination section 321 determines the start (input timing) of the detected note based on the detected note data 722 . Specifically, when each note in the detected note data 722 satisfies one of the following conditions A 1 to A 3 , the creation probability determination section 321 determines that note to be the start.
- Condition A 1 the note has not been detected at the preceding detection time t ⁇ 1 and is not detected at the subsequent detection time t +1 .
- Condition A 2 the note has not been detected at the preceding detection time t ⁇ 1 but is detected at the subsequent detection time t +1 , and the level of the note detected at the detection time t +1 is higher than the level of the note detected at the present detection time t.
- Condition A 3 the note has not been detected at the preceding detection time t ⁇ 1 but is detected at the subsequent detection time t +1 , and the level of the note detected at the subsequent detection time t +1 is lower than the level of the note detected at the present detection time t.
- the start determination results are stored as a start note data 723 .
- FIG. 12 shows an example of the start note data 723 .
- the start note data 723 indicates whether or not each note 723 b is the start note in units of detection time 723 a in the same manner as the detected note data 722 .
- “O” indicates that the note is the start note
- “x” indicates that the note is not the start note.
- the creation probability determination section 321 combines the detected note data 722 of three octaves within one octave in note name units to obtain note-name-unit detection data 724 . As shown in FIG. 13 , the creation probability determination section 321 creates the note-name-unit detection data 724 of one octave by combining the notes having the same name as those same notes in units of detection time t irrespective of the octave.
- the creation probability determination section 321 calculates the number of detected notes in the note-name-unit detection data 724 in units of detection time t. When the calculated number of notes is seven or more, the creation probability determination section 321 excludes all notes at the time t from the detected notes (silent). In FIG. 14 ( 1 ), eight notes “do”, “do#”, “mi”, “fa”, “fa#”, “la”, “la#”, and “ti” are detected at the time t n , for example. The creation probability determination section 321 excludes all notes at the time t n including these eight notes from the detected notes, as shown in FIG. 14 ( 2 ).
- the creation probability determination section 321 combines the start note data 723 within one octave in note name units to generate note-name-unit start data 725 .
- the creation probability determination section 321 determines whether or not a chord is formed in the note-name-unit detection data 724 .
- Chord refers to a combination (note set) of predetermined notes, such as a major chord and a minor chord.
- the creation probability determination section 321 determines formation of different types of chords. As shown in FIG. 15 , the creation probability determination section 321 determines whether or not a chord is formed in the note-name-unit detection data 724 in units of detection time t, and calculates the formation count in chord units. In this case, the creation probability determination section 321 determines formation of one chord at each detection time t.
- the creation probability determination section 321 determines whether or not a chord is formed in the note-name-unit start data 725 in units of detection time t, and calculates the formation count in chord units.
- the creation probability determination section 321 sums up the formation counts of the note-name-unit detection data 724 and the note-name-unit start data 725 in chord units to calculate the total formation count.
- the calculated formation count is stored as chord formation count data 742 .
- FIG. 16 shows an example of the data configuration of the chord formation count data 742 .
- a chord determination order 742 a , a chord 742 b , and a formation count 742 c are stored as the chord formation count data 742 while being associated with one another.
- the determination order 742 is set so that the order of a four-note chord made up of four notes is higher than the order of a three-note chord made up of three notes.
- the formation count 742 c includes the formation count of each of the note-name-unit detection data 724 and the note-name-unit start data 725 and the total value.
- the creation probability determination section 321 determines formation of each chord according to the determination order specified by the chord formation count data 742 . Specifically, the creation probability determination section 321 determines whether or not each chord is formed in the note-name-unit detection data 724 according to the specified determination order in units of detection time t, and determines the chord of which the formation has been determined first to be a chord formed at the time t. Likewise, the creation probability determination section 321 determines whether or not each chord is formed in the note-name-unit start data 725 according to the specified determination order in units of detection time t, and determines the chord of which the formation has been determined first to be a chord formed at the time t. The creation probability determination section 321 sums up the formation counts of the note-name-unit detection data 724 and the note-name-unit start data 725 in chord units to obtain the total formation count.
- the creation probability determination section 321 determines the sum of the formation count of each chord belonging to each of the chord classification groups (A) to (D) to be the score of each group according to a chord classification table 734 .
- the chord classification table 734 is a data table which defines the classification of chords.
- FIG. 17 shows an example of the data configuration of the chord classification table 734 .
- a group 734 a and a chord 734 b are stored in the chord classification table 734 while being associated with each other.
- the group 734 a is classified into four groups (A) to (D).
- the scores of the groups (A) to (D) are stored as score data 743 .
- FIG. 18 shows an example of the data configuration of the score data 743 .
- a group 743 a and a score 743 b are stored as the score data 743 while being associated with each other.
- the group 743 a is classified into six groups (A) to (F). In this example, the scores of only the groups (A) to (D) are determined, and the scores of the groups (E) and (F) are set at “0”.
- the creation probability determination section 321 determines whether or not the scores of the groups (A) to (D) satisfy the following condition B.
- Condition B the score of at least one of the groups (A) to (D) is “5” or more, and the scores of three or more groups are “1” or more.
- the creation probability determination section 321 selects three groups with higher scores from the groups (A) to (D).
- the creation probability determination section 321 refers to a group/attribute correspondence table 735 , and sets the attributes corresponding to the selected groups to be first to third attributes which are attributes of creation candidate characters in the order from the attribute with the highest score.
- the group/attribute correspondence table 735 is a data table which defines the correspondence between the groups (A) to (D) and the attributes of the characters.
- FIG. 19 shows an example of the data configuration of the group/attribute correspondence table 735 . As shown in FIG. 19 , a group 735 a and an attribute 735 b of a character are stored in the group/attribute correspondence table 735 while being associated with each other.
- the first attribute is “earth” corresponding to the group (C) with the highest score
- the second attribute is “water” corresponding to the group (D) with the second highest score
- the third attribute is “fire” corresponding to the group (A) with the third highest score.
- the creation probability determination section 321 determines the creation probability of each of the first to third attributes based on the score of each of the selected groups. Specifically, the creation probability determination section 321 calculates the ratio of the score of each group to the sum of the scores of the three selected groups as the creation probability of the attribute corresponding to each group. In the example shown in FIG. 18 , the score of the group (C) corresponding to the first attribute “earth” is “27”, the score of the group (D) corresponding to the second attribute “water” is “23”, and the score of the group (A) corresponding to the third attribute “fire” is “10”.
- the determined creation probability of each attribute is stored as determined creation probability data 744 .
- FIG. 20 shows an example of the data configuration of the determined creation probability data 744 .
- an attribute 744 a and a creation probability 744 b are stored as the determined creation probability data 744 while being associated with each other.
- the attribute 744 a includes the first to third attributes.
- the creation probability 744 b is set so that the total value is 100%.
- characters with the attributes corresponding to the groups (A) to (D) i.e., “fire”, “wind”, “earth”, and “water” are set to be creation candidate characters (selected candidate characters), and the character to be created is selected from these characters.
- the creation probability determination section 321 determines the first to third attributes and the creation probabilities as follows. When the scores of all of the groups (A) to (D) are “0” (i.e., no chord is formed), the creation probability determination section 321 determines that the creation of the character has failed, and does not determine the creation probability.
- the creation probability determination section 321 determines the scores of the groups (E) and (F) referring to a score setting table 736 based on the detection total count of all the black-key notes based on the detected note data 722 .
- FIG. 21 shows an example of the data configuration of the score setting table 736 .
- a ratio 736 a of the detection total count of all the black-key notes to the detection total count of all the notes and a score 736 b of each of the groups (E) and (F) are stored in the score setting table 736 while being associated each other.
- the creation probability determination section 321 refers to the note detection total count data 741 , and calculates the ratio of the detection total count of all the black-key notes to the detection total count of all the notes (set note content).
- the creation probability determination section 321 sets the scores associated with the calculated detection total count ratios in the score setting table 736 to be the scores of the groups (E) and (F).
- the creation probability determination section 321 selects three groups with higher scores from the groups (A) to (F), and sets the attributes corresponding to the three selected groups to be the first to third attributes in the order from the attribute with the highest score referring to the group/attribute correspondence table 735 .
- the creation probability determination section 321 determines the creation probability of each of the first to third attributes based on the score of each selected group. Specifically, the creation probability determination section 321 calculates the ratio of the score of each group to the sum of the scores of the three selected groups as the creation probability of the attribute corresponding to each group.
- the creation probability determination section 321 determines the first to third attributes and the creation probabilities based on the detection total count of each note based on the detected note data 722 .
- the creation probability determination section 321 determines one of the groups with a higher score to be a first group, and determines the attribute corresponding to the first group to be the first attribute referring to the group/attribute correspondence table 735 .
- the creation probability determination section 321 determines the attribute corresponding to the other group to be the second attribute.
- the creation probability determination section 321 determines that group to be a first group, and determines the attribute corresponding to the first group to be the first attribute referring to the group/attribute correspondence table 735 .
- the creation probability determination section 321 sets the sum of the detection total count of each note belonging to each of note classification groups (a) to (f) according to a note classification table 737 .
- the note classification table 737 is a data table which defines the classification of notes.
- FIG. 22 shows an example of the data configuration of the note classification table 737 .
- a group 737 a , a note 737 b , and an attribute 737 c of a character are stored in the note classification table 737 while being associated with one another.
- the group 737 a is classified into six groups (a) to (f). Each of the groups (a) to (f) is associated with two notes.
- FIG. 23 shows an example of the data configuration of the note classification data 745 . As shown in FIG. 23 , a group 745 a and a detection total count 745 b are stored as the note classification data 745 while being associated with each other.
- the creation probability determination section 321 refers to the note detection total count data 741 , and calculates the sum of the detection total count of each note belonging to each of the groups (a) to (f).
- the creation probability determination section 321 determines one of the groups (a) to (f) having the largest sum of the detection total counts, and determines the attribute corresponding to the determined group to be the second attribute. When the attribute corresponding to the group having the largest note count coincides with the first attribute, the creation probability determination section 321 determines the attribute corresponding to the group having the second largest note count to be the second attribute.
- the creation probability determination section 321 determines the third attribute based on the possessed characters. Specifically, the creation probability determination section 321 refers to the character setting data 732 corresponding to the type of egg selected for causing the character to be created and the possessed character data 731 d , and calculates the possession ratio of the number of characters possessed by the player and having each attribute to the total number of characters in character's attribute units. The creation probability determination section 321 determines the attribute of which the calculated possession ratio is the smallest to be the third attribute.
- the creation probability determination section 321 determines the creation probability of each of the first to third attributes referring to a creation probability setting table 746 .
- FIG. 24 shows an example of the data configuration of the creation probability setting table 746 .
- a score 746 a of the first group and a creation probability 746 b of each of the first to third attributes are stored in the creation probability setting table 746 while being associated with each other.
- the creation probability determination section 321 determines the creation probability associated with the score of the first group (i.e., one of the groups (a) to (f) corresponding to the first attribute) in the creation probability setting table 746 to be the creation probability of each of the first to third attributes.
- characters with the attributes corresponding to the groups (A) to (F) i.e., “fire”, “wind”, “earth”, “water”, “light”, and “darkness” are set to be creation candidate characters, and the character to be created is selected from these characters.
- the character creation control section 320 determines the character to be created according to the creation probability of each attribute determined by the creation probability determination section 321 . Specifically, the character creation control section 320 determines the attribute of the character to be created from the first to third attributes according to the creation probability.
- the character creation control section 320 refers to the possessed character data 731 d and the character setting data 732 corresponding to the selected egg, and determines a character randomly selected from the characters which have the determined attribute and are not possessed by the player to be the character to be created.
- the creation production section 322 then performs a creation production process of producing the creation of the determined character. Specifically, the creation production section 322 determines the color and the generation percentage of each of the first to third particles as three types of particles P to be displayed. The first to third particles respectively correspond to the first to third attributes.
- the generation percentage refers to the percentage of the number of respective particles generated in the total generation count which is the total number of first to third particles generated. Specifically, since the respective particles P have the same life (about a few seconds), the number of respective particles P displayed is proportional to the percentage of the respective particles P generated.
- the creation production section 322 refers to the determined creation probability data 744 and an attribute/color correspondence table 751 , and determines the colors corresponding to the first to third attributes to be the colors of the first to third particles, respectively.
- the attribute/color correspondence table 751 is a data table which defines the correspondence between the attribute of a character and a color.
- FIG. 25 shows an example of the data configuration of the attribute/color correspondence table 751 . As shown in FIG. 25 , an attribute 751 b of a character and a color 751 b are stored in the attribute/color correspondence table 751 while being associated with each other.
- the creation production section 322 determines an initial generation percentage which is the generation percentage when the character creation production starts, an intermediate generation percentage which is the generation percentage during the character creation production, and a final generation percentage which is the generation percentage when the character creation production ends as the generation percentage of the respective particles. Specifically, the initial generation percentages of the first to third particles are set at 33%. The creation probabilities of the first to third attributes are respectively set as the intermediate generation percentages of the first to third particles. The final generation percentage of the particle corresponding to the attribute of the character to be created is set at 90%, and the final generation percentages of the remaining particles are set at 5%.
- the determined colors and generation percentages of the respective particles P are stored as particle data 752 .
- FIG. 26 shows an example of the data configuration of the particle data 752 .
- a particle 752 a , a color 752 b , and a generation percentage 752 c are stored as the particle data 752 while being associated with one another.
- the particle 752 a is classified as the first to third particles.
- the generation percentage 752 c includes the initial generation percentage, the intermediate generation percentage, and the final generation percentage.
- the creation production section 322 then generates generation percentage control data 754 for controlling generation of the respective particles based on the determined generation percentages of the respective particles.
- FIG. 27 shows an example of the generation percentage control data 754 .
- FIG. 27 shows the generation percentages of the respective particles P with respect to the time t (the horizontal axis indicates the time t, and the vertical axis indicates the generation percentage).
- the generation percentages of the first to third particles are 33% (initial generation percentage) at the character creation production start time t 0 .
- the generation percentage is gradually changed (increased/decreased) so that the generation percentage is set at the intermediate generation percentage at the time t 1 during the character creation production and is set at the final generation percentage at the finish time t 2 .
- the creation production section 322 causes the image display section 400 to display the character creation production screen in which the selected egg and the respective particles are displayed, as shown in FIG. 3 , and causes the sound output section 500 to output specific production sound, for example.
- the creation production section 322 starts controlling the respective particles in the character creation production screen according to the total generation count control data 753 and the generation percentage control data 754 .
- the total generation count control data 753 is data for controlling the total generation count which is the sum of the numbers of respective particles generated.
- FIG. 28 shows an example of the total generation count control data 753 .
- FIG. 28 shows the total generation count N with respect to the time t (the horizontal axis indicates the time t, and the vertical axis indicates the generation count N).
- the total generation count N is constant at a total generation count N 1 from the character creation production start time to the time t 1 , and is gradually increased so that the total generation count N reaches a predetermined total generation count N 2 at the finish time t 2 .
- the creation production section 322 determines the total generation count N at the present time from the total generation count control data 753 in units of a specific period of time, and determines the generation percentage of each of the first to third particles from the generation percentage control data 754 .
- the creation production section 322 determines the generation count of each of the first to third particles by multiplying the total generation count N by the generation percentage of the respective particles, and generates the particles P in the determined generation count.
- the creation production section 322 causes the particles P to disappear (to be deleted) when the specific life has expired.
- the creation production section 322 controls the movement of each particle P currently displayed. Specifically, the creation production section 322 sets a moving force field acting on the particle P with the position of the egg being the generation base score (center).
- the moving force field is set as a positive (+) force field which acts to draw the particle P toward the center of the generation base score or a negative ( ⁇ ) force field which acts to move the particle P away from the generation base score.
- the creation production section 322 moves each particle P according to the external force corresponding to the distance from the force field base score applied by the moving force field and the initial speed applied to each particle P. For example, when a velocity vector in the direction which rotates around the generation base score of the moving force field is specified as the initial speed, each particle P moves so that each particle P is diffused or drawn while rotating around the egg OB.
- the creation production section 322 finishes displaying the character creation production screen, and displays the character creation screen in which the character to be created is displayed, as shown in FIG. 4 , for example.
- the creation production section 322 adds the created character to the possessed characters to update the possessed character data 731 d , and decrements the eggs of the selected type by one to update the possessed egg data 731 a.
- the image generation section 330 generates a game image for displaying a game screen based on the calculation results from the game calculation section 310 , and outputs an image signal of the generated image to the image display section 400 .
- the image display section 400 displays the game screen based on the image signal from the image generation section 330 while redrawing the screen of one frame every 1/60 second, for example.
- the function of the image display section 400 is implemented by hardware such as a CRT, an LCD, an ELD, a PDP, or an HMD.
- the displays 12 A and 12 B correspond to the image display section 400 .
- the sound generation section 340 generates game sound such as effect sound and BGM used during the game, and outputs a sound signal of the generated game sound to the sound output section 500 .
- the sound output section 500 outputs the game sound such as effect sound and BGM based on the sound signal from the sound generation section 340 .
- the function of the sound output section 500 is implemented by a speaker or the like. In FIG. 1 , the speaker 13 corresponds to the sound output section 500 .
- the communication section 600 communicates data with an external device such as another portable game device 1 according to the control signal from the processing section 300 .
- the function of the communication section 600 is implemented by a wireless communication module, a jack for a communication cable, a control circuit, or the like.
- the wireless communication device 18 corresponds to the communication section 600 .
- the storage section 700 stores a system program for implementing the function for causing the processing section 300 to integrally control the portable game device 1 , a program and data necessary for causing the processing section 300 to execute the game, and the like.
- the storage section 700 is used as a work area for the processing section 300 , and temporarily stores the results of calculations performed by the processing section 300 according to various programs, data input from the operation input section 100 , and the like.
- the function of the storage section 700 is implemented by an IC memory, a hard disk, a CD-ROM, a DVD, an MO, a RAM, a VRAM, or the like.
- the ROM, the RAM, and the like provided in the control device 17 correspond to the storage section 700 .
- the storage section 700 also stores the game program 710 for causing the processing section 300 to function as the game calculation section 310 , and game data.
- the game program 710 includes a character creation program 711 for causing the processing section 300 to function as the character creation control section 320 .
- the game data includes the input sound data 721 , the detected note data 722 , the start note data 723 , the note-name-unit detection data 724 , the note-name-unit start data 725 , the possessed item data 731 , the character setting table 732 , the level condition data 733 , the chord classification table 734 , the group/attribute correspondence table 735 , the score setting table 736 , the note classification table 737 , the note detection total count data 741 , the chord formation count data 742 , the score data 743 , the determined creation probability data 744 , the note classification data 745 , the creation probability setting table 746 , the attribute/color correspondence table 751 , the particle data 752 , the total generation count control data 753 , and the generation percentage control data 754 .
- FIG. 29 is a flowchart illustrative of the flow of the game process according to this embodiment. This process is implemented by causing the game calculation section 310 to execute the process based on the game program 710 .
- the game calculation section 310 controls the process of a known breeding game according to the operation input from the operation input section 100 and the like (step A 1 ).
- step A 3 YES
- the game calculation section 310 adds the acquired egg to the possessed eggs, and updates the possessed egg data 731 a (step A 5 ).
- step A 7 YES
- the character creation control section 320 performs a character creation process (step A 9 ).
- FIG. 30 is a flowchart illustrative of the flow of the character creation process.
- the character creation control section 320 refers to the possessed egg data 731 a , and causes the image display section 400 to display the egg selection screen in which different types of eggs provided in advance are displayed together with the number of eggs possessed by the player.
- the character creation control section 320 refers to the character setting table 732 corresponding to the egg selected on the egg selection screen, and causes the image display section 400 to display the character list screen which is a list of the characters set for the selected egg.
- the character creation control section 320 selects one egg from the eggs possessed by the player according to the operation input from the operation input section 100 (step B 1 ).
- the character creation control section 320 When the egg has been selected, the character creation control section 320 performs a sound input process of allowing the player to input melody sound by performing a specific countdown display and the like, and storing sound data input from the sound input section 200 as the input sound data 721 (step B 3 ). The creation probability determination section 321 then performs a creation probability determination process based on the input sound data 721 (step B 5 ).
- FIG. 31 is a flowchart illustrative of the flow of the creation probability determination process.
- the creation probability determination section 321 detects each note within a specific octave from the input sound data 721 , and generates the detected note data 722 (step C 1 ).
- the creation probability determination section 321 performs a filtering process for the detected note data 722 (step C 3 ).
- FIG. 32 is a flowchart illustrative of the flow of the filtering process.
- the creation probability determination section 321 determines the maximum level of the detected notes based on the detected note data 722 (step D 1 ).
- the creation probability determination section 321 refers to the level condition data 733 , and excludes any note in the detected note data 722 which does not satisfy the specific level condition from the detected notes (step D 3 ).
- the creation probability determination section 321 determines the detected notes in the detected note data 722 in units of detection time t. When five or more adjacent notes have been detected, the creation probability determination section 321 excludes all notes at the time t from the detected notes (step D 5 ).
- the creation probability determination section 321 calculates the detection total count of each note in the detected note data 722 , and sums up the calculated detection total count of each note to calculate the detection total count of all the notes (step D 7 ).
- the creation probability determination section 321 sums up the detection total count of each black-key note in the detected note data 722 to calculate the detection total count of all the black-key notes (step D 9 ).
- the creation probability determination section 321 combines the detected note data 722 within one octave in note name units to generate the note-name-unit detection data 724 (step D 11 ).
- the creation probability determination section 321 calculates the number of types of detected notes in the note-name-unit detection data 724 in units of detection time t. When the calculated number of types of notes is seven or more, the creation probability determination section 321 excludes all notes at the time t from the detected notes (step D 13 ).
- the creation probability determination section 321 determines the start of each detected note in the detected note data 722 to generate the start note data 723 (step D 15 ).
- the creation probability determination section 321 combines the generated start note data 723 within one octave in note name units to generate the note-name-unit start data 725 (step D 17 ).
- the creation probability determination section 321 thus completes the filtering process.
- the creation probability determination section 321 refers to the note detection total count data 741 , and determines the detection total count of all the notes in the detected note data 722 .
- the creation probability determination section 321 performs a chord formation determination process to determine the scores of the groups (A) to (D) (step C 7 ).
- FIG. 33 is a flowchart illustrative of the flow of the chord formation determination process.
- the creation probability determination section 321 determines whether or not a chord is formed in the note-name-unit detection data 724 at each detection time t, and calculates the formation count in chord units (step E 1 ). Likewise, the creation probability determination section 321 determines whether or not a chord is formed in the note-name-unit start data 725 at each detection time t, and calculates the formation count in chord units (step E 3 ).
- the creation probability determination section 321 sums up the formation counts of the note-name-unit detection data 724 and the note-name-unit start data 725 in chord units to calculate the total formation count (step E 5 ).
- the creation probability determination section 321 determines the sum of the formation count of each chord belonging to each of the chord classification groups (A) to (D) to be the score of each group (step E 7 ).
- the creation probability determination section 321 thus completes the chord formation determination process.
- the creation probability determination section 321 determines the score of each of the groups (A) to (D).
- the creation probability determination section 321 selects three groups with higher scores from the groups (A) to (D) (step C 13 ).
- the creation probability determination section 321 sets the attributes corresponding to the selected groups to be the first to third attributes in the order from the attribute with the highest score (step C 15 ), and determines the creation probability of each of the first to third attributes based on the score of each of the selected groups (step C 17 ).
- the creation probability determination section 321 determines that the character is successfully created (step C 19 ).
- the creation probability determination section 321 performs a group count shortage process, and determines the first to third attributes and the creation probabilities (step C 19 ).
- FIG. 34 is a flowchart illustrative of the flow of the group count shortage process.
- the creation probability determination section 321 determines one of the groups (A) to (F) with the highest score to be the first group, and determines the attribute corresponding to the first group to be the first attribute (step F 1 ).
- the creation probability determination section 321 determines the scores of the groups (A) to (F). When the number of groups included in the groups (A) to (F) and having a score of 1 or more is one (step F 3 : YES), the creation probability determination section 321 sums up the detection total count of each note in the detected note data 722 belonging to each of the note classification groups (a) to (f) referring to the note detection total count data 741 (step F 5 ).
- the creation probability determination section 321 selects one of the groups (a) to (f) having the largest detection total count, and determines whether or not the attribute corresponding to the selected group coincides with the first attribute. When the selected group does not coincide with the first attribute (step F 7 : YES), the creation probability determination section 321 sets the attribute corresponding to the selected group to be the second attribute (step F 9 ). When the selected group coincides with the first attribute (step F 7 : NO), the creation probability determination section 321 selects one of the groups (a) to (f) with the second highest score, and sets the attribute corresponding to the selected group to be the second attribute (step F 11 ).
- the creation probability determination section 321 sets the attribute corresponding to one of the groups (A) to (F) with the second highest score to be the second attribute (step F 13 ).
- the creation probability determination section 321 refers to the character setting table 732 corresponding to the type of the selected egg and the possessed item data 731 , and determines the attribute with the minimum possession rate from the attribute of each character set for the egg of the selected type (step F 15 ).
- the creation probability determination section 321 determines whether or not the determined attribute coincides with the first or second attribute. When the determined attribute does not coincide with the first or second attribute (step F 17 : YES), the creation probability determination section 321 sets the determined attribute to be the third attribute (step F 21 ). When the determined attribute coincides with the first or second attribute (step F 17 : NO), the creation probability determination section 321 determines the attribute of the character with the second smallest possession rate (step F 19 ). The creation probability determination section 321 determines whether or not the determined attribute coincides with the first or second attribute in the step F 17 (step F 17 ).
- the creation probability determination section 321 refers to the score setting table 736 , and determines the creation probability of each of the first to third attributes according to the score of the first group (step F 23 ).
- the creation probability determination section 321 thus completes the group count shortage process.
- the creation probability determination section 321 determines that the character is successfully created (step C 29 ).
- the creation probability determination section 321 determines the number of types of detected notes in the detected note data 722 referring to the note detection total count data 741 .
- the creation probability determination section 321 determines that the character is not successfully created (step C 31 ).
- the creation probability determination section 321 refers to the note detection total count data 741 , and determines the scores of the groups (E) and (F) referring to the score setting table 736 based on the detection total count of all the black-key notes in the detected note data 722 (step C 23 ). The creation probability determination section 321 then determines the score of each of the groups (A) to (F). When the scores of three or more of the groups (A) to (F) are 1 or more (step C 25 : YES), the creation probability determination section 321 selects three groups with higher scores from the groups (A) to (F) (step C 27 ).
- the creation probability determination section 321 sets the attributes corresponding to the selected groups to be the first to third attributes in the order from the attribute with the highest score (step C 15 ), and determines the creation probability of each of the first to third attributes based on the score of each of the selected groups (step C 17 ). The creation probability determination section 321 determines that the character is successfully created (step C 29 ).
- the creation probability determination section 321 When the number of groups (A) to (F) with a score of 1 or more is less than three (step C 25 : YES), the creation probability determination section 321 performs the group count shortage process, and determines the first to third attributes and the creation probabilities (step C 19 ). The creation probability determination section 321 determines that the character is successfully created (step C 29 ).
- the creation probability determination section 321 thus completes the creation probability determination process.
- the character creation control section 320 determines whether or not the character is successfully created.
- the character creation control section 320 determines the character to be created according to the type of the selected egg and the creation probability of each attribute determined. Specifically, the character creation control section 320 determines the attribute of the character to be created according to the creation probability of each attribute determined.
- the character creation control section 320 refers to the character setting table 732 corresponding to the type of the selected egg, and determines a character randomly selected from the characters which have the determined attribute and are not possessed by the player to be the character to be created (step B 9 ).
- the character creation control section 320 decrements (reduces) the eggs of the selected type by one to update the possessed egg data 731 a (step B 11 ).
- the creation production section 322 then performs a character creation production process (step B 13 ).
- FIG. 35 is a flowchart illustrative of the flow of the character creation production process.
- the creation production section 322 determines the colors corresponding to the first to third attributes to be the colors of the first to third particles, respectively (step G 1 ).
- the creation production section 322 determines the generation percentage of each of the first to third particles. Specifically, the creation production section 322 sets the initial generation percentage of each of the first to third particles at the same value (33%) (step G 3 ). The creation production section 322 sets the creation probabilities of the first to third attributes as the intermediate generation percentages of the first to third particles, respectively (step G 5 ). The creation production section 322 sets the final generation percentage of the particle corresponding to the attribute of the character to be created at 90%, and sets the final generation percentages of the remaining particles at 5% (step G 7 ). The creation production section 322 refers to the particle data 752 , and generates the generation percentage control data 754 according to the generation percentage of each of the first to third particles (step G 9 ).
- the creation production section 322 causes the image display section 400 to display the character creation production screen in which the egg of the selected type and the respective particles are disposed, and starts controlling each particle in the character creation screen according to the total generation count control data 753 and the generated generation percentage control data 754 (step G 11 ).
- the creation production section 322 finishes displaying the character creation production screen, and causes the image display section 400 to display the character creation screen displaying a state in which the character is created (step G 15 ).
- the creation production section 322 adds the created character to the possessed characters to update the possessed character data 731 d (step G 17 ).
- the creation production section 322 thus completes the character creation production process.
- step B 7 When the character is not successfully created in the step B 7 in FIG. 30 (step B 7 : NO), the character creation control section 320 performs a character creation failure production process such as causing the image display section 400 to display the creation failure screen showing that the character is not successfully created, or causing the sound output section 500 to output sound (step B 15 ).
- a character creation failure production process such as causing the image display section 400 to display the creation failure screen showing that the character is not successfully created, or causing the sound output section 500 to output sound (step B 15 ).
- the character creation control section 320 thus completes the character creation process.
- the game calculation section 310 determines whether or not to finish the game.
- the game calculation section 310 does not finish the game (step A 11 : NO)
- the game calculation section 310 transitions to the step A 1 .
- the game calculation section 310 finishes the game process to finish the game.
- the first to third attributes are determined as the attributes of the creation candidate characters from the attributes of the characters based on the determined formation count in chord units, and the creation probability of each of the first to third attributes is determined.
- the character with one of the first to third attributes determined according to the creation probability is created and added to the possessed characters.
- the character corresponding to the type of chord included in the input sound is created depending on the probability, the character to be created differs even if the same melody is input, whereby the player can enjoy the game.
- the creation probability of each of the first to third attributes is determined by the formation count of the corresponding chord. Specifically, when the number of specific chords is large, the character with the attribute associated with the specific chord is created with a high probability.
- the chord is an important element which determines the tone of the melody. Therefore, the player can enjoy estimating the character to be created from the tone of the input melody.
- the characters with the attributes “light” and “darkness” respectively corresponding to the groups (E) and (F) are included in the attributes of the creation candidate characters by determining the scores of the groups (E) and (F) according to the ratio of the detection total count of all the black-key notes to the detection total count of all the notes.
- the characters with the attributes “light” and “darkness” may be included in the creation candidate characters according to the detection total count of all the black-key notes, for example. Specifically, when the detection total count of all the black-key notes is equal to or greater than a first specific number and less than a second specific number, the character with the attribute “light” or “darkness” is included in the creation candidate characters. When the detection total count of all the black-key notes is equal to or greater than the second specific number, the characters with the attributes “light” and “darkness” are included in the creation candidate characters, for example. Note that the second specific number is greater than the first specific number.
- the particle P is spherical.
- the particle P may have another shape such as a triangle, a quadrangle, or a line.
- a state in which the spherical particles P are mixed may be displayed as a cloud or smoke.
- the display state such as the size, shape, or brightness of each particle P may be changed with the passage of time. In this case, it is desirable that the color of the particle P not be changed because the color of the particle P indicates the corresponding attribute.
- the number of respective particles P is changed by generating the particles P or causing the particles P to disappear.
- the total number of particles P may be constant without generating the particles P or causing the particles P to disappear, and the ratio of the numbers of respective particles P may be changed by changing the color of each particle P.
- each particle P has the same life. Note that the ratio of the numbers of respective particles P may be changed by changing the life of each particle P depending on the type.
- a character randomly selected from the characters corresponding to the attribute determined based on the input sound is created.
- the character to be created may be selected based on the date (date and time). Specifically, the selection probability of each time zone obtained by dividing one day (24 hours) into a plurality of time zones (e.g. morning, daytime, and night) is set for each character. The character to be created is selected according to the selection probability corresponding to the time zone corresponding to the time at which the melody sound is input among the selection probabilities in time zone units set for each character corresponding to the determined attribute.
- the selection probability of each season obtained by dividing one year (365 days) into a plurality of seasons e.g.
- the character to be created may be selected according to the selection probability corresponding to the date at which the melody sound is input. This allows the character to be created to be changed corresponding to the date at which the melody sound is input.
- the time zone may be associated with the character instead of the selection probability, and the character may be created which corresponds to the time zone corresponding to the time at which the melody sound is input.
- the character is created after performing creation production of displaying the particle P corresponding to the attribute of each creation candidate character.
- the character may be created during creation production, the character may be created at the same time as creation production, or creation production may be performed after (immediately after) creating the character.
- the attribute is set in advance for each character as the parameter by which each character is classified.
- the capability parameter of each character may be employed such as offensive power, defensive power, or witchcraft.
- the above-described embodiments illustrate the case of applying the invention to the portable game device.
- the invention can also be applied to other devices which can execute a game, such as a consumer game device, an arcade game device, and a portable telephone.
Abstract
A chord type is associated in advance with an attribute of a character. Whether or not a chord is formed is determined from input sound in units of detection time t at specific time intervals, first to third attributes are determined as the attributes of creation candidate characters from the attributes of the characters based on the formation count in chord units, and the creation probability of each of the first to third attributes is determined based on the formation count of the chord of the corresponding type. A character with one of the first to third attributes determined according to the creation probability is created.
Description
- Japanese Patent Application No. 2006-234164 filed on Aug. 30, 2006, is hereby incorporated by reference in its entirety.
- The present invention relates to a game device which causes a new character based on input sound to appear and the like.
- A game device has been known which includes a sound input means such as a microphone and utilizes sound input from the sound input means for a game process. For example, technology has been known which determines the parameter of a character caused to appear based on the input sound. According to this technology, the input sound (analog signal) is converted into a digital signal, and the digital signal is converted into a numerical value in frequency band units to create sequence data. Whether or not an arbitrary value in the sequence data coincides with predetermined reference data is determined, and the parameter of the character is determined based on the determination results (e.g. Japanese Patent No. 2860097).
- According to the technology disclosed in Japanese Patent No. 2860097, the parameter of the character caused to appear is determined based on the input sound. However, a parameter irrelevant to the meaning of the input sound is generated. Specifically, since the player cannot expect the parameter generated based on the input sound, the parameter of the character is virtually determined at random. Therefore, it may be troublesome for the player to input sound for generating a new character, whereby the player may lose interest in the game.
- According to one aspect of the invention, there is provided a game process control method which causes a computer including a sound input section to execute a game in which a game character appears, the method comprising:
- detecting a note set included in input sound input to the sound input section, the note set being one of different types of note sets formed by combining predetermined notes;
- selecting a game character caused to appear based on the detection result;
- causing the selected game character to appear; and
- controlling display of each game character including the new game character.
-
FIG. 1 shows an example of the outward appearance of a portable game device. -
FIG. 2A shows an example of an egg selection screen, andFIG. 2B shows an example of a character list screen. -
FIG. 3 shows an example of a character creation production screen. -
FIG. 4 shows an example of a character creation screen. -
FIG. 5 shows a functional configuration example of a portable game device. -
FIG. 6 shows a data configuration example of possessed item data. -
FIG. 7 shows a data configuration example of a character setting table. -
FIG. 8 shows a data configuration example of detected note data. -
FIG. 9 shows a data configuration example of level condition data. -
FIG. 10 is a view illustrative of causing silence to occur at a detection time t at which five or more consecutive notes are detected. -
FIG. 11 shows a data configuration example of note detection total count data. -
FIG. 12 shows a data configuration example of start note data. -
FIG. 13 is a view illustrative of generation of note-name-unit detection data from detected note data. -
FIG. 14 is a view illustrative of causing silence to occur at a detection time at which seven or more consecutive notes are detected. -
FIG. 15 is a view illustrative of determination of formation of a chord in note-name-unit detection data. -
FIG. 16 shows a data configuration example of chord formation count data. -
FIG. 17 shows a data configuration example of a chord classification table. -
FIG. 18 shows a data configuration example of score data. -
FIG. 19 shows a data configuration example of a group/attribute correspondence table. -
FIG. 20 shows a data configuration example of determined creation probability data. -
FIG. 21 shows a data configuration example of a score setting table. -
FIG. 22 shows a data configuration example of a note classification table. -
FIG. 23 shows a data configuration example of note classification data. -
FIG. 24 shows a data configuration example of a creation probability setting table. -
FIG. 25 shows a data configuration example of an attribute/color correspondence table. -
FIG. 26 shows a data configuration example of particle data. -
FIG. 27 shows an example of generation percentage control data. -
FIG. 28 shows an example of total generation count control data. -
FIG. 29 is a flowchart of a game process. -
FIG. 30 is a flowchart of a character creation process executed during the game process. -
FIG. 31 is a flowchart of a creation probability determination process executed during the character creation process. -
FIG. 32 is a flowchart of a filtering process executed during the creation probability determination process. -
FIG. 33 is a flowchart of a chord formation determination process executed during the creation probability determination process. -
FIG. 34 is a flowchart of a group count shortage process executed during the creation probability determination process. -
FIG. 35 is a flowchart of a character creation production process executed during the character creation process. - The invention has been achieved in view of the above-described situation, and may allow a player to estimate the relationship between a game character caused to appear and input sound to a certain extent.
- According to one embodiment of the invention, there is provided a game process control method which causes a computer including a sound input section to execute a game in which a game character appears, the method comprising:
- detecting a note set included in input sound input to the sound input section, the note set being one of different types of note sets formed by combining predetermined notes;
- selecting a game character caused to appear based on the detection result; causing the selected game character to appear; and
- controlling display of each game character including the new game character.
- According to another embodiment of the invention, there is provided a game device comprising:
- a sound input section;
- a note set detection section which detects a note set included in input sound input to the sound input section, the note set being one of different types of note sets formed by combining predetermined notes;
- a character selection section which selects a game character caused to appear based on the detection result of the note set detection section; and
- a character appearance control section which causes the game character selected by the character selection section to appear.
- According to the above embodiment, the note set included in the input sound is detected which is one of different types of note sets formed by combining predetermined notes, and the game character selected based on the detection result is caused to appear in the game. Specifically, the game character caused to appear in the game is determined based on the note set included in the input sound. For example, when detecting a chord such as a major chord or a minor chord as the note set, the game character caused to appear is determined based on the chord included in the input sound. Since the chord is an important element which determines the tone of the input sound, the player can enjoy estimating the game character caused to appear from the tone of the input sound, whereby the player's interest in the game character can be increased.
- In the game process control method, the note sets may be associated in advance with the game characters;
- the method may further comprise determining selection candidate characters including at least the game character corresponding to the detected note set; and
- the game character caused to appear may be selected from the determined selection candidate characters.
- According to this feature, the note sets are associated in advance with the game characters, and the game character caused to appear is selected from the selection candidate characters including at least the game character corresponding to the detected note set. Specifically, the game character selected from the selection candidate characters including the game character corresponding to the detected note set is caused to appear as a new game character. Therefore, since the game character corresponding to the detected note set is caused to appear depending on the probability, the game character caused to appear differs even if the sound is input, whereby the player can enjoy the game.
- In the game process control method, the note set included in the input sound may be detected at given time intervals; and
- the selection candidate characters may be determined based on a detection total count of each of the note sets detected.
- According to this feature, the note set included in the input sound is detected in units of time at given time intervals, and the selection candidate characters are determined based on the detection total count of each note set. Therefore, the game character determined to correspond to the tone of the input sound appears with a higher probability as a new game character by determining the game character corresponding to the note set with a detection total count equal to or greater than a specific number to be the selection candidate character, for example.
- In the game process control method, the selection candidate character corresponding to the note set with a larger detection total count may be selected as the game character caused to appear with a higher probability.
- According to this feature, the game character caused to appear is selected so that the selection candidate character corresponding to the note set with a larger detection total count is selected with a higher probability. Specifically, since the game character corresponding to the note set with the largest detection total count appears as a new game character with the highest probability, the game character determined to correspond to the tone of the input sound appears as a new game character with the highest probability.
- In the game process control method, the game character determined to be the selection candidate character may be associated in advance corresponding to a set note content which is a percentage of a predetermined note in the input sound;
- the method may further comprise determining the set note content of the input sound input to the sound input section; and
- the game character corresponding to the determined set note content may be determined to be included in the selection candidate characters.
- According to this feature, the game character corresponding to the set note content of the input sound is further included in the selection candidate characters. Specifically, a special game character exists which may appear according to the percentage of a specific note in the input sound. This makes it possible to further increase the player's interest in the game character caused to appear.
- The game process control method, may further comprise:
- detecting whether or not a set note which is a note set in advance is included in the input sound input to the sound input section at given time intervals;
- wherein a special character may be determined to be included in the selection candidate characters when a detection total count of the detected set note has reached a specific number.
- According to this feature, whether or not the set note is included in the input sound is detected in units of time at given time intervals, and the special character is included in the selection candidate characters when the detection total count has reached a specific number. Specifically, a special character exists which may appear only when the set notes are included in the input sound in a number equal to or greater than a specific number.
- In the game process control method, the game character may be associated in advance with each of a plurality of time conditions obtained by dividing a period in which the input sound may be input by date and/or time; and
- the game character corresponding to the time condition satisfied by an input time of the input sound from the sound input section may be selected as the game character caused to appear.
- According to this feature, the game character is caused to appear which is associated in advance with one of the time conditions obtained by dividing a period in which the input sound may be input by date and/or time and satisfied by the input time of the input sound. Specifically, the game character caused to appear differs depending on the input time, even if the same sound is input, whereby the game playability can be increased.
- The game process control method, may further comprise:
- detecting an input timing of each note included in the input sound input to the sound input section;
- wherein the note set may be detected which includes the notes input at the same input timing.
- According to this feature, the input timing of each note included in the input sound is detected, and the note set including the notes input at the same input timing is detected.
- In the game process control method, the note set included in the input sound input to the sound input section may be detected in note name units.
- According to this feature, the note set included in the input sound is detected in note name units (i.e., while regarding the notes having the same name as those same notes). Specifically, since the note set is detected in note name units irrespective of the octave, the note set is more easily detected.
- The game process control method, may further comprise:
- subjecting the input sound to a filtering process by detecting only the notes included in the input sound input to the sound input section and having a specific intensity;
- the note set may be detected using the input sound subjected to the filtering process as the input sound input to the sound input section.
- According to this feature, the note set is detected based on only the notes included in the input sound and having a specific intensity. Specifically, a weak note which is included in the input sound and does not have a specific intensity is not detected as a note.
- In the game process control method, the filtering process may include causing a portion of the input sound input to the sound input section in which a specific number or more of notes are input at the same time to be silent.
- According to this feature, a portion of the input sound in which a specific number or more of notes are input at the same time is caused to be silent. For example, when inputting a number of notes at the same time by simultaneously pressing piano keys over one octave, a note set is almost necessarily detected. According to the invention, since a portion of the input sound in which a specific number or more of notes are input at the same time is caused to be silent, an unfair input operation of inputting a number of notes at the same time can be prevented.
- According to a further embodiment of the invention, there is provided a computer-readable information recording medium storing a program for causing a computer to execute the game process control method.
- The term “information storage medium” used herein refers to a storage medium, such as a hard disk, an MO, a CD-ROM, a DVD, a memory card, or an IC memory, from which the stored information can be read by a computer. According to the invention, the note set included in the input sound is detected which is one of different types of note sets formed by combining predetermined notes, and the game character selected based on the detection result is caused to appear in the game. Specifically, the game character caused to appear in the game is determined based on the note set included in the input sound. For example, when detecting a chord as the note set, the game character caused to appear is determined based on the chord included in the input sound. Since the chord is an important element which determines the tone of the input sound, the player can enjoy estimating the game character caused to appear from the tone of the input sound, whereby the player's interest in the game character can be increased.
- Preferred embodiments of the invention are described below with reference to the drawings. The following description illustrates an example of causing a portable game device to execute a breeding game. Note that the embodiment to which the invention can be applied is not limited thereto.
-
FIG. 1 is a view showing an example of the outward appearance of aportable game device 1 according to this embodiment. As shown inFIG. 1 , theportable game device 1 is a folding-type game device in which anupper housing 10A and alower housing 10B are connected through ahinge 11 so that theportable game device 1 can be opened and shut.FIG. 1 illustrates theportable game device 1 in an open state (during use). - The inner sides of the
housings displays hinge 11 during use, aspeaker 13, amicrophone 14,various operation buttons 15, and the like. A touch panel is integrally formed in thedisplay 12B over the entire display region. The touch panel detects a touch position in units of dots forming thedisplay 12B according to a detection principle such as a pressure-sensitive method, an optical method, an electrostatic method, or an electromagnetic induction method, for example. The player can input various operations by utilizing astylus pen 30 provided as an accessory, or by touching thedisplay 12B. - Game information including a program and data necessary for the
portable game device 1 to execute a game process and the like is stored in acartridge 20 removable from aslot 16 formed in the side surface of thehousing 10B. Theportable game device 1 may connect with a wireless communication channel through a built-inwireless communication device 18 and acquires the game information from an external instrument. - The
portable game device 1 includes acontrol device 17 including a CPU and an IC memory, thewireless communication device 18 for performing wireless communication conforming to a wireless LAN standard, a reading device for thecartridge 20, and the like. The CPU provided in thecontrol device 17 executes various game processes based on a program and data read from the IC memory and thecartridge 20, a touch position detected by the touch panel, a sound signal input from themicrophone 14, an operation signal input from theoperation buttons 15, data received by thewireless communication device 18, and the like, and generates an image signal of a game screen and a sound signal of game sound. The CPU outputs the generated image signal to thedisplays displays speaker 13 to cause thespeaker 13 to output game sound. The player enjoys the breeding game by operating theoperation buttons 15 or touching thedisplay 12B while watching the game screens displayed on thedisplays - In the breeding game according to this embodiment, the player acquires an egg as one type of item during the game. The player causes a game character (hereinafter simply called “character”) to be created (appear) by playing a melody for the egg, and rears the created character. The player can play a melody by inputting melody sound (music) through the
microphone 14. - Different types of eggs are provided which differ in outward appearance. Different types of characters are set for each type of egg as characters created from the egg. One of the characters corresponding to the melody played by the player is created. Specifically, the character created from a single egg differs depending on the melody played by the player. The character to be created differs depending on the type of egg, even if the player plays the same melody.
- An attribute is set for each character. The attribute is a parameter by which each character is classified. The attribute affects the rearing of the character, the game process, and the like. In this example, the attribute is classified as “fire”, “wind”, “earth”, “water”, “light”, and “darkness” (six types in total).
-
FIGS. 2A and 2B are views showing an example of a game screen when creating a character.FIG. 2A shows a game screen displayed on thedisplay 12B, andFIG. 2B shows a game screen displayed on thedisplay 12A. - As shown in
FIG. 2A , an egg selection screen for selecting an egg from which a character is created is displayed on thedisplay 12B. Different types of eggs OB provided in advance are listed on the egg selection screen together with the number of eggs currently possessed by the player. One of the displayed eggs OB is in a selected state. InFIG. 2A , three of all types of eggs OB are displayed. The remaining eggs OB are displayed by scrolling the screen. Among the three eggs OB displayed, the egg OB at the center of the screen is in a selected state and enclosed by a frame M indicating the selected state. - As shown in
FIG. 2B , a character list screen which is a list of the characters set for the egg is displayed on thedisplay 12A. The characters set for the egg which is in a selected state on the egg selection screen (i.e., characters which may be created from the egg) are listed in attribute units. The name of the character which has been created and is possessed by the player is displayed, and the name of the character which is not possessed by the player is not displayed (indicated by “???” inFIG. 2B ). This allows the player to easily determine whether or not the player possesses each character. - The player selects the desired egg on the egg selection screen. The player inputs melody sound through the
microphone 14 by producing a sound or playing a musical instrument according to a countdown instruction displayed on thedisplay 12A, for example. Theportable game device 1 then performs a specific analysis process for the input sound, and causes the character corresponding to the processing results to be created from the selected egg. Note that the character is not necessarily created depending on the input sound. - When the player succeeds in creating the character, a character creation production screen is displayed which produces creation of the character for a specific period of time prior to creation of the character.
FIG. 3 is a view showing an example of the character creation production screen. As shown inFIG. 3 , the selected egg OB and a number of spherical particles P are displayed on the character creation production screen. - The particle P indicates the character to be created. Different types (three types in
FIG. 3 ) of particles P (P1 to P3) are displayed in combination. Each particle P has an identical shape, but differs in color depending on the type. Each particle P is displayed using animation techniques so that the particle P generated from the egg OB moves around the egg OB and is diffused. Each particle P disappears after a specific period of time (about a few seconds) has expired. - The numbers of respective particles P displayed change with the passage of time. Specifically, the numbers of respective particles P displayed are almost the same when the display of the character creation production screen starts, and gradually change (increase/decrease) with the passage of time. One type of particles P (i.e., particles P of a color corresponding to the attribute of the character to be created) are mainly displayed just before the display of the character creation production screen ends, and the numbers of the remaining two types of particles P displayed are reduced to a large extent. A change in the number of particles P displayed is controlled by changing the number of particles P generated.
- When a specific period of time has expired after the display of the character creation production screen has started and the character creation production screen has disappeared, a character creation screen of the character to be created is displayed.
FIG. 4 is a view showing an example of the character creation screen. As shown inFIG. 4 , a state in which a new character CH is created from the selected egg OB is displayed on the character creation screen. The created character CH is added to the possessed characters, and the number of eggs of the selected type is decremented (reduced) by one. -
FIG. 5 is a block diagram showing a functional configuration of theportable game device 1. InFIG. 5 , theportable game device 1 is functionally configured to include anoperation input section 100, asound input section 200, aprocessing section 300, animage display section 400, asound output section 500, acommunication section 600, and astorage section 700. - The
operation input section 100 receives an operation instruction input from the player, and outputs an operation signal corresponding to the operation to theprocessing section 300. The function of theoperation input section 100 is implemented by a button switch, a lever, a dial, a mouse, a keyboard, various sensors, and the like. InFIG. 1 , theoperation button 15 and the touch panel integrally formed in thedisplay 12B correspond to theoperation input section 100. - The
sound input section 200 collects sound such as voice input by the player, and outputs a sound signal corresponding to the collected sound to theprocessing section 300. The function of thesound input section 200 is implemented by a microphone or the like. InFIG. 1 , themicrophone 14 corresponds to thesound input section 200. - The
processing section 300 controls the entireportable game device 1 and performs various calculations such as proceeding with the game and generating an image. The function of theprocessing section 300 is implemented by a calculation device such as a CPU (CISC or RISC) or an ASIC (e.g. gate array) and its control program, for example. InFIG. 1 , the CPU provided in thecontrol device 17 corresponds to theprocessing section 300. - The
processing section 300 includes agame calculation section 310 which mainly performs game calculations, animage generation section 330 which generates a game image based on various types of data calculated by thegame calculation section 310, and asound generation section 340 which generates game sound such as effect sound and background music (BGM). - The
game calculation section 310 performs various game processes based on the operation signal input from theoperation input section 100, the sound signal input from thesound input section 200, a program and data read from thestorage section 700, and the like. In this embodiment, thegame calculation section 310 includes a charactercreation control section 320, and realizes the breeding game by performing a game process based on agame program 710. - The character
creation control section 320 includes a creationprobability determination section 321 and acreation production section 322, and performs a process relating to the creation of a character. Specifically, the charactercreation control section 320 refers to possesseditem data 731, and causes theimage display section 400 to display the egg selection screen in which different types of eggs provided in advance are displayed together with the number of eggs currently possessed by the player, as shown inFIG. 2A , for example. - The
possessed item data 731 is data relating to the currently possessed items.FIG. 6 shows an example of the data configuration of the possesseditem data 731. As shown inFIG. 6 , thepossessed item data 731 includes possessedegg data 731 a andpossessed character data 731 d. Thepossessed egg data 731 a is data relating to the possession of eggs, in which anegg type 731 b and apossession count 731 c are stored while being associated with each other. Thepossessed character data 731 a is data relating to the possession of characters, in which acharacter type 731 f and apossession count 731 g are stored while being associated with each other in units ofattributes 731 e of characters. - The character
creation control section 320 refers to a character setting table 732, and causes theimage display section 400 to display the character list screen in which a list of the characters corresponding to the egg selected on the egg selection screen is displayed, as shown inFIG. 2B , for example. - The character setting table 732 is a data table relating to the characters set for each egg.
FIG. 7 shows an example of the data configuration of the character setting table 732. As shown inFIG. 7 , the character setting table 732 is provided for each egg type. The character setting table 732 stores acorresponding egg type 732 a, and stores a plurality ofcharacter types 732 c associated with eachattribute 732 b. - When the egg has been selected on the egg selection screen, the character
creation control section 320 performs a specific countdown display and the like, and starts to record the input sound. Specifically, the charactercreation control section 320 converts the sound input from thesound input section 200 into a digital signal, and stores the digital signal in thestorage section 700 asinput sound data 721. After completion of recording, the creationprobability determination section 321 performs a specific analysis process for theinput sound data 721, and determines the creation probability of each of three candidate attributes as the attributes of creation candidate characters based on the processing results. - Specifically, the creation
probability determination section 321 detects notes within a specific octave range (e.g. three octaves) from theinput sound data 721 at specific time intervals (e.g. intervals of ⅛ seconds). The notes are within a specific octave range (e.g. three octaves) of which one octave includes “do”, “do#”, “re”, “re#”, “mi”, “fa”, “fa#”, “sol”, “sol#”, “la”, “la#”, “ti” (12 notes in total). These 12 notes are also called note names. - The note detection results are stored as detected
note data 722.FIG. 8 shows an example of the detectednote data 722. As shown inFIG. 8 , the presence or absence of detection of eachnote 722 b is stored as the detectednote data 722 in units ofdetection time 722 a. InFIG. 8 , “O” indicates that the note is detected, and “x” indicates that the note is not detected. - The creation
probability determination section 321 determines the maximum level (sound intensity) of the detected notes. The creationprobability determination section 321 excludes the note which is included in the detectednote data 722 and does not satisfy a specific level condition from the detected notes. - The level condition is stored as
level condition data 733.FIG. 9 shows an example of the data configuration of thelevel condition data 733. As shown inFIG. 9 , the level condition of the note with respect to the maximum level of the detected note is stored as thelevel condition data 733. - The creation
probability determination section 321 determines the detected notes in the detectednote data 722 in units of detection time t. When five or more adjacent notes have been detected, the creationprobability determination section 321 excludes all notes at the time t from the detected notes (silent). In FIG. 10(1), six adjacent notes from “mi” to “la” are detected at the time tn, for example. The creationprobability determination section 321 excludes all notes including these six notes from the detected notes so that silence occurs at the time tn, as shown in FIG. 10(2). - The creation
probability determination section 321 calculates the total count (detection total count) of each note detected based on the detectednote data 722. The creationprobability determination section 321 counts the notes having the same name as those same notes irrespective of the octave. The creationprobability determination section 321 sums up the detection total count of each note to calculate the detection total count of all the notes. The creationprobability determination section 321 sums up the detection total count of each note provided with sharp “#” (black-key note) to calculate the detection total count of all the black-key notes. The black-key notes include “do#”, “re#”, “fa#”, “sol#”, and “la#” (five notes in total). - The calculated detection total count is stored as note detection
total count data 741.FIG. 11 shows an example of the data configuration of the note detectiontotal count data 741. As shown inFIG. 11 , anote 741 a and a detection total count 741 b are stored as the note detectiontotal count data 741 while being associated with each other. A detection total count 741 c of all the notes and a detection total count 741 d of all the black-key notes are also stored as the note detectiontotal count data 741. - The creation
probability determination section 321 determines the start (input timing) of the detected note based on the detectednote data 722. Specifically, when each note in the detectednote data 722 satisfies one of the following conditions A1 to A3, the creationprobability determination section 321 determines that note to be the start. - Condition A2: the note has not been detected at the preceding detection time t−1 but is detected at the subsequent detection time t+1, and the level of the note detected at the detection time t+1 is higher than the level of the note detected at the present detection time t.
Condition A3: the note has not been detected at the preceding detection time t−1 but is detected at the subsequent detection time t+1, and the level of the note detected at the subsequent detection time t+1 is lower than the level of the note detected at the present detection time t. - The start determination results are stored as a
start note data 723.FIG. 12 shows an example of thestart note data 723. As shown inFIG. 12 , thestart note data 723 indicates whether or not each note 723 b is the start note in units ofdetection time 723 a in the same manner as the detectednote data 722. InFIG. 12 , “O” indicates that the note is the start note, and “x” indicates that the note is not the start note. - The creation
probability determination section 321 combines the detectednote data 722 of three octaves within one octave in note name units to obtain note-name-unit detection data 724. As shown inFIG. 13 , the creationprobability determination section 321 creates the note-name-unit detection data 724 of one octave by combining the notes having the same name as those same notes in units of detection time t irrespective of the octave. - The creation
probability determination section 321 calculates the number of detected notes in the note-name-unit detection data 724 in units of detection time t. When the calculated number of notes is seven or more, the creationprobability determination section 321 excludes all notes at the time t from the detected notes (silent). In FIG. 14(1), eight notes “do”, “do#”, “mi”, “fa”, “fa#”, “la”, “la#”, and “ti” are detected at the time tn, for example. The creationprobability determination section 321 excludes all notes at the time tn including these eight notes from the detected notes, as shown in FIG. 14(2). - Likewise, the creation
probability determination section 321 combines thestart note data 723 within one octave in note name units to generate note-name-unit start data 725. - The creation
probability determination section 321 then determines whether or not a chord is formed in the note-name-unit detection data 724. The term “chord” used herein refers to a combination (note set) of predetermined notes, such as a major chord and a minor chord. The creationprobability determination section 321 determines formation of different types of chords. As shown inFIG. 15 , the creationprobability determination section 321 determines whether or not a chord is formed in the note-name-unit detection data 724 in units of detection time t, and calculates the formation count in chord units. In this case, the creationprobability determination section 321 determines formation of one chord at each detection time t. Likewise, the creationprobability determination section 321 determines whether or not a chord is formed in the note-name-unit start data 725 in units of detection time t, and calculates the formation count in chord units. The creationprobability determination section 321 sums up the formation counts of the note-name-unit detection data 724 and the note-name-unit start data 725 in chord units to calculate the total formation count. - The calculated formation count is stored as chord
formation count data 742.FIG. 16 shows an example of the data configuration of the chordformation count data 742. As shown inFIG. 16 , achord determination order 742 a, achord 742 b, and aformation count 742 c are stored as the chordformation count data 742 while being associated with one another. Thedetermination order 742 is set so that the order of a four-note chord made up of four notes is higher than the order of a three-note chord made up of three notes. Theformation count 742 c includes the formation count of each of the note-name-unit detection data 724 and the note-name-unit start data 725 and the total value. - The creation
probability determination section 321 determines formation of each chord according to the determination order specified by the chordformation count data 742. Specifically, the creationprobability determination section 321 determines whether or not each chord is formed in the note-name-unit detection data 724 according to the specified determination order in units of detection time t, and determines the chord of which the formation has been determined first to be a chord formed at the time t. Likewise, the creationprobability determination section 321 determines whether or not each chord is formed in the note-name-unit start data 725 according to the specified determination order in units of detection time t, and determines the chord of which the formation has been determined first to be a chord formed at the time t. The creationprobability determination section 321 sums up the formation counts of the note-name-unit detection data 724 and the note-name-unit start data 725 in chord units to obtain the total formation count. - The creation
probability determination section 321 determines the sum of the formation count of each chord belonging to each of the chord classification groups (A) to (D) to be the score of each group according to a chord classification table 734. - The chord classification table 734 is a data table which defines the classification of chords.
FIG. 17 shows an example of the data configuration of the chord classification table 734. As shown inFIG. 17 , agroup 734 a and achord 734 b are stored in the chord classification table 734 while being associated with each other. Thegroup 734 a is classified into four groups (A) to (D). - The scores of the groups (A) to (D) are stored as
score data 743.FIG. 18 shows an example of the data configuration of thescore data 743. As shown inFIG. 18 , agroup 743 a and ascore 743 b are stored as thescore data 743 while being associated with each other. Thegroup 743 a is classified into six groups (A) to (F). In this example, the scores of only the groups (A) to (D) are determined, and the scores of the groups (E) and (F) are set at “0”. - The creation
probability determination section 321 determines whether or not the scores of the groups (A) to (D) satisfy the following condition B. - When the condition B is satisfied, the creation
probability determination section 321 selects three groups with higher scores from the groups (A) to (D). The creationprobability determination section 321 refers to a group/attribute correspondence table 735, and sets the attributes corresponding to the selected groups to be first to third attributes which are attributes of creation candidate characters in the order from the attribute with the highest score. - The group/attribute correspondence table 735 is a data table which defines the correspondence between the groups (A) to (D) and the attributes of the characters.
FIG. 19 shows an example of the data configuration of the group/attribute correspondence table 735. As shown inFIG. 19 , agroup 735 a and anattribute 735 b of a character are stored in the group/attribute correspondence table 735 while being associated with each other. - In the example shown in
FIG. 18 , the first attribute is “earth” corresponding to the group (C) with the highest score, the second attribute is “water” corresponding to the group (D) with the second highest score, and the third attribute is “fire” corresponding to the group (A) with the third highest score. - The creation
probability determination section 321 determines the creation probability of each of the first to third attributes based on the score of each of the selected groups. Specifically, the creationprobability determination section 321 calculates the ratio of the score of each group to the sum of the scores of the three selected groups as the creation probability of the attribute corresponding to each group. In the example shown inFIG. 18 , the score of the group (C) corresponding to the first attribute “earth” is “27”, the score of the group (D) corresponding to the second attribute “water” is “23”, and the score of the group (A) corresponding to the third attribute “fire” is “10”. The creation probability of the first attribute “earth” is 45% (=27/60(=27+23+10)), the creation probability of the second attribute “water” is 38% (=23/60), and the creation probability of the third attribute “fire” is 17% (=10/60). - The determined creation probability of each attribute is stored as determined
creation probability data 744.FIG. 20 shows an example of the data configuration of the determinedcreation probability data 744. As shown inFIG. 20 , anattribute 744 a and acreation probability 744 b are stored as the determinedcreation probability data 744 while being associated with each other. Theattribute 744 a includes the first to third attributes. Thecreation probability 744 b is set so that the total value is 100%. - Specifically, when the condition B is satisfied, characters with the attributes corresponding to the groups (A) to (D) (i.e., “fire”, “wind”, “earth”, and “water”) are set to be creation candidate characters (selected candidate characters), and the character to be created is selected from these characters.
- When the scores of the groups (A) to (D) do not satisfy the condition B, the creation
probability determination section 321 determines the first to third attributes and the creation probabilities as follows. When the scores of all of the groups (A) to (D) are “0” (i.e., no chord is formed), the creationprobability determination section 321 determines that the creation of the character has failed, and does not determine the creation probability. - Specifically, when the condition B is not satisfied since the scores of the groups (A) to (D) are less than “5”, the creation
probability determination section 321 determines the scores of the groups (E) and (F) referring to a score setting table 736 based on the detection total count of all the black-key notes based on the detectednote data 722. -
FIG. 21 shows an example of the data configuration of the score setting table 736. As shown inFIG. 21 , aratio 736 a of the detection total count of all the black-key notes to the detection total count of all the notes and ascore 736 b of each of the groups (E) and (F) are stored in the score setting table 736 while being associated each other. - The creation
probability determination section 321 refers to the note detectiontotal count data 741, and calculates the ratio of the detection total count of all the black-key notes to the detection total count of all the notes (set note content). The creationprobability determination section 321 sets the scores associated with the calculated detection total count ratios in the score setting table 736 to be the scores of the groups (E) and (F). The creationprobability determination section 321 selects three groups with higher scores from the groups (A) to (F), and sets the attributes corresponding to the three selected groups to be the first to third attributes in the order from the attribute with the highest score referring to the group/attribute correspondence table 735. The creationprobability determination section 321 determines the creation probability of each of the first to third attributes based on the score of each selected group. Specifically, the creationprobability determination section 321 calculates the ratio of the score of each group to the sum of the scores of the three selected groups as the creation probability of the attribute corresponding to each group. - When the condition B is not satisfied since the number of groups included in the groups (A) to (D) and having a score of “1” or more is less than three, the creation
probability determination section 321 determines the first to third attributes and the creation probabilities based on the detection total count of each note based on the detectednote data 722. - Specifically, when the number of groups included in the groups (A) to (D) and having a score of “1” or more is two, the creation
probability determination section 321 determines one of the groups with a higher score to be a first group, and determines the attribute corresponding to the first group to be the first attribute referring to the group/attribute correspondence table 735. The creationprobability determination section 321 determines the attribute corresponding to the other group to be the second attribute. - When the number of groups included in the groups (A) to (F) and having a score of “1” or more is one, the creation
probability determination section 321 determines that group to be a first group, and determines the attribute corresponding to the first group to be the first attribute referring to the group/attribute correspondence table 735. The creationprobability determination section 321 sets the sum of the detection total count of each note belonging to each of note classification groups (a) to (f) according to a note classification table 737. - The note classification table 737 is a data table which defines the classification of notes.
FIG. 22 shows an example of the data configuration of the note classification table 737. As shown inFIG. 22 , agroup 737 a, anote 737 b, and anattribute 737 c of a character are stored in the note classification table 737 while being associated with one another. Thegroup 737 a is classified into six groups (a) to (f). Each of the groups (a) to (f) is associated with two notes. - The calculated sum of the detection total count of each note of each group is stored as
note classification data 745.FIG. 23 shows an example of the data configuration of thenote classification data 745. As shown inFIG. 23 , agroup 745 a and a detection total count 745 b are stored as thenote classification data 745 while being associated with each other. - The creation
probability determination section 321 refers to the note detectiontotal count data 741, and calculates the sum of the detection total count of each note belonging to each of the groups (a) to (f). The creationprobability determination section 321 determines one of the groups (a) to (f) having the largest sum of the detection total counts, and determines the attribute corresponding to the determined group to be the second attribute. When the attribute corresponding to the group having the largest note count coincides with the first attribute, the creationprobability determination section 321 determines the attribute corresponding to the group having the second largest note count to be the second attribute. - After determining the first and the second attributes, the creation
probability determination section 321 determines the third attribute based on the possessed characters. Specifically, the creationprobability determination section 321 refers to thecharacter setting data 732 corresponding to the type of egg selected for causing the character to be created and thepossessed character data 731 d, and calculates the possession ratio of the number of characters possessed by the player and having each attribute to the total number of characters in character's attribute units. The creationprobability determination section 321 determines the attribute of which the calculated possession ratio is the smallest to be the third attribute. - After determining the first to third attributes, the creation
probability determination section 321 determines the creation probability of each of the first to third attributes referring to a creation probability setting table 746. -
FIG. 24 shows an example of the data configuration of the creation probability setting table 746. As shown inFIG. 24 , ascore 746 a of the first group and acreation probability 746 b of each of the first to third attributes are stored in the creation probability setting table 746 while being associated with each other. - The creation
probability determination section 321 determines the creation probability associated with the score of the first group (i.e., one of the groups (a) to (f) corresponding to the first attribute) in the creation probability setting table 746 to be the creation probability of each of the first to third attributes. - Specifically, when the condition B is not satisfied, characters with the attributes corresponding to the groups (A) to (F) (i.e., “fire”, “wind”, “earth”, “water”, “light”, and “darkness”) are set to be creation candidate characters, and the character to be created is selected from these characters.
- When the creation
probability determination section 321 has determined the creation probability of the character in attribute units based on the chord detected from the input sound, the charactercreation control section 320 determines the character to be created according to the creation probability of each attribute determined by the creationprobability determination section 321. Specifically, the charactercreation control section 320 determines the attribute of the character to be created from the first to third attributes according to the creation probability. The charactercreation control section 320 refers to thepossessed character data 731 d and thecharacter setting data 732 corresponding to the selected egg, and determines a character randomly selected from the characters which have the determined attribute and are not possessed by the player to be the character to be created. - The
creation production section 322 then performs a creation production process of producing the creation of the determined character. Specifically, thecreation production section 322 determines the color and the generation percentage of each of the first to third particles as three types of particles P to be displayed. The first to third particles respectively correspond to the first to third attributes. The generation percentage refers to the percentage of the number of respective particles generated in the total generation count which is the total number of first to third particles generated. Specifically, since the respective particles P have the same life (about a few seconds), the number of respective particles P displayed is proportional to the percentage of the respective particles P generated. - The
creation production section 322 refers to the determinedcreation probability data 744 and an attribute/color correspondence table 751, and determines the colors corresponding to the first to third attributes to be the colors of the first to third particles, respectively. - The attribute/color correspondence table 751 is a data table which defines the correspondence between the attribute of a character and a color.
FIG. 25 shows an example of the data configuration of the attribute/color correspondence table 751. As shown inFIG. 25 , anattribute 751 b of a character and acolor 751 b are stored in the attribute/color correspondence table 751 while being associated with each other. - The
creation production section 322 determines an initial generation percentage which is the generation percentage when the character creation production starts, an intermediate generation percentage which is the generation percentage during the character creation production, and a final generation percentage which is the generation percentage when the character creation production ends as the generation percentage of the respective particles. Specifically, the initial generation percentages of the first to third particles are set at 33%. The creation probabilities of the first to third attributes are respectively set as the intermediate generation percentages of the first to third particles. The final generation percentage of the particle corresponding to the attribute of the character to be created is set at 90%, and the final generation percentages of the remaining particles are set at 5%. - The determined colors and generation percentages of the respective particles P are stored as
particle data 752.FIG. 26 shows an example of the data configuration of theparticle data 752. As shown inFIG. 26 , aparticle 752 a, acolor 752 b, and ageneration percentage 752 c are stored as theparticle data 752 while being associated with one another. Theparticle 752 a is classified as the first to third particles. Thegeneration percentage 752 c includes the initial generation percentage, the intermediate generation percentage, and the final generation percentage. - The
creation production section 322 then generates generationpercentage control data 754 for controlling generation of the respective particles based on the determined generation percentages of the respective particles. -
FIG. 27 shows an example of the generationpercentage control data 754.FIG. 27 shows the generation percentages of the respective particles P with respect to the time t (the horizontal axis indicates the time t, and the vertical axis indicates the generation percentage). As shown inFIG. 27 , the generation percentages of the first to third particles are 33% (initial generation percentage) at the character creation production start time t0. The generation percentage is gradually changed (increased/decreased) so that the generation percentage is set at the intermediate generation percentage at the time t1 during the character creation production and is set at the final generation percentage at the finish time t2. - The
creation production section 322 causes theimage display section 400 to display the character creation production screen in which the selected egg and the respective particles are displayed, as shown inFIG. 3 , and causes thesound output section 500 to output specific production sound, for example. Thecreation production section 322 starts controlling the respective particles in the character creation production screen according to the total generationcount control data 753 and the generationpercentage control data 754. - The total generation
count control data 753 is data for controlling the total generation count which is the sum of the numbers of respective particles generated.FIG. 28 shows an example of the total generationcount control data 753.FIG. 28 shows the total generation count N with respect to the time t (the horizontal axis indicates the time t, and the vertical axis indicates the generation count N). As shown inFIG. 28 , the total generation count N is constant at a total generation count N1 from the character creation production start time to the time t1, and is gradually increased so that the total generation count N reaches a predetermined total generation count N2 at the finish time t2. - Specifically, the
creation production section 322 determines the total generation count N at the present time from the total generationcount control data 753 in units of a specific period of time, and determines the generation percentage of each of the first to third particles from the generationpercentage control data 754. Thecreation production section 322 determines the generation count of each of the first to third particles by multiplying the total generation count N by the generation percentage of the respective particles, and generates the particles P in the determined generation count. Thecreation production section 322 causes the particles P to disappear (to be deleted) when the specific life has expired. - The
creation production section 322 controls the movement of each particle P currently displayed. Specifically, thecreation production section 322 sets a moving force field acting on the particle P with the position of the egg being the generation base score (center). The moving force field is set as a positive (+) force field which acts to draw the particle P toward the center of the generation base score or a negative (−) force field which acts to move the particle P away from the generation base score. Thecreation production section 322 moves each particle P according to the external force corresponding to the distance from the force field base score applied by the moving force field and the initial speed applied to each particle P. For example, when a velocity vector in the direction which rotates around the generation base score of the moving force field is specified as the initial speed, each particle P moves so that each particle P is diffused or drawn while rotating around the egg OB. - When a specific period of time predetermined for the character creation production has expired, the
creation production section 322 finishes displaying the character creation production screen, and displays the character creation screen in which the character to be created is displayed, as shown inFIG. 4 , for example. Thecreation production section 322 adds the created character to the possessed characters to update thepossessed character data 731 d, and decrements the eggs of the selected type by one to update thepossessed egg data 731 a. - In
FIG. 5 , theimage generation section 330 generates a game image for displaying a game screen based on the calculation results from thegame calculation section 310, and outputs an image signal of the generated image to theimage display section 400. Theimage display section 400 displays the game screen based on the image signal from theimage generation section 330 while redrawing the screen of one frame every 1/60 second, for example. The function of theimage display section 400 is implemented by hardware such as a CRT, an LCD, an ELD, a PDP, or an HMD. InFIG. 1 , thedisplays image display section 400. - The
sound generation section 340 generates game sound such as effect sound and BGM used during the game, and outputs a sound signal of the generated game sound to thesound output section 500. Thesound output section 500 outputs the game sound such as effect sound and BGM based on the sound signal from thesound generation section 340. The function of thesound output section 500 is implemented by a speaker or the like. InFIG. 1 , thespeaker 13 corresponds to thesound output section 500. - The
communication section 600 communicates data with an external device such as anotherportable game device 1 according to the control signal from theprocessing section 300. The function of thecommunication section 600 is implemented by a wireless communication module, a jack for a communication cable, a control circuit, or the like. InFIG. 1 , thewireless communication device 18 corresponds to thecommunication section 600. - The
storage section 700 stores a system program for implementing the function for causing theprocessing section 300 to integrally control theportable game device 1, a program and data necessary for causing theprocessing section 300 to execute the game, and the like. Thestorage section 700 is used as a work area for theprocessing section 300, and temporarily stores the results of calculations performed by theprocessing section 300 according to various programs, data input from theoperation input section 100, and the like. The function of thestorage section 700 is implemented by an IC memory, a hard disk, a CD-ROM, a DVD, an MO, a RAM, a VRAM, or the like. InFIG. 1 , the ROM, the RAM, and the like provided in thecontrol device 17 correspond to thestorage section 700. - The
storage section 700 also stores thegame program 710 for causing theprocessing section 300 to function as thegame calculation section 310, and game data. Thegame program 710 includes acharacter creation program 711 for causing theprocessing section 300 to function as the charactercreation control section 320. The game data includes theinput sound data 721, the detectednote data 722, thestart note data 723, the note-name-unit detection data 724, the note-name-unit start data 725, thepossessed item data 731, the character setting table 732, thelevel condition data 733, the chord classification table 734, the group/attribute correspondence table 735, the score setting table 736, the note classification table 737, the note detectiontotal count data 741, the chordformation count data 742, thescore data 743, the determinedcreation probability data 744, thenote classification data 745, the creation probability setting table 746, the attribute/color correspondence table 751, theparticle data 752, the total generationcount control data 753, and the generationpercentage control data 754. -
FIG. 29 is a flowchart illustrative of the flow of the game process according to this embodiment. This process is implemented by causing thegame calculation section 310 to execute the process based on thegame program 710. - As shown in
FIG. 29 , thegame calculation section 310 controls the process of a known breeding game according to the operation input from theoperation input section 100 and the like (step A1). When the player has acquired a new egg (step A3: YES), thegame calculation section 310 adds the acquired egg to the possessed eggs, and updates thepossessed egg data 731 a (step A5). When causing a character to be created (step A7: YES), the charactercreation control section 320 performs a character creation process (step A9). -
FIG. 30 is a flowchart illustrative of the flow of the character creation process. - As shown in
FIG. 30 , the charactercreation control section 320 refers to thepossessed egg data 731 a, and causes theimage display section 400 to display the egg selection screen in which different types of eggs provided in advance are displayed together with the number of eggs possessed by the player. The charactercreation control section 320 refers to the character setting table 732 corresponding to the egg selected on the egg selection screen, and causes theimage display section 400 to display the character list screen which is a list of the characters set for the selected egg. The charactercreation control section 320 selects one egg from the eggs possessed by the player according to the operation input from the operation input section 100 (step B1). - When the egg has been selected, the character
creation control section 320 performs a sound input process of allowing the player to input melody sound by performing a specific countdown display and the like, and storing sound data input from thesound input section 200 as the input sound data 721 (step B3). The creationprobability determination section 321 then performs a creation probability determination process based on the input sound data 721 (step B5). -
FIG. 31 is a flowchart illustrative of the flow of the creation probability determination process. - As shown in
FIG. 31 , the creationprobability determination section 321 detects each note within a specific octave from theinput sound data 721, and generates the detected note data 722 (step C1). The creationprobability determination section 321 performs a filtering process for the detected note data 722 (step C3). -
FIG. 32 is a flowchart illustrative of the flow of the filtering process. - As shown in
FIG. 32 , the creationprobability determination section 321 determines the maximum level of the detected notes based on the detected note data 722 (step D1). The creationprobability determination section 321 refers to thelevel condition data 733, and excludes any note in the detectednote data 722 which does not satisfy the specific level condition from the detected notes (step D3). The creationprobability determination section 321 determines the detected notes in the detectednote data 722 in units of detection time t. When five or more adjacent notes have been detected, the creationprobability determination section 321 excludes all notes at the time t from the detected notes (step D5). - The creation
probability determination section 321 calculates the detection total count of each note in the detectednote data 722, and sums up the calculated detection total count of each note to calculate the detection total count of all the notes (step D7). The creationprobability determination section 321 sums up the detection total count of each black-key note in the detectednote data 722 to calculate the detection total count of all the black-key notes (step D9). - The creation
probability determination section 321 combines the detectednote data 722 within one octave in note name units to generate the note-name-unit detection data 724 (step D11). The creationprobability determination section 321 calculates the number of types of detected notes in the note-name-unit detection data 724 in units of detection time t. When the calculated number of types of notes is seven or more, the creationprobability determination section 321 excludes all notes at the time t from the detected notes (step D13). - The creation
probability determination section 321 determines the start of each detected note in the detectednote data 722 to generate the start note data 723 (step D15). The creationprobability determination section 321 combines the generatedstart note data 723 within one octave in note name units to generate the note-name-unit start data 725 (step D17). - The creation
probability determination section 321 thus completes the filtering process. - As shown in
FIG. 31 , after the completion of the filtering process, the creationprobability determination section 321 refers to the note detectiontotal count data 741, and determines the detection total count of all the notes in the detectednote data 722. When the detection total count of all the notes is 10 or more (step C5: YES), the creationprobability determination section 321 performs a chord formation determination process to determine the scores of the groups (A) to (D) (step C7). -
FIG. 33 is a flowchart illustrative of the flow of the chord formation determination process. - As shown in
FIG. 33 , the creationprobability determination section 321 determines whether or not a chord is formed in the note-name-unit detection data 724 at each detection time t, and calculates the formation count in chord units (step E1). Likewise, the creationprobability determination section 321 determines whether or not a chord is formed in the note-name-unit start data 725 at each detection time t, and calculates the formation count in chord units (step E3). - The creation
probability determination section 321 sums up the formation counts of the note-name-unit detection data 724 and the note-name-unit start data 725 in chord units to calculate the total formation count (step E5). The creationprobability determination section 321 determines the sum of the formation count of each chord belonging to each of the chord classification groups (A) to (D) to be the score of each group (step E7). - The creation
probability determination section 321 thus completes the chord formation determination process. - As shown in
FIG. 31 , after the completion of the chord formation determination process, the creationprobability determination section 321 determines the score of each of the groups (A) to (D). When the score of at least one of the groups (A) to (D) is 5 or more (step C9: YES), and the scores of three or more groups are 1 (step C11: YES), the creationprobability determination section 321 selects three groups with higher scores from the groups (A) to (D) (step C13). The creationprobability determination section 321 sets the attributes corresponding to the selected groups to be the first to third attributes in the order from the attribute with the highest score (step C15), and determines the creation probability of each of the first to third attributes based on the score of each of the selected groups (step C17). The creationprobability determination section 321 determines that the character is successfully created (step C19). - When the score of at least one of the groups (A) to (D) is 5 or more (step C9: YES), and the number of groups with a score of 1 or more is less than three (step C11: NO), the creation
probability determination section 321 performs a group count shortage process, and determines the first to third attributes and the creation probabilities (step C19). -
FIG. 34 is a flowchart illustrative of the flow of the group count shortage process. - As shown in
FIG. 34 , the creationprobability determination section 321 determines one of the groups (A) to (F) with the highest score to be the first group, and determines the attribute corresponding to the first group to be the first attribute (step F1). - The creation
probability determination section 321 determines the scores of the groups (A) to (F). When the number of groups included in the groups (A) to (F) and having a score of 1 or more is one (step F3: YES), the creationprobability determination section 321 sums up the detection total count of each note in the detectednote data 722 belonging to each of the note classification groups (a) to (f) referring to the note detection total count data 741 (step F5). - The creation
probability determination section 321 selects one of the groups (a) to (f) having the largest detection total count, and determines whether or not the attribute corresponding to the selected group coincides with the first attribute. When the selected group does not coincide with the first attribute (step F7: YES), the creationprobability determination section 321 sets the attribute corresponding to the selected group to be the second attribute (step F9). When the selected group coincides with the first attribute (step F7: NO), the creationprobability determination section 321 selects one of the groups (a) to (f) with the second highest score, and sets the attribute corresponding to the selected group to be the second attribute (step F11). - When the number of groups included in the groups (A) to (F) and having a score of 1 or more is two in the step F3 (step F3: NO), the creation
probability determination section 321 sets the attribute corresponding to one of the groups (A) to (F) with the second highest score to be the second attribute (step F13). - The creation
probability determination section 321 refers to the character setting table 732 corresponding to the type of the selected egg and thepossessed item data 731, and determines the attribute with the minimum possession rate from the attribute of each character set for the egg of the selected type (step F15). The creationprobability determination section 321 determines whether or not the determined attribute coincides with the first or second attribute. When the determined attribute does not coincide with the first or second attribute (step F17: YES), the creationprobability determination section 321 sets the determined attribute to be the third attribute (step F21). When the determined attribute coincides with the first or second attribute (step F17: NO), the creationprobability determination section 321 determines the attribute of the character with the second smallest possession rate (step F19). The creationprobability determination section 321 determines whether or not the determined attribute coincides with the first or second attribute in the step F17 (step F17). - The creation
probability determination section 321 refers to the score setting table 736, and determines the creation probability of each of the first to third attributes according to the score of the first group (step F23). - The creation
probability determination section 321 thus completes the group count shortage process. - In
FIG. 31 , after the completion of the group count shortage process, the creationprobability determination section 321 determines that the character is successfully created (step C29). - When the scores of the groups (A) to (D) are less than 5 (step C9: NO), the creation
probability determination section 321 determines the number of types of detected notes in the detectednote data 722 referring to the note detectiontotal count data 741. When the number of types of detected notes is less than two (step C21: NO), the creationprobability determination section 321 determines that the character is not successfully created (step C31). - When the number of types of detected notes is two or more (step C21: YES), the creation
probability determination section 321 refers to the note detectiontotal count data 741, and determines the scores of the groups (E) and (F) referring to the score setting table 736 based on the detection total count of all the black-key notes in the detected note data 722 (step C23). The creationprobability determination section 321 then determines the score of each of the groups (A) to (F). When the scores of three or more of the groups (A) to (F) are 1 or more (step C25: YES), the creationprobability determination section 321 selects three groups with higher scores from the groups (A) to (F) (step C27). The creationprobability determination section 321 sets the attributes corresponding to the selected groups to be the first to third attributes in the order from the attribute with the highest score (step C15), and determines the creation probability of each of the first to third attributes based on the score of each of the selected groups (step C17). The creationprobability determination section 321 determines that the character is successfully created (step C29). - When the number of groups (A) to (F) with a score of 1 or more is less than three (step C25: YES), the creation
probability determination section 321 performs the group count shortage process, and determines the first to third attributes and the creation probabilities (step C19). The creationprobability determination section 321 determines that the character is successfully created (step C29). - The creation
probability determination section 321 thus completes the creation probability determination process. - In
FIG. 30 , after the completion of the creation probability determination process, the charactercreation control section 320 determines whether or not the character is successfully created. When the character is successfully created (step B7: YES), the charactercreation control section 320 determines the character to be created according to the type of the selected egg and the creation probability of each attribute determined. Specifically, the charactercreation control section 320 determines the attribute of the character to be created according to the creation probability of each attribute determined. The charactercreation control section 320 refers to the character setting table 732 corresponding to the type of the selected egg, and determines a character randomly selected from the characters which have the determined attribute and are not possessed by the player to be the character to be created (step B9). The charactercreation control section 320 decrements (reduces) the eggs of the selected type by one to update thepossessed egg data 731 a (step B11). Thecreation production section 322 then performs a character creation production process (step B13). -
FIG. 35 is a flowchart illustrative of the flow of the character creation production process. - As shown in
FIG. 35 , thecreation production section 322 determines the colors corresponding to the first to third attributes to be the colors of the first to third particles, respectively (step G1). - The
creation production section 322 determines the generation percentage of each of the first to third particles. Specifically, thecreation production section 322 sets the initial generation percentage of each of the first to third particles at the same value (33%) (step G3). Thecreation production section 322 sets the creation probabilities of the first to third attributes as the intermediate generation percentages of the first to third particles, respectively (step G5). Thecreation production section 322 sets the final generation percentage of the particle corresponding to the attribute of the character to be created at 90%, and sets the final generation percentages of the remaining particles at 5% (step G7). Thecreation production section 322 refers to theparticle data 752, and generates the generationpercentage control data 754 according to the generation percentage of each of the first to third particles (step G9). - The
creation production section 322 causes theimage display section 400 to display the character creation production screen in which the egg of the selected type and the respective particles are disposed, and starts controlling each particle in the character creation screen according to the total generationcount control data 753 and the generated generation percentage control data 754 (step G11). When a specific period of time has expired after displaying the character creation production screen (step G13: YES), thecreation production section 322 finishes displaying the character creation production screen, and causes theimage display section 400 to display the character creation screen displaying a state in which the character is created (step G15). Thecreation production section 322 adds the created character to the possessed characters to update thepossessed character data 731 d (step G17). - The
creation production section 322 thus completes the character creation production process. - When the character is not successfully created in the step B7 in
FIG. 30 (step B7: NO), the charactercreation control section 320 performs a character creation failure production process such as causing theimage display section 400 to display the creation failure screen showing that the character is not successfully created, or causing thesound output section 500 to output sound (step B15). - The character
creation control section 320 thus completes the character creation process. - In
FIG. 29 , after the completion of the character creation process, thegame calculation section 310 determines whether or not to finish the game. When thegame calculation section 310 does not finish the game (step A11: NO), thegame calculation section 310 transitions to the step A1. When thegame calculation section 310 has determined to finish the game (step A11: YES), thegame calculation section 310 finishes the game process to finish the game. - According to this embodiment, whether or not a chord is formed is determined from the input sound in units of detection time t at specific time intervals, the first to third attributes are determined as the attributes of the creation candidate characters from the attributes of the characters based on the determined formation count in chord units, and the creation probability of each of the first to third attributes is determined. The character with one of the first to third attributes determined according to the creation probability is created and added to the possessed characters.
- Specifically, since the character corresponding to the type of chord included in the input sound is created depending on the probability, the character to be created differs even if the same melody is input, whereby the player can enjoy the game. The creation probability of each of the first to third attributes is determined by the formation count of the corresponding chord. Specifically, when the number of specific chords is large, the character with the attribute associated with the specific chord is created with a high probability. The chord is an important element which determines the tone of the melody. Therefore, the player can enjoy estimating the character to be created from the tone of the input melody.
- The embodiments to which the invention can be applied are not limited to the above-described embodiments. Various modifications and variations may be made without departing from the spirit and scope of the invention.
- In the above-described embodiments, the characters with the attributes “light” and “darkness” respectively corresponding to the groups (E) and (F) are included in the attributes of the creation candidate characters by determining the scores of the groups (E) and (F) according to the ratio of the detection total count of all the black-key notes to the detection total count of all the notes. The characters with the attributes “light” and “darkness” may be included in the creation candidate characters according to the detection total count of all the black-key notes, for example. Specifically, when the detection total count of all the black-key notes is equal to or greater than a first specific number and less than a second specific number, the character with the attribute “light” or “darkness” is included in the creation candidate characters. When the detection total count of all the black-key notes is equal to or greater than the second specific number, the characters with the attributes “light” and “darkness” are included in the creation candidate characters, for example. Note that the second specific number is greater than the first specific number.
- In the above-described embodiments, the particle P is spherical. Note that the particle P may have another shape such as a triangle, a quadrangle, or a line. A state in which the spherical particles P are mixed may be displayed as a cloud or smoke.
- The display state such as the size, shape, or brightness of each particle P may be changed with the passage of time. In this case, it is desirable that the color of the particle P not be changed because the color of the particle P indicates the corresponding attribute.
- In the above-described embodiments, the number of respective particles P is changed by generating the particles P or causing the particles P to disappear. Note that the total number of particles P may be constant without generating the particles P or causing the particles P to disappear, and the ratio of the numbers of respective particles P may be changed by changing the color of each particle P. In the above-described embodiments, each particle P has the same life. Note that the ratio of the numbers of respective particles P may be changed by changing the life of each particle P depending on the type.
- The above-described embodiments have been described taking an example of a Western music scale (e.g. do, re, mi, fa, sol, la, ti, and do). Note that the invention can also be applied to other scales.
- In the above-described embodiments, a character randomly selected from the characters corresponding to the attribute determined based on the input sound is created. Note that the character to be created may be selected based on the date (date and time). Specifically, the selection probability of each time zone obtained by dividing one day (24 hours) into a plurality of time zones (e.g. morning, daytime, and night) is set for each character. The character to be created is selected according to the selection probability corresponding to the time zone corresponding to the time at which the melody sound is input among the selection probabilities in time zone units set for each character corresponding to the determined attribute. The selection probability of each season obtained by dividing one year (365 days) into a plurality of seasons (e.g. spring, summer, autumn, and winter) may be set instead of the time zone, and the character to be created may be selected according to the selection probability corresponding to the date at which the melody sound is input. This allows the character to be created to be changed corresponding to the date at which the melody sound is input.
- Alternatively, the time zone may be associated with the character instead of the selection probability, and the character may be created which corresponds to the time zone corresponding to the time at which the melody sound is input.
- In the above-described embodiments, the character is created after performing creation production of displaying the particle P corresponding to the attribute of each creation candidate character. Note that the character may be created during creation production, the character may be created at the same time as creation production, or creation production may be performed after (immediately after) creating the character.
- In the above-described embodiments, the attribute is set in advance for each character as the parameter by which each character is classified. Note that the capability parameter of each character may be employed such as offensive power, defensive power, or witchcraft.
- The above-described embodiments illustrate the case of applying the invention to the portable game device. Note that the invention can also be applied to other devices which can execute a game, such as a consumer game device, an arcade game device, and a portable telephone.
- The above-described embodiments illustrate the case of applying the invention to the breeding game. Note that the invention can also be applied to other games in which a character appears, such as a role-playing game.
- Although only some embodiments of the invention have been described above in detail, those skilled in the art would readily appreciate that many modifications are possible in the embodiments without materially departing from the novel teachings and advantages of the invention. Accordingly, such modifications are intended to be included within the scope of the invention.
Claims (13)
1. A game process control method which causes a computer including a sound input section to execute a game in which a game character appears, the method comprising:
detecting a note set included in input sound input to the sound input section, the note set being one of different types of note sets formed by combining predetermined notes;
selecting a game character caused to appear based on the detection result;
causing the selected game character to appear; and
controlling display of each game character including the new game character.
2. The game process control method as defined in claim 1 ,
wherein the note sets are associated in advance with the game characters;
wherein the method further comprises determining selection candidate characters including at least the game character corresponding to the detected note set; and
wherein the game character caused to appear is selected from the determined selection candidate characters.
3. The game process control method as defined in claim 2 ,
wherein the note set included in the input sound is detected at given time intervals; and
wherein the selection candidate characters are determined based on a detection total count of each of the note sets detected.
4. The game process control method as defined in claim 3 , wherein the selection candidate character corresponding to the note set with a larger detection total count is selected as the game character caused to appear with a higher probability.
5. The game process control method as defined in claim 2 ,
wherein the game character determined to be the selection candidate character is associated in advance corresponding to a set note content which is a percentage of a predetermined note in the input sound;
wherein the method further comprises determining the set note content of the input sound input to the sound input section; and
wherein the game character corresponding to the determined set note content is determined to be included in the selection candidate characters.
6. The game process control method as defined in claim 2 , further comprising:
detecting whether or not a set note which is a note set in advance is included in the input sound input to the sound input section at given time intervals;
wherein a special character is determined to be included in the selection candidate characters when a detection total count of the detected set note has reached a specific number.
7. The game process control method as defined in claim 1 ,
wherein the game character is associated in advance with each of a plurality of time conditions obtained by dividing a period in which the input sound may be input by date and/or time; and
wherein the game character corresponding to the time condition satisfied by an input time of the input sound from the sound input section is selected as the game character caused to appear.
8. The game process control method as defined in claim 1 , further comprising:
detecting an input timing of each note included in the input sound input to the sound input section;
wherein the note set is detected which includes the notes input at the same input timing.
9. The game process control method as defined in claim 1 , wherein the note set included in the input sound input to the sound input section is detected in note name units.
10. The game process control method as defined in claim 1 , further comprising:
subjecting the input sound to a filtering process by detecting only the notes included in the input sound input to the sound input section and having a specific intensity;
wherein the note set is detected using the input sound subjected to the filtering process as the input sound input to the sound input section.
11. The game process control method as defined in claim 10 , wherein the filtering process includes causing a portion of the input sound input to the sound input section in which a specific number or more of notes are input at the same time to be silent.
12. A computer-readable information recording medium storing a program for causing a computer to execute the game process control method as defined in claim 1 .
13. A game device comprising:
a sound input section;
a note set detection section which detects a note set included in input sound input to the sound input section, the note set being one of different types of note sets formed by combining predetermined notes;
a character selection section which selects a game character caused to appear based on the detection result of the note set detection section; and
a character appearance control section which causes the game character selected by the character selection section to appear.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2006234164A JP4108719B2 (en) | 2006-08-30 | 2006-08-30 | PROGRAM, INFORMATION STORAGE MEDIUM, AND GAME DEVICE |
JP2006-234164 | 2006-08-30 |
Publications (1)
Publication Number | Publication Date |
---|---|
US20080058101A1 true US20080058101A1 (en) | 2008-03-06 |
Family
ID=38616989
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US11/892,789 Abandoned US20080058101A1 (en) | 2006-08-30 | 2007-08-27 | Game process control method, information storage medium, and game device |
Country Status (3)
Country | Link |
---|---|
US (1) | US20080058101A1 (en) |
JP (1) | JP4108719B2 (en) |
GB (1) | GB2446677B (en) |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20090163282A1 (en) * | 2007-12-25 | 2009-06-25 | Takumi Masuda | Computer-readable storage medium storing game program, and game apparatus |
CN102247695A (en) * | 2010-05-19 | 2011-11-23 | 万代股份有限公司 | Gaming device and computer program |
US20120059490A1 (en) * | 2010-09-07 | 2012-03-08 | Hoya Corporation | Photographing device having game function |
US20120057008A1 (en) * | 2010-09-07 | 2012-03-08 | Hoya Corporation | Photographing device having game function, and method of executing game |
US20140238220A1 (en) * | 2013-02-27 | 2014-08-28 | Yamaha Corporation | Apparatus and method for detecting chord |
US20150228202A1 (en) * | 2014-02-10 | 2015-08-13 | Samsung Electronics Co., Ltd. | Method of playing music based on chords and electronic device implementing the same |
Families Citing this family (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP5399831B2 (en) * | 2009-09-11 | 2014-01-29 | 株式会社コナミデジタルエンタテインメント | Music game system, computer program thereof, and method of generating sound effect data |
JP6978486B2 (en) * | 2016-09-05 | 2021-12-08 | グリー株式会社 | Game control methods, computers and control programs |
JP7036542B2 (en) * | 2017-06-15 | 2022-03-15 | 株式会社スクウェア・エニックス | Video game processor and video game processor |
Citations (29)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US3894186A (en) * | 1972-10-20 | 1975-07-08 | Sound Sciences Inc | Tone analysis system with visual display |
US4024789A (en) * | 1973-08-30 | 1977-05-24 | Murli Advani | Tone analysis system with visual display |
US4081829A (en) * | 1976-08-23 | 1978-03-28 | Atari, Inc. | Audio activated video display |
US4257062A (en) * | 1978-12-29 | 1981-03-17 | Meredith Russell W | Personalized audio-visual system |
US4267561A (en) * | 1977-11-02 | 1981-05-12 | Karpinsky John R | Color video display for audio signals |
US4301703A (en) * | 1980-04-14 | 1981-11-24 | Kimball International, Inc. | High note data generator |
US4331062A (en) * | 1980-06-02 | 1982-05-25 | Rogers Allen E | Visual note display apparatus |
US4363482A (en) * | 1981-02-11 | 1982-12-14 | Goldfarb Adolph E | Sound-responsive electronic game |
US4366741A (en) * | 1980-09-08 | 1983-01-04 | Musitronic, Inc. | Method and apparatus for displaying musical notations |
US4392409A (en) * | 1979-12-07 | 1983-07-12 | The Way International | System for transcribing analog signals, particularly musical notes, having characteristic frequencies and durations into corresponding visible indicia |
US4768086A (en) * | 1985-03-20 | 1988-08-30 | Paist Roger M | Color display apparatus for displaying a multi-color visual pattern derived from two audio signals |
US5513129A (en) * | 1993-07-14 | 1996-04-30 | Fakespace, Inc. | Method and system for controlling computer-generated virtual environment in response to audio signals |
US5990405A (en) * | 1998-07-08 | 1999-11-23 | Gibson Guitar Corp. | System and method for generating and controlling a simulated musical concert experience |
US6066790A (en) * | 1995-07-14 | 2000-05-23 | Freeland; Stephen J. | Multiple frequency display for musical sounds |
US6313843B1 (en) * | 1997-08-27 | 2001-11-06 | Casio Computer Co., Ltd. | Apparatus and method for controlling image display, and recording medium storing program for controlling image display |
US6319130B1 (en) * | 1998-01-30 | 2001-11-20 | Konami Co., Ltd. | Character display controlling device, display controlling method, and recording medium |
US6379244B1 (en) * | 1997-09-17 | 2002-04-30 | Konami Co., Ltd. | Music action game machine, performance operation instructing system for music action game and storage device readable by computer |
US6464585B1 (en) * | 1997-11-20 | 2002-10-15 | Nintendo Co., Ltd. | Sound generating device and video game device using the same |
US6485369B2 (en) * | 1999-05-26 | 2002-11-26 | Nintendo Co., Ltd. | Video game apparatus outputting image and music and storage medium used therefor |
US6541692B2 (en) * | 2000-07-07 | 2003-04-01 | Allan Miller | Dynamically adjustable network enabled method for playing along with music |
US6645067B1 (en) * | 1999-02-16 | 2003-11-11 | Konami Co., Ltd. | Music staging device apparatus, music staging game method, and readable storage medium |
US20050101364A1 (en) * | 2003-09-12 | 2005-05-12 | Namco Ltd. | Program, information storage medium, game system, and control method of the game system |
US6898759B1 (en) * | 1997-12-02 | 2005-05-24 | Yamaha Corporation | System of generating motion picture responsive to music |
US6905414B2 (en) * | 2002-05-16 | 2005-06-14 | Microsoft Corporation | Banning verbal communication to and from a selected party in a game playing system |
US7096186B2 (en) * | 1998-09-01 | 2006-08-22 | Yamaha Corporation | Device and method for analyzing and representing sound signals in the musical notation |
US7208669B2 (en) * | 2003-08-25 | 2007-04-24 | Blue Street Studios, Inc. | Video game system and method |
US20070163427A1 (en) * | 2005-12-19 | 2007-07-19 | Alex Rigopulos | Systems and methods for generating video game content |
US7601904B2 (en) * | 2005-08-03 | 2009-10-13 | Richard Dreyfuss | Interactive tool and appertaining method for creating a graphical music display |
US7799984B2 (en) * | 2002-10-18 | 2010-09-21 | Allegro Multimedia, Inc | Game for playing and reading musical notation |
Family Cites Families (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2860097B1 (en) * | 1998-01-08 | 1999-02-24 | テクモ株式会社 | Medium recording game program and game device |
JP3704001B2 (en) * | 1999-08-09 | 2005-10-05 | 株式会社トミー | Game device |
JP2006102270A (en) * | 2004-10-06 | 2006-04-20 | Sony Computer Entertainment Inc | Information processing method, and information processing terminal |
-
2006
- 2006-08-30 JP JP2006234164A patent/JP4108719B2/en active Active
-
2007
- 2007-08-27 US US11/892,789 patent/US20080058101A1/en not_active Abandoned
- 2007-08-30 GB GB0716859A patent/GB2446677B/en not_active Expired - Fee Related
Patent Citations (30)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US3894186A (en) * | 1972-10-20 | 1975-07-08 | Sound Sciences Inc | Tone analysis system with visual display |
US4024789A (en) * | 1973-08-30 | 1977-05-24 | Murli Advani | Tone analysis system with visual display |
US4081829A (en) * | 1976-08-23 | 1978-03-28 | Atari, Inc. | Audio activated video display |
US4267561A (en) * | 1977-11-02 | 1981-05-12 | Karpinsky John R | Color video display for audio signals |
US4257062A (en) * | 1978-12-29 | 1981-03-17 | Meredith Russell W | Personalized audio-visual system |
US4392409A (en) * | 1979-12-07 | 1983-07-12 | The Way International | System for transcribing analog signals, particularly musical notes, having characteristic frequencies and durations into corresponding visible indicia |
US4301703A (en) * | 1980-04-14 | 1981-11-24 | Kimball International, Inc. | High note data generator |
US4331062A (en) * | 1980-06-02 | 1982-05-25 | Rogers Allen E | Visual note display apparatus |
US4366741A (en) * | 1980-09-08 | 1983-01-04 | Musitronic, Inc. | Method and apparatus for displaying musical notations |
US4363482A (en) * | 1981-02-11 | 1982-12-14 | Goldfarb Adolph E | Sound-responsive electronic game |
US4768086A (en) * | 1985-03-20 | 1988-08-30 | Paist Roger M | Color display apparatus for displaying a multi-color visual pattern derived from two audio signals |
US5513129A (en) * | 1993-07-14 | 1996-04-30 | Fakespace, Inc. | Method and system for controlling computer-generated virtual environment in response to audio signals |
US6066790A (en) * | 1995-07-14 | 2000-05-23 | Freeland; Stephen J. | Multiple frequency display for musical sounds |
US6313843B1 (en) * | 1997-08-27 | 2001-11-06 | Casio Computer Co., Ltd. | Apparatus and method for controlling image display, and recording medium storing program for controlling image display |
US6379244B1 (en) * | 1997-09-17 | 2002-04-30 | Konami Co., Ltd. | Music action game machine, performance operation instructing system for music action game and storage device readable by computer |
US6464585B1 (en) * | 1997-11-20 | 2002-10-15 | Nintendo Co., Ltd. | Sound generating device and video game device using the same |
US6898759B1 (en) * | 1997-12-02 | 2005-05-24 | Yamaha Corporation | System of generating motion picture responsive to music |
US6319130B1 (en) * | 1998-01-30 | 2001-11-20 | Konami Co., Ltd. | Character display controlling device, display controlling method, and recording medium |
US5990405A (en) * | 1998-07-08 | 1999-11-23 | Gibson Guitar Corp. | System and method for generating and controlling a simulated musical concert experience |
US7096186B2 (en) * | 1998-09-01 | 2006-08-22 | Yamaha Corporation | Device and method for analyzing and representing sound signals in the musical notation |
US6645067B1 (en) * | 1999-02-16 | 2003-11-11 | Konami Co., Ltd. | Music staging device apparatus, music staging game method, and readable storage medium |
US6485369B2 (en) * | 1999-05-26 | 2002-11-26 | Nintendo Co., Ltd. | Video game apparatus outputting image and music and storage medium used therefor |
US6541692B2 (en) * | 2000-07-07 | 2003-04-01 | Allan Miller | Dynamically adjustable network enabled method for playing along with music |
US6905414B2 (en) * | 2002-05-16 | 2005-06-14 | Microsoft Corporation | Banning verbal communication to and from a selected party in a game playing system |
US7799984B2 (en) * | 2002-10-18 | 2010-09-21 | Allegro Multimedia, Inc | Game for playing and reading musical notation |
US7208669B2 (en) * | 2003-08-25 | 2007-04-24 | Blue Street Studios, Inc. | Video game system and method |
US20050101364A1 (en) * | 2003-09-12 | 2005-05-12 | Namco Ltd. | Program, information storage medium, game system, and control method of the game system |
US7628699B2 (en) * | 2003-09-12 | 2009-12-08 | Namco Bandai Games Inc. | Program, information storage medium, game system, and control method of the game system |
US7601904B2 (en) * | 2005-08-03 | 2009-10-13 | Richard Dreyfuss | Interactive tool and appertaining method for creating a graphical music display |
US20070163427A1 (en) * | 2005-12-19 | 2007-07-19 | Alex Rigopulos | Systems and methods for generating video game content |
Cited By (11)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20090163282A1 (en) * | 2007-12-25 | 2009-06-25 | Takumi Masuda | Computer-readable storage medium storing game program, and game apparatus |
US9498708B2 (en) * | 2007-12-25 | 2016-11-22 | Nintendo Co., Ltd. | Systems and methods for processing positional and sound input of user input to a touch panel |
CN102247695A (en) * | 2010-05-19 | 2011-11-23 | 万代股份有限公司 | Gaming device and computer program |
US20120059490A1 (en) * | 2010-09-07 | 2012-03-08 | Hoya Corporation | Photographing device having game function |
US20120057008A1 (en) * | 2010-09-07 | 2012-03-08 | Hoya Corporation | Photographing device having game function, and method of executing game |
US8684804B2 (en) * | 2010-09-07 | 2014-04-01 | Pentax Ricoh Imaging Company, Ltd. | Photographing device having game function, and method of executing game |
US8740680B2 (en) * | 2010-09-07 | 2014-06-03 | Pentax Ricoh Imaging Company, Ltd. | Photographing device having game function |
US20140238220A1 (en) * | 2013-02-27 | 2014-08-28 | Yamaha Corporation | Apparatus and method for detecting chord |
US9117432B2 (en) * | 2013-02-27 | 2015-08-25 | Yamaha Corporation | Apparatus and method for detecting chord |
US20150228202A1 (en) * | 2014-02-10 | 2015-08-13 | Samsung Electronics Co., Ltd. | Method of playing music based on chords and electronic device implementing the same |
US9424757B2 (en) * | 2014-02-10 | 2016-08-23 | Samsung Electronics Co., Ltd. | Method of playing music based on chords and electronic device implementing the same |
Also Published As
Publication number | Publication date |
---|---|
GB0716859D0 (en) | 2007-10-10 |
GB2446677B (en) | 2008-12-31 |
JP4108719B2 (en) | 2008-06-25 |
JP2008054851A (en) | 2008-03-13 |
GB2446677A (en) | 2008-08-20 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20080058101A1 (en) | Game process control method, information storage medium, and game device | |
US11173399B2 (en) | Music video game with user directed sound generation | |
EP0978301B1 (en) | Character display controlling device, display controlling method, and recording medium | |
US9339725B2 (en) | Game program and game apparatus | |
KR100874176B1 (en) | Audio signal output method and background music generation method | |
US10013963B1 (en) | Method for providing a melody recording based on user humming melody and apparatus for the same | |
US8221236B2 (en) | Game process control method, information storage medium, and game device | |
CA2652364C (en) | Data computation unit for music game, data computation program for music game, and data computation method for music game | |
JP2006145851A (en) | Blow air discriminating program, blow air discriminator, game program and gaming device | |
CN106205571A (en) | A kind for the treatment of method and apparatus of singing voice | |
JP4404820B2 (en) | Singing order display system | |
US20110201425A1 (en) | Game machine and game program | |
WO2011046933A1 (en) | Music game system and method of providing same | |
JP4928195B2 (en) | PROGRAM, INFORMATION STORAGE MEDIUM, AND GAME DEVICE | |
JP2010060631A (en) | Karaoke device | |
US11302296B2 (en) | Method implemented by processor, electronic device, and performance data display system | |
KR20050117808A (en) | Method for music producing game and its program storing recorded medium | |
JP2004298426A (en) | Pinball game machine | |
JP2016194622A (en) | Karaoke device and karaoke program | |
JP6114611B2 (en) | Drawing singing scoring system | |
JP2019136246A (en) | Computer system and program | |
JP2012205754A (en) | Game device, method for controlling the same, and program | |
JP4429244B2 (en) | Karaoke equipment | |
JPH11316587A (en) | Sound generator | |
JP2003108161A (en) | Karaoke device outputting game contents between karaoke performances |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: NAMCO BANDAI GAMES INC., JAPAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:HATO, YOSHIKAZU;REEL/FRAME:019785/0914 Effective date: 20070803 |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |