US20070015575A1 - Game apparatus and its control method - Google Patents

Game apparatus and its control method Download PDF

Info

Publication number
US20070015575A1
US20070015575A1 US11/485,482 US48548206A US2007015575A1 US 20070015575 A1 US20070015575 A1 US 20070015575A1 US 48548206 A US48548206 A US 48548206A US 2007015575 A1 US2007015575 A1 US 2007015575A1
Authority
US
United States
Prior art keywords
azimuth
character
display
operation mode
unit
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US11/485,482
Inventor
Shotaro Matsuno
Kengo Nakanishi
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Bandai Co Ltd
Clarity Acquisition II LLC
Original Assignee
Bandai Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Bandai Co Ltd filed Critical Bandai Co Ltd
Assigned to BANDAI CO., LTD. reassignment BANDAI CO., LTD. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: MATSUNO, SHOTARO, NAKANISHI, KENGO
Publication of US20070015575A1 publication Critical patent/US20070015575A1/en
Assigned to ANTHROGENESIS CORPORATION reassignment ANTHROGENESIS CORPORATION ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: LIU, QING, RAY, CYNTHIA
Abandoned legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/211Input arrangements for video game devices characterised by their sensors, purposes or types using inertial sensors, e.g. accelerometers or gyroscopes
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/53Controlling the output signals based on the game progress involving additional visual information provided to the game scene, e.g. by overlay to simulate a head-up display [HUD] or displaying a laser sight in a shooting game
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/80Special adaptations for executing a specific game genre or game mode
    • A63F13/818Fishing
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/10Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
    • A63F2300/105Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals using inertial sensors, e.g. accelerometers, gyroscopes
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/30Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by output arrangements for receiving control signals generated by the game device
    • A63F2300/303Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by output arrangements for receiving control signals generated by the game device for displaying additional data, e.g. simulating a Head Up Display
    • A63F2300/305Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by output arrangements for receiving control signals generated by the game device for displaying additional data, e.g. simulating a Head Up Display for providing a graphical or textual hint to the player
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/30Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by output arrangements for receiving control signals generated by the game device
    • A63F2300/303Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by output arrangements for receiving control signals generated by the game device for displaying additional data, e.g. simulating a Head Up Display
    • A63F2300/307Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by output arrangements for receiving control signals generated by the game device for displaying additional data, e.g. simulating a Head Up Display for displaying an additional window with a view from the top of the game field, e.g. radar screen
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/40Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterised by details of platform network
    • A63F2300/404Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterised by details of platform network characterized by a local network connection
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/80Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
    • A63F2300/8035Virtual fishing
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/80Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
    • A63F2300/8076Shooting

Definitions

  • the present invention relates to a game apparatus and, more particularly, to a technique associated with a portable game apparatus.
  • This game apparatus is a handgun type apparatus.
  • a player holds a grip unit of this apparatus, opens an actuation flap sideways, opens a display upward, and depresses a trigger with a muzzle unit pointed to a sound source.
  • the frequency or waveform pattern of a sampled sonic wave is analyzed by a computer in the apparatus, and a specific monster which is set in advance is selected based on the analysis result.
  • Whether or not the selected monster appears is determined based on parameters such as time, temperature, and the like. If appearance of the monster is determined, an image of this monster is displayed on a liquid crystal panel. The player operates an arrow button and the trigger to capture this monster. If the monster is successfully captured, the number of captured monsters is updated. Various monsters can be collected by making capturing operations toward various sound sources.
  • the inventors of the present invention have focused attention on a technique for providing a game according to an azimuth which the game apparatus main body faces by newly adding an azimuth sensor to the aforementioned conventional portable game apparatus.
  • an object of the present invention to provide a game apparatus which searches for and captures characters which exist on a virtual space based on the azimuth which the game apparatus main body faces.
  • a game apparatus of the present invention comprises the following components.
  • a game apparatus which comprises an operation unit that can be turned by a player, a display unit, and an input unit, comprising:
  • an azimuth sensor for detecting an azimuth
  • a memory for, when characters are virtually located around a position of the azimuth sensor, holding location information indicating a relative positional relationship between each character and the azimuth sensor;
  • detection means for detecting an input of an operation mode of one of a first operation mode and a second operation mode
  • processing means for performing game processing corresponding to the operation mode detected by the detection means
  • the processing means controls the display unit to display a location distribution of at least one character, with reference to the location information of the at least one character held in the memory, and
  • the processing means controls the display unit to display a location distribution of the characters located in azimuths near the azimuth detected by the azimuth sensor, with reference to the location information of each character held in the memory, and
  • the processing means performs game processing in accordance with an operation result to the input unit and the operation unit in correspondence with a display state on the display unit.
  • a game apparatus of the present invention comprises the following components.
  • a game apparatus which comprises an operation unit that can be turned by a player, a display unit, and an input unit, comprising:
  • an azimuth sensor for detecting an azimuth
  • a memory for, when characters are virtually located around a position of the azimuth sensor, holding location information indicating a relative positional relationship between the characters and the azimuth sensor;
  • processing means for performing game processing in accordance with an operation result to the input unit and the operation unit in correspondence with a display state on the display unit.
  • a game apparatus of the present invention comprises the following components.
  • a game apparatus which comprises an operation unit that can be turned by a player, a display unit, and an input unit, comprising:
  • an azimuth sensor for detecting an azimuth
  • a memory for, when characters are virtually located around a position of the azimuth sensor, holding location information indicating a relative positional relationship between each character and the azimuth sensor;
  • detection means for detecting an input of an operation mode of one of a first operation mode and a second operation mode
  • processing means for performing game processing corresponding to the operation mode detected by the detection means
  • the processing means controls the display unit to display a location distribution of at least one character based on the location information of the at least one character held in the memory
  • the processing means controls the display unit to make a display indicating a probability of occurrence of battle with a character located in an azimuth closest to the azimuth detected by the azimuth sensor with reference to the location information of each character held in the memory, and
  • the processing means performs game processing in accordance with an operation result to the input unit and the operation unit in correspondence with a display state on the display unit.
  • a game apparatus of the present invention comprises the following components.
  • a game apparatus which comprises an operation unit that can be turned by a player, a display unit, and an input unit, comprising:
  • an azimuth sensor for detecting an azimuth
  • a memory for, when characters are virtually located around a position of the azimuth sensor, holding location information indicating a relative positional relationship between the characters and the azimuth sensor;
  • a game apparatus of the present invention comprises the following components.
  • a game apparatus which comprises an operation unit and a display unit, comprising:
  • an azimuth sensor for detecting an azimuth
  • a memory which holds, when characters are virtually located around a position of the azimuth sensor, location information indicating a relative positional relationship between each character and the azimuth sensor;
  • detection means for detecting an input of an operation mode of one of a first operation mode and a second operation mode
  • processing means for performing game processing corresponding to the operation mode detected by the detection means
  • the processing means controls the display unit to display a location distribution of at least one character with reference to the location information of the at least one character held in the memory, and
  • the processing means makes notification according to a difference between an azimuth of a character located in an azimuth closest to the azimuth detected by the azimuth sensor, and the azimuth detected by the azimuth sensor, with reference to the location information of each character held in the memory, and
  • the processing means performs game processing in accordance with a display state on the display unit or an operation result to the operation unit based on the notification result.
  • a method of controlling a game apparatus of the present invention comprises the following steps.
  • a method of controlling a game apparatus which comprises an operation unit that can be turned by a player, a display unit, an input unit, an azimuth sensor for detecting an azimuth, and a memory, comprising:
  • processing step includes steps of:
  • the display unit controlling, when the input of the first operation mode is detected in the detection step, the display unit to display a location distribution of at least one character based on the location information of the at least one character held in the memory, and
  • the display unit controlling, when the input of the second operation mode is detected in the detection step, the display unit to display a location distribution of the characters located in azimuths near the azimuth detected by the azimuth sensor, with reference to the location information of each character held in the memory, and
  • a method of controlling a game apparatus of the present invention comprises the following steps.
  • a method of controlling a game apparatus which comprises an operation unit that can be turned by a player, a display unit, an input unit, an azimuth sensor for detecting an azimuth, and a memory comprising:
  • a method of controlling a game apparatus of the present invention comprises the following arrangement.
  • a method of controlling a game apparatus which comprises an operation unit that can be turned by a player, a display unit, an input unit, an azimuth sensor for detecting an azimuth, and a memory, comprising:
  • processing step includes steps of:
  • the display unit controlling, when the input of the first operation mode is detected in the detection step, the display unit to display a location distribution of at least one character based on the location information of the at least one character held in the memory, and
  • the display unit controls, when the input of the second operation mode is detected in the detection step, the display unit to make a display indicating a probability of occurrence of battle with a character located in an azimuth closest to the azimuth detected by the azimuth sensor with reference to the location information of each character held in the memory, and
  • a method of controlling a game apparatus of the present invention comprises the following steps.
  • a method of controlling a game apparatus which comprises an operation unit that can be turned by a player, a display unit, an input unit, an azimuth sensor for detecting an azimuth, and a memory, comprising:
  • a method of controlling a game apparatus of the present invention comprises the following steps.
  • a method of controlling a game apparatus which comprises an operation unit, a display unit, an azimuth sensor for detecting an azimuth, and a memory, comprising:
  • processing step includes a step of:
  • the display unit controlling, when the input of the first operation mode is detected in the detection step, the display unit to display a location distribution of at least one character with reference to the location information of the at least one character held in the memory, and
  • the processing step also includes steps of:
  • FIG. 1 is a perspective view showing the outer appearance of a portable game apparatus according to the first embodiment of the present invention
  • FIG. 2 is a block diagram showing the hardware arrangement of a portable game apparatus 100 according to the first embodiment of the present invention
  • FIG. 3 is a flowchart of the main processing to be executed by the portable game apparatus 100 (CPU 201 );
  • FIG. 4 is a view for explaining a display position of a marker
  • FIG. 5 shows a display example of the location distribution of characters on a display unit 103 .
  • FIG. 6 is a flowchart of the game processing in a second operation mode.
  • FIG. 7 is a flowchart showing the game processing in the second operation mode according to the fourth embodiment of the present invention.
  • FIG. 8A shows a display example of a display window to be displayed on the display section 103 in step S 703 ;
  • FIG. 8B shows a display example of a battle window
  • FIG. 9 is a perspective view showing the outer appearance of a portable game apparatus 900 according to the fifth embodiment of the present invention.
  • FIG. 10 is a block diagram showing the hardware arrangement of the portable game apparatus 900 according to the fifth embodiment of the present invention.
  • FIG. 11 is a side view of the portable game apparatus 900 when a second main body 900 b is rotated 90° around an axis 950 ;
  • FIG. 12 is a flowchart of the game processing in the second operation mode.
  • This embodiment will explain a portable game apparatus which provides a game for “fishing up” a character.
  • FIG. 1 is a perspective view showing the outer appearance of a portable game apparatus according to this embodiment.
  • a portable game apparatus 100 is formed by a first main body 101 a and second main body 101 b.
  • the second main body 101 b is attached to the first main body 101 a to be slidable in a direction substantially parallel to the latter, as indicated by an arrow A in FIG. 1 .
  • the first main body 101 a is comprised of a display unit 103 , input units (for example, keys in FIG. 1 ) 104 a to 104 e , and operation unit (for example, a handle unit in FIG. 1 ) 102 .
  • the display unit 103 comprises, e.g., a liquid crystal display, and displays various game screens to be described later.
  • the keys 104 a to 104 e are used by the player to input various operation instructions.
  • the keys 104 a to 104 e will sometimes be referred to as a “key group 104 ” together.
  • the handle unit 102 is operated by the player to fish a character.
  • the handle 102 is attached to the first main body 101 a to be rotatable in a direction indicated by an arrow B in FIG. 1 around an axis (indicated by the dotted line in FIG. 1 ) of an attachment unit to the first main body 101 a.
  • the portable game apparatus has the outer appearance arrangement shown in FIG. 1 .
  • the positions, shapes, and the like of the display unit 103 , keys 104 a to 104 e , and handle unit 102 are not limited to those in FIG. 1 , and various modifications may be made. As will be apparent from the following description, the following description can be applied to each individual modification.
  • FIG. 1 and subsequent figures mainly show parts used in the following description, and do not show any parts such as a power switch, and the like, which do not constitute the gist of this embodiment.
  • FIG. 2 is a block diagram showing the hardware arrangement of the portable game apparatus 100 according to this embodiment. Note that the same reference numerals in FIG. 2 denote the same parts as in FIG. 1 , and a description thereof will be omitted.
  • Reference numeral 201 denotes a CPU which controls the entire portable game apparatus 100 using programs and data stored in a ROM 203 , data temporarily held in a RAM 202 , and the like, and executes game processing to be described later.
  • the CPU 201 has an internal timer (not shown), which can measure time.
  • Reference numeral 202 denotes a RAM which can provide various areas as needed such as an area for temporarily storing data which is being processed, an area for temporarily storing the measurement result of an encoder 204 as data, an area for temporarily storing the detection result of an azimuth sensor 206 as data, a work area used by the CPU 201 upon executing various kinds of processing, and the like.
  • Reference numeral 203 denotes a ROM which stores programs and data required to make the CPU 201 control the overall portable game apparatus 100 and execute game processing to be described later.
  • the stored data include parameters associated with characters, and the parameters include initial vital forces (or health powers), weights, and the like of characters. These parameters are loaded onto the RAM 202 and are used in processing, as needed.
  • Reference numeral 204 denotes an encoder which measures the rotation angle of the handle unit 102 .
  • the measurement result of the encoder 204 is held in the RAM 202 as data.
  • Reference numeral 205 denotes a slide detector.
  • the second main body 101 b can slide across the first main body 101 a , as described above, and has a state slid to the fullest extent to one side, and a state slid to the fullest extent to the other side.
  • the slide detector 205 detects one of the former state (a state in which substantially the entire body of the second main body 101 b overlaps the first main body 101 a ) and the latter state (a state in which substantially the entire body of the second main body 101 b does not overlap the first main body 101 a ), and notifies the CPU 201 of the detection result.
  • the CPU 201 sets the operation mode of the portable game apparatus 100 in a first operation mode; when the detection state indicates the latter state, the CPU 201 sets the operation mode of the portable game apparatus 100 in a second operation mode.
  • the first and second operation modes will be described later.
  • Reference numeral 206 denotes an azimuth sensor which detects the azimuth of the main body. The detection result is held in the RAM 202 as data. Note that the detection precision of the azimuth sensor 206 is not particularly limited.
  • Reference numeral 207 denotes a bus which interconnects the aforementioned units.
  • FIG. 3 shows the flowchart of this processing.
  • the program and data for making the CPU 201 execute the processing according to the flowchart shown in FIG. 3 are stored in the ROM 203 , and the portable game apparatus 100 according to this embodiment executes respective processes to be described below when the CPU 201 executes the processing using the program and data. Execution of the processing according to the flowchart in FIG. 3 is started when the player turns on the power switch of the portable game apparatus and inputs a game start instruction using the key group 104 .
  • the CPU 201 accepts the detection result sent from the slide detector 205 , and sets one of the first operation mode and the second operation mode according to the detection result (step S 300 ), as described above.
  • step S 302 the flow advances to step S 302 , and the CPU 201 executes game processing in the first operation mode (step S 302 ).
  • step S 303 the CPU 201 executes game processing in the second operation mode (step S 303 ). Details of the processes in steps S 302 and S 303 will be described later.
  • step S 300 If the player inputs a game end instruction using the key group 104 or if a game end condition is met, the processing according to this flowchart ends. However, if the player does not input any game end instruction or if the game end condition is not met, the flow returns to step S 300 to execute the subsequent processes.
  • the game processing in the first operation mode will be described first. Assuming that characters to be fished up are virtually located within a surrounding area having the position of the azimuth sensor 206 as the center, appropriate azimuth angles and (virtual) distances are given to the characters to be located.
  • the azimuth angle indicates an azimuth angle from the position of the azimuth sensor 206 to the location of each character.
  • the distance indicates a distance from the position of the azimuth sensor 206 to the location of each character.
  • locate a character means registration of set data of the azimuth angle and distance (location information indicating a relative positional relationship between the character to be located and the azimuth sensor 206 ) in the RAM 202 .
  • a method of generating this set data is not particularly limited, and the distance and azimuth angle can be randomly given, or they may be determined according to predetermined rules for each character.
  • various methods may be used to locate characters.
  • the following method may be used. That is, set data of an azimuth of appearance, distance, and appearance date and time is stored in the ROM 203 for each character. As for a character whose appearance date and time matches the current date and time measured by the CPU 201 , this character is located at the azimuth angle and distance included in the set data together with this appearance date and time.
  • the CPU- 201 moves the individual characters.
  • the azimuth angles and distances of the characters are changed according to their movements.
  • the CPU 201 executes processing for updating the azimuth angles and distances of the respective characters managed in the RAM 202 in accordance with their movements.
  • the current azimuth angles and distances of the respective characters are managed in the RAM 202 .
  • the CPU 201 displays the location distribution of the characters on the display screen of the display unit 103 . This display will be described below.
  • FIG. 5 shows a display example of the location distribution of the characters on the display unit 103 .
  • reference numeral 501 denotes a circular area which has the position of the azimuth sensor 206 as the center; and 502 to 504 , markers indicating the positional relationships of the respective characters relative to the position of the azimuth sensor 206 .
  • FIG. 4 is a view for explaining the display position of a marker.
  • (cx, cy) be the coordinates of the position of the azimuth sensor 206
  • the display unit 103 displays the position of each character based on the direction and distance viewed from the position of the azimuth sensor 206 on its display screen.
  • the present invention is not limited to the display mode shown in FIG. 5 as long as display according to such purport is made, and various display modes may be used.
  • the CPU 201 flickers the circular area or displays a specific marker image to give a warning indicating that the character corresponding to this marker is located within a predetermined distance range with respect to the position of the azimuth sensor 206 .
  • the portable game apparatus of this embodiment comprises a sound generator and sound output unit, a warning by means of a sound may be generated using them.
  • the flickering speed may be increased, or the sound volume may be turned up.
  • the set data of the location of the character and the distance between the location of the character and the azimuth sensor 206 is managed in the RAM 202 , and the distribution of the locations of the characters relative to the position of the azimuth sensor 206 is displayed on the display screen of the display unit 103 .
  • a warning that advises accordingly is given.
  • FIG. 6 is a flowchart of the game processing in the second operation mode.
  • the CPU 201 activates the azimuth sensor 206 to detect the azimuth of the azimuth sensor 206 itself, and stores the detected azimuth in the RAM 202 as data (step S 601 ).
  • the CPU 201 refers to the azimuth angles of the respective characters managed in the RAM 202 to specify characters having azimuth angles, which have differences from the azimuth acquired in step S 601 to be equal to or smaller than a predetermined amount, and displays markers at the corresponding positions on the display screen of the display unit 103 (step S 602 ).
  • the display position of each marker is as has been described above using FIG. 4 .
  • the CPU 201 can display the location distribution of characters which are located in azimuths near the direction in which the portable game apparatus currently faces (the azimuth detected by the azimuth sensor 206 ) on the display screen of the display unit 103 .
  • the CPU 201 counts the number of markers currently displayed on the display screen of the display unit 103 , and checks if the count value is equal to or larger than a predetermined value M (step S 603 ). If it is determined as a result of checking in step S 603 that the count value is smaller than the predetermined value M, no character can be fished up, and the flow returns to step S 304 .
  • step S 604 the flow advances to step S 604 , and the CPU 201 checks if a standby mode to be described later is set (step S 604 ). If it is determined as a result of checking that the standby mode is currently set, the flow advances to step S 606 .
  • the processing in step S 606 and subsequent steps will be described later.
  • step S 605 the CPU 201 makes an indication used to indicate that the count value is equal to or larger than the predetermined value M on the display screen of the display unit 103 , and starts its internal timer (step S 605 ).
  • a predetermined image may flicker, or when a light-emitting element such as an LED or the like is provided to this portable game apparatus, it may flicker.
  • the CPU 201 checks in step S 605 if the predetermined key is pressed within a predetermined period of time after the start of the timer (step S 605 ). If the predetermined key is not pressed, the flow returns to step S 304 .
  • step S 606 the CPU 201 counts the number of markers currently displayed on the display screen of the display unit 103 , and checks if the count value is equal to or larger than a predetermined value N (>M) (step S 606 ).
  • step S 606 If it is determined as a result of checking in step S 606 that the count value is smaller than the predetermined value N, the flow returns to step S 304 .
  • step S 607 if it is determined as a result of checking that the count value is equal to or larger than the predetermined value N, the flow advances to step S 607 , and the CPU 201 flickers a character image on the display screen of the display unit 103 (step S 607 ). Note that various display modes may be used as that in step S 607 .
  • the player If the number of markers currently displayed on the display screen of the display unit 103 is equal to or larger than the predetermined value N (>M), the player cannot fish up a character unless he or she presses a predetermined key of the key group 104 a predetermined number of times within a predetermined period of time. Therefore, the player presses the predetermined key of the key group 104 a predetermined number of times within the predetermined period of time.
  • the CPU 201 upon detection of the first pressing of the predetermined key since the flow advances to step S 608 , the CPU 201 starts the internal timer from the detection timing, and checks if the predetermined key is pressed a predetermined number of times or more (e.g., 10 times or more) within the predetermined period of time (e.g., 3 sec) (step S 608 ).
  • a predetermined number of times or more e.g. 10 times or more
  • the predetermined period of time e.g. 3 sec
  • step S 609 the CPU 201 displays a message indicating that the player has failed to fish up the character on the display screen of the display unit 103 , and the flow then returns to step S 304 .
  • step S 609 After the display processing in step S 609 is executed, the game processing in the second operation mode is not continued, and the game processing in the first operation mode is performed. Hence, in step S 609 the CPU 201 also displays a message that prompts the player to slide across the second main body 101 b so that substantially the entire body of the second main body 101 b overlaps the first main body 101 a , on the display screen of the display unit 103 .
  • step S 610 the flow advances to step S 610 to enter a battle mode (second step) with the character, and the CPU 201 displays a message indicating that the battle with the character starts on the display screen of the display unit 103 (step S 610 ). For example, the CPU 201 displays a message “HIT! & BATTLE!” on the display screen of the display unit 103 .
  • the image of the character as an opponent is displayed on the display screen of the display unit 103 .
  • the player turns the handle unit 102 (in the direction indicated by the arrow B in FIG. 1 ) so as to fish up the opponent character.
  • its rotation angle is measured by the encoder 204 .
  • the CPU 201 since the CPU 201 starts its internal timer, it calculates a rotational velocity (rotation angle/sec) of the handle unit 102 based on these variables (step S 611 ).
  • the CPU 201 compares the calculated tension value with a predetermined threshold to check if the tension is equal to or higher than the predetermined threshold (step S 611 ).
  • step S 609 If the tension is equal to or higher than the predetermined threshold, the flow advances to step S 609 to display a message indicating that the player has failed to fish up the opponent character on the display screen of the display unit 103 . The flow then returns to step S 304 . That is, the opponent character cannot be fished up unless the tension is at least lower than the predetermined threshold.
  • step S 614 the flow advances to step S 614 to execute processing for reducing HP by an amount corresponding to the rotational velocity calculated in step S 611 (step S 614 ). For example, if the player has turned the handle unit 102 more times, HP is reduced by a larger amount.
  • the opponent character moves not to be easily fished up.
  • the azimuth and distance of this character change.
  • the player variously changes the azimuth of this portable game apparatus 100 to turn the direction (the azimuth measured by the azimuth sensor 206 ) of the portable game apparatus 100 to the azimuth where this character exists.
  • the image of that character is displayed on the display screen of the display unit 103 .
  • step S 614 the HP of this character is reduced only when the image of the character is displayed on the display screen of the display unit 103 and the player turns the handle unit 102 while keeping the tension to be lower than the predetermined threshold.
  • the HP of the character is reduced not by simply turning the handle unit 102 . Since the player searches for the azimuth where the character exists, and turns the handle unit 102 only after the image of this character is displayed on the display screen of the display unit 103 , the HP of the character is reduced. Therefore, an operation to be done by the player to fish up the character becomes harder, thus improving the game difficulty. Note that the moving speeds of the characters may be different depending on the characters.
  • the CPU 201 checks if a predetermined period of time has elapsed after the control enters the battle mode (step S 615 ). That is, since the battle with the opponent character must be done within the predetermined period of time (this “predetermined period of time” may vary depending on the types of opponent characters), if the predetermined period of time has elapsed, the flow advances to step S 609 to display a message indicating that the player has failed to fish up the character on the display screen of the display unit 103 , and the flow then returns to step S 304 .
  • step S 611 the flow returns to step S 611 to repeat the subsequent processes.
  • the operation mode is switched by sliding the second main body 101 b.
  • the present invention is not limited to this.
  • the operation mode may be switched based on, e.g., a predetermined key input of the key group 104 .
  • the game which virtually locates characters around the azimuth sensor 206 , i.e., the player of this portable game apparatus 100 , and allows the player to fish up any of the characters can be provided.
  • This game can be implemented without any complicated three-dimensional coordinate calculations unlike known virtual reality.
  • the predetermined key is pressed a predetermined number of times or more within the predetermined period of time in the state in step S 607 .
  • the condition, operation, and the like to start the battle mode are not particularly limited.
  • a condition that “the handle unit 102 is turned a predetermined number of times within a predetermined period of time in the state in step S 607 ” may be used.
  • data associated with the fished character is held in the RAM 202 .
  • the portable game apparatus 100 incorporates a storage device such as a hard disk which can held information after power OFF, such data can be stored in this storage device.
  • the CPU 201 reads out the data associated with the fished characters from such storage device, and displays a list of information associated with the fished characters such as the images, names, weights, fished dates and times, and the like of the fished characters on the display screen of the display unit 103 like a picture book.
  • a mode for bringing up the fished character may be provided.
  • a mode for converting the fished character into coins that can be used in this game and purchasing items by the converted coins may be provided.
  • Such items may include those which allow the player to fish up a character more efficiently. For example, such item may reduce the threshold to be compared with the tension.
  • the portable game apparatus 100 may comprise a communication unit which can make data communications with other portable game apparatuses.
  • the following game may be carried out. That is, the player may play the game for fishing up characters using his or her game apparatus, and may exchange information associated with fished characters with another player via the communication unit after an elapse of a predetermined period of time. Thus, a player who fished up the bigger character is checked by each other's game apparatuses to determine a winner.
  • characters having the azimuth angles which have differences from the azimuth acquired in step S 601 to be equal to or smaller than a predetermined amount are specified, and markers are displayed at corresponding positions on the display screen of the display unit 103 in step S 602 .
  • a character having the azimuth angle closest to the azimuth acquired in step S 601 is specified, and the probability of occurrence of the battle with the specified character is displayed on the display screen of the display unit 103 .
  • This probability is indicated by the number of markers displayed on the display screen of the display unit 103 . That is, the probability of the battle becomes higher with increasing number of markers.
  • the display method of the probability is not particularly limited.
  • this probability (corresponding to the number of displayed markers) changes according to predetermined rules or randomly.
  • the CPU 201 performs change control of the probability.
  • step S 603 and subsequent steps are the same as those in the first embodiment.
  • N the predetermined value
  • the processes in step S 603 and subsequent steps are executed to battle with this character.
  • FIG. 7 is a flowchart showing the game processing in the second operation mode according to this embodiment.
  • the CPU 201 activates the azimuth sensor 206 to make it detect its own azimuth, and stores the detected azimuth in the RAM 202 as data (step S 701 ).
  • the CPU 201 refers to the azimuth angles of the respective characters managed in the RAM 202 to specify a character having an azimuth angle closest to the azimuth acquired in step S 701 , and displays a gauge window having a length corresponding to the difference (azimuth difference) between the azimuth angle of the specified character and the azimuth acquired in step S 701 on the display screen of the display section 103 (step S 702 ).
  • FIG. 8A shows a display example of the display screen displayed on the display section 103 in step S 702 .
  • reference numeral 801 denotes a gauge which is formed by a plurality of gauge blocks. The length of the gauge changes depending on the number of gauge blocks to be displayed.
  • the length of this gauge becomes smaller (the number of gauge blocks to be displayed becomes smaller) with increasing azimuth difference, or it becomes larger (the number of gauge blocks to be displayed becomes larger) with decreasing azimuth difference.
  • the azimuth difference is 180°
  • one gauge block is displayed; when the azimuth difference is 0°, a maximum number of ( 6 in FIG. 8A ) gauge blocks are displayed.
  • the player can confirm the azimuth difference from a character which is located at an azimuth angle closest to an azimuth near the direction in which the portable game apparatus currently faces (the azimuth detected by the azimuth sensor 206 ) by observing the gauge.
  • the image of the character located at the azimuth angle closest to the azimuth acquired in step S 701 is displayed on an area 802 on the window of FIG. 8A .
  • the player may be audibly notified of that fact, or if the portable game apparatus 100 comprises an LED, he or she may be notified of that fact by flickering the LED.
  • the notification mode is not particularly limited as along as the purpose of notifying that the azimuth difference becomes equal to or smaller than the predetermined value remains the same.
  • step S 704 the CPU 201 checks if the handle section 102 has been turned a predetermined amount.
  • the method of acquiring the turning amount of the handle section 102 is the same as that in the first embodiment. As a result, if the gauge length becomes maximum and the handle section 102 has been turned the predetermined amount, the flow advances to step S 705 ; otherwise, the flow returns to step S 702 .
  • step S 705 the CPU 201 displays, on the display screen of the display section 103 , a message indicating that the battle starts with the character, which is displayed on the area 802 when the gauge length becomes maximum (i.e., the azimuth difference becomes sufficiently small) and the handle section 102 has been turned the predetermined amount (step S 705 ).
  • the CPU 201 displays a message “HIT!” on the display screen of the display section 103 .
  • the CPU 201 displays a battle window shown in FIG. 8B on the display screen of the display section 103 (step S 706 ).
  • FIG. 8B shows a display example of the battle window.
  • step S 706 the CPU 201 further displays the direction in which the portable game apparatus 100 is to face on the display screen of the display section 103 so that the azimuth of the portable game apparatus 100 matches the azimuth angle of the opponent character.
  • step S 706 the CPU 201 executes the processing for sequentially acquiring the azimuth of the portable game apparatus 100 as in step S 701 .
  • the image of the opponent character is displayed on the area 802 on the window of FIG. 8B .
  • the CPU 201 checks if the azimuth difference is equal to or smaller than a sufficiently small predetermined value (step S 707 ). If the azimuth difference is not smaller than the predetermined value, the flow returns to step S 706 . On the other hand, if the azimuth difference is equal to or smaller than the predetermined value, the flow advances to step S 709 to execute the same processing as in step S 611 above. That is, the CPU 201 calculates a rotational velocity (rotation angle/sec) of the handle section 102 , and reads out a parameter indicating the weight of the opponent character from the ROM 203 onto the RAM 202 and refers to it.
  • a rotational velocity rotation angle/sec
  • the CPU 201 calculates the tension of a fishing line using the rotational velocity and the parameter of weight by the above method. The CPU 201 then compares the calculated tension value with a predetermined threshold to check if the tension is equal to or higher than the predetermined threshold (step S 709 ).
  • step S 710 displays a message indicating that the player has failed to fish up the opponent character on the display screen of the display section 103 (step S 710 ).
  • the flow then returns to step S 304 .
  • step S 710 the game processing in the second operation mode is not continued, and the game processing in the first operation mode is performed.
  • the CPU 201 also displays a message that prompts the player to slide the second main body 101 b so that substantially the entire body of the second main body 101 b overlaps the first main body 101 a , on the display screen of the display section 103 .
  • step S 713 the flow advances to step S 713 to execute processing for reducing HP by an amount corresponding to the rotational velocity calculated in step S 709 (step S 713 ).
  • HP is reduced by a larger amount.
  • reference numeral 803 denotes a gauge which indicates the HP value by means of its length, and the hatched portion indicates the remaining HP value. Therefore, when the HP is reduced, the length of the hatched portion (the length of the gauge) decreases.
  • the CPU 201 checks if a predetermined period of time has elapsed after the control enters the battle mode (step S 714 ). That is, since the battle with the opponent character must be done within the predetermined period of time (this “predetermined period of time” may vary depending on the types of opponent characters), if the predetermined period of time has elapsed, the flow advances to step S 710 to display a message indicating that the player has failed to fish up the opponent character on the display screen of the display section 103 , and the flow then returns to step S 304 .
  • step S 706 the flow returns to step S 706 to repeat the subsequent processes.
  • This embodiment will explain a portable game apparatus which provides a game for “fighting off” characters by shooting them.
  • FIG. 9 shows the outer appearance of a portable game apparatus according to this embodiment.
  • a portable game apparatus 900 comprises a first main body 900 a , second main body 900 b , and member 905 .
  • the first main body 900 a is attached to the member 905 via a hinge (not shown) to be rotatable around an axis 960 , which passes through a point 959 on a side surface 909 of the member 905 and is substantially perpendicular to the side surface 909 , as indicated by arrow A.
  • the second main body 900 b is attached to the member 905 to be rotatable around an axis 950 , which passes through a central point 911 on a bottom surface 910 of the second main body 900 b and is substantially perpendicular to the bottom surface 910 , as indicated by arrow B.
  • FIG. 11 is a side view of the portable game apparatus 900 when the second main body 900 b is rotated 90° around the axis 950 . In this embodiment, processing associated with a second operation mode starts after the apparatus is set in the state shown in FIG. 11 .
  • the first main body 900 a comprises a display section 901 .
  • the display section 901 comprises, e.g., a liquid crystal display, and displays various game windows to be described later.
  • the second main body 900 b comprises a trigger 902 and keys 904 a and 904 b.
  • the trigger 902 is attached to the second main body 900 b to be able to be pressed into it, as indicated by arrow C in FIG. 11 .
  • the keys 904 a and 904 b are used by the player to input various operation instructions. In the following description, the keys 904 a and 904 b will be referred as a “key group 904 ” together in some cases.
  • the portable game apparatus has the outer appearance arrangement shown in FIGS. 9 and 11 .
  • the layout, shapes, and the like of the display section 901 , the trigger 902 and key group 904 are not limited to those shown in FIGS. 9 and 11 , and various modifications can be made. As can be apparent from the following description, the following description can be applied to each of such modifications.
  • FIG. 9 and subsequent figures mainly show parts used in the following description, and do not show any parts such as a power switch, and the like, which do not constitute the gist of this embodiment.
  • FIG. 10 is a block diagram showing the hardware arrangement of the portable game apparatus 900 according to this embodiment. Note that the same reference numerals in FIG. 10 denote the same parts as in FIG. 9 , and a description thereof will be omitted.
  • Reference numeral 1001 denotes a CPU which controls the entire portable game apparatus 900 using programs and data stored in a ROM 1003 , data temporarily held in a RAM 1002 , and the like, and executes game processing to be described later.
  • the CPU 1001 has an internal timer (not shown), which can measure time.
  • Reference numeral 1002 denotes a RAM which can provide various areas as needed such as an area for temporarily storing data which is being processed, an area for temporarily storing the pressing state of the trigger 902 detected by a detector 1004 as data, an area for temporarily storing the detection result of an azimuth sensor 1008 as data, a work area used by the CPU 1001 upon executing various kinds of processing, and the like.
  • Reference numeral 1003 denotes a ROM which stores programs and data required to make the CPU 1001 control the overall portable game apparatus 900 and execute game processing to be described later.
  • the stored data include parameters associated with characters, and the parameters include initial vital forces (or health powers), weights, and the like of characters.
  • the data also include parameters that specify the initial number of bullets fired by pulling the trigger 902 and their teeth. These parameters are loaded onto the RAM 1002 and are used in processing, as needed.
  • Reference numeral 1004 denotes a detector which detects whether or not the trigger 902 is pressed. The detection result is held in the RAM 1002 as data.
  • Reference numeral 1005 denotes an open/close detector.
  • the first main body 900 a is rotatable around the axis 960 with respect to the member 905 , and has a state in which the first main body 900 a is fully rotated to one side, and a state in which it is fully rotated to the other side.
  • the open/close detector 1005 detects one of the former state (a state in which substantially the entire body of the first main body 900 a overlaps the second main body 900 b ) and the latter state (a state in which the first and second main bodies 900 a and 900 b do not overlap each other), and notifies the CPU 1001 of that detection result.
  • the CPU 1001 sets the operation mode of the portable game apparatus 900 in the first operation mode; when the detection result indicates the latter state, the CPU 1001 sets the operation mode of the portable game apparatus 900 in the second operation mode.
  • the first and second operation modes will be described later.
  • Reference numeral 1008 denotes an azimuth sensor which detects the azimuth of the main body. The detection result is held in the RAM 1002 as data. Note that the detection precision of the azimuth sensor 1008 is not particularly limited.
  • Reference numeral 1009 denotes a bus which interconnects the aforementioned sections.
  • this embodiment has the game contents themselves different from the first embodiment, but it is the same as the first embodiment on the issue of executing processing according to the first and second operation modes.
  • the program and data for making the CPU 1001 execute this main processing are stored in the ROM 1003 .
  • the portable game apparatus 900 according to this embodiment executes this main processing. Execution of this main processing is started when the player turns on the power switch of the portable game apparatus and inputs a game start instruction using the key group 904 .
  • the processing in the first operation mode is basically the same as that in the first embodiment. That is, characters to be fought off are virtually located within a surrounding region having the position of the azimuth sensor 1008 as the center, and appropriate azimuth angles and (virtual) distances are given to characters to be located. After the characters are located, individual characters are moved, and the location distribution of the characters is displayed on the display screen of the display section 901 . When characters corresponding to markers are located within a predetermined distance range with respect to the position of the azimuth sensor 1008 , processing for notifying a message that advices accordingly is executed.
  • FIG. 12 is a flowchart showing the game processing in the second operation mode.
  • the CPU 1001 activates the azimuth sensor 1008 to detect the azimuth of the azimuth sensor 1008 itself, and stores the detected azimuth in the RAM 1002 as data (step S 1201 ).
  • the CPU 1001 refers to the azimuth angles of the respective characters managed in the RAM 1002 to specify a character having an azimuth angle closest to the azimuth acquired in step S 1201 , and displays a gauge window having a length according to the difference (azimuth difference) between the azimuth angle of the specified character and the azimuth acquired in step S 1201 on the display screen of the display section 901 (step S 1202 ).
  • the gauge window is basically the same as that shown in FIG. 8A . Note that bullet images may be used in place of the gauge blocks since the game contents of this embodiment lie in “fighting off characters by shooting bullets”.
  • the image of the character located at the azimuth angle closest to the azimuth acquired in step S 1201 is displayed on the area 802 on the window of FIG. 8A .
  • the player may be audibly notified of that fact, or if the portable game apparatus 900 comprises an LED, he or she may be notified of that fact by flickering the LED.
  • the notification mode is not particularly limited as along as the purpose of notifying that the azimuth difference becomes equal to or smaller than the predetermined value remains the same.
  • the player After the gauge length becomes maximum (i.e., when the azimuth difference becomes sufficiently small), the player must pull the trigger 902 a predetermined number of times or more within a predetermined period time (e.g., within 1 sec) to fight off the character.
  • Processing for specifying the number of times of pulling the trigger after the gauge length becomes maximum is implemented in such a manner that, for example, the CPU 1001 starts time measurement when the gauge length becomes maximum, and it counts the number of times of detection of pulling of the trigger 902 by the detector 1004 after the start of time measurement. When the count value becomes equal to or larger than a predetermined value before an elapse of the predetermined period of time, it is determined that the “trigger 902 has been pulled a predetermined number of times or more within the predetermined period of time”.
  • the CPU 1001 checks by the aforementioned method if the trigger 902 has been pulled a predetermined number of times or more within the predetermined period of time after the azimuth difference becomes sufficiently small (step S 1203 ). If the trigger 902 has not been pulled yet, the flow returns to step S 1202 ; otherwise, the flow advances to step S 1204 .
  • step S 1204 the CPU 1001 displays, on the display screen of the display section 901 , a message indicating that the battle starts with the character, which is displayed on the area 802 (step S 1204 ). For example, the CPU 1001 displays a message “LOCK ON!” on the display screen of the display section 901 . Then, the CPU 1001 displays the battle window shown in FIG. 8B on the display screen of the display section 901 (step S 1205 ). In case of this embodiment, the window shown in FIG. 8B displays the gauge 801 using bullet images as gauge blocks. When the battle has started, since the character as an opponent begins to move for escape, the azimuth angle of this character changes sequentially.
  • step S 1205 the CPU 1001 further displays the direction in which the portable game apparatus 900 is to face on the display screen of the display section 901 so that the azimuth of the portable game apparatus 900 matches that of the opponent character.
  • the image of the opponent character is displayed on the area 802 on the window of FIG. 8B .
  • step S 1206 the CPU 1001 checks if the azimuth difference is equal to or smaller than a sufficiently small predetermined value. If the azimuth difference is not smaller than the predetermined value, the flow returns to step S 1205 . On the other hand, if the azimuth difference is equal to or smaller than the predetermined value, the flow advances to step S 1207 to check if the number of currently remaining bullets is zero (step S 1207 ). As a result of checking, if the number of remaining bullets is zero, the flow advances to step S 1208 to display a message indicating that the opponent character has escaped on the display screen of the display section 901 (step S 1208 ). After that, the flow returns to step S 304 .
  • step S 1208 After the display processing in step S 1208 is executed, the game processing in the second operation mode is not continued, and the game processing in the first operation mode is performed. Hence, in step S 1208 the CPU 1001 also displays a message that prompts the player to rotate the first main body 900 a so that the first main body 900 a overlaps the second main body 900 b , on the display screen of the display section 901 .
  • step S 1209 the flow advances to step S 1209 to check if the detector 1004 detects that the trigger 902 has been pulled since step S 1206 (step S 1209 ) As a result of checking, if the detector 1004 does not detect that the trigger 902 has been pulled, the flow jumps to step S 1213 ; otherwise, the flow advances to step S 1210 .
  • step S 1212 execute processing for reducing HP (step S 1212 ).
  • an amount to be reduced a predetermined value may be used or an amount may be determined according to the teeth of bullets which are used currently.
  • the window shown in FIG. 8B is used as the battle window, the length of the hatched portion (gauge length) is shortened when the HP is reduced.
  • the CPU 1001 checks if a predetermined period of time has elapsed after the control enters the battle mode (step S 1213 ). That is, since the battle with the opponent character must be done within the predetermined period of time (this “predetermined period of time” may vary depending on the types of opponent characters), if the predetermined period of time has elapsed, the flow advances to step S 1208 to display a message indicating that the opponent character has escaped on the display screen of the display section 901 , and the flow then returns to step S 304 .
  • step S 1205 the flow returns to step S 1205 to repeat the subsequent processes.

Abstract

In a first operation mode, the location distribution of one or more characters is displayed on a display screen of a display unit (103). In a second operation mode, the location distribution of characters located in azimuths near the azimuth detected by an azimuth sensor (206) is displayed on the display screen of the display unit (103), and game processing is performed in accordance with the operation result to a key group (104) and handle unit (102) in correspondence with this display state.

Description

    FIELD OF THE INVENTION
  • The present invention relates to a game apparatus and, more particularly, to a technique associated with a portable game apparatus.
  • BACKGROUND OF THE INVENTION
  • Conventionally, various portable game apparatuses are known. As one of these portable game apparatuses, a game apparatus disclosed in Japanese Registered Utility Model No. 3046095 is known.
  • This game apparatus is a handgun type apparatus. A player holds a grip unit of this apparatus, opens an actuation flap sideways, opens a display upward, and depresses a trigger with a muzzle unit pointed to a sound source. As a result, the frequency or waveform pattern of a sampled sonic wave is analyzed by a computer in the apparatus, and a specific monster which is set in advance is selected based on the analysis result.
  • Whether or not the selected monster appears is determined based on parameters such as time, temperature, and the like. If appearance of the monster is determined, an image of this monster is displayed on a liquid crystal panel. The player operates an arrow button and the trigger to capture this monster. If the monster is successfully captured, the number of captured monsters is updated. Various monsters can be collected by making capturing operations toward various sound sources.
  • SUMMARY OF THE INVENTION
  • The inventors of the present invention have focused attention on a technique for providing a game according to an azimuth which the game apparatus main body faces by newly adding an azimuth sensor to the aforementioned conventional portable game apparatus.
  • It is, therefore, an object of the present invention to provide a game apparatus which searches for and captures characters which exist on a virtual space based on the azimuth which the game apparatus main body faces.
  • In order to achieve an object of the present invention, for example, a game apparatus of the present invention comprises the following components.
  • That is, a game apparatus which comprises an operation unit that can be turned by a player, a display unit, and an input unit, comprising:
  • an azimuth sensor for detecting an azimuth;
  • a memory for, when characters are virtually located around a position of the azimuth sensor, holding location information indicating a relative positional relationship between each character and the azimuth sensor;
  • detection means for detecting an input of an operation mode of one of a first operation mode and a second operation mode; and
  • processing means for performing game processing corresponding to the operation mode detected by the detection means, and
  • in that when the detection means detects the input of the first operation mode,
  • the processing means controls the display unit to display a location distribution of at least one character, with reference to the location information of the at least one character held in the memory, and
  • when the detection means detects the input of the second operation mode,
  • the processing means controls the display unit to display a location distribution of the characters located in azimuths near the azimuth detected by the azimuth sensor, with reference to the location information of each character held in the memory, and
  • the processing means performs game processing in accordance with an operation result to the input unit and the operation unit in correspondence with a display state on the display unit.
  • In order to achieve an object of the present invention, for example, a game apparatus of the present invention comprises the following components.
  • That is, a game apparatus which comprises an operation unit that can be turned by a player, a display unit, and an input unit, comprising:
  • an azimuth sensor for detecting an azimuth;
  • a memory for, when characters are virtually located around a position of the azimuth sensor, holding location information indicating a relative positional relationship between the characters and the azimuth sensor;
  • means for controlling the display unit to display a location distribution of characters located in azimuths near the azimuth detected by the azimuth sensor with reference to location information of each character held in the memory; and
  • processing means for performing game processing in accordance with an operation result to the input unit and the operation unit in correspondence with a display state on the display unit.
  • In order to achieve an object of the present invention, for example, a game apparatus of the present invention comprises the following components.
  • That is, a game apparatus which comprises an operation unit that can be turned by a player, a display unit, and an input unit, comprising:
  • an azimuth sensor for detecting an azimuth;
  • a memory for, when characters are virtually located around a position of the azimuth sensor, holding location information indicating a relative positional relationship between each character and the azimuth sensor;
  • detection means for detecting an input of an operation mode of one of a first operation mode and a second operation mode; and
  • processing means for performing game processing corresponding to the operation mode detected by the detection means, and
  • in that when the detection means detects the input of the first operation mode,
  • the processing means controls the display unit to display a location distribution of at least one character based on the location information of the at least one character held in the memory,
  • when the detection means detects the input of the second operation mode,
  • the processing means controls the display unit to make a display indicating a probability of occurrence of battle with a character located in an azimuth closest to the azimuth detected by the azimuth sensor with reference to the location information of each character held in the memory, and
  • when the battle has occurred, the processing means performs game processing in accordance with an operation result to the input unit and the operation unit in correspondence with a display state on the display unit.
  • In order to achieve an object of the present invention, for example, a game apparatus of the present invention comprises the following components.
  • That is, a game apparatus which comprises an operation unit that can be turned by a player, a display unit, and an input unit, comprising:
  • an azimuth sensor for detecting an azimuth;
  • a memory for, when characters are virtually located around a position of the azimuth sensor, holding location information indicating a relative positional relationship between the characters and the azimuth sensor;
  • means for controlling the display unit to make a display indicating a probability of occurrence of battle with a character located in an azimuth closest to the azimuth detected by the azimuth sensor with reference to location information of each character held in the memory; and
  • means for, when the battle has occurred, performing game processing in accordance with an operation result to the input unit and the operation unit in correspondence with a display state on the display unit.
  • In order to achieve an object of the present invention, for example, a game apparatus of the present invention comprises the following components.
  • That is, a game apparatus which comprises an operation unit and a display unit, comprising:
  • an azimuth sensor for detecting an azimuth;
  • a memory which holds, when characters are virtually located around a position of the azimuth sensor, location information indicating a relative positional relationship between each character and the azimuth sensor;
  • detection means for detecting an input of an operation mode of one of a first operation mode and a second operation mode; and
  • processing means for performing game processing corresponding to the operation mode detected by the detection means, and
  • in that when the detection means detects the input of the first operation mode,
  • the processing means controls the display unit to display a location distribution of at least one character with reference to the location information of the at least one character held in the memory, and
  • when the detection means detects the input of the second operation mode,
  • the processing means makes notification according to a difference between an azimuth of a character located in an azimuth closest to the azimuth detected by the azimuth sensor, and the azimuth detected by the azimuth sensor, with reference to the location information of each character held in the memory, and
  • the processing means performs game processing in accordance with a display state on the display unit or an operation result to the operation unit based on the notification result.
  • In order to achieve an object of the present invention, for example, a method of controlling a game apparatus of the present invention comprises the following steps.
  • That is, a method of controlling a game apparatus, which comprises an operation unit that can be turned by a player, a display unit, an input unit, an azimuth sensor for detecting an azimuth, and a memory, comprising:
  • a storage control step of storing, when characters are virtually located around a position of the azimuth sensor, location information indicating a relative positional relationship between each character and the azimuth sensor in the memory;
  • a detection step of detecting an input of an operation mode of one of a first operation mode and a second operation mode; and
  • a processing step of performing game processing corresponding to the operation mode detected in the detection step, and
  • in that the processing step includes steps of:
  • controlling, when the input of the first operation mode is detected in the detection step, the display unit to display a location distribution of at least one character based on the location information of the at least one character held in the memory, and
  • controlling, when the input of the second operation mode is detected in the detection step, the display unit to display a location distribution of the characters located in azimuths near the azimuth detected by the azimuth sensor, with reference to the location information of each character held in the memory, and
  • performing game processing in accordance with an operation result to the input unit and the operation unit in correspondence with a display state on the display unit.
  • In order to achieve an object of the present invention, for example, a method of controlling a game apparatus of the present invention comprises the following steps.
  • That is, a method of controlling a game apparatus, which comprises an operation unit that can be turned by a player, a display unit, an input unit, an azimuth sensor for detecting an azimuth, and a memory comprising:
  • a storage control step of storing, when characters are virtually located around a position of the azimuth sensor, location information indicating a relative positional relationship between each character and the azimuth sensor in the memory;
  • a step of controlling the display unit to display a location distribution of characters located in azimuths near the azimuth detected by the azimuth sensor with reference to location information of each character held in the memory; and
  • a step of performing game processing in accordance with an operation result to the input unit and the operation unit in correspondence with a display state on the display unit.
  • In order to achieve an object of the present invention, for example, a method of controlling a game apparatus of the present invention comprises the following arrangement.
  • That is, a method of controlling a game apparatus, which comprises an operation unit that can be turned by a player, a display unit, an input unit, an azimuth sensor for detecting an azimuth, and a memory, comprising:
  • a storage control step of storing, when characters are virtually located around a position of the azimuth sensor, location information indicating a relative positional relationship between each character and the azimuth sensor in the memory;
  • a detection step of detecting an input of an operation mode of one of a first operation mode and a second operation mode; and
  • a processing step of performing game processing corresponding to the operation mode detected in the detection step, and
  • in that the processing step includes steps of:
  • controlling, when the input of the first operation mode is detected in the detection step, the display unit to display a location distribution of at least one character based on the location information of the at least one character held in the memory, and
  • controlling, when the input of the second operation mode is detected in the detection step, the display unit to make a display indicating a probability of occurrence of battle with a character located in an azimuth closest to the azimuth detected by the azimuth sensor with reference to the location information of each character held in the memory, and
  • performing, when the battle has occurred, game processing in accordance with an operation result to the input unit and the operation unit in correspondence with a display state on the display unit.
  • In order to achieve an object of the present invention, for example, a method of controlling a game apparatus of the present invention comprises the following steps.
  • That is, a method of controlling a game apparatus, which comprises an operation unit that can be turned by a player, a display unit, an input unit, an azimuth sensor for detecting an azimuth, and a memory, comprising:
  • a storage control step of storing, when characters are virtually located around a position of the azimuth sensor, location information indicating a relative positional relationship between each character and the azimuth sensor in the memory;
  • a step of controlling the display unit to make a display indicating a probability of occurrence of battle with a character located in an azimuth closest to the azimuth detected by the azimuth sensor with reference to location information of each character held in the memory; and
  • a step of performing, when the battle has occurred, game processing in accordance with an operation result to the input unit and the operation unit in correspondence with a display state on the display unit.
  • In order to achieve an object of the present invention, for example, a method of controlling a game apparatus of the present invention comprises the following steps.
  • That is, a method of controlling a game apparatus, which comprises an operation unit, a display unit, an azimuth sensor for detecting an azimuth, and a memory, comprising:
  • a storage control step of storing, when characters are virtually located around a position of the azimuth sensor, location information indicating a relative positional relationship between each character and the azimuth sensor in the memory;
  • a detection step of detecting an input of an operation mode of one of a first operation mode and a second operation mode; and
  • a processing step of performing game processing corresponding to the operation mode detected in the detection step, and
  • in that the processing step includes a step of:
  • controlling, when the input of the first operation mode is detected in the detection step, the display unit to display a location distribution of at least one character with reference to the location information of the at least one character held in the memory, and
  • the processing step also includes steps of:
  • making, when the input of the second operation mode is detected in the detection step, notification according to a difference between an azimuth of a character located in an azimuth closest to the azimuth detected by the azimuth sensor, and the azimuth detected by the azimuth sensor with reference to the location information of each character held in the memory; and
  • performing game processing in accordance with a display state on the display unit or an operation result to the operation unit based on the notification result.
  • Other features and advantages of the present invention will be apparent from the following description taken in conjunction with the accompanying drawings, in which like reference characters designate the same or similar parts throughout the figures thereof.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • The accompanying drawings, which are incorporated in and constitute a part of the specification, illustrate embodiments of the invention and, together with the description, serve to explain the principles of the invention.
  • FIG. 1 is a perspective view showing the outer appearance of a portable game apparatus according to the first embodiment of the present invention;
  • FIG. 2 is a block diagram showing the hardware arrangement of a portable game apparatus 100 according to the first embodiment of the present invention;
  • FIG. 3 is a flowchart of the main processing to be executed by the portable game apparatus 100 (CPU 201);
  • FIG. 4 is a view for explaining a display position of a marker;
  • FIG. 5 shows a display example of the location distribution of characters on a display unit 103; and
  • FIG. 6 is a flowchart of the game processing in a second operation mode.
  • FIG. 7 is a flowchart showing the game processing in the second operation mode according to the fourth embodiment of the present invention;
  • FIG. 8A shows a display example of a display window to be displayed on the display section 103 in step S703;
  • FIG. 8B shows a display example of a battle window;
  • FIG. 9 is a perspective view showing the outer appearance of a portable game apparatus 900 according to the fifth embodiment of the present invention;
  • FIG. 10 is a block diagram showing the hardware arrangement of the portable game apparatus 900 according to the fifth embodiment of the present invention;
  • FIG. 11 is a side view of the portable game apparatus 900 when a second main body 900 b is rotated 90° around an axis 950; and
  • FIG. 12 is a flowchart of the game processing in the second operation mode.
  • DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS
  • Preferred embodiments of the present invention will now be described in detail in accordance with the accompanying drawings.
  • First Embodiment
  • This embodiment will explain a portable game apparatus which provides a game for “fishing up” a character.
  • FIG. 1 is a perspective view showing the outer appearance of a portable game apparatus according to this embodiment. As shown in FIG. 1, a portable game apparatus 100 is formed by a first main body 101 a and second main body 101 b. The second main body 101 b is attached to the first main body 101 a to be slidable in a direction substantially parallel to the latter, as indicated by an arrow A in FIG. 1.
  • The first main body 101 a is comprised of a display unit 103, input units (for example, keys in FIG. 1) 104 a to 104 e, and operation unit (for example, a handle unit in FIG. 1) 102. The display unit 103 comprises, e.g., a liquid crystal display, and displays various game screens to be described later.
  • The keys 104 a to 104 e are used by the player to input various operation instructions. In the following description, the keys 104 a to 104 e will sometimes be referred to as a “key group 104” together.
  • The handle unit 102 is operated by the player to fish a character. The handle 102 is attached to the first main body 101 a to be rotatable in a direction indicated by an arrow B in FIG. 1 around an axis (indicated by the dotted line in FIG. 1) of an attachment unit to the first main body 101 a.
  • In the following description of this embodiment, assume that the portable game apparatus has the outer appearance arrangement shown in FIG. 1. However, the positions, shapes, and the like of the display unit 103, keys 104 a to 104 e, and handle unit 102 are not limited to those in FIG. 1, and various modifications may be made. As will be apparent from the following description, the following description can be applied to each individual modification.
  • FIG. 1 and subsequent figures mainly show parts used in the following description, and do not show any parts such as a power switch, and the like, which do not constitute the gist of this embodiment.
  • FIG. 2 is a block diagram showing the hardware arrangement of the portable game apparatus 100 according to this embodiment. Note that the same reference numerals in FIG. 2 denote the same parts as in FIG. 1, and a description thereof will be omitted.
  • Reference numeral 201 denotes a CPU which controls the entire portable game apparatus 100 using programs and data stored in a ROM 203, data temporarily held in a RAM 202, and the like, and executes game processing to be described later. The CPU 201 has an internal timer (not shown), which can measure time.
  • Reference numeral 202 denotes a RAM which can provide various areas as needed such as an area for temporarily storing data which is being processed, an area for temporarily storing the measurement result of an encoder 204 as data, an area for temporarily storing the detection result of an azimuth sensor 206 as data, a work area used by the CPU 201 upon executing various kinds of processing, and the like.
  • Reference numeral 203 denotes a ROM which stores programs and data required to make the CPU 201 control the overall portable game apparatus 100 and execute game processing to be described later. The stored data include parameters associated with characters, and the parameters include initial vital forces (or health powers), weights, and the like of characters. These parameters are loaded onto the RAM 202 and are used in processing, as needed.
  • Reference numeral 204 denotes an encoder which measures the rotation angle of the handle unit 102. The measurement result of the encoder 204 is held in the RAM 202 as data.
  • Reference numeral 205 denotes a slide detector. The second main body 101 b can slide across the first main body 101 a, as described above, and has a state slid to the fullest extent to one side, and a state slid to the fullest extent to the other side. The slide detector 205 detects one of the former state (a state in which substantially the entire body of the second main body 101 b overlaps the first main body 101 a) and the latter state (a state in which substantially the entire body of the second main body 101 b does not overlap the first main body 101 a), and notifies the CPU 201 of the detection result.
  • When the detection state indicates the former state, the CPU 201 sets the operation mode of the portable game apparatus 100 in a first operation mode; when the detection state indicates the latter state, the CPU 201 sets the operation mode of the portable game apparatus 100 in a second operation mode. The first and second operation modes will be described later.
  • Reference numeral 206 denotes an azimuth sensor which detects the azimuth of the main body. The detection result is held in the RAM 202 as data. Note that the detection precision of the azimuth sensor 206 is not particularly limited.
  • Reference numeral 207 denotes a bus which interconnects the aforementioned units.
  • The main processing to be executed by the portable game apparatus 100 (CPU 201) will be described below using FIG. 3 which shows the flowchart of this processing. Note that the program and data for making the CPU 201 execute the processing according to the flowchart shown in FIG. 3 are stored in the ROM 203, and the portable game apparatus 100 according to this embodiment executes respective processes to be described below when the CPU 201 executes the processing using the program and data. Execution of the processing according to the flowchart in FIG. 3 is started when the player turns on the power switch of the portable game apparatus and inputs a game start instruction using the key group 104.
  • The CPU 201 accepts the detection result sent from the slide detector 205, and sets one of the first operation mode and the second operation mode according to the detection result (step S300), as described above.
  • As a result, if the CPU 201 sets the first operation mode, the flow advances to step S302, and the CPU 201 executes game processing in the first operation mode (step S302). After that, if the CPU 201 sets the second operation mode, the flow advances to step S303, and the CPU 201 executes game processing in the second operation mode (step S303). Details of the processes in steps S302 and S303 will be described later.
  • If the player inputs a game end instruction using the key group 104 or if a game end condition is met, the processing according to this flowchart ends. However, if the player does not input any game end instruction or if the game end condition is not met, the flow returns to step S300 to execute the subsequent processes.
  • The game processing in the first operation mode and that in the second operation mode will be described below.
  • <Game Processing in First Operation Mode>
  • The game processing in the first operation mode will be described first. Assuming that characters to be fished up are virtually located within a surrounding area having the position of the azimuth sensor 206 as the center, appropriate azimuth angles and (virtual) distances are given to the characters to be located.
  • The azimuth angle indicates an azimuth angle from the position of the azimuth sensor 206 to the location of each character. The distance indicates a distance from the position of the azimuth sensor 206 to the location of each character. Hence, “locate a character” means registration of set data of the azimuth angle and distance (location information indicating a relative positional relationship between the character to be located and the azimuth sensor 206) in the RAM 202.
  • A method of generating this set data is not particularly limited, and the distance and azimuth angle can be randomly given, or they may be determined according to predetermined rules for each character.
  • In this manner, various methods may be used to locate characters. For example, the following method may be used. That is, set data of an azimuth of appearance, distance, and appearance date and time is stored in the ROM 203 for each character. As for a character whose appearance date and time matches the current date and time measured by the CPU 201, this character is located at the azimuth angle and distance included in the set data together with this appearance date and time.
  • After the characters are located, the CPU-201 moves the individual characters. When the characters are moved, the azimuth angles and distances of the characters are changed according to their movements. Hence, the CPU 201 executes processing for updating the azimuth angles and distances of the respective characters managed in the RAM 202 in accordance with their movements. Hence, the current azimuth angles and distances of the respective characters are managed in the RAM 202.
  • The CPU 201 displays the location distribution of the characters on the display screen of the display unit 103. This display will be described below. FIG. 5 shows a display example of the location distribution of the characters on the display unit 103.
  • Referring to FIG. 5, reference numeral 501 denotes a circular area which has the position of the azimuth sensor 206 as the center; and 502 to 504, markers indicating the positional relationships of the respective characters relative to the position of the azimuth sensor 206.
  • Note that the display positions of the markers 502 to 504 will be described below using FIG. 4. FIG. 4 is a view for explaining the display position of a marker. Let (cx, cy) be the coordinates of the position of the azimuth sensor 206, and θ and d be the azimuth angle and direction (managed in the RAM 202) of a character (the up direction of the screen is an azimuth angle=0°). Then, the coordinates (ex, ey) of the display position of the marker corresponding to this character are given by:
    ex=cx−sin(θ)×d′
    ey=cy+cos(θ)×d′
    where d′ is the distance obtained by normalizing (reducing) the distance d in accordance with the display screen size of the display unit 103. At any rate, the display unit 103 displays the position of each character based on the direction and distance viewed from the position of the azimuth sensor 206 on its display screen. Hence, the present invention is not limited to the display mode shown in FIG. 5 as long as display according to such purport is made, and various display modes may be used.
  • When the marker displayed on the display screen of the display unit 103 is located within the circular area 501, the CPU 201 flickers the circular area or displays a specific marker image to give a warning indicating that the character corresponding to this marker is located within a predetermined distance range with respect to the position of the azimuth sensor 206. If the portable game apparatus of this embodiment comprises a sound generator and sound output unit, a warning by means of a sound may be generated using them.
  • Also, as the marker becomes closer to the circular area 501, the flickering speed may be increased, or the sound volume may be turned up.
  • As described above, in the game processing in the first operation mode, the set data of the location of the character and the distance between the location of the character and the azimuth sensor 206 is managed in the RAM 202, and the distribution of the locations of the characters relative to the position of the azimuth sensor 206 is displayed on the display screen of the display unit 103. When a character approaches to fall within a predetermined distance range from the position of the azimuth sensor 206, a warning that advises accordingly is given.
  • <Game Processing in Second Operation Mode>
  • The game processing in the second operation mode will be described below. FIG. 6 is a flowchart of the game processing in the second operation mode.
  • When the control enters the second operation mode, the CPU 201 activates the azimuth sensor 206 to detect the azimuth of the azimuth sensor 206 itself, and stores the detected azimuth in the RAM 202 as data (step S601).
  • Next, the CPU 201 refers to the azimuth angles of the respective characters managed in the RAM 202 to specify characters having azimuth angles, which have differences from the azimuth acquired in step S601 to be equal to or smaller than a predetermined amount, and displays markers at the corresponding positions on the display screen of the display unit 103 (step S602). The display position of each marker is as has been described above using FIG. 4. In this manner, the CPU 201 can display the location distribution of characters which are located in azimuths near the direction in which the portable game apparatus currently faces (the azimuth detected by the azimuth sensor 206) on the display screen of the display unit 103.
  • The CPU 201 counts the number of markers currently displayed on the display screen of the display unit 103, and checks if the count value is equal to or larger than a predetermined value M (step S603). If it is determined as a result of checking in step S603 that the count value is smaller than the predetermined value M, no character can be fished up, and the flow returns to step S304.
  • On the other hand, if it is determined as a result of checking that the count value is equal to or larger than the predetermined value M, the flow advances to step S604, and the CPU 201 checks if a standby mode to be described later is set (step S604). If it is determined as a result of checking that the standby mode is currently set, the flow advances to step S606. The processing in step S606 and subsequent steps will be described later.
  • On the other hand, if the standby mode is not set, the flow advances to step S605. In step S605, the CPU 201 makes an indication used to indicate that the count value is equal to or larger than the predetermined value M on the display screen of the display unit 103, and starts its internal timer (step S605). As the indication used to indicate that the count value is equal to or larger than the predetermined value M, for example, a predetermined image may flicker, or when a light-emitting element such as an LED or the like is provided to this portable game apparatus, it may flicker.
  • Note that as the location distribution of the characters is changing all the time, the number of markers displayed on the display screen of the display unit 103 and their display positions are changing every second.
  • If the number of markers currently displayed on the display screen of the display unit 103 is equal to or larger than the predetermined value M, the player cannot fish up a character unless he or she presses a predetermined key of the key group 104 within a predetermined period of time (e.g., 1 sec). Hence, the CPU 201 checks in step S605 if the predetermined key is pressed within a predetermined period of time after the start of the timer (step S605). If the predetermined key is not pressed, the flow returns to step S304.
  • On the other hand, if it is detected that the predetermined key of the key group 104 is pressed within the predetermined period of time after the start of the timer, the control enters the standby mode (first step), and the flow advances to step S606. In step S606, the CPU 201 counts the number of markers currently displayed on the display screen of the display unit 103, and checks if the count value is equal to or larger than a predetermined value N (>M) (step S606).
  • If it is determined as a result of checking in step S606 that the count value is smaller than the predetermined value N, the flow returns to step S304.
  • On the other hand, if it is determined as a result of checking that the count value is equal to or larger than the predetermined value N, the flow advances to step S607, and the CPU 201 flickers a character image on the display screen of the display unit 103 (step S607). Note that various display modes may be used as that in step S607.
  • If the number of markers currently displayed on the display screen of the display unit 103 is equal to or larger than the predetermined value N (>M), the player cannot fish up a character unless he or she presses a predetermined key of the key group 104 a predetermined number of times within a predetermined period of time. Therefore, the player presses the predetermined key of the key group 104 a predetermined number of times within the predetermined period of time.
  • Hence, upon detection of the first pressing of the predetermined key since the flow advances to step S608, the CPU 201 starts the internal timer from the detection timing, and checks if the predetermined key is pressed a predetermined number of times or more (e.g., 10 times or more) within the predetermined period of time (e.g., 3 sec) (step S608).
  • If it is determined as a result of checking that 10 or more key inputs cannot be detected within 3 sec, the flow advances to step S609, and the CPU 201 displays a message indicating that the player has failed to fish up the character on the display screen of the display unit 103, and the flow then returns to step S304.
  • After the display processing in step S609 is executed, the game processing in the second operation mode is not continued, and the game processing in the first operation mode is performed. Hence, in step S609 the CPU 201 also displays a message that prompts the player to slide across the second main body 101 b so that substantially the entire body of the second main body 101 b overlaps the first main body 101 a, on the display screen of the display unit 103.
  • On the other hand, If 10 or more key inputs are detected within 3 sec, the flow advances to step S610 to enter a battle mode (second step) with the character, and the CPU 201 displays a message indicating that the battle with the character starts on the display screen of the display unit 103 (step S610). For example, the CPU 201 displays a message “HIT! & BATTLE!” on the display screen of the display unit 103.
  • After that, the image of the character as an opponent is displayed on the display screen of the display unit 103. The player turns the handle unit 102 (in the direction indicated by the arrow B in FIG. 1) so as to fish up the opponent character. When the player turns the handle unit 102, its rotation angle is measured by the encoder 204. Also, since the CPU 201 starts its internal timer, it calculates a rotational velocity (rotation angle/sec) of the handle unit 102 based on these variables (step S611). The CPU 201 reads out a parameter indicating the weight of the opponent character from the ROM 203 or RAM 202 and refers to it (step S611). Then, the CPU 201 calculates using the rotational velocity and weight:
    Tension of fishing line=weight+rotational velocity
  • As can be seen from this equation, the tension increases with increasing weight of the opponent character. Also, the tension increases with increasing rotational velocity.
  • The CPU 201 compares the calculated tension value with a predetermined threshold to check if the tension is equal to or higher than the predetermined threshold (step S611).
  • If the tension is equal to or higher than the predetermined threshold, the flow advances to step S609 to display a message indicating that the player has failed to fish up the opponent character on the display screen of the display unit 103. The flow then returns to step S304. That is, the opponent character cannot be fished up unless the tension is at least lower than the predetermined threshold.
  • On the other hand, if the tension is lower than the predetermined threshold, the flow advances to step S612 to check with reference to data indicating the health power (to be abbreviated as HP hereinafter) of the opponent character held in the RAM 202 if an HP value=0 (step S612).
  • If it is determined as a result of checking in step S612 that HP=0, the flow advances to step S613 to display a message indicating that the player has succeeded to fish up the opponent character on the display screen of the display unit 103 (step S613), and the flow returns to step S304.
  • On the other hand, if HP≠0, the flow advances to step S614 to execute processing for reducing HP by an amount corresponding to the rotational velocity calculated in step S611 (step S614). For example, if the player has turned the handle unit 102 more times, HP is reduced by a larger amount.
  • After the control enters the battle mode, the opponent character moves not to be easily fished up. As a result, the azimuth and distance of this character change. Hence, the player variously changes the azimuth of this portable game apparatus 100 to turn the direction (the azimuth measured by the azimuth sensor 206) of the portable game apparatus 100 to the azimuth where this character exists. When the direction of the portable game apparatus 100 agrees with the azimuth where the character exists, the image of that character is displayed on the display screen of the display unit 103.
  • Hence, in step S614 the HP of this character is reduced only when the image of the character is displayed on the display screen of the display unit 103 and the player turns the handle unit 102 while keeping the tension to be lower than the predetermined threshold.
  • In this manner, the HP of the character is reduced not by simply turning the handle unit 102. Since the player searches for the azimuth where the character exists, and turns the handle unit 102 only after the image of this character is displayed on the display screen of the display unit 103, the HP of the character is reduced. Therefore, an operation to be done by the player to fish up the character becomes harder, thus improving the game difficulty. Note that the moving speeds of the characters may be different depending on the characters.
  • Next, the CPU 201 checks if a predetermined period of time has elapsed after the control enters the battle mode (step S615). That is, since the battle with the opponent character must be done within the predetermined period of time (this “predetermined period of time” may vary depending on the types of opponent characters), if the predetermined period of time has elapsed, the flow advances to step S609 to display a message indicating that the player has failed to fish up the character on the display screen of the display unit 103, and the flow then returns to step S304.
  • On the other hand, if the predetermined period of time has not elapsed yet, the flow returns to step S611 to repeat the subsequent processes.
  • In this embodiment, the operation mode is switched by sliding the second main body 101 b. However, the present invention is not limited to this. For example, the operation mode may be switched based on, e.g., a predetermined key input of the key group 104.
  • As for the operation method and operation timing for the key group 104 to fish up the character, and the display mode on the display screen of the display unit 103, various modifications may be made, and the present invention is not limited to the above specific embodiment.
  • According to the arrangement of this embodiment, the game which virtually locates characters around the azimuth sensor 206, i.e., the player of this portable game apparatus 100, and allows the player to fish up any of the characters can be provided. This game can be implemented without any complicated three-dimensional coordinate calculations unlike known virtual reality.
  • In this embodiment, in order to start the battle mode, the predetermined key is pressed a predetermined number of times or more within the predetermined period of time in the state in step S607. However, the condition, operation, and the like to start the battle mode are not particularly limited. For example, a condition that “the handle unit 102 is turned a predetermined number of times within a predetermined period of time in the state in step S607” may be used.
  • Second Embodiment
  • When the character is fished up, as has been described in the first embodiment, data associated with the fished character is held in the RAM 202. If the portable game apparatus 100 incorporates a storage device such as a hard disk which can held information after power OFF, such data can be stored in this storage device.
  • Therefore, when the player instructs a mode for displaying a list of information associated with the characters which were previously fished up by him or her on the display screen of the display unit 103 using the key group 104, the CPU 201 reads out the data associated with the fished characters from such storage device, and displays a list of information associated with the fished characters such as the images, names, weights, fished dates and times, and the like of the fished characters on the display screen of the display unit 103 like a picture book.
  • Also, a mode for bringing up the fished character may be provided. Furthermore, a mode for converting the fished character into coins that can be used in this game and purchasing items by the converted coins may be provided.
  • Such items may include those which allow the player to fish up a character more efficiently. For example, such item may reduce the threshold to be compared with the tension.
  • The portable game apparatus 100 may comprise a communication unit which can make data communications with other portable game apparatuses. In this case, the following game may be carried out. That is, the player may play the game for fishing up characters using his or her game apparatus, and may exchange information associated with fished characters with another player via the communication unit after an elapse of a predetermined period of time. Thus, a player who fished up the bigger character is checked by each other's game apparatuses to determine a winner.
  • Third Embodiment
  • In the first embodiment, characters having the azimuth angles, which have differences from the azimuth acquired in step S601 to be equal to or smaller than a predetermined amount are specified, and markers are displayed at corresponding positions on the display screen of the display unit 103 in step S602. However, in this embodiment, in step S602 a character having the azimuth angle closest to the azimuth acquired in step S601 is specified, and the probability of occurrence of the battle with the specified character is displayed on the display screen of the display unit 103. This probability is indicated by the number of markers displayed on the display screen of the display unit 103. That is, the probability of the battle becomes higher with increasing number of markers. Note that the display method of the probability is not particularly limited.
  • Then, this probability (corresponding to the number of displayed markers) changes according to predetermined rules or randomly. The CPU 201 performs change control of the probability.
  • The processes to be executed in step S603 and subsequent steps are the same as those in the first embodiment. When the standby mode is set and the number of markers currently displayed on the display screen of the display unit 103 is equal to or larger than the predetermined value N (>M), since the probability of occurrence of the battle with the character having the azimuth angle closest to the azimuth acquired in step S601 is equal to or higher than a predetermined value, the processes in step S603 and subsequent steps are executed to battle with this character.
  • Fourth Embodiment
  • This embodiment will explain another processing in the second operation mode. Only the processing in the second operation mode is different in this embodiment, and other points are the same as those in the first embodiment. FIG. 7 is a flowchart showing the game processing in the second operation mode according to this embodiment.
  • When the control enters the second operation mode, the CPU 201 activates the azimuth sensor 206 to make it detect its own azimuth, and stores the detected azimuth in the RAM 202 as data (step S701).
  • Next, the CPU 201 refers to the azimuth angles of the respective characters managed in the RAM 202 to specify a character having an azimuth angle closest to the azimuth acquired in step S701, and displays a gauge window having a length corresponding to the difference (azimuth difference) between the azimuth angle of the specified character and the azimuth acquired in step S701 on the display screen of the display section 103 (step S702). FIG. 8A shows a display example of the display screen displayed on the display section 103 in step S702. Referring to FIG. 8A, reference numeral 801 denotes a gauge which is formed by a plurality of gauge blocks. The length of the gauge changes depending on the number of gauge blocks to be displayed. The length of this gauge (the number of gauge blocks to be displayed) becomes smaller (the number of gauge blocks to be displayed becomes smaller) with increasing azimuth difference, or it becomes larger (the number of gauge blocks to be displayed becomes larger) with decreasing azimuth difference. For example, when the azimuth difference is 180°, one gauge block is displayed; when the azimuth difference is 0°, a maximum number of (6 in FIG. 8A) gauge blocks are displayed. In this way, the player can confirm the azimuth difference from a character which is located at an azimuth angle closest to an azimuth near the direction in which the portable game apparatus currently faces (the azimuth detected by the azimuth sensor 206) by observing the gauge.
  • When the azimuth difference becomes equal to or smaller than a predetermined value, the image of the character located at the azimuth angle closest to the azimuth acquired in step S701 is displayed on an area 802 on the window of FIG. 8A. In this case, the player may be audibly notified of that fact, or if the portable game apparatus 100 comprises an LED, he or she may be notified of that fact by flickering the LED. The notification mode is not particularly limited as along as the purpose of notifying that the azimuth difference becomes equal to or smaller than the predetermined value remains the same.
  • If the gauge length becomes maximum (i.e., when the azimuth difference becomes sufficiently small), the CPU 201 checks if the handle section 102 has been turned a predetermined amount (step S704). The method of acquiring the turning amount of the handle section 102 is the same as that in the first embodiment. As a result, if the gauge length becomes maximum and the handle section 102 has been turned the predetermined amount, the flow advances to step S705; otherwise, the flow returns to step S702.
  • In step S705, the CPU 201 displays, on the display screen of the display section 103, a message indicating that the battle starts with the character, which is displayed on the area 802 when the gauge length becomes maximum (i.e., the azimuth difference becomes sufficiently small) and the handle section 102 has been turned the predetermined amount (step S705). For example, the CPU 201 displays a message “HIT!” on the display screen of the display section 103. Then, the CPU 201 displays a battle window shown in FIG. 8B on the display screen of the display section 103 (step S706). FIG. 8B shows a display example of the battle window. When the battle has started, since the character as an opponent begins to move for escape, the azimuth angle of this character changes sequentially. Therefore, in step S706 the CPU 201 further displays the direction in which the portable game apparatus 100 is to face on the display screen of the display section 103 so that the azimuth of the portable game apparatus 100 matches the azimuth angle of the opponent character.
  • For example, let α be the azimuth angle of the opponent character, and β be the current azimuth of the portable game apparatus 100, and assume that the azimuth angle increases clockwise to have the player as the center. If α>β, an image of a right-pointing arrow is displayed on the display screen of the display section 103; if α<β, an image of a left-pointing arrow is displayed on the display screen of the display section 103. Of course, in step S706 the CPU 201 executes the processing for sequentially acquiring the azimuth of the portable game apparatus 100 as in step S701. When the azimuth difference between the azimuth of the portable game apparatus 100 and that of the opponent character becomes equal to or smaller than the predetermined value, the image of the opponent character is displayed on the area 802 on the window of FIG. 8B.
  • Next, the CPU 201 checks if the azimuth difference is equal to or smaller than a sufficiently small predetermined value (step S707). If the azimuth difference is not smaller than the predetermined value, the flow returns to step S706. On the other hand, if the azimuth difference is equal to or smaller than the predetermined value, the flow advances to step S709 to execute the same processing as in step S611 above. That is, the CPU 201 calculates a rotational velocity (rotation angle/sec) of the handle section 102, and reads out a parameter indicating the weight of the opponent character from the ROM 203 onto the RAM 202 and refers to it. Then, the CPU 201 calculates the tension of a fishing line using the rotational velocity and the parameter of weight by the above method. The CPU 201 then compares the calculated tension value with a predetermined threshold to check if the tension is equal to or higher than the predetermined threshold (step S709).
  • If the tension is equal to or higher than the predetermined threshold, the flow advances to step S710 to display a message indicating that the player has failed to fish up the opponent character on the display screen of the display section 103 (step S710). The flow then returns to step S304. After the display processing in step S710 is executed, the game processing in the second operation mode is not continued, and the game processing in the first operation mode is performed. Hence, in step S710 the CPU 201 also displays a message that prompts the player to slide the second main body 101 b so that substantially the entire body of the second main body 101 b overlaps the first main body 101 a, on the display screen of the display section 103.
  • On the other hand, if the tension is lower than the predetermined threshold, the flow advances to step S711 to check with reference to data indicating the health power (to be abbreviated as HP hereinafter) of the opponent character held in the RAM 202 if an HP value=0 (step S711).
  • If it is determined as a result of checking in step S711 that HP=0, the flow advances to step S712 to display a message indicating that the player has succeeded to fish up the opponent character on the display screen of the display section 103 (step S712), and the flow returns to step S304.
  • On the other hand, if HP≠0, the flow advances to step S713 to execute processing for reducing HP by an amount corresponding to the rotational velocity calculated in step S709 (step S713). For example, if the player has turned the handle section 102 more times, HP is reduced by a larger amount. On the window of FIG. 8B, reference numeral 803 denotes a gauge which indicates the HP value by means of its length, and the hatched portion indicates the remaining HP value. Therefore, when the HP is reduced, the length of the hatched portion (the length of the gauge) decreases.
  • Next, the CPU 201 checks if a predetermined period of time has elapsed after the control enters the battle mode (step S714). That is, since the battle with the opponent character must be done within the predetermined period of time (this “predetermined period of time” may vary depending on the types of opponent characters), if the predetermined period of time has elapsed, the flow advances to step S710 to display a message indicating that the player has failed to fish up the opponent character on the display screen of the display section 103, and the flow then returns to step S304.
  • On the other hand, if the predetermined period of time has not elapsed yet, the flow returns to step S706 to repeat the subsequent processes.
  • Fifth Embodiment
  • This embodiment will explain a portable game apparatus which provides a game for “fighting off” characters by shooting them.
  • FIG. 9 shows the outer appearance of a portable game apparatus according to this embodiment. As shown in FIG. 9, a portable game apparatus 900 comprises a first main body 900 a, second main body 900 b, and member 905.
  • The first main body 900 a is attached to the member 905 via a hinge (not shown) to be rotatable around an axis 960, which passes through a point 959 on a side surface 909 of the member 905 and is substantially perpendicular to the side surface 909, as indicated by arrow A.
  • The second main body 900 b is attached to the member 905 to be rotatable around an axis 950, which passes through a central point 911 on a bottom surface 910 of the second main body 900 b and is substantially perpendicular to the bottom surface 910, as indicated by arrow B. FIG. 11 is a side view of the portable game apparatus 900 when the second main body 900 b is rotated 90° around the axis 950. In this embodiment, processing associated with a second operation mode starts after the apparatus is set in the state shown in FIG. 11.
  • The first main body 900 a comprises a display section 901. The display section 901 comprises, e.g., a liquid crystal display, and displays various game windows to be described later.
  • The second main body 900 b comprises a trigger 902 and keys 904 a and 904 b. The trigger 902 is attached to the second main body 900 b to be able to be pressed into it, as indicated by arrow C in FIG. 11. The keys 904 a and 904 b are used by the player to input various operation instructions. In the following description, the keys 904 a and 904 b will be referred as a “key group 904” together in some cases.
  • In the description of this embodiment, the portable game apparatus has the outer appearance arrangement shown in FIGS. 9 and 11. However, the layout, shapes, and the like of the display section 901, the trigger 902 and key group 904 are not limited to those shown in FIGS. 9 and 11, and various modifications can be made. As can be apparent from the following description, the following description can be applied to each of such modifications.
  • FIG. 9 and subsequent figures mainly show parts used in the following description, and do not show any parts such as a power switch, and the like, which do not constitute the gist of this embodiment.
  • FIG. 10 is a block diagram showing the hardware arrangement of the portable game apparatus 900 according to this embodiment. Note that the same reference numerals in FIG. 10 denote the same parts as in FIG. 9, and a description thereof will be omitted.
  • Reference numeral 1001 denotes a CPU which controls the entire portable game apparatus 900 using programs and data stored in a ROM 1003, data temporarily held in a RAM 1002, and the like, and executes game processing to be described later. The CPU 1001 has an internal timer (not shown), which can measure time.
  • Reference numeral 1002 denotes a RAM which can provide various areas as needed such as an area for temporarily storing data which is being processed, an area for temporarily storing the pressing state of the trigger 902 detected by a detector 1004 as data, an area for temporarily storing the detection result of an azimuth sensor 1008 as data, a work area used by the CPU 1001 upon executing various kinds of processing, and the like.
  • Reference numeral 1003 denotes a ROM which stores programs and data required to make the CPU 1001 control the overall portable game apparatus 900 and execute game processing to be described later. The stored data include parameters associated with characters, and the parameters include initial vital forces (or health powers), weights, and the like of characters. The data also include parameters that specify the initial number of bullets fired by pulling the trigger 902 and their teeth. These parameters are loaded onto the RAM 1002 and are used in processing, as needed.
  • Reference numeral 1004 denotes a detector which detects whether or not the trigger 902 is pressed. The detection result is held in the RAM 1002 as data.
  • Reference numeral 1005 denotes an open/close detector. As described above, the first main body 900 a is rotatable around the axis 960 with respect to the member 905, and has a state in which the first main body 900 a is fully rotated to one side, and a state in which it is fully rotated to the other side. The open/close detector 1005 detects one of the former state (a state in which substantially the entire body of the first main body 900 a overlaps the second main body 900 b) and the latter state (a state in which the first and second main bodies 900 a and 900 b do not overlap each other), and notifies the CPU 1001 of that detection result.
  • When the detection result indicates the former state, the CPU 1001 sets the operation mode of the portable game apparatus 900 in the first operation mode; when the detection result indicates the latter state, the CPU 1001 sets the operation mode of the portable game apparatus 900 in the second operation mode. The first and second operation modes will be described later.
  • Reference numeral 1008 denotes an azimuth sensor which detects the azimuth of the main body. The detection result is held in the RAM 1002 as data. Note that the detection precision of the azimuth sensor 1008 is not particularly limited.
  • Reference numeral 1009 denotes a bus which interconnects the aforementioned sections.
  • As for the main processing to be executed by the portable game apparatus 900 (CPU 1001), this embodiment has the game contents themselves different from the first embodiment, but it is the same as the first embodiment on the issue of executing processing according to the first and second operation modes. In case of this embodiment, the program and data for making the CPU 1001 execute this main processing are stored in the ROM 1003. When the CPU 1001 executes processing using these program and data, the portable game apparatus 900 according to this embodiment executes this main processing. Execution of this main processing is started when the player turns on the power switch of the portable game apparatus and inputs a game start instruction using the key group 904.
  • The game processing in the first operation mode and that in the second operation mode will be respectively described below.
  • <Game processing in First Operation Mode>
  • The processing in the first operation mode is basically the same as that in the first embodiment. That is, characters to be fought off are virtually located within a surrounding region having the position of the azimuth sensor 1008 as the center, and appropriate azimuth angles and (virtual) distances are given to characters to be located. After the characters are located, individual characters are moved, and the location distribution of the characters is displayed on the display screen of the display section 901. When characters corresponding to markers are located within a predetermined distance range with respect to the position of the azimuth sensor 1008, processing for notifying a message that advices accordingly is executed.
  • <Game processing in Second Operation Mode>
  • The game processing in the second operation mode will be described below. FIG. 12 is a flowchart showing the game processing in the second operation mode.
  • When the control enters the second operation mode, the CPU 1001 activates the azimuth sensor 1008 to detect the azimuth of the azimuth sensor 1008 itself, and stores the detected azimuth in the RAM 1002 as data (step S1201).
  • Next, the CPU 1001 refers to the azimuth angles of the respective characters managed in the RAM 1002 to specify a character having an azimuth angle closest to the azimuth acquired in step S1201, and displays a gauge window having a length according to the difference (azimuth difference) between the azimuth angle of the specified character and the azimuth acquired in step S1201 on the display screen of the display section 901 (step S1202). The gauge window is basically the same as that shown in FIG. 8A. Note that bullet images may be used in place of the gauge blocks since the game contents of this embodiment lie in “fighting off characters by shooting bullets”. When the azimuth difference becomes equal to or smaller than a predetermined value, the image of the character located at the azimuth angle closest to the azimuth acquired in step S1201 is displayed on the area 802 on the window of FIG. 8A. In this case, the player may be audibly notified of that fact, or if the portable game apparatus 900 comprises an LED, he or she may be notified of that fact by flickering the LED. In either case, the notification mode is not particularly limited as along as the purpose of notifying that the azimuth difference becomes equal to or smaller than the predetermined value remains the same.
  • After the gauge length becomes maximum (i.e., when the azimuth difference becomes sufficiently small), the player must pull the trigger 902 a predetermined number of times or more within a predetermined period time (e.g., within 1 sec) to fight off the character. Processing for specifying the number of times of pulling the trigger after the gauge length becomes maximum is implemented in such a manner that, for example, the CPU 1001 starts time measurement when the gauge length becomes maximum, and it counts the number of times of detection of pulling of the trigger 902 by the detector 1004 after the start of time measurement. When the count value becomes equal to or larger than a predetermined value before an elapse of the predetermined period of time, it is determined that the “trigger 902 has been pulled a predetermined number of times or more within the predetermined period of time”.
  • The CPU 1001 checks by the aforementioned method if the trigger 902 has been pulled a predetermined number of times or more within the predetermined period of time after the azimuth difference becomes sufficiently small (step S1203). If the trigger 902 has not been pulled yet, the flow returns to step S1202; otherwise, the flow advances to step S1204.
  • In step S1204, the CPU 1001 displays, on the display screen of the display section 901, a message indicating that the battle starts with the character, which is displayed on the area 802 (step S1204). For example, the CPU 1001 displays a message “LOCK ON!” on the display screen of the display section 901. Then, the CPU 1001 displays the battle window shown in FIG. 8B on the display screen of the display section 901 (step S1205). In case of this embodiment, the window shown in FIG. 8B displays the gauge 801 using bullet images as gauge blocks. When the battle has started, since the character as an opponent begins to move for escape, the azimuth angle of this character changes sequentially. Therefore, in step S1205 the CPU 1001 further displays the direction in which the portable game apparatus 900 is to face on the display screen of the display section 901 so that the azimuth of the portable game apparatus 900 matches that of the opponent character. When the azimuth difference between the azimuth of the portable game apparatus 900 and that of the opponent character becomes equal to or smaller than the predetermined value, the image of the opponent character is displayed on the area 802 on the window of FIG. 8B. These processes are the same as those in the fourth embodiment.
  • Next, the CPU 1001 checks if the azimuth difference is equal to or smaller than a sufficiently small predetermined value (step S1206). If the azimuth difference is not smaller than the predetermined value, the flow returns to step S1205. On the other hand, if the azimuth difference is equal to or smaller than the predetermined value, the flow advances to step S1207 to check if the number of currently remaining bullets is zero (step S1207). As a result of checking, if the number of remaining bullets is zero, the flow advances to step S1208 to display a message indicating that the opponent character has escaped on the display screen of the display section 901 (step S1208). After that, the flow returns to step S304. After the display processing in step S1208 is executed, the game processing in the second operation mode is not continued, and the game processing in the first operation mode is performed. Hence, in step S1208 the CPU 1001 also displays a message that prompts the player to rotate the first main body 900 a so that the first main body 900 a overlaps the second main body 900 b, on the display screen of the display section 901.
  • On the other hand, if the number of remaining bullets is not zero, the flow advances to step S1209 to check if the detector 1004 detects that the trigger 902 has been pulled since step S1206 (step S1209) As a result of checking, if the detector 1004 does not detect that the trigger 902 has been pulled, the flow jumps to step S1213; otherwise, the flow advances to step S1210.
  • The CPU 1001 checks in step S1210 with reference to data indicating the health power (to be abbreviated as HP hereinafter) of the opponent character held in the RAM 1002 if an HP value=0 (step S1210).
  • If it is determined as a result of checking in step S1210 that HP=0, the flow advances to step S1211 to display a message indicating that the player has fought off the opponent character on the display screen of the display section 901 (step S1211), and the flow returns to step S304.
  • On the other hand, if HP≠0, the flow advances to step S1212 to execute processing for reducing HP (step S1212). As for an amount to be reduced, a predetermined value may be used or an amount may be determined according to the teeth of bullets which are used currently. When the window shown in FIG. 8B is used as the battle window, the length of the hatched portion (gauge length) is shortened when the HP is reduced.
  • Next, the CPU 1001 checks if a predetermined period of time has elapsed after the control enters the battle mode (step S1213). That is, since the battle with the opponent character must be done within the predetermined period of time (this “predetermined period of time” may vary depending on the types of opponent characters), if the predetermined period of time has elapsed, the flow advances to step S1208 to display a message indicating that the opponent character has escaped on the display screen of the display section 901, and the flow then returns to step S304.
  • On the other hand, if the predetermined period of time has not elapsed yet, the flow returns to step S1205 to repeat the subsequent processes.
  • As many apparently widely different embodiments of the present invention can be made without departing from the spirit and scope thereof, it is to be understood that the invention is not limited to the specific embodiments thereof except as defined in the claims.
  • This application claims the benefit of Japanese Application No. 2005-207461, filed on Jul. 15, 2005 and No. 2006-138223 filed on May 17, 2006, which are hereby incorporated by reference herein in their entirety.

Claims (15)

1. A game apparatus which comprises an operation unit that can be turned by a player, a display unit, and an input unit, comprising:
an azimuth sensor for detecting an azimuth;
a memory for, when characters are virtually located around a position of the azimuth sensor, holding location information indicating a relative positional relationship between each character and the azimuth sensor;
detection means for detecting an input of an operation mode of one of a first operation mode and a second operation mode; and
processing means for performing game processing corresponding to the operation mode detected by the detection means, and
in that when the detection means detects the input of the first operation mode,
the processing means controls the display unit to display a location distribution of at least one character, with reference to the location information of the at least one character held in the memory, and
when the detection means detects the input of the second operation mode,
the processing means controls the display unit to display a location distribution of the characters located in azimuths near the azimuth detected by the azimuth sensor, with reference to the location information of each character held in said memory, and
said processing means performs game processing in accordance with an operation result to the input unit and the operation unit in correspondence with a display state on the display unit.
2. The apparatus according to claim 1, wherein said memory holds, as the location information, a set of an azimuth angle from the position of said azimuth sensor to the location of each character, and a distance from the position of said azimuth sensor to the location of each character.
3. The apparatus according to claim 1, wherein when said detection means detects the input of the second operation mode, and when an operation input from the input unit is detected within a predetermined period of time after a display timing of the location distribution of characters, the number of which is not less than a predetermined value M, on the display unit, said processing means controls to start a first step,
when an operation input from one of the operation unit and the input unit is detected within a predetermined period of time after a display timing of the location distribution of characters, the number of which is not less than a predetermined value N (N>M) after the start of the first step, said processing means controls to start a second step, and
said processing means controls the display unit to display a message indicating battle with a character, and to make a display according to an operation result to the operation unit after the start of the second step.
4. The apparatus according to claim 1, wherein the location distribution of the characters is a relative location relationship between individual characters.
5. A game apparatus which comprises an operation unit that can be turned by a player, a display unit, and an input unit, comprising:
an azimuth sensor for detecting an azimuth;
a memory for, when characters are virtually located around a position of said azimuth sensor, holding location information indicating a relative positional relationship between the characters and said azimuth sensor;
means for controlling the display unit to display a location distribution of characters located in azimuths near the azimuth detected by said azimuth sensor with reference to location information of each character held in said memory; and
processing means for performing game processing in accordance with an operation result to the input unit and the operation unit in correspondence with a display state on the display unit.
6. A game apparatus which comprises an operation unit that can be turned by a player, a display unit, and an input unit, comprising:
an azimuth sensor for detecting an azimuth;
a memory for, when characters are virtually located around a position of said azimuth sensor, holding location information indicating a relative positional relationship between each character and said azimuth sensor;
detection means for detecting an input of an operation mode of one of a first operation mode and a second operation mode; and
processing means for performing game processing corresponding to the operation mode detected by said detection means, and
in that when said detection means detects the input of the first operation mode,
said processing means controls the display unit to display a location distribution of at least one character based on the location information of the at least one character held in said memory,
when said detection means detects the input of the second operation mode,
said processing means controls the display unit to make a display indicating a probability of occurrence of battle with a character located in an azimuth closest to the azimuth detected by said azimuth sensor with reference to the location information of each character held in said memory, and
when the battle has occurred, said processing means performs game processing in accordance with an operation result to the input unit and the operation unit in correspondence with a display state on the display unit.
7. A game apparatus which comprises an operation unit that can be turned by a player, a display unit, and an input unit, comprising:
an azimuth sensor for detecting an azimuth;
a memory for, when characters are virtually located around a position of said azimuth sensor, holding location information indicating a relative positional relationship between the characters and said azimuth sensor;
means for controlling the display unit to make a display indicating a probability of occurrence of battle with a character located in an azimuth closest to the azimuth detected by said azimuth sensor with reference to location information of each character held in said memory; and
means for, when the battle has occurred, performing game processing in accordance with an operation result to the input unit and the operation unit in correspondence with a display state on the display unit.
8. A game apparatus which comprises an operation unit and a display unit, comprising:
an azimuth sensor for detecting an azimuth;
a memory which holds, when characters are virtually located around a position of said azimuth sensor, location information indicating a relative positional relationship between each character and said azimuth sensor;
detection means for detecting an input of an operation mode of one of a first operation mode and a second operation mode; and
processing means for performing game processing corresponding to the operation mode detected by said detection means, and
in that when said detection means detects the input of the first operation mode,
said processing means controls the display unit to display a location distribution of at least one character with reference to the location information of the at least one character held in said memory, and
when said detection means detects the input of the second operation mode,
said processing means makes notification according to a difference between an azimuth of a character located in an azimuth closest to the azimuth detected by said azimuth sensor, and the azimuth detected by said azimuth sensor, with reference to the location information of each character held in said memory, and
said processing means performs game processing in accordance with a display state on the display unit or an operation result to the operation unit based on the notification result.
9. A method of controlling a game apparatus, which comprises an operation unit that can be turned by a player, a display unit, an input unit, an azimuth sensor for detecting an azimuth, and a memory, comprising:
a storage control step of storing, when characters are virtually located around a position of the azimuth sensor, location information indicating a relative positional relationship between each character and the azimuth sensor in the memory;
a detection step of detecting an input of an operation mode of one of a first operation mode and a second operation mode; and
a processing step of performing game processing corresponding to the operation mode detected in the detection step, and
in that the processing step includes steps of:
controlling, when the input of the first operation mode is detected in the detection step, the display unit to display a location distribution of at least one character based on the location information of the at least one character held in the memory, and
controlling, when the input of the second operation mode is detected in the detection step, the display unit to display a location distribution of the characters located in azimuths near the azimuth detected by the azimuth sensor, with reference to the location information of each character held in the memory, and
performing game processing in accordance with an operation result to the input unit and the operation unit in correspondence with a display state on the display unit.
10. A method of controlling a game apparatus, which comprises an operation unit that can be turned by a player, a display unit, an input unit, an azimuth sensor for detecting an azimuth, and a memory comprising:
a storage control step of storing, when characters are virtually located around a position of the azimuth sensor, location information indicating a relative positional relationship between each character and the azimuth sensor in the memory;
a step of controlling the display unit to display a location distribution of characters located in azimuths near the azimuth detected by the azimuth sensor with reference to location information of each character held in the memory; and
a step of performing game processing in accordance with an operation result to the input unit and the operation unit in correspondence with a display state on the display unit.
11. A method of controlling a game apparatus, which comprises an operation unit that can be turned by a player, a display unit, an input unit, an azimuth sensor for detecting an azimuth, and a memory, comprising:
a storage control step of storing, when characters are virtually located around a position of the azimuth sensor, location information indicating a relative positional relationship between each character and the azimuth sensor in the memory;
a detection step of detecting an input of an operation mode of one of a first operation mode and a second operation mode; and
a processing step of performing game processing corresponding to the operation mode detected in the detection step, and
in that the processing step includes steps of:
controlling, when the input of the first operation mode is detected in the detection step, the display unit to display a location distribution of at least one character based on the location information of the at least one character held in the memory, and
controlling, when the input of the second operation mode is detected in the detection step, the display unit to make a display indicating a probability of occurrence of battle with a character located in an azimuth closest to the azimuth detected by the azimuth sensor with reference to the location information of each character held in the memory, and
performing, when the battle has occurred, game processing in accordance with an operation result to the input unit and the operation unit in correspondence with a display state on the display unit.
12. A method of controlling a game apparatus, which comprises an operation unit that can be turned by a player, a display unit, an input unit, an azimuth sensor for detecting an azimuth, and a memory, comprising:
a storage control step of storing, when characters are virtually located around a position of the azimuth sensor, location information indicating a relative positional relationship between each character and the azimuth sensor in the memory;
a step of controlling the display unit to make a display indicating a probability of occurrence of battle with a character located in an azimuth closest to the azimuth detected by the azimuth sensor with reference to location information of each character held in the memory; and
a step of performing, when the battle has occurred, game processing in accordance with an operation result to the input unit and the operation unit in correspondence with a display state on the display unit.
13. A method of controlling a game apparatus, which comprises an operation unit, a display unit, an azimuth sensor for detecting an azimuth, and a memory, comprising:
a storage control step of storing, when characters are virtually located around a position of the azimuth sensor, location information indicating a relative positional relationship between each character and the azimuth sensor in the memory;
a detection step of detecting an input of an operation mode of one of a first operation mode and a second operation mode; and
a processing step of performing game processing corresponding to the operation mode detected in the detection step, and
in that the processing step includes a step of:
controlling, when the input of the first operation mode is detected in the detection step, the display unit to display a location distribution of at least one character with reference to the location information of the at least one character held in the memory, and
the processing step also includes steps of:
making, when the input of the second operation mode is detected in the detection step, notification according to a difference between an azimuth of a character located in an azimuth closest to the azimuth detected by said azimuth sensor, and the azimuth detected by the azimuth sensor with reference to the location information of each character held in the memory; and
performing game processing in accordance with a display state on the display unit or an operation result to the operation unit based on the notification result.
14. A program making a computer execute a control method of claim 9.
15. A computer readable storage medium storing a program of claim 14.
US11/485,482 2005-07-15 2006-07-13 Game apparatus and its control method Abandoned US20070015575A1 (en)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
JP2005207461 2005-07-15
JP2005-207461 2005-07-15
JP2006-138223 2006-05-17
JP2006138223A JP3964921B2 (en) 2005-07-15 2006-05-17 GAME DEVICE AND ITS CONTROL METHOD

Publications (1)

Publication Number Publication Date
US20070015575A1 true US20070015575A1 (en) 2007-01-18

Family

ID=37036953

Family Applications (1)

Application Number Title Priority Date Filing Date
US11/485,482 Abandoned US20070015575A1 (en) 2005-07-15 2006-07-13 Game apparatus and its control method

Country Status (5)

Country Link
US (1) US20070015575A1 (en)
EP (1) EP1743683A1 (en)
JP (1) JP3964921B2 (en)
KR (1) KR100793004B1 (en)
TW (1) TW200714328A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080194330A1 (en) * 2007-02-09 2008-08-14 Pixart Imaging Incorporation Interactive game method and interactive game system with alarm function

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP4440962B2 (en) * 2007-12-05 2010-03-24 株式会社バンダイ Game device
JP6779939B2 (en) * 2018-04-19 2020-11-04 グリー株式会社 Game device, control method and control program

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5713792A (en) * 1995-01-30 1998-02-03 Sega Enterprises, Ltd. Fishing game device and a simulated fishing reel
US6217446B1 (en) * 1996-12-06 2001-04-17 Kabushi Kaisha Sega Enterprises Game device and picture processing device
US20010007830A1 (en) * 2000-01-06 2001-07-12 Konami Corporation Game system and computer-readable storage medium
US20010021665A1 (en) * 1997-03-07 2001-09-13 Kazuhiro Gouji Fishing game device
US6422942B1 (en) * 1999-01-29 2002-07-23 Robert W. Jeffway, Jr. Virtual game board and tracking device therefor

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5339095A (en) * 1991-12-05 1994-08-16 Tv Interactive Data Corporation Multi-media pointing device
JPH07211196A (en) * 1994-01-25 1995-08-11 Sega Enterp Ltd Operation equipment for game machine
US5796387A (en) * 1994-08-16 1998-08-18 Smith Engineering Positioning system using infrared radiation
JP3422383B2 (en) * 1994-09-05 2003-06-30 株式会社タイトー Method and apparatus for detecting relative position between video screen and gun in shooting game machine
JP2000005443A (en) 1998-06-25 2000-01-11 Jatco Corp Gps receiver
JP3431522B2 (en) 1998-11-17 2003-07-28 株式会社ナムコ GAME DEVICE AND INFORMATION STORAGE MEDIUM
JP2001087551A (en) * 1999-09-24 2001-04-03 Konami Co Ltd Shooting video game device and image displaying method in shooting video game

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5713792A (en) * 1995-01-30 1998-02-03 Sega Enterprises, Ltd. Fishing game device and a simulated fishing reel
US6217446B1 (en) * 1996-12-06 2001-04-17 Kabushi Kaisha Sega Enterprises Game device and picture processing device
US20010021665A1 (en) * 1997-03-07 2001-09-13 Kazuhiro Gouji Fishing game device
US6422942B1 (en) * 1999-01-29 2002-07-23 Robert W. Jeffway, Jr. Virtual game board and tracking device therefor
US20010007830A1 (en) * 2000-01-06 2001-07-12 Konami Corporation Game system and computer-readable storage medium

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20080194330A1 (en) * 2007-02-09 2008-08-14 Pixart Imaging Incorporation Interactive game method and interactive game system with alarm function

Also Published As

Publication number Publication date
KR100793004B1 (en) 2008-01-08
JP3964921B2 (en) 2007-08-22
JP2007044487A (en) 2007-02-22
KR20070009463A (en) 2007-01-18
TW200714328A (en) 2007-04-16
EP1743683A1 (en) 2007-01-17

Similar Documents

Publication Publication Date Title
US20240033598A1 (en) Systems and methods for evaluating player performance
KR101954959B1 (en) Feedback signals from image data of athletic performance
US20160339316A1 (en) Screen baseball game apparatus without temporal and spatial limitations
CN109564621A (en) System and method for the tracking fortune band in movement environment
US20140200692A1 (en) Basketball sensing apparatus
US10744405B2 (en) Video game incorporating safe live-action combat
EP2779142B1 (en) Basketball sensing apparatus
US11141642B2 (en) Motion sensing basketball training device
US9399160B2 (en) Throwing sleeve with visual bio-feedback
US20150202510A1 (en) System for training sport mechanics
US10302396B1 (en) Method of and device for improving marksmanship skill
KR101282319B1 (en) Method and apparatus for virtual golf simulation measuring golf ability of the user
US20170296871A1 (en) Display method, swing analysis apparatus, swing analysis system, swing analysis program, and recording medium
US8430547B2 (en) Compact motion-simulating device
TW201916917A (en) Dart game apparatus and dart game system with display unit
US20070015575A1 (en) Game apparatus and its control method
JP2020505079A (en) Dart game device and dart game system that output event effects
TWI686229B (en) Dart game apparatus and dart game system
CN109937336B (en) Dart game device and dart game system for providing teaching image
TWI517882B (en) Wearable grenade throwing simulation system
JP6438899B2 (en) Golf practice equipment
KR20090046088A (en) Virtual reality video-game system and playing method of the same
KR20160077708A (en) Wearable device for measuring recode of dart sport
TWI690355B (en) Dart game apparatus and dart game system with an image projector
Malawski Real-Time First Person Perspective Tracking and Feedback System for Weapon Practice Support in Fencing.

Legal Events

Date Code Title Description
AS Assignment

Owner name: BANDAI CO., LTD., JAPAN

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:MATSUNO, SHOTARO;NAKANISHI, KENGO;REEL/FRAME:018104/0541;SIGNING DATES FROM 20060705 TO 20060706

AS Assignment

Owner name: ANTHROGENESIS CORPORATION, NEW JERSEY

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:LIU, QING;RAY, CYNTHIA;REEL/FRAME:018848/0382

Effective date: 20070115

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION