WO1996036945A1 - Dispositif de traitement d'une image, procede de traitement d'une image dispositif de jeu utilisant ces derniers et support de memoire - Google Patents
Dispositif de traitement d'une image, procede de traitement d'une image dispositif de jeu utilisant ces derniers et support de memoire Download PDFInfo
- Publication number
- WO1996036945A1 WO1996036945A1 PCT/JP1996/001331 JP9601331W WO9636945A1 WO 1996036945 A1 WO1996036945 A1 WO 1996036945A1 JP 9601331 W JP9601331 W JP 9601331W WO 9636945 A1 WO9636945 A1 WO 9636945A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- viewpoint
- image processing
- moving
- image
- processing apparatus
- Prior art date
Links
- 238000012545 processing Methods 0.000 title claims abstract description 148
- 238000003672 processing method Methods 0.000 title claims description 19
- 239000013598 vector Substances 0.000 claims description 34
- 238000000034 method Methods 0.000 claims description 29
- 230000000007 visual effect Effects 0.000 claims description 22
- 230000008859 change Effects 0.000 claims description 14
- 235000004522 Pentaglottis sempervirens Nutrition 0.000 abstract 1
- 238000010586 diagram Methods 0.000 description 30
- 239000000872 buffer Substances 0.000 description 21
- 238000004364 calculation method Methods 0.000 description 15
- 230000008569 process Effects 0.000 description 12
- 238000011161 development Methods 0.000 description 8
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 description 8
- 230000001154 acute effect Effects 0.000 description 4
- 238000006243 chemical reaction Methods 0.000 description 4
- 230000000875 corresponding effect Effects 0.000 description 4
- 238000013507 mapping Methods 0.000 description 3
- 238000004088 simulation Methods 0.000 description 3
- 230000009466 transformation Effects 0.000 description 3
- 238000013459 approach Methods 0.000 description 2
- 230000006399 behavior Effects 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 230000006870 function Effects 0.000 description 2
- 239000011159 matrix material Substances 0.000 description 2
- 230000002093 peripheral effect Effects 0.000 description 2
- 201000004059 subependymal giant cell astrocytoma Diseases 0.000 description 2
- 101100400452 Caenorhabditis elegans map-2 gene Proteins 0.000 description 1
- 241000282836 Camelus dromedarius Species 0.000 description 1
- 230000003321 amplification Effects 0.000 description 1
- 238000004891 communication Methods 0.000 description 1
- 230000001276 controlling effect Effects 0.000 description 1
- 238000007796 conventional method Methods 0.000 description 1
- 230000002596 correlated effect Effects 0.000 description 1
- 238000013500 data storage Methods 0.000 description 1
- 230000007423 decrease Effects 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 238000007667 floating Methods 0.000 description 1
- 239000003292 glue Substances 0.000 description 1
- 230000001771 impaired effect Effects 0.000 description 1
- 230000002452 interceptive effect Effects 0.000 description 1
- 239000004973 liquid crystal related substance Substances 0.000 description 1
- NJPPVKZQTLUDBO-UHFFFAOYSA-N novaluron Chemical compound C1=C(Cl)C(OC(F)(F)C(OC(F)(F)F)F)=CC=C1NC(=O)NC(=O)C1=C(F)C=CC=C1F NJPPVKZQTLUDBO-UHFFFAOYSA-N 0.000 description 1
- 238000003199 nucleic acid amplification method Methods 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 238000007781 pre-processing Methods 0.000 description 1
- 230000002441 reversible effect Effects 0.000 description 1
- 230000007480 spreading Effects 0.000 description 1
- 230000002194 synthesizing effect Effects 0.000 description 1
- 230000001131 transforming effect Effects 0.000 description 1
- 239000011800 void material Substances 0.000 description 1
Classifications
-
- A63F13/10—
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/50—Controlling the output signals based on the game progress
- A63F13/52—Controlling the output signals based on the game progress involving aspects of the displayed game scene
- A63F13/525—Changing parameters of virtual cameras
- A63F13/5258—Changing parameters of virtual cameras by dynamically adapting the position of the virtual camera to keep a game object or game character in its viewing frustum, e.g. for tracking a character or a ball
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/45—Controlling the progress of the video game
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T1/00—General purpose image data processing
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F13/00—Video games, i.e. games using an electronically generated display having two or more dimensions
- A63F13/80—Special adaptations for executing a specific game genre or game mode
- A63F13/803—Driving vehicles or craft, e.g. cars, airplanes, ships, robots or tanks
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/60—Methods for processing data by generating or executing the game program
- A63F2300/64—Methods for processing data by generating or executing the game program for computing dynamical parameters of game objects, e.g. motion determination or computation of frictional forces for a virtual car
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/60—Methods for processing data by generating or executing the game program
- A63F2300/66—Methods for processing data by generating or executing the game program for rendering three dimensional images
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/60—Methods for processing data by generating or executing the game program
- A63F2300/66—Methods for processing data by generating or executing the game program for rendering three dimensional images
- A63F2300/663—Methods for processing data by generating or executing the game program for rendering three dimensional images for simulating liquid objects, e.g. water, gas, fog, snow, clouds
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/60—Methods for processing data by generating or executing the game program
- A63F2300/66—Methods for processing data by generating or executing the game program for rendering three dimensional images
- A63F2300/6661—Methods for processing data by generating or executing the game program for rendering three dimensional images for changing the position of the virtual camera
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/60—Methods for processing data by generating or executing the game program
- A63F2300/66—Methods for processing data by generating or executing the game program for rendering three dimensional images
- A63F2300/6661—Methods for processing data by generating or executing the game program for rendering three dimensional images for changing the position of the virtual camera
- A63F2300/6684—Methods for processing data by generating or executing the game program for rendering three dimensional images for changing the position of the virtual camera by dynamically adapting its position to keep a game object in its viewing frustrum, e.g. for tracking a character or a ball
-
- A—HUMAN NECESSITIES
- A63—SPORTS; GAMES; AMUSEMENTS
- A63F—CARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
- A63F2300/00—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
- A63F2300/80—Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
- A63F2300/8017—Driving on land or water; Flying
Definitions
- Image processing apparatus image processing method, game apparatus using the same, and storage medium
- the present invention relates to an image processing device, an image processing method, and a game device using the same. More specifically, the present invention relates to an image processing device such as a game machine using a contribution graphics. And a storage medium storing these processing procedures.
- the game device can be used for data communication, image processing, sound, etc. between peripherals such as joysticks (operation sticks), buttons, and monitors, and peripherals.
- the game device includes a game device main body for executing processing and the like, and a display for displaying a video signal obtained from the main body.
- Image processing in this game device occupies a very large weight in increasing product value, and in recent years, technology for playing back moving images has also been refined.
- a three-dimensional shape can be composed of a plurality of polygons, and texturing (patterns) can be applied to these polygons to display characters from any viewpoint. This has been done in recent years.
- a three-dimensional character is drawn with a polygon data with texture matching, and the background part that requires movement in accordance with the movement of the camel and the change of the viewpoint is also required.
- a TV game device that draws with textured polygon data (for example, “Rayleche” manufactured by SEGA ENDERPLISES Co., Ltd.). —S I ”) 0
- the direction of the camera is denoted by E
- the direction of the force camera is denoted by F
- the field of view is denoted by G.
- the direction of the camera is F 1 and the field of view is G 1
- passing point q 11 and passing point q 1 is F 2
- the camera's orientation is F 3 origin of a viewed cortex G 3 becomes, ... direction change so that the viewing G had rather large changes at one point (second prior art) 0
- this device displays a river as a background or the like.
- Water in this river The flow is expressed using a texture-mapped polygon, and a texture that looks like a water stream is attached to the polygon along the flow of the river, and the texture coordinates are displayed over time. Accordingly, it is changed according to the direction in which water flows. For example, as shown in Fig. 25, when the coordinates of the texture are projected onto the polygons 150, 150, ... with arbitrary axes, the polygons 150, 150 , ⁇ ⁇ ⁇ The entire screen is displayed to move in the same direction. As shown in Figure 26, each of the polygons 15 1, 15 2, 15 3, 15: 4, 15 5, and 15 6 is n times the texture. , It is possible to represent the meandering river flow according to the shape of each polygon 15 1, 15 2, 15 3, 15 4, 15 5, and 15 6 (Third conventional example).
- the background 2 If you look at an object at a distance of 20 or an object at a long distance or at a distance, you can see the display range of the screen as 230 (fourth conventional example).
- the maximum number of polygons that can be displayed simultaneously is limited, so the number of polygons displayed is controlled so that the number of polygons does not exceed the maximum number on the entire screen.
- a map created separately from the game environment is included in the game environment, so that the flow of the game is interrupted.
- the viewpoint is always in the game, if the content of another is displayed at all, the flow of the game is interrupted and the player is confused.
- a Z-buffer method or a Z-sort method is often used to display the image.
- Z buffer depth information of the object
- the coordinate value may be calculated as an integer (handled as a fixed decimal point).
- special processing is used to represent the depth of an object at infinity.
- the object if the display range in the depth direction is limited in order to ensure accuracy by the Z-buffer method, the object must be placed at a distance within the display range 230. . That is, as shown in FIG. 27 (a), the object 2 21 must be arranged so as to be within the display range 230.
- the viewpoint 210 moves to the left side of the figure
- the background 220 at infinity moves, but the object at a long distance (or the object at no distance) does not move.
- the number of all polygons on the entire surface is controlled.
- the game screen is composed of a polygon representing the background and a polygon representing the enemy, etc., especially when the enemy is displayed in large numbers according to the progress of the game. There is. Therefore, the number of polygons in the background is reduced to display the energy, and a part of the background image may be missing (so-called polygon drop).
- the lack of a background image caused the quality of the game image to be significantly impaired.
- a first object of the present invention is to provide an image processing device that does not interrupt the flow of a game.
- a second object of the present invention is to provide an image processing device capable of moving a viewpoint in a natural state.
- a third object of the present invention is to provide an image processing device capable of expressing natural motion.
- a fourth object of the present invention is to provide an image processing apparatus that makes a screen at a long distance look similar to a natural state even when the viewpoint moves.
- a fifth object of the present invention is to provide an image processing device capable of preventing a background image from being lost.
- the present invention is directed to an image processing apparatus that moves a moving object configured on three-dimensional space coordinates and generates a viewpoint image obtained by viewing the moving object from a predetermined viewpoint, wherein information on the moving object is expressed. It is provided with an information representation generating means for configuring the information representation on three-dimensional spatial coordinates.
- the information-representation-generating means constitutes a planar body representing a position of the moving body as the information representation.
- the information representation body generating means changes the planar body from a folded state to an open state.
- the present invention includes a viewpoint moving unit that moves the viewpoint so as to display both the moving object and the information representation when the information expression is configured.
- the viewpoint movement unit further is to move toward the viewpoint in the information representation body 0
- the present invention further includes display means for displaying the generated viewpoint image.
- a game apparatus for playing a game while moving on a three-dimensional space coordinate system by the moving object, and includes any one of the image processing apparatuses described above.
- the present invention relates to an image processing method for generating a viewpoint image in which a moving object configured on three-dimensional spatial coordinates and an information representing body in which information on the moving object is expressed are viewed from a predetermined viewpoint.
- a third step of moving the viewpoint toward the information expression so as to express the information expression in a large size is a third step of moving the viewpoint toward the information expression so as to express the information expression in a large size.
- the image processing for displaying the information representing body such as the planar body on the three-dimensional spatial coordinates in relation to the moving display body is performed, the planar body in which necessary information is written is displayed. This allows the viewer to see the scene without discomfort and smoothly connect to subsequent images. Further, according to the present invention, it is possible to perform effective image processing regarding game development.
- the planar body for example, a map
- the planar body can be gradually enlarged and displayed, and the map is actually displayed on the game screen. The player can be given an interval as if they were watching.
- the image processing is performed such that the planar body is displayed from the folded state to the open state, it is possible to provide a device that gives a sense close to reality.
- the present invention provides a moving object configured in three-dimensional spatial coordinates, a point of interest provided on a trajectory through which the moving object passes, and a passing point provided on a trajectory through which the moving object has passed. And a visual axis determining means for determining a visual axis direction based on the position information of the point of interest and the information of the passing point position.
- the visual axis determining means determines a visual axis direction based on positional information of the point of interest and the passing point at an equal distance before and after the moving body.
- the viewpoint determining means changes distances from the moving body to the point of interest and the passing point, respectively, according to a property of the curve when the moving body is moving on the curve. Things.
- the present invention includes display means for displaying the generated viewpoint image.
- the present invention is a game device for playing a game while moving on three-dimensional space coordinates with the moving object, and includes the image processing device according to any one of the above.
- the present invention relates to an image processing apparatus that moves a moving object configured on three-dimensional space coordinates and generates a viewpoint image viewed from the viewpoint of the moving object.
- the apparatus further includes viewpoint determining means for smoothly changing a line of sight centered on the viewpoint in accordance with the trajectory of the moving body.
- Smoothly changing the line of sight according to the trajectory of the moving body means that a predetermined trajectory or a property of a trajectory based on a result of the movement of the moving body, for example, based on a curvature, a tangent vector, a derivative, and the like. This means that the line of sight changes continuously as a whole or at least in some sections.
- the viewpoint determining means determines a direction of a line of sight centered on the viewpoint based on coordinates on trajectories before and after the moving body.
- the viewpoint determination means calculates two coordinates at an equal distance in front of and behind the moving body, and sets a straight line connecting these coordinates as the direction of the line of sight.
- the viewpoint determining means when the moving object is moving on a curve, changes each distance from the moving object to coordinates before and after the moving object according to the property of the curve. It is something.
- the present invention further includes display means for displaying the generated viewpoint image.
- the present invention is a game device for playing a game while moving on three-dimensional space coordinates with the moving object, and includes any one of the image processing devices.
- the present invention relates to an image processing method for moving a moving object formed on three-dimensional space coordinates and generating a viewpoint image viewed from the viewpoint of the moving object.
- the moving display when the moving display is moving in the predetermined direction, at least one or more coordinates before and after the current coordinates of the moving display are captured, and the direction of the viewpoint is determined based on these coordinates.
- the viewpoint movement is close to the actual viewpoint movement. It can be a device capable of more natural expression. In addition, effective image processing for game development becomes possible.
- the moving display when the moving display is moving on a curve, the distance of the coordinates taken in before and after the current coordinates of the moving display is changed in accordance with the curvature of the curve. It is possible to obtain a viewpoint movement close to.
- the present invention provides an image processing apparatus that attaches a texture to each of a plurality of connected polygons and generates a predetermined image.
- a coordinate processing means for determining a reference vector for each of the plurality of polygons connected to each other and moving the texture in the direction of the reference vector; It is attached to the gon.
- the present invention relates to an image processing apparatus that attaches a texture to each of a plurality of connected polygons and generates a predetermined image.
- a reference vector is determined for each of a plurality of vertically or horizontally connected polygon rows in the vertical or horizontal direction, and the direction of the reference vector is determined without deforming the texture. It is provided with coordinate processing means for moving.
- the coordinate processing means determines the reference vector based on a predetermined curve.
- the reference vector for each of the plurality of polygons is continuous, and the texture corresponds to a flow along the curve.
- the present invention further comprises display means for displaying the generated image.
- the present invention relates to an image processing method for attaching a texture to each of a plurality of connected polygons and generating a predetermined image.
- a plurality of polygons connected vertically and horizontally are provided with a reference vector for each of the vertical and horizontal polygon rows, and the reference vector is provided. It is moving without deforming the texture in the direction of the toll. For this reason, for example, a screen with a flow such as a flow of river water can be more realistically represented.
- the reference line segment is given based on a predetermined curve (for example, a path through which a river flows), there is no change in the density of the texture, and the natural flow is reduced. Can be expressed.
- the present invention is directed to an image processing apparatus that moves a moving object configured on three-dimensional space coordinates and generates a viewpoint image viewed from the viewpoint of the position of the moving object.
- a movement processing means for moving the viewpoint together with the movement of the viewpoint is provided.
- the background image has a cylindrical shape or a spherical shape.
- the present invention further includes display means for displaying the generated viewpoint image.
- the present invention is a game device for playing a game while moving on three-dimensional space coordinates with the moving object, and includes any one of the image processing devices.
- the present invention relates to an image processing method for moving a moving object configured on three-dimensional space coordinates and generating a viewpoint image viewed from the viewpoint of the position of the moving object.
- a second step of moving a background image representing a distant background based on the moving amount is a second step of moving a background image representing a distant background based on the moving amount.
- the predetermined screen is moved in accordance with the movement of the viewpoint, so that even if the viewpoint moves, an object or the like irrespective of infinity or distance can be correctly displayed, and a more realistic expression can be obtained. it can.
- effective image processing for game development becomes possible.
- the predetermined screen can be arranged at a cylindrical or spherical position, the viewpoint can be arranged at the center, and the center can be moved with the movement of the viewpoint. Objects that are not related to infinity or distance can be displayed correctly.
- the present invention relates to an image processing apparatus that moves a moving object configured on three-dimensional space coordinates and generates a viewpoint image viewed from the viewpoint of the position of the moving object using a polygon.
- Polygons that compose the viewpoint image are classified into a plurality of polygons, and the number of polygons to be displayed is controlled so that the number of displayed polygons is equal to or less than a predetermined maximum display number for each classification. It is equipped with a gon number control means.
- the polygon number control means classifies the polygons constituting the viewpoint image into polygons and other polygons representing at least a background.
- the present invention further includes display means for displaying the generated viewpoint image. ⁇ .
- the present invention is a game device for playing a game while moving on three-dimensional space coordinates by the moving object, and includes any one of the image processing devices.
- the present invention relates to an image processing method for moving a moving object configured on three-dimensional space coordinates and generating a viewpoint image of the position of the moving object from a viewpoint using a polygon.
- screens other than the moving display body are divided into a plurality of types, the number of polygons assigned to each screen is individually limited, and each screen is configured within the limit. Therefore, for example, even when a large number of energies appear, the polygon of the background image is not lost.
- the present invention relates to an image processing apparatus that moves a moving object configured on three-dimensional spatial coordinates and generates a viewpoint image viewed from the viewpoint of the position of the moving object. It has a visual field changing means that changes according to the situation.
- the visual field changing means may include a visual field when there is no object in front of the moving object. It widens the field and narrows the field of view when there is an object.
- the present invention further includes display means for displaying the generated viewpoint image.
- the present invention is a game device for playing a game while moving on three-dimensional space coordinates by the moving object, and includes any one of the image processing devices.
- the present invention provides an image processing method for moving a moving object formed on three-dimensional space coordinates and generating a viewpoint image viewed from the viewpoint of the position of the moving object.
- the second step is to increase the field of view when there is no object, and to narrow the field of view when there is an object.
- the visual field is changed according to the situation in front of the moving body, so that a natural image can be obtained.
- a natural image For example, when a truck carrying a character is running in a wilderness, a wide field of view can be obtained by widening the field of view.
- the feeling of obstruction can be obtained by reducing.
- the present invention is a storage medium storing a procedure for causing a processing device to execute any one of the above methods.
- Storage media include, for example, floppy disks, magnetic tapes, magneto-optical discs, CD-R0M, DVDs R0, and more. Includes cartridges with RA memory or flash memory with backup, nonvolatile RAM cartridges, etc.
- a storage medium is a medium on which information (mainly digital data and programs) is recorded by some physical means, and which allows a processing unit such as a computer or a dedicated processor to perform a predetermined function. You can do it. BRIEF DESCRIPTION OF THE FIGURES
- FIG. 1 is a perspective view showing the appearance of the image processing apparatus of the present invention.
- FIG. 2 is a block diagram of the embodiment.
- FIG. 3 is a flowchart showing the operation of the camera work of the embodiment.
- FIG. 4 is an explanatory diagram of the operation of the embodiment.
- FIG. 5 is an explanatory diagram of the operation of the embodiment.
- FIG. 6 is a flowchart of an operation for obtaining a smooth viewing angle according to another embodiment.
- FIG. 7 is an explanatory diagram of the operation.
- FIG. 8 is an explanatory diagram of the operation.
- FIG. 9 is an explanatory diagram of the same operation.
- FIG. 10 is an explanatory diagram of the same operation.
- FIG. 1 is a flowchart of the operation of creating a river flow in the other embodiment.
- FIG. 12 is an explanatory diagram of the operation.
- FIG. 13 is an explanatory diagram of the operation.
- FIG. 14 is an explanatory diagram of the operation.
- FIG. 15 is a flowchart of the movement operation of the background and the like in the other embodiment.
- Figure 16 is an illustration of the same gauge operation.
- FIG. 17 is an explanatory diagram of the same operation.
- FIG. 1 & is a flowchart of the operation related to the polygon restriction in the other embodiment.
- FIG. 19 is an explanatory diagram of the same operation.
- FIG. 20 is an explanatory diagram of this operation.
- FIG. 21 is a flowchart of the operation relating to the viewing angle in the other embodiment.
- FIG. 22 is an explanatory diagram of conventional viewing angle processing.
- Figure 23 is an illustration of camera orientation and field of view.
- FIG. 24 is an explanatory diagram of a visual field obtained by conventional viewing angle processing.
- Figure 25 is a diagram for explaining the flow of a conventional river.
- Figure 26 is a diagram for explaining the flow of a conventional river.
- FIG. 27 is an explanatory diagram of a conventional background movement. BEST MODE FOR CARRYING OUT THE INVENTION
- FIGS. Figure 1 shows the appearance of this game device.
- reference numeral 1 indicates a game device main body.
- the game device body 1 has a box shape.
- the display 1 includes a display 1a including a CRT, a projector, a liquid crystal display, a plasma display, and the like as display means.
- An operation panel 2 is provided on the lower front surface of the display 1a.
- a speed force mounting hole (not shown) is provided on the side of the display 1a, and a speaker 14 is provided inside these holes.
- a game processing board 10 is provided inside the game apparatus main body 1.
- the display 1 a, the operation device 11 of the operation panel 2, and the speed 14 are connected to the game processing board 10. With such a structure, the player can enjoy the game using the display 1 a and the operation device 11 of the operation panel 2.
- the operation device 11 provided on the operation panel 2 includes a joystick 2a, a push button 2b, and a force. The player can operate the character by the joystick 2a and the button 2b.
- FIG. 2 is a block diagram showing a game device to which the data processing device of the embodiment is applied.
- This game device includes a display 1 a, an operation device 11 disposed on an operation panel 2, a game processing board 10, and a speaker 14.
- the display 1a displays, for example, an image of a game similar to that on the Delphi I, and a projector may be used instead of the display 1a.
- the game processing board 10 has a CPU (central processing unit) 101, a ROM 102 RAM I 03, a sound device 104 AP (amplifier) 105 , I / O interface 106, scroll data processing device 107, co-processor (auxiliary processing device) 108, graphics data ROM 109, geome ROM 111, drawing device 111, texture data ROM 113, texture map RAM 114, motion data. It is equipped with a frame-offer 115 and an image synthesizing device 1 16 D./A converter 117.
- the CPU 101 can store, via a line, a predetermined program, an image processing program, and the like, a memory 110, a data storage RAM 103, and a sound processor.
- Device It is connected to 104, an input / output interface 106, a scroll data calculation device 107, a coprocessor 108, and a geometrizer 110.
- the RAMI 03 functions as a buffer, and is used to write various commands to the geometrizer (display of intelligent objects, etc.) and to write data required for various calculations. Done.
- the input / output interface 106 is connected to the operating device 11 so that the operating signals of the operating device 1 1 ⁇ joystick 2a and the like are converted into digital quantities. Taken in by CPU101.
- the sound device 104 is connected to a speaker 14 via a power amplifier 105 so that the acoustic signal generated by the sound device 104 is amplified by the speaker 14 after power amplification. Given to.
- the CPU 101 operates the operation signal from the operation device 11 and the graphic data from the graphic data ROM 109 based on a program incorporated in the ROM 110, or Motion data from motion data R0Ml11 ("Character such as self, energy etc.” and “Background of moving path, terrain, sky, various structures, etc.”) (3D data) is read, and behavior calculation (simulation) and calculation of special effects are performed at least.
- the behavior calculation simulates the movement of the character in the virtual space based on the player's operation signal from the operation device 11, and after the coordinate values in the three-dimensional space are determined.
- a graphic data ROM 109 is connected to the co-processor 108, and accordingly, a predetermined graphic data is stored in the co-processor 108; ⁇ Passed to 1).
- the coprocessor 108 is primarily intended to undertake floating point operations. As a result, various judgments are made by the co-processor # 08, and the judgment results are given to the CPU 101, so that the calculation load of the CPU can be reduced.
- the geometry riser 110 is connected to the motion data R0 ⁇ 111 and the drawing device 112.
- the motion data R 0 ⁇ 1 1 1 includes the shape data composed of a plurality of polygons (three-dimensional data such as characters consisting of each vertex, terrain, background, etc.). ) Is stored, and this shape data is passed to the geometry It is.
- the geometrizer 110 performs perspective transformation of the shape data specified by the transformation matrix sent from the CPU 101, and converts the coordinate system in the three-dimensional virtual space to the view coordinate system. Get the data.
- the drawing device 112 attaches the texture to the transformed shape data of the visual field coordinate system, and outputs it to the frame buffer 115.
- the drawing device 112 is connected to the texture data R0M113 and the texture map RAM 114, and at the same time, the Connected to 15.
- the polygon data refers to a set of relative or absolute coordinates of the vertices of a polygon (polygon: mainly a triangle or a quadrangle) consisting of a collection of vertices.
- the figure data R0M109 stores polygon data set relatively coarse enough to execute a predetermined determination.
- the motion data R0M111 stores polygon data that is more precisely set with respect to the shape that composes the screen, such as the character, the truck, and the background. I have.
- the scroll data computing device 107 computes the data of the screen D—the screen of characters (stored in R0M102), such as characters.
- the output signal of the frame buffer 115 and the output signal of the frame buffer 115 are combined by an image forming apparatus 116, and the combined signal is further converted to a digital signal by a D / A converter 117. Is converted into an analog signal and input to the display 1a.
- the polygon screen (simulation results) such as characters, trucks, and terrain (background) temporarily stored in the frame buffer 115 are displayed.
- the scrolling of the text information is combined with the screen according to the specified priority to generate the final frame image data.
- This image data is converted into an analog signal by the D / A converter 117, sent to the display 1a, and the game image is displayed in real time.
- FIG. 3 is a flowchart illustrating the operation. 4 and 5 are explanatory diagrams of the operation.
- FIGS. 4 and 5 are explanatory diagrams taking this scene as an example.
- the gate 20 runs on the track 21 in the direction of the background 23.
- the trolley 20 may be stopped, and the background 23 or the like may be approached as indicated by the arrow in the drawing.
- the truck 20 has a character 24 (25) on it.
- Character 24 (25) expands map 26.
- the curve that turns around is the trajectory when the coordinates of the viewpoint of the force camera described later move. There are six points on this curve, labeled "1,""2,”"3,”"4,””5,” and "6.”
- the camera's viewpoint moves in the order of these numbers. That is, it goes upward from the bottom centering on the character 24 (25), and goes further downward.
- the direction of movement of the camera's viewpoint is counterclockwise.
- FIG. 5 six screens are shown. These screens are shown in FIGS. 5 (a), (), (c), (d), (e), (f), respectively. The screen changes in this order.
- FIGS. 5A to 5F correspond to images photographed from the viewpoints 1 to 6 of the camera in FIG. 4, respectively.
- Fig. 5 (a) is drawn with the tip of the truck 1 as the camera's viewpoint 1; the track 21 in the direction the truck 1 travels, the ground 22, the mountain, etc. The background 23 is shown. In the viewpoint 1 of this camera, the character 24 (25) is not reflected.
- Figure 5 (b) is drawn with the viewpoint of the camera 2 behind the character 24 (25). In addition to the track 21 etc., the character 24 (25) is also shown. In FIG. 5 (c), the upper part of the character 24 (25) is drawn as the camera's viewpoint 3 and the truck 2 running on the track 21 is shown. The whole view of 0 is shown.
- FIG. 5 (d) is drawn based on viewpoint 4, which has moved slightly before viewpoint 3, and shows that character 24 (25) spreads map 26.
- FIG. 5 (f) is drawn based on the viewpoint 6 which is closer to the map 26, and the map 26 is displayed on the entire screen.
- the CPU 101 performs the game according to the program built in the R0M102. (Step 301, Step 302; N0) Next, the CPU 101 completes the game processing and completes one stage. If it is determined that the processing has been completed (step 302; YES), the processing shifts to the viewpoint movement control step.
- the viewpoint movement control step When the coordinates of the viewpoint are within the predetermined value, for example, when a certain stage is cleared, the character proceeds to a position defined in the game. In a narrative case, a character has to overcome various crises, break through difficulties, and then re-enter the port. In other words, it is a scene change.
- the CPU 101 retrieves the coordinates of the first viewpoint of the camera from R0M102 and stores them in a predetermined area of RAM103 (step 30). 3).
- the CPU 101 processes the display data such as the character and the background based on the coordinates of the first viewpoint stored in the RAM 103 (step 304).
- step 305; YES when the coordinates of the viewpoint stored in the ROM 102 are within a predetermined value (step 305; YES), the CPU 101 performs image processing for displaying the map in a folded state. '(Step 306).
- the coordinates of the viewpoint of the camera are “1” located in front of the character 25 (26). " become.
- the display screen seen from this viewpoint "1" is composed of the tip of the trolley 20, the track 21, the ground 22, and the background 23. ⁇ ⁇ ⁇ ⁇ '''''' ⁇ ⁇ ⁇ ⁇
- These powers are displayed on the display 1a.
- step 307 the coordinates of the camera viewpoint stored in a predetermined area of the RAM 103 are updated (step S307; NO) because the coordinates have not reached the target (step S307; NO).
- step 308 the processing shifts to the processing in step 304 again.
- step 304 to 308 the coordinates of the viewpoint of the camera are changed to the character 24 (see FIG. 4).
- the viewpoint "2" is located on the back side of 2 5).
- the display screen viewed from this viewpoint ("2") has a track 20 and a character 24 (25) riding on it. It is composed of a track 21, ground 22 and background 23. These are then displayed on display 1a.
- step 304 to 308 the camera As shown in Fig. 4, the coordinates of the viewpoint of the camera become the viewpoint "3" located far above the rear surface of the character 24 (25).
- the display screen viewed from this viewpoint ("3") is a rocker 20, a character 24 (25) riding on it, and a railway track. 2 1 and ground 2 2. These are displayed on the display 1a as if viewed from the sky. ⁇ .
- step 304 to 308 before the coordinates of the camera viewpoint reach “4” in FIG. It is assumed that the CPU 101 has determined that the coordinates of the viewpoint stored in the predetermined error have exceeded the predetermined value (Step 305; N0). Then, the CPU 101 executes image processing that is displayed so that the map 26 is gradually expanded from the folded state (step 309).
- image processing is performed such that two points of polygons constituting a map are made common, and each of the other two points is updated each time this step is passed.
- the book-like map is displayed from the closed state to the gradually opened state. Therefore, each time you pass through this step, the map, which looks like a book, is opened, and when it is completely open, the coordinates of the polygon are updated thereafter. It is not done.
- step 304 to 305, 309, 307, 308 the coordinates of the camera's viewpoint are changed as shown in FIG.
- the viewpoint "4" is reached, which is located diagonally above the rear surface of the character 24 (25).
- the display screen viewed from this point of view ("4") is an enlarged clock 20 and a key board 24 (25) ), An open map 26 that can be seen between these upper bodies, a track 21, the ground 22, and a background 23. These are then displayed on display 1a.
- the CPU 101 repeats the processing of steps 304 to 305, 309, ⁇ 307, 308, and the coordinates of the camera's viewpoint become As shown in FIG. 4, the viewpoint "5" located directly above the character 24 (25) is reached. As shown in Fig. 5 (e), the display screen viewed from this viewpoint ("5") is a part of the trolley 20 and the cow character 24 (25) riding on it. ), And the enlarged map 26 seen from between the enlarged upper body. These are then displayed on display 1a. Further, as the CPU 101 repeats the processing of steps 304 to 300, 309, 307, and 308, the coordinates of the camera's viewpoint become As shown in FIG. 4, the viewpoint "6" located in front of the character 24 (25) is reached.
- the display screen viewed from this viewpoint (“6") is composed of the map 26 itself, which is completely expanded to the display la screen. These are then displayed on display 1a.
- the CPU 101 determines that the viewpoint coordinates have reached the final value (step 307; YES), and executes the pre-processing of the next stage while viewing the map 26 (step 307).
- the game processing is shifted again (Step 310) o
- a necessary map is displayed by a command or automatically as a separate screen display instead of the game screen.
- the map 26 is spread over the characters on the game screen, and the map 26 is displayed on the camera. Since it is displayed on the display 1a in a state of “looking into” by one click (moving the viewpoint), it is possible to avoid switching between screens and prevent interruption of the game flow.
- information required by the player such as the player's status and position in the entire game
- the information is used in the game using the camera work.
- the information is displayed in relation to the character 24 (25)
- the position of the camera viewpoint does not need to go around the character.
- the display of the map 26 may be stopped at the viewpoint "5", for example. .
- the position of the camera viewpoint may follow points 1 to 6 in FIG. 4 in order, may follow the reverse order, or may follow the random order.
- the position of the camera viewpoint may move continuously between these points, or may move only on points 1 to 6.
- the position of the camera viewpoint may move in a vertical plane as shown in FIG. 4 or may move in a horizontal plane.
- the camera may be swung left and right or up and down slowly to pan or zoom as in the case of shooting a movie.
- the display may include the surrounding scenery and characters.
- the traveling scene of the truck has been described as an example, but it is needless to say that the traveling scene of the truck is not limited to such a traveling scene.
- ⁇ Ya also applies when clearing the stage, taking a break, or selecting equipment.
- any scene that requires a screen change or temporary interruption The operation of gaze determination in this game device will now be described with reference to FIGS.
- FIG. 6 is a flowchart showing the same operation.
- FIG. 10 is a diagram for explaining how to determine the line of sight according to the curvature of the curve.
- FIG. 6 shows that when the CPU 10 1 is in the middle of processing a game program, the truck 20 on which the main character is riding approaches the curve. Shall enter.
- C.PU 10, 1. Captures the coordinates of the current void q 20 of the track 20 (step 4 0 1). ). That is, as shown in FIG. 7 (a), the coordinates (X, ⁇ , Z) of the current point q20 of the truck 20 are taken into CPU101.
- the CPU 101 sets the distance between the current coordinates of the track 20 and the distance set value ⁇ from the current coordinates of the track 20 in accordance with a predetermined distance set value, and determines the position of the front point Q 21 on the orbit. capturing coordinates (step 4 0 2).
- the symbol ⁇ is a distance setting value for obtaining the coordinates of a point that is a predetermined distance before and after the current point of the trolley 20; For example, it is stored in a predetermined error in RAM103. Then, the CPU 101 captures the coordinates before and after the track 20 based on the distance setting value ⁇ .
- the CPU 101 fetches the coordinates of the point q 22 separated by the distance setting value ⁇ behind the current coordinate of the truck 20 (Ste 403).
- the coordinates of the points (positions) q 21 and q 22 equidistant before and after the current coordinates of the truck 20 are taken in.
- the CPU 101 performs a calculation process of connecting the coordinates obtained in steps 402 and 403 with a straight line (step 404). This is equivalent to the processing of connecting the point q 21 and the point q 22 with a straight line N, as shown in FIG. 7 (b).
- the CPU 101 can calculate the length of the straight line N by the calculation in step 404.
- the angle of the curve can be determined from the length of the straight line N.
- the straight line Na between the points q21 and q22 becomes larger on a platform where the angle of the curve is obtuse.
- the length of the straight line Na takes the maximum value 2 ⁇ .
- the closer the curve angle is to an acute angle the smaller the straight line Nb connecting the points q21 and q22.
- the CPU 101 determines the value of the distance ⁇ of the straight line obtained in step 404 (step 405).
- the CPU 101 sets the distance setting value ⁇ to a standard value, assuming that the angle of the gap is an obtuse angle. It is stored in RA ⁇ 103 (step 407).
- the CPU 101 sets the distance set value ⁇ to a small value on the assumption that the angle of the curve is an acute angle, and sets the RAMI 0 It is stored in 3 (step 406). Switching the distance setting depending on whether the angle of the force is acute or obtuse in this way is difficult even when the track 21 of the truck 20 is sharply curved. This is to change the direction of the camera naturally.
- the CPU 101 sets the straight line N obtained in the above-mentioned step 404 as shown in FIG. 7 (b). And apply it to the current point q 20, and store it as a camera direction F in, for example, a predetermined error in RAM 103 (step 408 ).
- the field of view is G 10 at the camera orientation F 10 at point q 10.
- the camera direction is F 20 from points q 21 and q 22, and the field of view is G 20.
- the direction of the camera from points q31 and q32 is F30, and the field of view is G30.
- the fields of view change as they overlap with each other.
- the distance setting value ⁇ is varied according to the angle, for example, as shown in Fig. 10 (a), the interval between the distance setting values ⁇ is increased when the force is gentle. As shown in FIG. 10 (b), the interval of the distance setting value ⁇ can be made small at a sharp curve. Regarding how many times the calculation of the straight line ⁇ is repeated, it is relatively small in the case of FIG. 10 (a) and relatively large in the case of FIG. 10 (b). Therefore, by changing the distance set value ⁇ according to the angle, optimal processing can be performed while reducing the load on the CPU 101.
- two points are taken on the track before and after the truck, and the camera gaze is determined based on these two points.
- the movement of the viewpoint (the direction of the camera F) does not move much, and can produce a natural movement.
- the direction of the viewpoint is determined from the points before and after the current coordinate point, it is possible to cope with the case where the progress of the viewpoint is reversed.
- the setting of the distance between the two points can be adjusted, so that the swing width of the viewpoint can be adjusted.
- the angle of the curve can be determined by the straight line N connecting the two points, it can be directly reflected in the motion data.
- the application of the above processing is not limited to the case where the truck moves on the track.
- it can be applied to a platform where a car curves or an airplane goes up or down.
- the curve does not necessarily have to be smooth.
- the present invention can be applied to a line in which a plurality of straight lines are connected.
- the above processing can be applied to a platform that determines the reference direction of the screen display according to a certain rule.
- the direction of the camera may be determined based on the angle information of the straight line without moving the straight line in parallel.
- the processing may be performed without obtaining a straight line based on direction information (angle information) obtained in advance corresponding to the curve.
- the points at which the straight line N is calculated need not necessarily be two points sandwiching the moving object on the orbit, and the current position information of the moving object and the position information of a point through which the moving object passes (for example, (A little before and a lot before).
- FIG. 11 is a flowchart for explaining the flow of the river according to this embodiment.
- FIG. 12 is an explanatory diagram for explaining the relationship between a polygon and each coordinate.
- Figure 13 is an explanatory diagram of the coordinate system of the texture, where u is on the horizontal axis and V is on the vertical axis. The image representing the river is represented by multiple polygons connected in each direction of uV.
- FIG. 14 is an explanatory diagram showing the flow of the river represented by this processing.
- each polygon PG 1, PG 2,... Has a curve (the same number as the polygon column direction) Kl, K 2 serving as a reference of the texture coordinates. ,...
- the polygons PG1, PG2,... ′ Shown in this figure indicate a part of a plurality of polygons connected vertically and horizontally. Therefore, a polygon (not shown) is also connected in the lateral direction of the polygons P G1 and P G2.
- the CPU 101 reads a reference vector 1 serving as a reference for texture coordinates prepared for each polygon PG1 (step 50'1).
- the reference vector is determined for each polygon row in the horizontal direction (the direction orthogonal to the direction in which the river flows) based on a predetermined river path (curve).
- the reference vector K1 of the polygon string including the polygon PG1 and the reference vector K2 of the polygon string including the polygon P.G2 are respectively determined. ing.
- the reference vector indicates the river flow.
- the CPU 10 # calculates the texture coordinates using the reference vector K1 (step 502). This calculation is performed, for example, as follows.
- line segments parallel to the reference vector K 1 are L 1, L 3, and 4, and line segments orthogonal to the reference vector K 1 are L 2, Let L5 and L6.
- the texture coordinates of the vertices p1, p2, p3, and p4 of polygon PG1 are
- CPU 101 determines whether all polygons have been terminated (step 503)
- step 503 since the processing has not been completed yet (step 503; N0), the number of subscripts of the polygon PG is updated so that the next polygon PG can be calculated (step 503). 504), and returns to the processing of step 501 again.
- mapping data expressed in texture coordinates is mapped to a polygon indicating the flow of the river to express the flow of the river.
- the texture can be deformed appropriately according to the shape of the river flow. Therefore, the flow of the river can be naturally expressed.
- the reference vectors ⁇ 1, ⁇ 2,... ′ Correspond to the polygons PG1, PG2,.
- the reference vector is a measure of the river flow in that polygon.
- the reference vector may be determined from the shape of the polygon, for example, a line perpendicular to the side or a line connecting the midpoints of the sides.
- the reference vector may be determined for each polygon based on the curve indicating the flow.
- the conversion formula between the texture coordinates and the polygon is not limited to the above example, and another conversion formula may be used.
- the conversion may be performed based on the angle between the polygon side and the reference vector.
- FIG. 15 is a flowchart for explaining the screen movement processing of the embodiment.
- FIG. 16 is an explanatory diagram of the screen moving process.
- FIG. 17 is an explanatory diagram of the screen movement processing.
- the display is performed using the interactive computer graphics.
- the Z-buffer method for hidden surface removal for this display prepare the largest object within the display range in the Z direction. Also, when using the Z-SORT method for this display, prepare an object that is sufficiently large compared to other objects.
- the largest object in the display range G of the viewpoint 69 is the object 71, so that it can be moved.
- the object 71 is a background such as a mountain, a celestial body, or the sky.
- the symbols 72, 73, 74 in the figure represent various objects in front of the background.
- the CPU 101 executes the game development processing (step 601), and if there is no viewpoint shift (step 602; N0), the step 601 is executed again. Move to the processing of. For example, at this time, it is in the state shown in Fig. 17 (a).
- the CPU 101 retrieves the coordinates of the previously captured viewpoint from the predetermined area of the RAM 103, and writes the current coordinates to the predetermined area of the RAM 103. And execute the subtraction calculation with the coordinates taken in this time (Step 604) o
- the process returns to step 601. If the moving distance is not so large, the appearance of the distant object 71 does not change much.
- the threshold value of this step 601 is determined based on the distance to the object 71 and the like within a range that does not give the player an uncomfortable feeling.
- step 2 the coordinates of the predetermined object 71 are read (step 606).
- the CPU 101 changes the coordinates of the predetermined object 71 based on the calculation result (step 607).
- the object 71 is moved away from the object 71 by the same distance as the viewpoint 69 approaches the object 71.
- the object 71 and the object 72 may be separated from each other and may be discontinuous.However, since the object 71 is a very distant view, the sense of discomfort is very small. Does not occur.
- the CPU 101 stores the coordinates again in a predetermined area of the RAM 103 (step 608). After that, the CPU 101 shifts to the processing of step 601 again.
- a sphere 76 having the largest size in the display range expressing the starry sky and the like is formed, and the viewpoint 69 is positioned at the center of the sphere 76.
- the center may be moved with the movement of the viewpoint.
- the position of the sphere 76 changes with the movement of the viewpoint 69.
- the radius of the sphere 76 is the largest in the display range in the Z direction when the Z buffer method is used.
- the Z sort an object that is sufficiently large compared to other objects is prepared, and the position of the object is moved to the same distance as the movement of the viewpoint, so that the representation of a celestial sphere at infinity can be represented Do.
- a cylindrical infinity background image may be used instead of the sphere 76.
- the position of the viewpoint 69 is not changed, but even if you look up or look back, you can always obtain the appropriate background such as the starry sky corresponding to the direction. You. In the past, this kind of background was often used to facilitate a background screen attached to the display screen, but in this case, the display did not change and looked unnatural regardless of the direction in which it was viewed.
- FIG. 18 is a flowchart illustrating the operation of limiting the number of polygons according to the embodiment.
- FIG. 19 is an explanatory diagram of the same embodiment.
- FIG. 20 is an explanatory diagram of the same embodiment.
- a limit value is set for each of polygons constituting characters 24, 25, etc., polygons constituting energy, etc., and polygons constituting background, etc. Have been.
- a buffer 81 that sets the limit number R 1 of the polygons constituting the characters 24, 25, etc., and a polygon 81 constituting the energy etc.
- a buffer 82 is provided with a limited number R2, and a buffer 83 is provided with a limited number R3 of polygons constituting the background and the like.
- the buffers 81 to 83 may be provided in the RAM 103, for example.
- These limit numbers R 1 to R 3 are stored in R 0 M 102, and the limit numbers R 1 to R 3 are stored in buffers 82 and 83 when the operation starts.
- the CPU 101 executes a game development process (step 701). As a result, the CPU 101 reads the limited number R1 of polygons from the buffer 81 (step 702). Next, CPU 101 allocates this to the required character (step 703). The CPU 101 creates a character so as not to exceed the allocated number of polygons (step 704).
- the CPU 101 reads the polygon limit number R 3 from the buffer 83 (step 705). The CPU 101 then assigns this to the required background (step 706). The CPU 101 creates a background so as not to exceed the number of assigned polygons (step 707).
- step 708 determines whether or not there is an appearance of an energy (step 708). If no energy appears (step 708; N0 ⁇ ), the CPU 101 shifts to the processing of step 701 again.
- steps 701 to 70 When the processing up to 8 is executed, the display screen becomes, for example, as shown in Fig. 20 (a) .
- Fig. 20 (a) shows the clock 20 and the character riding on it. 24, 25, and the track 2 1, ground 2 2, background 2 3 are represented.
- step 708 if there is an appearance of an energy (step 708; YES), the CPU 101 shifts to the processing of step 709 again.
- step 709 the CPU 101 reads the polygon limit number R2 from the buffer 82 (step 709). Then, CPU 101 allocates this to the required energy (step 7110).
- step 7110 CPU 101 creates an energy not to exceed the allocated number of polygons.
- Step 7 1 the display screen becomes, for example, as shown in FIG. 20 (b).
- Figure 20 (b) shows the track 20 and the characters 24 and 25 on it, the track 21, the ground 22 and the background 23, etc. The energy 27 that is aimed at is expressed.
- FIG. 21 is a front chart for explaining the viewing angle changing operation according to the embodiment.
- the CPU 101 executes a game expansion operation based on the program from the ROM 102 (step 801).
- the viewing angle changes with the game development and is predetermined. Therefore, the CPU 101 determines the viewing angle from the game development (step 802).
- the CPU 101 sets the viewing angle to a large value if, for example, the truck 20 is running in a wilderness or the like during the game processing and the viewing angle is large (step 802: large).
- the CPU 101 sets the viewing angle to medium if the vehicle is running during a game process, for example, between a mountain and a building, and the viewing angle is medium (step 802; medium). (Step 804).
- the CPU 101 sets the viewing angle to a small value if the viewing angle is small (step 802; small), for example, while the vehicle is running in the tunnel during the game processing (step 802; small). 805). Accordingly, in the next step 806, the CPU 101 reads the set viewing angle and executes the processing of the viewing angle (step 806).
- the viewing angle is significantly reduced and distant is displayed.
- the viewing angle is set to a medium level.
- the viewing angle increases when driving on a plain or the like. Since such a change in the viewing angle matches the visual characteristics of humans, it is possible to provide a realistic game screen.
Description
Claims
Priority Applications (5)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US08/765,813 US5830066A (en) | 1995-05-19 | 1996-05-20 | Image processing device, image processing method, and game device and storage medium using the same |
EP96915197A EP0782104B1 (en) | 1995-05-19 | 1996-05-20 | Image processing device, image processing method, storage medium and computer program |
KR1019970700347A KR100276598B1 (ko) | 1995-05-19 | 1996-05-20 | 화상처리장치, 화상처리방법 및 이것을 이용한 게임장치 |
JP53470996A JP3859084B2 (ja) | 1995-05-19 | 1996-05-20 | 画像処理装置、画像処理方法及びこれを用いたゲーム装置並びに記憶媒体 |
DE69631949T DE69631949D1 (de) | 1995-05-19 | 1996-05-20 | Bildverarbeitungsgerät, bildverarbeitungsverfahren, speichermedium, und computerprogramm |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP7/145597 | 1995-05-19 | ||
JP14559795 | 1995-05-19 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO1996036945A1 true WO1996036945A1 (fr) | 1996-11-21 |
Family
ID=15388749
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP1996/001331 WO1996036945A1 (fr) | 1995-05-19 | 1996-05-20 | Dispositif de traitement d'une image, procede de traitement d'une image dispositif de jeu utilisant ces derniers et support de memoire |
Country Status (7)
Country | Link |
---|---|
US (4) | US5830066A (ja) |
EP (1) | EP0782104B1 (ja) |
JP (1) | JP3859084B2 (ja) |
KR (1) | KR100276598B1 (ja) |
CN (3) | CN100501768C (ja) |
DE (1) | DE69631949D1 (ja) |
WO (1) | WO1996036945A1 (ja) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH10211360A (ja) * | 1997-01-30 | 1998-08-11 | Sega Enterp Ltd | ゲーム装置及びゲーム装置における画面表示方法 |
JPH1186031A (ja) * | 1997-09-11 | 1999-03-30 | Sega Enterp Ltd | 画像処理装置及び画像処理方法並びに媒体 |
JP2016033736A (ja) * | 2014-07-31 | 2016-03-10 | 株式会社コロプラ | 複数の画像間の画面表示をスムーズに移行させるコンピュータ・プログラム |
Families Citing this family (80)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE69625523T2 (de) | 1995-05-10 | 2003-07-10 | Nintendo Co Ltd | Steuergerät mit analogem Joystick |
WO1996036945A1 (fr) * | 1995-05-19 | 1996-11-21 | Sega Enterprises, Ltd. | Dispositif de traitement d'une image, procede de traitement d'une image dispositif de jeu utilisant ces derniers et support de memoire |
JPH09153146A (ja) * | 1995-09-28 | 1997-06-10 | Toshiba Corp | 仮想空間表示方法 |
US6007428A (en) | 1995-10-09 | 1999-12-28 | Nintendo Co., Ltd. | Operation controlling device and video processing system used therewith |
JP3524247B2 (ja) | 1995-10-09 | 2004-05-10 | 任天堂株式会社 | ゲーム機およびそれを用いたゲーム機システム |
JP3544268B2 (ja) | 1995-10-09 | 2004-07-21 | 任天堂株式会社 | 三次元画像処理装置およびそれを用いた画像処理方法 |
CN1149465C (zh) | 1995-10-09 | 2004-05-12 | 任天堂株式会社 | 三维图像显示游戏机系统和三维图像处理方法 |
GB2313432B (en) | 1995-11-10 | 2000-06-21 | Nintendo Co Ltd | Joystick device |
US6022274A (en) | 1995-11-22 | 2000-02-08 | Nintendo Co., Ltd. | Video game system using memory module |
US6155926A (en) | 1995-11-22 | 2000-12-05 | Nintendo Co., Ltd. | Video game system and method with enhanced three-dimensional character and background control |
US6267673B1 (en) | 1996-09-20 | 2001-07-31 | Nintendo Co., Ltd. | Video game system with state of next world dependent upon manner of entry from previous world via a portal |
ES2332271T3 (es) * | 1996-06-05 | 2010-02-01 | Kabushiki Kaisha Sega Doing Business As Sega Corporation | Dispositivo de procesamiento de graficos, metodo de procesamiento de graficos, maquina de juegos y medio de almacenamiento. |
US6244959B1 (en) | 1996-09-24 | 2001-06-12 | Nintendo Co., Ltd. | Three-dimensional image processing system with enhanced character control |
US6139434A (en) | 1996-09-24 | 2000-10-31 | Nintendo Co., Ltd. | Three-dimensional image processing apparatus with enhanced automatic and user point of view control |
JP3709509B2 (ja) * | 1996-12-04 | 2005-10-26 | 株式会社セガ | ゲーム装置 |
JPH1186038A (ja) * | 1997-03-03 | 1999-03-30 | Sega Enterp Ltd | 画像処理装置、画像処理方法及び媒体並びにゲーム機 |
JP3767094B2 (ja) * | 1997-06-17 | 2006-04-19 | 株式会社セガ | ビデオゲーム装置における遊戯者により操作される複数キャラクタの表示制御方法 |
JP3655438B2 (ja) | 1997-07-17 | 2005-06-02 | 任天堂株式会社 | ビデオゲームシステム |
US6424353B2 (en) * | 1997-09-11 | 2002-07-23 | Sega Enterprises, Ltd. | Computer game apparatus |
JPH11128533A (ja) | 1997-10-30 | 1999-05-18 | Nintendo Co Ltd | ビデオゲーム装置およびその記憶媒体 |
DE69838734T2 (de) * | 1997-11-25 | 2008-10-30 | Kabushiki Kaisha Sega Doing Business As Sega Corp. | Bilderzeugungsgerät |
JP2992499B2 (ja) * | 1998-01-05 | 1999-12-20 | コナミ株式会社 | 画像処理方法及び装置、記録媒体 |
JPH11207029A (ja) * | 1998-01-28 | 1999-08-03 | Konami Co Ltd | ビデオゲーム装置、ビデオゲームにおける画面表示方法及び画面表示プログラムが格納された可読記録媒体 |
JP3342393B2 (ja) * | 1998-03-19 | 2002-11-05 | 株式会社コナミコンピュータエンタテインメントジャパン | ビデオゲーム装置、コンピュータ読み取り可能な記録媒体 |
JP3824788B2 (ja) * | 1998-09-28 | 2006-09-20 | 株式会社コナミデジタルエンタテインメント | ビデオゲーム装置、ビデオゲームにおけるゲーム画面の視点切替方法及びビデオゲームにおけるゲーム画面の視点切替プログラムが記録されたコンピュータ読み取り可能な記録媒体 |
JP3005581B1 (ja) * | 1999-03-16 | 2000-01-31 | コナミ株式会社 | 画像作成装置、画像作成方法、画像作成プログラムが記録された可読記録媒体およびビデオゲ―ム装置 |
JP3695217B2 (ja) * | 1999-04-30 | 2005-09-14 | オムロン株式会社 | 画像処理装置及び画像入力装置 |
JP4387511B2 (ja) * | 1999-09-09 | 2009-12-16 | 株式会社バンダイナムコゲームス | ゲーム装置および情報記憶媒体 |
JP2001118049A (ja) * | 1999-10-14 | 2001-04-27 | Sega Corp | マトリクス演算器を有する画像処理装置 |
US6556206B1 (en) * | 1999-12-09 | 2003-04-29 | Siemens Corporate Research, Inc. | Automated viewpoint selection for 3D scenes |
US6989832B2 (en) | 2000-01-21 | 2006-01-24 | Sony Computer Entertainment Inc. | Entertainment apparatus, storage medium and object display method |
JP3350655B2 (ja) * | 2000-01-25 | 2002-11-25 | 株式会社ナムコ | ゲームシステム及び情報記憶媒体 |
US6724385B2 (en) | 2000-03-08 | 2004-04-20 | Sony Computer Entertainment Inc. | Method of replaying game, recording medium, program, and entertainment system |
JP3310257B2 (ja) * | 2000-03-24 | 2002-08-05 | 株式会社コナミコンピュータエンタテインメントジャパン | ゲームシステム及びゲーム用プログラムを記録したコンピュータ読み取り可能な記録媒体 |
JP3625172B2 (ja) | 2000-04-26 | 2005-03-02 | コナミ株式会社 | 画像作成装置、画像作成方法、画像作成プログラムが記録されたコンピュータ読み取り可能な記録媒体およびビデオゲーム装置 |
JP2002095863A (ja) * | 2000-07-03 | 2002-04-02 | Sony Computer Entertainment Inc | プログラム実行システム、プログラム実行装置、記録媒体及びプログラム、並びに視点を切り換える方法及び照準を切り換える方法 |
US7244181B2 (en) * | 2000-11-14 | 2007-07-17 | Netamin Communication Corp. | Multi-player game employing dynamic re-sequencing |
US6908389B1 (en) * | 2001-03-07 | 2005-06-21 | Nokia Corporation | Predefined messages for wireless multiplayer gaming |
JP4099434B2 (ja) * | 2003-07-08 | 2008-06-11 | 任天堂株式会社 | 画像生成プログラム及びゲーム装置 |
JP4245433B2 (ja) * | 2003-07-23 | 2009-03-25 | パナソニック株式会社 | 動画作成装置および動画作成方法 |
US7210323B2 (en) * | 2003-12-16 | 2007-05-01 | General Motors Corporation | Binder apparatus for sheet forming |
US7364091B2 (en) | 2003-12-19 | 2008-04-29 | Scientific Games International, Inc. | Embedded optical signatures in documents |
AU2005292264B2 (en) * | 2004-10-01 | 2009-06-11 | Wms Gaming Inc. | System and method for 3D image manipulation in gaming machines |
CA2585964A1 (en) | 2004-10-28 | 2006-05-11 | Scientific Games Royalty Corp. | Lottery game played on a geometric figure using indicia with variable point values |
JP4515221B2 (ja) * | 2004-10-29 | 2010-07-28 | 任天堂株式会社 | ゲームプログラム |
JP3880008B2 (ja) * | 2004-12-21 | 2007-02-14 | 株式会社光栄 | キャラクタ集団移動制御プログラム、記憶媒体及びゲーム装置 |
US7662038B2 (en) | 2005-01-07 | 2010-02-16 | Scientific Games International, Inc. | Multi-matrix lottery |
KR20070108171A (ko) | 2005-01-07 | 2007-11-08 | 사이언티픽 게임스 인터내셔널, 아이엔씨. | 추억의 게임 테마를 사용하는 추첨 게임 |
US7824257B2 (en) | 2005-01-11 | 2010-11-02 | Scientific Games International, Inc. | On-line lottery game in which supplemental lottery-selected indicia are available for purchase |
US8262453B2 (en) | 2005-02-09 | 2012-09-11 | Scientific Games International, Inc. | Combination lottery and raffle game |
US7874902B2 (en) | 2005-03-23 | 2011-01-25 | Scientific Games International. Inc. | Computer-implemented simulated card game |
EP1874418A1 (en) | 2005-04-27 | 2008-01-09 | Scientific Games International, Inc. | Game apparatus |
JP4312737B2 (ja) * | 2005-05-13 | 2009-08-12 | 任天堂株式会社 | ゲームプログラムおよびゲーム装置 |
US7654529B2 (en) | 2005-05-17 | 2010-02-02 | Scientific Games International, Inc. | Combination scratch ticket and on-line game ticket |
JP4711223B2 (ja) * | 2005-08-02 | 2011-06-29 | 株式会社セガ | 画像生成プログラム、記憶媒体、画像処理方法及び画像処理装置 |
US20070033773A1 (en) * | 2005-08-11 | 2007-02-15 | Red Lan | Releasable fastening device for fastening an article on a support frame |
CN100375429C (zh) * | 2005-11-29 | 2008-03-12 | 珠海市西山居软件有限公司 | 通过浏览器实时观看玩家游戏的方法 |
CN100375430C (zh) * | 2005-11-29 | 2008-03-12 | 珠海市西山居软件有限公司 | 一种游戏录像回放方法及系统 |
US20070265043A1 (en) * | 2006-04-12 | 2007-11-15 | Wang Andy Y | Team-based networked video gaming and automatic event management |
US9327191B2 (en) * | 2006-05-08 | 2016-05-03 | Nintendo Co., Ltd. | Method and apparatus for enhanced virtual camera control within 3D video games or other computer graphics presentations providing intelligent automatic 3D-assist for third person viewpoints |
JP5330640B2 (ja) * | 2006-05-09 | 2013-10-30 | 任天堂株式会社 | ゲームプログラム、ゲーム装置、ゲームシステム、およびゲーム処理方法 |
US9666031B2 (en) * | 2006-06-12 | 2017-05-30 | Bally Gaming, Inc. | Wagering machines having three dimensional game segments |
US8277316B2 (en) | 2006-09-14 | 2012-10-02 | Nintendo Co., Ltd. | Method and apparatus for using a common pointing input to control 3D viewpoint and object targeting |
US8201096B2 (en) * | 2007-06-09 | 2012-06-12 | Apple Inc. | Browsing or searching user interfaces and other aspects |
US8834245B2 (en) * | 2007-08-17 | 2014-09-16 | Nintendo Co., Ltd. | System and method for lock on target tracking with free targeting capability |
JP4489800B2 (ja) * | 2007-08-30 | 2010-06-23 | 株式会社スクウェア・エニックス | 画像生成装置及び方法、並びにプログラム及び記録媒体 |
US8081186B2 (en) * | 2007-11-16 | 2011-12-20 | Microsoft Corporation | Spatial exploration field of view preview mechanism |
US8584044B2 (en) * | 2007-11-16 | 2013-11-12 | Microsoft Corporation | Localized thumbnail preview of related content during spatial browsing |
US20090132967A1 (en) * | 2007-11-16 | 2009-05-21 | Microsoft Corporation | Linked-media narrative learning system |
US9098647B2 (en) | 2008-03-10 | 2015-08-04 | Apple Inc. | Dynamic viewing of a three dimensional space |
US8089479B2 (en) * | 2008-04-11 | 2012-01-03 | Apple Inc. | Directing camera behavior in 3-D imaging system |
US9619917B2 (en) | 2008-10-03 | 2017-04-11 | Apple Inc. | Depth of field for a camera in a media-editing application |
US8577518B2 (en) * | 2009-05-27 | 2013-11-05 | American Aerospace Advisors, Inc. | Airborne right of way autonomous imager |
US8460081B2 (en) | 2010-05-14 | 2013-06-11 | Scientific Games International, Inc. | Grid-based multi-lottery game and associated method |
US8808080B2 (en) | 2010-05-14 | 2014-08-19 | Scientific Games International, Inc. | Grid-based lottery game and associated method |
JP5689953B2 (ja) * | 2010-05-25 | 2015-03-25 | ジョン、ジェ ウンJEON, Jae Woong | アニメーション著作システムおよびアニメーション著作方法 |
JP5887160B2 (ja) * | 2012-02-15 | 2016-03-16 | 任天堂株式会社 | 画像処理システム、ゲームシステム、画像処理方法、画像処理装置及びコンピュータプログラム |
CA3122091A1 (en) | 2018-12-05 | 2020-06-11 | Caesars Enterprise Services, Llc | Video slot gaming screen capture and analysis |
JP7233399B2 (ja) * | 2020-06-23 | 2023-03-06 | 任天堂株式会社 | ゲームプログラム、ゲーム装置、ゲームシステム、およびゲーム処理方法 |
US11511190B2 (en) * | 2021-05-03 | 2022-11-29 | Sony Interactive Entertainment Inc. | Merge computer simulation sky box with game world |
Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH02216184A (ja) * | 1989-02-17 | 1990-08-29 | Mitsubishi Precision Co Ltd | 模擬視界発生装置 |
JPH04127279A (ja) * | 1990-06-11 | 1992-04-28 | Hitachi Ltd | 物体の運動経路生成装置および方法 |
JPH05282428A (ja) * | 1992-04-03 | 1993-10-29 | Fujitsu Ltd | 3次元コンピュータグラフィクス用図形データ作成方法 |
JPH078632A (ja) * | 1993-03-26 | 1995-01-13 | Namco Ltd | 3次元ゲーム装置 |
JPH0724142A (ja) * | 1993-07-13 | 1995-01-27 | Sega Enterp Ltd | ゲーム装置 |
JPH07110873A (ja) * | 1993-10-13 | 1995-04-25 | Canon Inc | 画像処理装置 |
JPH07116343A (ja) * | 1992-06-12 | 1995-05-09 | Sega Enterp Ltd | 電子遊戯機器 |
Family Cites Families (30)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US557960A (en) * | 1896-04-07 | brintnell | ||
US467541A (en) * | 1892-01-26 | Countee balance for stove or range doors | ||
GB2133257B (en) * | 1982-12-22 | 1987-07-29 | Ricoh Kk | T v game system |
AU556546B2 (en) * | 1984-03-28 | 1986-11-06 | Bela Bogar | Spacers for concrete reinforcing elements |
US4672541A (en) * | 1984-05-31 | 1987-06-09 | Coleco Industries, Inc. | Video game with interactive enlarged play action inserts |
US5191642A (en) * | 1987-04-09 | 1993-03-02 | General Electric Company | Method for efficiently allocating computer resource for real time image generation |
JP2725062B2 (ja) * | 1989-08-01 | 1998-03-09 | 株式会社リコー | 画像処理装置 |
US5175616A (en) * | 1989-08-04 | 1992-12-29 | Her Majesty The Queen In Right Of Canada, As Represented By The Minister Of National Defence Of Canada | Stereoscopic video-graphic coordinate specification system |
WO1992009965A1 (en) | 1990-11-30 | 1992-06-11 | Cambridge Animation Systems Limited | Animation |
US5261820A (en) * | 1990-12-21 | 1993-11-16 | Dynamix, Inc. | Computer simulation playback method and simulation |
US5341468A (en) * | 1991-01-28 | 1994-08-23 | Ricoh Company, Ltd. | Image processor |
JP2983728B2 (ja) * | 1991-01-30 | 1999-11-29 | 株式会社リコー | クリッピング処理装置 |
FR2676573B1 (fr) * | 1991-05-14 | 1997-01-24 | Thomson Csf | Procede de visualisation en temps reel d'ecoulements de fluides. |
US5384908A (en) * | 1991-12-30 | 1995-01-24 | Xerox Corporation | Avoiding oscillation in interactive animation |
US5366376A (en) * | 1992-05-22 | 1994-11-22 | Atari Games Corporation | Driver training system and method with performance data feedback |
US5830065A (en) | 1992-05-22 | 1998-11-03 | Sitrick; David H. | User image integration into audiovisual presentation system and methodology |
JP2760253B2 (ja) * | 1992-07-14 | 1998-05-28 | 住友電気工業株式会社 | 道路の動画像作成方法及びこの方法を適用した車載ナビゲーション装置 |
JP2807608B2 (ja) * | 1992-12-29 | 1998-10-08 | 株式会社ナムコ | ソーティング処理装置、これを用いた画像合成装置及びソーティング処理方法 |
JPH0817853B2 (ja) * | 1993-02-19 | 1996-02-28 | 日本電気株式会社 | 対戦シミュレーションゲームの画面表示方式 |
US5555354A (en) * | 1993-03-23 | 1996-09-10 | Silicon Graphics Inc. | Method and apparatus for navigation within three-dimensional information landscape |
US5598187A (en) * | 1993-05-13 | 1997-01-28 | Kabushiki Kaisha Toshiba | Spatial motion pattern input system and input method |
US5577960A (en) * | 1993-06-10 | 1996-11-26 | Namco, Ltd. | Image synthesizing system and game playing apparatus using the same |
US5616079A (en) * | 1993-06-16 | 1997-04-01 | Namco Ltd. | Three-dimensional games machine |
JPH078623A (ja) | 1993-06-23 | 1995-01-13 | Sophia Co Ltd | 遊技機の集中管理装置 |
JP3311830B2 (ja) * | 1993-09-20 | 2002-08-05 | 株式会社東芝 | 3次元動画作成装置 |
JP3704734B2 (ja) | 1994-01-26 | 2005-10-12 | 株式会社日立製作所 | テクスチャマッピング方法及び装置 |
TW284870B (ja) * | 1994-01-26 | 1996-09-01 | Hitachi Ltd | |
US6010405A (en) * | 1994-12-30 | 2000-01-04 | Sega Enterprises, Ltd. | Videogame system for creating simulated comic book game |
JP3442183B2 (ja) * | 1995-02-28 | 2003-09-02 | 株式会社ナムコ | 3次元ゲーム装置及び画像合成方法 |
WO1996036945A1 (fr) * | 1995-05-19 | 1996-11-21 | Sega Enterprises, Ltd. | Dispositif de traitement d'une image, procede de traitement d'une image dispositif de jeu utilisant ces derniers et support de memoire |
-
1996
- 1996-05-20 WO PCT/JP1996/001331 patent/WO1996036945A1/ja active IP Right Grant
- 1996-05-20 CN CNB2006100827128A patent/CN100501768C/zh not_active Expired - Fee Related
- 1996-05-20 US US08/765,813 patent/US5830066A/en not_active Expired - Fee Related
- 1996-05-20 JP JP53470996A patent/JP3859084B2/ja not_active Expired - Fee Related
- 1996-05-20 KR KR1019970700347A patent/KR100276598B1/ko not_active IP Right Cessation
- 1996-05-20 DE DE69631949T patent/DE69631949D1/de not_active Expired - Lifetime
- 1996-05-20 CN CN96190524A patent/CN1114891C/zh not_active Expired - Fee Related
- 1996-05-20 EP EP96915197A patent/EP0782104B1/en not_active Expired - Lifetime
-
1998
- 1998-08-14 US US09/134,446 patent/US6419582B1/en not_active Expired - Fee Related
-
2001
- 2001-12-28 CN CNB011386959A patent/CN1264122C/zh not_active Expired - Fee Related
-
2002
- 2002-05-31 US US10/157,997 patent/US7207884B2/en not_active Expired - Fee Related
-
2007
- 2007-02-16 US US11/707,059 patent/US20070155492A1/en not_active Abandoned
Patent Citations (7)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH02216184A (ja) * | 1989-02-17 | 1990-08-29 | Mitsubishi Precision Co Ltd | 模擬視界発生装置 |
JPH04127279A (ja) * | 1990-06-11 | 1992-04-28 | Hitachi Ltd | 物体の運動経路生成装置および方法 |
JPH05282428A (ja) * | 1992-04-03 | 1993-10-29 | Fujitsu Ltd | 3次元コンピュータグラフィクス用図形データ作成方法 |
JPH07116343A (ja) * | 1992-06-12 | 1995-05-09 | Sega Enterp Ltd | 電子遊戯機器 |
JPH078632A (ja) * | 1993-03-26 | 1995-01-13 | Namco Ltd | 3次元ゲーム装置 |
JPH0724142A (ja) * | 1993-07-13 | 1995-01-27 | Sega Enterp Ltd | ゲーム装置 |
JPH07110873A (ja) * | 1993-10-13 | 1995-04-25 | Canon Inc | 画像処理装置 |
Non-Patent Citations (1)
Title |
---|
See also references of EP0782104A4 * |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH10211360A (ja) * | 1997-01-30 | 1998-08-11 | Sega Enterp Ltd | ゲーム装置及びゲーム装置における画面表示方法 |
JPH1186031A (ja) * | 1997-09-11 | 1999-03-30 | Sega Enterp Ltd | 画像処理装置及び画像処理方法並びに媒体 |
JP2016033736A (ja) * | 2014-07-31 | 2016-03-10 | 株式会社コロプラ | 複数の画像間の画面表示をスムーズに移行させるコンピュータ・プログラム |
Also Published As
Publication number | Publication date |
---|---|
CN1912918A (zh) | 2007-02-14 |
CN1264122C (zh) | 2006-07-12 |
CN1114891C (zh) | 2003-07-16 |
CN1154750A (zh) | 1997-07-16 |
US7207884B2 (en) | 2007-04-24 |
KR100276598B1 (ko) | 2000-12-15 |
CN100501768C (zh) | 2009-06-17 |
US5830066A (en) | 1998-11-03 |
US20020151361A1 (en) | 2002-10-17 |
EP0782104A4 (en) | 1998-12-02 |
CN1368708A (zh) | 2002-09-11 |
DE69631949D1 (de) | 2004-04-29 |
US6419582B1 (en) | 2002-07-16 |
EP0782104B1 (en) | 2004-03-24 |
JP3859084B2 (ja) | 2006-12-20 |
US20070155492A1 (en) | 2007-07-05 |
EP0782104A1 (en) | 1997-07-02 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
WO1996036945A1 (fr) | Dispositif de traitement d'une image, procede de traitement d'une image dispositif de jeu utilisant ces derniers et support de memoire | |
EP0844587B1 (en) | Image processor, image processing method, game machine and recording medium | |
JP3786132B2 (ja) | ゲーム画像処理プログラム及び記憶媒体 | |
US6952205B2 (en) | Recording medium storing 3D image processing program, the program, 3D image processing method and video game apparatus | |
US7277571B2 (en) | Effective image processing, apparatus and method in virtual three-dimensional space | |
JP2007180935A (ja) | 音声処理装置、音声処理方法、ならびに、プログラム | |
WO1999024937A9 (fr) | Dispositif et procede de generation d'images | |
JP3005581B1 (ja) | 画像作成装置、画像作成方法、画像作成プログラムが記録された可読記録媒体およびビデオゲ―ム装置 | |
JP3956318B2 (ja) | 画像処理装置、画像処理方法及びこれを用いたゲーム装置並びに記憶媒体 | |
JP2006061717A (ja) | ゲーム画像の表示制御プログラム及びゲーム装置並びに記憶媒体 | |
JP3583995B2 (ja) | エンタテインメント装置、記憶媒体およびオブジェクト表示方法 | |
US7129945B2 (en) | Image generation method, program and information storage medium | |
JP4577968B2 (ja) | ゲームシステム及び情報記憶媒体 | |
JP3937180B2 (ja) | 画像処理装置、画像処理方法及びこれを用いたゲーム装置並びに記憶媒体 | |
JP2001286675A (ja) | ゲーム装置、情報記憶媒体およびゲームシステム | |
JPH10230075A (ja) | ゲーム装置 | |
JP2004220626A (ja) | 画像処理装置 | |
JP2002042154A (ja) | ゲームシステム及び情報記憶媒体 | |
JP2000148986A (ja) | シミュレーション装置および方法 | |
JP2002260012A (ja) | 画像の生成方法及びそれに用いるプログラム |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 96190524.7 Country of ref document: CN |
|
AK | Designated states |
Kind code of ref document: A1 Designated state(s): CN JP KR US |
|
AL | Designated countries for regional patents |
Kind code of ref document: A1 Designated state(s): DE ES FR GB IT |
|
WWE | Wipo information: entry into national phase |
Ref document number: 1019970700347 Country of ref document: KR |
|
WWE | Wipo information: entry into national phase |
Ref document number: 1996915197 Country of ref document: EP |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application | ||
WWP | Wipo information: published in national office |
Ref document number: 1996915197 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 08765813 Country of ref document: US |
|
WWP | Wipo information: published in national office |
Ref document number: 1019970700347 Country of ref document: KR |
|
WWG | Wipo information: grant in national office |
Ref document number: 1019970700347 Country of ref document: KR |
|
WWG | Wipo information: grant in national office |
Ref document number: 1996915197 Country of ref document: EP |