CN102884490A - Maintaining multiple views on a shared stable virtual space - Google Patents

Maintaining multiple views on a shared stable virtual space Download PDF

Info

Publication number
CN102884490A
CN102884490A CN2011800226110A CN201180022611A CN102884490A CN 102884490 A CN102884490 A CN 102884490A CN 2011800226110 A CN2011800226110 A CN 2011800226110A CN 201180022611 A CN201180022611 A CN 201180022611A CN 102884490 A CN102884490 A CN 102884490A
Authority
CN
China
Prior art keywords
equipment
portable set
view
virtual
virtual scene
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN2011800226110A
Other languages
Chinese (zh)
Other versions
CN102884490B (en
Inventor
G.维辛
T.米勒
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sony Interactive Entertainment LLC
Original Assignee
Sony Computer Entertainment America LLC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US12/947,290 external-priority patent/US8730156B2/en
Application filed by Sony Computer Entertainment America LLC filed Critical Sony Computer Entertainment America LLC
Priority to CN201610220654.4A priority Critical patent/CN105843396B/en
Publication of CN102884490A publication Critical patent/CN102884490A/en
Application granted granted Critical
Publication of CN102884490B publication Critical patent/CN102884490B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • A63F13/525Changing parameters of virtual cameras
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/90Constructional details or arrangements of video game devices not provided for in groups A63F13/20 or A63F13/25, e.g. housing, wiring, connections or cabinets
    • A63F13/92Video game devices specially adapted to be hand-held while playing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1626Constructional details or arrangements for portable computers with a single-body enclosure integrating a flat display, e.g. Personal Digital Assistants [PDAs]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1633Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
    • G06F1/1684Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675
    • G06F1/1694Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675 the I/O peripheral being a single or a set of motion sensors for pointer control or gesture input obtained by sensing movements of the portable computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L12/00Data switching networks
    • H04L12/02Details
    • H04L12/16Arrangements for providing special services to substations
    • H04L12/18Arrangements for providing special services to substations for broadcast or conference, e.g. multicast
    • H04L12/1813Arrangements for providing special services to substations for broadcast or conference, e.g. multicast for computer conferences, e.g. chat rooms
    • H04L12/1827Network arrangements for conference optimisation or adaptation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L51/00User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail
    • H04L51/04Real-time or near real-time messaging, e.g. instant messaging [IM]
    • H04L51/046Interoperability with other network applications or services
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L51/00User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail
    • H04L51/07User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail characterised by the inclusion of specific contents
    • H04L51/10Multimedia information
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L51/00User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail
    • H04L51/07User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail characterised by the inclusion of specific contents
    • H04L51/18Commands or executable codes
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/211Input arrangements for video game devices characterised by their sensors, purposes or types using inertial sensors, e.g. accelerometers or gyroscopes
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/213Input arrangements for video game devices characterised by their sensors, purposes or types comprising photodetecting means, e.g. cameras, photodiodes or infrared cells
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/215Input arrangements for video game devices characterised by their sensors, purposes or types comprising means for detecting acoustic signals, e.g. using a microphone
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/216Input arrangements for video game devices characterised by their sensors, purposes or types using geographical information, e.g. location of the game device or player using GPS
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • A63F13/426Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle involving on-screen location information, e.g. screen coordinates of an area at which the player is aiming with a light gun
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/80Special adaptations for executing a specific game genre or game mode
    • A63F13/803Driving vehicles or craft, e.g. cars, airplanes, ships, robots or tanks
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/80Special adaptations for executing a specific game genre or game mode
    • A63F13/812Ball games, e.g. soccer or baseball
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/10Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
    • A63F2300/1006Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals having additional degrees of freedom
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/10Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
    • A63F2300/1087Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals comprising photodetecting means, e.g. a camera
    • A63F2300/1093Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals comprising photodetecting means, e.g. a camera using visible light
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/20Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterised by details of the game platform
    • A63F2300/204Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterised by details of the game platform the platform being a handheld device
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/20Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterised by details of the game platform
    • A63F2300/205Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterised by details of the game platform for detecting the geographical location of the game platform
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/60Methods for processing data by generating or executing the game program
    • A63F2300/66Methods for processing data by generating or executing the game program for rendering three dimensional images
    • A63F2300/6661Methods for processing data by generating or executing the game program for rendering three dimensional images for changing the position of the virtual camera
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/80Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
    • A63F2300/8082Virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2200/00Indexing scheme relating to G06F1/04 - G06F1/32
    • G06F2200/16Indexing scheme relating to G06F1/16 - G06F1/18
    • G06F2200/163Indexing scheme relating to constructional details of the computer
    • G06F2200/1637Sensing arrangement for detection of housing movement or orientation, e.g. for controlling scrolling or cursor movement on the display of an handheld computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/01Indexing scheme relating to G06F3/01
    • G06F2203/012Walk-in-place systems for allowing a user to walk in a virtual environment while constraining him to a given position in the physical environment

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Networks & Wireless Communication (AREA)
  • Signal Processing (AREA)
  • Computer Hardware Design (AREA)
  • Processing Or Creating Images (AREA)
  • User Interface Of Digital Computer (AREA)

Abstract

Methods, apparatus, and computer programs for controlling a view of a virtual scene with a portable device are presented. In one method, a signal is received and the portable device is synchronized to make the location of the portable device a reference point in a three- dimensional (3D) space. A virtual scene, which includes virtual reality elements, is generated in the 3D space around the reference point. Further, the method determines the current position in the 3D space of the portable device with respect to the reference point and creates a view of the virtual scene. The view represents the virtual scene as seen from the current position of the portable device and with a viewing angle based on the current position of the portable device. Additionally, the created view is displayed in the portable device, and the view of the virtual scene is changed as the portable device is moved by the user within the 3D space. In another method, multiple players shared the virtual reality and interact among each other view the objects in the virtual reality.

Description

Keep many views in the stable Virtual Space of sharing
Technical field
The present invention relates to method, equipment and computer program be used to the view that utilizes portable set control virtual scene, and relate more specifically to for realizing mutual method, equipment and the computer program of many people at virtual or augmented reality.
Background technology
Virtual reality (VR) is the environment of computer simulation, no matter this environment is real world or the simulation in the imaginary world, wherein the user can be by Application standard input equipment or special-purpose multi-direction input equipment and virtual environment or pseudo-entity object (VA) alternately.The environment of simulation can be similar to real world, for example is used for the simulation of pilot or battle drill, and perhaps it can significantly be different from reality, as in the VR game.Virtual reality is through being usually used in describing usually the multiple application related with its immersion, highly virtual 3D environmental facies.Development, the image of computer-aided design (CAD) (CAD) software is hardware-accelerated, head mounted display, database gloves (database gloves) and miniaturization have helped to popularize this concept.
Augmented reality (AR) provides the on-the-spot view of physics real world, and the image that the element of physics real world and virtual computing machine generate merges (or the figure image intensifying that is generated by virtual computing machine) to create mixed reality.Strengthen traditionally in real time and have the semantic background of environmental element, such as the sports score on the TV between match period.Utilize the help of advanced AR technology (for example, adding the identification of computer vision and object), become interactive and can digitally use about the information of true environment around the user.
The term empty border (AV) of increasing also is used in the virtual reality world, and is similar to AR.The empty border of increasing refers to that also real-world objects merges in the virtual world.As the intermediate state in the virtual continuum (Virtuality Continuum), AV refers to prevailing Virtual Space, wherein dynamic integrity physical element (for example, physical object or people), and physical element can be in real time and virtual world mutual.Unless otherwise, use in this application term VR as generic term, it also comprises AR and AV.
The VR game typically needs a large amount of computer resources.It is rare implementing in the handheld device of VR game, and existing game is the oversimplification with basic VR effect.In addition, the player's is mutual in many people AR game permission virtual world, but this is limited to the object (for example, vehicle, racket, ball etc.) of being handled by the player in the virtual world alternately.Virtual world is that computing machine generates, and does not depend on the location of player and portable set.When creating " truly " virtual reality and experience, do not consider that mutual the and player of player is with respect to the relative positioning of their environment.
Embodiments of the invention occur under this background.
Summary of the invention
Embodiments of the invention are provided for utilizing portable set to control method, equipment and the computer program of the view of virtual scene.Should be appreciated that the present invention can implement in many ways, such as the method on process, device, system, equipment or the computer-readable medium.Some novelty embodiment of the present invention are described below.
In an embodiment of method, receive signal, and synchronous described portable set so that described portable set orientate reference point in three-dimensional (3D) space as.Around described reference point generating virtual scene in 3d space, described virtual scene comprises the virtual reality element.In addition, described method is determined described portable set with respect to the current location of described reference point in 3d space, and creates the view of virtual scene.The performance of described view as from as described in the current location of portable set watch and have based on as described in the virtual scene at visual angle of current location of portable set.In addition, in described portable set, show the view that creates, and when in 3d space, moving described portable set by the user, change the view of described virtual scene.In other method, shared virtual reality and mutual mutual a plurality of players watch the object in the virtual reality.
In another method, presented the method that is used between each equipment, sharing virtual scene.Described method comprises for the reference point of synchronous the first equipment to three-dimensional (3D) space, and is used for the operation with respect to the location of location Calculation second equipment of described the first equipment.In addition, the operation of described method is included in described the first equipment and described the second exchanged between equipment information, so that the reference point of described the second device synchronization in the 3d space.Described information comprises the location of described reference point and described the first equipment and the second equipment.In addition, the method operation is used for around described reference point in 3d space generating virtual scene.Described virtual scene is shared by two equipment, and when equipment and described virtual scene were mutual, described virtual scene is simultaneously change in two equipment.Create the view of described virtual scene, as use based on the visual angle of the current location of portable set from as described in the current location of the first equipment watch, and shown in show the view that creates in the first equipment.By when portable set moves in 3d space, change the view of the demonstration of described virtual scene, described method continues.
In another embodiment, manner of execution is used for using the view of the first equipment control virtual scene.Described method comprises for first reference point of synchronous described the first equipment to the first three-dimensional (3D) space.In another operation, set up the communication link between described the first equipment and the second equipment.Described the second equipment is in the second 3d space of described the first 3d space outside, second reference point of described the second device synchronization in described the second 3d space.In addition, the manner of execution operation is used for generating the public virtual scene that comprises the virtual reality element, and wherein said public virtual scene can both observe by the first and second equipment.Described the first equipment is set up described public virtual scene around described the first reference point, and described the second equipment is set up described public virtual scene around described the second reference point.Two equipment can with described virtual reality element interactions.In addition, described method comprises for definite described the first equipment with respect to the operation of described reference point in the current location of described the first 3d space.The performance of described view as from as described in the current location of the first equipment watch and have based on as described in the public virtual scene at visual angle of current location of the first equipment.In described the first equipment, show the view that creates, and when described the first equipment moved, the view of the demonstration of described public virtual scene changed in described the first 3d space.
In other embodiment, the view of portable set control virtual scene is used in the method operation.In an operation, the reference point in three-dimensional (3D) space that synchronous portable set is arranged in to described portable set.Described portable set comprises the anterior camera of the front portion of facing described portable set and the rear portion camera of facing the rear portion of described portable set.In addition, executable operations is used for around described reference point in 3d space generating virtual scene.Described virtual scene comprises the virtual reality element.Determine that described portable set is with respect to the present bit of described reference point in 3d space.In the other method operation, create the view of described virtual scene.Described view catches the performance of the virtual scene of seeing such as current eye position the player's of portable set as described in hold the 3d space, and described catching seen by the window that enters virtual scene corresponding to the player.The position of window is equivalent to the position in the 3d space of display in the described portable set in the 3d space.Described method also comprises for showing the view that creates at described display, and be used for when described portable set or player when 3d space moves, change the operation of view of the demonstration of described virtual scene.
In more embodiment, portable set is used for mutual with augmented reality.Described portable set comprises locating module, virtual reality maker, view generation device and display.Locating module be used for to be determined the position of portable set described in the 3d space that described portable set is positioned at, wherein receives when being used for synchronous signal at described portable set, and the position of described portable set is made as the reference point in the 3d space.The virtual reality maker creates virtual scene around described reference point in 3d space.Described virtual scene comprises the virtual reality element.In addition, the view generation device creates the view of described virtual scene, the performance of wherein said view as from as described in the position of portable set watch and have based on as described in the virtual scene at visual angle of position of portable set.In addition, display is used for showing the view of described virtual scene.When described portable set moves in 3d space, at the scene change shown in the described display.
In other embodiments, when by one or more computer run, the computer program in the embeddeding computer readable storage medium storing program for executing is used for implementing method of the present invention.
From the following detailed description of the principle of the invention is described in the mode of example in conjunction with the drawings, it is obvious that other side of the present invention will become.
Description of drawings
Can understand best the present invention with reference to following description by reference to the accompanying drawings, in the accompanying drawing:
Fig. 1 describes according to an embodiment, the user before the reference point that portable set is synchronized in the space.
The virtual reality scenario that Fig. 2 diagram utilizes portable set to observe.
Fig. 3 diagram is according to an embodiment, utilizes the player's of virtual plate and mixing the augmented reality chess game of hand.
Fig. 4 describes the many people reality-virtualizing game according to an embodiment.
Fig. 5 diagram is used for an embodiment of the calibration steps of many people environment.
How Fig. 6 diagram plays interactive entertainment in network connection according to an embodiment.
Fig. 7 illustrates the interactive entertainment of the location of not depending on portable set.
Fig. 8 illustrates the interactive entertainment according to an embodiment, and wherein the view in the display depends on the position of portable set.
Fig. 9 diagram is according to an embodiment, and the movement of portable set is similar effect when display has with mobile camera in the Virtual Space how.
Figure 10 illustrates according to an embodiment, the two dimension performance of the change when rotating portable set in the image shown in the display.
Figure 11 illustrates the portable set that is used for playing the VR game according to an embodiment.
Figure 12 A-12F diagram is according to an embodiment, and how the position of portable set affects the view in the display.
The augmented reality game of playing between the remote user of Figure 13 A-13B diagram according to an embodiment.
Figure 14 A-14H describes according to an embodiment, along with the change in the portable set change position indicator.
Figure 15 diagram is used for using the front and rear camera to implement the embodiment of viewing frustum (viewing fustum) at portable set.
Figure 16 A-16B diagram moves the effect that changes viewing frustum according to an embodiment along with the player.
How Figure 17 diagram uses virtual camera to cross over the view of virtual scene according to an embodiment.
Figure 18 A-18H illustrates according to an embodiment, is used for a series of views of diagram viewing frustum effect.
Figure 19 A-19B diagram is used for the embodiment of combination viewing frustum effect and camera effect.
Figure 20 illustrates according to one embodiment of present invention, is used for utilizing the flow process of algorithm of the view of portable set control virtual scene.
Figure 21 diagram can be used for the framework of the equipment of the enforcement embodiment of the invention.
Figure 22 be scenario A according to an embodiment of the invention to the graphical representation of exemplary of scene E, wherein each user A to user E with mutual to the game client 1102 of server process via Internet connection.
The embodiment of Figure 23 pictorial information ISP framework.
Embodiment
Following examples are described method, equipment and the computer program for the view of the virtual scene of controlling virtual or augmented reality.Yet it will be apparent to those skilled in the art that not to have putting into practice the present invention in the situation of some or all of these details.In other cases, be not described in detail well known process operations in order to can blur necessarily the present invention.
Fig. 1 describes according to an embodiment, the user before the reference point that portable set is synchronized in the space.Portable set 104 is positioned on the table, and preparation is synchronized to reference point with this portable set.User 102 has been placed on portable set the point as reference point or anchor point, in order to set up the virtual reality around this point.In situation shown in Figure 1, portable set is positioned at the approximate center of desk, and in case synchronous this portable set is set up virtual world with regard to the center around desk.Can be in many ways synchronous portable set, such as promote on the portable set 104 button, touch touch-sensitive screen in the portable set, allow static a period of time of equipment (for example, 5 seconds), input voice command etc.
In case receiving, portable set wants synchronous input, the location tracking module in the portable set of just resetting.Portable set can comprise multiple location tracking module, and such as accelerometer, magnetometer, GPS (GPS) equipment, camera, depth camera, compass, gyroscope etc. is discussed with reference to Figure 21 as following.
Portable set can be one of many types, such as handheld portable game station, cell phone, panel computer, notebook computer, net book, PDA(Personal Digital Assistant) etc.Describe embodiments of the invention with reference to portable game device, but principle can be applied to have any portable electric appts of display.Principle of the present invention can also be applied to be connected to game console or other input equipment of the computing equipment with display.
The virtual reality scenario that Fig. 2 diagram utilizes portable set to observe.After with respect to reference point 106 synchronizers 104, portable set will begin the view of display virtual real 108.Mobile in 3d space around reference point 106 at the camera of the back of portable set by simulation, create the view in the display.Fig. 2 describes to comprise the virtual reality of chessboard.Portable set 104 can detect movement, and along with equipment is determined its relative position with respect to reference point 106 around moving.Can position and location positioning with distinct methods and different accuracy level.For example, can be by analyzing the image of catching with camera or data or its combination that obtains from inertia system, GPS, ultrasonic triangulation, WiFi communication, dead reckoning etc., detection and location.
In one embodiment, equipment is followed the tracks of in the space of portable set with respect to reference point 106 and is located, and position in the space of portable set.The position is used for determining the visual angle of camera, that is to say, portable set is as the camera that enters in the virtual scene.If portable set is aimed to the right, view will turn to the right side so.In other words, perspective definition is for having at the initial point at display (the perhaps other parts of equipment) center and vertical and away from the vector of the direction of display.In another embodiment, only follow the tracks of position in the space, and the view in the calculation display, aim at from the location, space that portable set is arranged in and towards reference point as camera.
In some existing embodiments, augmented reality (AR) label places on the table, and with acting on the fiduciary mark that generates augmented reality.The AR label can be when object or the image of catching in the image stream time identification that are present in actual environment.The AR label is as fiduciary mark, and it realizes the determining of location in true environment.Embodiments of the invention are eliminated the needs for the AR label, because the location that is synchronized to 3d space and follows the tracks of portable set.In addition, locating information allows the true 3D virtual experience of game delivery in the portable set.In addition, the array of the portable set of networking can be used for creating the virtual world of sharing, and describes with reference to Fig. 4 as following.
Fig. 3 diagram is according to an embodiment, utilizes the player's of virtual plate and mixing the augmented reality chess game of hand.The image of 3d space is used for creating augmented reality by and virtual element real with respect to the calibration point combination, and the function of catching as optical motion is provided.Utilize the polyphaser technology of calibration, may determine the position of hand or arm so that so that the player can " arrive " in the augmented reality scene and be mutual with game object (pieces of chess).
In one embodiment, two of the individual equipment back cameras are used for determining that object is to the location of 3d space.Depth camera also can be used for obtaining three-dimensional information.In other embodiments, locate for definite hand 306 from the camera of a plurality of equipment, discuss with reference to Fig. 4 as following.In proficiency handheld portable devices 302, the player stares by screen 304, and arrives in the game area that generates for them in order to touch 3D game object and environment.Game play is sense of touch fully.May a plurality of players arrive in the game area simultaneously, and mutual with complex way and game object.For example, player's hand 306 can by the button that interacts, holds, pushes away, draws, grabs, moves, clashes into, pushes, hits, throws, fights, opens, closes, turns on and off, presses, shoot, eat etc. mutual with virtual objects.
Each portable set that is synchronized to game area adds another potential camera, relatively moves and follow the tracks of and the ping data, so that may see from a plurality of viewpoints player's hand and finger, in order to create effective capturing movement field based on the 3D camera.Hand and Virtual Space mix, and wherein the virtual element in the Virtual Space occurs in the view that shows, are the parts of 3d space as virtual element.From geometrical perspective, the view of the view of virtual element true element when moving in 3d space with portable set changes identical mode and changes.
Fig. 4 describes the many people reality-virtualizing game according to an embodiment.When the position of calibrating with the connective combination of high speed and pattern analysis data, position and game information can be in each exchanged between equipment of selecting participation communal space game experiencing.This allows each player's system access from every other player's camera image and positional information, so as with they calibrating position synchronously together and shared virtual space (being also referred to as the communal space).
Synchronously or after calibrating their portable set, create public virtual scene 404 with reference to the point (such as the point on the table) in the public 3d space at player 402A-402C.Each player has the view of virtual scene 404, is real as the table of Virtual Space (combatting chess board game in the case) in the player front.Portable set is as camera, so that when the player centered on equipment moving, view changed.As a result, the actual view on each display does not rely on the view in other displays, and view is only based on the relative position of portable set with respect to actual scene, and it is fixed to actual physics location on the 3d space.
Determine the position by utilizing a plurality of cameras, accelerometer and other plant equipment together with the high-speed communication between the portable set, may create the experience as the 3D capturing movement, this experience allows the player to see (and may touch) virtual game role and environment in believable mode.
The communal space 404 game utilize the high speed connectedness of equipment in each the exchanged between equipment information that participates in communal space game experiencing.By " the magic window " with the equipment stable direction, watch the communal space 404 game area by equipment, this stable " magic window " remains in the space between each equipment.By using the combination of the height continuation of information between motion tracking, graphical analysis and each equipment, even when equipment moves everywhere, game area also occurs at settling position.
Fig. 5 diagram is used for an embodiment of the calibration steps of many people environment.As previously mentioned, the positional information that obtains from device senses device (accelerometer, GPS, compass, depth camera etc.) sends to other chained devices, in order to strengthen the data that cooperation keeps in the Virtual Space.Be used for creating the public communal space that is synchronized to common reference point 502, the first player 504A is synchronized to 3d space with her equipment with respect to reference point 502.Other players and the first player in the communal space set up communication linkage, so that switch and game information.Can obtain by different way relative position, such as using WiFi triangulation and ping test to determine relative position.In addition, virtual information can be used for be determined other location, such as detecting the facial of other players and from their may the locating of face, game station.
In one embodiment, audio frequency triangulation is used for relying on supersonic communication and shotgun microphone to determine relative position.A plurality of frequencies can be used for carrying out audio frequency triangulation.In case equipment is exchange position information, radio communication (such as ultrasonic, WiFi or bluetooth) just is used for surplus equipment is synchronized to reference point 502.After the calibration all devices, equipment is understood reference point 502 and they with respect to the relative position of reference point 502.Should be appreciated that additive method can be used for calibrating a plurality of equipment to the reference point of sharing.For example, can be by in turn equipment being placed on the reference point, all devices can be calibrated to same reference points.
By using shade and the illumination of being determined by the room inner light source, can be so that virtual scene be more true to nature.By using the camera feedback, game environment and role have scene lighting and the shade that affected by real world.This means with setting about arriving in the virtual world with mutual with virtual objects, player's hand with cast shadow on virtual role or object.Adjust gaming world shade and illumination by real world shade and illumination, in order to obtain possible optimum efficiency.
How Fig. 6 diagram plays interactive entertainment in network connection according to an embodiment.Being permitted eurypalynous game is possible in the communal space.For example, portable set can be as the racket that carries out table tennis game.Equipment is mobile everywhere, as the racket batting.The player sees that ball floats between screen and adversary's screen.In war game, the player passes portable set and looks, and the ballista at aiming enemy fort place.The player pulls back equipment in order to load ballista, then presses button in order to ballista is opened fire to enemy's castle.
The communal space can also create when the player is in different location, as shown in Figure 6.The player has set up network connection in order to play games.Each player is his reference point of device synchronization in the player space, and creates virtual reality, such as ping-pong table.The adversary is presented at the back of his desk end, and wherein the shifted matching of adversary's equipment is to the movement of adversary's racket.Game can also be added incarnation and hold racket, is used for more real game experiencing.During playing, movement and position in each equipment tracking equipment space.This information and other equipment are shared, so that so that other equipment can be arranged the virtual paddle of matching unit motion.Also share other game informations, such as location and the movement of ball.
Fig. 7 illustrates the interactive entertainment of the location of not depending on portable set.The illustrated game of Fig. 7 illustrates the restriction of not synchronous with respect to reference point 706 game of object for appreciation.While ice hockey game on many people table on two equipment 704C that separate and the 702A.This game comprises ice hockey rink 708, ice hockey 714 and bat 710 and 712.Each player is by the control of moveable finger on display bat.Display illustrates the location of ice hockey and bat.Yet the view when portable set moves everywhere on the display does not change, because do not exist with the geography of reference point synchronous.For example, when player 702A moves to the location during 702B, view is identical, and the location of tube apparatus is not where.
In order to play games, each portable set only exchanges the information about the location of the movement of bat and ice hockey.There is not the virtual experience that is tied to 3d space.
Fig. 8 illustrates the interactive entertainment according to an embodiment, and wherein the view in the display depends on the position of portable set.Equipment 802A and 802B have been calibrated to public space, and have created the ice hockey rink as virtual element.Equipment is as the camera that enters the space, and equipment does not need complete play surface must be shown.For example, when pulling open equipment from reference point, the larger view that effect and court occur dwindling can be used.In addition, if be inclined upwardly equipment, view illustrates the top in court so, and if downward-sloping equipment, the view in the equipment is more near self target of player so.As seen in Figure 8, the view in each display is separate, and based on the front view of working as from the play surface of each portable set.
Fig. 9 diagram is according to an embodiment, and the movement of portable set is similar effect when display has with mobile camera in the Virtual Space how.Suppose that the point of portable set from spheroid to automobile 902 aimings, along with portable set is mobile in spheroid, can obtain a plurality of views of automobile.For example, from the view of " arctic point " top of automobile will be shown, and the bottom of automobile will be shown from the view of " Geophysical South Pole ".The view of automobile side, front and rear also is shown among Fig. 9.
In one embodiment, the player can input command to change or the view of upset virtual world.For example, in the situation of automobile, the player is from the front portion of seeing automobile to the rear portion of seeing automobile, and Rotate 180 ° and axle vertically turn over reference point as scene.In this way, the player needn't move to obtain different visual angles around the room.Other input can produce different-effect, such as the convergent-divergent (so that virtual world seems smaller or greater) of 90 ° of rotations, view, with respect to x, y or the rotation of z axle etc.In another embodiment, the upset of portable set (that is, in player's 180 ° of rotations on hand) will be so that the view turned upside down of virtual world.
Figure 10 illustrates according to an embodiment, the two dimension performance of the change when rotating portable set in the image shown in the display.Portable set 152 is concentrated one's gaze on wall with visual angle α, causes the projection 160 on wall.Therefore, the view on portable set 152 will be corresponding to projection 160.When equipment 152 rotational angle β, portable set finishes in the position 154.View is rotational angle β also, keeps simultaneously camera perspective α.As a result, the view on the portable set is corresponding to projection 162.Should be noted that the view on the screen is independent of eye position (such as position 158 and 156), and view is independent of the player and where is positioned at.In addition, the view on the display depends on the position of portable set, and this portable set is as virtual camera.Other embodiment that the following describes comprises according to the view on the display of the position change of eyes.
Figure 11 illustrates the portable set that is used for playing the VR game according to an embodiment.Figure 11 is to the car race game of 12F diagram, and wherein portable set can be as the driving of camera or control vehicle.Portable set illustrates the view of match, wherein sees racing track in central authorities, has other racing car and is sitting in people on the racing track one side grandstand.
Figure 12 A-12F diagram is according to an embodiment, and how the position of portable set affects the view in the display.In this sequence, portable set is used as camera, and does not drive racing car.Figure 12 A illustrates the player and holds portable set object for appreciation car race game.Equipment is held in the player front with the length of approximate arm.When the player was in position shown in Figure 12 A, the view of game was an illustrated view among Figure 12 B, and wherein the view on the display illustrates the match as being seen by the driver.The driver can see forwardly racing track and the part (comprising bearing circle) of vehicle interior.
Figure 12 C illustrates the player and rotates about 45 ° of whiles and still hold portable set in his front.As a result, portable set moves in the space with the player.See the result that the player moves in Figure 12 D, wherein the view of racing track has also rotated about 45 °.Can see that portable set is used as camera, and the change of the view on the display, in the 3D world, change the position as camera.
Figure 12 E illustrates the player and turns left other 45 °.As a result, the head of portable set and view have changed about 90 ° with respect to the original position.Draw the result on the display in Figure 12 F, wherein the driver of game has a side view now, and it comprises another racing car and grandstand.
The augmented reality game of playing between the remote user of Figure 13 A-13B diagram according to an embodiment.Figure 13 A illustrates the portable set with camera 1302 of facing the player who holds portable set.In the face of player's camera has many purposes, use such as teleconference, viewing frustum and (to see Figure 15-19B), in game, to merge player's face etc.
Figure 13 B illustrates the embodiment of the augmented reality game that produces the reality effect.Player 1308 is in long range positioning, and via network connection exchange game and environmental information.The camera of long range positioning is taken the picture of player and near Ta (such as background 1310).Image sends to adversary's equipment, and wherein this image mixes with virtual chessboard 1306.Similarly, camera 1304 is taken the player's who holds equipment picture, and the person that sends images to the remote game.The player can the communal space in this way.
When view is crossed over when entering other player's screen, each player sees his view as the reality that strengthens, and this view fades away in the virtual reality mist.All of still just following the tracks of each player with respect to the position of the synchronous calibration that is used for two equipment move.Game is inserted virtual chessboard in the table top that provides 3D to experience.As previously described, portable set can move to change view everywhere, and from different viewpoints (such as from the top, sidepiece, adversary's view etc.) see chessboard.
In one embodiment, substitute use live feed (live-feed) by face and the background that is updated periodically from the adversary, reduce the communication that needs and process bandwidth.In addition, may only send the part of remote image, such as player's image, because background may be static with less relevant.For example, can per five seconds, each player changes expression, when the player speaks etc., renewal remote game person's face.
In another embodiment, sound also can exchange between the player, so that so that 3D experiences truer.In another embodiment, the player has the option that changes view, such as mixing 3D rendering and only showing that chessboard is with the switching between the view of improvement dish.In another embodiment, image stabilization can be used for stable because the little image change of the slight jitter of player's hand.In one embodiment, the face of holding the player of equipment also can add display to, how to occur to the adversary in order to this user is shown.
Figure 14 A-14H describes according to an embodiment, along with the change in the portable set change position indicator.In the sequence of Figure 14 A-14H, portable set uses the viewing frustum effect to determine how the augmented reality world is presented to the user.
In current 3D computer graphics, viewing frustum or view frustums are the area of space in the modeling world that possible appear on the screen.Viewing frustum is the visual field of the camera (notional camera) of the imagination.The definite shape in this zone depends on the camera lens of simulating what kind and changes, but the frustum of rectangle cone (therefore gaining the name) typically.Plane perpendicular to direction of observation cutting frustum is called front cutting face (near plane) and rear cutting face (far plane).In one embodiment, front cutting face is corresponding to the surface of display in the portable set.Do not draw than front cutting face more near camera or surpass the object of rear cutting face.
In one embodiment, viewing frustum (or between eyes) fixing (top of centrum) in the player's who holds portable set eye.Display is as entering the window of virtual reality.Therefore, " window " more near eyes, the zone of the virtual reality of demonstration is larger.On the contrary, " window " got over away from eyes, the view of virtual reality less (and more detailed).This effect is similar to the old-fashioned peep-hole of rectangle that approaches the optics that do not distort.Outside eyes can be observed more near peep-hole.
Figure 14 A is illustrated in the player that house interior is held the augmented reality portable set.After equipment was synchronized to the room, the virtual reality maker added " drafting " in the face of the virtual triangle on player's the wall, and " drafting " square on the wall of the player left side.In Figure 14 A, the player holds equipment slightly under the eye level face with the arm that almost completely launches.View shown in the display is presented among Figure 14 B, the leg-of-mutton part of the front of player shown in it.
In Figure 14 C, the player is in same position, and bend elbow is so that with the more approaching face of portable set.Because viewing frustum effect as discussed above, the player sees the greater part of wall.Figure 14 D is illustrated in the view that shows in the equipment among Figure 14 C.Because the frustum effect is compared with view before Figure 14 B, observe the greater part of wall.Now see complete triangle at display.
Figure 14 E illustrates the player and moves down equipment in order to see bottom to sidewalls, shown in Figure 14 F.At display leg-of-mutton base section is shown.In Figure 14 G, the player forwards the left side to, and uses " window " that enter the enhancing world to watch the bight in room, shown in Figure 14 H.
Figure 15 diagram is used for using the front and rear camera to implement the embodiment of viewing frustum at portable set.Figure 15 illustrates the 2D projection of viewing frustum, and because it is the 2D projection, so the viewing frustum cone is viewed as triangle.Portable set 1506 comprises respectively preposition and rearmounted camera 1514 and 1512.Camera 1512 is used for catching the image in the space that the player is positioned at.Camera 1514 is for the image of catching the player who holds equipment 1506.Facial recognition software allows device software to determine the location of player's eyes, so that simulation viewing frustum effect.
In one embodiment, viewing frustum has on the summit at eyes place, and wherein the edge of rectangle truncated cone extends and bight by display the equipment of holding from eyes.When eyes were in position 1502, the player " saw " zone 1510 in the face of the wall of equipment.The line and the wall that are derived from the bight of eyes and contact display intersect with localized area 1510.When eyes moved to position 1504, the line that is derived from eyes as a result of changed.New line localized area 1508.Generally speaking, static if portable set 1506 keeps, the change of eye position will cause the change that shows in display so.Certainly, if portable set moves, view also will change so, because along with the bight with display, the edge of cone intersects, viewing frustum changes.
Should be appreciated that the illustrated embodiment of Figure 15 is the illustrative embodiments of viewing frustum.Other embodiment can utilize difformity to be used for viewing frustum, and can convergent-divergent viewing frustum effect, perhaps can add the edge to viewing frustum.Therefore, the illustrated embodiment of Figure 15 should not be construed as exclusive or restrictive, but exemplary or illustrative.
Figure 16 A-16B diagram moves the effect that changes viewing frustum according to an embodiment along with the player.Figure 16 A comprises the display 1606 in the portable set, and wherein the surface of display is parallel to the surface of wall.When the player watches by having viewing frustum effect display, the summit be in the facial somewhere (between eyes) of player, based on wall and the edge extends from eyes and the situation in the bight of contact display 1606 under, create the rectangle truncated cone.
When the player was in position 1602, viewing frustum created rectangle basal plane 1610, and it is that the player sees on display 1606.When the player moves to position 1604, and do not have mobile display, viewing frustum as a result of changes.The new basal plane that is used for frustum is rectangle 1608, and it is seen in display 1606.The result is the change that the change of player position causes view in the virtual reality.
Figure 16 B diagram is when using the viewing frustum effect, along with face moves far or shift near the zooming effect that display creates.When the player was in position 1632, the player saw rectangle 1638, as previously described.If the player moves away to position 1632 and do not have mobile display from display 1636, see the new display corresponding to rectangle 1640.Therefore, when the player removed, the viewing area of virtual world shrank, and causes amplification effect, because the viewing area in the display is less, the object in this viewing area seems larger at display.The opposite movement that the player shifts near display 1636 will cause the opposite effect of dwindling.
How Figure 17 diagram uses virtual camera to cross over the view of virtual scene according to an embodiment.Virtual or augmented reality does not need not to be restricted in the restriction in the room that the player is positioned at, such as what see at the Figure 11 that is used for car race game before us.Also can simulate the virtual world of the physical edge that surmounts the player.Figure 17 diagram is watched the player of virtual concert.Actual stage is located at beyond the room wall, and can be modeled as apart from portable set hundreds of foot, and portable set is in the case as virtual camera.Also can simulate in the same manner viewing frustum.
As observing in the bottom, different camera positions and viewing angle will cause the different views on the display.For example, the first position of focusing facula is in the reserve singer, and the second position of focusing facula is in main artist, and the 3rd location aiming crowd.Virtual camera also can add the convergent-divergent input, in order to zoom in or out as true camera.
In one embodiment, convergent-divergent is used for browsing virtual reality.For example, if the player moves forward one foot, portable set will create virtual view so, advance 10 feet as the player.In this way, the player can browse the virtual world greater than player room of living in.
In another embodiment, the player can input command so that so that camera is mobile and need not actual mobile portable equipment in virtual reality.Because with respect to the synchronous portable set of reference point, the movement of this camera and not the movement by the player have and change reference point to the effect of new location.This new reference point can be called virtual reference point, and needn't be positioned at the residing actual physics of player space.For example, in the illustrated scene of Figure 17, the player can use " moving forward " order to move camera to the backstage.In case the player " is in " backstage, the player just can begin everywhere mobile portable equipment in order to check the view backstage, as previously discussed.
Figure 18 A-18H illustrates according to an embodiment, is used for a series of views of diagram viewing frustum effect.Figure 18 A illustrates the player who holds portable set.View on the display is corresponding to the image of the forest shown in Figure 18 B.In Figure 18 C, the player moves his head to his the right, simultaneously portable set is remained on the approximately uniform position with Figure 18 A.Figure 18 D is corresponding to the view that is used for Figure 18 C player, and how the panorama that forest is shown is owing to the viewing frustum effect changes.
In Figure 18 E, the player keeps head turned to his the right, simultaneously to his left side mobile portable equipment in order to emphasize the viewing frustum effect, because whether the player exists something after wanting to know tree.Figure 18 F illustrates the display corresponding to player among Figure 18 E.The panorama of forest changes again.Have the elfin be hidden in one of the trees hidden among Figure 18 B back, but along with the player has changed the visual angle of forest, elvish part in Figure 18 F as seen.Figure 18 G illustrates the player to his the right angled head further, and to his left side mobile portable equipment further.Such as Figure 18 H finding, effect is after the player can see tree now what to be, the elfin now fully as seen.
Figure 19 A-19B diagram is used for the embodiment of combination viewing frustum effect and camera effect.Can see and make up viewing frustum with the camera effect because different and impossible for the behavior of setting up virtual view.Yet when exist to be used for limiting when when using an effect or another regular, combination is possible.In one embodiment, when player's mobile portable equipment, use the camera effect, and when user during with respect to the portable set moving-head, use viewing frustum effect.In the situation that two events occur simultaneously, select an effect, such as viewing frustum.
This combination means the position of given eyes and portable set, depends on that how eyes and camera arrive this position, may exist different views on display.For example, when eyes 1902 are watched through equipment 1906, at the different views of virtual reality shown in Figure 19 A and the 19B, as discussed below.
With reference to Figure 19 A, the eyes 1902 initial equipment 1904 that see through are watched.The equipment of use viewing frustum effect is " aiming " virtual reality directly forward.This causes being derived from the α angle at viewing frustum cone top, and causes the camera angle of β.Use as before the identical 2D performance described with reference to Figure 10 and 15, the player sees fragment 1908 on the wall in this primary importance.The player is then with equipment rotational angle γ, equipment is placed position 1906.Because the player is mobile this equipment, so portable set is corresponding to the movement with camera effect, so that also rotational angle γ of virtual camera.The result is the zone 1910 that display illustrates wall now.
Figure 19 B illustrates the player and watches through portable set 1906 at initial eye position 1912.Use the viewing frustum effect, and the result is the performance on the display in zone 1918.Then the player moves to eye position 1902 and does not have mobile portable equipment.Because equipment is not mobile, thus the viewing frustum effect occurs, and then the player sees zone 1916 at display.Be in same position although should be noted that eyes 1902 and display 1906 in Figure 19 A and 19B, actual view is different, because cause eyes and display to be in the sequence of events of this position.
Figure 20 illustrates according to one embodiment of present invention, is used for utilizing the flow process of algorithm of the view of portable set control virtual scene.In operation 2002, receive signal with synchronous portable set, such as button press or screen touch.In operation 2004, the synchronous portable set of the method is so that the reference point of orientating three-dimensional (3D) space as that portable set is arranged in.In one embodiment, 3d space is the room that the player is positioned at.In another embodiment, virtual reality comprises that room and extension surmount the Virtual Space of room wall.
During operation 2006, around reference point generating virtual scene in 3d space.Virtual scene comprises the virtual reality element, such as the chessboard of Fig. 2.In operation 2008, portable set determines that portable set is with respect to the current location of reference point in 3d space.In operation 2010, create the view of virtual scene.This view performance as watch and have a virtual scene based on the visual angle of the current location of portable set from the current location of portable set.In addition, during operation 2012, the view of establishment is shown at the display of portable set.In operation 2014, whether the portable set inspection is by user's mobile portable equipment, that is, whether current location changes.If portable set is mobile, method flow is got back to operation 2008 so, in order to recomputate current location.If portable set is not mobile, arrive operation 2012 by flow process so, the view of establishment before portable set continues to show.
Figure 21 diagram can be used for the framework of the equipment of the enforcement embodiment of the invention.Portable set is computing equipment, and comprises the typical module that is present in the portable set, such as processor, storer (RAM, ROM etc.), battery or other power supplys and permanent storage (such as hard disk).The exchange messages such as communication module permission portable set and other portable sets, other computing machines, server.Communication module comprises USB (universal serial bus) (USB) connector, communication linkage (such as Ethernet), supersonic communication, bluetooth and WiFi.
Load module comprises load button and sensor, microphone, touch sensitive screen, camera (preposition, rearmounted, depth camera) and card reader.Other input-output apparatus (such as keyboard or mouse) also can be connected to portable set via communication linkage (such as USB or bluetooth).Output module comprises display (having touch sensitive screen), light emitting diode (LED), sense of touch vibrational feedback and loudspeaker.Other output devices also can be connected to portable set via communication module.
The position that can be used for calculating by locating module portable set from the information of distinct device.These modules comprise magnetometer, accelerometer, gyroscope, GPS and compass.In addition, locating module can be analyzed with the sound of camera and microphones capture or view data with calculating location.In addition, locating module can be carried out test (such as WiFi ping test or ultrasonic tesint), with the position of definite portable set or near the position of other equipment.
The virtual reality maker uses the position of being calculated by locating module, creates virtual or augmented reality, as described above.The view generation device is created in the view that shows on the display based on virtual reality and position.The view generation device can also use the direction effect that is applied to multi-loudspeaker system, produces the sound that is derived from the virtual reality maker.
Should be appreciated that Figure 21 illustrated embodiment is the illustrative embodiments of portable set.Other embodiment can utilize the subset of disparate modules, module, perhaps distribute inter-related task to disparate modules.Therefore, Figure 21 illustrated embodiment should not be construed as exclusive or restrictive, but exemplary or illustrative.
Figure 22 be scenario A according to an embodiment of the invention to the graphical representation of exemplary of scene E, wherein each user A to user E with mutual to the game client 1102 of server process via Internet connection.Game client is the equipment that allows the user to use and process to server via Internet connection.Game client allows user's access and plays online entertainment content, such as, but not limited to game, film, music and photo.In addition, game client can provide the access of using for such as the online communication of VOIP, text chat agreement and Email.
The user is mutual via controller and game client.In certain embodiments, controller is the game client nonshared control unit, and in other embodiments, controller can be keyboard and mouse combination.In one embodiment, game client is autonomous device, its can output audio and vision signal to create multimedia environment by monitor/TV and related audio equipment.For example, game client can be but be not limited to thin-client, inner PCI-express card, exterior PC I-express equipment, ExpressCard equipment, inside, outside or wireless USB apparatus or firmware device etc.In other embodiments, game client is integrated with TV or other multimedia equipments, such as DVR, Blu-ray player, DVD player or multi-channel receiver.
In the scenario A of Figure 22, user A uses with the controller 100 of game client 1102A pairing mutual with the client application that shows at monitor 106.Similarly, in scenario B, user B uses with the controller 100 of game client 1102B pairing mutual with another client application that shows at monitor 106.Scene C diagram watches the monitor that shows game from the view of his back and from the buddy list of game client 1102C as user C.Although Figure 22 shows the individual server processing module, in one embodiment, all over the worldly there are a plurality of server process modules.The submodule that each server process module comprises for user conversation control, shares/communication logic, user location and load balance are processed service.In addition, the server process module comprises network processes and distributed storage.
When game client 1102 was connected to the server process module, user conversation control can be used for authentication of users.The user of checking can have virtual distributed storage and the virtual network processes that is associated.The exemplary projects that can be stored as user's virtual distributed storage comprises the media of purchase, such as, but not limited to game, video and music etc.In addition, distributed storage can be used for preservation for the game state of a plurality of game, for the self-defined setting of individual game and the general setting that is used for game client.In one embodiment, user's locating module of server process is used for determining the geo-location of user and their each game client.User's geo-location can by share/communication logic and load balance processing server use, so that based on the processing requirements optimize performance of geo-location and a plurality of server process modules.Virtual network is processed and the network storage is arbitrary or both will allow the server process module underused to (a plurality of) from the Processing tasks dynamic transfer of game client.Therefore, load balance can be used for minimizing the delay that is associated with readjustment and the data transmission between server processing module and the game client from storage.
The server process module has the example that server is used A and server application B.The server process module can support a plurality of servers to use, as using X by server 1Use X with server 2Indication.In one embodiment, the computing architecture of trooping used based on a plurality of processor processing servers in allowing to troop of server process.In another embodiment, dissimilar multicomputer processing schemes is applied to the processing server application.This allows server process to regulate in proportion, in order to hold the more a plurality of client application of execution of big figure and the game client that corresponding server is used.Alternately, server process can be regulated in proportion, to hold by the graphics process of requirements at the higher level or game, video compress or to use the calculation requirement of the required increase of complicacy.In one embodiment, the server process module is used via server and is carried out the great majority processing.This allows relatively costly assembly (such as graphic process unit, RAM and general processor) to be positioned at the center, and reduces the cost of game client.The server application data of processing sends back to corresponding game client via the Internet, in order to show at monitor.
The exemplary application that scene C diagram can have game client and server processing module to carry out.For example, in one embodiment, game client 1102C allows user C to create and watch the buddy list 1120 that comprises user A, user B, user D and user E.As directed, in scene C, user C can see at monitor 106C realtime graphic or each user's incarnation.Server process is carried out each application of each game client 1102 of game client 1102C and user A, user B, user D and user E.Because server process is known the application of being carried out by game client B, can play which game by indicating user B so be used for the buddy list of user A.In addition, in one embodiment, user A can watch direct actual play video from user B.This realizes by will only sending to game client A except game client B for the server application data of the processing of user B.
Except watching the video from the good friend, communications applications can allow the real-time Communication for Power between the good friend.As the example before being applied to, this allows user A that encouragement and prompting are provided in the real-time video of watching user B.In one embodiment, set up two-way real-time speech communicating by client-server application.In another embodiment, client-server application realizes text chat.In another embodiment, client-server application is that text is used for showing at good friend's screen with speech conversion.
Scene D and scene E diagram respectively with mutual each user D and the user E of game console 1110D and 1110E.Each game console 1110D and 1110E are connected to the server process module, and illustrate the network that the coordination of server process module is played for the game of game console and game client.
The embodiment of Figure 23 pictorial information ISP framework.Information server supplier (ISP) 250 transmits the bulk information servers to the geographical upper user 262 who disperses and connect via network 266.ISP can only transmit one type service (upgrading such as stock price), perhaps many services (such as broadcast medium, news, physical culture, match etc.).In addition, the service that is provided by each ISP is dynamic, that is to say, can put at any time interpolation or take service away.Therefore, provide the server of particular type can change along with the time to specific individual ISP.For example, when the local of user at her, the user can be by the ISP service that approaches with the user, and when the user travelled to different cities, the user can be served by different I SP.Local ISP will transmit the information and the data that need and arrive new ISP, so that user profile " is followed " user to new city, so that data are more near user and easier access.In another embodiment, can management be used for the user information main frame ISP and directly with between the server ISP that the user engages set up main frame-relationship server under from the control of main frame ISP.In another embodiment, along with client mobile all over the world, data are transferred to another ISP from an ISP, so that be in the ISP for these services of transmission for the ISP of the more good position of service-user.
ISP 250 comprises application service provider (ASP) 252, and it provides computer based to serve to the client by network.The software that uses the ASP pattern to provide is also referred to as required software sometimes or software is namely served (SaaS).It is by using the standard agreement such as HTTP that simple form for the access of application-specific (such as customer relation management) is provided.Application software resides in supplier's system, and the web-browsing device by using HTML, is accessed by the user by the private client software that provided by supplier or such as other remote interfaces of thin-client.
Cloud computing is used in the service of transmitting in wide geographic area usually.Cloud computing is the service of such type, wherein provide in the Internet dynamically adjust in proportion and common virtualized resource as service.The user needs not be the expert in the technological infrastructure in " cloud " of supporting them.Cloud computing can be divided into different services, such as infrastructure namely serve (IaaS), platform namely serves (PaaS) and software is namely served (SaaS).Cloud computing service provides online public commercial application the from the access of web-browsing device usually, and software and data are stored on the server simultaneously.Based on how describing the Internet in computer network figure, the term cloud is as the metaphor for the Internet, and is abstract for its complicated foundation facility of hiding.
In addition, ISP 250 comprises game processing server (GPS) 254, and it is used for playing single or multiplayer video game by game client.Most of video-games of playing in the Internet are via the connection operation to game server.Typically, game uses private server to use, and it is collected data and it is distributed to other players from the player.This is more efficient and effective than equity arrangement, but it requires discrete server to be used as the main frame that server is used.In another embodiment, GPS plays games at each of player and they and sets up communication between equipment with exchange message, and does not rely on concentrated GPS.
Special-purpose GPS is the server that is independent of the client operation.Such server usually is being positioned at data center, is providing the specialized hardware of larger bandwidth and dedicated processes ability to move.Private server is for the method for optimizing of great majority based on the host game server of the multiplayer of PC.A large amount of MMOGs allow them to control and update content usually moving by having this private server of playing the hosting of proprietorial software company.
Broadcasting processing server (BPS) 256 distribution audio or video signals are to the audience.Be sometimes referred to as narrow broadcast to the unusual audience's of close limit broadcasting.The final stage of broadcast distribution is how signal arrives audience or spectators, and it can be as using broadcasting station or TV stand wireless arrival antenna and receiver, perhaps can reach via standing or receiving from direct network by wired TV or electrophone (or " wireless cable ").The Internet can also take broadcasting or TV to recipient, the special multicast that utilizes permission signal and bandwidth sharing.Say that in history broadcasting is limited by the geographic area, such as government broadcasting or area broadcasting.Yet in the situation that the Internet increases sharply fast, broadcasting has not been subjected to geographic restrictions, because content can arrive in the world almost any country.
Storage service provider (SSP) 258 provides Computer Storage space and related management service.SSP also provides periodic backups and filing.By being provided as the storage of service, the user can be scheduled to more storages as required.Another major advantage is that SSP comprises backup services, and if the hardware driving fault of user's computing machine, the user will not lose their all data.In addition, a plurality of SSP can have the overall or incremental backup of user data, allow the user where to be positioned at or to be used for the effective means visit data of the equipment of visit data not rely on the user.For example, the user can access the personal document in the mobile phone in the home computer and when the user moves.
Communication supplier 260 is provided to user's connection.A kind of supplier of communication provides for the Internet service provider of the access of the Internet (ISP).ISP uses the data transmission technology (such as dialing, DSL, wire line MODEM, wireless or specialized high-speed interconnection) that is suitable for transmitting the Internet protocol datagram to connect its client.The communication supplier can also give information to transmit and serve, and sends such as Email, instant messaging and SMS.The communication supplier of another kind is Internet Service Provider (NSP), and it sells bandwidth or access to netwoks by the direct backbone network access for the Internet.Internet Service Provider can comprise telecommunications company, data carrier, radio communication supplier, Internet service provider, the cable television operator of high-speed the Internet access etc. is provided.
Some modules in the exchanges data 268 interconnection ISP 253, and via network 266 these modules are connected to user 262.All modules that exchanges data 268 can cover ISP 250 are in approaching zonule, perhaps can cover zone in the earth when disperseing on the disparate modules geography.For example, exchanges data 268 can comprise interior quick gigabit (or the faster) Ethernet of casing of data center, perhaps intercontinental virtual region network (VLAN).
User 262 utilizes the client device 264 access remote service that comprise at least CPU, display and I/O.Client device can be PC, mobile phone, net book, PDA etc.In one embodiment, ISP 250 identifications are by the type of the equipment of user's use, and the communication means of adjustment employing.In other cases, client device Application standard communication means (such as html) access ISP 250.
Information service provider (ISP) 250 transmits bulk informations and serves the user 262 who disperses on the geography and connect via network 266.ISP can only transmit one type service (upgrading such as stock price), perhaps many services (such as broadcast medium, news, physical culture, match etc.).In addition, the service that is provided by each ISP is dynamic, that is to say, can put at any time interpolation or take service away.Therefore, provide the server of particular type can change along with the time to specific individual ISP.For example, when the local of user at her, the user can be by the ISP service that approaches with the user, and when the user travelled to different cities, the user can be served by different I SP.Local ISP will transmit the information and the data that need and arrive new ISP, so that user profile " is followed " user to new city, so that data are more near user and easier access.In another embodiment, can management be used for the user information main frame ISP and directly with between the server ISP that the user engages set up main frame-relationship server under from the control of main frame ISP.In another embodiment, along with client mobile all over the world, data are transferred to another ISP from an ISP, so that be in the ISP for these services of transmission for the ISP of the more good position of service-user.
ISP 250 comprises application service provider (ASP) 252, and it provides computer based to serve to the client by network.The software that uses the ASP pattern to provide is also referred to as required software sometimes or software is namely served (SaaS).It is by using the standard agreement such as HTTP that simple form for the access of application-specific (such as customer relation management) is provided.Application software resides in supplier's system, and the web-browsing device by using HTML, is accessed by the user by the private client software that provided by supplier or such as other remote interfaces of thin-client.
Cloud computing is used in the service of transmitting in wide geographic area usually.Cloud computing is the service of such type, wherein provide in the Internet dynamically adjust in proportion and common virtualized resource as service.The user needs not be the expert in the technological infrastructure in " cloud " of supporting them.Cloud computing can be divided into different services, such as infrastructure namely serve (IaaS), platform namely serves (PaaS) and software is namely served (SaaS).Cloud computing service provides online public commercial application the from the access of web-browsing device usually, and software and data are stored on the server simultaneously.Based on how describing the Internet in computer network figure, the term cloud is as the metaphor for the Internet, and is abstract for its complicated foundation facility of hiding.
In addition, ISP 250 comprises game processing server (GPS) 254, and it is used for playing single or multiplayer video game by game client.Most of video-games of playing in the Internet are via the connection operation to game server.Typically, game uses private server to use, and it is collected data and it is distributed to other players from the player.This is more efficient and effective than equity arrangement, but it requires discrete server to be used as the main frame that server is used.In another embodiment, GPS plays games at each of player and they and sets up communication between equipment with exchange message, and does not rely on concentrated GPS.
Special-purpose GPS is the server that is independent of the client operation.Such server usually is being positioned at data center, is providing the specialized hardware of larger bandwidth and dedicated processes ability to move.Private server is for the method for optimizing of great majority based on the host game server of the multiplayer of PC.A large amount of MMOGs allow them to control and update content usually moving by having this private server of playing the hosting of proprietorial software company.
Broadcasting processing server (BPS) 256 distribution audio or video signals are to the audience.Be sometimes referred to as narrow broadcast to the unusual audience's of close limit broadcasting.The final stage of broadcast distribution is how signal arrives audience or spectators, and it can be as using broadcasting station or TV stand wireless arrival antenna and receiver, perhaps can reach via standing or receiving from direct network by wired TV or electrophone (or " wireless cable ").The Internet can also take broadcasting or TV to recipient, the special multicast that utilizes permission signal and bandwidth sharing.Say that in history broadcasting is limited by the geographic area, such as government broadcasting or area broadcasting.Yet in the situation that the Internet increases sharply fast, broadcasting has not been subjected to geographic restrictions, because content can arrive in the world almost any country.
Storage service provider (SSP) 258 provides Computer Storage space and related management service.SSP also provides periodic backups and filing.By being provided as the storage of service, the user can be scheduled to more storages as required.Another major advantage is that SSP comprises backup services, and if the hardware driving fault of user's computing machine, the user will not lose their all data.In addition, a plurality of SSP can have the overall or incremental backup of user data, allow the user where to be positioned at or to be used for the effective means visit data of the equipment of visit data not rely on the user.For example, the user can access the personal document in the mobile phone in the home computer and when the user moves.
Communication supplier 260 is provided to user's connection.A kind of supplier of communication provides for the Internet service provider of the access of the Internet (ISP).ISP uses the data transmission technology (such as dialing, DSL, wire line MODEM, wireless or specialized high-speed interconnection) that is suitable for transmitting the Internet protocol datagram to connect its client.The communication supplier can also give information to transmit and serve, and sends such as Email, instant messaging and SMS.The communication supplier of another kind is Internet Service Provider (NSP), and it sells bandwidth or access to netwoks by the direct backbone network access for the Internet.Internet Service Provider can comprise telecommunications company, data carrier, radio communication supplier, Internet service provider, the cable television operator of high-speed the Internet access etc. is provided.
Some modules in the exchanges data 268 interconnection ISP 253, and via network 266 these modules are connected to user 262.All modules that exchanges data 268 can cover ISP 250 are in approaching zonule, perhaps can cover zone in the earth when disperseing on the disparate modules geography.For example, exchanges data 268 can comprise interior quick gigabit (or the faster) Ethernet of casing of data center, perhaps intercontinental virtual region network (VLAN).
User 262 utilizes the client device 264 access remote service that comprise at least CPU, display and I/O.Client device can be PC, mobile phone, net book, PDA etc.In one embodiment, ISP 250 identifications are by the type of the equipment of user's use, and the communication means of adjustment employing.In other cases, client device Application standard communication means (such as html) access ISP 250.
Various embodiments of the present invention can be put into practice with various computer system configurations, various computer system configurations comprise handheld device, microprocessor system, based on microprocessor or programmable consumer electronics, mini-computer, mainframe computer etc.The present invention can also put into practice in distributed computing environment, and wherein task is carried out by the teleprocessing equipment by network linking.
Consider top embodiment, should be appreciated that the present invention can adopt the various computer-implemented operation that relates to the data of storing in computer system.These operations are those of physical manipulation of requirement physical quantity.The described here any operation that forms a part of the present invention is useful machine operation.The invention still further relates to for equipment or the device of carrying out these operations.Can be for the special constructing apparatus of the purpose that requires, such as special purpose computer.When being defined as special purpose computer, computing machine can also be carried out other processing, the program of the part that is not special-purpose purpose and carry out or routine, simultaneously still can be for special-purpose purpose operation.Alternately, can be by selective activation or by the general-purpose computer processes that is stored in computer memory, high-speed cache or disposes by one or more computer programs that network obtains.When obtaining data by network, data can be processed by other computing machines on the network (for example, the cloud of computational resource).
Various embodiments of the present invention can also be defined as the machine that data is transformed into another state from a state.The data of conversion can be saved in storer, are then handled by processor.Therefore processor is converted to another part thing with data from something.In addition, method can be by processing by one or more machines or the processor of network connection.Each machine can be converted to another from a state or thing with data, and can deal with data, and data are saved in storer, by transmitted data on network, shows the result, another machine of perhaps result being communicated by letter.
One or more embodiment of the present invention can also be combined as the computer-readable code on the computer-readable medium.Computer-readable medium is any data storage device that can store data, and data after this can be by computer system reads.The example of computer-readable medium comprises hard drives, network attached storage (NAS), ROM (read-only memory), random access memory, CD-ROM, CD-R, CD-RW, tape and other optics and non-optical data storage device.Computer-readable medium can be included in the computer-readable tangible medium that distributes on the computer system of network coupling, so that computer-readable code is with distributed way storage and execution.
Although the method operation is described with particular order, should be appreciated that and between operation, to carry out other housekeeping operations, perhaps can adjusting operation, so that they occurred in the different slightly time, perhaps can be distributed in the permission processing and operate in the system that occurs with the various intervals that are associated with processing, as long as carry out in the way you want the processing of overlap-add operation.
Although describe aforementioned invention in detail for the clear purpose of understanding, be apparent that and put into practice within the scope of the appended claims specific change and modification.Therefore, present embodiment is considered as illustrative and not restrictive, and the invention is not restricted to details given herein, but can revise in the scope of claims and equivalent.

Claims (38)

1. one kind is used for using portable set to control the method for the view of virtual scene, and described method comprises:
Receive the signal that is used for synchronous described portable set;
Synchronous described portable set is so that the reference point of orientating three-dimensional (3D) space as that described portable set is arranged in;
Around described reference point generating virtual scene in 3d space, described virtual scene comprises the virtual reality element;
Determine that described portable set is with respect to the current location of described reference point in 3d space;
Create the view of described virtual scene, the performance of wherein said view as from as described in the current location of portable set watch and have based on as described in the virtual scene at visual angle of current location of portable set;
In described portable set, show the view that creates; And
When in 3d space, moving described portable set by the user, change the view of the demonstration of described virtual scene.
2. the method for claim 1, the view that wherein creates described virtual scene also comprises:
Use camera in the described portable set to catch the image of 3d space, described image comprises the true element in the 3d space; And
Mix described true element and virtual reality element, wherein virtual element appears in the view of demonstration, seem that virtual element is the part of 3d space, wherein from geometrical perspective, the view of virtual element changes to change identical mode with the view of true element when described portable set moves in 3d space.
3. method as claimed in claim 2, wherein said true element comprises player's hand, described method also comprises:
Detect the position that described hand occupies the 3d space that the first virtual element is arranged in; And
After described detection, so that the described hand of interactive simulation of described hand and described the first virtual element touches described virtual element, wherein said hand can be handled position or the characteristic of described the first virtual element to change described the first virtual element, is real object as described the first virtual element.
4. method as claimed in claim 3, wherein said hand be alternately from comprise interaction, hold, push away, draw, grab, move, clash into, push, hit, throw, fight, open, close, turn on and off, press button, the action to described the first virtual element of the group selection of shooting, eating.
5. method as claimed in claim 2 wherein creates view and also comprises:
According to the lighting condition in 3d space and the virtual scene, add the shade of described hand at virtual element.
6. method as claimed in claim 2, wherein said true element comprises desk, wherein mixes also to comprise:
Place virtual element at the top of described desk, reside on the described desk as the virtual element of placing.
7. method as claimed in claim 2, wherein said true element comprises the wall in the room, wherein mixes also to comprise:
Add diagram to described wall, described diagram is one of virtual element.
8. the method for claim 1, the current location of wherein said portable set comprise the geometric coordinate of watching the surface of display in the geometric coordinate of described portable set and the described portable set.
9. method as claimed in claim 8, the geometric coordinate of wherein said portable set is equivalent to the geometric coordinate of camera in the described portable set.
10. method as claimed in claim 9 is wherein about having at the initial point at the center of watching the surface of display and perpendicular to the vector of the direction of watching the surface of display, definition visual angle.
11. the method for claim 1, the view that wherein creates described virtual scene also comprises:
When described portable set moves closer to the second virtual reality element, amplify described the second virtual reality element; And
When described portable set moves away from described the second virtual reality element, dwindle described the second virtual reality element.
12. the method for claim 1 wherein changes the view that shows and also comprises:
When described portable set moves, add image stabilization to the image that shows.
13. the method for claim 1 also comprises:
Receive the input that is used for changing view; And
Change the establishment of the view of described virtual scene, so that calculate the view of described virtual scene from another point of the current location that is different from described portable set.
14. method as claimed in claim 13, the view of wherein said virtual scene is with respect to the perpendicular line Rotate 180 degree that intersects with described reference point.
15. the method for claim 1, wherein by press on the described portable set button or by touching the touch-sensitive display of described portable set, generate the signal that receives.
16. the method for claim 1, the wall in the room that wherein described portable set is positioned at when receiving signal limits the border of described virtual scene.
17. the method for claim 1, wherein synchronous described portable set also comprises:
Reset position tracking module in the described portable set, described position tracking module be from comprising accelerometer, magnetometer, GPS equipment, camera, depth camera, compass or gyrostatic group selection,
Wherein use determining from the information and executing current location of described position tracking module.
18. a method that is used for sharing virtual scene between each equipment, described method comprises:
Synchronous the first equipment arrives the reference point in three-dimensional (3D) space;
Location with respect to location Calculation second equipment of described the first equipment;
In described the first equipment and described the second exchanged between equipment information, so that the reference point of described the second device synchronization in the 3d space, described information comprises the location of described reference point and described the first equipment and the second equipment;
Around described reference point generating virtual scene in 3d space, described virtual scene is shared by two equipment, and when two equipment and described virtual scene were mutual, described virtual scene is simultaneously change in two equipment;
Create the view of described virtual scene, just as use is watched from the current location of described the first equipment based on the visual angle of the current location of portable set;
In described the first equipment, show the view that creates; And
When portable set moves, change the view of the demonstration of described virtual scene in 3d space.
19. method as claimed in claim 18, the location of wherein calculating described the second equipment also comprises:
Collect first-phase between described the first equipment and described the second equipment to locating information, one or more of the graphical analysis that described collection comprises that the WiFi position is followed the tracks of, audio frequency triangulation or the camera from described the first equipment obtain;
To locating information, determine that described the second equipment is with respect to the relative position of described the first equipment based on described first-phase; And
Send the coordinate of described relative position and described reference point to described the second equipment.
20. method as claimed in claim 18 also comprises:
Collect first-phase between described the first equipment and described the second equipment to locating information, one or more of the graphical analysis that described collection comprises that the WiFi position is followed the tracks of, audio frequency triangulation or the camera from described the first equipment obtain;
Receive second-phase to locating information from described the second equipment;
Based on the first and second relative information, determine that described the second equipment is with respect to the relative position of described the first equipment; And
Send the coordinate of described relative position and described reference point to described the second equipment.
21. method as claimed in claim 18, wherein said virtual scene comprises virtual Trivial Pursuit Unhinged, and the player who wherein holds respectively described the first equipment and described the second equipment plays described virtual Trivial Pursuit Unhinged.
22. method as claimed in claim 18 is wherein held the first player of the first portable set by mobile first portable set synchronous with the first virtual element, controls the movement of the first virtual element in the described virtual scene.
23. one kind is used for using the first equipment to control the method for the view of virtual scene, described method comprises:
Synchronous first reference point of described the first equipment in the first three-dimensional (3D) space;
Set up the communication link between described the first equipment and the second equipment, described the second equipment is in the second 3d space of described the first 3d space outside, second reference point of described the second device synchronization in described the second 3d space;
Generation comprises the public virtual scene of virtual reality element, described public virtual scene can both observe by the first and second equipment, described the first equipment is set up described public virtual scene around described the first reference point, described the second equipment is set up described public virtual scene around described the second reference point, two equipment can with described virtual reality element interactions;
Determine that described the first equipment is with respect to the current location of described reference point in described the first 3d space;
Create the view of described public virtual scene, the performance of wherein said view as from as described in the current location of the first equipment watch and have based on as described in the public virtual scene at visual angle of current location of the first equipment;
In described the first equipment, show the view that creates; And
When described the first equipment moves, change the view of the demonstration of described public virtual scene in described the first 3d space.
24. method as claimed in claim 23, wherein said communication link comprise the network connection between described the first equipment and described the second equipment.
25. method as claimed in claim 23 also comprises:
Distribute virtual location in described the first 3d space to described the second equipment;
Receive corresponding to the second mutual equipment interactive information between described the second equipment and the public virtual scene from described the second equipment; And
The view that changes described public virtual scene according to the second equipment interactive information that receives and virtual location, wherein said the second equipment appears in described the first 3d space and is mutual with described the first equipment, is physically located at described the first 3d space as described the second equipment.
26. method as claimed in claim 23, the view of wherein said public virtual scene also comprise the image that is positioned near the second player of described the second equipment, described the second equipment has the camera be used to the image of catching described the second player.
27. method as claimed in claim 26 also comprises:
Be updated periodically described the second player's image.
28. method as claimed in claim 26 also comprises:
When described the second player moves, upgrade described the second player's image.
29. method as claimed in claim 26, wherein said virtual element comprises chessboard and chess piece, and wherein said the first equipment and described the second equipment are used for playing by handling described chess piece the game of Chinese chess.
30. method as claimed in claim 26, wherein said the first equipment and described the second equipment show as respectively the first object and second object in the view of described public virtual scene, the movement of the first equipment described in described the first 3d space of the shifted matching of wherein said the first object, and the movement of the second equipment described in described the second 3d space of the shifted matching of described second object.
31. one kind is used for using portable set to control the method for the view of virtual scene, described method comprises:
The reference point in three-dimensional (3D) space that synchronous portable set is arranged in to described portable set, described portable set comprise the anterior camera of the front portion of facing described portable set and the rear portion camera of facing the rear portion of described portable set;
Around described reference point generating virtual scene in 3d space, described virtual scene comprises the virtual reality element;
Determine that described portable set is with respect to the current location of described reference point in 3d space;
Create the view of described virtual scene, described view is caught the performance of the virtual scene of seeing such as current eye position the player's of portable set as described in hold the 3d space, described catching corresponding to the player will see that by window the position of window is equivalent to the position of display in 3d space in the described portable set in the 3d space in the virtual scene;
In described display, show the view that creates; And
When described portable set or player move, change the view of the demonstration of described virtual scene in 3d space.
32. method as claimed in claim 31, wherein the image from described anterior camera is used for determining current eye position, and is used for obtaining the view of 3d space from the image of described rear portion camera.
33. method as claimed in claim 31, wherein spur described display away from player's eyes so that view amplifies described virtual scene, and spur described display near player's eyes so that view dwindles described virtual scene.
34. one kind is used for using portable set to control the method for the view of scene, described method comprises:
Receive the signal that is used for synchronous described portable set;
Synchronous described portable set is so that the reference point of orientating three-dimensional (3D) space as that described portable set is arranged in;
Around described reference point generating virtual scene in 3d space, described virtual scene comprises the virtual reality element;
When mobile described portable set during away from described reference point, create the view of described virtual scene, wherein said view performance as from as described in the virtual scene watched of the current location of portable set; And
In described portable set, show the view that creates; And
When described portable set moves, change the view of the demonstration of described virtual scene in 3d space.
35. method as claimed in claim 34 also comprises:
Determine to generate the virtual element of sound;
From sounding corresponding to the portable set of the sound that is generated by described virtual element, wherein when described portable set shifted near the position of the described virtual element that generates sound, the sound that sends became more loud.
36. method as claimed in claim 35, wherein said portable set has boombox, and wherein according to the relative positioning between the described virtual element of described portable set and generation sound, the sound that adjusting is sent is in order to provide stereophonic effect.
37. one kind is used for the portable set mutual with augmented reality, described portable set comprises:
Locating module be used for to be determined the position of portable set described in the 3d space that described portable set is positioned at, wherein receives when being used for synchronous signal at described portable set, and the position of described portable set is made as the reference point in the 3d space;
The virtual reality maker, it creates virtual scene around described reference point in 3d space, and described virtual scene comprises the virtual reality element;
The view generation device, it creates the view of described virtual scene, the performance of wherein said view as from as described in the position of portable set watch and have based on as described in the virtual scene at visual angle of position of portable set; And
Display is for the view that shows described virtual scene, when described portable set moves in 3d space, at the scene change shown in the described display.
38. the computer program in the embeddeding computer readable storage medium storing program for executing when being carried out by one or more processors, is used for sharing virtual scene between each equipment, described computer program comprises:
Be used for synchronous the first equipment to the programmed instruction of the reference point in three-dimensional (3D) space;
Be used for the programmed instruction with respect to the location of location Calculation second equipment of described the first equipment;
Be used in described the first equipment and described the second exchanged between equipment information, so that the programmed instruction of the reference point of described the second device synchronization in the 3d space, described information comprises the location of described reference point and described the first equipment and the second equipment;
Be used for around the programmed instruction of described reference point in 3d space generating virtual scene, described virtual scene is shared by two equipment, and when two equipment and described virtual scene were mutual, described virtual scene is simultaneously change in two equipment;
Be used for to create as use based on the visual angle of the current location of portable set from as described in the current location of the first equipment watch as described in the programmed instruction of view of virtual scene;
Be used for shown in the first equipment show the programmed instruction of the view that creates; And
Be used for when portable set when 3d space moves, change the programmed instruction of view of the demonstration of described virtual scene.
CN201180022611.0A 2010-03-05 2011-01-24 On the stable Virtual Space of sharing, maintain many views Active CN102884490B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201610220654.4A CN105843396B (en) 2010-03-05 2011-01-24 The method of multiple view is maintained on shared stabilization Virtual Space

Applications Claiming Priority (5)

Application Number Priority Date Filing Date Title
US31125110P 2010-03-05 2010-03-05
US61/311,251 2010-03-05
US12/947,290 2010-11-16
US12/947,290 US8730156B2 (en) 2010-03-05 2010-11-16 Maintaining multiple views on a shared stable virtual space
PCT/US2011/022288 WO2011109126A1 (en) 2010-03-05 2011-01-24 Maintaining multiple views on a shared stable virtual space

Related Child Applications (1)

Application Number Title Priority Date Filing Date
CN201610220654.4A Division CN105843396B (en) 2010-03-05 2011-01-24 The method of multiple view is maintained on shared stabilization Virtual Space

Publications (2)

Publication Number Publication Date
CN102884490A true CN102884490A (en) 2013-01-16
CN102884490B CN102884490B (en) 2016-05-04

Family

ID=43923591

Family Applications (2)

Application Number Title Priority Date Filing Date
CN201180022611.0A Active CN102884490B (en) 2010-03-05 2011-01-24 On the stable Virtual Space of sharing, maintain many views
CN201610220654.4A Active CN105843396B (en) 2010-03-05 2011-01-24 The method of multiple view is maintained on shared stabilization Virtual Space

Family Applications After (1)

Application Number Title Priority Date Filing Date
CN201610220654.4A Active CN105843396B (en) 2010-03-05 2011-01-24 The method of multiple view is maintained on shared stabilization Virtual Space

Country Status (4)

Country Link
CN (2) CN102884490B (en)
MX (1) MX2012010238A (en)
TW (1) TWI468734B (en)
WO (1) WO2011109126A1 (en)

Cited By (36)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103105993A (en) * 2013-01-25 2013-05-15 腾讯科技(深圳)有限公司 Method and system for realizing interaction based on augmented reality technology
CN103997443A (en) * 2013-02-20 2014-08-20 仁宝电脑工业股份有限公司 Method for controlling electronic equipment and electronic device
US20140368537A1 (en) * 2013-06-18 2014-12-18 Tom G. Salter Shared and private holographic objects
CN104657568A (en) * 2013-11-21 2015-05-27 深圳先进技术研究院 Multiplayer mobile game system and multiplayer mobile game method based on intelligent glasses
CN105683868A (en) * 2013-11-08 2016-06-15 高通股份有限公司 Face tracking for additional modalities in spatial interaction
CN105938629A (en) * 2016-03-31 2016-09-14 联想(北京)有限公司 Information processing method and electronic equipment
CN106200956A (en) * 2016-07-07 2016-12-07 北京时代拓灵科技有限公司 A kind of field of virtual reality multimedia presents and mutual method
CN106447786A (en) * 2016-09-14 2017-02-22 同济大学 Parallel space establishing and sharing system based on virtual reality technologies
CN106621306A (en) * 2016-12-23 2017-05-10 浙江海洋大学 Double-layer three-dimensional type army flag chessboard
CN107103645A (en) * 2017-04-27 2017-08-29 腾讯科技(深圳)有限公司 virtual reality media file generation method and device
CN107211180A (en) * 2015-01-21 2017-09-26 微软技术许可有限责任公司 Spatial audio signal for the object with associated audio content is handled
CN107469343A (en) * 2017-07-28 2017-12-15 深圳市瑞立视多媒体科技有限公司 Virtual reality exchange method, apparatus and system
CN107632700A (en) * 2017-08-01 2018-01-26 中国农业大学 A kind of farm implements museum experiencing system and method based on virtual reality
CN107657589A (en) * 2017-11-16 2018-02-02 上海麦界信息技术有限公司 Mobile phone A R elements of a fix axle synchronous method based on the demarcation of three datum marks
CN107861682A (en) * 2017-11-03 2018-03-30 网易(杭州)网络有限公司 The control method for movement and device of virtual objects
CN107967054A (en) * 2017-11-16 2018-04-27 中国人民解放军陆军装甲兵学院 The immersion three-dimensional electronic sand table that a kind of virtual reality is coupled with augmented reality
CN108269307A (en) * 2018-01-15 2018-07-10 歌尔科技有限公司 A kind of augmented reality exchange method and equipment
CN108474950A (en) * 2016-01-20 2018-08-31 三星电子株式会社 HMD device and its control method
CN108919945A (en) * 2018-06-07 2018-11-30 佛山市长郡科技有限公司 A kind of method of virtual reality device work
CN108932051A (en) * 2017-05-24 2018-12-04 腾讯科技(北京)有限公司 augmented reality image processing method, device and storage medium
CN109219789A (en) * 2016-05-04 2019-01-15 深圳脑穿越科技有限公司 Display methods, device and the terminal of virtual reality
CN109284000A (en) * 2018-08-10 2019-01-29 西交利物浦大学 Three-dimensional geometry object visualization method and system under a kind of reality environment
CN109313652A (en) * 2016-06-24 2019-02-05 微软技术许可有限责任公司 The relationship of holographic object is drawn
WO2019080902A1 (en) * 2017-10-27 2019-05-02 Zyetric Inventions Limited Interactive intelligent virtual object
CN110168476A (en) * 2017-01-09 2019-08-23 斯纳普公司 Augmented reality object manipulation
CN110286768A (en) * 2019-06-27 2019-09-27 Oppo广东移动通信有限公司 Dummy object display methods, terminal device and computer readable storage medium
CN110337681A (en) * 2017-03-01 2019-10-15 三菱电机株式会社 Information processing system
CN110349270A (en) * 2019-07-02 2019-10-18 石家庄中扬网络科技股份有限公司 Virtual sand table rendering method based on realistic space positioning
US10665019B2 (en) 2016-03-24 2020-05-26 Qualcomm Incorporated Spatial relationships for integration of visual images of physical environment into virtual reality
CN111263956A (en) * 2017-11-01 2020-06-09 索尼公司 Information processing apparatus, information processing method, and program
CN111372098A (en) * 2015-01-21 2020-07-03 微软技术许可有限责任公司 User equipment, system, method and readable medium for shared scene grid data synchronization
CN111915736A (en) * 2020-08-06 2020-11-10 黄得锋 AR interaction control system, device and application
CN113941138A (en) * 2020-08-06 2022-01-18 黄得锋 AR interaction control system, device and application
CN115068932A (en) * 2016-06-13 2022-09-20 索尼互动娱乐股份有限公司 Audience management at view locations in virtual reality environments
US11460912B2 (en) 2019-11-24 2022-10-04 XRSpace CO., LTD. System and method related to data fusing
US11580700B2 (en) 2016-10-24 2023-02-14 Snap Inc. Augmented reality object manipulation

Families Citing this family (41)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3654147A1 (en) 2011-03-29 2020-05-20 QUALCOMM Incorporated System for the rendering of shared digital interfaces relative to each user's point of view
JP5718197B2 (en) * 2011-09-14 2015-05-13 株式会社バンダイナムコゲームス Program and game device
CN102495959A (en) * 2011-12-05 2012-06-13 无锡智感星际科技有限公司 Augmented reality (AR) platform system based on position mapping and application method
CN102542165B (en) * 2011-12-23 2015-04-08 三星半导体(中国)研究开发有限公司 Operating device and operating method for three-dimensional virtual chessboard
US20130234925A1 (en) * 2012-03-09 2013-09-12 Nokia Corporation Method and apparatus for performing an operation at least partially based upon the relative positions of at least two devices
US8630458B2 (en) 2012-03-21 2014-01-14 Google Inc. Using camera input to determine axis of rotation and navigation
JP5966510B2 (en) * 2012-03-29 2016-08-10 ソニー株式会社 Information processing system
EP3005195A4 (en) * 2013-05-24 2017-05-24 Awe Company Limited Systems and methods for a shared mixed reality experience
EP2886172A1 (en) * 2013-12-18 2015-06-24 Microsoft Technology Licensing, LLC Mixed-reality arena
US10015370B2 (en) 2015-08-27 2018-07-03 Htc Corporation Method for synchronizing video and audio in virtual reality system
US10115234B2 (en) * 2016-03-21 2018-10-30 Accenture Global Solutions Limited Multiplatform based experience generation
CN106528285A (en) * 2016-11-11 2017-03-22 上海远鉴信息科技有限公司 Method and system for multi-terminal cooperative scheduling in virtual reality
US10242503B2 (en) 2017-01-09 2019-03-26 Snap Inc. Surface aware lens
CN107087152B (en) * 2017-05-09 2018-08-14 成都陌云科技有限公司 Three-dimensional imaging information communication system
CN107320955B (en) * 2017-06-23 2021-01-29 武汉秀宝软件有限公司 AR venue interface interaction method and system based on multiple clients
CN109298776B (en) * 2017-07-25 2021-02-19 阿里巴巴(中国)有限公司 Augmented reality interaction system, method and device
CN107390875B (en) * 2017-07-28 2020-01-31 腾讯科技(上海)有限公司 Information processing method, device, terminal equipment and computer readable storage medium
CN107492183A (en) * 2017-07-31 2017-12-19 程昊 One kind has paper instant lottery AR methods of exhibiting and system
CN109426333B (en) * 2017-08-23 2022-11-04 腾讯科技(深圳)有限公司 Information interaction method and device based on virtual space scene
CN107995481B (en) * 2017-11-30 2019-11-15 贵州颐爱科技有限公司 A kind of display methods and device of mixed reality
US20210038975A1 (en) * 2018-01-22 2021-02-11 The Goosebumps Factory Bvba Calibration to be used in an augmented reality method and system
US11880540B2 (en) * 2018-03-22 2024-01-23 Hewlett-Packard Development Company, L.P. Digital mark-up in a three dimensional environment
CN108519817A (en) * 2018-03-26 2018-09-11 广东欧珀移动通信有限公司 Exchange method, device, storage medium based on augmented reality and electronic equipment
CN108667798A (en) * 2018-03-27 2018-10-16 上海临奇智能科技有限公司 A kind of method and system of virtual viewing
CN108479065B (en) * 2018-03-29 2021-12-28 京东方科技集团股份有限公司 Virtual image interaction method and related device
US11173398B2 (en) * 2018-05-21 2021-11-16 Microsoft Technology Licensing, Llc Virtual camera placement system
US11030813B2 (en) 2018-08-30 2021-06-08 Snap Inc. Video clip object tracking
US11176737B2 (en) 2018-11-27 2021-11-16 Snap Inc. Textured mesh building
CN113330484A (en) 2018-12-20 2021-08-31 斯纳普公司 Virtual surface modification
US10866658B2 (en) 2018-12-20 2020-12-15 Industrial Technology Research Institute Indicator device, mixed reality device and operation method thereof
US10948978B2 (en) 2019-04-23 2021-03-16 XRSpace CO., LTD. Virtual object operating system and virtual object operating method
CN113508361A (en) 2019-05-06 2021-10-15 苹果公司 Apparatus, method and computer-readable medium for presenting computer-generated reality files
US10499044B1 (en) 2019-05-13 2019-12-03 Athanos, Inc. Movable display for viewing and interacting with computer generated environments
US11189098B2 (en) 2019-06-28 2021-11-30 Snap Inc. 3D object camera customization system
US11232646B2 (en) 2019-09-06 2022-01-25 Snap Inc. Context-based virtual object rendering
US11227442B1 (en) 2019-12-19 2022-01-18 Snap Inc. 3D captions with semantic graphical elements
US11263817B1 (en) 2019-12-19 2022-03-01 Snap Inc. 3D captions with face tracking
US11832015B2 (en) 2020-08-13 2023-11-28 Snap Inc. User interface for pose driven virtual effects
CN115705116A (en) * 2021-08-04 2023-02-17 北京字跳网络技术有限公司 Interactive method, electronic device, storage medium, and program product
US20230078578A1 (en) * 2021-09-14 2023-03-16 Meta Platforms Technologies, Llc Creating shared virtual spaces
TWI803134B (en) * 2021-09-24 2023-05-21 宏達國際電子股份有限公司 Virtual image display device and setting method for input interface thereof

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6522312B2 (en) * 1997-09-01 2003-02-18 Canon Kabushiki Kaisha Apparatus for presenting mixed reality shared among operators
US20030156144A1 (en) * 2002-02-18 2003-08-21 Canon Kabushiki Kaisha Information processing apparatus and method
US6972734B1 (en) * 1999-06-11 2005-12-06 Canon Kabushiki Kaisha Mixed reality apparatus and mixed reality presentation method
US20060258420A1 (en) * 2003-09-02 2006-11-16 Mullen Jeffrey D Systems and methods for location based games and employment of the same on location enabled devices
CN101452339A (en) * 2007-12-06 2009-06-10 国际商业机器公司 Rendering of real world objects and interactions into a virtual universe

Family Cites Families (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5588139A (en) * 1990-06-07 1996-12-24 Vpl Research, Inc. Method and system for generating objects for a multi-person virtual world using data flow networks
US7149691B2 (en) * 2001-07-27 2006-12-12 Siemens Corporate Research, Inc. System and method for remotely experiencing a virtual environment
US20060257420A1 (en) * 2002-04-26 2006-11-16 Cel-Sci Corporation Methods of preparation and composition of peptide constructs useful for treatment of autoimmune and transplant related host versus graft conditions
US8323106B2 (en) * 2008-05-30 2012-12-04 Sony Computer Entertainment America Llc Determination of controller three-dimensional location using image analysis and ultrasonic communication
US8585476B2 (en) * 2004-11-16 2013-11-19 Jeffrey D Mullen Location-based games and augmented reality systems
TWI278772B (en) * 2005-02-23 2007-04-11 Nat Applied Res Lab Nat Ce Augmented reality system and method with mobile and interactive function for multiple users
JP4738870B2 (en) * 2005-04-08 2011-08-03 キヤノン株式会社 Information processing method, information processing apparatus, and remote mixed reality sharing apparatus
JP5230114B2 (en) * 2007-03-13 2013-07-10 キヤノン株式会社 Information processing apparatus and information processing method
CN101174332B (en) * 2007-10-29 2010-11-03 张建中 Method, device and system for interactively combining real-time scene in real world with virtual reality scene
US8786675B2 (en) * 2008-01-23 2014-07-22 Michael F. Deering Systems using eye mounted displays

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6522312B2 (en) * 1997-09-01 2003-02-18 Canon Kabushiki Kaisha Apparatus for presenting mixed reality shared among operators
US6972734B1 (en) * 1999-06-11 2005-12-06 Canon Kabushiki Kaisha Mixed reality apparatus and mixed reality presentation method
US20030156144A1 (en) * 2002-02-18 2003-08-21 Canon Kabushiki Kaisha Information processing apparatus and method
US20060258420A1 (en) * 2003-09-02 2006-11-16 Mullen Jeffrey D Systems and methods for location based games and employment of the same on location enabled devices
CN101452339A (en) * 2007-12-06 2009-06-10 国际商业机器公司 Rendering of real world objects and interactions into a virtual universe

Cited By (57)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103105993A (en) * 2013-01-25 2013-05-15 腾讯科技(深圳)有限公司 Method and system for realizing interaction based on augmented reality technology
CN103105993B (en) * 2013-01-25 2015-05-20 腾讯科技(深圳)有限公司 Method and system for realizing interaction based on augmented reality technology
US10049494B2 (en) 2013-01-25 2018-08-14 Tencent Technology (Shenzhen) Company Limited Method and system for performing interaction based on augmented reality
CN103997443A (en) * 2013-02-20 2014-08-20 仁宝电脑工业股份有限公司 Method for controlling electronic equipment and electronic device
US20140368537A1 (en) * 2013-06-18 2014-12-18 Tom G. Salter Shared and private holographic objects
CN105393158A (en) * 2013-06-18 2016-03-09 微软技术许可有限责任公司 Shared and private holographic objects
CN105683868B (en) * 2013-11-08 2019-07-30 高通股份有限公司 Feature tracking for the additional mode in spatial interaction
CN110488972B (en) * 2013-11-08 2023-06-09 高通股份有限公司 Face tracking for additional modalities in spatial interaction
US10146299B2 (en) 2013-11-08 2018-12-04 Qualcomm Technologies, Inc. Face tracking for additional modalities in spatial interaction
CN110488972A (en) * 2013-11-08 2019-11-22 高通股份有限公司 Feature tracking for the additional mode in spatial interaction
CN105683868A (en) * 2013-11-08 2016-06-15 高通股份有限公司 Face tracking for additional modalities in spatial interaction
CN104657568A (en) * 2013-11-21 2015-05-27 深圳先进技术研究院 Multiplayer mobile game system and multiplayer mobile game method based on intelligent glasses
CN104657568B (en) * 2013-11-21 2017-10-03 深圳先进技术研究院 Many people's moving game system and methods based on intelligent glasses
CN111372098A (en) * 2015-01-21 2020-07-03 微软技术许可有限责任公司 User equipment, system, method and readable medium for shared scene grid data synchronization
CN107211180A (en) * 2015-01-21 2017-09-26 微软技术许可有限责任公司 Spatial audio signal for the object with associated audio content is handled
CN108474950A (en) * 2016-01-20 2018-08-31 三星电子株式会社 HMD device and its control method
CN108474950B (en) * 2016-01-20 2021-09-28 三星电子株式会社 HMD device and control method thereof
US11164546B2 (en) 2016-01-20 2021-11-02 Samsung Electronics Co., Ltd. HMD device and method for controlling same
US10665019B2 (en) 2016-03-24 2020-05-26 Qualcomm Incorporated Spatial relationships for integration of visual images of physical environment into virtual reality
CN105938629A (en) * 2016-03-31 2016-09-14 联想(北京)有限公司 Information processing method and electronic equipment
CN105938629B (en) * 2016-03-31 2022-01-18 联想(北京)有限公司 Information processing method and electronic equipment
CN109219789A (en) * 2016-05-04 2019-01-15 深圳脑穿越科技有限公司 Display methods, device and the terminal of virtual reality
CN115068932B (en) * 2016-06-13 2023-10-20 索尼互动娱乐股份有限公司 View rendering method at view position in virtual reality environment
CN115068932A (en) * 2016-06-13 2022-09-20 索尼互动娱乐股份有限公司 Audience management at view locations in virtual reality environments
CN109313652B (en) * 2016-06-24 2021-09-17 微软技术许可有限责任公司 Relational rendering of holographic objects
CN109313652A (en) * 2016-06-24 2019-02-05 微软技术许可有限责任公司 The relationship of holographic object is drawn
CN106200956A (en) * 2016-07-07 2016-12-07 北京时代拓灵科技有限公司 A kind of field of virtual reality multimedia presents and mutual method
CN106447786A (en) * 2016-09-14 2017-02-22 同济大学 Parallel space establishing and sharing system based on virtual reality technologies
US11580700B2 (en) 2016-10-24 2023-02-14 Snap Inc. Augmented reality object manipulation
CN106621306A (en) * 2016-12-23 2017-05-10 浙江海洋大学 Double-layer three-dimensional type army flag chessboard
CN110168476B (en) * 2017-01-09 2022-11-04 斯纳普公司 Augmented reality object manipulation
CN110168476A (en) * 2017-01-09 2019-08-23 斯纳普公司 Augmented reality object manipulation
CN110337681A (en) * 2017-03-01 2019-10-15 三菱电机株式会社 Information processing system
WO2018196658A1 (en) * 2017-04-27 2018-11-01 腾讯科技(深圳)有限公司 Virtual reality media file generation method and device, storage medium, and electronic device
CN107103645B (en) * 2017-04-27 2018-07-20 腾讯科技(深圳)有限公司 virtual reality media file generation method and device
CN107103645A (en) * 2017-04-27 2017-08-29 腾讯科技(深圳)有限公司 virtual reality media file generation method and device
CN108932051A (en) * 2017-05-24 2018-12-04 腾讯科技(北京)有限公司 augmented reality image processing method, device and storage medium
CN107469343A (en) * 2017-07-28 2017-12-15 深圳市瑞立视多媒体科技有限公司 Virtual reality exchange method, apparatus and system
CN107469343B (en) * 2017-07-28 2021-01-26 深圳市瑞立视多媒体科技有限公司 Virtual reality interaction method, device and system
CN107632700A (en) * 2017-08-01 2018-01-26 中国农业大学 A kind of farm implements museum experiencing system and method based on virtual reality
WO2019080902A1 (en) * 2017-10-27 2019-05-02 Zyetric Inventions Limited Interactive intelligent virtual object
CN111263956A (en) * 2017-11-01 2020-06-09 索尼公司 Information processing apparatus, information processing method, and program
CN107861682A (en) * 2017-11-03 2018-03-30 网易(杭州)网络有限公司 The control method for movement and device of virtual objects
CN107657589A (en) * 2017-11-16 2018-02-02 上海麦界信息技术有限公司 Mobile phone A R elements of a fix axle synchronous method based on the demarcation of three datum marks
CN107967054B (en) * 2017-11-16 2020-11-27 中国人民解放军陆军装甲兵学院 Immersive three-dimensional electronic sand table with virtual reality and augmented reality coupled
CN107657589B (en) * 2017-11-16 2021-05-14 上海麦界信息技术有限公司 Mobile phone AR positioning coordinate axis synchronization method based on three-datum-point calibration
CN107967054A (en) * 2017-11-16 2018-04-27 中国人民解放军陆军装甲兵学院 The immersion three-dimensional electronic sand table that a kind of virtual reality is coupled with augmented reality
CN108269307A (en) * 2018-01-15 2018-07-10 歌尔科技有限公司 A kind of augmented reality exchange method and equipment
CN108919945A (en) * 2018-06-07 2018-11-30 佛山市长郡科技有限公司 A kind of method of virtual reality device work
CN109284000A (en) * 2018-08-10 2019-01-29 西交利物浦大学 Three-dimensional geometry object visualization method and system under a kind of reality environment
CN109284000B (en) * 2018-08-10 2022-04-01 西交利物浦大学 Method and system for visualizing three-dimensional geometric object in virtual reality environment
CN110286768B (en) * 2019-06-27 2022-05-17 Oppo广东移动通信有限公司 Virtual object display method, terminal device and computer-readable storage medium
CN110286768A (en) * 2019-06-27 2019-09-27 Oppo广东移动通信有限公司 Dummy object display methods, terminal device and computer readable storage medium
CN110349270A (en) * 2019-07-02 2019-10-18 石家庄中扬网络科技股份有限公司 Virtual sand table rendering method based on realistic space positioning
US11460912B2 (en) 2019-11-24 2022-10-04 XRSpace CO., LTD. System and method related to data fusing
CN113941138A (en) * 2020-08-06 2022-01-18 黄得锋 AR interaction control system, device and application
CN111915736A (en) * 2020-08-06 2020-11-10 黄得锋 AR interaction control system, device and application

Also Published As

Publication number Publication date
WO2011109126A1 (en) 2011-09-09
CN105843396A (en) 2016-08-10
MX2012010238A (en) 2013-01-18
CN102884490B (en) 2016-05-04
TW201205121A (en) 2012-02-01
TWI468734B (en) 2015-01-11
CN105843396B (en) 2019-01-01

Similar Documents

Publication Publication Date Title
CN102884490B (en) On the stable Virtual Space of sharing, maintain many views
US11244469B2 (en) Tracking position of device inside-out for augmented reality interactivity
TWI449953B (en) Methods for generating an interactive space viewable through at least a first and a second device, and portable device for sharing a virtual reality among portable devices
US9947139B2 (en) Method and apparatus for providing hybrid reality environment
US9041739B2 (en) Matching physical locations for shared virtual experience
TWI786700B (en) Scanning of 3d objects with a second screen device for insertion into a virtual environment
JP2021516087A (en) Scaled VR Engagement and View at esports events
TW201250577A (en) Computer peripheral display and communication device providing an adjunct 3D user interface
CN104010706A (en) Directional input for a video game
Kostov Fostering player collaboration within a multimodal co-located game
CN113599810B (en) Virtual object-based display control method, device, equipment and medium
TWI807732B (en) Non-transitory computer-readable storage medium for interactable augmented and virtual reality experience
CN112843682B (en) Data synchronization method, device, equipment and storage medium
CN116407838A (en) Display control method, display control system and storage medium in game
TW202111480A (en) Virtual reality and augmented reality interaction system and method respectively playing roles suitable for an interaction technology by an augmented reality user and a virtual reality user

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant