CN102884490B - On the stable Virtual Space of sharing, maintain many views - Google Patents

On the stable Virtual Space of sharing, maintain many views Download PDF

Info

Publication number
CN102884490B
CN102884490B CN201180022611.0A CN201180022611A CN102884490B CN 102884490 B CN102884490 B CN 102884490B CN 201180022611 A CN201180022611 A CN 201180022611A CN 102884490 B CN102884490 B CN 102884490B
Authority
CN
China
Prior art keywords
equipment
portable set
virtual
view
space
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201180022611.0A
Other languages
Chinese (zh)
Other versions
CN102884490A (en
Inventor
G.维辛
T.米勒
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sony Interactive Entertainment LLC
Original Assignee
Sony Computer Entertainment America LLC
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US12/947,290 external-priority patent/US8730156B2/en
Application filed by Sony Computer Entertainment America LLC filed Critical Sony Computer Entertainment America LLC
Priority to CN201610220654.4A priority Critical patent/CN105843396B/en
Publication of CN102884490A publication Critical patent/CN102884490A/en
Application granted granted Critical
Publication of CN102884490B publication Critical patent/CN102884490B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/50Controlling the output signals based on the game progress
    • A63F13/52Controlling the output signals based on the game progress involving aspects of the displayed game scene
    • A63F13/525Changing parameters of virtual cameras
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/90Constructional details or arrangements of video game devices not provided for in groups A63F13/20 or A63F13/25, e.g. housing, wiring, connections or cabinets
    • A63F13/92Video game devices specially adapted to be hand-held while playing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1626Constructional details or arrangements for portable computers with a single-body enclosure integrating a flat display, e.g. Personal Digital Assistants [PDAs]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F1/00Details not covered by groups G06F3/00 - G06F13/00 and G06F21/00
    • G06F1/16Constructional details or arrangements
    • G06F1/1613Constructional details or arrangements for portable computers
    • G06F1/1633Constructional details or arrangements of portable computers not specific to the type of enclosures covered by groups G06F1/1615 - G06F1/1626
    • G06F1/1684Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675
    • G06F1/1694Constructional details or arrangements related to integrated I/O peripherals not covered by groups G06F1/1635 - G06F1/1675 the I/O peripheral being a single or a set of motion sensors for pointer control or gesture input obtained by sensing movements of the portable computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L12/00Data switching networks
    • H04L12/02Details
    • H04L12/16Arrangements for providing special services to substations
    • H04L12/18Arrangements for providing special services to substations for broadcast or conference, e.g. multicast
    • H04L12/1813Arrangements for providing special services to substations for broadcast or conference, e.g. multicast for computer conferences, e.g. chat rooms
    • H04L12/1827Network arrangements for conference optimisation or adaptation
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L51/00User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail
    • H04L51/04Real-time or near real-time messaging, e.g. instant messaging [IM]
    • H04L51/046Interoperability with other network applications or services
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L51/00User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail
    • H04L51/07User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail characterised by the inclusion of specific contents
    • H04L51/10Multimedia information
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04LTRANSMISSION OF DIGITAL INFORMATION, e.g. TELEGRAPHIC COMMUNICATION
    • H04L51/00User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail
    • H04L51/07User-to-user messaging in packet-switching networks, transmitted according to store-and-forward or real-time protocols, e.g. e-mail characterised by the inclusion of specific contents
    • H04L51/18Commands or executable codes
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/211Input arrangements for video game devices characterised by their sensors, purposes or types using inertial sensors, e.g. accelerometers or gyroscopes
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/213Input arrangements for video game devices characterised by their sensors, purposes or types comprising photodetecting means, e.g. cameras, photodiodes or infrared cells
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/215Input arrangements for video game devices characterised by their sensors, purposes or types comprising means for detecting acoustic signals, e.g. using a microphone
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/216Input arrangements for video game devices characterised by their sensors, purposes or types using geographical information, e.g. location of the game device or player using GPS
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/40Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment
    • A63F13/42Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle
    • A63F13/426Processing input control signals of video game devices, e.g. signals generated by the player or derived from the environment by mapping the input signals into game commands, e.g. mapping the displacement of a stylus on a touch screen to the steering angle of a virtual vehicle involving on-screen location information, e.g. screen coordinates of an area at which the player is aiming with a light gun
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/80Special adaptations for executing a specific game genre or game mode
    • A63F13/803Driving vehicles or craft, e.g. cars, airplanes, ships, robots or tanks
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/80Special adaptations for executing a specific game genre or game mode
    • A63F13/812Ball games, e.g. soccer or baseball
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/10Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
    • A63F2300/1006Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals having additional degrees of freedom
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/10Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals
    • A63F2300/1087Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals comprising photodetecting means, e.g. a camera
    • A63F2300/1093Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterized by input arrangements for converting player-generated signals into game device control signals comprising photodetecting means, e.g. a camera using visible light
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/20Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterised by details of the game platform
    • A63F2300/204Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterised by details of the game platform the platform being a handheld device
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/20Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterised by details of the game platform
    • A63F2300/205Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game characterised by details of the game platform for detecting the geographical location of the game platform
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/60Methods for processing data by generating or executing the game program
    • A63F2300/66Methods for processing data by generating or executing the game program for rendering three dimensional images
    • A63F2300/6661Methods for processing data by generating or executing the game program for rendering three dimensional images for changing the position of the virtual camera
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F2300/00Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game
    • A63F2300/80Features of games using an electronically generated display having two or more dimensions, e.g. on a television screen, showing representations related to the game specially adapted for executing a specific type of game
    • A63F2300/8082Virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2200/00Indexing scheme relating to G06F1/04 - G06F1/32
    • G06F2200/16Indexing scheme relating to G06F1/16 - G06F1/18
    • G06F2200/163Indexing scheme relating to constructional details of the computer
    • G06F2200/1637Sensing arrangement for detection of housing movement or orientation, e.g. for controlling scrolling or cursor movement on the display of an handheld computer
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/01Indexing scheme relating to G06F3/01
    • G06F2203/012Walk-in-place systems for allowing a user to walk in a virtual environment while constraining him to a given position in the physical environment

Abstract

Method, device and the computer program of the view for using portable set control virtual scene are presented. In one approach, receive signal, and synchronous described portable set, to make the reference point in three-dimensional (3D) space of orientating as of described portable set. Around described reference point generating virtual scene in 3d space, described virtual scene comprises virtual reality element. In addition, described method determine described portable set with respect to described reference point the current location in 3d space, and create the view of virtual scene. The performance of described view as from as described in the current location of portable set watch and have based on as described in the virtual scene at visual angle of current location of portable set. In addition, in described portable set, show the view creating, and in the time moving described portable set by user in 3d space, change the view of described virtual scene. In other method, shared virtual reality and mutually mutual multiple players watch the object in virtual reality.

Description

On the stable Virtual Space of sharing, maintain many views
Technical field
The present invention relates to method, equipment and the meter of the view for utilizing portable set control virtual sceneCalculation machine program, and relate more specifically to for realize at virtual or augmented reality the mutual method of many people,Equipment and computer program.
Background technology
Virtual reality (VR) is the environment of computer simulation, no matter this environment is real world or fabricatesThe simulation in the world, wherein user can by use standard input device or special multi-direction input equipment withVirtual environment or pseudo-entity object (VA) are mutual. The environment of simulation can be similar to real world, exampleAs the simulation for pilot or battle drill, or it can significantly be different from reality, as swum at VRIn play. Virtual reality is associated with its immersion, highly virtual 3D environment facies through being usually used in describing conventionallyMultiple application. Development, the image of CAD (CAD) software are hardware-accelerated, wear-typeDisplay, database gloves (databasegloves) and miniaturization have helped universal this concept.
Augmented reality (AR) provides the on-the-spot view of physics real world, physics real worldElement and the image that generates of virtual computer merge (or the figure image intensifying being generated by virtual computer)To create mixed reality. Strengthen the real-time semantic background with thering is environmental element traditionally, such as match periodBetween sports score on TV. (for example, adding computer vision and object knows to utilize advanced AR technologyHelp not), becomes interactive and can digitally use about the information of surrounding's true environment of user.
The term empty border (AV) of increasing is also used in virtual reality world, and is similar to AR. Amplification is emptyBorder also refers to that real-world objects merges in virtual world. As virtual continuum (VirtualityContinuum) intermediate state in, AV refers to prevailing Virtual Space, wherein dynamic integrity thingReason element (for example, physical object or people), and physical element can be in real time and virtual world mutual. RemoveNon-other appointment, is used term VR as generic term in this application, and it also comprises AR and AV.
VR game typically needs a large amount of computer resources. It is rare in the handheld device of VR game, implementingHave, and existing game is the oversimplification with basic VR effect. In addition many people AR game,Allow the mutual of player in virtual world, handled by player in virtual world but this is limited to alternatelyObject (for example, vehicle, racket, ball etc.). Virtual world is that computer generates, and does not depend onThe location of player and portable set. In the time creating the experience of " truly " virtual reality, do not consider tripPlay person mutually and player with respect to the relative positioning of their environment.
Embodiments of the invention occur under this background.
Summary of the invention
Embodiments of the invention be provided for the view that utilizes portable set control virtual scene method,Equipment and computer program. Should be appreciated that, the present invention can implement in many ways, such as process, dressPut, method on system, equipment or computer-readable medium. Some novelties of the present invention are described belowEmbodiment.
In an embodiment of method, receive signal, and synchronous described portable set, to makeDescribed portable set orientate the reference point in three-dimensional (3D) space as. Around described reference point at 3DGenerating virtual scene in space, described virtual scene comprises virtual reality element. In addition, described method is trueFixed described portable set is the current location in 3d space with respect to described reference point, and creates emptyIntend the view of scene. The performance of described view as from as described in the current location of portable set watch and haveThe virtual scene at the visual angle of the current location based on described portable set. In addition, described portable establishingThe view that standby middle demonstration creates, and in the time moving described portable set by user in 3d space,Change the view of described virtual scene. In other method, share virtual reality and mutually mutual manyIndividual player watches the object in virtual reality.
In another method, present the method for share virtual scene between each equipment. Described sideMethod comprises the reference point to three-dimensional (3D) space for synchronous the first equipment, and for respect to instituteState the operation of the location of location Calculation second equipment of the first equipment. In addition, the operation of described method comprisesIn described the first equipment and described the second exchanged between equipment information, so that described the second device synchronization is arrivedReference point in 3d space. Described information comprises described reference point and described the first equipment and the second equipmentLocation. In addition, method operation is used for around described reference point in 3d space generating virtual scene. InstituteState virtual scene and shared by two equipment, at equipment and described virtual scene when mutual, described virtual sceneIn two equipment, change simultaneously. Create the view of described virtual scene, as used based on portable setThe visual angle of current location watch from the current location of described the first equipment, and shown in the first equipmentShow the view creating. By in the time that portable set moves in 3d space, change described virtual fieldThe view of the demonstration of scape, described method continues.
In another embodiment, manner of execution is for being used the view of the first equipment control virtual scene. InstituteThe method of stating comprises the first reference point to the first three-dimensional (3D) space for synchronous described the first equipment.In another operation, set up the communication link between described the first equipment and the second equipment. Described second establishesStandby the second 3d space in described the first 3d space outside, described the second device synchronization is to described secondThe second reference point in 3d space. In addition, manner of execution operation, comprises virtual reality element for generatingPublic virtual scene, wherein said public virtual scene can by the first and second equipment both observe. InstituteFirst equipment of stating is set up described public virtual scene around described the first reference point, described the second equipment aroundDescribed the second reference point is set up described public virtual scene. Two equipment can with described virtual reality elementAlternately. In addition, described method comprise for determine described the first equipment with respect to described reference point describedThe operation of the current location in the first 3d space. The performance of described view as from as described in the first equipment currentThe public virtual scene at the visual angle of the current location based on described the first equipment is watched and had in position. ?In described the first equipment, show the view creating, and when described the first equipment is at described the first 3d spaceMiddle when mobile, the view of the demonstration of described public virtual scene changes.
In other embodiment, method operation is used the view of portable set control virtual scene. OneIn individual operation, three-dimensional (3D) space that synchronous portable set is arranged in to described portable setReference point. Described portable set comprises anterior anterior camera and the face in the face of described portable setThe rear portion camera at the rear portion to described portable set. In addition, executable operations is used for around described reference pointGenerating virtual scene in 3d space. Described virtual scene comprises virtual reality element. Determine described in justTake formula equipment with respect to described reference point the present bit in 3d space. In other method operation, woundBuild the view of described virtual scene. Described view is caught as the player's from portable set as described in holdingThe performance of the virtual scene that in 3d space, current eye position is seen, described in catch corresponding to player and pass throughEnter that the window of virtual scene sees. In 3d space, the position of window is equivalent in described portable setPosition in the 3d space of display. Described method also comprises for showing and create at described displayView, and for when described portable set or player are in the time that 3d space moves, change instituteState the operation of the view of the demonstration of virtual scene.
In more embodiment, portable set is for mutual with augmented reality. Described portable establishingStandby locating module, virtual reality maker, view generation device and the display of comprising. Locating module is for trueThe position of portable set described in the 3d space that fixed described portable set is positioned at, wherein describedPortable set receives while being used for synchronous signal, and the position of described portable set is made as in 3d spaceReference point. Virtual reality maker creates virtual scene around described reference point in 3d space. InstituteState virtual scene and comprise virtual reality element. In addition, view generation device creates the view of described virtual scene,The performance of wherein said view as from as described in the position of portable set watch and have based on as described in portableThe virtual scene at the visual angle of the position of equipment. In addition, display is for showing the view of described virtual scene.In the time that described portable set moves in 3d space, at the scene change shown in described display.
In other embodiments, when by one or more computer run, embeddeding computer readable storageComputer program in medium is used for implementing method of the present invention.
The following detailed description of the principle of the invention is described, the present invention from the mode with example in conjunction with the drawingsOther side will become obvious.
Brief description of the drawings
Can understand best the present invention with reference to following description by reference to the accompanying drawings, in accompanying drawing:
Fig. 1 describes according to an embodiment, before portable set is synchronized to the reference point in spaceUser.
The virtual reality scenario that Fig. 2 diagram utilizes portable set to observe.
Fig. 3 illustrates according to an embodiment, utilizes the augmented reality of the player's of virtual plate and mixing handChess game.
Fig. 4 describes according to many people reality-virtualizing game of an embodiment.
Fig. 5 diagram is for an embodiment of the calibration steps of many people environment.
Fig. 6 illustrates according to an embodiment, how on network connects, to play interactive entertainment.
Fig. 7 illustrates the interactive entertainment of the location of not depending on portable set.
Fig. 8 illustrates that, according to the interactive entertainment of an embodiment, wherein the view in display depends on justTake the position of formula equipment.
Fig. 9 illustrates according to an embodiment, the movement of portable set how on display, to have withSimilar effect when mobile camera in Virtual Space.
Figure 10 illustrates according to an embodiment, in the time rotating portable set at the image shown in displayIn the two dimension performance of change.
Figure 11 illustrates the portable set of playing for playing VR according to an embodiment.
Figure 12 A-12F illustrates according to an embodiment, and how the position of portable set affects in displayView.
Figure 13 A-13B diagram is according to the augmented reality trip of playing between the remote user of an embodimentPlay.
Figure 14 A-14H describes according to an embodiment, along with portable set changes in position indicatorChange.
Figure 15 diagram is used for using front and rear camera on portable set, to implement viewing frustum(viewingfustum) embodiment.
Figure 16 A-16B illustrates according to an embodiment, changes viewing frustum along with player movesEffect.
Figure 17 illustrates according to an embodiment, how to use virtual camera to cross over the view of virtual scene.
Figure 18 A-18H illustrates according to an embodiment, for illustrating a series of of viewing frustum effectView.
Figure 19 A-19B diagram is for combining the embodiment of viewing frustum effect and camera effect.
Figure 20 illustrates according to one embodiment of present invention, for utilizing portable set control virtual sceneThe flow process of algorithm of view.
Figure 21 diagram can be for the framework of the equipment of the enforcement embodiment of the present invention.
Figure 22 is the graphical representation of exemplary of scenario A according to an embodiment of the invention to scene E, itsIn each user A to user E with via Internet connection to the game client 1102 of server processAlternately.
The embodiment of Figure 23 pictorial information ISP framework.
Detailed description of the invention
Following examples describe for control virtual or augmented reality virtual scene view method,Equipment and computer program. But, it will be apparent to those skilled in the art that and can not haveHave in the situation of some or all of these details and put into practice the present invention. In other cases, do not retouch in detailState well known process operations so that fuzzy the present invention necessarily.
Fig. 1 describes according to an embodiment, before portable set is synchronized to the reference point in spaceUser. It is upper that portable set 104 is positioned at table, and this portable set is synchronized to reference point by preparation. WithFamily 102 has been placed on portable set the point as reference point or anchor point, to set up around thisThe virtual reality of point. Shown in Fig. 1 in the situation that, portable set is positioned at the approximate center of desk, andOnce and synchronously this portable set is just set up virtual world around the center of desk. Can be in many waysSynchronous portable set, such as promoting button on portable set 104, touching in portable setTouch-sensitive screen, allow static a period of time of equipment (for example, 5 seconds), input voice command etc.
Once receiving, portable set wants synchronous input, the location tracking mould of just resetting in portable setPiece. Portable set can comprise multiple location tracking module, as discussed below with reference to Figure 21, allAs accelerometer, magnetometer, global positioning system (GPS) equipment, camera, depth camera, compass,Gyroscope etc.
Portable set can be one of many types, such as handheld portable game station, cell phone,Panel computer, notebook computer, net book, PDA(Personal Digital Assistant) etc. With reference to portable gamePlay device description embodiments of the invention, but principle can be applied to and has any portable of displayElectronic equipment. Principle of the present invention can also be applied to the game that is connected to the computing equipment with displayController or other input equipment.
The virtual reality scenario that Fig. 2 diagram utilizes portable set to observe. Same with respect to reference point 106After step equipment 104, portable set will start the view of display virtual real 108. Exist by simulationThe camera below of portable set moves in 3d space around reference point 106, creates in displayView. Fig. 2 describes to comprise the virtual reality of chessboard. Portable set 104 can detect movement,And along with equipment is around mobile definite its relative position with respect to reference point 106. Can use not TongfangMethod and different accuracy level position and location positioning. For example, can catch with camera by analysisImage or the number obtaining from inertia system, GPS, ultrasonic triangulation, WiFi communication, dead reckoning etc.According to or its combination, detection and location.
In one embodiment, equipment is followed the tracks of in the space of portable set with respect to reference point 106 and is located,And position in the space of portable set. Position, for determining the visual angle of camera, that is to say, portableFormula equipment is as the camera entering in virtual scene. If portable set is aimed to the right, view will soTurn to the right side. In other words, perspective definition is for to have at display (or other parts of equipment) centerInitial point and vertical and away from the vector of the direction of display. In another embodiment, only follow the tracks of skyBetween in position, and view in calculation display, the space being positioned at from portable set as cameraMiddle location is aimed at and towards reference point.
In some existing embodiments, it is upper that augmented reality (AR) label is placed in table, and with acting onGenerate the fiduciary mark of augmented reality. AR label can be when being present in the catching in image stream of actual environmentObject or the image of time identification. AR label is as fiduciary mark, and it realizes location in true environment reallyFixed. Embodiments of the invention are eliminated the needs for AR label, because be synchronized to 3d space and trackingThe location of portable set. In addition, locating information allows the true 3D of game delivery in portable setVirtual experience. In addition, the array of the portable set of networking can be for creating shared virtual world,As described below with reference to Fig. 4.
Fig. 3 illustrates according to an embodiment, utilizes the augmented reality of the player's of virtual plate and mixing handChess game. The image of 3d space is for being created and increased by and virtual element real with respect to calibration point combinationStrong reality, and the function of catching as optical motion is provided. Utilize the polyphaser technology of calibration, may be reallyDetermine the position of hand or arm, so as player can " to be arrived " in augmented reality scene and with tripPlay object (pieces of chess) is mutual.
In one embodiment, individual equipment two cameras are below for determining that object is to 3d spaceLocation. Depth camera also can be for obtaining three-dimensional information. In other embodiments, establish from multipleStandby camera is located for determining hand 306, as discussed below with reference to Fig. 4. At proficiency handheld portableWhen equipment 302, player stares by screen 304, and arrives the Game Zone for their generationsIn territory to touch 3D game object and environment. Game play is sense of touch completely. May multiple gamePerson arrives in game area simultaneously, and mutual with complex way and game object. For example, playerHand 306 can be by interacting, hold, push away, draw, grab, move, clash into, push, hit,Throw, fight, open, close, turn on and off, press button, shoot, eat etc. and virtual objectsAlternately.
The each portable set that is synchronized to game area adds another potential camera, relatively move follow the tracks of andPing data, make to see player's hand and finger from multiple viewpoints, to create effective baseIn the capturing movement field of 3D camera. Hand and Virtual Space mix, wherein the void in Virtual SpaceIntending element and occur in the view showing, is a part for 3d space as virtual element. From how much thoroughlyDepending on seeing, the view of the view of virtual element true element when moving in 3d space with portable setChanging identical mode changes.
Fig. 4 describes according to many people reality-virtualizing game of an embodiment. When combining school by high speed connectednessWhen accurate position and pattern analysis data, position and game information can select to participate in communal space gameEach exchanged between equipment of experiencing. This allows each player's system access from every other gamePerson's camera image and positional information, so as by their calibrating position synchronously together and share virtualSpace (also referred to as the communal space).
Same with reference to the point (such as the point on table) in public 3d space at player 402A-402CStep or calibrate their portable set after, create public virtual scene 404. Each player's toolThere is the view of virtual scene 404, swimming as Virtual Space (combatting chess board game in the case)On table before play person, be real. Portable set, as camera, makes to move around equipment as playerWhen moving, view changes. As a result, the actual view on each display does not rely in other displaysView, and the only relative position with respect to actual scene based on portable set of view, it is fixed toActual physics location on 3d space.
By utilizing multiple cameras, accelerometer and other plant equipment together with the high speed between portable setPosition is determined in communication, may create the experience as 3D capturing movement, and this experience allows player with credibleMode see (and may touch) virtual game role and environment.
The communal space 404 game utilize the high speed connectedness of equipment participating in the each of communal space game experiencingExchanged between equipment information. By by " the magic window " of equipment stable direction, watch the communal space by equipment404 game area, this stable " magic window " remains in the space between each equipment. By usingThe combination of the height continuation of information between motion tracking, graphical analysis and each equipment, even if work as equipmentWhile movement everywhere, game area also occurs at settling position.
Fig. 5 diagram is for an embodiment of the calibration steps of many people environment. As previously mentioned, pass from equipmentThe positional information that sensor (accelerometer, GPS, compass, depth camera etc.) obtains sends to other linksEquipment, to strengthen the data that in Virtual Space, cooperation keeps. Be synchronized to common reference point for creating502 the public communal space, the first player 504A is synchronized to her equipment with respect to reference point 5023d space. Other players and the first player in the communal space set up communication linkage, so that exchange positionPut and game information. Can obtain by different way relative position, such as use WiFi triangulation andPing test is to determine relative position. In addition, virtual information can be for determining other location, such as inspectionSurvey other players' face and may the locating of face, game station from them.
In one embodiment, audio frequency triangulation is used for relying on supersonic communication and shotgun microphone to determine phaseTo position. Multiple frequencies can be for carrying out audio frequency triangulation. Once equipment is exchange position information,Radio communication (such as ultrasonic, WiFi or bluetooth) is just for being synchronized to surplus equipment reference point 502.After calibration all devices, equipment is understood reference point 502 and they phase with respect to reference point 502To position. Should be appreciated that, additive method can be for calibrating multiple equipment to shared reference point. For example,Can be by turn equipment being placed in reference point, all devices can be calibrated to same reference points.
By using shade and the illumination determined by room inner light source, can make virtual scene more true to nature.By using camera feedback, game environment and role have the scene lighting and the shade that affected by real world.This means that player's hand is by cast shadow with setting about arriving in virtual world with mutual with virtual objectsOn virtual role or object. Adjust gaming world shade and illumination by real world shade and illumination,To obtain possible optimum efficiency.
Fig. 6 illustrates according to an embodiment, how on network connects, to play interactive entertainment. Many typesGame in the communal space, be possible. For example, portable set can be as carrying out table tennis gameRacket. Equipment moves everywhere, as racket batting. Player see ball screen and opponent's screen itBetween float. In war game, player looks through portable set, and aims at enemy's fort placeBallista. Player pull back equipment in case load ballista, then press button so as by ballista to enemyCastle is opened fire.
The communal space can also create player in the time that difference is located, as shown in Figure 6. PlayerConnect to play games through setting up network. Each player by his device synchronization in player spaceReference point, and create virtual reality, such as ping-pong table. Opponent be presented at he desk end after,Wherein the shifted matching of opponent's equipment is to the movement of opponent's racket. Game can also be added incarnation and hold ballClap, for more real game experiencing. During playing, the movement in each equipment tracking equipment spaceAnd position. This information and other equipment are shared, to make other equipment can arrange matching unit motionVirtual paddle. Also share other game informations, such as location and the movement of ball.
Fig. 7 illustrates the interactive entertainment of the location of not depending on portable set. The illustrated game of Fig. 7 is shownGo out to play the restriction of not synchronous with respect to reference point 706 game. Simultaneously at two equipment 704C that separate andIce hockey game on the upper many people table of 702A. This game comprises ice hockey rink 708, ice hockey 714 and bat 710 and712. Each player is by moveable finger control bat on display. Display illustrates ice hockey and batLocation. But the view in the time that portable set moves everywhere on display does not change, because do not depositSynchronizeing with the geography of reference point. For example, in the time that player 702A moves to location 702B, view isIdentical, and the location of tube apparatus is not where.
In order to play games, each portable set only exchanges the letter about the movement of bat and the location of ice hockeyBreath. There is not the virtual experience that is tied to 3d space.
Fig. 8 illustrates that, according to the interactive entertainment of an embodiment, wherein the view in display depends on justTake the position of formula equipment. Equipment 802A and 802B have been calibrated to public space, and have created iceCourt is as virtual element. Equipment is as the camera that enters space, and equipment does not need to have illustratedWhole play surface. For example, in the time pulling open equipment from reference point, there is dwindling effect and court moreLarge view can be used. In addition, if be inclined upwardly equipment, view illustrates the top in court so, and ifDownward-sloping equipment, the view in equipment more approaches self target of player so. As seen in Figure 8,View in each display is separate, and the working as of play surface based on from each portable setFront view.
Fig. 9 illustrates according to an embodiment, the movement of portable set how on display, to have withSimilar effect when mobile camera in Virtual Space. Suppose that the point of portable set from spheroid is to automobile 902Aim at, along with portable set moves in spheroid, can obtain multiple views of automobile. For example, comeView from " arctic point " will illustrate the top of automobile, and will illustrate from the view of " Geophysical South Pole "The bottom of automobile. The view of automobile side, front and rear is also shown in Fig. 9.
In one embodiment, player can input command to change or the view of upset virtual world.For example, the in the situation that of automobile, player, from the front portion of seeing automobile to the rear portion of seeing automobile, seemsScene Rotate 180 ° and axle vertically turns over reference point. In this way, player needn't be around roomMobile to obtain different visual angles. Other input can produce different-effect, such as 90 ° of rotations, viewConvergent-divergent (to make virtual world seem smaller or greater), with respect to x, y or the rotation of z axle etc. ?In another embodiment, the upset of portable set (, in player's 180 ° of rotations on hand) will makeThe view turned upside down of virtual world.
Figure 10 illustrates according to an embodiment, in the time rotating portable set at the image shown in displayIn the two dimension performance of change. Portable set 152 is concentrated one's gaze on wall with visual angle α, causes on wallProjection 160. Therefore, the view on portable set 152 will be corresponding to projection 160. When equipment 152When rotational angle β, portable set finishes in position 154. View is rotational angle β also, keeps simultaneouslyCamera perspective α. As a result, the view on portable set is corresponding to projection 162. Should be noted that screenOn view be independent of eye position (such as position 158 and 156), and view is independent of player positionIn where. In addition, the view on display depends on the position of portable set, and this portable set is usedMake virtual camera. Other embodiment described below comprises on the display changing according to the position of eyesView.
Figure 11 illustrates the portable set of playing for playing VR according to an embodiment. Figure 11 is to 12FDiagram car race game, wherein portable set can be as the driving of camera or control vehicle. PortableEquipment illustrates the view of match, wherein sees racing track in central authorities, has other racing car and is sitting in racing track one sidePeople on grandstand.
Figure 12 A-12F illustrates according to an embodiment, and how the position of portable set affects in displayView. In this sequence, portable set is used as camera, and does not drive racing car. Figure 12 A illustratesPlayer holds portable set and plays car race game. Equipment is held before player with the length that is similar to armFace. When player is during in position shown in Figure 12 A, the view of game be in Figure 12 B illustrated one lookFigure, wherein the view on display illustrates the match as seen by driver. Driver can seeRacing track forwardly and a part for vehicle interior (comprising steering wheel).
Figure 12 C illustrates that player rotates about 45 ° of whiles and still before him, holds portable set. As a result,Portable set moves in space together with player. In Figure 12 D, see the result that player moves,Wherein the view of racing track has also rotated about 45 °. Can see that portable set is used as camera, and aobviousShow that the view on device changes, and changes position as camera in the 3D world.
Figure 12 E illustrates that player turns left other 45 °. As a result, the head of portable set is relative with viewAbout 90 ° have been changed in home position. In Figure 12 F, draw the result on display, wherein playDriver has a side view now, and it comprises another racing car and grandstand.
Figure 13 A-13B diagram is according to the augmented reality trip of playing between the remote user of an embodimentPlay. Figure 13 A illustrates that player's the portable of camera 1302 having in the face of holding portable set establishStandby. In the face of player's camera has many purposes, (see such as videoconference, viewing frustum applicationFigure 15-19B), in game, merge face of player etc.
Figure 13 B illustrates the embodiment of the augmented reality game that produces reality effect. Player 1308In long range positioning, and connect exchange game and environmental information via network. The camera of long range positioning is clappedTake the photograph near the picture of player and Ta (such as background 1310). Image sends to opponent's equipment, whereinThis image mixes with virtual chessboard 1306. Similarly, camera 1304 shootings are held the player's of equipmentPicture, and the person that sends images to remote game. Player can the communal space in this way.
In the time that view leap enters other player's screen, each player sees as the reality strengtheningHis view, this view fades away in virtual reality mist. Still just with respect to for two equipmentAll movements of each player are followed the tracks of in the position of synchronous calibration. The desk top that game is providing 3D to experienceIn portion, insert virtual chessboard. As previously described, portable set can move to change view everywhere,And from different points of view (such as from top, sidepiece, opponent's view etc.) see chessboard.
In one embodiment, substitute use in fact by being updated periodically from opponent's face and backgroundCondition is fed to (live-feed), reduces the communication needing and processes bandwidth. In addition, may only send long-range figureA part for picture, such as player's image, because background may be static and less relevant. For example,Can every five seconds, each player changes expression, in the time that player speaks etc., renewal remote game person'sFace.
In another embodiment, sound also can exchange between player, to 3D is experienced moreAdd true. In another embodiment, player has the option that changes view, such as mixing 3D figureLook like and only show that chessboard is with the switching between the view of improvement dish. In another embodiment, image stabilization canFor the little image change of the stable slight jitter due to player's hand. In one embodiment, holdThere is the player's of equipment face also can add display to, to illustrate how this user occurs to rightHand.
Figure 14 A-14H describes according to an embodiment, along with portable set changes in position indicatorChange. In the sequence of Figure 14 A-14H, portable set uses viewing frustum effect how to determineThe augmented reality world is presented to user.
In current 3D computer graphics, viewing frustum or view frustums are possible appear at screenOn the modeling world in area of space. Viewing frustum is the camera (notionalcamera) of the imaginationVisual field. The exact shape in this region depends on simulates the camera lens of what kind and changes, but allusion quotationIt is the frustum (therefore gaining the name) of rectangle cone type. Perpendicular to putting down of direction of observation cutting frustumFace is called front cutting face (nearplane) and rear cutting face (farplane). In one embodiment, frontCutting face is corresponding to the surface of display in portable set. Do not draw than front cutting face more approach camera orExceed the object of rear cutting face.
In one embodiment, viewing frustum hold portable set player eye in (orBetween eyes) fixing (top of centrum). Display is as the window that enters virtual reality. Therefore, " windowMouthful " more approaching eyes, the region of the virtual reality of demonstration is larger. On the contrary, " window " got over away from eyes,The view less (and more detailed) of virtual reality. This effect is similar to the rectangle that approaches the optics that do not distortOld-fashioned peep-hole. Eyes more approach peep-hole, outsidely can observe.
Figure 14 A is illustrated in the player that house interior is held augmented reality portable set. At equipmentAfter being synchronized to room, virtual reality maker adds " drafting " and is facing on player's wallVirtual triangle, and " drafting " square on the wall of the player left side. In Figure 14 A, tripPlay person holds equipment slightly under eye level face with the arm almost launching completely. Shown in displayView be presented in Figure 14 B, the leg-of-mutton part shown in it before player.
In Figure 14 C, player is in same position, and bend elbow is to more connect portable setNearly face. Due to viewing frustum effect as discussed above, player sees the greater part of wall.Figure 14 D is illustrated in the view showing in equipment in Figure 14 C. Because frustum effect, with Figure 14 B'sView is compared before, observes the greater part of wall. Now on display, see complete triangle.
Figure 14 E illustrates that player moves down equipment to see the bottom to sidewalls, as Figure 14 F instituteShow. Leg-of-mutton base section is shown on display. In Figure 14 G, player forwards the left side to, andAnd use and enter " window " that strengthen the world to watch the bight in room, as shown in Figure 14 H.
Figure 15 diagram is used for using front and rear camera on portable set, to implement viewing frustumEmbodiment. Figure 15 illustrates the 2D projection of viewing frustum, and because it is 2D projection, instituteBe viewed as triangle with viewing frustum cone. Portable set 1506 comprises respectively preposition and rearmounted phaseMachine 1514 and 1512. Camera 1512 is for the image in the space of catching player and being positioned at. Camera 1514For the image of catching the player who holds equipment 1506. Facial recognition software allows device software to determine tripThe location of play person's eyes, so that simulation viewing frustum effect.
In one embodiment, viewing frustum has on the summit at eyes place, wherein rectangle frustumThe edge of cone is from eyes extension and by the bight of display the equipment of holding. When eyes are in positionPut 1502 o'clock, player " sees " region 1510 in the face of the wall of equipment. Be derived from eyes and connectThe line and the wall that touch the bight of display intersect with localized area 1510. When eyes move to position 1504Time, the line that is derived from eyes as a result of changes. New line localized area 1508. Generally speaking, if justTake formula equipment 1506 and keep static, the change of eye position is by changing of causing showing in display soBecome. Certainly, if portable set moves, view also will change so, because along with the edge of coneIntersect with the bight of display, viewing frustum changes.
Should be appreciated that, the illustrated embodiment of Figure 15 is the illustrative embodiments of viewing frustum. ItsHe embodiment can utilize difformity for viewing frustum, and can convergent-divergent viewing frustumEffect, or can add edge to viewing frustum. Therefore, the illustrated embodiment of Figure 15 should notBe interpreted as exclusive or restrictive, but exemplary or illustrative.
Figure 16 A-16B illustrates according to an embodiment, changes viewing frustum along with player movesEffect. Figure 16 A comprises the display 1606 in portable set, and wherein the surface of display is parallel to wallThe surface of wall. In the time that player watches by having viewing frustum effect display, on summit inPlayer face somewhere (between eyes), aobvious from eyes extension and contact based on wall and edgeShow in the situation in bight of device 1606, create rectangle truncated cone.
When player is during in position 1602, viewing frustum creates rectangle basal plane 1610, and it is tripPlay person sees on display 1606. When player moves to position 1604, and not mobile aobviousShow device, viewing frustum as a result of changes. New basal plane for frustum is rectangle 1608, itsIn display 1606, see. Result is that the change of player position causes view in virtual realityChange.
Figure 16 B diagram is in the time being used viewing frustum effect, along with face moves far or shift near display instituteThe zooming effect creating. When player is during in position 1632, player sees rectangle 1638, as itFront description. If player moves away to position 1632 and there is no mobile display from display 1636,See the new display corresponding to rectangle 1640. Therefore, in the time that player removes, the observation of virtual worldRegions contract, causes amplification effect, because the viewing area in display is less, in this viewing areaObject on display, seem larger. Player shifts near contrary mobile the causing on the contrary of display 1636Dwindle effect.
Figure 17 illustrates according to an embodiment, how to use virtual camera to cross over the view of virtual scene. EmptyIntend or augmented reality does not need not to be restricted in the restriction in the room that player is positioned at, as used before usIn Figure 11 of car race game, see. Also can simulate the virtual generation of the physical edge that surmounts playerBoundary. Figure 17 diagram is watched the player of virtual concert. Actual stage is located at beyond room wall, andAnd can be modeled as apart from portable set hundreds of foot, portable set is in the case as virtual phaseMachine. Also can simulate in the same manner viewing frustum.
As observed in bottom, different camera positions and viewing angle are by the difference causing on displayView. For example, the first position of focusing facula is in standby singer, and the second position of focusing facula is in main artist, andThe 3rd location aiming crowd. Virtual camera also can add convergent-divergent input, to put as true cameraLarge or dwindle.
In one embodiment, convergent-divergent is for browsing virtual reality. For example,, if player moves forwardOne foot, portable set will create virtual view so, advance 10 feet as player. WithThis mode, player can browse the virtual world that is greater than player room of living in.
In another embodiment, player can input command to camera is moved in virtual realityAnd need not actual mobile portable equipment. Because with respect to the synchronous portable set of reference point, this cameraMovement and do not have and change the effect of reference point to new location by player's movement. This is newReference point can be called virtual reference point, and needn't be positioned at the residing actual physics of player space.For example, in the illustrated scene of Figure 17, player can use " moving forward " to order to move to backstageMoving camera. Once player " in " backstage, player just can start mobile portable equipment everywhereTo check view backstage, as previously discussed.
Figure 18 A-18H illustrates according to an embodiment, for illustrating a series of of viewing frustum effectView. Figure 18 A illustrates the player who holds portable set. View on display is corresponding to Figure 18 BThe image of shown forest. In Figure 18 C, player moves his head to his the right, will simultaneouslyPortable set remains on the approximately uniform position with Figure 18 A. Figure 18 D is corresponding to for Figure 18 C middle reachesPlay person's view, and how the panorama that forest is shown is because viewing frustum effect changes.
In Figure 18 E, player keeps head to turn to his the right, moves just to his left side simultaneouslyTake formula equipment to emphasize viewing frustum effect, because whether player exists certain after wanting to know treeThing. Figure 18 F illustrates the display corresponding to player in Figure 18 E. The panorama of forest changes again.Existence is hidden in one of trees hiding in Figure 18 B elfin below, but along with player changesThe visual angle that becomes forest, elvish part is visible in Figure 18 F. Figure 18 G illustrates the right side of player to himLimit is angled head further, and to his left side mobile portable equipment further. As Figure 18 H finding,Effect is after player can see tree now, what to be, elfin is now completely visible.
Figure 19 A-19B diagram is for combining the embodiment of viewing frustum effect and camera effect. CanSee that combination viewing frustum and camera effect are because the behavior for setting up virtual view is different and can notEnergy. But when existing for limiting while when using an effect or another regular, combination isPossible. In one embodiment, in the time of player's mobile portable equipment, use camera effect, andAnd when user is during with respect to portable set moving-head, use viewing frustum effect. Two thingsIn the situation that part occurs simultaneously, select an effect, such as viewing frustum.
This combination means the position of given eyes and portable set, how depends on eyes and cameraArrive this position, on display, may have different views. For example,, when eyes 1902 see through equipment1906 when watch, at the different views of virtual reality shown in Figure 19 A and 19B, as discussed below.
With reference to Figure 19 A, the initial equipment 1904 that sees through of eyes 1902 is watched. Use viewing frustum effectEquipment " aiming " virtual reality directly forward of fruit. This causes being derived from viewing frustum cone topα angle, and the camera angle that causes β. The identical 2D table that uses as describe with reference to Figure 10 and 15 beforeExisting, player sees the fragment 1908 on wall in this primary importance. Player is then by equipment angle of rotationDegree γ, to be placed in equipment position 1906. Because player is mobile this equipment, establish so portableStandby corresponding to the movement with camera effect, make also rotational angle γ of virtual camera. Result is displayThe region 1910 of wall is shown now.
Figure 19 B illustrates that player sees through portable set 1906 at initial eye position 1912 and watches. MakeUse viewing frustum effect, and result is the performance on the display in region 1918. Player thenMove to eye position 1902 and there is no mobile portable equipment. Because equipment does not move, so occurViewing frustum effect, and then player sees region 1916 on display. Should be noted thatAlthough eyes 1902 and display 1906 in Figure 19 A and 19B in same position, actual lookingFigure is different, because cause eyes and the sequence of events of display in this position.
Figure 20 illustrates according to one embodiment of present invention, for utilizing portable set control virtual sceneThe flow process of algorithm of view. In operation 2002, receive signal with synchronous portable set, such as pressingButton is pressed or screen touch. In operation 2004, the synchronous portable set of the method, so that obtain portableThe reference point in what equipment was arranged in orientate as three-dimensional (3D) space. In one embodiment, 3d spaceIt is the room that player is positioned at. In another embodiment, virtual reality comprises that room and extension surmountThe Virtual Space of room wall.
During operation 2006, around reference point generating virtual scene in 3d space. Virtual scene bagDraw together virtual reality element, such as the chessboard of Fig. 2. In operation 2008, portable set is determined portableEquipment is the current location in 3d space with respect to reference point. In operation 2010, create virtual sceneView. This view performance is established as watched and have based on portable from the current location of portable setThe virtual scene at the visual angle of standby current location. In addition, during operation 2012, at portable setThe view of establishment is shown on display. In operation 2014, whether portable set inspection is by userMobile portable equipment, that is, whether current location changes. If portable set moves,Method flow is got back to operation 2008 so, to recalculate current location. If portable set does not haveMobile, arrive operation 2012 by flow process so, portable set continues to show the view creating before.
Figure 21 diagram can be for the framework of the equipment of the enforcement embodiment of the present invention. Portable set is to calculateEquipment, and comprise the typical module being present in portable set, such as processor, memory (RAM,ROM etc.), battery or other power supplys and permanent memory (such as hard disk). Communication module allows justTake the exchange messages such as formula equipment and other portable sets, other computers, server. Communication module bagDraw together USB (USB) connector, communication linkage (such as Ethernet), supersonic communication, bluetoothAnd WiFi.
Input module comprise load button and sensor, microphone, touch sensitive screen, camera (preposition, afterPut, depth camera) and card reader. Other input-output apparatus (such as keyboard or mouse) also canBe connected to portable set via communication linkage (such as USB or bluetooth). Output module comprises display(thering is touch sensitive screen), light emitting diode (LED), sense of touch vibrational feedback and loudspeaker. Other outputsEquipment also can be connected to portable set via communication module.
Information from distinct device can be used for the position of calculating portable set by locating module. TheseModule comprises magnetometer, accelerometer, gyroscope, GPS and compass. In addition, locating module can be analyzedBy the sound of camera and microphones capture or view data with calculating location. In addition, locating module can be heldRow test (such as WiFiping test or ultrasonic tesint), with determine portable set position or nearThe position of other equipment.
Virtual reality maker uses the position of being calculated by locating module, creates virtual or augmented reality, asBefore. View generation device, based on virtual reality and position, is created in the view showing on display.View generation device can also use the direction effect that is applied to multi-loudspeaker system, produces and is derived from virtual realityThe sound of maker.
Should be appreciated that, Figure 21 illustrated embodiment is the illustrative embodiments of portable set. Other are realExecute example and can utilize the subset of disparate modules, module, or distribute inter-related task to disparate modules. Therefore,Figure 21 illustrated embodiment should not be construed as exclusive or restrictive, but exemplary or illustrative.
Figure 22 is the graphical representation of exemplary of scenario A according to an embodiment of the invention to scene E, itsIn each user A to user E with via Internet connection to the game client 1102 of server processAlternately. Game client is the equipment that allows user to apply and process to server via Internet connection.Game client allows user access and plays online entertainment content, such as, but not limited to game, film,Music and photo. In addition, game client can provide for such as VOIP, text chat agreement and electricityThe access of the online communication application of sub-mail.
User is mutual via controller and game client. In certain embodiments, controller is game visitorFamily end nonshared control unit, and in other embodiments, controller can be keyboard and mouse combination. OneIn individual embodiment, game client is autonomous device, its can output audio and vision signal with by prisonVisual organ/TV and related audio equipment create multimedia environment. For example, game client can be but not limitIn thin-client, inner PCI-express card, exterior PC I-express equipment, ExpressCard equipment,Inside, outside or wireless USB apparatus or firmware device etc. In other embodiments, game clientBe integrated with TV or other multimedia equipments, such as DVR, Blu-ray player, DVD player or manyChannel receiver.
In the scenario A of Figure 22, user A uses the controller 100 with game client 1102A pairingMutual with the client application showing on monitor 106. Similarly, in scenario B, user B makesWith with the controller 100 of game client 1102B pairing and another client who shows on monitor 106End application is mutual. The monitor that scene C diagram shows game when user C watches is from the view after himAnd from the buddy list of game client 1102C. Although Figure 22 shows individual server and processes mould, all over the worldly there are multiple server process modules in piece, but in one embodiment. Each serverProcessing module comprises for user conversation control, shared/communication logic, user location and load balance processingThe submodule of service. In addition, server process module comprises network processes and distributed storage.
In the time that game client 1102 is connected to server process module, user conversation control can be for testingCard user. The user of checking can have the virtual distributed storage and the virtual network processes that are associated.The exemplary projects that can be stored as user's virtual distributed storage comprises the media of purchase, such as but notBe limited to game, video and music etc. In addition, distributed storage can be for preserving for multiple gameGame state, for the self-defined setting of indivedual game and for the general setting of game client. ?In an embodiment, user's locating module of server process is for determining user and their each gameThe geo-location of client. User's geo-location can be processed clothes by share/communication logic and load balanceBusiness device uses, so that the processing requirements optimize performance based on geo-location and multiple server process modules.Virtual network is processed and the network storage is arbitrary or both are dynamic from the Processing tasks of game client by allowingTransfer to the server process module that (multiple) underuse. Therefore, load balance can be forLittleization to from storage readjustment and server processing module and game client between transfer of data relevantThe delay of connection.
Server process module has the example of server application A and server application B. Server processModule can be supported the application of multiple servers, as by server application X1With server application X2Instruction.In one embodiment, the multiple processor processing servers application of server process based in allowing to troopThe computing architecture of trooping. In another embodiment, dissimilar multicomputer processing scheme is applied to placeThe application of reason server. This allows server process to regulate in proportion, so that it is many to hold the execution of big figure moreThe game client of individual client application and corresponding server application. Alternately, server process canTo regulate in proportion, complicated to hold by graphics process or game, video compress or the application of requirements at the higher levelThe calculation requirement of the required increase of property. In one embodiment, server process module should via serverProcess with carrying out great majority. This allows relatively costly assembly (such as graphic process unit, RAM and generalProcessor) be positioned at center, and reduce the cost of game client. The server application data warp of processingSend back to corresponding game client by internet, to show on monitor.
The exemplary application that scene C diagram can have game client and server processing module to carry out. ExampleAs, in one embodiment, game client 1102C allows user C to create and watches and comprise userThe buddy list 1120 of A, user B, user D and user E. As directed, in scene C, useFamily C can see realtime graphic or each user's incarnation on monitor 106C. Server process is heldEach game client of row game client 1102C and user A, user B, user D and user EEach application of 1102. Because server process is known the application of being carried out by game client B, so useWhich game buddy list in user A can play by indicating user B. In addition, in one embodiment,User A can watch the directly actual play video from user B. This is by by locating for user BThe server application data of reason only sends to game client A and realizes except game client B.
Except watching the video from good friend, communications applications can allow the real-time Communication for Power between good friend.As being applied to previous example, this allows user A to provide bulging in the real-time video of watching user BEncourage and point out. In one embodiment, set up two-way real-time speech communicating by client-server application.In another embodiment, client-server application realizes text chat. In another embodiment, clientThe application of end/server is that text for showing on good friend's screen by speech conversion.
Scene D and scene E diagram respectively with game console 1110D and mutual each user of 1110ED and user E. Each game console 1110D and 1110E are connected to server process module, andDiagram server process module is coordinated the network of playing for the game of game console and game client.
The embodiment of Figure 23 pictorial information ISP framework. Information server supplier (ISP) 250Transmit bulk information server to the user 262 who geographically disperses and connect via network 266. ISP canOnly to transmit the service (upgrading such as stock price) of a type, or many services is (such as broadcast matchmakerBody, news, physical culture, match etc.). In addition, the service being provided by each ISP is dynamic, namelySay, can put at any time interpolation or take service away. Therefore, provide the server of particular type to specialFixed individual ISP can change along with the time. For example,, user can when in user at her the localServed by the ISP approaching with user, in the time that user travels to different cities, user can be by different I SPService. Local ISP arrives new ISP by transmitting the information and the data that need, and user profile " is followed "User, to new city, makes data more approach user and more easily access. In another embodiment,Can manage the main frame ISP of the information that is used for user and direct and use under the control from main frame ISPBetween the server ISP that family engages, set up main frame-relationship server. In another embodiment, along with clientEnd moves all over the world, and data are transferred to another ISP from an ISP, to make in for serving useThe ISP of the more good position at family is an ISP of these services of transmission.
ISP250 comprises application service provider (ASP) 252, and it provides based on computer by networkService to client. The software that use ASP pattern provides is sometimes also referred to as software or software take as requiredBusiness (SaaS). Simple form for the access of application-specific (such as customer relation management) is providedBy using the standard agreement such as HTTP. Application software resides in supplier's system, and logicalCross and use the web browser of HTML, by the private client software that provided by supplier or allAs other remote interfaces of thin-client are accessed by user.
The service of transmitting on wide geographic area is used cloud computing conventionally. Cloud computing is such typeService, is wherein providing dynamically adjustment and common virtualized resource conduct in proportion on internetService. User needs not be the expert in technological infrastructure in " cloud " of supporting them. Cloud computing canTo be divided into different services, such as infrastructure serve (IaaS), platform serves (PaaS) and softwareI.e. service (SaaS). Cloud computing service provides the online public business application from web browser access conventionally,Software and data are stored on server simultaneously. Based on how to describe internet in computer network figure,Term cloud is as for the metaphor of internet, and is abstract for its hiding complicated foundation facility.
In addition, ISP250 comprises game processing server (GPS) 254, and it is used for by game clientPlay single or multiplayer video game. In most of video-games of playing on internet via to game serverConnection operation. Typically, game is used private server application, its from player collect data andBe distributed to other players. This is more efficient and effective than peering arrangement, discrete but it requiresServer is used as the main frame of server application. In another embodiment, GPS is player and theyEach is played games and sets up communication with exchange message between equipment, and does not rely on concentrated GPS.
Special GPS is the server that is independent of client operation. Such server is being positioned at data conventionallyCenter, provide on the specialized hardware of larger bandwidth and dedicated processes ability and move. Private server be forThe method for optimizing of the host game server of the multiplayer of great majority based on PC. A large amount of many people swim onlinePlay, conventionally moving by having in the private server that the proprietorial software company of this game presides over, allows himControl and fresh content more.
Broadcast processing server (BPS) 256 distribution audio or video signals are to audience. Arrive very close limitAudience's broadcast be sometimes referred to as narrow broadcast. The final stage of broadcast distribution is how signal arrives audience or sightCrowd, and it can be as used broadcasting station or TV stand wireless arrival antenna and receiver, or can lead toCross wired TV or wired broadcasting (or " wireless cable ") via station or directly arrive from network. InternetCan also take broadcast or TV to recipient, the special multicast that utilizes permission signal and bandwidth sharing. Go throughIn history, say, broadcast is limited by geographic area, such as government broadcasting or area broadcast. But, soonSpeed internet increase sharply situation under, broadcast without geographical constraints because content can arrive in the worldAlmost any country.
Storage service provider (SSP) 258 provides Computer Storage space and related management service. SSPPeriodic backups and filing are also provided. By being provided as the storage of service, user can be pre-as requiredFixed more storages. Another major advantage is that SSP comprises backup services, and if user's computerHardware driving fault, user will not lose their all data. In addition, multiple SSP can have useOverall or the incremental backup of user data, allows user not rely on user and be where positioned at or for access numberAccording to the effective means visit data of equipment. For example, user can access in home computer and usePersonal document in mobile phone when move at family.
Communication supplier 260 is provided to user's connection. A kind of communication supplier is to provide for internetThe ISP (ISP) of access. ISP uses the number that is suitable for transmitting the Internet protocol datagramReportedly transferring technology (such as dialing, DSL, wire line MODEM, wireless or specialized high-speed interconnection) connectsMeet its client. Communication supplier can also give information to transmit and serve, such as Email, instant messagingSend with SMS. The communication supplier of another kind is Internet Service Provider (NSP), and it is by rightBandwidth or access to netwoks are sold in direct backbone network access in internet. Internet Service Provider can compriseTelecommunications company, data medium, radio communication supplier, ISP, provide at a high speed because of spyThe cable television operator of net access etc.
The exchanges data 268 some modules in ISP253 that interconnect, and via network 266 by these mouldsPiece is connected to user 262. All modules that exchanges data 268 can cover ISP250 are little in what approachRegion, or can cover region in the earth in the time that disparate modules geographically disperses. For example, data are handed overChange 268 quick gigabit (or the faster) Ethernets that can comprise in the casing of data center, orThe intercontinental virtual region network of person (VLAN).
User 262 utilizes the client device 264 that at least comprises CPU, display and I/O to access long-rangeService. Client device can be PC, mobile phone, net book, PDA etc. At an embodimentIn, ISP250 identifies the type of the equipment being used by user, and adjusts the communication means adopting. At itIn his situation, client device uses standard traffic method (such as html) access ISP250.
Information service provider (ISP) 250 transmits bulk informations and serves and geographically disperse and via netThe user 262 that network 266 connects. ISP can only transmit the service (upgrading such as stock price) of a type,Or many services (such as broadcast medium, news, physical culture, match etc.). In addition carried by each ISP,The service of confession is dynamic, that is to say, can put at any time interpolation or take service away. Therefore,Provide the server of particular type can change along with the time to specific individual ISP. For example, using, user can be served by the ISP approaching with user, when user travels to different cities when in the Ta local, family, user can be served by different I SP when in the city. Local ISP arrives new by transmitting the information and the data that needISP, makes user profile " follow " user to new city, makes data more approach user and more holdEasily access. In another embodiment, can management for the main frame ISP of user's information and fromUnder the control of main frame ISP, directly and between the server ISP that engages of user set up main frame-relationship server.In another embodiment, along with client moves all over the world, data are transferred to another from an ISPISP, to make ISP in the more good position for a service-user ISP as these services of transmission.
ISP250 comprises application service provider (ASP) 252, and it provides based on computer by networkService to client. The software that use ASP pattern provides is sometimes also referred to as software or software take as requiredBusiness (SaaS). Simple form for the access of application-specific (such as customer relation management) is providedBy using the standard agreement such as HTTP. Application software resides in supplier's system, and logicalCross and use the web browser of HTML, by the private client software that provided by supplier or allAs other remote interfaces of thin-client are accessed by user.
The service of transmitting on wide geographic area is used cloud computing conventionally. Cloud computing is such typeService, is wherein providing dynamically adjustment and common virtualized resource conduct in proportion on internetService. User needs not be the expert in technological infrastructure in " cloud " of supporting them. Cloud computing canTo be divided into different services, such as infrastructure serve (IaaS), platform serves (PaaS) and softwareI.e. service (SaaS). Cloud computing service provides the online public business application from web browser access conventionally,Software and data are stored on server simultaneously. Based on how to describe internet in computer network figure,Term cloud is as for the metaphor of internet, and is abstract for its hiding complicated foundation facility.
In addition, ISP250 comprises game processing server (GPS) 254, and it is used for by game clientPlay single or multiplayer video game. In most of video-games of playing on internet via to game serverConnection operation. Typically, game is used private server application, its from player collect data andBe distributed to other players. This is more efficient and effective than peering arrangement, discrete but it requiresServer is used as the main frame of server application. In another embodiment, GPS is player and theyEach is played games and sets up communication with exchange message between equipment, and does not rely on concentrated GPS.
Special GPS is the server that is independent of client operation. Such server is being positioned at data conventionallyCenter, provide on the specialized hardware of larger bandwidth and dedicated processes ability and move. Private server be forThe method for optimizing of the host game server of the multiplayer of great majority based on PC. A large amount of many people swim onlinePlay, conventionally moving by having in the private server that the proprietorial software company of this game presides over, allows himControl and fresh content more.
Broadcast processing server (BPS) 256 distribution audio or video signals are to audience. Arrive very close limitAudience's broadcast be sometimes referred to as narrow broadcast. The final stage of broadcast distribution is how signal arrives audience or sightCrowd, and it can be as used broadcasting station or TV stand wireless arrival antenna and receiver, or can lead toCross wired TV or wired broadcasting (or " wireless cable ") via station or directly arrive from network. InternetCan also take broadcast or TV to recipient, the special multicast that utilizes permission signal and bandwidth sharing. Go throughIn history, say, broadcast is limited by geographic area, such as government broadcasting or area broadcast. But, soonSpeed internet increase sharply situation under, broadcast without geographical constraints because content can arrive in the worldAlmost any country.
Storage service provider (SSP) 258 provides Computer Storage space and related management service. SSPPeriodic backups and filing are also provided. By being provided as the storage of service, user can be pre-as requiredFixed more storages. Another major advantage is that SSP comprises backup services, and if user's computerHardware driving fault, user will not lose their all data. In addition, multiple SSP can have useOverall or the incremental backup of user data, allows user not rely on user and be where positioned at or for access numberAccording to the effective means visit data of equipment. For example, user can access in home computer and usePersonal document in mobile phone when move at family.
Communication supplier 260 is provided to user's connection. A kind of communication supplier is to provide for internetThe ISP (ISP) of access. ISP uses the number that is suitable for transmitting the Internet protocol datagramReportedly transferring technology (such as dialing, DSL, wire line MODEM, wireless or specialized high-speed interconnection) connectsMeet its client. Communication supplier can also give information to transmit and serve, such as Email, instant messagingSend with SMS. The communication supplier of another kind is Internet Service Provider (NSP), and it is by rightBandwidth or access to netwoks are sold in direct backbone network access in internet. Internet Service Provider can compriseTelecommunications company, data medium, radio communication supplier, ISP, provide at a high speed because of spyThe cable television operator of net access etc.
The exchanges data 268 some modules in ISP253 that interconnect, and via network 266 by these mouldsPiece is connected to user 262. All modules that exchanges data 268 can cover ISP250 are little in what approachRegion, or can cover region in the earth in the time that disparate modules geographically disperses. For example, data are handed overChange 268 quick gigabit (or the faster) Ethernets that can comprise in the casing of data center, orThe intercontinental virtual region network of person (VLAN).
User 262 utilizes the client device 264 that at least comprises CPU, display and I/O to access long-rangeService. Client device can be PC, mobile phone, net book, PDA etc. At an embodimentIn, ISP250 identifies the type of the equipment being used by user, and adjusts the communication means adopting. At itIn his situation, client device uses standard traffic method (such as html) access ISP250.
Various embodiments of the present invention can be put into practice by various computer system configurations, various computer systemsConfiguration comprise handheld device, microprocessor system, based on microprocessor or programmable consumer electronics,Mini-computer, mainframe computer etc. The present invention can also put into practice in DCE, whereinTask is carried out by the teleprocessing equipment by network linking.
Consider embodiment above, should be appreciated that the present invention can adopt to relate in computer system to depositThe various computer-implemented operation of the data of storage. These operations are those of physical manipulation of requirement physical quantityA bit. The described here any operation that forms a part of the present invention is useful machine operation. The present inventionAlso relate to equipment or device for carrying out these operations. Can be the special constructing apparatus of object requiring,Such as special-purpose computer. In the time being defined as special-purpose computer, computer can also be carried out and not be special objectOther of part process, program carries out or routine, simultaneously still can be for special object operation. Can replaceDai Di, can be by selective activation or by being stored in computer storage, high-speed cache or obtaining by networkThe general-purpose computer processes of the one or more computer programs configurations that obtain. In the time obtaining data by network,Data can for example, be processed by other computers on network (, the cloud of computational resource).
Various embodiments of the present invention can also be defined as the machine that data is transformed into another state from a stateDevice. The data of conversion can be saved in memory, are then handled by processor. Processor is therefore by dataBe converted to another part thing from something. In addition, method can be by one that can connect by network or manyIndividual machine or processor processing. Each machine can be converted to another from a state or thing by data,And can deal with data, data are saved in to memory, by transmitted data on network, show result,Or by result another machine of communicating by letter.
The computer that one or more embodiment of the present invention can also be combined as on computer-readable medium canRead code. Computer-readable medium is any data storage device that can store data, and after this data canWith by computer system reads. The example of computer-readable medium comprises hard drives, network attached storage(NAS), read-only storage, random access memory, CD-ROM, CD-R, CD-RW, tapeWith other optics and non-optical data storage device. Computer-readable medium can be included in network couplingThe computer-readable tangible medium distributing in computer system, makes computer-readable code with distributed sideFormula storage and execution.
Although method operation is described with particular order, should be appreciated that and can between operation, carry out other internal affairsProcess operation, or can adjust operation, they were occurred in the different slightly time, or canBe distributed in and allow to process in the system that operates the various intervals appearance to be associated with processing, as long as with hopeMode carry out the processing of overlap-add operation.
Although describe aforementioned invention in detail for the clear object of understanding, be apparent that passablePut into practice within the scope of the appended claims specific change and amendment. Therefore, the present embodiment is considered as illustrativeInstead of restrictive, and the invention is not restricted to details given herein, but can be in appended powerIn the scope that profit requires and equivalent, revise.

Claims (35)

1. for using the method for view for portable set control virtual scene, described method comprises:
Receive the signal for synchronous described portable set;
Described portable set is synchronized to the reference point in physical three-dimensional 3d space, and described reference point isIn the space being occupied by described portable set in the time that described portable set receives the signal of wanting synchronousPoint;
Synchronously use the camera of described portable set to catch the image of physics 3d space in response to described;
Follow the tracks of with respect to portable set described reference point, described present bit in physics 3d spacePut the image recognition of described tracking utilization to described image of catching and by described portable setThe inertia information that inertial sensor obtains;
Generate around the virtual scene of described reference point definition, described virtual scene comprises virtual reality element;And
In the display of described portable set, show looking of described virtual scene based on described current locationFigure.
2. the method for claim 1, the view that wherein creates described virtual scene also comprises:
Be blended in true element and described virtual reality element in described physics 3d space, wherein virtualReality element appears in the view of demonstration, is a part for 3d space as virtual reality element, itsIn from geometrical perspective, in the time that described portable set moves in 3d space, virtual reality element looksFigure changes to change identical mode with the view of true element.
3. method as claimed in claim 2, wherein said true element comprises desk, also wherein mixesComprise:
Place virtual reality element at the top of described desk, reside in as the virtual reality element of placingOn described desk.
4. method as claimed in claim 2, wherein said true element comprises the wall in room, itsMiddle mixing also comprises:
Add diagram to described wall, described diagram is one of virtual reality element.
5. the method for claim 1, described in the current location of wherein said portable set comprisesSurperficial the sitting for how much of the described display in the geometric coordinate of portable set and described portable setMark.
6. method as claimed in claim 5, the geometric coordinate of wherein said portable set is equivalent to instituteState the geometric coordinate of the described camera of portable set.
7. method as claimed in claim 6, wherein about having the center surperficial watching of displayInitial point and perpendicular to the vector of the surperficial direction of watching of display, definition view direction.
8. the method for claim 1, the view that wherein creates described virtual scene also comprises:
In the time that described portable set moves closer to the second virtual reality element, amplify described second virtual existingReal argument element; And
In the time that described portable set moves away from described the second virtual reality element, dwindle described the second voidIntend real element.
9. the method for claim 1, wherein changes the view showing and also comprises:
In the time that described portable set moves, add image stabilization to the image showing.
10. the method for claim 1, also comprises:
Receive the input for changing view; And
Change the establishment of the view of described virtual scene, make from being different from the current of described portable setAnother point of position calculates the view of described virtual scene.
11. methods as claimed in claim 10, the view of wherein said virtual scene is with respect to describedThe vertical line Rotate 180 degree that reference point is crossing.
12. the method for claim 1, wherein by pressing the button on described portable setOr by touching the touch-sensitive display of described portable set, generate the signal receiving.
13. the method for claim 1, wherein by portable set institute described in receiving when signalThe wall in the room being positioned at, limits the border of described virtual scene.
14. the method for claim 1, wherein synchronous described portable set also comprises:
The position tracking module of resetting in described portable set, described position tracking module is from comprising accelerationMeter, magnetometer, camera, depth camera, compass or gyrostatic group selection,
Wherein use the tracking from the information and executing current location of described position tracking module.
15. 1 kinds for using the method for view of portable set control virtual scene, comprising:
Described portable set is synchronized to the reference point in physical three-dimensional 3d space, and described reference point isIn the space being occupied by described portable set in the time that described portable set receives the signal of wanting synchronousPoint;
Synchronously use the camera of described portable set to catch the image of physics 3d space in response to described;
Generate around the virtual scene of described reference point definition, described virtual scene comprises virtual reality element;
Determine with respect to portable set described reference point, described present bit in physics 3d spacePut;
In the display of described portable set, show looking of described virtual scene based on described current locationFigure;
The image recognition of the hand of the player in the image based on to caught is followed the tracks of described hand at physics 3DPosition in space;
Detect the position when position of described hand is positioned at for the first virtual element; And
After described detection, make described in can the simulating alternately of described hand and described the first virtual elementHand touches described virtual element, and wherein said hand can be handled described the first virtual element to change describedThe position of one virtual element or characteristic are real objects as described the first virtual element.
16. methods as claimed in claim 15, wherein said hand be alternately from comprise interaction,Hold, push away, draw, grab, move, clash into, push, hit, throw, fight, open, close,The action to described the first virtual element of the group selection that turns on and off, presses button, shoots or eat.
17. methods as claimed in claim 15, wherein create view and also comprise:
According to the lighting condition in 3d space and virtual scene, on virtual reality element, add described handShade.
18. 1 kinds of methods for shared virtual scene between each equipment, described method comprises:
Calculate with respect to the reference point in physical three-dimensional 3d space, described first by the first equipmentThe first location of equipment;
By the first equipment calculate with respect to the first location of described the first equipment, the second equipment is at physicsThe second location in three-dimensional 3d space, wherein said the first equipment and the second equipment are handheld devices;
Described in the information of the reference point in the described physics 3d space of mark is sent to from described the first equipmentThe second equipment, described information comprises described reference point, described the first location and described the second location;
Generate the described physics 3d space of enhancing, be positioned at described reference point virtual scene around, described voidIntend scene and shared by two equipment, and be present in described the first equipment and described the second equipment separatelyOn display, in response to from described the first equipment or mutual, described virtual from described the second equipmentScene changes in two equipment simultaneously;
Establishment similarly is to watch first of the same described virtual scene to look from the current location of described the first equipmentFigure;
In the display of described the first equipment, show the first view; And
When described the first equipment moves in physics 3d space, change the demonstration of described virtual sceneThe first view.
19. methods as claimed in claim 18, the location of wherein calculating described the second equipment also comprises:
Collect first-phase between described the first equipment and described the second equipment to fixed by described the first equipmentPosition information, described collection comprises that follow the tracks of WiFi position, audio frequency triangulation or from described the first equipmentOne or more of graphical analysis that obtain of camera;
By described the first equipment based on described first-phase to locating information, determine that described the second equipment is relativeIn described second location of described the first location; And
The coordinate that sends described the second location and described reference point from described the first equipment is established to described secondStandby.
20. methods as claimed in claim 18, also comprise:
Collect first-phase between described the first equipment and described the second equipment to fixed by described the first equipmentPosition information, described collection comprises that follow the tracks of WiFi position, audio frequency triangulation or from described the first equipmentOne or more of graphical analysis that obtain of camera;
Receive second-phase to locating information by described the first equipment from described the second equipment;
By described the first equipment based on the first and second relative information, determine described the second equipment with respect toThe second location of described the first equipment; And
The coordinate that sends described the second location and described reference point from described the first equipment is established to described secondStandby.
21. methods as claimed in claim 18, wherein said virtual scene comprises virtual Trivial Pursuit Unhinged,The player who wherein holds respectively described the first equipment and described the second equipment plays described virtual Trivial Pursuit Unhinged.
22. methods as claimed in claim 18, wherein hold the first player of the first equipment by movingMoving the first equipment of synchronizeing with the first virtual element, controls moving of the first virtual element in described virtual sceneMoving.
23. 1 kinds for using the method for view of the first equipment control virtual scene, and described method comprises:
Calculate with respect to the first equipment the first reference point in the first physical three-dimensional 3d space, describedFirst location;
Set up the communication link between described the first equipment and the second equipment, described the second equipment is in describedThe second physics 3d space of the first physics 3d space outside, described the second equipment has with respect to describedThe second location of the second reference point in two physics 3d spaces, wherein said the first equipment and the second equipmentIt is handheld device;
Send the first image of the first user being associated with described the first equipment from described the first equipment, andAnd received the second image of the second user who is associated with described the second equipment by described the first equipment, described inFirst user and described the second user are in different location;
Generation comprises the public virtual scene of virtual reality element, and described public virtual scene is present in instituteState on the first display of the first equipment and on the second display of described the second equipment, described first establishesFor setting up described public virtual scene around described the first reference point, described the second equipment is around described secondReference point is set up described public virtual scene, two equipment can with described virtual reality element interactions;
Determine with respect to the first equipment described reference point, described in described the first physics 3d spaceCurrent location;
Create the view of described public virtual scene, the performance of wherein said view as from as described in the first equipmentThe public virtual scene that current location is watched;
By the second image blend of described the second user in the view of described public virtual scene to create mouldIntend described the second user and approach effect near described first user;
In the display of described the first equipment, show the view of described public virtual scene; And
In the time that described the first equipment moves in described the first physics 3d space, change described public virtualThe view of the demonstration of scene.
24. methods as claimed in claim 23, wherein said communication link comprise described the first equipment andStraight-forward network between described the second equipment connects.
25. methods as claimed in claim 23, also comprise:
Distribute virtual location in described the first physics 3d space to described the second equipment;
Receive mutual corresponding between described the second equipment and public virtual scene from described the second equipmentThe second equipment interactive information; And
Change looking of described public virtual scene according to the second equipment interactive information receiving and virtual locationFigure, wherein said the second equipment appear in described the first physics 3d space and with described the first equipmentAlternately, be physically located at described the first physics 3d space as described the second equipment.
26. methods as claimed in claim 23, also comprise:
Be updated periodically the second image of described the second user.
27. methods as claimed in claim 23, also comprise:
In the time that described the second user moves, upgrade the second image of described the second user.
28. methods as claimed in claim 23, wherein said virtual element comprises chessboard and chess piece, itsDescribed in the first equipment and described the second equipment for play the game of Chinese chess by handling described chess piece, whereinDescribed in described the second user, in the view of virtual scene, appear as be sitting in described first user before.
29. methods as claimed in claim 23, wherein said the first equipment and described the second equipment are in instituteState in the view of public virtual scene and show as respectively the first object and second object, wherein said first pairThe movement of the first equipment described in the first physics 3d space described in the shifted matching of elephant, and described secondThe movement of the second equipment described in the second physics 3d space described in the shifted matching of object.
30. 1 kinds for using the method for view of portable set control virtual scene, described method bagDraw together:
Synchronous portable set is to the reference point in physical three-dimensional 3d space, described reference point for when described inPoint in the space that portable set is occupied by described portable set while receiving the signal of wanting synchronous, instituteStating portable set comprises in the face of the anterior anterior camera of described portable set and in the face of described portableThe rear portion camera at the rear portion of formula equipment, described portable set is handheld device;
Synchronously use the camera of described portable set to catch the image of physics 3d space in response to described;
Follow the tracks of with respect to portable set described reference point, described present bit in physics 3d spacePut the image recognition of described tracking utilization to described image of catching and by described portable setThe inertia information that inertial sensor obtains;
Generate around the virtual scene of described reference point definition, described virtual scene comprises virtual reality element;
Current location based on described portable set creates the view of described virtual scene, and described view is caughtObtain and see as the current eye position from the player's of portable set as described in holding physics 3d spaceThe performance of virtual scene, described in catch corresponding to player by window to seeing in virtual sceneContent, the position of the window in 3d space is equivalent in described portable set display at physics 3D skyBetween in position;
In described display, show the view creating; And
When described portable set or player move in 3d space, change described virtual sceneThe view showing, wherein keeping described portable set to hold described portable set in staticThe change of the position of player's eyes causes the change of the view showing.
31. methods as claimed in claim 30, are wherein used for determining from the image of described anterior cameraCurrent eye position, and from the image of described rear portion camera for obtaining the view of physics 3d space.
32. methods as claimed in claim 30, wherein pull the eyes of described display away from playerMake view amplify described virtual scene, and the eyes that pull described display to approach player make to lookFigure dwindles described virtual scene.
33. 1 kinds for using the method for view of portable set control scene, and described method comprises:
Receive the signal for synchronous described portable set;
Synchronous described portable set, to make the physical three-dimensional of orientating as that described portable set is positioned atReference point in 3d space, described reference point is in the time that described portable set receives the signal of wanting synchronousPoint in the space being occupied by described portable set;
In the time watching described physics 3d space by described portable set, at described portable setGenerating virtual scene in display, described virtual scene comprises virtual reality element;
In the time moving described portable set away from described reference point, create the view of described virtual scene,The performance of wherein said view as from as described in the virtual scene watched of the current location of portable set, Qi ZhongchuanThe view of building is independent of the position of the user's who holds described portable set eyes;
In described portable set, show the view creating; And
In the time that described portable set moves in physics 3d space, change the demonstration of described virtual sceneView.
34. methods as claimed in claim 33, also comprise:
Determine the virtual element that generates sound; And
Send the sound corresponding to the sound being generated by described virtual element from portable set, wherein work as instituteWhen stating portable set and shifting near the position of described virtual element that generates sound, the sound sending becomes largerSound.
35. methods as claimed in claim 34, wherein said portable set has boombox,Wherein, according to the relative positioning between described portable set and the described virtual element of generation sound, regulateThe sound sending is to provide stereophonic effect.
CN201180022611.0A 2010-03-05 2011-01-24 On the stable Virtual Space of sharing, maintain many views Active CN102884490B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201610220654.4A CN105843396B (en) 2010-03-05 2011-01-24 The method of multiple view is maintained on shared stabilization Virtual Space

Applications Claiming Priority (5)

Application Number Priority Date Filing Date Title
US31125110P 2010-03-05 2010-03-05
US61/311,251 2010-03-05
US12/947,290 2010-11-16
US12/947,290 US8730156B2 (en) 2010-03-05 2010-11-16 Maintaining multiple views on a shared stable virtual space
PCT/US2011/022288 WO2011109126A1 (en) 2010-03-05 2011-01-24 Maintaining multiple views on a shared stable virtual space

Related Child Applications (1)

Application Number Title Priority Date Filing Date
CN201610220654.4A Division CN105843396B (en) 2010-03-05 2011-01-24 The method of multiple view is maintained on shared stabilization Virtual Space

Publications (2)

Publication Number Publication Date
CN102884490A CN102884490A (en) 2013-01-16
CN102884490B true CN102884490B (en) 2016-05-04

Family

ID=43923591

Family Applications (2)

Application Number Title Priority Date Filing Date
CN201180022611.0A Active CN102884490B (en) 2010-03-05 2011-01-24 On the stable Virtual Space of sharing, maintain many views
CN201610220654.4A Active CN105843396B (en) 2010-03-05 2011-01-24 The method of multiple view is maintained on shared stabilization Virtual Space

Family Applications After (1)

Application Number Title Priority Date Filing Date
CN201610220654.4A Active CN105843396B (en) 2010-03-05 2011-01-24 The method of multiple view is maintained on shared stabilization Virtual Space

Country Status (4)

Country Link
CN (2) CN102884490B (en)
MX (1) MX2012010238A (en)
TW (1) TWI468734B (en)
WO (1) WO2011109126A1 (en)

Families Citing this family (77)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP3654147A1 (en) 2011-03-29 2020-05-20 QUALCOMM Incorporated System for the rendering of shared digital interfaces relative to each user's point of view
JP5718197B2 (en) * 2011-09-14 2015-05-13 株式会社バンダイナムコゲームス Program and game device
CN102495959A (en) * 2011-12-05 2012-06-13 无锡智感星际科技有限公司 Augmented reality (AR) platform system based on position mapping and application method
CN102542165B (en) * 2011-12-23 2015-04-08 三星半导体(中国)研究开发有限公司 Operating device and operating method for three-dimensional virtual chessboard
US20130234925A1 (en) * 2012-03-09 2013-09-12 Nokia Corporation Method and apparatus for performing an operation at least partially based upon the relative positions of at least two devices
US8630458B2 (en) 2012-03-21 2014-01-14 Google Inc. Using camera input to determine axis of rotation and navigation
JP5966510B2 (en) * 2012-03-29 2016-08-10 ソニー株式会社 Information processing system
CN103105993B (en) * 2013-01-25 2015-05-20 腾讯科技(深圳)有限公司 Method and system for realizing interaction based on augmented reality technology
TWI555390B (en) * 2013-02-20 2016-10-21 仁寶電腦工業股份有限公司 Method for controlling electronic device and electronic apparatus using the same
EP3005195A4 (en) * 2013-05-24 2017-05-24 Awe Company Limited Systems and methods for a shared mixed reality experience
US20140368537A1 (en) * 2013-06-18 2014-12-18 Tom G. Salter Shared and private holographic objects
US10146299B2 (en) 2013-11-08 2018-12-04 Qualcomm Technologies, Inc. Face tracking for additional modalities in spatial interaction
CN104657568B (en) * 2013-11-21 2017-10-03 深圳先进技术研究院 Many people's moving game system and methods based on intelligent glasses
EP2886172A1 (en) * 2013-12-18 2015-06-24 Microsoft Technology Licensing, LLC Mixed-reality arena
US9787846B2 (en) * 2015-01-21 2017-10-10 Microsoft Technology Licensing, Llc Spatial audio signal processing for objects with associated audio content
US9407865B1 (en) * 2015-01-21 2016-08-02 Microsoft Technology Licensing, Llc Shared scene mesh data synchronization
US10015370B2 (en) 2015-08-27 2018-07-03 Htc Corporation Method for synchronizing video and audio in virtual reality system
KR102610120B1 (en) 2016-01-20 2023-12-06 삼성전자주식회사 Head mounted display and control method thereof
US10115234B2 (en) * 2016-03-21 2018-10-30 Accenture Global Solutions Limited Multiplatform based experience generation
US10665019B2 (en) 2016-03-24 2020-05-26 Qualcomm Incorporated Spatial relationships for integration of visual images of physical environment into virtual reality
CN105938629B (en) * 2016-03-31 2022-01-18 联想(北京)有限公司 Information processing method and electronic equipment
CN109219789A (en) * 2016-05-04 2019-01-15 深圳脑穿越科技有限公司 Display methods, device and the terminal of virtual reality
US10245507B2 (en) * 2016-06-13 2019-04-02 Sony Interactive Entertainment Inc. Spectator management at view locations in virtual reality environments
US10169918B2 (en) * 2016-06-24 2019-01-01 Microsoft Technology Licensing, Llc Relational rendering of holographic objects
CN106200956A (en) * 2016-07-07 2016-12-07 北京时代拓灵科技有限公司 A kind of field of virtual reality multimedia presents and mutual method
CN106447786A (en) * 2016-09-14 2017-02-22 同济大学 Parallel space establishing and sharing system based on virtual reality technologies
US10593116B2 (en) 2016-10-24 2020-03-17 Snap Inc. Augmented reality object manipulation
CN106528285A (en) * 2016-11-11 2017-03-22 上海远鉴信息科技有限公司 Method and system for multi-terminal cooperative scheduling in virtual reality
CN106621306A (en) * 2016-12-23 2017-05-10 浙江海洋大学 Double-layer three-dimensional type army flag chessboard
US10242503B2 (en) 2017-01-09 2019-03-26 Snap Inc. Surface aware lens
EP3566111B1 (en) * 2017-01-09 2023-09-20 Snap Inc. Augmented reality object manipulation
WO2018158896A1 (en) * 2017-03-01 2018-09-07 三菱電機株式会社 Information processing system
CN107103645B (en) * 2017-04-27 2018-07-20 腾讯科技(深圳)有限公司 virtual reality media file generation method and device
CN107087152B (en) * 2017-05-09 2018-08-14 成都陌云科技有限公司 Three-dimensional imaging information communication system
CN108932051B (en) * 2017-05-24 2022-12-16 腾讯科技(北京)有限公司 Augmented reality image processing method, apparatus and storage medium
CN107320955B (en) * 2017-06-23 2021-01-29 武汉秀宝软件有限公司 AR venue interface interaction method and system based on multiple clients
CN109298776B (en) * 2017-07-25 2021-02-19 阿里巴巴(中国)有限公司 Augmented reality interaction system, method and device
CN107469343B (en) * 2017-07-28 2021-01-26 深圳市瑞立视多媒体科技有限公司 Virtual reality interaction method, device and system
CN107390875B (en) * 2017-07-28 2020-01-31 腾讯科技(上海)有限公司 Information processing method, device, terminal equipment and computer readable storage medium
CN107492183A (en) * 2017-07-31 2017-12-19 程昊 One kind has paper instant lottery AR methods of exhibiting and system
CN107632700A (en) * 2017-08-01 2018-01-26 中国农业大学 A kind of farm implements museum experiencing system and method based on virtual reality
CN109426333B (en) * 2017-08-23 2022-11-04 腾讯科技(深圳)有限公司 Information interaction method and device based on virtual space scene
WO2019080902A1 (en) * 2017-10-27 2019-05-02 Zyetric Inventions Limited Interactive intelligent virtual object
CN111263956A (en) * 2017-11-01 2020-06-09 索尼公司 Information processing apparatus, information processing method, and program
CN107861682A (en) * 2017-11-03 2018-03-30 网易(杭州)网络有限公司 The control method for movement and device of virtual objects
CN107657589B (en) * 2017-11-16 2021-05-14 上海麦界信息技术有限公司 Mobile phone AR positioning coordinate axis synchronization method based on three-datum-point calibration
CN107967054B (en) * 2017-11-16 2020-11-27 中国人民解放军陆军装甲兵学院 Immersive three-dimensional electronic sand table with virtual reality and augmented reality coupled
CN107995481B (en) * 2017-11-30 2019-11-15 贵州颐爱科技有限公司 A kind of display methods and device of mixed reality
CN108269307B (en) * 2018-01-15 2023-04-07 歌尔科技有限公司 Augmented reality interaction method and equipment
US20210038975A1 (en) * 2018-01-22 2021-02-11 The Goosebumps Factory Bvba Calibration to be used in an augmented reality method and system
US11880540B2 (en) * 2018-03-22 2024-01-23 Hewlett-Packard Development Company, L.P. Digital mark-up in a three dimensional environment
CN108519817A (en) * 2018-03-26 2018-09-11 广东欧珀移动通信有限公司 Exchange method, device, storage medium based on augmented reality and electronic equipment
CN108667798A (en) * 2018-03-27 2018-10-16 上海临奇智能科技有限公司 A kind of method and system of virtual viewing
CN108479065B (en) * 2018-03-29 2021-12-28 京东方科技集团股份有限公司 Virtual image interaction method and related device
US11173398B2 (en) * 2018-05-21 2021-11-16 Microsoft Technology Licensing, Llc Virtual camera placement system
CN108919945A (en) * 2018-06-07 2018-11-30 佛山市长郡科技有限公司 A kind of method of virtual reality device work
CN109284000B (en) * 2018-08-10 2022-04-01 西交利物浦大学 Method and system for visualizing three-dimensional geometric object in virtual reality environment
US11030813B2 (en) 2018-08-30 2021-06-08 Snap Inc. Video clip object tracking
US11176737B2 (en) 2018-11-27 2021-11-16 Snap Inc. Textured mesh building
CN113330484A (en) 2018-12-20 2021-08-31 斯纳普公司 Virtual surface modification
US10866658B2 (en) 2018-12-20 2020-12-15 Industrial Technology Research Institute Indicator device, mixed reality device and operation method thereof
US10948978B2 (en) 2019-04-23 2021-03-16 XRSpace CO., LTD. Virtual object operating system and virtual object operating method
CN113508361A (en) 2019-05-06 2021-10-15 苹果公司 Apparatus, method and computer-readable medium for presenting computer-generated reality files
US10499044B1 (en) 2019-05-13 2019-12-03 Athanos, Inc. Movable display for viewing and interacting with computer generated environments
CN110286768B (en) * 2019-06-27 2022-05-17 Oppo广东移动通信有限公司 Virtual object display method, terminal device and computer-readable storage medium
US11189098B2 (en) 2019-06-28 2021-11-30 Snap Inc. 3D object camera customization system
CN110349270B (en) * 2019-07-02 2023-07-28 上海迪沪景观设计有限公司 Virtual sand table presenting method based on real space positioning
US11232646B2 (en) 2019-09-06 2022-01-25 Snap Inc. Context-based virtual object rendering
US20210157394A1 (en) 2019-11-24 2021-05-27 XRSpace CO., LTD. Motion tracking system and method
US11227442B1 (en) 2019-12-19 2022-01-18 Snap Inc. 3D captions with semantic graphical elements
US11263817B1 (en) 2019-12-19 2022-03-01 Snap Inc. 3D captions with face tracking
CN111915736A (en) * 2020-08-06 2020-11-10 黄得锋 AR interaction control system, device and application
CN113941138A (en) * 2020-08-06 2022-01-18 黄得锋 AR interaction control system, device and application
US11832015B2 (en) 2020-08-13 2023-11-28 Snap Inc. User interface for pose driven virtual effects
CN115705116A (en) * 2021-08-04 2023-02-17 北京字跳网络技术有限公司 Interactive method, electronic device, storage medium, and program product
US20230078578A1 (en) * 2021-09-14 2023-03-16 Meta Platforms Technologies, Llc Creating shared virtual spaces
TWI803134B (en) * 2021-09-24 2023-05-21 宏達國際電子股份有限公司 Virtual image display device and setting method for input interface thereof

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6522312B2 (en) * 1997-09-01 2003-02-18 Canon Kabushiki Kaisha Apparatus for presenting mixed reality shared among operators
US6972734B1 (en) * 1999-06-11 2005-12-06 Canon Kabushiki Kaisha Mixed reality apparatus and mixed reality presentation method
CN101452339A (en) * 2007-12-06 2009-06-10 国际商业机器公司 Rendering of real world objects and interactions into a virtual universe

Family Cites Families (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5588139A (en) * 1990-06-07 1996-12-24 Vpl Research, Inc. Method and system for generating objects for a multi-person virtual world using data flow networks
US7149691B2 (en) * 2001-07-27 2006-12-12 Siemens Corporate Research, Inc. System and method for remotely experiencing a virtual environment
JP4054585B2 (en) * 2002-02-18 2008-02-27 キヤノン株式会社 Information processing apparatus and method
US20060257420A1 (en) * 2002-04-26 2006-11-16 Cel-Sci Corporation Methods of preparation and composition of peptide constructs useful for treatment of autoimmune and transplant related host versus graft conditions
US11033821B2 (en) * 2003-09-02 2021-06-15 Jeffrey D. Mullen Systems and methods for location based games and employment of the same on location enabled devices
US8323106B2 (en) * 2008-05-30 2012-12-04 Sony Computer Entertainment America Llc Determination of controller three-dimensional location using image analysis and ultrasonic communication
US8585476B2 (en) * 2004-11-16 2013-11-19 Jeffrey D Mullen Location-based games and augmented reality systems
TWI278772B (en) * 2005-02-23 2007-04-11 Nat Applied Res Lab Nat Ce Augmented reality system and method with mobile and interactive function for multiple users
JP4738870B2 (en) * 2005-04-08 2011-08-03 キヤノン株式会社 Information processing method, information processing apparatus, and remote mixed reality sharing apparatus
JP5230114B2 (en) * 2007-03-13 2013-07-10 キヤノン株式会社 Information processing apparatus and information processing method
CN101174332B (en) * 2007-10-29 2010-11-03 张建中 Method, device and system for interactively combining real-time scene in real world with virtual reality scene
US8786675B2 (en) * 2008-01-23 2014-07-22 Michael F. Deering Systems using eye mounted displays

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6522312B2 (en) * 1997-09-01 2003-02-18 Canon Kabushiki Kaisha Apparatus for presenting mixed reality shared among operators
US6972734B1 (en) * 1999-06-11 2005-12-06 Canon Kabushiki Kaisha Mixed reality apparatus and mixed reality presentation method
CN101452339A (en) * 2007-12-06 2009-06-10 国际商业机器公司 Rendering of real world objects and interactions into a virtual universe

Also Published As

Publication number Publication date
WO2011109126A1 (en) 2011-09-09
CN102884490A (en) 2013-01-16
CN105843396A (en) 2016-08-10
MX2012010238A (en) 2013-01-18
TW201205121A (en) 2012-02-01
TWI468734B (en) 2015-01-11
CN105843396B (en) 2019-01-01

Similar Documents

Publication Publication Date Title
CN102884490B (en) On the stable Virtual Space of sharing, maintain many views
US10424077B2 (en) Maintaining multiple views on a shared stable virtual space
CN102939139B (en) Calibration of portable devices in shared virtual space
US11050977B2 (en) Immersive interactive remote participation in live entertainment
US9947139B2 (en) Method and apparatus for providing hybrid reality environment
US10573060B1 (en) Controller binding in virtual domes
TWI786700B (en) Scanning of 3d objects with a second screen device for insertion into a virtual environment
US20130196772A1 (en) Matching physical locations for shared virtual experience
US20100103196A1 (en) System and method for generating a mixed reality environment
CN106716306A (en) Synchronizing multiple head-mounted displays to a unified space and correlating movement of objects in the unified space
CN104010706A (en) Directional input for a video game
WO2018167563A1 (en) Virtual reality system using an actor and director model
CN111744180A (en) Method and device for loading virtual game, storage medium and electronic device
US10740957B1 (en) Dynamic split screen
Gobira et al. Expansion of uses and applications of virtual reality
TWI807732B (en) Non-transitory computer-readable storage medium for interactable augmented and virtual reality experience
TW202111480A (en) Virtual reality and augmented reality interaction system and method respectively playing roles suitable for an interaction technology by an augmented reality user and a virtual reality user
CN117861200A (en) Information processing method and device in game, electronic equipment and storage medium

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant