WO2004113991A2 - Calibrating real and virtual views - Google Patents

Calibrating real and virtual views Download PDF

Info

Publication number
WO2004113991A2
WO2004113991A2 PCT/US2004/018346 US2004018346W WO2004113991A2 WO 2004113991 A2 WO2004113991 A2 WO 2004113991A2 US 2004018346 W US2004018346 W US 2004018346W WO 2004113991 A2 WO2004113991 A2 WO 2004113991A2
Authority
WO
WIPO (PCT)
Prior art keywords
real
virtual
display
optical
view
Prior art date
Application number
PCT/US2004/018346
Other languages
French (fr)
Other versions
WO2004113991A3 (en
Inventor
Frank Sauer
Yakup Genc
Nassir Navab
Original Assignee
Siemens Corporate Research, Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Siemens Corporate Research, Inc. filed Critical Siemens Corporate Research, Inc.
Priority to DE112004000902T priority Critical patent/DE112004000902T5/en
Publication of WO2004113991A2 publication Critical patent/WO2004113991A2/en
Publication of WO2004113991A3 publication Critical patent/WO2004113991A3/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/012Head tracking input arrangements
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0101Head-up displays characterised by optical features
    • G02B2027/0138Head-up displays characterised by optical features comprising image capture systems, e.g. camera
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0179Display position adjusting means not related to the information to be displayed
    • G02B2027/0187Display position adjusting means not related to the information to be displayed slaved to motion of at least a part of the body of the user, e.g. head, eye
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/01Head-up displays
    • G02B27/0189Sight systems

Definitions

  • the present invention relates to augmented reality, and more particularly to a system and method for augmented reality calibration of see-through head-mounted displays.
  • Augmented vision also referred to as augmented reality or augmented reality vision
  • Augmented reality augments a user's view of the real world with superimposed computer generated graphical information.
  • This information may be include, for example, a text label attached to some object in the scene, or a three- dimensional (3D) model of a patient's brain, derived from an MRI scan, and aligned to the real view of the person's head.
  • the user may observe the real world directly with his or her eyes, and the additional graphics information is blended in via a semi-transparent display located between the observer and the real scene.
  • a semi-transparent display located between the observer and the real scene.
  • Such a display device can, for example, be an optical see-through head mounted display.
  • the display can also be opaque, like a computer screen or a non-see-through head mounted display. It then presents to the user the complete augmented view, a combination of the real-world view and the graphics overlay.
  • a video camera takes the place of the real-world observer to capture the real world-view. Two cameras may be implemented for stereo vision.
  • a computer may be used to combine the live video with the graphics augmentation.
  • a display device of this kind is, for example, a video-see-through head-mounted display.
  • the graphics are positioned, oriented, and scaled, or even rendered in a perspective fashion for correct alignment with the real-world view.
  • the graphics may be anchored to a real- world object.
  • the relationship between two coordinate systems needs to be defined, one attached to the user's head, the other attached to the object.
  • Tracking denotes the process of keeping track of this relationship.
  • Commercial tracking systems are available based on optical, magnetic, ultrasound, and mechanical means.
  • Calibration is needed to achieve correct alignment between virtual graphics objects and real objects in the scene.
  • Calibrating a video-see-through HMD can be done in an objective way, independent of a user, as real and virtual images are combined in the computer.
  • an optical-see-through HMD the combination of the real and virtual images takes place finally in the user's eye, and the position of the user's eye behind the semi-transparent screen has critical influence on the alignment.
  • An augmented reality system comprises a real reference generator for displaying a real reference on a calibration screen, an optical see-through display having a fixed position with respect to the real reference generator and a virtual reference generator for displaying a virtual reference on the optical see-through display.
  • the augmented reality system further comprises an input device for aligning a view of the virtual reference with a view of the real reference through the optical see-through display, wherein the virtual reference is moved on the optical see-through display, and a processor for determining one or more parameters for rendering a virtual object as part of a real scene seen through the optical see- through display.
  • the augmented reality system comprises a tracking camera for tracking a pose of the calibration screen with respect to the real reference.
  • the augmented reality system comprises a tracking camera having a fixed position with respect to the real reference generator for capturing a view of the calibration screen.
  • the augmented reality system further comprises a processor, wherein an optical marker configuration is fixed to the calibration screen and imaged by the tracking camera, wherein the processor determines a positional relationship between the calibration screen and a head-mounted display according to a position of the optical marker configuration in an image captured by the tracking camera, the head-mounted display comprising the real reference generator and optical see- through display.
  • the augmented reality system comprises at least one tracking camera for capturing a view of the calibration screen and a head-mounted display comprising the real reference generator and optical see-through display.
  • the augmented reality system further comprises a processor, wherein an optical marker configuration is fixed to each of the calibration screen and the head-mounted display and tracked by the at least one tracking camera, wherein the processor determines a positional relationship between the calibration screen and the head-mounted display according to the positions of respective optical marker configurations in the view captured by the at least one tracking camera.
  • a system for calibrating real and virtual views comprises a real reference generator for displaying a real reference on a calibration screen, an optical display having a fixed position with respect to the real reference point generator and a virtual reference generator for generating a virtual reference in the optical display.
  • the system further comprises an input device for aligning a view of the virtual reference with a view of the real reference, wherein the virtual reference is moved on the optical display with respect to the view of the real reference and a processor for determining one or more parameters for rendering a virtual object in a real scene seen in the optical display.
  • the system further comprising a camera capturing the view of the real reference, wherein the real reference is displayed in the optical display with the virtual reference superimposed thereon.
  • the system comprises a tracking camera having a fixed position with respect to the real reference generator for capturing a view of the calibration screen and a processor, wherein an optical marker configuration is fixed to the calibration screen and tracked by the tracking camera, wherein the processor determines a positional relationship between the calibration screen and a head-mounted display according to the position of the optical marker configuration in the view captured by the tracking camera, the head-mounted display comprising the real reference generator and optical display.
  • the system comprises a tracking camera coupled to the real reference generator for capturing a view of the calibration screen.
  • the system further comprises a processor, wherein an optical marker configuration is fixed to the calibration screen and tracked by the tracking camera, wherein the processor determines a positional relationship between the calibration screen and a head-mounted display according to the position of the optical marker configuration in the view captured by the tracking camera, the head-mounted display comprising the real reference generator and optical display.
  • the system comprises at least one tracking camera for capturing a view of the calibration screen and a head-mounted display comprising the real reference generator and optical display.
  • the system further comprises a processor, wherein an optical marker configuration is fixed to each of the calibration screen and the head-mounted display and tracked by the tracking camera, wherein the processor determines a positional relationship between the calibration screen and the head-mounted display according to the positions of respective optical marker configurations in the view captured by the at least one tracking camera.
  • a method for calibrating real and virtual views comprises tracking a calibration screen, wherein a real reference, generated by a real reference generator, is projected on the calibration screen, aligning a virtual reference to a view of the real reference in a display, wherein the real reference generator and the display have a fixed relative position, determining a point correspondence between the virtual reference and the real reference, and determining one or more parameters for rendering a virtual object in the real scene.
  • the ; method comprises displaying the virtual reference on an -optical see-through display, -through which the real- reference is visible.
  • the method comprises capturing a view of a real scene including the real reference and displaying the view of the real scene augmented with the virtual reference .
  • Figure 1 is an illustration of a calibration system according to an embodiment of the present disclosure
  • Figure 2A is an illustration of an augmented reality calibration system according to an embodiment of the present disclosure
  • Figure 2B is an illustration of an augmented reality calibration system according to an embodiment of the present disclosure
  • Figure 2C is an illustration of a video-see-through augmented reality calibration system according to an embodiment of the present disclosure
  • Figure 3 is a flow chart of a method according to an embodiment of the present disclosure.
  • Figure 4 is a flow chart of a method according to an embodiment of the present disclosure.
  • a system and method for calibration of an optical see- through head-mounted display implements a real reference as a light spot originating from an illuminator attached to the HMD.
  • the light spots "jitter along" with the HMD, and the user does not perceive any jitter between these reference markers and virtual markers that are displayed at a fixed location on a semi-transparent screen of the HMD.
  • a user 100 aligns a virtual' reference 101, displayed as graphics on the HMD's semitransparent screen 103, with a real reference structure 102, observed through the screen 103.
  • the real reference structure 102 is implemented as a projected light point and/or pattern on a calibration screen 104 or other substrate.
  • the real reference 102 and the virtual reference 101 may be, for example, one or more points or shapes .
  • the present invention may be implemented in various forms of hardware, software, firmware, special purpose processors, or a combination thereof.
  • the present invention may be implemented in software as an application program tangibly embodied on a program storage device.
  • the application program may be. uploaded to, and executed by, a machine comprising any suitable architecture.
  • the real reference 102 originates from an illumination system 200 that is rigidly attached to the HMD 201 and observed through a screen 103.
  • the real reference 102 moves along on the calibration screen 104, and for small head movements the real reference 102 appears fixed with respect to the virtual reference 101 as seen through the HMD's semi- transparent screen 103.
  • An alignment process is now easier from a user's 100 vantage, as jitter between the real reference 102 and the virtual reference 101 is substantially reduced.
  • the real reference 102 is observed on a flat screen 104.
  • the user can hold the screen 104 in one hand at arm's length, place it on a table, etc.
  • the screen 104 is tracked with respect to the user's head or HMD 201.
  • a tracking arrangement includes an external (see Figure 2A) or head-mounted (see Figure 2B) tracking camera 202.
  • the screen 104 and - in the case of Figure 2A' s external tracking camera - the HMD 201 include optical markers 203.
  • the optical markers 203 may be, for example, retroreflective flat discs, or retroreflective spheres .
  • the illuminator 200 projects a light pattern 102 that includes one, or preferably several points.
  • the illuminator 200 can be constructed with a single light source and an optical system that generates the light pattern 102.
  • a laser could be used together with a lens system and a mask, or with diffractive optics to generate an array of light spots.
  • an array of light sources such as an LED array may be used with the advantage that the light pattern can be made switchable.
  • the LEDs can be switched on and off individually.
  • the LEDs can be combined with a lens system or with micro-optics, e.g. a lens array.
  • the illuminator 200 can also include a scanning means or beam deflection means to switch between different beam directions .
  • the screen 103 may be, for example, a monocular or binocular arrangement. In the binocular arrangement, both screens are preferably individually calibrated, one after the other. Appropriate optics 204 in combination with the semitransparent display 103 generates an image of the virtual reference 101.
  • the virtual reference 101 is visible to a user as the user looks through the semi-transparent display 103.
  • the see-through display can be embodied with an image projector and an optical system, which includes a beam splitter.
  • the user 100 moves the virtual reference (s) 101 displayed on the semi-transparent screen 103 into alignment with the reference light pattern 102 as seen from the user's perspective, e.g., 205.
  • the user 101 controls an interface 206 (e.g., processor 207 and input device 208) to move the virtual reference (s) 101 on the screen 103.
  • the input device 208 may be, for example, a trackball or a mouse.
  • the processor 207 may be a virtual reference generator comprising a processor and graphics card to render the virtual reference for display on the semi-transparent screen.
  • the user aligns the virtual reference 101 to several different real reference light points, e.g., 102.
  • the user may assume different poses (e.g., distances and/or orientations) with regard to the calibration screen 104.
  • a processor determines a spatial relationship between the calibration screen 104 and HMD 201 according to the positions of markers 203 and the user determined alignment of the virtual reference 101 and the real reference 102.
  • Figure 2B is an example of a HMD including a tracking camera 202. As shown, where the tracking camera is fixed to the HMD, the spatial relationship between the calibration screen 104 and HMD 201 may be determined using optical markers 203 fixed to the calibration screen 104 only. Further, the pose of the calibration screen 104 may be determined according to the relationship of different optical markers 203 fixed to the screen 104.
  • Figure 2C is a video-see-through augmented reality system according to an embodiment of the present disclosure, wherein
  • a camera e.g., 202
  • the tracking and video functions of camera 202 may be performed by separate cameras.
  • the image of the real scene is displayed to the user 101.
  • a virtual view is superimposed on the real view, e.g., 209, and the user 101 perceives real and virtual views, e.g., 204.
  • the user may align the virtual reference with the view of the real reference in the real scene .
  • Tracking camera and illuminator are mechanically fixed to each other, and location and orientation of the light beams, which generate the reference points, are determined with respect to a coordinate system of the tracking camera.
  • location and orientation of the light beams, which generate the reference points are determined with respect to a coordinate system of the tracking camera.
  • the calibration screen 301 By tracking the calibration screen 301, one can determine the 3D coordinates of the reference points in the tracking camera coordinate system as the intersection of the corresponding light beams with the plane of the screen.
  • a user aligns the real and virtual references 302 and the system records a set of 3D-2D point correspondences 303.
  • Each consists of the 3D coordinates of a reference light point and the 2D coordinates of a virtual marker that the user has aligned to the reference.
  • This set of point correspondences allows one to determine one or more parameters for rendering the virtual objects in correct alignment with the real scene 304.
  • the camera parameters that determine the user's view of the virtual world as displayed on the semitransparent screen, are matched to the camera parameters that determine the user's view of the real world as seen through the screen.
  • Such determinations are known in the art, for example, as described in U.S. Patent Application No.
  • calibration may include instantiating parameter values for mathematical models that map the physical environment to internal representations, so that the computer's internal model matches the physical environment.
  • parameters include, for example optical characteristics of a physical camera and position and orientation (pose) information of various entities such as the camera, the markers for tracking, and the various objects.
  • the system can render 3D graphical objects in a way that they appear rigidly anchored in the real scene.
  • the user's viewpoint changes are tracked with a tracking system and accounted for with corresponding changes of the graphics objects' virtual view.
  • external tracking means can be used in conjunction 1 with head-mounted markers or sensors that are rigidly fixed with respect to the illuminator.
  • the tracking system tracks both the HMD and the calibration screen 401.
  • the 3D coordinates of the calibration light points can be aligned 402 and determined as intersection of light beams and screen plane 403.
  • the virtual reference points are brought into alignment with the real reference points displayed as light on the screen and the correspondence is recorded 404.
  • System includes head-mounted display, tracking means, computing and graphics rendering means, light projection means, and trackable screen.
  • Calibration alignments between the real and virtual reference structures may be averaged over several measurements for each point correspondence. Note that virtual marker and real marker appear jitter free relative to each other. Averaging may reduce error in the calibration. Averaging is user-friendly compared to calibration procedures that use external features. Here, the user can hold the alignment for one or several seconds because of the reduced j itter between the real and virtual markers.

Abstract

A method for calibrating real and virtual views includes tracking a calibration screen (104), wherein a real reference point (102), generated by a real reference point generator (200), is projected on the calibration screen (104), aligning a view of a virtual reference point (101) to a view of the real reference point (102) in a display, wherein the real reference point generator (200) and the display have a fixed relative position, determining a point correspondence between the virtual reference point (101) and the real reference point (102), and determining one or more parameters for rendering a virtual object in the real scene.

Description

CALIBRATING REAL AND VIRTUAL VIEWS BACKGROUND OF THE INVENTION
1. Field of the Invention
The present invention relates to augmented reality, and more particularly to a system and method for augmented reality calibration of see-through head-mounted displays.
2. Discussion of Related Art
Augmented vision, also referred to as augmented reality or augmented reality vision, augments a user's view of the real world with superimposed computer generated graphical information. This information may be include, for example, a text label attached to some object in the scene, or a three- dimensional (3D) model of a patient's brain, derived from an MRI scan, and aligned to the real view of the person's head.
The user may observe the real world directly with his or her eyes, and the additional graphics information is blended in via a semi-transparent display located between the observer and the real scene. Such a display device can, for example, be an optical see-through head mounted display.
The display can also be opaque, like a computer screen or a non-see-through head mounted display. It then presents to the user the complete augmented view, a combination of the real-world view and the graphics overlay. A video camera takes the place of the real-world observer to capture the real world-view. Two cameras may be implemented for stereo vision. A computer may be used to combine the live video with the graphics augmentation. A display device of this kind is, for example, a video-see-through head-mounted display.
The graphics are positioned, oriented, and scaled, or even rendered in a perspective fashion for correct alignment with the real-world view. To achieve precise alignment of the real and virtual view, the graphics may be anchored to a real- world object. For this knowledge of the position and orientation of the user's viewpoint is needed with respect to this object, and the orientation of the object. Thus, the relationship between two coordinate systems needs to be defined, one attached to the user's head, the other attached to the object.
Tracking denotes the process of keeping track of this relationship. Commercial tracking systems are available based on optical, magnetic, ultrasound, and mechanical means.
Calibration is needed to achieve correct alignment between virtual graphics objects and real objects in the scene. Calibrating a video-see-through HMD can be done in an objective way, independent of a user, as real and virtual images are combined in the computer. In contrast, with an optical-see-through HMD the combination of the real and virtual images takes place finally in the user's eye, and the position of the user's eye behind the semi-transparent screen has critical influence on the alignment.
Different methods for calibrating an optical -see-through HMD are known as prior art. All known calibration methods require the user to align virtual structures with real reference structures. For example, in the SPAAM method the user is shown a sequence of fixed graphical markers on the display and moves the head to bring them into alignment with a reference marker in the real scene. This alignment is hampered by the user's head jitter. Due to head jitter the location of the real marker jitters, and it is not possible to precisely align virtual and real markers.
For augmented reality applications needing both precise measurements and comfortable use, such as in an operating room, no known system currently exists. Therefore, a need exists for a system and method for augmented reality calibration of see-through head-mounted displays.
SUMMARY OF THE INVENTION
An augmented reality system comprises a real reference generator for displaying a real reference on a calibration screen, an optical see-through display having a fixed position with respect to the real reference generator and a virtual reference generator for displaying a virtual reference on the optical see-through display. The augmented reality system further comprises an input device for aligning a view of the virtual reference with a view of the real reference through the optical see-through display, wherein the virtual reference is moved on the optical see-through display, and a processor for determining one or more parameters for rendering a virtual object as part of a real scene seen through the optical see- through display.
The augmented reality system comprises a tracking camera for tracking a pose of the calibration screen with respect to the real reference.
The augmented reality system comprises a tracking camera having a fixed position with respect to the real reference generator for capturing a view of the calibration screen. The augmented reality system further comprises a processor, wherein an optical marker configuration is fixed to the calibration screen and imaged by the tracking camera, wherein the processor determines a positional relationship between the calibration screen and a head-mounted display according to a position of the optical marker configuration in an image captured by the tracking camera, the head-mounted display comprising the real reference generator and optical see- through display.
The augmented reality system comprises at least one tracking camera for capturing a view of the calibration screen and a head-mounted display comprising the real reference generator and optical see-through display. The augmented reality system further comprises a processor, wherein an optical marker configuration is fixed to each of the calibration screen and the head-mounted display and tracked by the at least one tracking camera, wherein the processor determines a positional relationship between the calibration screen and the head-mounted display according to the positions of respective optical marker configurations in the view captured by the at least one tracking camera.
A system for calibrating real and virtual views comprises a real reference generator for displaying a real reference on a calibration screen, an optical display having a fixed position with respect to the real reference point generator and a virtual reference generator for generating a virtual reference in the optical display. The system further comprises an input device for aligning a view of the virtual reference with a view of the real reference, wherein the virtual reference is moved on the optical display with respect to the view of the real reference and a processor for determining one or more parameters for rendering a virtual object in a real scene seen in the optical display.
The system further comprising a camera capturing the view of the real reference, wherein the real reference is displayed in the optical display with the virtual reference superimposed thereon. The system comprises a tracking camera having a fixed position with respect to the real reference generator for capturing a view of the calibration screen and a processor, wherein an optical marker configuration is fixed to the calibration screen and tracked by the tracking camera, wherein the processor determines a positional relationship between the calibration screen and a head-mounted display according to the position of the optical marker configuration in the view captured by the tracking camera, the head-mounted display comprising the real reference generator and optical display.
The system comprises a tracking camera coupled to the real reference generator for capturing a view of the calibration screen. The system further comprises a processor, wherein an optical marker configuration is fixed to the calibration screen and tracked by the tracking camera, wherein the processor determines a positional relationship between the calibration screen and a head-mounted display according to the position of the optical marker configuration in the view captured by the tracking camera, the head-mounted display comprising the real reference generator and optical display.
The system comprises at least one tracking camera for capturing a view of the calibration screen and a head-mounted display comprising the real reference generator and optical display. The system further comprises a processor, wherein an optical marker configuration is fixed to each of the calibration screen and the head-mounted display and tracked by the tracking camera, wherein the processor determines a positional relationship between the calibration screen and the head-mounted display according to the positions of respective optical marker configurations in the view captured by the at least one tracking camera.
A method for calibrating real and virtual views comprises tracking a calibration screen, wherein a real reference, generated by a real reference generator, is projected on the calibration screen, aligning a virtual reference to a view of the real reference in a display, wherein the real reference generator and the display have a fixed relative position, determining a point correspondence between the virtual reference and the real reference, and determining one or more parameters for rendering a virtual object in the real scene.
The ;method comprises displaying the virtual reference on an -optical see-through display, -through which the real- reference is visible.
The method comprises capturing a view of a real scene including the real reference and displaying the view of the real scene augmented with the virtual reference .
BRIEF DESCRIPTION OF THE DRAWINGS
Preferred embodiments of the present invention will be described below in more detail, with reference to the accompanying drawings :
Figure 1 is an illustration of a calibration system according to an embodiment of the present disclosure; Figure 2A is an illustration of an augmented reality calibration system according to an embodiment of the present disclosure;
Figure 2B is an illustration of an augmented reality calibration system according to an embodiment of the present disclosure;
Figure 2C is an illustration of a video-see-through augmented reality calibration system according to an embodiment of the present disclosure;
Figure 3 is a flow chart of a method according to an embodiment of the present disclosure; and
Figure 4 is a flow chart of a method according to an embodiment of the present disclosure.
DETAILED DESCRIPTION OF PREFERRED EMBODIMENTS
A system and method for calibration of an optical see- through head-mounted display (HMD) implements a real reference as a light spot originating from an illuminator attached to the HMD. The light spots "jitter along" with the HMD, and the user does not perceive any jitter between these reference markers and virtual markers that are displayed at a fixed location on a semi-transparent screen of the HMD.
Referring to Figure 1, to calibrate an optical see- through system, a user 100 aligns a virtual' reference 101, displayed as graphics on the HMD's semitransparent screen 103, with a real reference structure 102, observed through the screen 103. The real reference structure 102 is implemented as a projected light point and/or pattern on a calibration screen 104 or other substrate. The real reference 102 and the virtual reference 101 may be, for example, one or more points or shapes .
It is to be understood that the present invention may be implemented in various forms of hardware, software, firmware, special purpose processors, or a combination thereof. In one embodiment, the present invention may be implemented in software as an application program tangibly embodied on a program storage device. The application program may be. uploaded to, and executed by, a machine comprising any suitable architecture.
It is to be further understood that, because some of the constituent system components and method steps depicted in the accompanying figures may be implemented in software, the actual connections between the system components (or the process steps) may differ depending upon the manner in which the present invention is programmed. Given the teachings of the present invention provided herein, one of ordinary skill in the related art will be able to contemplate these and similar implementations or con igurations of the present invention.
Referring to Figures 2A and 2B, the real reference 102 originates from an illumination system 200 that is rigidly attached to the HMD 201 and observed through a screen 103. When the user moves the HMD 201, the real reference 102 moves along on the calibration screen 104, and for small head movements the real reference 102 appears fixed with respect to the virtual reference 101 as seen through the HMD's semi- transparent screen 103. An alignment process is now easier from a user's 100 vantage, as jitter between the real reference 102 and the virtual reference 101 is substantially reduced.
The real reference 102 is observed on a flat screen 104. The user can hold the screen 104 in one hand at arm's length, place it on a table, etc.
The screen 104 is tracked with respect to the user's head or HMD 201. A tracking arrangement includes an external (see Figure 2A) or head-mounted (see Figure 2B) tracking camera 202. In case of optical tracking, the screen 104 and - in the case of Figure 2A' s external tracking camera - the HMD 201 include optical markers 203. The optical markers 203 may be, for example, retroreflective flat discs, or retroreflective spheres .
The illuminator 200 projects a light pattern 102 that includes one, or preferably several points. The illuminator 200 can be constructed with a single light source and an optical system that generates the light pattern 102. For example, a laser could be used together with a lens system and a mask, or with diffractive optics to generate an array of light spots.
Alternatively, an array of light sources such as an LED array may be used with the advantage that the light pattern can be made switchable. The LEDs can be switched on and off individually. The LEDs can be combined with a lens system or with micro-optics, e.g. a lens array.
The illuminator 200 can also include a scanning means or beam deflection means to switch between different beam directions .
The screen 103 may be, for example, a monocular or binocular arrangement. In the binocular arrangement, both screens are preferably individually calibrated, one after the other. Appropriate optics 204 in combination with the semitransparent display 103 generates an image of the virtual reference 101. The virtual reference 101 is visible to a user as the user looks through the semi-transparent display 103. Alternatively, the see-through display can be embodied with an image projector and an optical system, which includes a beam splitter.
To perform the calibration, the user 100 moves the virtual reference (s) 101 displayed on the semi-transparent screen 103 into alignment with the reference light pattern 102 as seen from the user's perspective, e.g., 205. The user 101 controls an interface 206 (e.g., processor 207 and input device 208) to move the virtual reference (s) 101 on the screen 103. The input device 208 may be, for example, a trackball or a mouse. The processor 207 may be a virtual reference generator comprising a processor and graphics card to render the virtual reference for display on the semi-transparent screen.
To complete the calibration process, the user aligns the virtual reference 101 to several different real reference light points, e.g., 102. For better calibration accuracy, the user may assume different poses (e.g., distances and/or orientations) with regard to the calibration screen 104.
A processor, e.g., 207, determines a spatial relationship between the calibration screen 104 and HMD 201 according to the positions of markers 203 and the user determined alignment of the virtual reference 101 and the real reference 102. Figure 2B is an example of a HMD including a tracking camera 202. As shown, where the tracking camera is fixed to the HMD, the spatial relationship between the calibration screen 104 and HMD 201 may be determined using optical markers 203 fixed to the calibration screen 104 only. Further, the pose of the calibration screen 104 may be determined according to the relationship of different optical markers 203 fixed to the screen 104.
Figure 2C is a video-see-through augmented reality system according to an embodiment of the present disclosure, wherein
19 a camera, e.g., 202, captures an image of a real scene including a real reference 102. The tracking and video functions of camera 202 may be performed by separate cameras. The image of the real scene is displayed to the user 101. A virtual view is superimposed on the real view, e.g., 209, and the user 101 perceives real and virtual views, e.g., 204. The user may align the virtual reference with the view of the real reference in the real scene .
Referring to Figure 3, consider the case of a head- mounted tracking camera. Tracking camera and illuminator are mechanically fixed to each other, and location and orientation of the light beams, which generate the reference points, are determined with respect to a coordinate system of the tracking camera. By tracking the calibration screen 301, one can determine the 3D coordinates of the reference points in the tracking camera coordinate system as the intersection of the corresponding light beams with the plane of the screen.
During the calibration process a user aligns the real and virtual references 302 and the system records a set of 3D-2D point correspondences 303. Each consists of the 3D coordinates of a reference light point and the 2D coordinates of a virtual marker that the user has aligned to the reference. This set of point correspondences allows one to determine one or more parameters for rendering the virtual objects in correct alignment with the real scene 304. For example, the camera parameters, that determine the user's view of the virtual world as displayed on the semitransparent screen, are matched to the camera parameters that determine the user's view of the real world as seen through the screen. Such determinations are known in the art, for example, as described in U.S. Patent Application No. 20020105484, filed 09/25/01, entitled "System and Method for Calibrating a Monocular Optical See-Through Head-Mounted Display System for Augmented .Reality" , wherein calibration may include instantiating parameter values for mathematical models that map the physical environment to internal representations, so that the computer's internal model matches the physical environment. These parameters include, for example optical characteristics of a physical camera and position and orientation (pose) information of various entities such as the camera, the markers for tracking, and the various objects.
After successful calibration of the optical-see-through augmented reality system for the individual user, the system can render 3D graphical objects in a way that they appear rigidly anchored in the real scene. The user's viewpoint changes are tracked with a tracking system and accounted for with corresponding changes of the graphics objects' virtual view.
Alternatively to the case of a head-mounted tracking camera, external tracking means can be used in conjunction1 with head-mounted markers or sensors that are rigidly fixed with respect to the illuminator. The tracking system tracks both the HMD and the calibration screen 401. Again, the 3D coordinates of the calibration light points can be aligned 402 and determined as intersection of light beams and screen plane 403. The virtual reference points are brought into alignment with the real reference points displayed as light on the screen and the correspondence is recorded 404.
System includes head-mounted display, tracking means, computing and graphics rendering means, light projection means, and trackable screen.
Calibration alignments between the real and virtual reference structures may be averaged over several measurements for each point correspondence. Note that virtual marker and real marker appear jitter free relative to each other. Averaging may reduce error in the calibration. Averaging is user-friendly compared to calibration procedures that use external features. Here, the user can hold the alignment for one or several seconds because of the reduced j itter between the real and virtual markers.
Having described embodiments for a system and method for calibrating real and virtual views, it is noted that modifications and variations can be made by persons skilled in the art in light of the above teachings. It is therefore to be understood that changes may be made in the particular
1 embodiments of the invention disclosed which are within the scope and spirit of the invention as defined by the appended claims. Having thus described the invention with the details and particularity required by the patent laws, what is claimed and desired protected by Letters Patent is set forth in the appended claims .

Claims

What is claimed is:
1. An augmented reality system comprising: a real reference generator for displaying a real reference on a calibration screen; an optical see-through display having a fixed position with respect to the real reference generator; a virtual reference generator for displaying a virtual reference on the optical see-through display; an input device for aligning a view of the virtual reference with a view of the real reference through the optical see-through display, wherein the virtual reference is moved on the optical see-through display; and a processor for determining one or more parameters for rendering a virtual object as part of a real scene seen through the optical see-through display.
2. The augmented reality system of claim 1, further comprising a tracking camera for tracking a pose of the calibration screen with respect to the real reference.
3. The augmented reality system of claim 1, further comprising a tracking camera having a fixed position with respect to the real reference generator for capturing a view of the calibration screen.
4. The augmented reality system of claim 3, further comprising a processor, wherein an optical marker configuration is fixed to the calibration screen and imaged by the tracking camera, wherein the processor determines a positional relationship between the calibration screen and a head-mounted display according to a position of the optical marker configuration in an image captured by the tracking camera, the head-mounted display comprising the real reference generator and optical see-through display.
5. The augmented reality system of claim 1, further comprising at least one tracking camera for capturing a view of the calibration screen and a head-mounted display comprising the real reference generator and optical see- through display.
6. The augmented reality system of claim 5, further comprising a processor, wherein an optical marker configuration is fixed to each of the calibration screen and the head-mounted display and tracked by the at least one tracking camera, wherein the processor determines a positional relationship between the calibration screen and the head- mounted display according to the positions of respective optical marker configurations in the view captured by the at least one tracking camera.
7. A system for calibrating real and virtual views comprising: a real reference generator for displaying a real reference on a calibration screen; an optical display having a fixed position with respect to the real reference point generator; a virtual reference generator for generating a virtual
I reference in the optical display; an input device for aligning a view of the virtual reference with a view of the real reference, wherein the virtual reference is moved on the optical display with respect to the view of the real reference; and a processor for determining one or more parameters for rendering a virtual object in a real scene seen in the optical display.
8. The system for calibrating real and virtual views claim
7, further comprising a camera capturing the view of the real reference, . wherein the real reference is displayed in the optical display with the virtual reference superimposed thereon.
9. The system for calibrating real and virtual views claim
8, further comprising: a tracking camera having a fixed position with respect to the real reference generator for capturing a view of the calibration screen; and a processor, wherein an optical marker configuration is fixed to the calibration screen and tracked by the tracking camera, wherein the processor determines a -positional relationship between the calibration screen and a head-mounted display according to the position of the optical marker configuration in the view captured by the tracking camera, the head-mounted display comprising the real reference generator and optical display.
10. The system for calibrating real and virtual views claim 7, further comprising a tracking camera coupled to the real reference generator for capturing a view of the calibration screen.
11. The system for calibrating real and virtual views claim 10, further comprising a processor, wherein an optical marker configuration is fixed to the calibration screen and tracked by the tracking camera, wherein the processor determines a positional relationship between the calibration screen and a head-mounted display according to the position of the optical marker configuration in the view captured by the tracking camera, the head-mounted display comprising the real reference generator and optical display.
12. The system for calibrating real and virtual views claim 7, further comprising a at least one tracking camera for capturing a view of the calibration screen and a head-mounted display comprising the real reference generator and optical display.
13. The system for calibrating real and virtual views claim 12, further .comprising a processor, wherein an optical marker configuration is fixed to each of the calibration screen and the head-mounted display and tracked by the tracking camera, wherein the processor determines a positional relationship between the calibration screen and the head-mounted display according to the positions of respective optical marker configurations in the view captured by the at least one tracking camera.
14. A method for calibrating real and virtual views comprising: tracking a calibration screen, wherein a real reference, generated by a real reference generator, is projected on the calibration screen; aligning a virtual reference to a view of the real reference in a display, wherein the real reference generator and the display have a fixed relative position; determining a point correspondence between the virtual reference and the real reference; and determining one or more parameters for rendering a virtual object in the real scene.
15. The method for calibrating real and virtual views of claim 14, further comprising displaying the virtual reference on an optical see-through display, through which the real reference is visible.
16. The method for calibrating real and virtual views of claim 14, further comprising: capturing a vie'w of a real scene including the real reference; and displaying the view of the real scene augmented with the virtual reference.
PCT/US2004/018346 2003-06-12 2004-06-09 Calibrating real and virtual views WO2004113991A2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
DE112004000902T DE112004000902T5 (en) 2003-06-12 2004-06-09 Calibration of actual and virtual views

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US47786103P 2003-06-12 2003-06-12
US60/477,861 2003-06-12
US10/863,414 2004-06-08
US10/863,414 US7369101B2 (en) 2003-06-12 2004-06-08 Calibrating real and virtual views

Publications (2)

Publication Number Publication Date
WO2004113991A2 true WO2004113991A2 (en) 2004-12-29
WO2004113991A3 WO2004113991A3 (en) 2005-02-24

Family

ID=33544358

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2004/018346 WO2004113991A2 (en) 2003-06-12 2004-06-09 Calibrating real and virtual views

Country Status (3)

Country Link
US (1) US7369101B2 (en)
DE (1) DE112004000902T5 (en)
WO (1) WO2004113991A2 (en)

Cited By (35)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2006072527A1 (en) * 2005-01-05 2006-07-13 Siemens Aktiengesellschaft Head-up display for a motor vehicle
EP1708139A2 (en) 2005-04-01 2006-10-04 Canon Kabushiki Kaisha Calibration method and apparatus
EP1847963A1 (en) * 2006-04-20 2007-10-24 Koninklijke KPN N.V. Method and system for displaying visual information on a display
EP2081182A1 (en) * 2006-10-16 2009-07-22 Sony Corporation Display device and display method
FR2934057A1 (en) * 2008-07-16 2010-01-22 Xavier Arthur Carriou User head's direction and opposite direction calibrating method for global positioning system imaging in e.g. real environment, involves aligning vertical images with physical elements in non stereoscopic peripheral visual field of eyes
WO2011073682A1 (en) * 2009-12-17 2011-06-23 Bae Systems Plc A method of aligning a helmet mounted display
EP2341386A1 (en) * 2009-12-17 2011-07-06 BAE Systems PLC A method of aligning a helmet mounted display
WO2012034767A1 (en) * 2010-09-14 2012-03-22 Robert Bosch Gmbh Head-up display
WO2012116059A1 (en) * 2011-02-22 2012-08-30 Qualcomm Incorporated Providing a corrected view based on the position of a user with respect to a mobile platform
WO2012061727A3 (en) * 2010-11-05 2012-11-01 Ethicon Enco-Surgery, Inc. Surgical instrument safety glasses or surgical monitor with visual feed back
WO2013114066A1 (en) * 2012-01-30 2013-08-08 Bae Systems Plc Improvements in or relating to image display systems
JP2014123376A (en) * 2012-12-21 2014-07-03 Dassault Systemes Delmia Corp Location correction of virtual objects
US9000720B2 (en) 2010-11-05 2015-04-07 Ethicon Endo-Surgery, Inc. Medical device packaging with charging interface
US9039720B2 (en) 2010-11-05 2015-05-26 Ethicon Endo-Surgery, Inc. Surgical instrument with ratcheting rotatable shaft
US9089338B2 (en) 2010-11-05 2015-07-28 Ethicon Endo-Surgery, Inc. Medical device packaging with window for insertion of reusable component
EP2966863A1 (en) * 2014-07-10 2016-01-13 Seiko Epson Corporation Hmd calibration with direct geometric modeling
WO2016191043A1 (en) * 2015-05-28 2016-12-01 Microsoft Technology Licensing, Llc Calibration of an optical see-through head mounted display
US9649150B2 (en) 2010-11-05 2017-05-16 Ethicon Endo-Surgery, Llc Selective activation of electronic components in medical device
US9782215B2 (en) 2010-11-05 2017-10-10 Ethicon Endo-Surgery, Llc Surgical instrument with ultrasonic transducer having integral switches
US9782214B2 (en) 2010-11-05 2017-10-10 Ethicon Llc Surgical instrument with sensor and powered control
US10085792B2 (en) 2010-11-05 2018-10-02 Ethicon Llc Surgical instrument with motorized attachment feature
US10136938B2 (en) 2014-10-29 2018-11-27 Ethicon Llc Electrosurgical instrument with sensor
US10192133B2 (en) 2015-06-22 2019-01-29 Seiko Epson Corporation Marker, method of detecting position and pose of marker, and computer program
US10192361B2 (en) 2015-07-06 2019-01-29 Seiko Epson Corporation Head-mounted display device and computer program
US10271042B2 (en) 2015-05-29 2019-04-23 Seeing Machines Limited Calibration of a head mounted eye tracking system
US10347048B2 (en) 2015-12-02 2019-07-09 Seiko Epson Corporation Controlling a display of a head-mounted display device
US10376304B2 (en) 2010-11-05 2019-08-13 Ethicon Llc Surgical instrument with modular shaft and end effector
US10537380B2 (en) 2010-11-05 2020-01-21 Ethicon Llc Surgical instrument with charging station and wireless communication
US10660695B2 (en) 2010-11-05 2020-05-26 Ethicon Llc Sterile medical instrument charging device
US10881448B2 (en) 2010-11-05 2021-01-05 Ethicon Llc Cam driven coupling between ultrasonic transducer and waveguide in surgical instrument
US10959769B2 (en) 2010-11-05 2021-03-30 Ethicon Llc Surgical instrument with slip ring assembly to power ultrasonic transducer
US10973563B2 (en) 2010-11-05 2021-04-13 Ethicon Llc Surgical instrument with charging devices
US11227441B2 (en) 2019-07-04 2022-01-18 Scopis Gmbh Technique for calibrating a registration of an augmented reality device
US11252399B2 (en) 2015-05-28 2022-02-15 Microsoft Technology Licensing, Llc Determining inter-pupillary distance
EP3948774A4 (en) * 2019-03-29 2022-06-01 Nec Corporation System and method for adaptively constructing a three-dimensional facial model based on two or more inputs of a two-dimensional facial image

Families Citing this family (108)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7482088B2 (en) * 2003-01-31 2009-01-27 3M Innovative Properties Company Flow field
JP2005100367A (en) * 2003-09-02 2005-04-14 Fuji Photo Film Co Ltd Image generating apparatus, image generating method and image generating program
US8179366B2 (en) * 2004-12-06 2012-05-15 Naturalpoint, Inc. Systems and methods for using a movable object to control a computer
US20080306708A1 (en) * 2007-06-05 2008-12-11 Raydon Corporation System and method for orientation and location calibration for image sensors
DE102007059478B4 (en) * 2007-12-11 2014-06-26 Kuka Laboratories Gmbh Method and system for aligning a virtual model with a real object
FR2935810B1 (en) * 2008-09-09 2010-10-22 Airbus France METHOD FOR ADJUSTING A HARMONIZATION COMPENSATION BETWEEN VIDEO SENSOR AND HIGH HEAD VISUALIZATION DEVICE, AND DEVICES THEREOF
US8397181B2 (en) * 2008-11-17 2013-03-12 Honeywell International Inc. Method and apparatus for marking a position of a real world object in a see-through display
US20100250366A1 (en) * 2009-03-31 2010-09-30 Microsoft Corporation Merge real-world and virtual markers
CN101904770B (en) * 2009-06-05 2012-11-14 复旦大学 Operation guiding system and method based on optical enhancement reality technology
US8307308B2 (en) 2009-08-27 2012-11-06 International Business Machines Corporation Updating assets rendered in a virtual world environment based on detected user interactions in another world
US8817078B2 (en) * 2009-11-30 2014-08-26 Disney Enterprises, Inc. Augmented reality videogame broadcast programming
US8803951B2 (en) * 2010-01-04 2014-08-12 Disney Enterprises, Inc. Video capture system control using virtual cameras for augmented reality
US8488246B2 (en) 2010-02-28 2013-07-16 Osterhout Group, Inc. See-through near-eye display glasses including a curved polarizing film in the image source, a partially reflective, partially transmitting optical element and an optically flat film
US8482859B2 (en) 2010-02-28 2013-07-09 Osterhout Group, Inc. See-through near-eye display glasses wherein image light is transmitted to and reflected from an optically flat film
US20150309316A1 (en) 2011-04-06 2015-10-29 Microsoft Technology Licensing, Llc Ar glasses with predictive control of external device based on event input
US20110213664A1 (en) * 2010-02-28 2011-09-01 Osterhout Group, Inc. Local advertising content on an interactive head-mounted eyepiece
US8477425B2 (en) 2010-02-28 2013-07-02 Osterhout Group, Inc. See-through near-eye display glasses including a partially reflective, partially transmitting optical element
US9182596B2 (en) 2010-02-28 2015-11-10 Microsoft Technology Licensing, Llc See-through near-eye display glasses with the optical assembly including absorptive polarizers or anti-reflective coatings to reduce stray light
US8964298B2 (en) 2010-02-28 2015-02-24 Microsoft Corporation Video display modification based on sensor input for a see-through near-to-eye display
US9097890B2 (en) 2010-02-28 2015-08-04 Microsoft Technology Licensing, Llc Grating in a light transmissive illumination system for see-through near-eye display glasses
US9366862B2 (en) 2010-02-28 2016-06-14 Microsoft Technology Licensing, Llc System and method for delivering content to a group of see-through near eye display eyepieces
US20120249797A1 (en) 2010-02-28 2012-10-04 Osterhout Group, Inc. Head-worn adaptive display
US9129295B2 (en) 2010-02-28 2015-09-08 Microsoft Technology Licensing, Llc See-through near-eye display glasses with a fast response photochromic film system for quick transition from dark to clear
CN102906623A (en) * 2010-02-28 2013-01-30 奥斯特豪特集团有限公司 Local advertising content on an interactive head-mounted eyepiece
US8472120B2 (en) 2010-02-28 2013-06-25 Osterhout Group, Inc. See-through near-eye display glasses with a small scale image source
US9091851B2 (en) 2010-02-28 2015-07-28 Microsoft Technology Licensing, Llc Light control in head mounted displays
US9097891B2 (en) 2010-02-28 2015-08-04 Microsoft Technology Licensing, Llc See-through near-eye display glasses including an auto-brightness control for the display brightness based on the brightness in the environment
US9134534B2 (en) 2010-02-28 2015-09-15 Microsoft Technology Licensing, Llc See-through near-eye display glasses including a modular image source
US9128281B2 (en) 2010-09-14 2015-09-08 Microsoft Technology Licensing, Llc Eyepiece with uniformly illuminated reflective display
US9341843B2 (en) 2010-02-28 2016-05-17 Microsoft Technology Licensing, Llc See-through near-eye display glasses with a small scale image source
US8467133B2 (en) 2010-02-28 2013-06-18 Osterhout Group, Inc. See-through display with an optical assembly including a wedge-shaped illumination system
US10180572B2 (en) 2010-02-28 2019-01-15 Microsoft Technology Licensing, Llc AR glasses with event and user action control of external applications
US9223134B2 (en) 2010-02-28 2015-12-29 Microsoft Technology Licensing, Llc Optical imperfections in a light transmissive illumination system for see-through near-eye display glasses
US9229227B2 (en) 2010-02-28 2016-01-05 Microsoft Technology Licensing, Llc See-through near-eye display glasses with a light transmissive wedge shaped illumination system
US9285589B2 (en) 2010-02-28 2016-03-15 Microsoft Technology Licensing, Llc AR glasses with event and sensor triggered control of AR eyepiece applications
US9759917B2 (en) 2010-02-28 2017-09-12 Microsoft Technology Licensing, Llc AR glasses with event and sensor triggered AR eyepiece interface to external devices
US9375255B2 (en) 2010-11-05 2016-06-28 Ethicon Endo-Surgery, Llc Surgical instrument handpiece with resiliently biased coupling to modular shaft and end effector
US9526921B2 (en) 2010-11-05 2016-12-27 Ethicon Endo-Surgery, Llc User feedback through end effector of surgical instrument
US9597143B2 (en) 2010-11-05 2017-03-21 Ethicon Endo-Surgery, Llc Sterile medical instrument charging device
US9381058B2 (en) 2010-11-05 2016-07-05 Ethicon Endo-Surgery, Llc Recharge system for medical devices
US9017849B2 (en) 2010-11-05 2015-04-28 Ethicon Endo-Surgery, Inc. Power source management for medical device
US9017851B2 (en) 2010-11-05 2015-04-28 Ethicon Endo-Surgery, Inc. Sterile housing for non-sterile medical device component
US9161803B2 (en) 2010-11-05 2015-10-20 Ethicon Endo-Surgery, Inc. Motor driven electrosurgical device with mechanical and electrical feedback
US9247986B2 (en) 2010-11-05 2016-02-02 Ethicon Endo-Surgery, Llc Surgical instrument with ultrasonic transducer having integral switches
US9011471B2 (en) 2010-11-05 2015-04-21 Ethicon Endo-Surgery, Inc. Surgical instrument with pivoting coupling to modular shaft and end effector
US9421062B2 (en) 2010-11-05 2016-08-23 Ethicon Endo-Surgery, Llc Surgical instrument shaft with resiliently biased coupling to handpiece
WO2012118573A1 (en) * 2011-02-28 2012-09-07 Osterhout Group, Inc. Light control in head mounted displays
US8817046B2 (en) * 2011-04-21 2014-08-26 Microsoft Corporation Color channels and optical markers
US9606992B2 (en) * 2011-09-30 2017-03-28 Microsoft Technology Licensing, Llc Personal audio/visual apparatus providing resource management
US9243890B2 (en) * 2011-10-25 2016-01-26 Semiconductor Components Industries, Llc Video overlay systems
US9311883B2 (en) 2011-11-11 2016-04-12 Microsoft Technology Licensing, Llc Recalibration of a flexible mixed reality device
JP5927867B2 (en) * 2011-11-28 2016-06-01 セイコーエプソン株式会社 Display system and operation input method
US8917453B2 (en) 2011-12-23 2014-12-23 Microsoft Corporation Reflective array waveguide
DE102011122206A1 (en) 2011-12-23 2013-06-27 Volkswagen Aktiengesellschaft Method for representation of virtual image component i.e. augmented reality image, on transparent display of augmented reality system, involves determining position of component, and representing virtual image component by display
US9223138B2 (en) 2011-12-23 2015-12-29 Microsoft Technology Licensing, Llc Pixel opacity for augmented reality
US8638498B2 (en) 2012-01-04 2014-01-28 David D. Bohn Eyebox adjustment for interpupillary distance
US9213185B1 (en) * 2012-01-06 2015-12-15 Google Inc. Display scaling based on movement of a head-mounted display
US9606586B2 (en) 2012-01-23 2017-03-28 Microsoft Technology Licensing, Llc Heat transfer device
US8810600B2 (en) 2012-01-23 2014-08-19 Microsoft Corporation Wearable display device calibration
US9297996B2 (en) 2012-02-15 2016-03-29 Microsoft Technology Licensing, Llc Laser illumination scanning
US9779643B2 (en) 2012-02-15 2017-10-03 Microsoft Technology Licensing, Llc Imaging structure emitter configurations
US9368546B2 (en) 2012-02-15 2016-06-14 Microsoft Technology Licensing, Llc Imaging structure with embedded light sources
US9726887B2 (en) 2012-02-15 2017-08-08 Microsoft Technology Licensing, Llc Imaging structure color conversion
US9578318B2 (en) 2012-03-14 2017-02-21 Microsoft Technology Licensing, Llc Imaging structure emitter calibration
US11068049B2 (en) 2012-03-23 2021-07-20 Microsoft Technology Licensing, Llc Light guide display and field of view
US10191515B2 (en) 2012-03-28 2019-01-29 Microsoft Technology Licensing, Llc Mobile device light guide display
US9558590B2 (en) 2012-03-28 2017-01-31 Microsoft Technology Licensing, Llc Augmented reality light guide display
US9717981B2 (en) 2012-04-05 2017-08-01 Microsoft Technology Licensing, Llc Augmented reality and physical games
US10502876B2 (en) 2012-05-22 2019-12-10 Microsoft Technology Licensing, Llc Waveguide optics focus elements
US8989535B2 (en) 2012-06-04 2015-03-24 Microsoft Technology Licensing, Llc Multiple waveguide imaging structure
US9401121B2 (en) * 2012-09-27 2016-07-26 Futurewei Technologies, Inc. Network visualization through augmented reality and modeling
US20140168264A1 (en) 2012-12-19 2014-06-19 Lockheed Martin Corporation System, method and computer program product for real-time alignment of an augmented reality device
US10192358B2 (en) 2012-12-20 2019-01-29 Microsoft Technology Licensing, Llc Auto-stereoscopic augmented reality display
KR20140090552A (en) 2013-01-09 2014-07-17 엘지전자 주식회사 Head Mounted Display and controlling method for eye-gaze calibration
US9619021B2 (en) 2013-01-09 2017-04-11 Lg Electronics Inc. Head mounted display providing eye gaze calibration and control method thereof
US20140191927A1 (en) * 2013-01-09 2014-07-10 Lg Electronics Inc. Head mount display device providing eye gaze calibration and control method thereof
US9239460B2 (en) * 2013-05-10 2016-01-19 Microsoft Technology Licensing, Llc Calibration of eye location
US20150193980A1 (en) * 2014-01-06 2015-07-09 Qualcomm Incorporated Calibration of augmented reality (ar) optical see-through display using shape-based alignment
DE102014213113A1 (en) 2014-07-07 2016-01-07 Volkswagen Aktiengesellschaft Three-dimensional augmented reality process, especially in the automotive sector
US9304235B2 (en) 2014-07-30 2016-04-05 Microsoft Technology Licensing, Llc Microfabrication
US10678412B2 (en) 2014-07-31 2020-06-09 Microsoft Technology Licensing, Llc Dynamic joint dividers for application windows
US10254942B2 (en) 2014-07-31 2019-04-09 Microsoft Technology Licensing, Llc Adaptive sizing and positioning of application windows
US10592080B2 (en) 2014-07-31 2020-03-17 Microsoft Technology Licensing, Llc Assisted presentation of application windows
US11086216B2 (en) 2015-02-09 2021-08-10 Microsoft Technology Licensing, Llc Generating electronic components
US9429692B1 (en) 2015-02-09 2016-08-30 Microsoft Technology Licensing, Llc Optical components
US10018844B2 (en) 2015-02-09 2018-07-10 Microsoft Technology Licensing, Llc Wearable image display system
US9827209B2 (en) 2015-02-09 2017-11-28 Microsoft Technology Licensing, Llc Display system
US10317677B2 (en) 2015-02-09 2019-06-11 Microsoft Technology Licensing, Llc Display system
US9535253B2 (en) 2015-02-09 2017-01-03 Microsoft Technology Licensing, Llc Display system
US9423360B1 (en) 2015-02-09 2016-08-23 Microsoft Technology Licensing, Llc Optical components
US9372347B1 (en) 2015-02-09 2016-06-21 Microsoft Technology Licensing, Llc Display system
US9513480B2 (en) 2015-02-09 2016-12-06 Microsoft Technology Licensing, Llc Waveguide
WO2016172167A1 (en) * 2015-04-20 2016-10-27 Washington University Camera calibration with lenticular arrays
JP6892213B2 (en) * 2015-04-30 2021-06-23 ソニーグループ株式会社 Display device and initial setting method of display device
EP3286598B1 (en) * 2015-06-15 2023-11-15 Essilor International Method for calibrating a binocular displaying device
EP3296791A1 (en) * 2016-09-16 2018-03-21 Nokia Technologies Oy Method, apparatus and computer program for near eye display
FR3060774A1 (en) * 2016-12-16 2018-06-22 Peugeot Citroen Automobiles Sa METHOD FOR ADJUSTING HIGH-LEVEL REALITY HEAD DISPLAY DEVICE
CA3045780A1 (en) * 2017-01-30 2018-08-02 Novartis Ag Systems and method for augmented reality ophthalmic surgical microscope projection
KR102436730B1 (en) 2017-12-06 2022-08-26 삼성전자주식회사 Method and apparatus for estimating parameter of virtual screen
US11861062B2 (en) 2018-02-03 2024-01-02 The Johns Hopkins University Blink-based calibration of an optical see-through head-mounted display
US11386572B2 (en) 2018-02-03 2022-07-12 The Johns Hopkins University Calibration system and method to align a 3D virtual scene and a 3D real world for a stereoscopic head-mounted display
CN110569006B (en) * 2018-06-05 2023-12-19 广东虚拟现实科技有限公司 Display method, display device, terminal equipment and storage medium
WO2020152585A1 (en) 2019-01-21 2020-07-30 Insightness Ag Transparent smartphone
US11106044B2 (en) * 2019-07-02 2021-08-31 GM Global Technology Operations LLC Eye height based virtual image alignment for head-up display
US10885819B1 (en) * 2019-08-02 2021-01-05 Harman International Industries, Incorporated In-vehicle augmented reality system
US11920922B2 (en) * 2021-04-02 2024-03-05 Rockwell Collins, Inc. Combiner alignment detector
US11948265B2 (en) 2021-11-27 2024-04-02 Novarad Corporation Image data set alignment for an AR headset using anatomic structures and data fitting
US20230169696A1 (en) * 2021-11-27 2023-06-01 Novarad Corporation Transfer of Alignment Accuracy Between Visible Markers Used with Augmented Reality Displays

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4281241A (en) * 1977-02-21 1981-07-28 Australasian Training Aids (Pty.) Ltd. Firing range
US4439755A (en) * 1981-06-04 1984-03-27 Farrand Optical Co., Inc. Head-up infinity display and pilot's sight
GB2259213A (en) * 1991-08-29 1993-03-03 British Aerospace Variable resolution view-tracking display
EP0827337A1 (en) * 1996-02-26 1998-03-04 Seiko Epson Corporation Wearable information displaying device and information displaying method using the same
WO2001078015A2 (en) * 2000-04-07 2001-10-18 Carnegie Mellon University Computer-aided bone distraction
US20020105484A1 (en) * 2000-09-25 2002-08-08 Nassir Navab System and method for calibrating a monocular optical see-through head-mounted display system for augmented reality

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
GB2259231B (en) 1991-09-05 1995-04-26 Anthony Godfrey Bunbury Renewable energy plant propagator unit
EP0710387B1 (en) 1993-07-22 1997-12-03 Minnesota Mining And Manufacturing Company Method and apparatus for calibrating three-dimensional space for machine vision applications
US5610678A (en) * 1993-12-30 1997-03-11 Canon Kabushiki Kaisha Camera including camera body and independent optical viewfinder
CA2318252A1 (en) 1998-01-28 1999-08-05 Eric R. Cosman Optical object tracking system
JP3631151B2 (en) * 2000-11-30 2005-03-23 キヤノン株式会社 Information processing apparatus, mixed reality presentation apparatus and method, and storage medium

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4281241A (en) * 1977-02-21 1981-07-28 Australasian Training Aids (Pty.) Ltd. Firing range
US4439755A (en) * 1981-06-04 1984-03-27 Farrand Optical Co., Inc. Head-up infinity display and pilot's sight
GB2259213A (en) * 1991-08-29 1993-03-03 British Aerospace Variable resolution view-tracking display
EP0827337A1 (en) * 1996-02-26 1998-03-04 Seiko Epson Corporation Wearable information displaying device and information displaying method using the same
WO2001078015A2 (en) * 2000-04-07 2001-10-18 Carnegie Mellon University Computer-aided bone distraction
US20020105484A1 (en) * 2000-09-25 2002-08-08 Nassir Navab System and method for calibrating a monocular optical see-through head-mounted display system for augmented reality

Cited By (60)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2006072527A1 (en) * 2005-01-05 2006-07-13 Siemens Aktiengesellschaft Head-up display for a motor vehicle
EP1708139A2 (en) 2005-04-01 2006-10-04 Canon Kabushiki Kaisha Calibration method and apparatus
EP1708139A3 (en) * 2005-04-01 2008-11-19 Canon Kabushiki Kaisha Calibration method and apparatus
US7542051B2 (en) 2005-04-01 2009-06-02 Canon Kabushiki Kaisha Calibration method and apparatus
EP1847963A1 (en) * 2006-04-20 2007-10-24 Koninklijke KPN N.V. Method and system for displaying visual information on a display
WO2007121880A1 (en) * 2006-04-20 2007-11-01 Koninklijke Kpn N.V. Method and system for displaying visual information on a display
US9846304B2 (en) 2006-10-16 2017-12-19 Sony Corporation Display method and display apparatus in which a part of a screen area is in a through-state
EP2081182A4 (en) * 2006-10-16 2011-08-24 Sony Corp Display device and display method
US9182598B2 (en) 2006-10-16 2015-11-10 Sony Corporation Display method and display apparatus in which a part of a screen area is in a through-state
EP2081182A1 (en) * 2006-10-16 2009-07-22 Sony Corporation Display device and display method
US8681256B2 (en) 2006-10-16 2014-03-25 Sony Corporation Display method and display apparatus in which a part of a screen area is in a through-state
FR2934057A1 (en) * 2008-07-16 2010-01-22 Xavier Arthur Carriou User head's direction and opposite direction calibrating method for global positioning system imaging in e.g. real environment, involves aligning vertical images with physical elements in non stereoscopic peripheral visual field of eyes
WO2011073682A1 (en) * 2009-12-17 2011-06-23 Bae Systems Plc A method of aligning a helmet mounted display
EP2341386A1 (en) * 2009-12-17 2011-07-06 BAE Systems PLC A method of aligning a helmet mounted display
US9323056B2 (en) 2009-12-17 2016-04-26 Bae Systems Plc Method of aligning a helmet mounted display
WO2012034767A1 (en) * 2010-09-14 2012-03-22 Robert Bosch Gmbh Head-up display
US9039720B2 (en) 2010-11-05 2015-05-26 Ethicon Endo-Surgery, Inc. Surgical instrument with ratcheting rotatable shaft
US10973563B2 (en) 2010-11-05 2021-04-13 Ethicon Llc Surgical instrument with charging devices
US10376304B2 (en) 2010-11-05 2019-08-13 Ethicon Llc Surgical instrument with modular shaft and end effector
US9072523B2 (en) 2010-11-05 2015-07-07 Ethicon Endo-Surgery, Inc. Medical device with feature for sterile acceptance of non-sterile reusable component
US9089338B2 (en) 2010-11-05 2015-07-28 Ethicon Endo-Surgery, Inc. Medical device packaging with window for insertion of reusable component
US11925335B2 (en) 2010-11-05 2024-03-12 Cilag Gmbh International Surgical instrument with slip ring assembly to power ultrasonic transducer
US11744635B2 (en) 2010-11-05 2023-09-05 Cilag Gmbh International Sterile medical instrument charging device
US11690605B2 (en) 2010-11-05 2023-07-04 Cilag Gmbh International Surgical instrument with charging station and wireless communication
WO2012061727A3 (en) * 2010-11-05 2012-11-01 Ethicon Enco-Surgery, Inc. Surgical instrument safety glasses or surgical monitor with visual feed back
US11389228B2 (en) 2010-11-05 2022-07-19 Cilag Gmbh International Surgical instrument with sensor and powered control
US10537380B2 (en) 2010-11-05 2020-01-21 Ethicon Llc Surgical instrument with charging station and wireless communication
US9000720B2 (en) 2010-11-05 2015-04-07 Ethicon Endo-Surgery, Inc. Medical device packaging with charging interface
US9649150B2 (en) 2010-11-05 2017-05-16 Ethicon Endo-Surgery, Llc Selective activation of electronic components in medical device
US10959769B2 (en) 2010-11-05 2021-03-30 Ethicon Llc Surgical instrument with slip ring assembly to power ultrasonic transducer
US9782215B2 (en) 2010-11-05 2017-10-10 Ethicon Endo-Surgery, Llc Surgical instrument with ultrasonic transducer having integral switches
US9782214B2 (en) 2010-11-05 2017-10-10 Ethicon Llc Surgical instrument with sensor and powered control
US10143513B2 (en) 2010-11-05 2018-12-04 Ethicon Llc Gear driven coupling between ultrasonic transducer and waveguide in surgical instrument
US10945783B2 (en) 2010-11-05 2021-03-16 Ethicon Llc Surgical instrument with modular shaft and end effector
US10881448B2 (en) 2010-11-05 2021-01-05 Ethicon Llc Cam driven coupling between ultrasonic transducer and waveguide in surgical instrument
US10085792B2 (en) 2010-11-05 2018-10-02 Ethicon Llc Surgical instrument with motorized attachment feature
US10660695B2 (en) 2010-11-05 2020-05-26 Ethicon Llc Sterile medical instrument charging device
WO2012116059A1 (en) * 2011-02-22 2012-08-30 Qualcomm Incorporated Providing a corrected view based on the position of a user with respect to a mobile platform
US9507416B2 (en) 2011-02-22 2016-11-29 Robert Howard Kimball Providing a corrected view based on the position of a user with respect to a mobile platform
WO2013114066A1 (en) * 2012-01-30 2013-08-08 Bae Systems Plc Improvements in or relating to image display systems
US9448406B2 (en) 2012-01-30 2016-09-20 Bae Systems Plc Image display systems
EP2747034A3 (en) * 2012-12-21 2018-01-24 Dassault Systemes Delmia Corp. Location correction of virtual objects
JP2014123376A (en) * 2012-12-21 2014-07-03 Dassault Systemes Delmia Corp Location correction of virtual objects
CN105320271B (en) * 2014-07-10 2018-09-14 精工爱普生株式会社 It is calibrated using the head-mounted display of direct Geometric Modeling
EP2966863A1 (en) * 2014-07-10 2016-01-13 Seiko Epson Corporation Hmd calibration with direct geometric modeling
US10198865B2 (en) 2014-07-10 2019-02-05 Seiko Epson Corporation HMD calibration with direct geometric modeling
CN105320271A (en) * 2014-07-10 2016-02-10 精工爱普生株式会社 HMD calibration with direct geometric modeling
US10136938B2 (en) 2014-10-29 2018-11-27 Ethicon Llc Electrosurgical instrument with sensor
US11252399B2 (en) 2015-05-28 2022-02-15 Microsoft Technology Licensing, Llc Determining inter-pupillary distance
US9746675B2 (en) 2015-05-28 2017-08-29 Microsoft Technology Licensing, Llc Alignment based view matrix tuning
WO2016191043A1 (en) * 2015-05-28 2016-12-01 Microsoft Technology Licensing, Llc Calibration of an optical see-through head mounted display
US10271042B2 (en) 2015-05-29 2019-04-23 Seeing Machines Limited Calibration of a head mounted eye tracking system
US10192133B2 (en) 2015-06-22 2019-01-29 Seiko Epson Corporation Marker, method of detecting position and pose of marker, and computer program
US10296805B2 (en) 2015-06-22 2019-05-21 Seiko Epson Corporation Marker, method of detecting position and pose of marker, and computer program
US10192361B2 (en) 2015-07-06 2019-01-29 Seiko Epson Corporation Head-mounted display device and computer program
US10242504B2 (en) 2015-07-06 2019-03-26 Seiko Epson Corporation Head-mounted display device and computer program
US10424117B2 (en) 2015-12-02 2019-09-24 Seiko Epson Corporation Controlling a display of a head-mounted display device
US10347048B2 (en) 2015-12-02 2019-07-09 Seiko Epson Corporation Controlling a display of a head-mounted display device
EP3948774A4 (en) * 2019-03-29 2022-06-01 Nec Corporation System and method for adaptively constructing a three-dimensional facial model based on two or more inputs of a two-dimensional facial image
US11227441B2 (en) 2019-07-04 2022-01-18 Scopis Gmbh Technique for calibrating a registration of an augmented reality device

Also Published As

Publication number Publication date
DE112004000902T5 (en) 2006-03-09
US20060152434A1 (en) 2006-07-13
WO2004113991A3 (en) 2005-02-24
US7369101B2 (en) 2008-05-06

Similar Documents

Publication Publication Date Title
US7369101B2 (en) Calibrating real and virtual views
CN100416336C (en) Calibrating real and virtual views
US20240000295A1 (en) Light field capture and rendering for head-mounted displays
US6891518B2 (en) Augmented reality visualization device
US20240080433A1 (en) Systems and methods for mediated-reality surgical visualization
Azuma A survey of augmented reality
US20060176242A1 (en) Augmented reality device and method
Azuma Augmented reality: Approaches and technical challenges
JP2022530012A (en) Head-mounted display with pass-through image processing
KR100542370B1 (en) Vision-based augmented reality system using invisible marker
KR20180101496A (en) Head-mounted display for virtual and mixed reality with inside-out location, user body and environment tracking
JP2001211403A (en) Head mount display device and head mount display system
JP3372926B2 (en) Head mounted display device and head mounted display system
Vogt et al. Reality augmentation for medical procedures: System architecture, single camera marker tracking, and system evaluation
JP2015060071A (en) Image display device, image display method, and image display program
Hua et al. A testbed for precise registration, natural occlusion and interaction in an augmented environment using a head-mounted projective display (HMPD)
Livingston Vision-based tracking with dynamic structured light for video see-through augmented reality
JPH1066678A (en) Non-contact line-of-sight measurement device
CN113589533A (en) Head mounted display and method for determining line of sight of user wearing the same
US20210208402A1 (en) A System and Method for Alignment Between Real and Virtual Objects in a Head-Mounted Optical See-Through Display
US11071453B2 (en) Systems and methods for reflection-based positioning relative to an eye
Gao et al. Easy calibration of a head-mounted projective display for augmented reality systems
CN214376323U (en) Entertainment helmet
EP4322114A1 (en) Projective bisector mirror
EP4329662A1 (en) Optical see through (ost) head mounted display (hmd) system and method for precise alignment of virtual objects with outwardly viewed objects

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A2

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BW BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE EG ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NA NI NO NZ OM PG PH PL PT RO RU SC SD SE SG SK SL SY TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW

AL Designated countries for regional patents

Kind code of ref document: A2

Designated state(s): BW GH GM KE LS MW MZ NA SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IT LU MC NL PL PT RO SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG

121 Ep: the epo has been informed by wipo that ep was designated in this application
WWE Wipo information: entry into national phase

Ref document number: 20048161027

Country of ref document: CN

RET De translation (de og part 6b)

Ref document number: 112004000902

Country of ref document: DE

Date of ref document: 20060309

Kind code of ref document: P

WWE Wipo information: entry into national phase

Ref document number: 112004000902

Country of ref document: DE

DPEN Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed from 20040101)
122 Ep: pct application non-entry in european phase
REG Reference to national code

Ref country code: DE

Ref legal event code: 8607