US20020041259A1 - Personal display with vision tracking - Google Patents

Personal display with vision tracking Download PDF

Info

Publication number
US20020041259A1
US20020041259A1 US09/128,954 US12895498A US2002041259A1 US 20020041259 A1 US20020041259 A1 US 20020041259A1 US 12895498 A US12895498 A US 12895498A US 2002041259 A1 US2002041259 A1 US 2002041259A1
Authority
US
United States
Prior art keywords
eye
light
image
user
source
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
US09/128,954
Other versions
US6396461B1 (en
Inventor
John R. Lewis
Nenad Nestorovic
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Microvision Inc
Original Assignee
Microvision Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Family has litigation
First worldwide family litigation filed litigation Critical https://patents.darts-ip.com/?family=22437786&utm_source=google_patent&utm_medium=platform_link&utm_campaign=public_patent_search&patent=US20020041259(A1) "Global patent litigation dataset” by Darts-ip is licensed under a Creative Commons Attribution 4.0 International License.
Application filed by Microvision Inc filed Critical Microvision Inc
Priority to US09/128,954 priority Critical patent/US6396461B1/en
Assigned to MICROVISION, INC. reassignment MICROVISION, INC. ASSIGNMENT OF ASSIGNORS INTEREST (SEE DOCUMENT FOR DETAILS). Assignors: LEWIS, JOHN R., NESTOROVIC, NENAD
Publication of US20020041259A1 publication Critical patent/US20020041259A1/en
Priority to US10/150,309 priority patent/US20020167462A1/en
Application granted granted Critical
Publication of US6396461B1 publication Critical patent/US6396461B1/en
Anticipated expiration legal-status Critical
Expired - Lifetime legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/0093Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00 with means for monitoring data relating to the user, e.g. head-tracking, eye-tracking
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2320/00Control of display operating conditions
    • G09G2320/02Improving the quality of display appearance
    • G09G2320/0261Improving the quality of display appearance in the context of movement of objects on the screen or movement of the observer relative to the screen
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G3/00Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes
    • G09G3/001Control arrangements or circuits, of interest only in connection with visual indicators other than cathode-ray tubes using specific devices not provided for in groups G09G3/02 - G09G3/36, e.g. using an intermediate record carrier such as a film slide; Projection systems; Display of non-alphanumerical information, solely or in combination with alphanumerical information, e.g. digital display on projected diapositive as background

Definitions

  • the present invention relates to displays and, more particularly, to displays that produce images responsive to a viewer's eye orientation.
  • CTRs cathode ray tube type displays
  • televisions and computer monitors are very common.
  • CRTs are bulky and consume substantial amounts of power, making them undesirable for portable or head-mounted applications.
  • Flat panel displays such as liquid crystal displays and field emission displays, may be less bulky and consume less power.
  • typical flat panel displays utilize screens that are several inches across. Such screens have limited use in head mounted applications or in applications where the display is intended to occupy only a small portion of a user's field of view.
  • One difficulty with such displays is that, as the user's eye moves to view various regions of the background information, the user's field of view shifts. As the field of view shifts, the position of the region 42 changes relative to the field of view 44 . This shifting may be desirable where the region 42 is intended to be fixed relative to the background information 48 . However, this shifting can be undesirable in applications where the image is intended to be at a fixed location in the user's field of view. Even if the image is intended to move within the field of view, the optics of the displaying apparatus may not provide an adequate image at all locations or orientations of the user's pupil relative to the optics.
  • a small display is a scanned display such as that described in U.S. Pat. No. 5,467,104 of Furness et. al., entitled VIRTUAL RETINAL DISPLAY, which is incorporated herein by reference.
  • a scanner such as a scanning mirror or acousto-optic scanner, scans a modulated light beam onto a viewer's retina.
  • the scanned light enters the eye through the viewer's pupil and is imaged onto the retina by the cornea and eye lens.
  • such displays may have difficulty when the viewer's eye moves.
  • control electronics 54 provide electrical signals that control operation of the display 50 in response to an image signal V IM from an image source 56 , such as a computer, television receiver, videocassette player, or similar device.
  • the second portion of the display 50 is a light source 57 that outputs a modulated light beam 53 having a modulation corresponding to information in the image signal V IM .
  • the light source may be a directly modulated light emitter such as a light emitting diode (LED) or may be include a continuous light emitter indirectly modulated by an external modulator, such as an acousto-optic modulator.
  • the third portion of the display 50 is a scanning assembly 58 that scans the modulated beam 53 of the light source 57 through a two-dimensional scanning pattern, such as a raster pattern.
  • a scanning assembly is a mechanically resonant scanner, such as that described U.S. Pat. No. 5,557,444 to Melville et al., entitled MINIATURE OPTICAL SCANNER FOR A TWO-AXIS SCANNING SYSTEM, which is incorporated herein by reference.
  • other scanning assemblies such as acousto-optic scanners may be used in such displays.
  • Optics 60 form the fourth portion of the display 50 .
  • the imaging optics 60 in the embodiment of FIG. 2 include a pair of lenses 62 and 64 that shape and focus the scanned beam 53 appropriately for viewing by the eye 52 .
  • the scanned beam 53 enters the eye 52 through a pupil 65 and strikes the retina 59 .
  • When scanned modulated light strikes the retina 59 the viewer perceives the image.
  • the display 50 may have difficulty when the viewer looks off-axis.
  • the viewer's eye 52 rotates, the viewer's pupil 65 moves from its central position. In the rotated position all or a portion of the scanned beam 53 from the imaging optics 56 may not enter the pupil 65 . Consequently, the viewer's retina 59 does not receive all of the scanned light. The viewer thus does not perceive the entire image.
  • One approach to this problem described employs an optics that expand the cross-sectional area of the scanned effective beam. A portion of the expanded beam strikes the pupil 65 and is visible to the viewer. While such an approach can improve the effective viewing angle and help to ensure that the viewer perceives the scanned image, the intensity of light received by the viewer is reduced as the square of the beam radius.
  • a display apparatus tracks the orientation or position of a user's eye and actively adjusts the position or orientation of an image source or manipulates an intermediate component to insure that light enters the user's pupil or to control the perceived location of a virtual image in the user's field of view.
  • the display includes a beam combiner that receives light from a background and light from the image source. The combined light from the combiner is received through the user's pupil and strikes the retina. The user perceives an image that is a combination of the virtual image and the background.
  • additional light strikes the user's eye.
  • the additional light may be a portion of the light provided by the image source or may be provided by a separate light source.
  • the additional light is preferably aligned with light from the beam combiner. Where the additional light comes from a source other than the image source, the additional light is preferably at a wavelength that is not visible.
  • a portion of the additional light is reflected or scattered by the user's eye and the reflected or scattered portion depends in part upon whether the additional light enters the eye through the pupil or whether the additional light strikes the remaining area of the eye.
  • the reflected or scattered light is then indicative of alignment of the additional light to the user's pupil.
  • an image field of a detector is aligned with the light exiting the beam combiner.
  • the detector receives the reflected portion of the additional light and provides an electrical signal indicative of the amount of reflected light to a position controller.
  • the detector is a low-resolution CCD array and the position controller includes an electronic controller and a look up table in a memory that provides adjustment data in response to the signals from the detector. Data from the look up table drives a piezoelectric positioning mechanism that is physically coupled to a substrate carrying both the detector and the image source.
  • the controller accesses the look up table to retrieve positioning data.
  • the piezoelectric positioning mechanism shifts the substrate to realign the image source and the detector to the pupil.
  • the CCD array is replaced by a quadrant-type detector, including a plurality of spaced-apart detectors.
  • the outputs of the detectors drive a control circuit that implements a search function to align the scanned beam to the pupil.
  • imaging optics having a magnification greater than one helps to direct light from the image source and additional light to the user's eye. Physical movement of the image source and detector causes an even greater movement of the location at which light from the image source strikes the eye. Thus, small movements induced by the piezoelectric positioning mechanism can track larger movements of the pupil position.
  • FIG. 1 is a diagrammatic representation of a combined image perceived by a user resulting from the combination of light from an image source and light from a background.
  • FIG. 2 is a diagrammatic representation of a scanner and a user's eye showing alignment of a scanned beam with the user's pupil.
  • FIG. 3 is a diagrammatic representation of a scanner and a user's eye showing misalignment of the scanned beam with the user's pupil.
  • FIG. 4 is a diagrammatic representation of a display according to one embodiment of the invention including a positioning beam and detector.
  • FIG. 5 is an isometric view of a head-mounted scanner including a tether.
  • FIG. 6 is a diagrammatic representation of the display of FIG. 4 showing displacement of the eye relative to the beam position and corresponding reflection of the positioning beam.
  • FIG. 7A is a diagrammatic representation of reflected light striking the detector in the position of FIG. 4.
  • FIG. 7B is a diagrammatic representation of reflected light striking the detector in the position of FIG. 6.
  • FIG. 8 is a diagrammatic representation of the display of FIG. 2 showing the image source and positioning beam source adjusted to correct the misalignment of FIG. 6.
  • FIG. 9 is a detail view of a portion of a display showing shape memory alloy-based positioners coupled to the substrate.
  • FIG. 10 is a schematic of a scanning system suitable for use as the image source in the display of FIG. 4.
  • FIG. 11 is a top plan view of a position detector including four separate optical detectors.
  • FIGS. 12 A-C are diagrammatic representations of a display utilizing a single reflective optic and a moving optical source.
  • FIG. 13 is a top plan view of a bi-axial MEMS scanner for use in the display of FIG. 2.
  • FIG. 14 is a diagram of an alternative embodiment of a display including an exit pupil expander and a moving light emitter.
  • FIG. 15A is a diagrammatic representative of nine exit pupils centered over an eye pupil.
  • FIG. 15B is a diagrammatic representation of shifting of the eye pupil of FIG. 15A and corresponding shifting of the exit pupil array.
  • a virtual retinal display 70 includes control electronics 72 , a light source 74 , a scanning assembly 58 , and imaging optics 78 .
  • the light source may be directly or indirectly modulated and the imaging optics 78 are formed from curved, partially transmissive mirrors 62 , 64 that combine light received from a background 80 with light from the scanning assembly 58 to produce a combined input to the viewer's eye 52 .
  • the light source 74 emits light modulated according to image signals V lM the image signal source 56 , such as a television receiver, computer, CD-ROM player, videocassette player, or any similar device.
  • the light source 74 may utilize coherent light emitters, such as laser diodes or microlasers, or may use noncoherent sources such as light emitting diodes. Also, the light source 74 may be directly modulated or an external modulator, such as an acousto-optic modulator, may be used.
  • coherent light emitters such as laser diodes or microlasers
  • noncoherent sources such as light emitting diodes.
  • the light source 74 may be directly modulated or an external modulator, such as an acousto-optic modulator, may be used.
  • an external modulator such as an acousto-optic modulator
  • image sources such as LCD panels and field emission displays
  • image sources are usually not preferred because they typically are larger and bulkier than the image source described in the preferred embodiment. Their large mass makes them more difficult to reposition quickly as described below with reference to FIGS. 6 - 8 .
  • the background 80 is presented herein as a “real-world” background, the background light may be
  • a first portion 71 of the display 67 is mounted to a head-borne frame 73 and a second portion 75 is carried separately, for example in a hip belt.
  • the portions 71 , 75 are linked by a fiber optic and electronic tether 77 that carries optical and electronic signals from the second portion to the first portion.
  • An example of a fiber coupled scanner display is found in U.S. Pat. No. 5,596,339 of Furness et.
  • the light source may be coupled directly to the scanning assembly 58 so that the fiber can be eliminated.
  • the user's eye 52 is typically in a substantially fixed location relative to the imaging optics 78 because the display 70 is typically head mounted.
  • this description therefore does not discuss head movement in describing operation of the display 70 .
  • the display 70 may be used in other than head-mounted applications, such as where the display 70 forms a fixed viewing apparatus having an eyecup against which the user's eye socket is pressed.
  • the user's head may be free for relative movement in some applications. In such applications, a known head tracking system may track the user's head position for coarse positioning.
  • Imaging optics 78 redirect and magnify scanned light from the scanning assembly 58 toward the user's eye 52 , where the light passes through the pupil 65 and strikes the retina 59 to produce a virtual image. At the same time, light from the background 80 passes through the mirrors 62 , 64 and pupil 65 to the user's retina 59 to produce a “real” image. Because the user's retina 59 receives light from both the scanned beam and the background 80 , the user perceives a combined image with the virtual image appearing transparent, as shown in FIG. 1. To ease the user's acquisition of light from partially or fully reflective mirrors 62 , 64 , the imaging optics 78 may also include an exit pupil expander that increases the effective numerical aperture of the beam of scanned light. The exit pupil expander is omitted from the Figures for clarity of presentation of the beam 53 .
  • the imaging optics 78 also receive a locator beam 90 from an infrared light source 92 carried by a common substrate 85 with the light source 74 .
  • the locator beam 90 is shown as following a different optical path for clarity of presentation, the infrared light source 92 is actually positioned adjacent to the light source 74 so that light from the light source 74 and light from the infrared light source 92 are substantially collinear.
  • the output of the imaging optics 78 includes light from the infrared light source 92 .
  • the infrared light source 92 and the light source 74 are shown as being physically adjacent, other implementations are easily realizable.
  • the infrared light source 92 may be physically separated from the light source 74 by superimposing the locator beam 90 onto the light from the light source 74 with a beam splitter and steering optics.
  • the pupil 65 may become misaligned with light from the light source 74 and infrared light source 92 . All or a portion of the light from the light source 74 and infrared source 92 may no longer enter the pupil 65 or may enter the pupil 65 at an orientation where the pupil 65 does not direct the light to the center of the retina 59 . Instead, some of the light from the sources 74 , 92 strikes a non-pupil portion 96 of the eye. As is known, the non-pupil portion 96 of the eye has a reflectance different and typically higher than that of the pupil 65 .
  • the non-pupil portion 96 reflects light from the sources 74 , 92 back toward the imaging optics 78 .
  • the imaging optics 78 redirect the reflected light toward an optical detector 98 positioned on the substrate 85 adjacent to the sources 74 , 92 .
  • the detector 98 is a commercially available CCD array that is sensitive to infrared light. As will be described below, in some applications, other types of detectors may be desirable.
  • FIG. 7A when the user's eye is positioned so that light from the sources 74 , 92 enters the pupil (i.e., when the eye is positioned as shown in FIG. 4), a central region 100 of the detector 98 receives a low level of light from the imaging optics 78 .
  • the area of low light resulting from the user's pupil will be referred to herein as the pupil shadow 106 .
  • the pupil shadow shifts relative to the detector 88 as shown in FIG. 7B.
  • the detector data which are indicative of the position of the pupil shadow 106 are input to an electronic controller 108 , such as a microprocessor or application specific integrated circuit (ASIC). Responsive to the data, the controller 108 accesses a look up table in a memory device 110 to retrieve positioning data indicating an appropriate positioning correction for the light source 74 .
  • the positioning data may be determined empirically or may be calculated based upon known geometry of the eye 52 and the scanning assembly 58 .
  • the controller 110 activates X and Y drivers 112 , 114 to provide voltages to respective piezoelectric positioners 116 , 118 coupled to the substrate 85 .
  • piezoelectric materials deform in the presence of electrical fields, thereby converting voltages to physical movement. Therefore, the applied voltages from the respective drivers 112 , 114 cause the piezoelectric positioners 116 , 118 to move the sources 74 , 92 , as indicated by the arrow 120 and arrowhead 122 in FIG. 8.
  • positioners such as electronic servomechanisms may be used in place of the piezoelectric positioners 112 , 114 .
  • shape memory alloy-based positioners 113 such as equiatomic nickel-titanium alloys, can be used to reposition the substrate as shown in FIG. 9.
  • the positioners 113 may be spirally located, as shown in FIG. 9 or may be in any other appropriate configuration.
  • the imaging optics 78 does not always require magnification, particularly where the positioners 116 , 118 are formed from a mechanism that provides relatively large translation of the scanner 70 .
  • FIG. 10 shows one embodiment of a mechanically resonant scanner 200 suitable for use as the scanning assembly 58 .
  • the resonant scanner 200 includes as the principal horizontal scanning element, a horizontal scanner 201 that includes a moving mirror 202 mounted to a spring plate 204 .
  • the dimensions of the mirror 202 and spring plate 204 and the material properties of the spring plate 204 are selected so that the mirror 202 and spring plate 204 have a natural oscillatory frequency on the order of 1-100 kHz.
  • a ferromagnetic material mounted with the mirror 202 is driven by a pair of electromagnetic coils 206 , 208 to provide motive force to mirror 202 , thereby initiating and sustaining oscillation.
  • Drive electronics 218 provide electrical signal to activate the coils 206 , 208 .
  • Vertical scanning is provided by a vertical scanner 220 structured very similarly to the horizontal scanner 201 .
  • the vertical scanner 220 includes a mirror 222 driven by a pair of coils 224 , 226 in response to electrical signals from the drive electronics 218 .
  • the vertical scanner 220 is typically not resonant.
  • the mirror 222 receives light from the horizontal scanner 201 and produces vertical deflection at about 30-100 Hz.
  • the lower frequency allows the mirror 222 to be significantly larger than the mirror 202 , thereby reducing constraints on the positioning of the vertical scanner 220 .
  • the light source 74 driven by the image source 56 (FIG. 8) outputs a beam of light that is modulated according to the image signal.
  • the drive electronics 218 activate the coils 206 , 208 , 224 , 226 to oscillate the mirrors 202 , 222 .
  • the modulated beam of light strikes the oscillating horizontal mirror 202 , and is deflected horizontally by an angle corresponding to the instantaneous angle of the mirror 202 .
  • the deflected light then strikes the vertical mirror 222 and is deflected at a vertical angle corresponding to the instantaneous angle of the vertical mirror 222 .
  • the modulation of the optical beam is synchronized with the horizontal and vertical scans so that at each position of the mirrors, the beam color and intensity correspond to a desired virtual image.
  • the beam therefore “draws” the virtual image directly upon the user's retina.
  • the vertical and horizontal scanners 201 , 220 are typically mounted in fixed relative positions to a frame.
  • the scanner 200 typically includes one or more turning mirrors that direct the beam such that the beam strikes each of the mirrors a plurality of times to increase the angular range of scanning.
  • FIG. 11 shows one realization of the position detector 88 in which the CCD array is replaced with four detectors 88 A- 88 D each aligned to a respective quadrant of the virtual image.
  • the pupil shadow 106 shifts, as represented by the broken lines in FIG. 10.
  • the voltage on the positioners 116 , 118 can then be varied to realign the scanned light to the user's eye 52 .
  • the outputs of the four quadrant detector can form error signals that, when amplified appropriately, may drive the respective positioners 114 , 116 to reposition the light emitter 74 .
  • a further aspect of the embodiment of the display 70 of FIG. 8 is z-axis adjustment provided by a third positioner 128 that controls the position of the light source 74 and scanner 76 along a third axis.
  • the third positioner 128 like the X and Y positioners 114 , 116 is a piezoelectric positioner controlled by the electronic controller 108 through a corresponding driver 130 .
  • the controller 108 responsive to positioning data from the memory 110 , activates the third positioner 130 , thereby adjusting the z-axis position of the light source 74 .
  • the appropriate positioning data can be determined empirically or may be developed analytically through optical modeling.
  • controller 108 can also adjust focus of the scanned beam 53 through the third positioner 130 . Adjustment of the focus allows the controller to compensate for shifts in the relative positions of the scanning assembly 76 , mirrors 62 , 64 and eye 52 which may result from movement of the eye, temperature changes, pressure changes, or other effects. Also, the controller 108 can adjust the z-axis position to adapt a head-mounted display to different users.
  • the embodiments herein are described as having positioning along three orthogonal axes, the invention is not so limited.
  • physical positioning may be applied to other degrees of motion.
  • rotational positioners may rotate the mirrors 62 , 64 , the light source 74 or the substitute 85 about various axes to provide rotational positioning control.
  • Such an embodiment allows the controller log to establish position of the virtual image (e.g. the region 42 of FIG. 1).
  • the controller 108 can move the region 42 to track changes in the user's field of view.
  • the region 42 can thus remain in a substantially fixed position in the user's field of view.
  • the three axes are not limited to orthogonal axes.
  • FIGS. 12 A-C While the embodiments described herein have included two mirrors 62 , 64 , one skilled in the art will recognize that more complex or less complex optical structures may be desirable for some applications.
  • a single reflective optics 300 can be used to reflect light toward the viewer's eye 52 .
  • the corresponding position and angular orientation of the scanning assembly 58 can be determined for each eye position, as shown in FIGS. 12 A-C.
  • the determined position and orientation are then stored digitally and retrieved in response to detected eye position.
  • the scanning assembly 58 is then moved to the retrieved eye position and orientation. For example, as shown in FIG. 12B, when the field of view of the eyes is centered, the scanning assembly 58 is centered. When the field of view is shifted left, as shown in FIG. 12A, the scanning assembly 58 is shifted right to compensate.
  • MEMS microelectromechanical
  • a bi-axial scanner 400 is formed in a silicon substrate 402 .
  • the bi-axial scanner 400 includes a mirror 404 supported by opposed flexures 406 that link the mirror 404 to a pivotable support 408 .
  • the flexures 406 are dimensioned to twist torsionally thereby allowing the mirror 404 to pivot about an axis defined by the flexures 406 , relative to the support 408 .
  • pivoting of the mirror 404 defines horizontal scans of the scanner 400 .
  • a second pair of opposed flexures 412 couple the support 408 to the substrate 402 .
  • the flexures 412 are dimensioned to flex torsionally, thereby allowing the support 408 to pivot relative to the substrate 402 .
  • the mass and dimensions of the mirror 404 , support 408 and flexures 406 , 412 are selected such that the mirror 404 resonates, at 10-40 kHz horizontally with a high Q and such that the support 408 pivots at frequencies that are preferably higher than 60 Hz, although in some applications, a lower frequency may be desirable. For example, where a plurality of beams are used, vertical frequencies of 10 Hz or lower may be acceptable.
  • the mirror 404 is pivoted by applying an electric field between a plate 414 on the mirror 404 and a conductor on a base (not shown).
  • This approach is termed capacitive drive, because of the plate 414 acts as one plate of a capacitor and the conductor in the base acts as a second plate.
  • the electric field exerts a force on the mirror 404 causing the mirror 404 to pivot about the flexures 406 .
  • the mirror 404 can be made to scan periodically.
  • the voltage is varied at the mechanically resonant frequency of the mirror 404 so that the mirror 404 will oscillate with little power consumption.
  • the support 408 may be pivoted magnetically or capacitively depending upon the requirements of a particular application.
  • the support 408 and flexures 412 are dimensioned so that the support 408 can respond frequencies well above a desired refresh rate, such as 60 Hz.
  • FIG. 14 An alternative embodiment according to the invention, shown in FIG. 14 includes a diffractive exit pupil expander 450 positioned between the scanning assembly 58 and the eye 52 .
  • a diffractive exit pupil expander 450 positioned between the scanning assembly 58 and the eye 52 .
  • the exit pupil expander 450 redirects the scanned beam to a plurality of common locations, to define a plurality of exit pupils 456 .
  • the exit pupil expander 450 may produce nine separate exit pupils 456 .
  • the user's pupil 65 receives one or more of the defined exit pupils 456 , the user can view the desired image.
  • the pupil 65 still may receive light from one or more of the exit pupils 456 .
  • the user thus continues to perceive the image, even when the pupil 65 shifts relative to the exit pupils 456 .
  • the scanning assembly 58 (FIGS. 12 A- 12 C) shifts, as indicated by the arrows 458 in FIG. 14 and arrows 460 in FIG. 15B to center the array of exit pupils 456 with the user's pupil 65 .
  • the number of exit pupils 456 can be reduced while preserving coupling to the pupil 65 .
  • the detector 88 and infrared source 92 may be mounted separately from the light source 74 .
  • the detector 98 and infrared source 92 may be mounted in a fixed location or may be driven by a separate set of positioners.
  • the detector 98 would monitor reflected visible light originating from the light source 74 .
  • the infrared beam and scanned light beam may be made collinear through the use of conventional beam splitting techniques.
  • the piezoelectric positioners 116 , 118 may be coupled to the mirror 64 or to an intermediate lens 121 to produce a “virtual” movement of the light source 74 .
  • translation of the mirror 64 or lens 121 will produce a shift in the apparent position of the light source 74 relative to the eye.
  • the lens 121 also allows the display to vary the apparent distance from the scanner 200 , 400 to the eye 52 .
  • the lens 121 may be formed from or include an electro-optic material, such as quartz.
  • the effective focal length can then be varied by varying the voltage across the electro-optic material for each position of the scanner 200 , 400 .
  • the horizontal scanners 200 , 400 are described herein as preferably being mechanically resonant at the scanning frequency, in some applications the scanner 200 may be non-resonant. For example, where the scanner 200 is used for “stroke” or “calligraphic” scanning, a non-resonant scanner would be preferred.
  • a single light source is described herein, the principles and structures described herein are applicable to displays having a plurality of light sources. In fact, the exit pupil expander 450 of FIG. 14 effectively approximates the use of several light sources.
  • the exemplary embodiment herein utilizes the pupil shadow to track gaze
  • a variety of other approaches may be within the scope of the invention, for example, reflective techniques, such known “glint” techniques as may be adapted for use with the described embodiments according to the invention may image the fundus or features of the iris to track gaze. Accordingly, the invention is not limited except as by the appended claims.

Abstract

A display apparatus includes an image source, an eye position detector, and a combiner, that are aligned to a user's eye. The eye position detector monitors light reflected from the user's eye to identify the pupil position. If light from the image source becomes misaligned with respect to the pupil, a physical positioning mechanism adjusts the relative positions of the image source and the beam combiner so that light from the image source is translated relative to the pupil, thereby realigning the display to the pupil. In one embodiment, the positioner is a piezoelectric positioner and in other embodiments, the positioner is a servomechanism or a shape memory alloy.

Description

    TECHNICAL FIELD
  • The present invention relates to displays and, more particularly, to displays that produce images responsive to a viewer's eye orientation. [0001]
  • BACKGROUND OF THE INVENTION
  • A variety of techniques are available for providing visual displays of graphical or video images to a user. For example, cathode ray tube type displays (CRTs), such as televisions and computer monitors are very common. Such devices suffer from several limitations. For example, CRTs are bulky and consume substantial amounts of power, making them undesirable for portable or head-mounted applications. [0002]
  • Flat panel displays, such as liquid crystal displays and field emission displays, may be less bulky and consume less power. However, typical flat panel displays utilize screens that are several inches across. Such screens have limited use in head mounted applications or in applications where the display is intended to occupy only a small portion of a user's field of view. [0003]
  • More recently, very small displays have been developed for partial or augmented view applications. In such applications, a portion of the display is positioned in the user's field of view and presents an image that occupies a region [0004] 42 of the user's field of view 44, as shown in FIG. 1. The user can thus see both a displayed image 46 and background information 48.
  • One difficulty with such displays is that, as the user's eye moves to view various regions of the background information, the user's field of view shifts. As the field of view shifts, the position of the region [0005] 42 changes relative to the field of view 44. This shifting may be desirable where the region 42 is intended to be fixed relative to the background information 48. However, this shifting can be undesirable in applications where the image is intended to be at a fixed location in the user's field of view. Even if the image is intended to move within the field of view, the optics of the displaying apparatus may not provide an adequate image at all locations or orientations of the user's pupil relative to the optics.
  • One example of a small display is a scanned display such as that described in U.S. Pat. No. 5,467,104 of Furness et. al., entitled VIRTUAL RETINAL DISPLAY, which is incorporated herein by reference. In scanned displays, a scanner, such as a scanning mirror or acousto-optic scanner, scans a modulated light beam onto a viewer's retina. The scanned light enters the eye through the viewer's pupil and is imaged onto the retina by the cornea and eye lens. As will now be described with reference to FIG. 2, such displays may have difficulty when the viewer's eye moves. [0006]
  • As shown in FIG. 2, a scanned [0007] display 50 is positioned for viewing by a viewer's eye 52. The display 50 includes four principal portions, each of which will be described in greater detail below. First, control electronics 54 provide electrical signals that control operation of the display 50 in response to an image signal VIM from an image source 56, such as a computer, television receiver, videocassette player, or similar device.
  • The second portion of the [0008] display 50 is a light source 57 that outputs a modulated light beam 53 having a modulation corresponding to information in the image signal VIM. The light source may be a directly modulated light emitter such as a light emitting diode (LED) or may be include a continuous light emitter indirectly modulated by an external modulator, such as an acousto-optic modulator.
  • The third portion of the [0009] display 50 is a scanning assembly 58 that scans the modulated beam 53 of the light source 57 through a two-dimensional scanning pattern, such as a raster pattern. One example of such a scanning assembly is a mechanically resonant scanner, such as that described U.S. Pat. No. 5,557,444 to Melville et al., entitled MINIATURE OPTICAL SCANNER FOR A TWO-AXIS SCANNING SYSTEM, which is incorporated herein by reference. However, other scanning assemblies, such as acousto-optic scanners may be used in such displays.
  • Optics [0010] 60 form the fourth portion of the display 50. The imaging optics 60 in the embodiment of FIG. 2 include a pair of lenses 62 and 64 that shape and focus the scanned beam 53 appropriately for viewing by the eye 52. The scanned beam 53 enters the eye 52 through a pupil 65 and strikes the retina 59. When scanned modulated light strikes the retina 59, the viewer perceives the image.
  • As shown in FIG. 3, the [0011] display 50 may have difficulty when the viewer looks off-axis. When the viewer's eye 52 rotates, the viewer's pupil 65 moves from its central position. In the rotated position all or a portion of the scanned beam 53 from the imaging optics 56 may not enter the pupil 65. Consequently, the viewer's retina 59 does not receive all of the scanned light. The viewer thus does not perceive the entire image.
  • One approach to this problem described employs an optics that expand the cross-sectional area of the scanned effective beam. A portion of the expanded beam strikes the [0012] pupil 65 and is visible to the viewer. While such an approach can improve the effective viewing angle and help to ensure that the viewer perceives the scanned image, the intensity of light received by the viewer is reduced as the square of the beam radius.
  • SUMMARY OF THE INVENTION
  • A display apparatus tracks the orientation or position of a user's eye and actively adjusts the position or orientation of an image source or manipulates an intermediate component to insure that light enters the user's pupil or to control the perceived location of a virtual image in the user's field of view. In one embodiment, the display includes a beam combiner that receives light from a background and light from the image source. The combined light from the combiner is received through the user's pupil and strikes the retina. The user perceives an image that is a combination of the virtual image and the background. [0013]
  • In addition to the light from the background and light from the image source, additional light strikes the user's eye. The additional light may be a portion of the light provided by the image source or may be provided by a separate light source. The additional light is preferably aligned with light from the beam combiner. Where the additional light comes from a source other than the image source, the additional light is preferably at a wavelength that is not visible. [0014]
  • A portion of the additional light is reflected or scattered by the user's eye and the reflected or scattered portion depends in part upon whether the additional light enters the eye through the pupil or whether the additional light strikes the remaining area of the eye. The reflected or scattered light is then indicative of alignment of the additional light to the user's pupil. [0015]
  • In one embodiment, an image field of a detector is aligned with the light exiting the beam combiner. The detector receives the reflected portion of the additional light and provides an electrical signal indicative of the amount of reflected light to a position controller. [0016]
  • In one embodiment, the detector is a low-resolution CCD array and the position controller includes an electronic controller and a look up table in a memory that provides adjustment data in response to the signals from the detector. Data from the look up table drives a piezoelectric positioning mechanism that is physically coupled to a substrate carrying both the detector and the image source. [0017]
  • When the detector indicates a shift in location of the reflected additional light, the controller accesses the look up table to retrieve positioning data. In response to the retrieved data, the piezoelectric positioning mechanism shifts the substrate to realign the image source and the detector to the pupil. [0018]
  • In another embodiment, the CCD array is replaced by a quadrant-type detector, including a plurality of spaced-apart detectors. The outputs of the detectors drive a control circuit that implements a search function to align the scanned beam to the pupil. [0019]
  • In one embodiment, imaging optics having a magnification greater than one helps to direct light from the image source and additional light to the user's eye. Physical movement of the image source and detector causes an even greater movement of the location at which light from the image source strikes the eye. Thus, small movements induced by the piezoelectric positioning mechanism can track larger movements of the pupil position.[0020]
  • BRIEF DESCRIPTION OF THE FIGURES
  • FIG. 1 is a diagrammatic representation of a combined image perceived by a user resulting from the combination of light from an image source and light from a background. [0021]
  • FIG. 2 is a diagrammatic representation of a scanner and a user's eye showing alignment of a scanned beam with the user's pupil. [0022]
  • FIG. 3 is a diagrammatic representation of a scanner and a user's eye showing misalignment of the scanned beam with the user's pupil. [0023]
  • FIG. 4 is a diagrammatic representation of a display according to one embodiment of the invention including a positioning beam and detector. [0024]
  • FIG. 5 is an isometric view of a head-mounted scanner including a tether. [0025]
  • FIG. 6 is a diagrammatic representation of the display of FIG. 4 showing displacement of the eye relative to the beam position and corresponding reflection of the positioning beam. [0026]
  • FIG. 7A is a diagrammatic representation of reflected light striking the detector in the position of FIG. 4. [0027]
  • FIG. 7B is a diagrammatic representation of reflected light striking the detector in the position of FIG. 6. [0028]
  • FIG. 8 is a diagrammatic representation of the display of FIG. 2 showing the image source and positioning beam source adjusted to correct the misalignment of FIG. 6. [0029]
  • FIG. 9 is a detail view of a portion of a display showing shape memory alloy-based positioners coupled to the substrate. [0030]
  • FIG. 10 is a schematic of a scanning system suitable for use as the image source in the display of FIG. 4. [0031]
  • FIG. 11 is a top plan view of a position detector including four separate optical detectors. [0032]
  • FIGS. [0033] 12A-C are diagrammatic representations of a display utilizing a single reflective optic and a moving optical source.
  • FIG. 13 is a top plan view of a bi-axial MEMS scanner for use in the display of FIG. 2. [0034]
  • FIG. 14 is a diagram of an alternative embodiment of a display including an exit pupil expander and a moving light emitter. [0035]
  • FIG. 15A is a diagrammatic representative of nine exit pupils centered over an eye pupil. [0036]
  • FIG. 15B is a diagrammatic representation of shifting of the eye pupil of FIG. 15A and corresponding shifting of the exit pupil array.[0037]
  • DETAILED DESCRIPTION OF THE INVENTION
  • As shown in FIG. 4, a virtual retinal display [0038] 70 according to the invention includes control electronics 72, a light source 74, a scanning assembly 58, and imaging optics 78. As with the embodiment of FIG. 2, the light source may be directly or indirectly modulated and the imaging optics 78 are formed from curved, partially transmissive mirrors 62, 64 that combine light received from a background 80 with light from the scanning assembly 58 to produce a combined input to the viewer's eye 52. The light source 74 emits light modulated according to image signals VlM the image signal source 56, such as a television receiver, computer, CD-ROM player, videocassette player, or any similar device. The light source 74 may utilize coherent light emitters, such as laser diodes or microlasers, or may use noncoherent sources such as light emitting diodes. Also, the light source 74 may be directly modulated or an external modulator, such as an acousto-optic modulator, may be used. One skilled in the art will recognize that a variety of other image sources, such as LCD panels and field emission displays, may also be used. However, such image sources are usually not preferred because they typically are larger and bulkier than the image source described in the preferred embodiment. Their large mass makes them more difficult to reposition quickly as described below with reference to FIGS. 6-8. Moreover, although the background 80 is presented herein as a “real-world” background, the background light may be occluded or may be produced by another light source of the same or different type.
  • Although the elements here are presented diagrammatically, one skilled in the art will recognize that the components are typically sized and configured for mounting to a helmet or similar frame as a head-mounted [0039] display 67, as shown in FIG. 5. In this embodiment, a first portion 71 of the display 67 is mounted to a head-borne frame 73 and a second portion 75 is carried separately, for example in a hip belt. The portions 71, 75 are linked by a fiber optic and electronic tether 77 that carries optical and electronic signals from the second portion to the first portion. An example of a fiber coupled scanner display is found in U.S. Pat. No. 5,596,339 of Furness et. al., entitled VIRTUAL RETINAL DISPLAY WITH FIBER OPTIC POINT SOURCE which is incorporated herein by reference. One skilled in the art will recognize that, in many applications, the light source may be coupled directly to the scanning assembly 58 so that the fiber can be eliminated.
  • Returning to the display [0040] 70 of FIG. 4, the user's eye 52 is typically in a substantially fixed location relative to the imaging optics 78 because the display 70 is typically head mounted. For clarity, this description therefore does not discuss head movement in describing operation of the display 70. One skilled in the art will recognize that the display 70 may be used in other than head-mounted applications, such as where the display 70 forms a fixed viewing apparatus having an eyecup against which the user's eye socket is pressed. Also, the user's head may be free for relative movement in some applications. In such applications, a known head tracking system may track the user's head position for coarse positioning.
  • Imaging optics [0041] 78 redirect and magnify scanned light from the scanning assembly 58 toward the user's eye 52, where the light passes through the pupil 65 and strikes the retina 59 to produce a virtual image. At the same time, light from the background 80 passes through the mirrors 62, 64 and pupil 65 to the user's retina 59 to produce a “real” image. Because the user's retina 59 receives light from both the scanned beam and the background 80, the user perceives a combined image with the virtual image appearing transparent, as shown in FIG. 1. To ease the user's acquisition of light from partially or fully reflective mirrors 62, 64, the imaging optics 78 may also include an exit pupil expander that increases the effective numerical aperture of the beam of scanned light. The exit pupil expander is omitted from the Figures for clarity of presentation of the beam 53.
  • In addition to light from the light source [0042] 74, the imaging optics 78 also receive a locator beam 90 from an infrared light source 92 carried by a common substrate 85 with the light source 74. Though the locator beam 90 is shown as following a different optical path for clarity of presentation, the infrared light source 92 is actually positioned adjacent to the light source 74 so that light from the light source 74 and light from the infrared light source 92 are substantially collinear. Thus, the output of the imaging optics 78 includes light from the infrared light source 92. One skilled in the art will recognize that, although the infrared light source 92 and the light source 74 are shown as being physically adjacent, other implementations are easily realizable. For example, the infrared light source 92 may be physically separated from the light source 74 by superimposing the locator beam 90 onto the light from the light source 74 with a beam splitter and steering optics.
  • Tracking of the eye position will now be described with reference to FIGS. [0043] 6-9. As shown in FIG. 6, when the user's eye 52 moves, the pupil 65 may become misaligned with light from the light source 74 and infrared light source 92. All or a portion of the light from the light source 74 and infrared source 92 may no longer enter the pupil 65 or may enter the pupil 65 at an orientation where the pupil 65 does not direct the light to the center of the retina 59. Instead, some of the light from the sources 74, 92 strikes a non-pupil portion 96 of the eye. As is known, the non-pupil portion 96 of the eye has a reflectance different and typically higher than that of the pupil 65. Consequently, the non-pupil portion 96 reflects light from the sources 74, 92 back toward the imaging optics 78. The imaging optics 78 redirect the reflected light toward an optical detector 98 positioned on the substrate 85 adjacent to the sources 74, 92. In this embodiment, the detector 98 is a commercially available CCD array that is sensitive to infrared light. As will be described below, in some applications, other types of detectors may be desirable.
  • As shown in FIG. 7A, when the user's eye is positioned so that light from the sources [0044] 74, 92 enters the pupil (i.e., when the eye is positioned as shown in FIG. 4), a central region 100 of the detector 98 receives a low level of light from the imaging optics 78. The area of low light resulting from the user's pupil will be referred to herein as the pupil shadow 106. When the eye 52 shifts to the position shown in FIG. 6, the pupil shadow shifts relative to the detector 88 as shown in FIG. 7B.
  • The detector data, which are indicative of the position of the [0045] pupil shadow 106 are input to an electronic controller 108, such as a microprocessor or application specific integrated circuit (ASIC). Responsive to the data, the controller 108 accesses a look up table in a memory device 110 to retrieve positioning data indicating an appropriate positioning correction for the light source 74. The positioning data may be determined empirically or may be calculated based upon known geometry of the eye 52 and the scanning assembly 58.
  • In response to the retrieved positioning data, the controller [0046] 110 activates X and Y drivers 112, 114 to provide voltages to respective piezoelectric positioners 116, 118 coupled to the substrate 85. As is known, piezoelectric materials deform in the presence of electrical fields, thereby converting voltages to physical movement. Therefore, the applied voltages from the respective drivers 112, 114 cause the piezoelectric positioners 116, 118 to move the sources 74, 92, as indicated by the arrow 120 and arrowhead 122 in FIG. 8.
  • As shown in FIG. 8, shifting the positions of the sources [0047] 74, 92 shifts the locations at which light from the sources 74, 92 strikes the user's eye, so that the light once again enters the pupil. The pupil shadow 106 once again returns to the position shown in FIG. 7A. One skilled in the art will recognize that the deformation of the piezoelectric positioner 116 is exaggerated in FIG. 8 for demonstrative purposes. However, because the mirrors 62, 64 have a magnification greater than one, small shifts in the position of the substrate 85 can produce larger shifts in the location at which the light from the light source 74 arrives at the eye. Thus, the piezoelectric positioners 112, 114 can produce sufficient beam translation for many positions of the eye. Where even larger beam translations are desirable, a variety of other types of positioners, such as electronic servomechanisms may be used in place of the piezoelectric positioners 112, 114. Alternatively, shape memory alloy-based positioners 113, such as equiatomic nickel-titanium alloys, can be used to reposition the substrate as shown in FIG. 9. The positioners 113 may be spirally located, as shown in FIG. 9 or may be in any other appropriate configuration. One skilled in the art will also recognize that the imaging optics 78 does not always require magnification, particularly where the positioners 116, 118 are formed from a mechanism that provides relatively large translation of the scanner 70.
  • FIG. 10 shows one embodiment of a mechanically [0048] resonant scanner 200 suitable for use as the scanning assembly 58. The resonant scanner 200 includes as the principal horizontal scanning element, a horizontal scanner 201 that includes a moving mirror 202 mounted to a spring plate 204. The dimensions of the mirror 202 and spring plate 204 and the material properties of the spring plate 204 are selected so that the mirror 202 and spring plate 204 have a natural oscillatory frequency on the order of 1-100 kHz. A ferromagnetic material mounted with the mirror 202 is driven by a pair of electromagnetic coils 206, 208 to provide motive force to mirror 202, thereby initiating and sustaining oscillation. Drive electronics 218 provide electrical signal to activate the coils 206, 208.
  • Vertical scanning is provided by a vertical scanner [0049] 220 structured very similarly to the horizontal scanner 201. Like the horizontal scanner 201, the vertical scanner 220 includes a mirror 222 driven by a pair of coils 224, 226 in response to electrical signals from the drive electronics 218. However, because the rate of oscillation is much lower for vertical scanning, the vertical scanner 220 is typically not resonant. The mirror 222 receives light from the horizontal scanner 201 and produces vertical deflection at about 30-100 Hz. Advantageously, the lower frequency allows the mirror 222 to be significantly larger than the mirror 202, thereby reducing constraints on the positioning of the vertical scanner 220.
  • In operation, the light source [0050] 74, driven by the image source 56 (FIG. 8) outputs a beam of light that is modulated according to the image signal. At the same time, the drive electronics 218 activate the coils 206, 208, 224, 226 to oscillate the mirrors 202, 222. The modulated beam of light strikes the oscillating horizontal mirror 202, and is deflected horizontally by an angle corresponding to the instantaneous angle of the mirror 202. The deflected light then strikes the vertical mirror 222 and is deflected at a vertical angle corresponding to the instantaneous angle of the vertical mirror 222. The modulation of the optical beam is synchronized with the horizontal and vertical scans so that at each position of the mirrors, the beam color and intensity correspond to a desired virtual image. The beam therefore “draws” the virtual image directly upon the user's retina. One skilled in the art will recognize that several components of the scanner 200 have been omitted for clarity of presentation. For example, the vertical and horizontal scanners 201, 220 are typically mounted in fixed relative positions to a frame. Additionally, the scanner 200 typically includes one or more turning mirrors that direct the beam such that the beam strikes each of the mirrors a plurality of times to increase the angular range of scanning.
  • FIG. 11 shows one realization of the position detector [0051] 88 in which the CCD array is replaced with four detectors 88A-88D each aligned to a respective quadrant of the virtual image. When the user's eye 52 becomes misaligned with the virtual image, the pupil shadow 106 shifts, as represented by the broken lines in FIG. 10. In this position, the intensity of light received by one or more of the detectors 88A-88D falls. The voltage on the positioners 116, 118 can then be varied to realign the scanned light to the user's eye 52. Advantageously, in this embodiment, the outputs of the four quadrant detector can form error signals that, when amplified appropriately, may drive the respective positioners 114, 116 to reposition the light emitter 74.
  • A further aspect of the embodiment of the display [0052] 70 of FIG. 8 is z-axis adjustment provided by a third positioner 128 that controls the position of the light source 74 and scanner 76 along a third axis. The third positioner 128, like the X and Y positioners 114, 116 is a piezoelectric positioner controlled by the electronic controller 108 through a corresponding driver 130.
  • As can be seen from FIG. 8, when the user's [0053] eye 52 rotates to view an object off-axis and the X and Y positioners 116, 118 adjust the position of the light source 74, the distance between the scanner 76 and the first mirror 64 changes slightly, as does the distance between the first mirror 64 and the eye 52. Consequently, the image plane defined by the scanned beam may shift away from the desired location and the perceived image may become distorted. Such shifting may also produce an effective astigmatism in biocular or binocular systems due to difference in the variations between the left and right eye subsystems. To compensate for the shift in relative positions, the controller 108, responsive to positioning data from the memory 110, activates the third positioner 130, thereby adjusting the z-axis position of the light source 74. The appropriate positioning data can be determined empirically or may be developed analytically through optical modeling.
  • One skilled in the art will also recognize that the [0054] controller 108 can also adjust focus of the scanned beam 53 through the third positioner 130. Adjustment of the focus allows the controller to compensate for shifts in the relative positions of the scanning assembly 76, mirrors 62, 64 and eye 52 which may result from movement of the eye, temperature changes, pressure changes, or other effects. Also, the controller 108 can adjust the z-axis position to adapt a head-mounted display to different users.
  • Although the embodiments herein are described as having positioning along three orthogonal axes, the invention is not so limited. First, physical positioning may be applied to other degrees of motion. For example, rotational positioners may rotate the [0055] mirrors 62, 64, the light source 74 or the substitute 85 about various axes to provide rotational positioning control. Such an embodiment allows the controller log to establish position of the virtual image (e.g. the region 42 of FIG. 1). By controlling the position of the virtual image, the controller 108 can move the region 42 to track changes in the user's field of view. The region 42 can thus remain in a substantially fixed position in the user's field of view. In addition to rotational freedom, one skilled in the art will recognize that the three axes are not limited to orthogonal axes.
  • While the embodiments described herein have included two [0056] mirrors 62, 64, one skilled in the art will recognize that more complex or less complex optical structures may be desirable for some applications. For example, as shown in FIGS. 12A-C, a single reflective optics 300 can be used to reflect light toward the viewer's eye 52. By tracing the optical paths 302 from the scanning assembly 58 to the pupil 65, the corresponding position and angular orientation of the scanning assembly 58 can be determined for each eye position, as shown in FIGS. 12A-C.
  • The determined position and orientation are then stored digitally and retrieved in response to detected eye position. The [0057] scanning assembly 58 is then moved to the retrieved eye position and orientation. For example, as shown in FIG. 12B, when the field of view of the eyes is centered, the scanning assembly 58 is centered. When the field of view is shifted left, as shown in FIG. 12A, the scanning assembly 58 is shifted right to compensate.
  • To reduce the size and weight to be moved in response to the detected eye position, it is desirable to reduce the size and weight of the [0058] scanning assembly 58. One approach to reducing the size and weight is to replace the mechanical resonant scanners 200, 220 with a microelectromechanical (MEMS) scanner, such as that described in U.S. Pat. No. 5,629,790 entitled MICROMACHINED TORSIONAL SCANNER to Neukermans et. al., and U.S. Pat. No. 5,648,618 entitled MICROMACHINED HINGE HAVING AN INTEGRAL TORSION SENSOR to Neukermans et. al., each of which is incorporated herein by reference. As described therein and shown in FIG. 13, a bi-axial scanner 400 is formed in a silicon substrate 402. The bi-axial scanner 400 includes a mirror 404 supported by opposed flexures 406 that link the mirror 404 to a pivotable support 408. The flexures 406 are dimensioned to twist torsionally thereby allowing the mirror 404 to pivot about an axis defined by the flexures 406, relative to the support 408. In one embodiment, pivoting of the mirror 404 defines horizontal scans of the scanner 400.
  • A second pair of [0059] opposed flexures 412 couple the support 408 to the substrate 402. The flexures 412 are dimensioned to flex torsionally, thereby allowing the support 408 to pivot relative to the substrate 402. Preferably, the mass and dimensions of the mirror 404, support 408 and flexures 406, 412 are selected such that the mirror 404 resonates, at 10-40 kHz horizontally with a high Q and such that the support 408 pivots at frequencies that are preferably higher than 60 Hz, although in some applications, a lower frequency may be desirable. For example, where a plurality of beams are used, vertical frequencies of 10 Hz or lower may be acceptable.
  • In a preferred embodiment, the [0060] mirror 404 is pivoted by applying an electric field between a plate 414 on the mirror 404 and a conductor on a base (not shown). This approach is termed capacitive drive, because of the plate 414 acts as one plate of a capacitor and the conductor in the base acts as a second plate. As the voltage between plates increases, the electric field exerts a force on the mirror 404 causing the mirror 404 to pivot about the flexures 406. By periodically varying the voltage applied to the plates, the mirror 404 can be made to scan periodically. Preferably, the voltage is varied at the mechanically resonant frequency of the mirror 404 so that the mirror 404 will oscillate with little power consumption.
  • The support [0061] 408 may be pivoted magnetically or capacitively depending upon the requirements of a particular application. Preferably, the support 408 and flexures 412 are dimensioned so that the support 408 can respond frequencies well above a desired refresh rate, such as 60 Hz.
  • An alternative embodiment according to the invention, shown in FIG. 14 includes a diffractive [0062] exit pupil expander 450 positioned between the scanning assembly 58 and the eye 52. As described in U.S. Pat. No. 5,701,132 entitled VIRTUAL RETINAL DISPLAY WITH EXPANDED EXIT PUPIL to Kollin et. al., which is incorporated herein by reference, at each scan position 452, 454 the exit pupil expander 450 redirects the scanned beam to a plurality of common locations, to define a plurality of exit pupils 456. For example, as shown in FIG. 15A, the exit pupil expander 450 may produce nine separate exit pupils 456. When the user's pupil 65 receives one or more of the defined exit pupils 456, the user can view the desired image.
  • If the user's eye moves, as shown in FIG. 15B, the [0063] pupil 65 still may receive light from one or more of the exit pupils 456. The user thus continues to perceive the image, even when the pupil 65 shifts relative to the exit pupils 456. Nevertheless, the scanning assembly 58 (FIGS. 12A-12C) shifts, as indicated by the arrows 458 in FIG. 14 and arrows 460 in FIG. 15B to center the array of exit pupils 456 with the user's pupil 65. By re-centering the array relative to the pupil 65, the number of exit pupils 456 can be reduced while preserving coupling to the pupil 65.
  • Although the invention has been described herein by way of exemplary embodiments, variations in the structures and methods described herein may be made without departing from the spirit and scope of the invention. For example, the positioning of the various components may also be varied. In one example of repositioning, the detector [0064] 88 and infrared source 92 may be mounted separately from the light source 74. In such an embodiment, the detector 98 and infrared source 92 may be mounted in a fixed location or may be driven by a separate set of positioners. Also, in some applications, it may be desirable to eliminate the infrared source 92. In such an embodiment, the detector 98 would monitor reflected visible light originating from the light source 74. Also, the infrared beam and scanned light beam may be made collinear through the use of conventional beam splitting techniques. In still another embodiment, the piezoelectric positioners 116, 118 may be coupled to the mirror 64 or to an intermediate lens 121 to produce a “virtual” movement of the light source 74. In this embodiment, translation of the mirror 64 or lens 121 will produce a shift in the apparent position of the light source 74 relative to the eye. By shifting the position or effective focal length of the lens 121, the lens 121 also allows the display to vary the apparent distance from the scanner 200, 400 to the eye 52. For example, the lens 121 may be formed from or include an electro-optic material, such as quartz. The effective focal length can then be varied by varying the voltage across the electro-optic material for each position of the scanner 200, 400. Moreover, although the horizontal scanners 200, 400 are described herein as preferably being mechanically resonant at the scanning frequency, in some applications the scanner 200 may be non-resonant. For example, where the scanner 200 is used for “stroke” or “calligraphic” scanning, a non-resonant scanner would be preferred. One skilled in the art will recognize that, although a single light source is described herein, the principles and structures described herein are applicable to displays having a plurality of light sources. In fact, the exit pupil expander 450 of FIG. 14 effectively approximates the use of several light sources. Further, although the exemplary embodiment herein utilizes the pupil shadow to track gaze, a variety of other approaches may be within the scope of the invention, for example, reflective techniques, such known “glint” techniques as may be adapted for use with the described embodiments according to the invention may image the fundus or features of the iris to track gaze. Accordingly, the invention is not limited except as by the appended claims.

Claims (47)

What is claimed is:
1. A method of producing an image for viewing by an eye, comprising the steps of:
emitting light from a first location;
modulating the light in a pattern corresponding to the image;
producing a positioning beam;
directing the positioning beam along a first path toward the eye;
receiving a portion of light reflected from the eye with an optical detector;
producing an electrical signal responsive to the received reflected light;
identifying a pupil position responsive to the electrical signal; and
physically repositioning the first location in response to the electrical signal.
2. The method of claim 1 wherein an image source produces the light and wherein the step of physically repositioning the first location in response to the electrical signal includes physically repositioning the image source relative to the user's eye.
3. The method of claim 2 wherein the step of physically repositioning the image source includes activating a piezoelectric positioner coupled to the image source.
4. The method of claim 3 wherein the step of physically repositioning the image shown includes activating a shape memory alloy coupled to the image source.
5. The method of claim 1 wherein the optical detector includes a detector array and wherein the step of producing an electrical signal responsive to the received reflected light includes outputting data from the detector array.
6. The method of claim 1 wherein the positioning beam is an infrared beam.
7. The method of claim 1 wherein the step of producing an electrical signal includes the steps of:
outputting data from the detector array;
retrieving data stored in a memory; and
producing the electrical signal in response to the retrieved data.
8. The method of claim 1 wherein a portion of the emitted light forms the positioning beam.
9. The method of claim 1 wherein the step of emitting light includes producing the light with an image source and guiding the light with guiding optics and wherein the step of physically repositioning the first location in response to the electrical signal includes physically varying the relative positioning of the guiding optics and the image source.
10. The method of claim 9 wherein the guiding optics include a lens.
11. The method of claim 10 wherein the guiding optics further include a turning reflector.
12. A method of producing an image in response to an image signal for perception by a user, comprising the steps of:
emitting, from a first position, light corresponding to the image responsive to the image signal;
directing the emitted light corresponding to the image toward the user's eye;
determining an eye position while directing the emitted light corresponding to the image toward the user's eye; and
responsive to the determined eye position adjusting the first position to direct the emitted light toward the user's pupil.
13. The method of claim 12 wherein the step of determining the eye position includes the steps of:
emitting a tracking beam of light;
directing the tracking beam of light toward the user's eye; and
monitoring light reflected from the user's eye.
14. The method of claim 13 wherein the step of emitting a tracking beam of light includes the steps of emitting the tracking beam from substantially the first position.
15. The method of claim 12 wherein the step of monitoring light reflected from the user's eye includes:
positioning an optical detector adjacent to the first position; and
receiving a portion of the reflected light with the detector.
16. The method of claim 12 wherein the step of directing the emitted light corresponding to the image toward the user's eye includes scanning the emitted light with a scanner.
17. The method of claim 16 wherein the step of directing the tracking beam of light toward the user's eye includes scanning the tracking beam with the scanner.
18. A method in a display apparatus of identifying alignment of an optical source with an eye, comprising the steps of:
projecting light from a tracking source onto the eye;
receiving light reflected from a plurality of locations on the eye;
generating electrical signals corresponding to the received reflected light;
responsive to the electrical signals, identifying a region of the eye having a reduced reflectance relative to other regions of the eye; and
comparing the identified region of reduced reflectance with a reference region corresponding to centering of the optical source relative to the reduced reflectance region.
19. The method of claim 18 further including the step of aligning the tracking source in a substantially fixed position relative to the optical source.
20. The method of claim 18 wherein the step of receiving light reflected from a plurality of locations on the eye includes receiving light reflected from a plurality of locations on the eye with a photodetector.
21. The method of claim 20 wherein the photodetector is a two-dimensional detector array.
22. The method of claim 21 wherein the two-dimensional detector array is a CCD array.
23. The method of claim 20 wherein the photodetector includes a plurality of integrated detectors.
24. A method of aligning a virtual image to an eye, comprising the steps of:
directing image light from a first location along a first set of optical paths to the eye produce the virtual image;
directing a tracking beam of light toward the eye such that a portion of the tracking beam is reflected from the eye;
receiving a reflected portion of the tracking beam with a photodetector;
producing an electrical signal in response to the reception of the reflected portion;
responsive to the electrical signal, identifying a region of the reflected portion corresponding to a pupil;
determining an adjustment of first location that increases the amount of image light entering the pupil; and
adjusting the first location responsive to the determined adjustment.
25. The method of claim 24 wherein the display includes an image source that produces the image light and a detector that produces the electrical signal, and wherein the image source and detector are mounted to a common supporting body.
26. The method of claim 25 wherein the step of adjusting the first set of optical paths responsive to the determined adjustment includes moving the supporting body.
27. The method of claim 26 wherein the step of moving the supporting body includes activating a piezoelectric positioner.
28. The method of claim 27 wherein the step of moving the supporting body includes activating a shape memory alloy.
29. A virtual display for producing an image for viewing by a user's eye, comprising:
an image source operative to emit light in a pattern corresponding to the image along a path toward the user's eye;
an optical detector aligned to the user's eye and operative to detect a location of a region of the user's eye having a reflectance corresponding to a selected eye feature having a predetermined position relative to a pupil of the eye, the optical detector producing a signal indicative of the detected location; and
a positioning mechanism having a control input coupled to the optical detector and a positioning output coupled to the image source, the positioning mechanism being responsive to the signal indicative of the detected location to physically reposition the image source in a direction that shifts the optical path to the pupil.
30. The display of claim 29 wherein the positioned is an electrically actuated positioner and wherein the signal indicative of the detected location is an electrical signal.
31. The display of claim 29 wherein the image source includes a light emitter and imaging optics configured for relative repositioning by the positioning mechanism.
32. The display of claim 29 wherein the image source and detector are mounted to a common supporting body.
33. The display of claim 29 wherein the positioning mechanism is coupled to the common body to physically displace the common body.
34. The display of claim 29 wherein the image source is a retinal scanner.
35. The display of claim 29 further comprising a beam combiner having a first input aligned to the image source and a second input, the beam combiner being operative to direct light from the first and second inputs and to provide the combined light to a user's retina.
36. A display apparatus including eye position tracking, comprising:
a first scanner;
beam-turning optics aligned to the eye;
an image source mounted to a base and aligned to beam-turning optics at an angle selected to direct light from the image source to the eye;
an optical source aligned to the eye;
a detector aligned to the eye and responsive to output an electrical signal indicative of alignment of the optical source relative to a selected region of the eye; and
a positioning mechanism coupled to the base and responsive to the electrical signal from the detector to physically adjust the relative positions of the base relative and the beam-turning optics.
37. The display apparatus of claim 36 wherein the image source is a retinal scanner.
38. The display apparatus of claim 36 wherein the positioning mechanism is a piezoelectric positioner.
39. The display apparatus of claim 36 wherein the positioning mechanism is a servomechanism.
40. The display apparatus of claim 36 wherein the positioning mechanism includes a shape memory alloy.
41. The display apparatus of claim 36 wherein beam-turning optics includes a beam combiner.
42. The display apparatus of claim 41 wherein the beam combiner includes an optical magnifier.
43. The display apparatus of claim 42 wherein the optical magnifier is a mirror.
44. The display apparatus of claim 40 wherein the beam combiner includes a beam splitter.
45. The display apparatus of claim 36 further including a head mounting structure carrying the optical source, the beam-turning optics, and the positioning mechanism.
46. A display apparatus, comprising a light movable source operative to emit a beam of light modulated according to a derived image, the movable light source being responsive to a position input to vary the effective position of the beam of light, an exit pupil expander positioned to receive the emitted beam of light, the exit pupil expander being responsive to emit a plurality of exit beams in response to the received beam of light; an eye tracker oriented to detect a user's eye position and configured to output an electric signal corresponding to the detected eye position; a positioner having an electrical input coupled to the eye tracker to receive the electric signal, the positioner further being coupled to the light source, the positioner being operative to provide the position input in response to the electrical signal.
47. The display apparatus of claim 46 wherein the exit pupil expander is a diffractive element.
US09/128,954 1998-08-05 1998-08-05 Personal display with vision tracking Expired - Lifetime US6396461B1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
US09/128,954 US6396461B1 (en) 1998-08-05 1998-08-05 Personal display with vision tracking
US10/150,309 US20020167462A1 (en) 1998-08-05 2002-05-17 Personal display with vision tracking

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
US09/128,954 US6396461B1 (en) 1998-08-05 1998-08-05 Personal display with vision tracking

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US10/150,309 Continuation US20020167462A1 (en) 1998-08-05 2002-05-17 Personal display with vision tracking

Publications (2)

Publication Number Publication Date
US20020041259A1 true US20020041259A1 (en) 2002-04-11
US6396461B1 US6396461B1 (en) 2002-05-28

Family

ID=22437786

Family Applications (2)

Application Number Title Priority Date Filing Date
US09/128,954 Expired - Lifetime US6396461B1 (en) 1998-08-05 1998-08-05 Personal display with vision tracking
US10/150,309 Abandoned US20020167462A1 (en) 1998-08-05 2002-05-17 Personal display with vision tracking

Family Applications After (1)

Application Number Title Priority Date Filing Date
US10/150,309 Abandoned US20020167462A1 (en) 1998-08-05 2002-05-17 Personal display with vision tracking

Country Status (1)

Country Link
US (2) US6396461B1 (en)

Cited By (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020113755A1 (en) * 2001-02-19 2002-08-22 Samsung Electronics Co., Ltd. Wearable display apparatus
EP2083670A1 (en) * 2006-11-29 2009-08-05 Tobii Technology AB Eye tracking illumination
WO2011155878A1 (en) * 2010-06-10 2011-12-15 Volvo Lastavagnar Ab A vehicle based display system and a method for operating the same
US20140375540A1 (en) * 2013-06-24 2014-12-25 Nathan Ackerman System for optimal eye fit of headset display device
JP2015176130A (en) * 2014-03-18 2015-10-05 パイオニア株式会社 virtual image display device
WO2015070182A3 (en) * 2013-11-09 2015-11-05 Firima Inc. Optical eye tracking
US20160166146A1 (en) * 2014-12-11 2016-06-16 Icspi Corp. Eye-Tracking System and Method Therefor
US9552064B2 (en) 2013-11-27 2017-01-24 Shenzhen Huiding Technology Co., Ltd. Eye tracking and user reaction detection
US9652034B2 (en) 2013-09-11 2017-05-16 Shenzhen Huiding Technology Co., Ltd. User interface based on optical sensing and tracking of user's eye movement and position
JP2018124575A (en) * 2018-04-24 2018-08-09 パイオニア株式会社 Virtual image display device
US10169864B1 (en) * 2015-08-27 2019-01-01 Carl Zeiss Meditec, Inc. Methods and systems to detect and classify retinal structures in interferometric imaging data
US10317672B2 (en) 2014-12-11 2019-06-11 AdHawk Microsystems Eye-tracking system and method therefor
CN111033354A (en) * 2017-08-11 2020-04-17 微软技术许可有限责任公司 Eye tracking using MEMS scanning and reflected light
US10627623B2 (en) 2012-05-03 2020-04-21 Nokia Technologies Oy Image providing apparatus, method and computer program
CN111971609A (en) * 2018-04-06 2020-11-20 依视路国际公司 Method for customizing a head mounted device adapted for generating a virtual image
WO2021108327A1 (en) * 2019-11-26 2021-06-03 Magic Leap, Inc. Enhanced eye tracking for augmented or virtual reality display systems
CN113661431A (en) * 2019-08-29 2021-11-16 苹果公司 Optical module of head-mounted device
US11747624B2 (en) 2015-09-23 2023-09-05 Magic Leap, Inc. Eye imaging with an off-axis imager

Families Citing this family (89)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
AU4598399A (en) 1999-07-06 2001-01-22 Swisscom Mobile Ag Method for checking user authorization
JP2002157607A (en) * 2000-11-17 2002-05-31 Canon Inc System and method for image generation, and storage medium
US7111939B2 (en) * 2001-01-22 2006-09-26 Eastman Kodak Company Image display system with body position compensation
DE10311306A1 (en) * 2003-03-14 2004-09-23 Carl Zeiss Image display device, e.g. head-mounted display device, has pupil optics that spatially magnify outlet pupil of imaging optics and/or move outlet pupil of imaging optics
US20040196399A1 (en) * 2003-04-01 2004-10-07 Stavely Donald J. Device incorporating retina tracking
US7401920B1 (en) 2003-05-20 2008-07-22 Elbit Systems Ltd. Head mounted eye tracking and display system
JP4298455B2 (en) * 2003-09-30 2009-07-22 キヤノン株式会社 Scanning image display device
US7362738B2 (en) * 2005-08-09 2008-04-22 Deere & Company Method and system for delivering information to a user
DE102005046130A1 (en) * 2005-09-27 2007-03-29 Bausch & Lomb Inc. Excimer laser-eye surgical system, has eye tracing device sending instruction signal to laser device via bidirectional bus to fire shot, when preset position data is same as initial position data of scanning device for shot
US8956396B1 (en) * 2005-10-24 2015-02-17 Lockheed Martin Corporation Eye-tracking visual prosthetic and method
US8709078B1 (en) 2011-08-03 2014-04-29 Lockheed Martin Corporation Ocular implant with substantially constant retinal spacing for transmission of nerve-stimulation light
US10524656B2 (en) 2005-10-28 2020-01-07 Topcon Medical Laser Systems Inc. Photomedical treatment system and method with a virtual aiming device
US7542210B2 (en) * 2006-06-29 2009-06-02 Chirieleison Sr Anthony Eye tracking head mounted display
US7511684B2 (en) * 2006-07-31 2009-03-31 Motorola, Inc. Image alignment method for binocular eyewear displays
US9079762B2 (en) 2006-09-22 2015-07-14 Ethicon Endo-Surgery, Inc. Micro-electromechanical device
JP2008119197A (en) * 2006-11-10 2008-05-29 Tokai Rika Co Ltd Main body of situation monitoring apparatus and situation monitoring apparatus
US7713265B2 (en) 2006-12-22 2010-05-11 Ethicon Endo-Surgery, Inc. Apparatus and method for medically treating a tattoo
US8273015B2 (en) 2007-01-09 2012-09-25 Ethicon Endo-Surgery, Inc. Methods for imaging the anatomy with an anatomically secured scanner assembly
US8801606B2 (en) 2007-01-09 2014-08-12 Ethicon Endo-Surgery, Inc. Method of in vivo monitoring using an imaging system including scanned beam imaging unit
US8216214B2 (en) 2007-03-12 2012-07-10 Ethicon Endo-Surgery, Inc. Power modulation of a scanning beam for imaging, therapy, and/or diagnosis
US8626271B2 (en) 2007-04-13 2014-01-07 Ethicon Endo-Surgery, Inc. System and method using fluorescence to examine within a patient's anatomy
US7995045B2 (en) 2007-04-13 2011-08-09 Ethicon Endo-Surgery, Inc. Combined SBI and conventional image processor
US8160678B2 (en) 2007-06-18 2012-04-17 Ethicon Endo-Surgery, Inc. Methods and devices for repairing damaged or diseased tissue using a scanning beam assembly
US7982776B2 (en) 2007-07-13 2011-07-19 Ethicon Endo-Surgery, Inc. SBI motion artifact removal apparatus and method
US9125552B2 (en) 2007-07-31 2015-09-08 Ethicon Endo-Surgery, Inc. Optical scanning module and means for attaching the module to medical instruments for introducing the module into the anatomy
US7983739B2 (en) 2007-08-27 2011-07-19 Ethicon Endo-Surgery, Inc. Position tracking and control for a scanning assembly
US7925333B2 (en) 2007-08-28 2011-04-12 Ethicon Endo-Surgery, Inc. Medical device including scanned beam unit with operational control features
US20090161705A1 (en) * 2007-12-20 2009-06-25 Etienne Almoric Laser projection utilizing beam misalignment
US8050520B2 (en) 2008-03-27 2011-11-01 Ethicon Endo-Surgery, Inc. Method for creating a pixel image from sampled data of a scanned beam imager
WO2009131626A2 (en) * 2008-04-06 2009-10-29 David Chaum Proximal image projection systems
US20100149073A1 (en) * 2008-11-02 2010-06-17 David Chaum Near to Eye Display System and Appliance
US8332014B2 (en) 2008-04-25 2012-12-11 Ethicon Endo-Surgery, Inc. Scanned beam device and method using same which measures the reflectance of patient tissue
WO2010062481A1 (en) * 2008-11-02 2010-06-03 David Chaum Near to eye display system and appliance
US9965681B2 (en) 2008-12-16 2018-05-08 Osterhout Group, Inc. Eye imaging in head worn computing
US9952664B2 (en) 2014-01-21 2018-04-24 Osterhout Group, Inc. Eye imaging in head worn computing
US9298007B2 (en) * 2014-01-21 2016-03-29 Osterhout Group, Inc. Eye imaging in head worn computing
US9495589B2 (en) * 2009-01-26 2016-11-15 Tobii Ab Detection of gaze point assisted by optical reference signal
US11726332B2 (en) 2009-04-27 2023-08-15 Digilens Inc. Diffractive projection apparatus
US9292973B2 (en) 2010-11-08 2016-03-22 Microsoft Technology Licensing, Llc Automatic variable virtual focus for augmented reality displays
US9304319B2 (en) 2010-11-18 2016-04-05 Microsoft Technology Licensing, Llc Automatic focus improvement for augmented reality displays
US20120176474A1 (en) * 2011-01-10 2012-07-12 John Norvold Border Rotational adjustment for stereo viewing
WO2016020630A2 (en) 2014-08-08 2016-02-11 Milan Momcilo Popovich Waveguide laser illuminator incorporating a despeckler
CA2750287C (en) 2011-08-29 2012-07-03 Microsoft Corporation Gaze detection in a see-through, near-eye, mixed reality display
WO2013033195A2 (en) 2011-08-30 2013-03-07 Microsoft Corporation Head mounted display with iris scan profiling
US9213163B2 (en) 2011-08-30 2015-12-15 Microsoft Technology Licensing, Llc Aligning inter-pupillary distance in a near-eye display system
US9025252B2 (en) 2011-08-30 2015-05-05 Microsoft Technology Licensing, Llc Adjustment of a mixed reality display for inter-pupillary distance alignment
US9323325B2 (en) 2011-08-30 2016-04-26 Microsoft Technology Licensing, Llc Enhancing an object of interest in a see-through, mixed reality display device
US8998414B2 (en) 2011-09-26 2015-04-07 Microsoft Technology Licensing, Llc Integrated eye tracking and display system
US8681426B2 (en) 2011-11-04 2014-03-25 Honeywell International Inc. Steerable near-to-eye display and steerable near-to-eye display system
WO2013167864A1 (en) 2012-05-11 2013-11-14 Milan Momcilo Popovich Apparatus for eye tracking
WO2014030158A1 (en) * 2012-08-24 2014-02-27 Ic Inside Ltd Visual aid projector
US9933684B2 (en) * 2012-11-16 2018-04-03 Rockwell Collins, Inc. Transparent waveguide display providing upper and lower fields of view having a specific light output aperture configuration
KR20150136601A (en) 2013-03-25 2015-12-07 에꼴 뽈리떼끄닉 뻬데랄 드 로잔느 (으뻬에프엘) Method for displaying an image projected from a head-worn display with multiple exit pupils
US10209517B2 (en) 2013-05-20 2019-02-19 Digilens, Inc. Holographic waveguide eye tracker
WO2014209244A1 (en) * 2013-06-27 2014-12-31 Koc Universitesi Image display device in the form of a pair of eye glasses
CN103630116B (en) * 2013-10-10 2016-03-23 北京智谷睿拓技术服务有限公司 Image acquisition localization method and image acquisition locating device
US9557553B2 (en) * 2013-10-10 2017-01-31 Raytheon Canada Limited Electronic eyebox
CN103557859B (en) * 2013-10-10 2015-12-23 北京智谷睿拓技术服务有限公司 Image acquisition localization method and image acquisition positioning system
US10191279B2 (en) 2014-03-17 2019-01-29 Osterhout Group, Inc. Eye imaging in head worn computing
US11227294B2 (en) 2014-04-03 2022-01-18 Mentor Acquisition One, Llc Sight information collection in head worn computing
US11669163B2 (en) 2014-01-21 2023-06-06 Mentor Acquisition One, Llc Eye glint imaging in see-through computer display systems
US9811159B2 (en) 2014-01-21 2017-11-07 Osterhout Group, Inc. Eye imaging in head worn computing
US9532714B2 (en) 2014-01-21 2017-01-03 Osterhout Group, Inc. Eye imaging in head worn computing
US11737666B2 (en) 2014-01-21 2023-08-29 Mentor Acquisition One, Llc Eye imaging in head worn computing
US9836122B2 (en) 2014-01-21 2017-12-05 Osterhout Group, Inc. Eye glint imaging in see-through computer display systems
US11487110B2 (en) 2014-01-21 2022-11-01 Mentor Acquisition One, Llc Eye imaging in head worn computing
US20150241963A1 (en) 2014-02-11 2015-08-27 Osterhout Group, Inc. Eye imaging in head worn computing
GB2526092A (en) * 2014-05-13 2015-11-18 Nokia Technologies Oy An apparatus and method for providing an image
US10241330B2 (en) 2014-09-19 2019-03-26 Digilens, Inc. Method and apparatus for generating input images for holographic waveguide displays
EP3198192A1 (en) 2014-09-26 2017-08-02 Milan Momcilo Popovich Holographic waveguide opticaltracker
US20180275402A1 (en) 2015-01-12 2018-09-27 Digilens, Inc. Holographic waveguide light field displays
CN107873086B (en) 2015-01-12 2020-03-20 迪吉伦斯公司 Environmentally isolated waveguide display
US10330777B2 (en) 2015-01-20 2019-06-25 Digilens Inc. Holographic waveguide lidar
US9632226B2 (en) 2015-02-12 2017-04-25 Digilens Inc. Waveguide grating device
JP6617945B2 (en) * 2015-03-11 2019-12-11 株式会社リコー Image display device
US10996660B2 (en) 2015-04-17 2021-05-04 Tulip Interfaces, Ine. Augmented manufacturing system
AU2016267275B2 (en) * 2015-05-28 2021-07-01 Google Llc Systems, devices, and methods that integrate eye tracking and scanning laser projection in wearable heads-up displays
EP3359999A1 (en) 2015-10-05 2018-08-15 Popovich, Milan Momcilo Waveguide display
WO2017134412A1 (en) 2016-02-04 2017-08-10 Milan Momcilo Popovich Holographic waveguide optical tracker
WO2018129398A1 (en) 2017-01-05 2018-07-12 Digilens, Inc. Wearable heads up displays
US10176375B2 (en) 2017-03-29 2019-01-08 Raytheon Canada Limited High speed pupil detection system and method
WO2020146546A1 (en) * 2019-01-08 2020-07-16 Avegant Corp. Sensor-based eye-tracking using a holographic optical element
US11237389B1 (en) * 2019-02-11 2022-02-01 Facebook Technologies, Llc Wedge combiner for eye-tracking
EP3924759A4 (en) 2019-02-15 2022-12-28 Digilens Inc. Methods and apparatuses for providing a holographic waveguide display using integrated gratings
EP3980825A4 (en) 2019-06-07 2023-05-03 Digilens Inc. Waveguides incorporating transmissive and reflective gratings and related methods of manufacturing
EP4022370A4 (en) 2019-08-29 2023-08-30 Digilens Inc. Evacuating bragg gratings and methods of manufacturing
US11885965B1 (en) 2019-09-23 2024-01-30 Apple Inc. Head-mounted display and display modules thereof
US11793787B2 (en) 2019-10-07 2023-10-24 The Broad Institute, Inc. Methods and compositions for enhancing anti-tumor immunity by targeting steroidogenesis
WO2022170287A2 (en) 2021-06-07 2022-08-11 Panamorph, Inc. Near-eye display system

Family Cites Families (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE69114790T2 (en) 1990-08-20 1996-04-18 Sony Corp Direct view picture display device.
US5596339A (en) 1992-10-22 1997-01-21 University Of Washington Virtual retinal display with fiber optic point source
US6008781A (en) * 1992-10-22 1999-12-28 Board Of Regents Of The University Of Washington Virtual retinal display
US5467104A (en) 1992-10-22 1995-11-14 Board Of Regents Of The University Of Washington Virtual retinal display
US5539422A (en) 1993-04-12 1996-07-23 Virtual Vision, Inc. Head mounted display system
JPH0824358B2 (en) * 1993-08-16 1996-03-06 工業技術院長 Image display device
JP3396062B2 (en) * 1993-08-26 2003-04-14 オリンパス光学工業株式会社 Image display device
US5659430A (en) 1993-12-21 1997-08-19 Olympus Optical Co., Ltd. Visual display apparatus
JP3240362B2 (en) * 1994-04-13 2001-12-17 独立行政法人産業技術総合研究所 Wide-field image presentation device
US5727098A (en) 1994-09-07 1998-03-10 Jacobson; Joseph M. Oscillating fiber optic display and imager
US5557444A (en) 1994-10-26 1996-09-17 University Of Washington Miniature optical scanner for a two axis scanning system
US5701132A (en) 1996-03-29 1997-12-23 University Of Washington Virtual retinal display with expanded exit pupil
US5935948A (en) 1997-04-09 1999-08-10 Krstulovic; Veljko J. Method of treating and preventing gallstones
US6097353A (en) 1998-01-20 2000-08-01 University Of Washington Augmented retinal display with view tracking and data positioning
US6043799A (en) * 1998-02-20 2000-03-28 University Of Washington Virtual retinal display with scanner array for generating multiple exit pupils
US5903397A (en) * 1998-05-04 1999-05-11 University Of Washington Display with multi-surface eyepiece

Cited By (28)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7193584B2 (en) * 2001-02-19 2007-03-20 Samsung Electronics Co., Ltd. Wearable display apparatus
US20020113755A1 (en) * 2001-02-19 2002-08-22 Samsung Electronics Co., Ltd. Wearable display apparatus
EP2083670A1 (en) * 2006-11-29 2009-08-05 Tobii Technology AB Eye tracking illumination
EP2083670A4 (en) * 2006-11-29 2010-02-03 Tobii Technology Ab Eye tracking illumination
EP2371271A1 (en) * 2006-11-29 2011-10-05 Tobii Technology AB Eye tracking illumination
WO2011155878A1 (en) * 2010-06-10 2011-12-15 Volvo Lastavagnar Ab A vehicle based display system and a method for operating the same
US10627623B2 (en) 2012-05-03 2020-04-21 Nokia Technologies Oy Image providing apparatus, method and computer program
US20140375540A1 (en) * 2013-06-24 2014-12-25 Nathan Ackerman System for optimal eye fit of headset display device
US9652034B2 (en) 2013-09-11 2017-05-16 Shenzhen Huiding Technology Co., Ltd. User interface based on optical sensing and tracking of user's eye movement and position
WO2015070182A3 (en) * 2013-11-09 2015-11-05 Firima Inc. Optical eye tracking
US11740692B2 (en) 2013-11-09 2023-08-29 Shenzhen GOODIX Technology Co., Ltd. Optical eye tracking
US9552064B2 (en) 2013-11-27 2017-01-24 Shenzhen Huiding Technology Co., Ltd. Eye tracking and user reaction detection
US10416763B2 (en) 2013-11-27 2019-09-17 Shenzhen GOODIX Technology Co., Ltd. Eye tracking and user reaction detection
JP2015176130A (en) * 2014-03-18 2015-10-05 パイオニア株式会社 virtual image display device
US10213105B2 (en) * 2014-12-11 2019-02-26 AdHawk Microsystems Eye-tracking system and method therefor
US10317672B2 (en) 2014-12-11 2019-06-11 AdHawk Microsystems Eye-tracking system and method therefor
US20160166146A1 (en) * 2014-12-11 2016-06-16 Icspi Corp. Eye-Tracking System and Method Therefor
US11682121B2 (en) 2015-08-27 2023-06-20 Carl Zeiss Meditec, Inc. Methods and systems to detect and classify retinal structures in interferometric imaging data
US10169864B1 (en) * 2015-08-27 2019-01-01 Carl Zeiss Meditec, Inc. Methods and systems to detect and classify retinal structures in interferometric imaging data
US10896511B2 (en) 2015-08-27 2021-01-19 Carl Zeiss Meditec, Inc. Methods and systems to detect and classify retinal structures in interferometric imaging data
US11747624B2 (en) 2015-09-23 2023-09-05 Magic Leap, Inc. Eye imaging with an off-axis imager
CN111033354A (en) * 2017-08-11 2020-04-17 微软技术许可有限责任公司 Eye tracking using MEMS scanning and reflected light
CN111971609A (en) * 2018-04-06 2020-11-20 依视路国际公司 Method for customizing a head mounted device adapted for generating a virtual image
US11243401B2 (en) 2018-04-06 2022-02-08 Essilor International Method for customizing a head mounted device adapted to generate a virtual image
JP2018124575A (en) * 2018-04-24 2018-08-09 パイオニア株式会社 Virtual image display device
CN113661431A (en) * 2019-08-29 2021-11-16 苹果公司 Optical module of head-mounted device
US11681362B2 (en) 2019-11-26 2023-06-20 Magic Leap, Inc. Enhanced eye tracking for augmented or virtual reality display systems
WO2021108327A1 (en) * 2019-11-26 2021-06-03 Magic Leap, Inc. Enhanced eye tracking for augmented or virtual reality display systems

Also Published As

Publication number Publication date
US20020167462A1 (en) 2002-11-14
US6396461B1 (en) 2002-05-28

Similar Documents

Publication Publication Date Title
CA2388015C (en) Personal display with vision tracking
US6396461B1 (en) Personal display with vision tracking
US6151167A (en) Scanned display with dual signal fiber transmission
US7209271B2 (en) Multiple beam scanning imager
US6285489B1 (en) Frequency tunable resonant scanner with auxiliary arms
US7002716B2 (en) Method and apparatus for blending regions scanned by a beam scanner
US6661393B2 (en) Scanned display with variation compensation
US7190329B2 (en) Apparatus for remotely imaging a region
US6803561B2 (en) Frequency tunable resonant scanner
US6256131B1 (en) Active tuning of a torsional resonant structure
US6654158B2 (en) Frequency tunable resonant scanner with auxiliary arms
US6882462B2 (en) Resonant scanner with asymmetric mass distribution
US7310174B2 (en) Method and apparatus for scanning regions
US7982765B2 (en) Apparatus, system, and method for capturing an image with a scanned beam of light
US7516896B2 (en) Frequency tunable resonant scanner with auxiliary arms
JPH11160650A (en) Picture display device
EP1352286B1 (en) Scanned display with variation compensation
KR20040020864A (en) Frequency tunable resonant scanner and method of making
CN112543886A (en) Device arrangement for projecting a laser beam for generating an image on the retina of an eye
JPH11109278A (en) Video display device
EP1226569B1 (en) Low light viewer with image simulation
EP1330673B1 (en) Frequency tunable resonant scanner with auxiliary arms
JPH11133346A (en) Video display device
EP1655629A2 (en) Point source scanning apparatus and method

Legal Events

Date Code Title Description
AS Assignment

Owner name: MICROVISION, INC., WASHINGTON

Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:LEWIS, JOHN R.;NESTOROVIC, NENAD;REEL/FRAME:009370/0501

Effective date: 19980805

STCF Information on status: patent grant

Free format text: PATENTED CASE

FPAY Fee payment

Year of fee payment: 4

FEPP Fee payment procedure

Free format text: PAYOR NUMBER ASSIGNED (ORIGINAL EVENT CODE: ASPN); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

FPAY Fee payment

Year of fee payment: 8

FEPP Fee payment procedure

Free format text: PAT HOLDER NO LONGER CLAIMS SMALL ENTITY STATUS, ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: STOL); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY

FPAY Fee payment

Year of fee payment: 12