WO1997049081A1 - Method and apparatus for orientation sensing and directional sound generation - Google Patents

Method and apparatus for orientation sensing and directional sound generation Download PDF

Info

Publication number
WO1997049081A1
WO1997049081A1 PCT/US1997/010107 US9710107W WO9749081A1 WO 1997049081 A1 WO1997049081 A1 WO 1997049081A1 US 9710107 W US9710107 W US 9710107W WO 9749081 A1 WO9749081 A1 WO 9749081A1
Authority
WO
WIPO (PCT)
Prior art keywords
user
visual display
axis
boom
viewing apparatus
Prior art date
Application number
PCT/US1997/010107
Other languages
French (fr)
Inventor
Richard Roell
James Rogers
Original Assignee
Immersive Technologies, Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US08/667,815 external-priority patent/US6057810A/en
Application filed by Immersive Technologies, Inc. filed Critical Immersive Technologies, Inc.
Priority to AU34828/97A priority Critical patent/AU3482897A/en
Publication of WO1997049081A1 publication Critical patent/WO1997049081A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/012Head tracking input arrangements
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09BEDUCATIONAL OR DEMONSTRATION APPLIANCES; APPLIANCES FOR TEACHING, OR COMMUNICATING WITH, THE BLIND, DEAF OR MUTE; MODELS; PLANETARIA; GLOBES; MAPS; DIAGRAMS
    • G09B9/00Simulators for teaching or training purposes
    • G09B9/02Simulators for teaching or training purposes for teaching control of vehicles or other craft
    • G09B9/08Simulators for teaching or training purposes for teaching control of vehicles or other craft for teaching control of aircraft, e.g. Link trainer
    • G09B9/30Simulation of view from aircraft
    • G09B9/307Simulation of view from aircraft by helmet-mounted projector or display

Definitions

  • Video displays are commonly used in medical operations, for example, and in many other analytical fields to visually present a simulated environment to a user.
  • Another particular application is in the video game industry where video displays transmit real time image data to a user.
  • the video display is programmed to respond to actions taken by the user through a control mechanism such as a joy stick and is updated in real time.
  • This system allows for the orientation of a user in a three-dimensional virtual world coordinate system.
  • the apparatus may provide the user with the ability to manually control forward, backward, up, down, left, and right translational motion within the virtual world, and to control other software responses.
  • Figure 2 is an enlarged view of the visual display of Figure 1 ;
  • Figures 3A-3C illustrate rotation of the visual display about the roll axis according to an exemplary embodiment of the invention;
  • Figures 4A-4C illustrate rotation of the visual display about the pitch axis according to an exemplary embodiment of the invention;
  • the configuration of the visual display can reduce the setup time for each user, because the user may simply walk up to the apparatus, bring his face into contact with the visual display, and immediately begin to use the apparatus.
  • the use of a processor dedicated for each eye/ear pair can significantly improve the quality of the image and sound data transmitted to the user.

Abstract

A viewing apparatus (50) comprises a visual display (104) transmitting image data to a user and rotatably coupled, in three dimensions, to a boom (102) supporting the visual display (104). The operator inserts his head into the visual display (104) and grasps the control grips (144) with both hands to interact in three dimensions with a virtual world and with virtual objects in the virtual world via graphics displayed to each eye. The visual display (104) is equipped with sensing devices for sensing both location and directional coordinates in three dimensions relative to the user and the virtual world and objects. The visual display (104) houses two LCD displays which direct separate images to the user's eyes so that the images are perceived stereoscopically and speakers positioned forward of the user's ears (115, 117) and the speakers positioned rearward of the user's ears (116, 118) so that sound can be perceived to originate from a particular direction or object.

Description

METHOD AND APPARATUS FOR ORIENTATION SENSING AND DmECTIONAL SOUND GENERATION
BACKGROUND The present invention relates generally to a method and apparatus for sensing an orientation, and more particularly to a viewing method and apparatus for sensing an orientation of a user's head and for viewing and maneuvering in and around a computer generated virtual world.
The use of video displays for recreational, educational, and scientific purposes has increased dramatically in recent years. Video displays are commonly used in medical operations, for example, and in many other analytical fields to visually present a simulated environment to a user. Another particular application is in the video game industry where video displays transmit real time image data to a user. The video display is programmed to respond to actions taken by the user through a control mechanism such as a joy stick and is updated in real time.
Recently, video displays have been adapted to be secured a fixed distance in front of the user's eyes and to provide real time image data based on, for example, the movement of the user's head. Commonly known as "virtual reality", a motion sensor senses movement of the user's head and provides a signal representing the motion to a microprocessor which calculates the real time image data based on the signal.
In many known systems, the video display is fixed inside a helmet which the user wears while using the apparatus. U.S. Patent No. 4,884,219, for example, discloses an apparatus for the perception of computer-generated imagery comprising a helmet which includes two visual display units and a first module containing three small coils. A second module also having three small coils is fixed at a location remote from the helmet. The second module acts as a transmitter to generate a low frequency electric field. The first module acts as a moveable sensor which samples the field generated by the second module. An electronics decode circuit decodes a signal from the first module and computes the sensor's position and orientation in angular coordinates. O 97/49081 PC1YUS97/10107
-2-
Although this helmet-type apparatus may be used to sense an orientation of the user's head and to calculate image data, it has several disadvantages. For example, people who use the apparatus will have different sized heads. The helmet, therefore, will typically have to include a relatively complex mechanism for allowing a size adjustment of the helmet before each person uses the apparatus. It is usually necessary, therefore, to hire an attendant to assist users in fitting the helmet to the user's head, which has obvious cost disadvantages. Each fitting may also take an undesirably long period of time, which significantly curtails receipts, and which may deter potential users from waiting in line to use the apparatus. In addition, the use of the same helmet by all users may cause problems with respect to hygiene, which may deter a significant number of potential user's from using the apparatus. Moreover, because the helmet is not supported other than by resting on the user's head, the mass of the helmet must be kept at a reasonable mass, which may significantly affect the quality and size of the visual display units which can be used.
In other known systems, a visual display is suspended from a boom which may be counterbalanced such that the visual display does not weigh down on the user's head. U.S. Patent No. 5,253,832, for example, discloses a spring counterbalanced boom suspension system in which the visual display is suspended from a boom arm which is rotatably connected to a cantilever arm. In this system, however, the freedom of motion of the visual display is substantially more limited than in the helmet-type apparatus of the above-cited U.S. Patent No. 4,884,219. For example, the visual display can rotate only about two axes with respect to the boom arm which supports it. Also, the apparatus is designed to be lightweight, which may significantly limit the size and quality of the LCD units which may be used in the visual display. In addition, the use of springs for counterbalancing cannot provide the precision of using counterweights, since the spring constant is not easily adjusted and may vary somewhat as the spring is expanded or contracted.
Another disadvantage of many known systems is that the audio output is delivered to the user by means of two speakers, one speaker for each ear. The speakers are generally attached to the helmet in one of three ways. First, the speakers may be fixed to the helmet. Second, the speakers may be attached to the helmet with adjustable slides. Third, the speakers may be incorporated into headphones which are separate from the helmet. With these systems, the user's ability to detect a sound origination direction is limited to right and left only. It would be desirable, therefore, to have a viewing apparatus which provides image data to a user based on the movement of the user's head, which can be used by any user without a significant setup time, which supports relatively large LCD displays, which allows a high degree of rotational freedom, and which can be left unattended without risk of damage to the apparatus. It would also be desirable to have a viewing device which provides audio data which can be interpreted by a user as originating from a particular direction in two or three dimensions.
SUMMARY
Exemplary embodiments of the present invention generally take the form of a visual display or "visor" supported by a mechanism which allows rotation about three axes to accommodate freedom of motion and to sense an orientation of the user's head. Sensors may be provided to sense the angular position of the viewing apparatus with respect to the three axes of rotation and to generate control signals indicative of the angular position. The control signals generated by the sensors may be transmitted to a processor which calculates individual images to be transmitted to each eye of the user, as well as sound signals to be transmitted through at least one of a plurality of speakers located forward and rearward of each of the user's ears. The images are calculated in accordance with the sensor data together with data relating to the apparent spatial relation of the user to a virtual model. The apparatus preferably comprises a pair of high resolution visual display units such as LCD displays which present stereoscopic computer-generated images to respective mirrors in front of the user's eyes. The provision of speakers both forward and rearward of each of the user's ears allows the virtual world to extend beyond the field of view provided by the visual display units. For example, signals may be transmitted through the speakers rearward of the user's ear which are perceived by the user to originate from a location in the virtual world behind the user. Thus, an object in the virtual world can be sensed by sound even when it is not in the user's field of view.
An exemplary viewing apparatus may thus comprise a visual display which transmits images to a user, a first member rotatably connected to the visual display about a first axis, a second member rotatably connected to the first member about a second axis perpendicular the first axis, and a third member rotatably connected to the second member about a third axis perpendicular to the first and second axes.
An exemplary method for sensing an orientation comprises the steps of rotatably coupling a visual display to a first member about a first axis, rotatably coupling the first member to a second member about a second axis, and rotatably coupling the second member to a third member about a third axis. The second member may be coupled to the third member such that the second member rotates 360 degrees about the third axis which is perpendicular to the first and second axes.
According to a preferred embodiment of the invention, the apparatus comprises a housing having a recess which receives a user's head, first and second speakers, housed in the housing, positioned forward of the user's left ear and right ear, respectively, third and fourth speakers, housed in the housing, positioned rearward of the user's left ear and right ear, respectively, means for sensing an orientation of the user's head, and means for generating an audio signal for at least one of the first, second, third, and fourth speakers based on the sensed orientation of the user's head. A preferred method comprises the steps of sensing an orientation of a user's head about an axis of rotation, transmitting a first audio signal through a first speaker positioned forward of the user's ear based on the sensed orientation of the user's head, and transmitting a second audio signal, different from the first audio signal, through a second speaker positioned rearward of the user's ear based on the sensed orientation.
This system allows for the orientation of a user in a three-dimensional virtual world coordinate system. In addition, the apparatus may provide the user with the ability to manually control forward, backward, up, down, left, and right translational motion within the virtual world, and to control other software responses. BRIEF DESCRIPTION OF THE DRAWINGS
The foregoing and other objects, features and advantages of the present invention will be more readily understood upon reading the following detailed description in conjunction with the drawings in which: Figure 1 is a perspective view of a viewing apparatus according to an exemplary embodiment of the invention;
Figure 2 is an enlarged view of the visual display of Figure 1 ; Figures 3A-3C illustrate rotation of the visual display about the roll axis according to an exemplary embodiment of the invention; Figures 4A-4C illustrate rotation of the visual display about the pitch axis according to an exemplary embodiment of the invention;
Figure 5 illustrates rotation of the visual display about the yaw axis according to an exemplary embodiment of the invention; and
Figure 6 illustrates an exemplary visual display having speakers positioned forward and rearward of the user's ears.
DETAILED DESCRIPTION
In general, the apparatus according to exemplary embodiments of the present invention includes a mechanism for sensing an orientation of the user's head in three dimensions, and one or more processors to generate image data for each eye and audio data for each ear based on the sensed orientation. The processor can create a "virtual world" with the image data it generates. The virtual world may include stationary and moving objects, for example, with which the user interacts. The spatial coordinates of the virtual model can be recalculated in real time so that the objects in the virtual world may be perceived to move relative to the user.
The user's position in the virtual world may be a hypothetical position which the user, through the use of suitable controls, instructs the processor to adopt. Inside the apparatus, the user will be blind to the real world, but the images presented to the user may include a control menu, for example at the periphery of the user's field of view. Through manual controls or sensors, the user can access the control menu to perform various functions. For example, with the control menu, the user may select from different viewpoints of the virtual world. In a first mode, the images presented to the user may be computed such that the user sees a representation of his person in the virtual world. According to a second mode, the images are computed such that the user views the virtual world through a virtual helmet, as if inside the virtual helmet. In a third mode, the images are computed so that the user views the virtual world from outside of the virtual helmet, with an unobstructed field of view. The control menu may also allow the user to change his position in the virtual world so that the perceived images are calculated from a different standpoint.
Referring now to Figure 1, an exemplary viewing apparatus includes a cabinet or housing 100, a boom 102 rotatably mounted on the cabinet 100 at an axis 1, and a visual display 104 which is mechanically coupled to the boom 102. Inside the cabinet 100 is housed at least one and preferably two processors 150 which receive signals indicating the orientation of the visual display 104 and which calculate image and sound data for the visual display 104 based on the received signals. Although exemplary embodiments of the present invention will be described in which a visual display is provided which includes LCD displays, it is also contemplated that applications may arise in which only sound and not visual images are provided to the user based on the orientation of the user's head. Thus, references to the visual display 104 may be generalized to encompass a housing which receives the user's head and which provides only audio signals to the user.
The rotatable boom 102 allows the visual display 104 to be lowered and raised to accommodate users of different heights. As described in commonly owned U.S. Application Serial No. 08/668,259, entitled "Method and Apparatus for Counterbalancing", filed on June 20, 1996, which is hereby incorporated by reference, the boom 102 which supports the visual display 104 is preferably counterbalanced by a counterweight boom 122. The counterweight boom 122 is rotatably mounted on the cabinet 100 at an axis 10 and is coupled to the boom 102 with a linking member 124. The linking member is rotatably connected to the boom at an axis 8 and rotatably connected to the counterweight boom at an axis 9. Because the axes 1 and 10 are fixed, the counterweight boom 122 follows the motion of the boom 102. By locating the counterweight boom 122 below the boom 102, the center of mass of the apparatus is lowered, which significantly improves the stability of the apparatus.
The counterweight boom 122 may be weighted at an axis 11 with a counterweight 126. The counterweight 126 is preferably suspended a predetermined distance below the counterweight boom 122 with a shaft 128 which further lowers the center of mass of the apparatus 50. To limit the velocity at which the boom 102 and counterweight boom 122 may travel, a stabilizing cylinder 134 can be provided. The stabilizing cylinder 134 is preferably a conventional unbiased gas cylinder which has an equal resistance to being expanded or contracted in length. By limiting the velocity of the boom 102 and counterweight boom 122, safety of the apparatus is enhanced and the risk of damaging the apparatus is reduced.
According to a preferred embodiment of the invention, the visual display 104 is connected to the boom 102 such that it may be rotated about three mutually perpendicular axes. As shown in Figure 2, the visual display 104 is supported by a bearing assembly 140 to allow free rotation about a roll axis 2. The bearing assembly 140 may be fixed to a pitch yoke 108. The pitch yoke 108 may be rotatably connected to a yaw yoke 112 by bearing assemblies 110 allowing rotation of the pitch yoke 108 about a pitch axis 3.
The pitch yoke 108 may be generally U-shaped and may include counterweights 142 located opposite the connection between the pitch yoke 108 and the visual display 104. The counterweights 142 balance the torque imparted about the pitch axis 3 due to the weight of the visual display 104 on the pitch yoke 108. The counterweights 142 may be connected to a shaft into which the pitch yoke 108 is slidably inserted to a distance which counterbalances the weight of the visual display 104 about the pitch axis 3, as best shown in Figure 5. The counterweights 142 may be secured at the distance which balances the visual display 104 by any suitable means, such as a removable screw. The position of the counterweights 142 may be adjusted at any time, for example if a visual display of different mass is used, to counterbalance the visual display 104. The counterbalancing of the visual display 104 with the counterweights 142 provides the advantage that the visual display 104 does not need to be limited by size or weight constraints. This allows large LCD displays to be installed in the visual display 104 which occupy a large portion of the user's field of vision.
The yaw yoke 112, which may also be generally U-shaped, may be rotatably connected to the boom 102 through a bearing assembly 114 which allows rotation about a yaw axis 4. As described in the above-cited U.S. Application No.
08/668,259, the boom 102 may include a linking member inside the boom 102 which is rotatably fixed to the housing 100 and which connects to the bearing assembly 114 to allow the bearing assembly 114 to always maintain a vertical orientation. Thus, when the boom 102 is raised or lowered to accommodate users of different heights, the linking member inside the boom 102 causes the angle between the boom 102 and the bearing assembly 114 to change so that the bearing assembly 114 remains vertically oriented.
Rotation of the visual display 104 about the roll axis 2 may be limited by mechanical stops, as shown in Figure 3, for example to a roll angle of ±45 degrees. Inside the visual display 104 may be housed a potentiometer connected to a shaft of the bearing assembly 140 by a coupling. The potentiometer may be an analog device which outputs a voltage indicative of (e.g. , proportional to) the roll angle between the visual display 104 and the pitch yoke 108. For example, the potentiometer may output voltages between +5V and -5V as the roll angle changes from +45° to -45°. The output voltage is transmitted to the processor 150 which calculates image and sound data based on the output voltage indicative of the roll angle.
Rotation about the pitch axis 3 may be limited by mechanical stops which allow rotation to a maximum positive angle such as +45° as shown in Fig. 4A, and to a maximum negative angle such as -45° as shown in Fig 4C. To sense the pitch angle of the visual display 104, a potentiometer may be mounted in a housing of one of the bearing assemblies 110 and may be connected by a coupling to a pitch shaft which connects the pitch yoke 108 to the yaw yoke 112. The potentiometer outputs a voltage indicative of the pitch angle between the pitch yoke 108 and the yaw yoke 112. The output voltage is used by the processor or processors 150 to calculate image and sound data based on the pitch angle indicated by the output voltage. According to a preferred embodiment, the yaw yoke 112 is rotatably connected to the boom 102 via a bearing assembly 114 which allows a full 360° of rotation without restriction at any angle about the yaw axis 4 as shown in Fig. 5. To sense the yaw angle, a conventional optical encoder or quadrature pulse generator may be installed in the yaw bearing assembly 114 and attached to a shaft of the yaw yoke by a coupling. The optical encoder may include a pair of LEDs which emit pulses responsive to the amount of rotation of the yaw yoke 112 about the yaw axis 4. The optical encoder also preferably includes a light sensor which senses the pulses output by the LEDs and which transmits a signal indicative of the yaw angle to the processor 150. The signal indicative of the yaw angle is used by the processor 150 to generate image and sound data. A yaw angle of zero is initialized at the orientation of the yaw yoke when the processor 150 is booted. The two potentiometers and the optical encoder may thus be mounted in the mechanical rotation points shown in Fig. 2 on axes 2, 3, and 4, respectively, which allow for easy movement which correlates to the body's own natural abilities.
The apparatus is preferably constructed to rotate about axes which coincide with the natural axes of rotation of the user's head. For example, the visual display 104, pitch yoke 108, and yaw yoke 112 may be configured such that the roll axis 2 passes just above the user's nose, the pitch axis 3 passes through the user's head just below and behind the ears, and the yaw axis 4 passes through the user's head at the back of the head near the spinal cord. In this way, the apparatus cooperates with the natural motion of the user's head.
According to preferred embodiments of the invention, the visual display 104 is designed to allow a user to quickly use the apparatus with minimal setup time. The visual display or housing 104 preferably includes a recess 105 (see Figure 2) which receives the user's head. As shown in Figures 3 and 5, the recess 105 is formed so that it can receive a range of head sizes. As shown in Figures 4A-4C, a handle or handles 144 may be attached to the visual display which a user grasps to pull the front face of the visual display 104 into contact with the user's face and to control the orientation of the visual display 104. These features, together with the boom 102 which supports the visual display 104, allow a user of any size to quickly walk up to the apparatus 50, grasp the handles 144, bring the user's face into contact with the front face of the visual display 104, and begin to use the machine, which significantly reduces setup time.
The handles 144 can be equipped with at least one trigger or other suitable signalling device to indicate a translational motion which the user desires in a virtual world. For example, triggers can be provided on the handles 144 to allow the user to travel in three degrees of translational freedom within the virtual world, in addition to the three degrees of rotation freedom provided by the yaw yoke 112, pitch yoke 108, and bearing assemblies 110, 114 and 140. The interactions in the virtual world may be governed by both sound and vision. The objects in the virtual world may produce distinct sounds that will aid in their location in the virtual world even if not perceived in the visual displays. Sounds the objects emit can be easily identified by the user as originating from a particular direction in a planar field that is horizontal to the user's eyes, i.e. , forward, backward, left, right or any combination thereof.
According to a preferred embodiment of the invention, two speakers are provided for each ear. One of the two speakers is preferably positioned forward of the ear and the other of the two speakers is preferably positioned behind the ear. This configuration may be adopted for each ear. In this way, the audio program can be provided to the user from a selected one or more of the four speakers such that the audio program has a spatial dimension. The user thus perceives that a sound of the audio program is emanating from a specific direction which may correspond, for example, to the location of a stationary or moving object in the virtual world. Through the proper combination of speakers, the sound can be made to emanate from any direction. Of course, additional speakers may be provided above and below each of the user's ears to add a third dimension to the audio program.
As shown in Figures 2 and 6, the visual display or housing 104 houses at least four speakers which provide audio to a user's ears. Two speakers 115 and 116 can be positioned forward and rearward of the left ear, respectively, and two speakers 117 and 118 can be positioned forward and rearward of the right ear, respectively. The four speakers receive audio signals from the two processors 150 so that the user perceives sound as originating from a particular direction.
The provision of four speakers allows the processor to generate audio signals which can be sensed by the user to originate from any point over 360° in a horizontal plane. Thus, sounds can be generated which are perceived by the user to originate from a particular point in the virtual world, which may correspond to a moving or stationary object in the virtual world. In addition, as shown in Figure 6, speakers 160-163 can be provided above and below each ear so that a sound can be generated which is perceived by the user to originate from any direction in three dimensions. The processors 150 calculate the signals which are sent to the four speakers
115 through 118 based on the sensed orientation of the user's head. Because the two speakers 116 and 118 are positioned rearward of the user's ear, it is possible to provide audio signals which are perceived by the user to originate from an object in the virtual world which is behind the user and thus out of the field of view of the user. The provision of speakers behind the user's ears thus allows the virtual world to expand beyond the field of view of the visual displays. Further, because the user is free to rotate 360° about the yaw axis without restriction at any angle, the user may bring into his field of view the object from which a sound originates by rotating the yaw yoke 112 about the yaw axis 4. As the user rotates, the audio signals are modified by the processors 150 to be transmitted primarily through the forward speakers 115 and 117, rather than through the rearward speakers 116 and 118.
The audio signals provided to the four speakers may be calculated based also on the position of the user in the virtual world with respect to an object or location in the virtual world. Thus, as the user moves through the virtual world and changes his position with respect to objects in the virtual world, the audio signals provided to the speakers are modified accordingly. For example, the audio signals may be calculated so that as the user passes a stationary object in the virtual world which emits a sound, the sound is perceived by the user to always originate from the stationary object as the user changes position with respect to the stationary object. Moreover, the sound signals from a moving object in the virtual world may be modified so that the user perceives the object to move even if the object is out of the user's field of view. To simulate a sound wave emanating from a distant object in the virtual world, the audio signals provided to different speakers can be calculated so as to have a difference in arrival time, frequency, and/or amplitude. According to a preferred embodiment, a first audio signal which is provided to one of the speakers has a time delay with respect to a second audio signal provided to another one of the speakers. The time delay represents the difference in propagation time of a sound wave originating from a distant object when one of the user's ears is closer to the object than the other of the user's ears. The time delay may also arise when two sound waves reach the user's ears by different paths, for example, when one of the waves is reflected off a wall. In addition, due to Doppler shift, a first audio signal provided to one of the speakers preferably has a frequency difference with respect to a second audio signal provided to another one of the speakers. According to principles well known in the art, the frequency of a sound wave changes as it bends around a physical object. The frequency of a sound wave may also change to a certain degree as it is reflected off of an object. The degree of frequency change depends on the acoustic properties of the object. Two signals provided to two different speakers also preferably have an amplitude difference. The amplitude difference represents the difference in energy of the sound wave as it arrives at each ear based on the different distances from each ear to the object. The amplitude difference may also arise due to absorption of energy of one of the waves as it is reflected off an object in the path to the user's ear. Thus, in a virtual world in which physical and acoustic properties of objects are modeled, such arrival time, frequency, and amplitude changes can be derived for sound waves which reach the user's ears by different paths.
One advantage provided by exemplary embodiments of the present invention relates to the balancing of the visual display 104 by the boom 102 and counterweight boom 122. Because the visual display 104 is supported by the boom 102 and not by the user, the mass of the visual display 104 does not have to be kept to a minimum, as in conventional helmet- type designs. This allows for the utilization of relatively large LCD displays. For example, rather than being limited to lightweight LCD displays having a diagonal of less than an inch, exemplary embodiments of the present invention allow much larger LCD displays to be used, for example LCD displays having a diagonal of 2, 3, 4, 5 or more inches. The use of larger LCD displays greatly increases the satisfaction of the user because the large LCD displays occupy a much larger portion of the user's field of vision. Those skilled in the art will also appreciate that other types of displays, for example plasma LCDs or flat screen CRTs, may be used in conjunction with the present invention.
The support of the visual display 104 by the boom 102 provides the additional advantage that the apparatus 50 can be left unattended without risk of damage to the apparatus. The visual display 104 is preferably securely attached to the boom 102 via the pitch yoke 108, yaw yoke 112, and bearing assemblies 110, 114 and 140. In addition, the velocity of the boom 102 may be limited by the stabilizer cylinder 134 shown in Figure 1. These features can prevent the visual display 104 from being moved in a manner which would damage the visual display 104. Consequently, the apparatus 50 can be installed at a variety of locations without the need for an attendant. According to a preferred embodiment of the invention, two LCD displays, each having a diagonal of approximately 5 inches, are provided in the visual display 104 to transmit images to a user. The LCD displays may substantially fill the field of vision of the user with computer-generated images. The processor 150 is preferably programmed to generate individual image data of the virtual model for each eye of the user in accordance with sensed variations in the orientation and position of the user. The virtual model may therefore be perceived stereoscopically, which enhances the realism of the virtual world. If two processors 150 are implemented, each processor can be dedicated to computing image data for one eye and one ear. The use of a processor 150 for each eye/ear pair may significantly improve the quality of the image and sound data transmitted to the user.
To calculate the image and sound data, the processors 150 receive signals representing the roll, pitch, and yaw of the visual display 104. This data is processed by the processors 150 to compute a graphical representation for each eye and dimensional audio. The data represents the orientation of the user's head with the 3- dimensional world. Additional signals can be transmitted from the handle 144 of the apparatus 50, which enables the user to move in translation in the virtual world e.g. , forward, backward, up, down, left, and right. The additional translation movement controls can be independent of the fixed location of the visual display 104, and may be limited only by the definition of the virtual world.
As will be readily appreciated by those skilled in the art, the present invention provides many significant advantages for sensing an orientation of a user in conjunction with a virtual world. For example, because the visual display may be supported by a boom and mechanically coupled to the boom, the risk of damage to the visual display is minimal. This allows the apparatus to be left unattended and greatly increases the number of potential sites at which the apparatus may be installed, as compared with a helmet-type apparatus. In addition, die supporting boom 102 and the counterweighted pitch axis 108 allow for the use of relatively large LCD displays in the visual display so that a large portion of the user's field of vision is occupied. The use of a counterweight boom located below the supporting boom increases the stability of the apparatus by lowering its center of mass. The configuration of the visual display can reduce the setup time for each user, because the user may simply walk up to the apparatus, bring his face into contact with the visual display, and immediately begin to use the apparatus. The use of a processor dedicated for each eye/ear pair can significantly improve the quality of the image and sound data transmitted to the user. The above-described exemplary embodiments are intended to be illustrative in all respects, ramer than restrictive, of the present invention. Thus the present invention is capable of many variations in detailed implementation that can be derived from the description contained herein by a person skilled in the art. All such variations and modifications are considered to be within the scope and spirit of the present invention as defined by the following claims.

Claims

WHAT IS CLAIMED IS:
1. An apparatus comprising: a housing having a recess which receives a user's head; first and second speakers, housed in the housing, positioned forward of the user's left ear and right ear, respectively; third and fourth speakers, housed in the housing, positioned rearward of the user's left ear and right ear, respectively; means for sensing an orientation of the user's head; and means for generating an audio signal for at least one of the first, second, third, and fourth speakers based on the sensed orientation of the user's head.
2. The apparatus of claim 1, further comprising: fifth and sixth speakers, housed in the housing, positioned above the user's left ear and right ear, respectively; seventh and eighth speakers, housed in the housing, positioned below the user's left ear and right ear, respectively; and wherein the means for generating generates an audio signal for at least one of the first through eighth speakers based on the sensed orientation of the user's head.
3. The apparatus of claim 1, further comprising a fixed frame, wherein the housing is mechanically coupled to the fixed frame such that the housing is free to rotate 360° about a vertical axis without restriction at any angle.
4. The apparatus of claim 1, wherein the means for generating is adapted to generate a first audio signal and a second audio signal, wherein the first audio signal has at least one of: a time delay, a frequency difference, and an amplitude difference with respect to the second audio signal.
5. The apparatus of claim 1, further comprising first and second video displays for providing stereoscopic images to the user.
6. The apparatus of claim 1, further comprising: a fixed frame; a first boom mechanically coupled to the housing and rotatably coupled to the fixed frame; and a second boom mechanically coupled to the first boom, and disposed below the first boom.
7. The apparatus of claim 1, further comprising: a fixed frame; and means for mechamcally coupling the housing to the fixed frame such that the housing is free to rotate about three axes of rotation.
8. The apparatus of claim 1, further comprising: an indicator for allowing the user to indicate a translational motion; and wherein the means for generating generates the audio signal based on the translational motion.
9. A method comprising the steps of: sensing an orientation of a user's head about an axis of rotation; transmitting a first audio signal through a first speaker positioned forward of the user's ear based on the sensed orientation of the user's head; and transmitting a second audio signal, different from the first audio signal, through a second speaker positioned rearward of the user's ear based on the sensed orientation.
10. The method of claim 9, further comprising the steps of: sensing an orientation of the user's head about a second axis of rotation; and transmitting a third audio signal through a third speaker positioned above the user's ear based on the sensed orientation of the user's head.
11. The method of claim 9, further comprising the steps of: indicating a translational motion of the user; and transmitting the first and second audio signals based on the indicated translational motion.
12. The method of claim 9, wherein the orientation of the user's head about the first axis of rotation is sensed over 360° without restriction at any angle.
13. The method of claim 9, further comprising the step of providing a first image and a second image to the user's eyes, based on the sensed orientation of the user's head.
14. A viewing apparatus comprising: a visual display which transmits images to a user; a first member rotatably connected to the visual display about a first horizontal axis; a second member rotatably connected to the first member about a second horizontal axis perpendicular the first horizontal axis; and a third member rotatably connected to the second member about a third vertical axis.
15. The viewing apparatus of claim 14, further comprising an indicator disposed on the visual display for indicating a translational motion.
16. The viewing apparatus of claim 15, wherein the indicator is a manual control.
17. The viewing apparatus of claim 14, further comprising: a boom connected to the third member; and a counterweight boom mechanically coupled to the boom such that the boom and the counterweight boom move in unison, wherein the counterweight boom is disposed below the boom.
18. The viewing apparatus of claim 17, wherein the counterweight boom is enclosed within a housing.
19. The viewing apparatus of claim 14, wherein: the first member comprises a first U-shaped yoke which is rotatably connected to the visual display along a centerline of the first U-shaped yoke; and the second member comprises a second U-shaped yoke which is rotatably connected to sides of the first U-shaped yoke.
20. The viewing apparatus of claim 14, wherein the first member comprises a counterweight which counterbalances a torque imparted on the first member by the visual display about the second horizontal axis.
21. The viewing apparatus of claim 14, wherein the visual display comprises two LCD displays which each have a diagonal of not less than 3 inches.
22. The viewing apparatus of claim 14, wherein the second member is free to rotate 360 degrees about the third vertical axis without restriction at any angle.
23. The viewing apparatus of claim 14, further comprising: a first potentiometer for sensing a first angular position of the visual display with respect to the first member about the first horizontal axis; a second potentiometer for sensing a second angular position of the second member with respect to the first member about the second horizontal axis; and an optical encoder for sensing a third angular position of the third member with respect to the second member about the third vertical axis, wherein the optical encoder senses the third angular position over 360 degrees without restriction at any angle.
24. A viewing apparatus comprising: a visual display which transmits images to a user; a boom which supports the visual display; and means for mechanically coupling the visual display to the boom such that the visual display rotates 360 degrees about a vertical axis without restriction at any angle.
25. The viewing apparatus of claim 24, wherein the visual display comprises two LCD displays which each have diagonals of at least 3 inches.
26. A viewing apparatus comprising: a visual display; a first member connected to the visual display; a second member rotatably connected to the first member about a first axis; and a counterweight, disposed on the first member, which counterbalances a torque imparted by the visual display on the first member about the first axis.
27. The viewing apparatus of claim 26, wherein a position of the counterweight with respect to the first member is adjustable so as to vary a counterbalancing torque provided by the counterweight.
28. A viewing apparatus comprising: a visual display which transmits images to a user; a first member rotatably connected to the visual display at a first axis; a first sensor on the first axis for sensing a first angular position of the visual display relative to the first member; a second member rotatably connected to the first member at a second axis perpendicular to the first axis; a second sensor on the second axis for sensing a second angular position of the first member relative to the second member; a third member rotatably connected to the second member at a third axis perpendicular to the first and second axes; a third sensor on the third axis for sensing a third angular position of the second member relative to the third member; and a control unit which receives signals from the first, second and third sensors representing the first, second and third angular positions and which transmits image data to the visual display based on the first, second and third angular positions.
29. The viewing apparatus of claim 28, wherein the visual display comprises a first speaker positioned forward of a first ear of the user and a second speaker positioned rearward of the first ear of the user.
30. The viewing apparatus of claim 28, wherein the visual display comprises two LCD displays which each have diagonals of at least 3 inches.
31. The viewing apparatus of claim 30, wherein the control unit generates different image data for each of the two LCD displays.
32. A method comprising the steps of: rotatably coupling a visual display to a first member about a first axis; rotatably coupling the first member to a second member about a second axis and rotatably coupling the second member to a third member about a third axis such that the second member rotates 360 degrees about the third axis.
33. The method of claim 32, further comprising the step of counterbalancing with a counterweight a torque imparted by the visual display on the first member about the second axis.
PCT/US1997/010107 1996-06-20 1997-06-20 Method and apparatus for orientation sensing and directional sound generation WO1997049081A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
AU34828/97A AU3482897A (en) 1996-06-20 1997-06-20 Method and apparatus for orientation sensing and directional sound generation

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US08/667,815 US6057810A (en) 1996-06-20 1996-06-20 Method and apparatus for orientation sensing
US08/667,815 1996-06-20
US69478996A 1996-08-09 1996-08-09
US08/694,789 1996-08-09

Publications (1)

Publication Number Publication Date
WO1997049081A1 true WO1997049081A1 (en) 1997-12-24

Family

ID=27099776

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US1997/010107 WO1997049081A1 (en) 1996-06-20 1997-06-20 Method and apparatus for orientation sensing and directional sound generation

Country Status (2)

Country Link
AU (1) AU3482897A (en)
WO (1) WO1997049081A1 (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1694063A1 (en) * 2003-11-21 2006-08-23 Nishi, Kenji Image display device and simulation apparatus
EP2128842A2 (en) * 2008-05-30 2009-12-02 Honeywell International Inc. Operator assistance methods and systems
US9574326B2 (en) 2012-08-02 2017-02-21 Harnischfeger Technologies, Inc. Depth-related help functions for a shovel training simulator
US9666095B2 (en) 2012-08-02 2017-05-30 Harnischfeger Technologies, Inc. Depth-related help functions for a wheel loader training simulator
EP3246897A1 (en) * 2016-05-18 2017-11-22 Christian K. Keul Device and method for simulating acceleration forces

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4257062A (en) * 1978-12-29 1981-03-17 Meredith Russell W Personalized audio-visual system
US4884219A (en) * 1987-01-21 1989-11-28 W. Industries Limited Method and apparatus for the perception of computer-generated imagery
US5253832A (en) * 1991-07-26 1993-10-19 Bolas Mark T Spring counterbalanced boom suspension system
US5320538A (en) * 1992-09-23 1994-06-14 Hughes Training, Inc. Interactive aircraft training system and method

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4257062A (en) * 1978-12-29 1981-03-17 Meredith Russell W Personalized audio-visual system
US4884219A (en) * 1987-01-21 1989-11-28 W. Industries Limited Method and apparatus for the perception of computer-generated imagery
US5253832A (en) * 1991-07-26 1993-10-19 Bolas Mark T Spring counterbalanced boom suspension system
US5320538A (en) * 1992-09-23 1994-06-14 Hughes Training, Inc. Interactive aircraft training system and method

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1694063A1 (en) * 2003-11-21 2006-08-23 Nishi, Kenji Image display device and simulation apparatus
EP1694063A4 (en) * 2003-11-21 2008-01-30 Kenji Nishi Image display device and simulation apparatus
EP2128842A2 (en) * 2008-05-30 2009-12-02 Honeywell International Inc. Operator assistance methods and systems
EP2128842A3 (en) * 2008-05-30 2009-12-16 Honeywell International Inc. Operator assistance methods and systems
US9207758B2 (en) 2008-05-30 2015-12-08 Honeywell International Inc. Operator assistance methods and systems
US9574326B2 (en) 2012-08-02 2017-02-21 Harnischfeger Technologies, Inc. Depth-related help functions for a shovel training simulator
US9666095B2 (en) 2012-08-02 2017-05-30 Harnischfeger Technologies, Inc. Depth-related help functions for a wheel loader training simulator
EP3246897A1 (en) * 2016-05-18 2017-11-22 Christian K. Keul Device and method for simulating acceleration forces

Also Published As

Publication number Publication date
AU3482897A (en) 1998-01-07

Similar Documents

Publication Publication Date Title
US6057810A (en) Method and apparatus for orientation sensing
US4984179A (en) Method and apparatus for the perception of computer-generated imagery
US7812815B2 (en) Compact haptic and augmented virtual reality system
EP3612143B1 (en) Emulating spatial perception using virtual echolocation
JP3247126B2 (en) Method and apparatus for providing a portable visual display
US5742331A (en) Three-dimensional image display apparatus
JP4871270B2 (en) System and method for operating in a virtual three-dimensional space and system for selecting operations via a visualization system
EP0575332B1 (en) Compact head position tracking device for low cost virtual reality system
JP4251673B2 (en) Image presentation device
US9354718B2 (en) Tightly coupled interactive stereo display
EP0607000B1 (en) Method and apparatus for generating high resolution 3D images in a head tracked stereo display system
US5253832A (en) Spring counterbalanced boom suspension system
JP3862348B2 (en) Motion capture system
CA2524879A1 (en) Windowed immersive environment for virtual reality simulators
WO2006101942A2 (en) Systems and methods for eye-operated three-dimensional object location
WO2019067650A1 (en) Range finding and accessory tracking for head-mounted display systems
JPH07253773A (en) Three dimentional display device
US5855344A (en) Method and apparatus for counterbalancing
WO1997049081A1 (en) Method and apparatus for orientation sensing and directional sound generation
US20040041788A1 (en) Interactive virtual portal
JPH0869449A (en) Simulation device for bodily feeling three-dimensional body
JP2001175883A (en) Virtual reality device
Rebo et al. A helmet-mounted virtual environment display system
JP3697763B2 (en) Image display system
KR20130117627A (en) Simulator system for micro-nano robot using real-time characteristic data

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): AL AM AT AU AZ BA BB BG BR BY CA CH CN CU CZ DE DK EE ES FI GB GE HU IL IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MD MG MK MN MW MX NO NZ PL PT RO RU SD SE SG SI SK TJ TM TR TT UA UG UZ VN AM AZ BY KG KZ MD RU TJ TM

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): GH KE LS MW SD SZ UG ZW AT BE CH DE DK ES FI FR GB GR IE IT LU MC NL PT SE BF BJ CF

DFPE Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101)
121 Ep: the epo has been informed by wipo that ep was designated in this application
NENP Non-entry into the national phase

Ref country code: JP

Ref document number: 98503113

Format of ref document f/p: F

REG Reference to national code

Ref country code: DE

Ref legal event code: 8642

122 Ep: pct application non-entry in european phase
NENP Non-entry into the national phase

Ref country code: CA