WO2002037466A1 - Electronic user worn interface device - Google Patents

Electronic user worn interface device Download PDF

Info

Publication number
WO2002037466A1
WO2002037466A1 PCT/US2001/048224 US0148224W WO0237466A1 WO 2002037466 A1 WO2002037466 A1 WO 2002037466A1 US 0148224 W US0148224 W US 0148224W WO 0237466 A1 WO0237466 A1 WO 0237466A1
Authority
WO
WIPO (PCT)
Prior art keywords
interface
per
computer
based system
data input
Prior art date
Application number
PCT/US2001/048224
Other languages
French (fr)
Inventor
David Devor
Av Utukuri
Kumar Utukuri
Jonathan Clarke
John J. Gentile
Anthony R. Gentile
Original Assignee
Essential Reality, Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Essential Reality, Inc filed Critical Essential Reality, Inc
Priority to AU2002230814A priority Critical patent/AU2002230814A1/en
Priority to JP2002540133A priority patent/JP2004513443A/en
Priority to EP01991060A priority patent/EP1340218A1/en
Priority to US10/091,558 priority patent/US20020153488A1/en
Publication of WO2002037466A1 publication Critical patent/WO2002037466A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/014Hand-worn input/output arrangements, e.g. data gloves
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/017Gesture based interaction, e.g. based on a set of recognized hand gestures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/0304Detection arrangements using opto-electronic means

Definitions

  • the present invention relates generally to the field of computer I/O devices.
  • the present invention is related to a multiple axis input device in
  • GUI User Interface
  • the Macintosh system also made popular the use of the mouse which made manipulation of the on-screen environment simpler.
  • the interface device of the present invention provides an input device that
  • Video games are just one area that will benefit from the enhanced productivity of such an input device.
  • Other areas include, but are not limited to,
  • the input system of the present invention generally comprises a user worn interface, such as a hand worn glove, necklace, ankle bracelet, sock, or shirt, with
  • the interface device in the preferred glove embodiment, is engineered with up to five-finger bend sensitivity that provides full finger and wrist recognition and
  • a pattern of radiation sources in the preferred embodiment an array of
  • the device is lightweight
  • a separate tracking station working details of sensors provided in co-
  • the object's three-dimensional position is provided by a first two-
  • the object's (704) distance from either of the three-dimensional direction detectors 700 and 702 is determined and then, using triangulation with the known distance between
  • the object's three-dimensional direction is detected.
  • Figure 1 illustrates a system diagram of a preferred embodiment of the
  • Figure 2. illustrates a system diagram of a second embodiment of the present
  • Figure 3 illustrates a diagram of pitch, yaw and roll for a hand.
  • Figure 4 illustrates an electronics diagram of the system used in figure 1.
  • FIG. 5 illustrates the triangulation technique used by the tracking system
  • Figure 6 illustrates a schematic of the detector geometry.
  • Figure 7 illustrates shadow when the detector of figure 6 is spaced apart
  • Figure 8 illustrates detector head geometry
  • Figure 9 illustrates reference frames.
  • Figure 10 illustrates creating a basis set given an arbitrary set of 3D
  • a hand worn glove with electronics, light emitters
  • the system includes detectors, position tracker and associated circuitry, and a controller providing input to a computer based device such as a PC or gaming console.
  • a computer based device such as a PC or gaming console.
  • device of the invention is also capable of gesture recognition from specific movement of thumb and finger digits. Positioning and movement recognition of
  • the glove like device is achieved through sensors positioned in strategic locations within the glove like device. Such sensors track finger bends and hand movements
  • Figure 1 illustrates an embodiment of the input system 100 of the present
  • the system comprises glove 102 with embedded electronics 104,
  • sensors not shown
  • sources of radiation 105 LEDs
  • tracking head 106 for tracking position of sources of radiation
  • computer interface electronics which, in this embodiment, are housed in a separate control box 108.
  • control box 108 provide the data indicative of motion and position of a user's hand
  • the glove uses embedded electronics 104 to
  • Figure 2 illustrates the system of figure 1 with the control box 108 and
  • tracking unit 106 (including tracking heads 110a and 110b) combined to form a
  • Monitor 114 from a connected PC or other
  • the system tracks finger bends, hand movements, and position, all detected
  • Yaw is defined as movement along a plane parallel to the ground.
  • Pitch is defined as movement along an axis that is parallel to the top of your hand and perpendicular to the wrist, such that it would enter your wrist below your thumb and exit below your little fmger. Pitch is accomplished by moving your hand up and down while holding your forearm parallel to the ground.
  • Roll is defined as rotation of your hand about an axis that is parallel to the ground and enters your hand at the tip of your middle fmger and runs through your wrist parallel to your forearm. Roll is accomplished by holding your hand flat with your palm facing the ground and turning your arm such that your thumb rises and your little finger falls or vice versa.
  • Recognition of thumb and finger movement is achieved through sensors positioned in strategic locations within glove 102, e.g., two sensors in each finger and an abduction sensor for the thumb.
  • the sensing of finger bend or glove fabric stretch from finger movement will preferably occur via strip sensors or conductive inks with variable electrical resistance (such as, but not limited to, strain gauges, carbon-based inks on a flexible substrate).
  • the position changes of hand digits are detected above a baseline bias voltage & resistance.
  • Other types of sensors which sense finger positions can be substituted without departing from the scope of the present invention.
  • Position of the user's hand is tracked by tracking head 106 using electronics and firmware/software supporting X, Y and Z axis positioning in conjunction with LEDs mounted on glove 102.
  • LEDs mounted on glove 102 Preferably, 10-16 infrared LEDs 105 are mounted in three to four groups of four LEDs. However, variations of number and patterns of LEDs are within the scope of the present invention. LEDs radiating in other ranges of the electromagnetic spectrum, including visual light, are also considered within the scope of the present invention.
  • Each group of LEDs are preferably placed in different areas around the hand to ensure at least one set of 3 of LEDs has line-of- sight with tracking head 106, regardless of hand orientation. Within each group,
  • LEDs are preferably arranged in a pattern as to provide a non-coplanar configuration, with the ideal configuration being a triangular based pyramid
  • tracking head 106 tracked by tracking head 106.
  • the complete three-dimensional position of glove 102 is determined using
  • Tracker head 106 employs at least two three-dimensional
  • a three-dimensional direction detector 500 is placed at a
  • Points A and B are
  • Detector 500 measures the three-dimensional direction of a radiating object 504
  • detector 502 measures the three-dimensional direction of
  • this arrangement allows the complete three-dimensional position of radiating source 500 to be measured as the distance and the three-dimensional
  • the detectors are arranged on the same plane, which minimizes
  • planar detectors do not have to be arranged in very close proximity to each other as is required with a central aperture.
  • the radiation from the source is modulated
  • Each group of LEDs are preferably placed in different areas around the hand, in order to ensure line of sight with the optical tracker head that is preferably
  • a mounting bar which may have adjustable length.
  • This bar will be attached to a "docking station” which can be placed on the users desk or tabletop.
  • the "docking station” may contain a provision to assist in
  • the station may be used as a storage location for the glove like device. Within the “docking station” there will be housing for various electronic components.
  • the control box 108 houses the USB controller and preferably has one USB cable (host PC coimection), and two cables that attach to the glove and
  • the control box comprises a USB controller, an oscillator, filter capacitors and an analog multiplexer chip.
  • based micro controller that meets the needs of the present system might comprise a
  • the USB controller preferably has enough code space for 8K instructions and is an
  • Such controller also preferably incorporates FLASH memory, which allows for downloading of new program updates, bug
  • the interface device is preferably a USB product to allow for direct "plug
  • FIG. 4 A block diagram of the interface device is shown in FIG. 4.
  • PCs on the market today are fully USB-ready, as well as the newest gaming consoles by Sony, Sega, Nintendo and Microsoft.
  • the device of the invention will communicate with the CPU via a USB port.
  • the system comprises the tracker circuitry 402, control box
  • the tracker circuitry includes a main
  • section 412 including an optical tracking multiplexing board, differential amplifiers, DiffLV, DiffRV, DiffLH, DiffRH, SumL, SumR, DiffB, and power
  • left and right tracking detector heads 408, output amps for the heads
  • the controller box 404 includes a micro controller and supporting components such as flash memory, crystal for clock timing, various analog control
  • circuitry 420 circuitry 420, power supply 422, and connectors 424 and USB 426 connection to a
  • the glove circuitry 406 includes an LED array (16) and associated decoders
  • the LEDs are driven to output light signals and the bend sensors
  • the routing of signals is controlled through 434
  • Connector 436 provides I/O of signals to the controller 404.
  • the first step is to determine the angle of entry
  • the next step involves triangulation in order to determine positional information of each LED in space.
  • the final step is to calculate the orientation of
  • Step 1 Calculating angle of entry
  • k is a proportionality factor, and includes, detector sensitivity, system gain etc. It is assumed to be same for both channels.
  • Step 2 Triangulating position information
  • O . and O 2 represent the two detector heads, located on a common X-axis.
  • Y ⁇ and Y 2 are the two Y-axes and Zi, Z 2 the two Z-axes.
  • D is the
  • angles from the second detector O 2 are ⁇ 2 and ⁇ 2 . Since the detectors are displaced
  • Step 3 Calculating orientation information
  • the first frame is the world co-ordinate frame (Frame 0), which in our geometry can be considered to
  • is the pitch and ⁇ is the yaw of the system.
  • R ⁇ - >2 can be calculated by measuring the physical geometry of the same
  • the actual angles can be calculated by setting the Ro-> ⁇ matrix to equal the general
  • AEC Architecture, Engineering and Construction
  • ME Mechanical Engineering
  • MCAD Mechanical Computer-aided Design
  • the glove device of the present invention will be described in its simplest iteration.
  • a current computer user interface for existing and future applications.
  • a current computer user interface for existing and future applications.
  • the glove interface device of the present invention is a peripheral device adapted for use with the computer and gaming console market.
  • the device of the invention is designed for multiple uses including but not limited to: gaming, scientific visualization, animation, Computer Aided Design (CAD), virtual reality, industrial design, training and education, and web browsing.
  • CAD Computer Aided Design
  • Such glove device serves as an interface for a personal computer, stand-alone video gaming console or other USB devices, and is provided with a multiplicity of sensors to accurately determine and track the position of the user's body parts, such as hand, wrist and five fingers in space.
  • embodiments of the glove device incorporating enhancements and feature options include a wireless product, the ability to monitor sweat and pulse, the ability to provide and utilize tactile feedback and the ability to interchange the glove fabrics
  • the present invention may be implemented in various computing environments.
  • the present invention may be implemented on a conventional IBM PC or equivalent, multi- nodal system (e.g., LAN) or networking system (e.g., Internet, WWW, wireless

Abstract

A user worn interface (102), such as a hand worn glove, necklace, ankle bracelet, sock, or shirt, includes embedded electronics (104), radiation sources (105), and sensor technologies (110a, 110b). Starting position and movement of the interface (102) is tracked on a computer monitor with electronics & firmware/software supporting X, Y and Z axis positioning. Electro-resistive sensors embedded within a surface of the interface (102) cooperate with an array of LEDs (105) located on a top surface of the interface (102) and a line of sight detection system to detect all motion, position, and gestures from specific movement of the user's body (e.g., thumb, finger digits and hand). The user's motions and positions are detected and processed to provide x, y, z and yaw, pitch, and roll data to a computer-based device.

Description

ELECTRONIC USER WORN INTERFACE DEVICE
RELATED APPLICATIONS
The present application claims the benefit of provisional patent application
"Glove Interface Device", serial number 60/276,292, filed March 16, 2001 and a
second provisional of the same title "Glove Interface Device", serial number
60/245,088, filed November 2, 2000. In addition, co-pending application entitled
"Shadow Based Range and Direction Finder" is hereby incorporated by reference.
BACKGROUND OF THE INVENTION
Field of Invention
The present invention relates generally to the field of computer I/O devices.
More specifically, the present invention is related to a multiple axis input device in
the form of a body worn element.
Discussion of Prior Art
In the early 1980's, the Macintosh® computer was the first to use a Graphic
User Interface (GUI) in widely accepted computer software. A GUI is a
graphically based, mouse-centric user interface that makes use of icons, windows, buttons, menus, and dialog boxes which allow the user to select commands and
manage programs, files, and folders by manipulating icons. The Macintosh system also made popular the use of the mouse which made manipulation of the on-screen environment simpler.
Computer applications, particularly in the gaming and design markets, have
migrated from a two-dimensional visual interface to intuitive, real-world- simulating three-dimensional visual interfaces. Since the advent of the mouse,
however, little has changed in the way of input devices. Input devices have been
dominated by two-dimensional products such as mice, keyboards, joysticks and proprietary console controllers because of prohibitively high prices for three-
dimensional input devices.
What is needed is an inexpensive three-dimensional input device that serves users' needs for two-dimensional and three-dimensional software manipulation by
providing a more intuitive and precise means of entering information - direct input
that emulates the way humans interact with their offline environment. The interface device of the present invention provides an input device that
serves users' needs for 2D and 3D software manipulation by providing a more
intuitive and precise means of entering information - direct input that emulates the way humans interact with their offline environment.
Video games are just one area that will benefit from the enhanced productivity of such an input device. Other areas include, but are not limited to,
music composition and conduction, orchestration of lighting and sound control,
material shaping and manipulation, web surfing and browsing and the like. The
desire of consumers to literally grab a hold of the Internet will become a reality.
SUMMARY OF THE INVENTION
The input system of the present invention generally comprises a user worn interface, such as a hand worn glove, necklace, ankle bracelet, sock, or shirt, with
electronics, radiation sources, and sensor technologies that provide for detecting a user's natural body motions (e.g., hands) and position in three-dimensional space. The user's motions and positions are detected and processed to provide x,y,z and
yaw, pitch, and roll data to a computer-based device.
The interface device, in the preferred glove embodiment, is engineered with up to five-finger bend sensitivity that provides full finger and wrist recognition and
preferably includes web-imbedded software to provide Internet navigation. Up to
five-finger bend sensitivity is achieved through the use of sensors (in the preferred
embodiment 15) embedded in the housing or "fabric" of the interface device. In
addition, a pattern of radiation sources (in the preferred embodiment an array of
16), such as LEDs, provide signals to a remote tracking device, able to track the
position of the radiation sources in three dimensions. The device is lightweight
and comfortable, incorporating customized and specialized materials.
A separate tracking station (working details of sensors provided in co-
pending application "Shadow Based Range and Direction Finder") provides for a plurality of three-dimensional direction detectors used to determine the three- dimensional position of a radiating object (detected radiation sources from gloves'
LEDs). The object's three-dimensional position is provided by a first two-
dimensional direction of the radiating object in a plane defined by the first axis and the vertical that is determined from a ratio of the detected intensities of the incident
radiation on each of a first pair of detectors. In addition, a second two-dimensional direction of the radiating object in a plane defined by the second axis and the
vertical is determined from a ratio of the detected intensities of the incident
radiation on each of the second pair of detectors. As shown in figure 7, the object's (704) distance from either of the three-dimensional direction detectors 700 and 702 is determined and then, using triangulation with the known distance between
detectors, the object's three-dimensional direction is detected.
BRIEF DESCRIPTION OF THE DRAWINGS Figure 1 illustrates a system diagram of a preferred embodiment of the
present invention.
Figure 2. illustrates a system diagram of a second embodiment of the present
invention.
Figure 3 illustrates a diagram of pitch, yaw and roll for a hand.
Figure 4 illustrates an electronics diagram of the system used in figure 1.
Figure 5 illustrates the triangulation technique used by the tracking system
of the present invention.
Figure 6 illustrates a schematic of the detector geometry.
Figure 7 illustrates shadow when the detector of figure 6 is spaced apart
from the filter. Figure 8 illustrates detector head geometry.
Figure 9 illustrates reference frames.
Figure 10 illustrates creating a basis set given an arbitrary set of 3D
positions.
DESCRIPTION OF THE PREFERRED EMBODIMENTS
There is depicted in the drawings, and will herein be described in detail, one or more preferred embodiments of the invention, with the understanding that
the present disclosure is to be considered as an exemplification of the principles of
the invention and the associated functional specifications for its practice and is not intended to limit the invention to the embodiments illustrated. Those skilled in the
art will envision many other possible variations within the scope of the present invention.
A first embodiment of the input device of the present invention generally
comprises a system including a hand worn glove with electronics, light emitters
and detectors, position tracker and associated circuitry, and a controller providing input to a computer based device such as a PC or gaming console. The system
provides for detecting a user's natural hand motions, gestures and positioning in
three-dimensional space. The relative movements of the user's hand motions and
position to previously detected values provide the input to the computer-based device.
The preferred embodiment of the device consists of a hand worn glove with
embedded electronics, radiation sources, and sensor technologies. Starting position and movement of the device will be tracked on a computer monitor with electronics & firmware/software supporting X, Y and Z axis positioning. The
device of the invention is also capable of gesture recognition from specific movement of thumb and finger digits. Positioning and movement recognition of
the glove like device is achieved through sensors positioned in strategic locations within the glove like device. Such sensors track finger bends and hand movements
within conventional three-dimensional space to include pitch, yaw, and roll.
Figure 1 illustrates an embodiment of the input system 100 of the present
invention. The system comprises glove 102 with embedded electronics 104,
sensors (not shown), sources of radiation 105 (LEDs), tracking head 106 for tracking position of sources of radiation, and computer interface electronics, which, in this embodiment, are housed in a separate control box 108. The electronics in
control box 108 provide the data indicative of motion and position of a user's hand
to a computer (not shown) through cable 109 (or equivalently through wireless
transmission) to act as an input. The glove uses embedded electronics 104 to
communicate specific movement of thumb and finger digits to control box 108. This allows for the use of hand motion and gesture recognition as input. Tracker
head 106 provides the three-dimensional position information, as recognized by the
detection of position of radiation sources 105, to control box 108. The combined
position and motion information allow a user's natural hand movements to be used as input.
Figure 2 illustrates the system of figure 1 with the control box 108 and
tracking unit 106 (including tracking heads 110a and 110b) combined to form a
single integrated station 107. Monitor 114 from a connected PC or other
equivalent computer system displays the data input by user control of glove 104. While the illustration shows a single glove, alternative embodiments including multiple gloves are within the scope of the present invention.
The system tracks finger bends, hand movements, and position, all detected
within conventional three-dimensional space. This enables the measurement of yaw, pitch and roll movements. Yaw, pitch and roll movements are depicted in figure 3. Yaw is defined as movement along a plane parallel to the ground. Yaw
motion is achieved by holding your hand with your palm facing and parallel to the
ground while rotating side to side on this horizontal plane. Pitch is defined as movement along an axis that is parallel to the top of your hand and perpendicular to the wrist, such that it would enter your wrist below your thumb and exit below your little fmger. Pitch is accomplished by moving your hand up and down while holding your forearm parallel to the ground. Roll is defined as rotation of your hand about an axis that is parallel to the ground and enters your hand at the tip of your middle fmger and runs through your wrist parallel to your forearm. Roll is accomplished by holding your hand flat with your palm facing the ground and turning your arm such that your thumb rises and your little finger falls or vice versa.
Recognition of thumb and finger movement is achieved through sensors positioned in strategic locations within glove 102, e.g., two sensors in each finger and an abduction sensor for the thumb. The sensing of finger bend or glove fabric stretch from finger movement will preferably occur via strip sensors or conductive inks with variable electrical resistance (such as, but not limited to, strain gauges, carbon-based inks on a flexible substrate). The position changes of hand digits are detected above a baseline bias voltage & resistance. Other types of sensors which sense finger positions (both knuckles per finger preferred) can be substituted without departing from the scope of the present invention.
Position of the user's hand is tracked by tracking head 106 using electronics and firmware/software supporting X, Y and Z axis positioning in conjunction with LEDs mounted on glove 102. Preferably, 10-16 infrared LEDs 105 are mounted in three to four groups of four LEDs. However, variations of number and patterns of LEDs are within the scope of the present invention. LEDs radiating in other ranges of the electromagnetic spectrum, including visual light, are also considered within the scope of the present invention. Each group of LEDs are preferably placed in different areas around the hand to ensure at least one set of 3 of LEDs has line-of- sight with tracking head 106, regardless of hand orientation. Within each group,
LEDs are preferably arranged in a pattern as to provide a non-coplanar configuration, with the ideal configuration being a triangular based pyramid
configuration. These LEDs make glove 102 a point source of radiation that is
tracked by tracking head 106.
The complete three-dimensional position of glove 102 is determined using
tracking head 106. Tracker head 106 employs at least two three-dimensional
direction detectors 110a and 110b placed a known distance away from each other
for determining the glove's three-dimensional position. This is conceptually
illustrated in figure 5. A three-dimensional direction detector 500 is placed at a
point A. A second similar detector 502 is placed at a point B. Points A and B are
separated by a known distance, d, along what is normally called a baseline. Detector 500 measures the three-dimensional direction of a radiating object 504
from point A. Likewise, detector 502 measures the three-dimensional direction of
radiating object 504 from point B. Once both the directions are determined, triangulation is used to calculate the distance of the object from either point A or B.
As can be seen, this arrangement allows the complete three-dimensional position of radiating source 500 to be measured as the distance and the three-dimensional
direction of the radiating object from either detector.
These three-dimensional direction detectors have a number of advantages:
• The incident radiation is not restricted by an aperture and one of the planar detectors is always fully illuminated. On a relative scale, the individual signals from each planar detector "range" from 1 (at head on) to zero, and the total signal resulting from the four planar detectors ranges from 4 (at head on) to 2. This is a significant improvement from the Marquet method where the signal ranges
Figure imgf000011_0001
• A large fraction of the detector area is always illuminated providing
much larger usable ranges than previous systems.
• One detector for each direction is fully illuminated at all angles and its signal provides normalization.
• The detectors are arranged on the same plane, which minimizes
mounting difficulties and expenses, h addition, there is not a need for accurately aligned angular positioning of the detectors.
• Any detector element including inexpensive solar cells could be utilized.
• Expensive optical elements, such as mirrors or lenses, are not needed.
• The planar detectors do not have to be arranged in very close proximity to each other as is required with a central aperture.
Therefore, inexpensive, single planar detectors can be used rather
than expensive, segmented planar detectors
• Light strikes both detectors of a single axis pair at the same angle. Thus, any deviation from Lambertian behavior affects both of them
in the same way. Because of this, and the fact that only intensity ratios are used for the calculations, the accuracy of the measurement is not altered by the incident angle. In an additional embodiment, the radiation from the source is modulated
and detection is done at the modulation frequency only. This provides for discrimination from ambient radiation. In addition, this technique provides the
ability to measure position information for several sources. The signals from
different sources are made distinguishable using any modulation technique such as FDM, TDM, PCM, etc. Through suitable decoding at the detector end the
positions of all the sources are then found at the same time. Thus, the system can
be easily extended for simultaneous detection of several radiating sources, the
number of sources being limited only by data processing speeds and capacities.
Each group of LEDs are preferably placed in different areas around the hand, in order to ensure line of sight with the optical tracker head that is preferably
located at the back corner of a mounting bar which may have adjustable length. This bar will be attached to a "docking station" which can be placed on the users desk or tabletop. The "docking station" may contain a provision to assist in
installation & removal of the glove like device from the users hand. The "docking
station" may be used as a storage location for the glove like device. Within the "docking station" there will be housing for various electronic components.
The control box 108 (FIG. 1) houses the USB controller and preferably has one USB cable (host PC coimection), and two cables that attach to the glove and
the optical tracker head. The control box comprises a USB controller, an oscillator, filter capacitors and an analog multiplexer chip. A representative USB
based micro controller that meets the needs of the present system might comprise a
6 MIPS, RISC core with built in 8 channel 8-bit Analog-To-Digital converters. The USB controller preferably has enough code space for 8K instructions and is an
in-system, programmable chip. Such controller also preferably incorporates FLASH memory, which allows for downloading of new program updates, bug
fixes, etc. directly into the USB controller from the host computer.
The interface device is preferably a USB product to allow for direct "plug
and play" in PCs and gaming consoles. A block diagram of the interface device is shown in FIG. 4. PCs on the market today are fully USB-ready, as well as the newest gaming consoles by Sony, Sega, Nintendo and Microsoft. The device of the
invention is designed as a natural interface for the personal computer, stand-alone
video gaming consoles and all other USB-compatible 3D software platforms, and
acts as a peripheral to a PC, Mac/Apple and gaming consoles. Its operational performance is ultimately dependent on the motherboard, RAM, graphics card and
monitor capability of the user's CPU, and the device of the invention will communicate with the CPU via a USB port.
As shown, the system comprises the tracker circuitry 402, control box
circuitry 404, and glove circuitry 406. The tracker circuitry includes a main
section 412 including an optical tracking multiplexing board, differential amplifiers, DiffLV, DiffRV, DiffLH, DiffRH, SumL, SumR, DiffB, and power
distribution, left and right tracking detector heads 408, output amps for the heads
410, associated signal transferring MUX 414, and connectors for cables 416.
The controller box 404 includes a micro controller and supporting components such as flash memory, crystal for clock timing, various analog control
circuitry 420, power supply 422, and connectors 424 and USB 426 connection to a
computer or electronic based device such as a PC or gaming console. The glove circuitry 406 includes an LED array (16) and associated decoders
and drivers 429 analog bend sensors 430 (8) and 432(7) and associated interfaces
MUX's 431. The LEDs are driven to output light signals and the bend sensors
return signals of movement (typically based on a strain(change of capacitance) of
the electro-resistive strip sensors). The routing of signals is controlled through 434
and may be hardwired, programmable or preprogrammed by conventional means.
Connector 436 provides I/O of signals to the controller 404.
Tracker Algorithm
An algorithm to determine angles and positions from the LEDs can be broken
down into three major steps. The first step is to determine the angle of entry
(horizontal and vertical) of the light from an individual LED into the detector heads. The next step involves triangulation in order to determine positional information of each LED in space. The final step is to calculate the orientation of
the entire hand, given the positional information of all the LEDs in space.
Step 1 - Calculating angle of entry
Referring to Figure 6: Height of wall, OA = h
Length of detector, BD = L
Length of shadow, OC = x
Dist from foot of the wall to edge of detector, OB = d
Width of detector = w tan θ = x/h
OC = x = h tanθ
shadowed portion of the detector, BC = OC-OB = h tan θ -d
So, illuminated portion, CD = BD-BC = L - (h tan θ -d) = (L- h tanθ+d)
Signal from det #2, 12 = kLw
Signal from det #1, Ii = kw (L- h tan θ +d)
Where k is a proportionality factor, and includes, detector sensitivity, system gain etc. It is assumed to be same for both channels.
So the ratio, R = ( I2 - ϊλ )/ Iχ = [ kLw - {l w ( 1 - h tan θ + d)}]/ kLw
= (h tan θ -d)/L
So, tan θ = (LR + d)/h
θ = arc tan [(LR + d)/h]
This assumes that the detector is right at the surface of the- package. Refraction in
the filter requires a correction. Looking at the figure we see that the shadow is
lengthened when there is a filter. If the filter is bonded to the detector, than the
extra shadow length by EF is calculated:
EF = tan φ
Sinθ = n sinφ Taking sin θ = tan θ and sin φ = tan φ, we have
EF = tan θ/n
So, the illuminated area = lw - lw(htanθ + 1 tanθ/n - d)
= lw[l-tanθ(h + t/n) + d]
ie. h is replaced by (h + t/n)
So, the solution becomes :
θ = arc tan [(LR + d)/(h + t/n]
for values of h = 4 mm, t = 0.5 mm and n= 1.5, the correction amounts to about 0.3
in 4.0
Referring to Figure 7, where showing the shadow when the detector is spaced s
apart from the filter
If the filter is not bonded to the detector and there is an air/ vacuum spacing of
between the two as in figure overleaf,
The formula becomes:
θ = arc tan [(LR + d)/(h + t/n + s] assuming,
L = w = 3 mm
And the cathode edge of the detector is at (2.15 -1.5) = 0.65 mm from the package
edge. So, d, is quite close to 0.65 mm.
Step 2 - Triangulating position information
Referring to Figure 8:
Once the horizontal and vertical angle of entries to both detector heads has been
determined, we can triangulate the x,y,z position of the LED.
From the diagram above, O. and O2 represent the two detector heads, located on a common X-axis. Y\ and Y2 are the two Y-axes and Zi, Z2 the two Z-axes. D is the
LED position.
Its projection in the XZi plane is OiC subtending an angle θi with the Zi. The
projection in the YiZi plane is OiF subtending an angle φi. The corresponding
angles from the second detector O2 are θ2 and φ2. Since the detectors are displaced
only along the X axis, φi = φ2, Yi = Y2 = Y and Zi = Z2 = Z • Caution: These are the angles measured by the X & Y sets of detectors in a
direction finder, and are different from the angles θ and φ that are normally
defined and used for calculating directional cosines.
Zi = Z2 = Oi A = O2B = Z say, common to both
Xι = OιK = AC = Z tan θι
Figure imgf000018_0001
Xι = X2 + d
Therefore, Z tan θ 1 = Z tan θ2 + d
Z (tan θi - tan θ2) = d
Figure imgf000018_0002
So, X] = d/(tan θi - tan θ2)* tan θi and
X2 = d/(tan θi - tan θ2)* tan θ2
Yi = OiG = AF = Z tan φ! = Yj = O2H = BE = Z tan φ2
Yi = Y2 = Y = d/(tan θt - tan θ2)* tan φ2
The formulas are exact and provide the correct sign + or -
For the above formulas, it is not necessary to even calculate the tan of the angle, as this value can be obtained directly from the detector head calculation performed in the previous step. Step 3 - Calculating orientation information
Referring to Figure 9:
At this stage, we have the x, y, z position information of all visible LEDs in the
camera frame of reference.
There are three main frames of reference that must be described. The first frame is the world co-ordinate frame (Frame 0), which in our geometry can be considered to
be the camera frame of reference. Then, there is another frame of reference, which
is the hand's or wrist frame of reference, in which all LEDs are placed in (Frame
1). Finally, we can also create another frame of reference using a specific group of 3 LEDs (Frame 2). It should be noted that Frame 2 is constant with respect to Frame 1, and Frame 1 is moving with respect to Frame 0 as the user moves the
system in space.
The following is an arbitrary rotation matrix R = RZ; ψ. * Ry, e * Rx, ψ • Where φ is
the roll, Θ is the pitch and ψ is the yaw of the system.
Figure imgf000019_0001
-sinΘ cosΘsinψ cosΘcosψ
Given this general matrix, the columns of the matrix form the basis vectors for the
transformed space. Inversely, given a set of basis vectors (i, j, k) for a general
space, one can easily create a rotation matrix, which transforms points from the
absolute world frame to the relative world frame.
For the problem, we need to obtain the transformation matrix Ro->ι .
Figure imgf000020_0001
Rθ->2 [Rl->2] = Rθ->l Rl->2 [Rl->2]
Ro->2 [Rι->2]1 = Ro->ι The inverse of a transformation is its transpose.
Hence, Ro->ι = Ro->2
Figure imgf000020_0002
From this equation, we can calculate the transformation matrix that takes points
from the camera frame to the wrist frame (frame 0 to frame 1). To obtain Ro->2. we just create a basis set from the LEDs visible to the camera and create the rotation
matrix. Rι->2 can be calculated by measuring the physical geometry of the same
LEDs with respect to an origin on the hand, the same way.
In order to create a basis set given an arbitrary set of 3D positions, the following algorithm can be used (referring to Figure 10). By creating 2 vectors with the known points (A, B), we can create a set of basis
vectors: A, AXB, AX(AXB). After normalizing these vectors as the i, j and k
column vectors, we obtain a rotation matrix Ro->2- By using the exact same points
and the same calculations, we can also create the matrix [Rι->2]t. Using these two
matrices, we can now calculate Ro->ι.
The actual angles can be calculated by setting the Ro->ι matrix to equal the general
yaw\pitch\roll matrix described above, and solving for the angles.
Figure imgf000021_0001
ψ = tan" (Ro->i[3,2] Ro->i[3,3])
Figure imgf000021_0002
These are the angles that we need. This will work in the first quadrant. The other quadrants can be found using a combination of the signs of the matrix elements.
Professionals in a wide range of computer-aided design (CAD) applications
use 3D visualization tools. Architects, game designers, product designers,
mechanical designers, construction engineers and cartoon movie makers all use PC-based design software to create, manipulate and animate in 3D. This market's
needs are distinct from those of the gaming market in that the users seek to create
3D objects— rather than simply manipulate them— which traditionally requires a time-consuming, multi-action process. Currently, all software is designed for four primary input devices: keyboard,
mouse, light pen and digitizing tablets. While the tasks enabled by certain devices
(namely drawing with digitizing tablets) are useful, the software in general has proven consistently frustrating to its user base, primarily because it is so difficult to
learn.
Applications for this market can be distilled into four segments based upon
the software used and the functionality requirements of the group. These segments
are the Architecture, Engineering and Construction (AEC) market, the Mechanical Engineering (ME, or MCAD for Mechanical Computer-aided Design) market, the
Game Developer market and the Film/Video market.
In its simplest iteration, the glove device of the present invention will
replicate the existing mouse or controller commands built into current applications. However, with awareness of the extended capabilities of the device of the invention, a developer will be able to create entirely new commands and user
interfaces for existing and future applications. A current computer user interface
will evolve to a 3D interface as programming languages and bandwidth enabling
larger file sizes arrives.
The primary users are Internet browsing households. Research shows that
all members of the family in PC-owning households are using the PC, and for different reasons. Online, they are all using the Internet, and again for different
reasons. The advent of broadband will be the primary incentive for 3D designers to
produce content for the average Internet user. This will enable web developers to incorporate real-looking objects into e-Commerce sites, create desktops and browsers which look like real rooms and require manipulation through a 3D space, and ultimately change the entire Internet experience for the PC user. The glove, or other body worn device, of the present invention will enable all users to work intuitively within this environment.
The glove interface device of the present invention is a peripheral device adapted for use with the computer and gaming console market. The device of the invention is designed for multiple uses including but not limited to: gaming, scientific visualization, animation, Computer Aided Design (CAD), virtual reality, industrial design, training and education, and web browsing. Such glove device serves as an interface for a personal computer, stand-alone video gaming console or other USB devices, and is provided with a multiplicity of sensors to accurately determine and track the position of the user's body parts, such as hand, wrist and five fingers in space.
CONCLUSION
A system and method has been shown in the above embodiments for the effective implementation of a user worn interface device. While various preferred embodiments have been shown and described, it will be understood that there is no intent to limit the invention by such disclosure, but rather, it is intended to cover all modifications and alternate constructions falling within the spirit and scope of the invention, as defined in the appended claims. For example, the present invention should not be limited by software/program, computing environment, specific computing hardware, or specific LED or sensor numbers, patterns or placement.
In addition, as 2-way tactile feedback technology progresses, opportunities will result: remote robotic operation, "Braille" websites for the blind, online shopping where you can feel the fabric, education on a whole new level, surgery conducted with doctor and patient located on opposite coasts. Alternative
embodiments of the glove device incorporating enhancements and feature options include a wireless product, the ability to monitor sweat and pulse, the ability to provide and utilize tactile feedback and the ability to interchange the glove fabrics
through the use of removable electronics.
The above enhancements for icons and its described functional elements are
implemented in various computing environments. For example, the present invention may be implemented on a conventional IBM PC or equivalent, multi- nodal system (e.g., LAN) or networking system (e.g., Internet, WWW, wireless
web). All programming and data related thereto are stored in computer memory,
static or dynamic, and may be retrieved by the user in any of: conventional computer storage, display (i.e. CRT) and/or hardcopy (i.e. printed) formats. The programming of the present invention may be implemented by one of skill in the
art of GUI and I/O device programming.

Claims

CLAIMSWhat is claimed is:
1. A computer interface providing data input to a computer based system, said computer interface comprising:
an electronic user worn device, said device operatively connected to said computer based system and comprising:
a plurality of embedded electro-resistive sensors;
an array of radiation emitters, said emitters located on an exterior surface of
said device; a position tracker device operative with said array of radiation emitters to
track position of said emitters,
a processing controller, said controller operatively connected to outputs
from said embedded electro-resistive sensors and an output of said position tracker, and
wherein said processing controller receives data from said sensors and said
position tracker and transfers a computer interface input representing
motion and position of said device to said computer interface.
2. A computer interface as per claim 1, wherein said electronic user worn
device comprises any of: a flexible glove, necklace, ankle bracelet, sock, or shirt.
3. A computer interface as per claim 1, wherein said electro-resistive sensors comprise ink deposited on a flexible substrate.
4. A computer interface as per claim 1, wherein said radiation emitters
comprise LEDs.
5. A computer interface as per claim 1, wherein said computer based device includes any of: a personal computer, stand-alone video gaming console, or USB
devices.
6. A computer interface as per claim 1, wherein said interface accurately
tracks the position of any, or a combination of: a user's hand, wrist, individual
fingers, legs, feet, head, torso.
7. A computer interface as per claim 1, wherein said interface includes web-
imbedded code to assist in Internet navigation.
8. A computer interface as per claim 1, wherein said operative connection to said computer based device comprises any of : standard cable connection, USB , or
wireless connection.
9. A computer interface as per claim 8, wherein said USB connection provides
plug and play capability.
10. A computer interface as per claim 2, wherein said interface recognizes gestures performed by detection of coordinated movements of said flexible glove.
11. A computer interface as per claim 2, wherein said interface measures yaw,
pitch and roll movements of said flexible glove.
12. A computer interface as per claim 1, wherein said array of radiation emitters provide inputs to horizontal and vertical detector clusters mounted on said position tracker device.
13. A computer interface as per claim 1, wherein said computer interface
further comprises a docking station for said device.
14. A computer interface as per claim 1, wherein said interface tracks six
degrees of motion, including x, y, z, yaw, pitch, and roll.
15. A computer interface as per claim 1, wherein said interface is applied to any
of: music composition and conduction, orchestration of lighting and sound control,
material shaping and manipulation, and web surfing and browsing.
16. A method to provide positional input to a computer based system using an electronic interface, said interface comprising: a user wearable device, a plurality of embedded sensors, an array of embedded radiation emitters, a position tracker
device operative with said array of radiation emitters, and a processing controller, said method comprising: detecting movement signals from said sensors, said signals
indicating specific user body specific movements;
detecting at said position tracker device position signals from said
array of radiation emitters;
processing said movement and position signals to output a data input signal to said computer based system, and wherein said data input signal representing motion and position of said user
wearable device is provided to an application interface of said computer
based system.
17. A method to provide data input to a computer based system using an
electronic interface as per claim 16, wherein said step of detecting movement signals from said sensors further comprises detecting a change of resistance
produced in said sensors.
18. A method to provide data input to a computer based system using an
electronic interface as per claim 16, wherein said sensors comprise ink deposited
on a flexible substrate.
19. A method to provide data input to a computer based system using an
electronic interface as per claim 16, wherein said array of radiation emitters
comprise LEDs.
20. A method to provide data input to a computer based system using an
electronic interface as per claim 16, wherein said computer based system comprises any of: a personal computer, stand-alone video gaming console, or USB device.
21. A method to provide data input to a computer based system using an
electronic interface as per claim 16, wherein said method accurately tracks the position of any, or a combination of: a user's hand, wrist, individual fingers, legs, feet, head, torso.
22. A method to provide data input to a computer based system using an electronic interface as per claim 16, wherein said method assists a user in Internet
navigation.
23. A method to provide data input to a computer based system using an electronic interface as per claim 16, wherein a USB connection provides plug and
play capability.
24. A method to provide data input to a computer based system using an
electronic interface as per claim 16, wherein said method recognizes gestures performed by detection of coordinated movements of said interface.
25. A method to provide data input to a computer based system using an
electronic interface as per claim 16, wherein said interface tracks six degrees of
motion, including x, y, z, yaw, pitch, and roll.
26. A method to provide data input to a computer based system using an
electronic interface as per claim 16, wherein said interface is applied to any of: music composition and conduction, orchestration of lighting and sound control,
material shaping and manipulation, and web surfing and browsing
27. A method to provide data input to a computer based system using an
electronic interface as per claim 16, wherein said user wearable device comprises any of: a flexible glove, necklace, ankle bracelet, sock, or shirt.
PCT/US2001/048224 2000-11-02 2001-10-30 Electronic user worn interface device WO2002037466A1 (en)

Priority Applications (4)

Application Number Priority Date Filing Date Title
AU2002230814A AU2002230814A1 (en) 2000-11-02 2001-10-30 Electronic user worn interface device
JP2002540133A JP2004513443A (en) 2000-11-02 2001-10-30 Electronic user mounting interface device and method using the same
EP01991060A EP1340218A1 (en) 2000-11-02 2001-10-30 Electronic user worn interface device
US10/091,558 US20020153488A1 (en) 2001-03-08 2002-03-07 Shadow based range and direction finder

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US24508800P 2000-11-02 2000-11-02
US60/245,088 2000-11-02
US27629201P 2001-03-16 2001-03-16
US60/276,292 2001-03-16

Publications (1)

Publication Number Publication Date
WO2002037466A1 true WO2002037466A1 (en) 2002-05-10

Family

ID=26936984

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2001/048224 WO2002037466A1 (en) 2000-11-02 2001-10-30 Electronic user worn interface device

Country Status (5)

Country Link
EP (1) EP1340218A1 (en)
JP (1) JP2004513443A (en)
AU (1) AU2002230814A1 (en)
TW (1) TW543028B (en)
WO (1) WO2002037466A1 (en)

Cited By (32)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2005043087A1 (en) * 2003-11-03 2005-05-12 Intelligent Devices Inc. Method of producing medium-to thin-film pressure and humidity sensors by flexographic printing
WO2005064299A3 (en) * 2003-12-26 2006-04-06 Toyota Motor Co Ltd Device and method for sensing and displaying convexo concave
WO2008042219A2 (en) * 2006-09-29 2008-04-10 Nellcor Puritan Bennett Llc User interface and identification in a medical device systems and methods
GB2458583A (en) * 2005-01-18 2009-09-30 Rallypoint Inc Wearable article sensing a force and a direction associated the force
WO2011045786A2 (en) 2009-10-13 2011-04-21 Rami Parham Wearable device for generating input for computerized systems
WO2011104709A2 (en) 2010-02-23 2011-09-01 Rami Parham A system for projecting content to a display surface having user-controlled size, shape and location/direction and apparatus and methods useful in conjunction therewith
EP2418562A1 (en) * 2010-08-13 2012-02-15 Deutsches Primatenzentrum GmbH Modelling of hand and arm position and orientation
US8467071B2 (en) 2010-04-21 2013-06-18 Faro Technologies, Inc. Automatic measurement of dimensional data with a laser tracker
US8467072B2 (en) 2011-02-14 2013-06-18 Faro Technologies, Inc. Target apparatus and method of making a measurement with the target apparatus
WO2013101542A1 (en) * 2011-12-30 2013-07-04 Faro Technologies, Inc. Method and apparatus for using gestures to control a laser tracker
US8537375B2 (en) 2010-04-21 2013-09-17 Faro Technologies, Inc. Method and apparatus for using gestures to control a laser tracker
US8724119B2 (en) 2010-04-21 2014-05-13 Faro Technologies, Inc. Method for using a handheld appliance to select, lock onto, and track a retroreflector with a laser tracker
WO2014142807A1 (en) * 2013-03-12 2014-09-18 Intel Corporation Menu system and interactions with an electronic device
US9041914B2 (en) 2013-03-15 2015-05-26 Faro Technologies, Inc. Three-dimensional coordinate scanner and method of operation
US9164173B2 (en) 2011-04-15 2015-10-20 Faro Technologies, Inc. Laser tracker that uses a fiber-optic coupler and an achromatic launch to align and collimate two wavelengths of light
US9377885B2 (en) 2010-04-21 2016-06-28 Faro Technologies, Inc. Method and apparatus for locking onto a retroreflector with a laser tracker
US9395174B2 (en) 2014-06-27 2016-07-19 Faro Technologies, Inc. Determining retroreflector orientation by optimizing spatial fit
US9400170B2 (en) 2010-04-21 2016-07-26 Faro Technologies, Inc. Automatic measurement of dimensional data within an acceptance region by a laser tracker
WO2016134295A1 (en) * 2015-02-20 2016-08-25 Sony Coumputer Entertainment Inc. Magnetic tracking of glove fingertips
US9482529B2 (en) 2011-04-15 2016-11-01 Faro Technologies, Inc. Three-dimensional coordinate scanner and method of operation
US9482755B2 (en) 2008-11-17 2016-11-01 Faro Technologies, Inc. Measurement system having air temperature compensation between a target and a laser tracker
US9638507B2 (en) 2012-01-27 2017-05-02 Faro Technologies, Inc. Measurement machine utilizing a barcode to identify an inspection plan for an object
US9686532B2 (en) 2011-04-15 2017-06-20 Faro Technologies, Inc. System and method of acquiring three-dimensional coordinates using multiple coordinate measurement devices
US9772394B2 (en) 2010-04-21 2017-09-26 Faro Technologies, Inc. Method and apparatus for following an operator and locking onto a retroreflector with a laser tracker
CN107209582A (en) * 2014-12-16 2017-09-26 肖泉 The method and apparatus of high intuitive man-machine interface
CN107533369A (en) * 2015-02-20 2018-01-02 索尼互动娱乐股份有限公司 The magnetic tracking of glove fingertip with peripheral unit
US9880619B2 (en) 2010-02-23 2018-01-30 Muy Interactive Ltd. Virtual reality system with a finger-wearable control
DE102005011432B4 (en) 2005-03-12 2019-03-21 Volkswagen Ag Data glove
US10296085B2 (en) 2014-03-05 2019-05-21 Markantus Ag Relatively simple and inexpensive finger operated control device including piezoelectric sensors for gesture input, and method thereof
US10302413B2 (en) 2011-04-15 2019-05-28 Faro Technologies, Inc. Six degree-of-freedom laser tracker that cooperates with a remote sensor
CN112632407B (en) * 2020-12-18 2022-10-14 湖南科技大学 Spatial sampling method considering geographic environment heterogeneity
US11696704B1 (en) 2020-08-31 2023-07-11 Barron Associates, Inc. System, device and method for tracking the human hand for upper extremity therapy

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
TWI412392B (en) * 2005-08-12 2013-10-21 Koninkl Philips Electronics Nv Interactive entertainment system and method of operation thereof
KR102343657B1 (en) * 2014-08-28 2021-12-24 삼성전자주식회사 Application processor for processing user input corresponding to movements of wrist muscles and devices including same
JP6447150B2 (en) * 2015-01-14 2019-01-09 東洋紡株式会社 Glove-type input device
CN104732983B (en) * 2015-03-11 2018-03-16 浙江大学 A kind of interactive music method for visualizing and device

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4988981A (en) * 1987-03-17 1991-01-29 Vpl Research, Inc. Computer data entry and manipulation apparatus and method
US6049327A (en) * 1997-04-23 2000-04-11 Modern Cartoons, Ltd System for data management based onhand gestures
US6069594A (en) * 1991-07-29 2000-05-30 Logitech, Inc. Computer input device with multiple switches using single line

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4988981A (en) * 1987-03-17 1991-01-29 Vpl Research, Inc. Computer data entry and manipulation apparatus and method
US4988981B1 (en) * 1987-03-17 1999-05-18 Vpl Newco Inc Computer data entry and manipulation apparatus and method
US6069594A (en) * 1991-07-29 2000-05-30 Logitech, Inc. Computer input device with multiple switches using single line
US6049327A (en) * 1997-04-23 2000-04-11 Modern Cartoons, Ltd System for data management based onhand gestures

Cited By (67)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2005043087A1 (en) * 2003-11-03 2005-05-12 Intelligent Devices Inc. Method of producing medium-to thin-film pressure and humidity sensors by flexographic printing
WO2005064299A3 (en) * 2003-12-26 2006-04-06 Toyota Motor Co Ltd Device and method for sensing and displaying convexo concave
US7685886B2 (en) 2003-12-26 2010-03-30 Toyota Jidosha Kabushiki Kaisha Convexo concave amplifying device and convexo concave detecting method by use thereof, deformation sensing device and convexo concave detecting method by use thereof, and convexo concave position exhibiting device and convexo concave position exhibiting method
GB2458583A (en) * 2005-01-18 2009-09-30 Rallypoint Inc Wearable article sensing a force and a direction associated the force
GB2458583B (en) * 2005-01-18 2009-12-09 Rallypoint Inc Sensing input actions
DE102005011432B4 (en) 2005-03-12 2019-03-21 Volkswagen Ag Data glove
WO2008042219A2 (en) * 2006-09-29 2008-04-10 Nellcor Puritan Bennett Llc User interface and identification in a medical device systems and methods
WO2008042219A3 (en) * 2006-09-29 2008-08-07 Nellcor Puritan Bennett Llc User interface and identification in a medical device systems and methods
US9482755B2 (en) 2008-11-17 2016-11-01 Faro Technologies, Inc. Measurement system having air temperature compensation between a target and a laser tracker
US9453913B2 (en) 2008-11-17 2016-09-27 Faro Technologies, Inc. Target apparatus for three-dimensional measurement system
WO2011045786A2 (en) 2009-10-13 2011-04-21 Rami Parham Wearable device for generating input for computerized systems
WO2011104709A2 (en) 2010-02-23 2011-09-01 Rami Parham A system for projecting content to a display surface having user-controlled size, shape and location/direction and apparatus and methods useful in conjunction therewith
US9535516B2 (en) 2010-02-23 2017-01-03 Muv Interactive Ltd. System for projecting content to a display surface having user-controlled size, shape and location/direction and apparatus and methods useful in conjunction therewith
US9880619B2 (en) 2010-02-23 2018-01-30 Muy Interactive Ltd. Virtual reality system with a finger-wearable control
US10528154B2 (en) 2010-02-23 2020-01-07 Touchjet Israel Ltd System for projecting content to a display surface having user-controlled size, shape and location/direction and apparatus and methods useful in conjunction therewith
US9329716B2 (en) 2010-02-23 2016-05-03 Muv Interactive Ltd. System for projecting content to a display surface having user-controlled size, shape and location/direction and apparatus and methods useful in conjunction therewith
US9377885B2 (en) 2010-04-21 2016-06-28 Faro Technologies, Inc. Method and apparatus for locking onto a retroreflector with a laser tracker
US9007601B2 (en) 2010-04-21 2015-04-14 Faro Technologies, Inc. Automatic measurement of dimensional data with a laser tracker
US8654354B2 (en) 2010-04-21 2014-02-18 Faro Technologies, Inc. Method and apparatus for using gestures to control a laser tracker
US8654355B2 (en) 2010-04-21 2014-02-18 Faro Technologies, Inc. Method and apparatus for using gestures to control a laser tracker
US8724119B2 (en) 2010-04-21 2014-05-13 Faro Technologies, Inc. Method for using a handheld appliance to select, lock onto, and track a retroreflector with a laser tracker
US8724120B2 (en) 2010-04-21 2014-05-13 Faro Technologies, Inc. Automatic measurement of dimensional data with a laser tracker
US8467071B2 (en) 2010-04-21 2013-06-18 Faro Technologies, Inc. Automatic measurement of dimensional data with a laser tracker
US10480929B2 (en) 2010-04-21 2019-11-19 Faro Technologies, Inc. Method and apparatus for following an operator and locking onto a retroreflector with a laser tracker
US8537375B2 (en) 2010-04-21 2013-09-17 Faro Technologies, Inc. Method and apparatus for using gestures to control a laser tracker
US10209059B2 (en) 2010-04-21 2019-02-19 Faro Technologies, Inc. Method and apparatus for following an operator and locking onto a retroreflector with a laser tracker
US8896848B2 (en) 2010-04-21 2014-11-25 Faro Technologies, Inc. Method and apparatus for using gestures to control a laser tracker
US8537371B2 (en) 2010-04-21 2013-09-17 Faro Technologies, Inc. Method and apparatus for using gestures to control a laser tracker
US9885559B2 (en) 2010-04-21 2018-02-06 Faro Technologies, Inc. Method and apparatus for following an operator and locking onto a retroreflector with a laser tracker
US9146094B2 (en) 2010-04-21 2015-09-29 Faro Technologies, Inc. Automatic measurement of dimensional data with a laser tracker
US9400170B2 (en) 2010-04-21 2016-07-26 Faro Technologies, Inc. Automatic measurement of dimensional data within an acceptance region by a laser tracker
US9772394B2 (en) 2010-04-21 2017-09-26 Faro Technologies, Inc. Method and apparatus for following an operator and locking onto a retroreflector with a laser tracker
US8576380B2 (en) 2010-04-21 2013-11-05 Faro Technologies, Inc. Method and apparatus for using gestures to control a laser tracker
EP2418562A1 (en) * 2010-08-13 2012-02-15 Deutsches Primatenzentrum GmbH Modelling of hand and arm position and orientation
WO2012020054A1 (en) * 2010-08-13 2012-02-16 Deutsches Primatenzentrum Gmbh (Dpz) Modelling of hand and arm position and orientation
US8593648B2 (en) 2011-02-14 2013-11-26 Faro Technologies, Inc. Target method using indentifier element to obtain sphere radius
US8467072B2 (en) 2011-02-14 2013-06-18 Faro Technologies, Inc. Target apparatus and method of making a measurement with the target apparatus
US8619265B2 (en) 2011-03-14 2013-12-31 Faro Technologies, Inc. Automatic measurement of dimensional data with a laser tracker
US9494412B2 (en) 2011-04-15 2016-11-15 Faro Technologies, Inc. Diagnosing multipath interference and eliminating multipath interference in 3D scanners using automated repositioning
US10302413B2 (en) 2011-04-15 2019-05-28 Faro Technologies, Inc. Six degree-of-freedom laser tracker that cooperates with a remote sensor
US9453717B2 (en) 2011-04-15 2016-09-27 Faro Technologies, Inc. Diagnosing multipath interference and eliminating multipath interference in 3D scanners using projection patterns
US10578423B2 (en) 2011-04-15 2020-03-03 Faro Technologies, Inc. Diagnosing multipath interference and eliminating multipath interference in 3D scanners using projection patterns
US9448059B2 (en) 2011-04-15 2016-09-20 Faro Technologies, Inc. Three-dimensional scanner with external tactical probe and illuminated guidance
US9482529B2 (en) 2011-04-15 2016-11-01 Faro Technologies, Inc. Three-dimensional coordinate scanner and method of operation
US10267619B2 (en) 2011-04-15 2019-04-23 Faro Technologies, Inc. Three-dimensional coordinate scanner and method of operation
US10119805B2 (en) 2011-04-15 2018-11-06 Faro Technologies, Inc. Three-dimensional coordinate scanner and method of operation
US9686532B2 (en) 2011-04-15 2017-06-20 Faro Technologies, Inc. System and method of acquiring three-dimensional coordinates using multiple coordinate measurement devices
US9967545B2 (en) 2011-04-15 2018-05-08 Faro Technologies, Inc. System and method of acquiring three-dimensional coordinates using multiple coordinate measurment devices
US9164173B2 (en) 2011-04-15 2015-10-20 Faro Technologies, Inc. Laser tracker that uses a fiber-optic coupler and an achromatic launch to align and collimate two wavelengths of light
WO2013101542A1 (en) * 2011-12-30 2013-07-04 Faro Technologies, Inc. Method and apparatus for using gestures to control a laser tracker
CN104094080A (en) * 2011-12-30 2014-10-08 法罗技术股份有限公司 Method and apparatus for using gestures to control a laser tracker
GB2512554A (en) * 2011-12-30 2014-10-01 Faro Tech Inc Method and apparatus for using gestures to control a laser tracker
US9638507B2 (en) 2012-01-27 2017-05-02 Faro Technologies, Inc. Measurement machine utilizing a barcode to identify an inspection plan for an object
WO2014142807A1 (en) * 2013-03-12 2014-09-18 Intel Corporation Menu system and interactions with an electronic device
US9482514B2 (en) 2013-03-15 2016-11-01 Faro Technologies, Inc. Diagnosing multipath interference and eliminating multipath interference in 3D scanners by directed probing
US9041914B2 (en) 2013-03-15 2015-05-26 Faro Technologies, Inc. Three-dimensional coordinate scanner and method of operation
US10296085B2 (en) 2014-03-05 2019-05-21 Markantus Ag Relatively simple and inexpensive finger operated control device including piezoelectric sensors for gesture input, and method thereof
US9395174B2 (en) 2014-06-27 2016-07-19 Faro Technologies, Inc. Determining retroreflector orientation by optimizing spatial fit
EP3234742A4 (en) * 2014-12-16 2018-08-08 Quan Xiao Methods and apparatus for high intuitive human-computer interface
CN107209582A (en) * 2014-12-16 2017-09-26 肖泉 The method and apparatus of high intuitive man-machine interface
US10254833B2 (en) 2015-02-20 2019-04-09 Sony Interactive Entertainment Inc. Magnetic tracking of glove interface object
WO2016134295A1 (en) * 2015-02-20 2016-08-25 Sony Coumputer Entertainment Inc. Magnetic tracking of glove fingertips
US9652038B2 (en) 2015-02-20 2017-05-16 Sony Interactive Entertainment Inc. Magnetic tracking of glove fingertips
CN107533369A (en) * 2015-02-20 2018-01-02 索尼互动娱乐股份有限公司 The magnetic tracking of glove fingertip with peripheral unit
CN107533369B (en) * 2015-02-20 2021-05-07 索尼互动娱乐股份有限公司 Magnetic tracking of glove fingertips with peripheral devices
US11696704B1 (en) 2020-08-31 2023-07-11 Barron Associates, Inc. System, device and method for tracking the human hand for upper extremity therapy
CN112632407B (en) * 2020-12-18 2022-10-14 湖南科技大学 Spatial sampling method considering geographic environment heterogeneity

Also Published As

Publication number Publication date
JP2004513443A (en) 2004-04-30
EP1340218A1 (en) 2003-09-03
AU2002230814A1 (en) 2002-05-15
TW543028B (en) 2003-07-21

Similar Documents

Publication Publication Date Title
EP1340218A1 (en) Electronic user worn interface device
US8878807B2 (en) Gesture-based user interface employing video camera
US7886621B2 (en) Digital foam
US9939987B2 (en) Method and apparatus for user interface of input devices
CN107430437B (en) System and method for creating a real grabbing experience in a virtual reality/augmented reality environment
US9195301B2 (en) Three dimensional volumetric display input and output configurations
KR101546654B1 (en) Method and apparatus for providing augmented reality service in wearable computing environment
US20120068927A1 (en) Computer input device enabling three degrees of freedom and related input and feedback methods
US20120192119A1 (en) Usb hid device abstraction for hdtp user interfaces
US11209916B1 (en) Dominant hand usage for an augmented/virtual reality device
JPH067371B2 (en) 3D computer input device
US10481742B2 (en) Multi-phase touch-sensing electronic device
Nguyen et al. 3DTouch: A wearable 3D input device for 3D applications
US7804486B2 (en) Trackball systems and methods for rotating a three-dimensional image on a computer display
KR101530340B1 (en) Motion sensing system for implementing hand position-posture information of user in a three-dimensional virtual space based on a combined motion tracker and ahrs system
JPH0269798A (en) Method of turning displayed object
CN110209270A (en) A kind of data glove, data glove system, bearing calibration and storage medium
KR101686585B1 (en) A hand motion tracking system for a operating of rotary knob in virtual reality flighting simulator
JP6932267B2 (en) Controller device
Bogue Sensors for interfacing with consumer electronics
US11874955B2 (en) Electronic device
US10768721B2 (en) Model controller
Millan et al. Gesture-based control
Nguyen 3DTouch: Towards a Wearable 3D Input Device for 3D Applications
CN114327042A (en) Detection glove, gesture tracking method, AR device and key pressing method

Legal Events

Date Code Title Description
WWE Wipo information: entry into national phase

Ref document number: 10091558

Country of ref document: US

121 Ep: the epo has been informed by wipo that ep was designated in this application
WWE Wipo information: entry into national phase

Ref document number: 2002540133

Country of ref document: JP

WWE Wipo information: entry into national phase

Ref document number: 2001991060

Country of ref document: EP

WWP Wipo information: published in national office

Ref document number: 2001991060

Country of ref document: EP

REG Reference to national code

Ref country code: DE

Ref legal event code: 8642

WWW Wipo information: withdrawn in national office

Ref document number: 2001991060

Country of ref document: EP

DPE2 Request for preliminary examination filed before expiration of 19th month from priority date (pct application filed from 20040101)