WO1996014633A1 - Multi-dimensional electrical control device - Google Patents

Multi-dimensional electrical control device Download PDF

Info

Publication number
WO1996014633A1
WO1996014633A1 PCT/US1995/014316 US9514316W WO9614633A1 WO 1996014633 A1 WO1996014633 A1 WO 1996014633A1 US 9514316 W US9514316 W US 9514316W WO 9614633 A1 WO9614633 A1 WO 9614633A1
Authority
WO
WIPO (PCT)
Prior art keywords
input device
electrical control
control input
hand manipulable
force sensor
Prior art date
Application number
PCT/US1995/014316
Other languages
French (fr)
Inventor
Ehud Baron
Omry Genossar
Original Assignee
Baron Motion Communication Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Baron Motion Communication Inc. filed Critical Baron Motion Communication Inc.
Priority to AU41447/96A priority Critical patent/AU4144796A/en
Publication of WO1996014633A1 publication Critical patent/WO1996014633A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/038Control and interface arrangements therefor, e.g. drivers or device-embedded control circuitry
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0354Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
    • G06F3/03543Mice or pucks

Definitions

  • the present invention relates to hand manipula- ble input and control devices generally, and more specif ⁇ ically to such devices for use with computers, games, televisions, and other electronically controlled devices.
  • Hand-manipulated input devices are well-known in the art. Hand-manipulated input devices include mice, joysticks, light pens, trackballs and digitizers. Unlike keyboard based input such as computer keyboards, tele ⁇ phones, and conventional television remote control de ⁇ vices, hand-manipulated input devices generally provide a continuous scale of input.
  • mice are hand manipulable devices designed to control displacement in two dimensions, as the hand manipulating the mouse moves about a generally flat surface. Displacement may be effected by the rota ⁇ tion of a ball located on the underside of the mouse in contact with the flat surface, or alternatively by opti ⁇ cal means or otherwise.
  • Conventional mice also include a switch having a pressed position and a released position. The user of the mouse typically presses and/or releases the switch to indicate selection of an object, of an option, or of some desired action.
  • mice are additionally capable of control ⁇ ling displacement in a third spatial dimension. Of these mice, some use transmitters and receivers located in the 14633 PC17US95/14316
  • mice and in the environment around the mouse, as for example ultrasonic or infrared transmitters and receiv ⁇ ers.
  • Other mice determine three-dimensional displacement with the aid of accelerometers, such as the mouse de ⁇ scribed in Applicant's copending Israel patent applica ⁇ tion 108565.
  • mice control the conventional two dimen ⁇ sions along a flat surface and additionally control another variable indicating user input via hand motion. These mice may be considered three dimensional in the sense that they control three independent variables indicating user input via hand motions.
  • a non-mouse device using pressure sensors for input of two-dimensional information from the user's hand is described in US Patent 4,313,113 to Thornburg.
  • the present invention seeks to provide an improved hand manipulable input and control device.
  • CAD/CAM applications it may be necessary to position an object in three dimensions and also to zoom at the same time; to position an object in three dimensions and also to rotate the object about an arbitrary axis at the same time; or to position an object in three dimensions and to specify object at ⁇ tributes at the same time.
  • computer games may addi ⁇ tionally require the input of three-dimensional position information and also of information on the speed of action of the game at the same time.
  • a hand manipulable electrical control input device including a finger engageable control element providing at least four mutually orthogonal parameter inputs and including at least one force sensor providing at least one of the at least four mutually orthogonal parameter inputs.
  • the control element in ⁇ cludes a mouse.
  • control element includes a computer input device.
  • control element is displaceable in at least two dimensions.
  • control element is displaceable in at least three dimensions.
  • the mutually orthogo ⁇ nal parameter inputs include at least three of the fol ⁇ lowing parameter inputs: X-displacement, Y-displacement, size, zoom, rotation, color, intensity, speed, amplitude, frequency and texture.
  • each force sensor is associated with a displaceable switch.
  • the at least one force sensor includes a plurality of force sensors.
  • each force sensor is individual finger actuable.
  • each force sensor is separately actuable.
  • control element in ⁇ cludes a mouse.
  • control element includes a computer input device. Additionally in accordance with a preferred embodiment of the present invention the control element is displaceable in at least two dimensions.
  • control element is displaceable in at least three dimensions.
  • the mutually orthogo ⁇ nal parameter inputs include at least three of the fol ⁇ lowing parameter inputs: X-displacement, Y-displacement, size, zoom, rotation, color, intensity, speed, amplitude, frequency and texture.
  • each force sensor is associated with a displaceable switch.
  • a hand manipulable electrical control input device includ ⁇ ing a finger engageable control element providing at least three mutually orthogonal parameter inputs and including at least two force sensors providing at least two of the at least three mutually orthogonal parameter inputs.
  • each force sensor is indi ⁇ vidual finger actuable.
  • each force sensor is separately actuable.
  • control element includes a mouse.
  • control element in ⁇ cludes a computer input device.
  • control element is displaceable in at least two dimensions.
  • control element is displaceable in at least three dimensions.
  • the mutually orthogonal parameter inputs include at least three of the following parameter inputs: X-displacement, Y-displacement, size, zoom, rotation, color, intensity, speed, amplitude, frequency and texture.
  • each force sensor is associated with a displaceable switch.
  • Fig. 1A is a simplified pictorial illustration of a computer system including a hand manipulable comput ⁇ er input device constructed and operative in accordance with a preferred embodiment of the present invention
  • Fig. IB is a sectional illustration of a por ⁇ tion of the device of Fig. 1A at section lines IB-IB;
  • Figs. 2A, 2B, 3A, and 3B are simplified picto ⁇ rial illustrations showing the operation of the device of Fig. 1A;
  • Figs. 4A - 4D are simplified pictorial illus ⁇ trations showing the operation of the device of Fig. 1A;
  • Fig. 5 is an enlarged view of a portion of the device of Fig. 4A at section lines V-V;
  • Figs. 6A - 6D are simplified pictorial illus ⁇ trations showing the operation of the device of Fig. 1;
  • Fig. 7 is an enlarged view of a portion of the device of Fig. 6A at section lines VII-VII;
  • Figs. 8A - 8D are simplified pictorial illus ⁇ trations showing the operation of the device of Fig. 1;
  • Fig. 9 is an enlarged view of a portion of the device of Fig. 8A at section lines IX-IX;
  • Figs. 10A - 10D are simplified pictorial illus ⁇ trations showing the operation of the device of Fig. 1;
  • Fig. 11 is an enlarged view of a portion of the device of Fig. 6A at section lines XI-XI;
  • Figs. 12A and 12B are simplified pictorial illustrations showing the operation of the device of Fig. l;
  • Fig. 13 is an enlarged view of a portion of the device of Fig. 10A at section lines XIII-XIII;
  • Figs. 14A and 14B are simplified pictorial illustrations showing the operation of the device of Fig.
  • Fig. 15 is an enlarged view of a portion of the device of Fig. 6A at section lines XV-XV;
  • Figs. 16A and 16B are simplified pictorial illustrations showing the operation of the device of Fig. 1.
  • Fig. 1A which illus ⁇ trates a computer system including a hand manipulable computer input device which is constructed and operative in accordance with a preferred embodiment of the present invention.
  • the apparatus of Fig. 1A comprises a computer 100 including a CPU 105 and a display screen 110.
  • Com ⁇ puter 105 may be any appropriate computer system such as an IBM-compatible PC system, a workstation system, or another appropriate system.
  • system of Fig. 1A includes computer 105 merely as an example, and that the scope of the present invention is not limited to use with computers, but rather may include use with other devices such as video games, interactive television, or any other electronically controlled device.
  • a screen object 115 is shown in Fig. 1A dis ⁇ played on the screen 110.
  • the screen object 115 is depicted in Fig. 1A as a screen representation of a three dimensional object. It is appreciated that the screen object 115 may be an object such as a representation of an object which is part of a computer game, a cursor representing position in three dimensional space, or another object.
  • the system of Fig. 1A also includes a hand manipulable computer input device 120.
  • the hand manipu ⁇ lable computer input device 120 is depicted in Fig. 1A as a mouse. It is appreciated that the input device 120 is not necessarily a mouse, but may in fact include any two or three-dimensional hand manipulable computer input device, for example a trackball, a digitizer, or a light pen.
  • the input device 120 is shown con ⁇ nected to the CPU 105 by a cable.
  • the input device 120 may be connected by a plurality of cables, by a wireless connection, or by other appropriate means.
  • the input device 120 comprises a select button 125. Except as described below, the select button 125 and the remaining components of the input device 120 may be conventional components such as are found in commer ⁇ cially available hand manipulable computer input devices, for example, a mouse such as a Microsoft mouse available from Microsoft Corporation, USA.
  • the input device 120 also comprises a plurality of force sensors 130.
  • Each of the force sensors 130 is operative to sense the force applied to the force sensor 130 by the user, typically by the user's finger.
  • a first force sensor 131 and a second force sensor 132 are depicted. It will be appreciated that the plu ⁇ rality of force sensors 130 may comprise more than two force sensors.
  • each of the force sensors 130 is individually finger actuable; that is, each of the force sensors 130 may, if so desired by the user, be actuated by a single finger. Additionally, each of the force sensors 130 is preferably separately actuable; that is, each of the force sensors 130 may, if so desired by the user, be actuated without actuating any of the other force sensors. Preferably, each of the force sensors 130 may be actuated by the user by applying force either in the forward or the backward direction; alternatively, actuation my occur by applying force in other directions.
  • the force sensors 130 produce continuous sig ⁇ nals representing the force applied to each force sensor 130.
  • the continuous signals are converted to digital signals by conventional means, such as by analog-to- digital conversion or within an embedded microprocessor, as typically employed in hand manipulable computer input devices.
  • the processed digitized signals are then trans ⁇ mitted to the computer 100 via the cable or other con ⁇ necting means.
  • the analog signals may be sent via the cable or other connecting means to the computer 100 through an analog connection such as a game port.
  • the computer 100 is operative to convert the analog signals to digital signals.
  • the signals repre ⁇ senting force applied to the force sensors 130 are re ⁇ ceived by the computer 100 in synchronization with con ⁇ ventional signals sent by the input device 120.
  • the computer 100 is also operative to synchronize the force signals with the conventional signals.
  • Fig. IB is a sectional illustration of a portion of the device of Fig. 1A at section lines IB-IB.
  • the apparatus of Fig. IB comprises one of the force sensors 130.
  • the force sensor 130 shown in Fig. IB comprises an actuator 135.
  • the actuator 135 comprises a bidirectional actuator such as a rocker switch. It is appreciated that the actuator 135 may be any appropriate actuator such as an on-off switch or another appropriate switch.
  • the actuator 135 is operative to provide an indication of the direction in which force is applied to the force sensor 130.
  • the actuator 135 may be actuated by the user without necessitating movement of the user's hand or finger from its point of contact with the actuator 135.
  • Force sensor 130 also comprises a force sensi ⁇ tive element 140.
  • the actuator 135 is positioned rela ⁇ tive to the force sensitive element 140 such that, when the user presses the actuator 135, the actuator 135 presses on the force sensitive element 140.
  • the force sensitive element 140 is operative to produce an analog signal indicating the degree of force applied thereto.
  • the force sensitive element 140 may be any appropriate force sensitive element or transducer as, for example, a force sensitive resistor such as the FSR commercially available from Interlink Electronics Europe, B.P. 8, Zone Industrielle, L-6401 Echternach, Luxembourg.
  • the operation of the apparatus of Fig. 1A is now briefly described.
  • the depiction of the screen object 115 on the screen 110 is controlled by the CPU 105, which may be executing a CAD/CAM package, a simula ⁇ tion, a computer game, or another appropriate computer task.
  • the computer 100 is operative to alter the depiction of the screen object 115 on the screen 110.
  • the user of the system of Fig. 1A may use the input device 120 to direct the CPU 105 to manipulate the screen object 115 in more than three mutually orthogonal dimensions.
  • the user may, utilizing the conventional input capabilities of the input device 120, direct the CPU 105 to move the screen object 115 in two or three orthogonal directions.
  • the user may actuate one or more of the force sensors 130 to direct the CPU 105 to perform other manipulations, such as one or more of the follow ⁇ ing: to zoom the screen object 115 in or out; to rotate the screen object 115 about one or more axes; to change the speed of an action occurring to the screen object 115; to change attributes of the screen object 115 such as the color, texture, size, or other visible at ⁇ tributes of the screen object 115; to move the screen object 115 in two or three orthogonal directions, instead of or in addition to the movement indicated using the conventional input capabili ⁇ ties of the input device 120 as described above; to change the amplitude or frequency of one or more characteristics of motion of the screen object 115; or to perform another manipulation.
  • other manipulations such as one or more of the follow ⁇ ing: to zoom the screen object 115 in or out; to rotate the screen object 115 about one or more axes; to change the speed of an action occurring to the screen object 115; to change attributes of the screen object 115 such as the color,
  • the user may actuate one or more of the force sensors 130 to direct the CPU 105 to change the speed of some action, such as paging through text or scrolling, in addition to manipulations performed utiliz ⁇ ing the conventional input capabilities of the input device 120.
  • the actuator 135 comprises a bidirectional actuator such as a rocker switch
  • the position of the actuator 135 may be used to control the direction of the action indicated by use of the force sensors 130, as, for example, whether a zoom is to take place in the inward or the outward direction.
  • FIGs. 2A, 2B, 3A, and 3B are simplified pictorial illustrations showing the operation of the device of Fig. 1A.
  • the user is shown manipulating the input device 120 by moving the input device 120 in a given direction, thus controlling motion of the screen object 115 in a single dimension, motion in the vertical direction.
  • Figs. 3A and 3B the user is shown manipu ⁇ lating the input device 120 by moving the input device 120 in another given direction, thus controlling motion of the screen object 115 in a second dimension orthogonal to the first, motion in the horizontal direction.
  • FIGs. 4A - 4D are simplified pictorial illustrations showing the opera- 14633 PC17US95/14316
  • Figs. 4A - 4D the user is shown pressing on the first force sensor 131.
  • Fig. 5 is an enlarged view of a portion of the device of Fig. 4A at section lines V-V.
  • the user presses down with one finger on the first force sensor 131 at one end thereof, shown in Fig. 5 as the front end.
  • the user thus signals, as explained above with reference to Fig. IB, both the direction and the degree of force applied.
  • the computer 100 responds to the force exerted by the user on the first force sensor 131 by zooming the depiction of the screen object 115 inward.
  • the manipulation of the first force sensor 131 of the input device 120 by the user controls motion of the screen object 115 in a third dimension orthogonal to the first two dimensions.
  • FIGs. 6A - 6D are simplified pictorial illustrations showing the opera ⁇ tion of the device of Fig. 1.
  • the user is shown pressing on the first force sensor 131.
  • Fig. 7 is an enlarged view of a portion of the device of Fig. 6A at section lines VII-VII.
  • the user presses down with one finger on the first force sensor 131 at an end thereof different from the end shown in Fig. 5, which different end is shown in Fig. 7 as the back end.
  • the user thus signals, as explained above with reference to Fig. IB, both the direction and the degree of force applied.
  • the computer 100 responds to the force exerted by the user on the first force sensor 131 by zooming the depiction of the screen object 115 outward.
  • the manip ⁇ ulation of the first force sensor 131 of the input device 120 by the user controls motion of the screen object 115 in two different directions in a third dimension orthogo ⁇ nal to the first two dimensions.
  • FIGs. 8A - 8D are simplified pictorial illustrations showing the opera ⁇ tion of the device of Fig. 1.
  • Figs. 8A - 8D the user is shown pressing on the first force sensor 131.
  • Fig. 9 is an enlarged view of a portion of the device of Fig. 8A at section lines IX-IX.
  • the user presses down with one finger on the first force sensor 131 at one end thereof, shown in Fig. 9 as the front end.
  • the user thus signals, as explained above with reference to Fig. IB, both the direction and the degree of force applied.
  • the force depicted in Figs. 8A 8D and in Fig. 9 is greater than the force depicted above in Figs. 4A - 4D, 5, 6A - 6D, and 7.
  • the computer 100 responds to the force exerted by the user on the first force sensor 131 by zooming the depiction of the screen object 115 inward.
  • the zoom is seen in Figs. 8A - 8D to occur at a higher rate than the zoom depicted in Figs. 4A 4D, in response to the greater force applied by the user to the first force sensor 131.
  • the manipula ⁇ tion of the first force sensor 131 of the input device 120 by the user controls motion as well as rate of motion of the screen object 115 in a third dimension orthogonal to the first two dimensions.
  • FIGs. 10A - 10D are simplified pictorial illustrations showing the opera ⁇ tion of the device of Fig. 1.
  • Figs. 10A - 10D the user is shown pressing on the first force sensor 131.
  • Fig. 11 is an enlarged view of a portion of the device of Fig. 8A at section lines XI-XI.
  • the user presses down with one finger on the first force sensor 131 at an end thereof different from the end shown in Fig. 9, the different end being shown in Fig. 11 as the back end.
  • the user thus signals, as explained above with reference to Fig. IB, both the direction and the degree of force applied.
  • the force depicted in Figs. 10A - 10D and in Fig. 11 is greater than the force de ⁇ picted above in Figs. 4A - 4D, 5, 6A - 6D, and 7.
  • the computer 100 responds to the force exerted by the user on the first force sensor 131 by zooming the depiction of the screen object 115 outward.
  • the zoom is seen in Figs. 10A - 10D to occur at a higher rate than the zoom depicted in Figs. 6A - 6D, in response to the greater force applied by the user to the first force sensor 131.
  • the manipulation of the first force sensor 131 of the input device 120 by the user controls motion as well as rate of motion of the screen object 115 in two different direc ⁇ tions in a third dimension orthogonal to the first two dimensions.
  • FIGs. 12A and 12B are simplified pictorial illustrations showing the operation of the device of Fig. 1.
  • the user is shown pressing on the second force sensor 132.
  • Fig. 13 is an enlarged view of a portion of the device of Fig. 12A at section lines XIII-XIII.
  • the user presses down with one finger on the second force sensor 132 at one end thereof, shown in Fig. 13 as the front end.
  • the user thus signals, as explained above with reference to Fig. IB, both the direction and the degree of force applied.
  • the computer 100 responds to the force exerted by the user on the second force sensor 132 by rotating the depiction of the screen object 115 clockwise.
  • the manipulation of the second force sensor 132 of the input device 120 by the user controls motion of the screen object 115 in a fourth dimension orthogonal to the first three dimen ⁇ sions.
  • FIGs. 14A and 14B are simplified pictorial illustrations showing the operation of the device of Fig. 1.
  • the user is shown pressing on the second force sensor 132.
  • Fig. 15 is an enlarged view of a portion of the device of Fig. 14A at section lines XV-XV.
  • the user presses down with one finger on the second force sensor 132 at an end thereof different from the end shown in Fig. 13, which different end is shown in Fig. 15 as the back end.
  • the user thus signals, as explained above with reference to Fig. IB, both the direction and the degree of force applied.
  • the computer 100 responds to the force exerted by the user on the second force sensor 132 by rotating the depiction of the screen object 115 counterclockwise.
  • the manipulation of the second force sensor 132 of the input device 120 by the user controls motion of the screen object 115 in two different direc ⁇ tions in a fourth dimension orthogonal to the first three dimensions.
  • the rate of rotation of the screen object 115 may be controlled by the degree of force applied by the user to the second force sensor 132.
  • the manipulation of the second force sensor 132 of the input device 120 by the user controls motion as well as rate of motion of the screen object 115 in two differ- /14633 PC17US95/14316
  • FIGs. 16A and 16B are simplified pictorial illustrations showing the operation of the device of Fig. 1.
  • Figs. 16A and 16B depict the user moving the input device 120 at a diago ⁇ nal, or in two orthogonal directions, and also applying force to both of the force sensors 130 at the same time.
  • the computer 100 responds by moving the depiction of the screen object 115 in two orthogonal directions, zooming the depiction of the screen object 115 and rotating the depiction of the screen object 115, all at substantially the same time.
  • the user thus controls motion of the screen object 115 in four mutually orthogonal dimensions at the same time.
  • the software components of the present invention may, if desired, be implemented in ROM (read-only memory) form.
  • the software components may, generally, be implemented in hardware, if desired, using conventional techniques.

Abstract

A hand manipulable electrical control input device (120) including a finger engageable control element (125, 130) providing at least four mutually orthogonal parameter inputs and including at least one force sensor (130) providing at least one of said at least four mutually orthogonal parameter inputs.

Description

MULTI-DIMENSIONAL ELECTRICAL CONTROL DEVICE
FIELD OF THE INVENTION
The present invention relates to hand manipula- ble input and control devices generally, and more specif¬ ically to such devices for use with computers, games, televisions, and other electronically controlled devices.
BACKGROUND OF THE INVENTION
Hand-manipulated input devices are well-known in the art. Hand-manipulated input devices include mice, joysticks, light pens, trackballs and digitizers. Unlike keyboard based input such as computer keyboards, tele¬ phones, and conventional television remote control de¬ vices, hand-manipulated input devices generally provide a continuous scale of input.
Various hand-manipulated input devices are use in different applications. For computer input, the most widely used of these devices is the mouse.
Conventional mice are hand manipulable devices designed to control displacement in two dimensions, as the hand manipulating the mouse moves about a generally flat surface. Displacement may be effected by the rota¬ tion of a ball located on the underside of the mouse in contact with the flat surface, or alternatively by opti¬ cal means or otherwise. Conventional mice also include a switch having a pressed position and a released position. The user of the mouse typically presses and/or releases the switch to indicate selection of an object, of an option, or of some desired action.
Some mice are additionally capable of control¬ ling displacement in a third spatial dimension. Of these mice, some use transmitters and receivers located in the 14633 PC17US95/14316
mouse and in the environment around the mouse, as for example ultrasonic or infrared transmitters and receiv¬ ers. Other mice determine three-dimensional displacement with the aid of accelerometers, such as the mouse de¬ scribed in Applicant's copending Israel patent applica¬ tion 108565.
Some mice control the conventional two dimen¬ sions along a flat surface and additionally control another variable indicating user input via hand motion. These mice may be considered three dimensional in the sense that they control three independent variables indicating user input via hand motions.
US Patent 4,961,138 to Gorniak describes a mouse whose third dimension is the pressure of the entire mouse against the flat surface.
US Patent 5,063,376 to Chang describes a mouse with a numeric keyboard, one of whose buttons is an analog switch. The analog switch describe by Chang controls the position of the switch, that is the degree to which the switch is depressed, from undepressed through fully depressed.
A non-mouse device using pressure sensors for input of two-dimensional information from the user's hand is described in US Patent 4,313,113 to Thornburg.
SUMMARY OF THE INVENTION
The present invention seeks to provide an improved hand manipulable input and control device.
Existing hand manipulable input and control devices, such as mice, are sometimes inadequate to input the necessary information to a computer because existing devices can input at most three dimensions of information plus on/off information. Input of four or more dimen¬ sions of information with existing devices requires the use of a second input device such as a keyboard in addi¬ tion to the hand manipulable input device.
In CAD/CAM applications, for example, it may be necessary to position an object in three dimensions and also to zoom at the same time; to position an object in three dimensions and also to rotate the object about an arbitrary axis at the same time; or to position an object in three dimensions and to specify object at¬ tributes at the same time.
In another example, computer games may addi¬ tionally require the input of three-dimensional position information and also of information on the speed of action of the game at the same time.
There are many other examples requiring more than three dimensions of information. Typically, it is preferred for the multiple dimensions to be input simul¬ taneously.
There is thus provided in accordance with a preferred embodiment of the present invention a hand manipulable electrical control input device including a finger engageable control element providing at least four mutually orthogonal parameter inputs and including at least one force sensor providing at least one of the at least four mutually orthogonal parameter inputs. Further in accordance with a preferred embodi¬ ment of the present invention the control element in¬ cludes a mouse.
Still further in accordance with a preferred embodiment of the present invention the control element includes a computer input device.
Additionally in accordance with a preferred embodiment of the present invention the control element is displaceable in at least two dimensions.
Further in accordance with a preferred embodi¬ ment of the present invention the control element is displaceable in at least three dimensions.
Still further in accordance with a preferred embodiment of the present invention the mutually orthogo¬ nal parameter inputs include at least three of the fol¬ lowing parameter inputs: X-displacement, Y-displacement, size, zoom, rotation, color, intensity, speed, amplitude, frequency and texture.
Additionally in accordance with a preferred embodiment of the present invention each force sensor is associated with a displaceable switch.
Further in accordance with a preferred embodi¬ ment of the present invention the at least one force sensor includes a plurality of force sensors.
Still further in accordance with a preferred embodiment of the present invention each force sensor is individual finger actuable.
Additionally in accordance with a preferred embodiment of the present invention each force sensor is separately actuable.
Further in accordance with a preferred embodi¬ ment of the present invention the control element in¬ cludes a mouse.
Still further in accordance with a preferred embodiment of the present invention the control element includes a computer input device. Additionally in accordance with a preferred embodiment of the present invention the control element is displaceable in at least two dimensions.
Further in accordance with a preferred embodi¬ ment of the present invention the control element is displaceable in at least three dimensions.
Still further in accordance with a preferred embodiment of the present invention the mutually orthogo¬ nal parameter inputs include at least three of the fol¬ lowing parameter inputs: X-displacement, Y-displacement, size, zoom, rotation, color, intensity, speed, amplitude, frequency and texture.
Additionally in accordance with a preferred embodiment of the present invention each force sensor is associated with a displaceable switch.
There is also provided in accordance with another preferred embodiment of the present invention a hand manipulable electrical control input device includ¬ ing a finger engageable control element providing at least three mutually orthogonal parameter inputs and including at least two force sensors providing at least two of the at least three mutually orthogonal parameter inputs.
Further in accordance with a preferred embodi¬ ment of the present invention each force sensor is indi¬ vidual finger actuable.
Still further in accordance with a preferred embodiment of the present invention each force sensor is separately actuable.
Additionally in accordance with a preferred embodiment of the present invention the control element includes a mouse.
Further in accordance with a preferred embodi¬ ment of the present invention the control element in¬ cludes a computer input device.
Still further in accordance with a preferred embodiment of the present invention the control element is displaceable in at least two dimensions.
Additionally in accordance with a preferred embodiment of the present invention the control element is displaceable in at least three dimensions.
Further in accordance with a preferred embodi¬ ment of the present invention the mutually orthogonal parameter inputs include at least three of the following parameter inputs: X-displacement, Y-displacement, size, zoom, rotation, color, intensity, speed, amplitude, frequency and texture.
Still further in accordance with a preferred embodiment of the present invention each force sensor is associated with a displaceable switch.
BRIEF DESCRIPTION OF THE DRAWINGS
The present invention will be understood and appreciated from the following detailed description, taken in conjunction with the drawings in which:
Fig. 1A is a simplified pictorial illustration of a computer system including a hand manipulable comput¬ er input device constructed and operative in accordance with a preferred embodiment of the present invention;
Fig. IB is a sectional illustration of a por¬ tion of the device of Fig. 1A at section lines IB-IB;
Figs. 2A, 2B, 3A, and 3B are simplified picto¬ rial illustrations showing the operation of the device of Fig. 1A;
Figs. 4A - 4D are simplified pictorial illus¬ trations showing the operation of the device of Fig. 1A;
Fig. 5 is an enlarged view of a portion of the device of Fig. 4A at section lines V-V;
Figs. 6A - 6D are simplified pictorial illus¬ trations showing the operation of the device of Fig. 1;
Fig. 7 is an enlarged view of a portion of the device of Fig. 6A at section lines VII-VII;
Figs. 8A - 8D are simplified pictorial illus¬ trations showing the operation of the device of Fig. 1;
Fig. 9 is an enlarged view of a portion of the device of Fig. 8A at section lines IX-IX;
Figs. 10A - 10D are simplified pictorial illus¬ trations showing the operation of the device of Fig. 1;
Fig. 11 is an enlarged view of a portion of the device of Fig. 6A at section lines XI-XI;
Figs. 12A and 12B are simplified pictorial illustrations showing the operation of the device of Fig. l;
Fig. 13 is an enlarged view of a portion of the device of Fig. 10A at section lines XIII-XIII; Figs. 14A and 14B are simplified pictorial illustrations showing the operation of the device of Fig.
1;
Fig. 15 is an enlarged view of a portion of the device of Fig. 6A at section lines XV-XV; and
Figs. 16A and 16B are simplified pictorial illustrations showing the operation of the device of Fig. 1.
DETAILED DESCRIPTION OF PREFERRED EMBODIMENTS
Reference is now made to Fig. 1A which illus¬ trates a computer system including a hand manipulable computer input device which is constructed and operative in accordance with a preferred embodiment of the present invention. The apparatus of Fig. 1A comprises a computer 100 including a CPU 105 and a display screen 110. Com¬ puter 105 may be any appropriate computer system such as an IBM-compatible PC system, a workstation system, or another appropriate system.
It is appreciated that the system of Fig. 1A includes computer 105 merely as an example, and that the scope of the present invention is not limited to use with computers, but rather may include use with other devices such as video games, interactive television, or any other electronically controlled device.
A screen object 115 is shown in Fig. 1A dis¬ played on the screen 110. The screen object 115 is depicted in Fig. 1A as a screen representation of a three dimensional object. It is appreciated that the screen object 115 may be an object such as a representation of an object which is part of a computer game, a cursor representing position in three dimensional space, or another object.
The system of Fig. 1A also includes a hand manipulable computer input device 120. The hand manipu¬ lable computer input device 120 is depicted in Fig. 1A as a mouse. It is appreciated that the input device 120 is not necessarily a mouse, but may in fact include any two or three-dimensional hand manipulable computer input device, for example a trackball, a digitizer, or a light pen.
In Fig. 1A the input device 120 is shown con¬ nected to the CPU 105 by a cable. Alternatively, the input device 120 may be connected by a plurality of cables, by a wireless connection, or by other appropriate means.
The input device 120 comprises a select button 125. Except as described below, the select button 125 and the remaining components of the input device 120 may be conventional components such as are found in commer¬ cially available hand manipulable computer input devices, for example, a mouse such as a Microsoft mouse available from Microsoft Corporation, USA.
The input device 120 also comprises a plurality of force sensors 130. Each of the force sensors 130 is operative to sense the force applied to the force sensor 130 by the user, typically by the user's finger. In Fig. 1A, a first force sensor 131 and a second force sensor 132 are depicted. It will be appreciated that the plu¬ rality of force sensors 130 may comprise more than two force sensors.
Preferably, each of the force sensors 130 is individually finger actuable; that is, each of the force sensors 130 may, if so desired by the user, be actuated by a single finger. Additionally, each of the force sensors 130 is preferably separately actuable; that is, each of the force sensors 130 may, if so desired by the user, be actuated without actuating any of the other force sensors. Preferably, each of the force sensors 130 may be actuated by the user by applying force either in the forward or the backward direction; alternatively, actuation my occur by applying force in other directions.
The force sensors 130 produce continuous sig¬ nals representing the force applied to each force sensor 130. The continuous signals are converted to digital signals by conventional means, such as by analog-to- digital conversion or within an embedded microprocessor, as typically employed in hand manipulable computer input devices. The processed digitized signals are then trans¬ mitted to the computer 100 via the cable or other con¬ necting means. Alternatively, the analog signals may be sent via the cable or other connecting means to the computer 100 through an analog connection such as a game port. In this case, the computer 100 is operative to convert the analog signals to digital signals.
In the case where analog-to-digital conversion occurs within the input device 120, the signals repre¬ senting force applied to the force sensors 130 are re¬ ceived by the computer 100 in synchronization with con¬ ventional signals sent by the input device 120. In the case where analog-to-digital conversion occurs within the computer 100, the computer 100 is also operative to synchronize the force signals with the conventional signals.
Reference is now made to Fig. IB which is a sectional illustration of a portion of the device of Fig. 1A at section lines IB-IB. The apparatus of Fig. IB comprises one of the force sensors 130.
The force sensor 130 shown in Fig. IB comprises an actuator 135. Preferably, the actuator 135 comprises a bidirectional actuator such as a rocker switch. It is appreciated that the actuator 135 may be any appropriate actuator such as an on-off switch or another appropriate switch. The actuator 135 is operative to provide an indication of the direction in which force is applied to the force sensor 130. Preferably, the actuator 135 may be actuated by the user without necessitating movement of the user's hand or finger from its point of contact with the actuator 135.
Force sensor 130 also comprises a force sensi¬ tive element 140. The actuator 135 is positioned rela¬ tive to the force sensitive element 140 such that, when the user presses the actuator 135, the actuator 135 presses on the force sensitive element 140. The force sensitive element 140 is operative to produce an analog signal indicating the degree of force applied thereto. The force sensitive element 140 may be any appropriate force sensitive element or transducer as, for example, a force sensitive resistor such as the FSR commercially available from Interlink Electronics Europe, B.P. 8, Zone Industrielle, L-6401 Echternach, Luxembourg.
The operation of the apparatus of Fig. 1A is now briefly described. The depiction of the screen object 115 on the screen 110 is controlled by the CPU 105, which may be executing a CAD/CAM package, a simula¬ tion, a computer game, or another appropriate computer task. In response to signals from the input device 120, the computer 100 is operative to alter the depiction of the screen object 115 on the screen 110.
Preferably, the user of the system of Fig. 1A may use the input device 120 to direct the CPU 105 to manipulate the screen object 115 in more than three mutually orthogonal dimensions. For example, the user may, utilizing the conventional input capabilities of the input device 120, direct the CPU 105 to move the screen object 115 in two or three orthogonal directions. Typi¬ cally at the same time, the user may actuate one or more of the force sensors 130 to direct the CPU 105 to perform other manipulations, such as one or more of the follow¬ ing: to zoom the screen object 115 in or out; to rotate the screen object 115 about one or more axes; to change the speed of an action occurring to the screen object 115; to change attributes of the screen object 115 such as the color, texture, size, or other visible at¬ tributes of the screen object 115; to move the screen object 115 in two or three orthogonal directions, instead of or in addition to the movement indicated using the conventional input capabili¬ ties of the input device 120 as described above; to change the amplitude or frequency of one or more characteristics of motion of the screen object 115; or to perform another manipulation.
It is appreciated that the manipulations de¬ scribed above may alternatively or additionally affect some other attribute such as the frequency or volume of a sound, which other attribute is typically associated in some way with the screen object 115.
In the case where the screen object 115 com¬ prises text, such as a menu or a representation of a printed document, the user may actuate one or more of the force sensors 130 to direct the CPU 105 to change the speed of some action, such as paging through text or scrolling, in addition to manipulations performed utiliz¬ ing the conventional input capabilities of the input device 120.
In the case where the actuator 135 comprises a bidirectional actuator such as a rocker switch, the position of the actuator 135 may be used to control the direction of the action indicated by use of the force sensors 130, as, for example, whether a zoom is to take place in the inward or the outward direction.
Reference is now made to Figs. 2A, 2B, 3A, and 3B which are simplified pictorial illustrations showing the operation of the device of Fig. 1A. In Figs. 2A and 2B, the user is shown manipulating the input device 120 by moving the input device 120 in a given direction, thus controlling motion of the screen object 115 in a single dimension, motion in the vertical direction.
In Figs. 3A and 3B, the user is shown manipu¬ lating the input device 120 by moving the input device 120 in another given direction, thus controlling motion of the screen object 115 in a second dimension orthogonal to the first, motion in the horizontal direction.
Reference is now made to Figs. 4A - 4D which are simplified pictorial illustrations showing the opera- 14633 PC17US95/14316
14
tion of the device of Fig. 1A. In Figs. 4A - 4D, the user is shown pressing on the first force sensor 131.
Reference is now additionally made to Fig. 5 which is an enlarged view of a portion of the device of Fig. 4A at section lines V-V. As is shown in Fig. 5, the user presses down with one finger on the first force sensor 131 at one end thereof, shown in Fig. 5 as the front end. The user thus signals, as explained above with reference to Fig. IB, both the direction and the degree of force applied.
As is shown in Figs. 4A - 4D, the computer 100 responds to the force exerted by the user on the first force sensor 131 by zooming the depiction of the screen object 115 inward. Thus, the manipulation of the first force sensor 131 of the input device 120 by the user controls motion of the screen object 115 in a third dimension orthogonal to the first two dimensions.
Reference is now made to Figs. 6A - 6D which are simplified pictorial illustrations showing the opera¬ tion of the device of Fig. 1. In Figs. 6A - 6D, the user is shown pressing on the first force sensor 131.
Reference is now additionally made to Fig. 7 which is an enlarged view of a portion of the device of Fig. 6A at section lines VII-VII. As is shown in Fig. 7, the user presses down with one finger on the first force sensor 131 at an end thereof different from the end shown in Fig. 5, which different end is shown in Fig. 7 as the back end. The user thus signals, as explained above with reference to Fig. IB, both the direction and the degree of force applied.
As is shown in Figs. 6A - 6D, the computer 100 responds to the force exerted by the user on the first force sensor 131 by zooming the depiction of the screen object 115 outward. Thus, taking Figs. 4A - 4D, 5, 6A - 6D and 7 together, it will be appreciated that the manip¬ ulation of the first force sensor 131 of the input device 120 by the user controls motion of the screen object 115 in two different directions in a third dimension orthogo¬ nal to the first two dimensions.
Reference is now made to Figs. 8A - 8D which are simplified pictorial illustrations showing the opera¬ tion of the device of Fig. 1. In Figs. 8A - 8D, the user is shown pressing on the first force sensor 131.
Reference is now additionally made to Fig. 9 which is an enlarged view of a portion of the device of Fig. 8A at section lines IX-IX. As is shown in Fig. 9, the user presses down with one finger on the first force sensor 131 at one end thereof, shown in Fig. 9 as the front end. The user thus signals, as explained above with reference to Fig. IB, both the direction and the degree of force applied. The force depicted in Figs. 8A 8D and in Fig. 9 is greater than the force depicted above in Figs. 4A - 4D, 5, 6A - 6D, and 7.
As is shown in Figs. 8A - 8D, the computer 100 responds to the force exerted by the user on the first force sensor 131 by zooming the depiction of the screen object 115 inward. The zoom is seen in Figs. 8A - 8D to occur at a higher rate than the zoom depicted in Figs. 4A 4D, in response to the greater force applied by the user to the first force sensor 131. Thus, the manipula¬ tion of the first force sensor 131 of the input device 120 by the user controls motion as well as rate of motion of the screen object 115 in a third dimension orthogonal to the first two dimensions.
Reference is now made to Figs. 10A - 10D which are simplified pictorial illustrations showing the opera¬ tion of the device of Fig. 1. In Figs. 10A - 10D, the user is shown pressing on the first force sensor 131.
Reference is now additionally made to Fig. 11 which is an enlarged view of a portion of the device of Fig. 8A at section lines XI-XI. As is shown in Fig. 11, the user presses down with one finger on the first force sensor 131 at an end thereof different from the end shown in Fig. 9, the different end being shown in Fig. 11 as the back end. The user thus signals, as explained above with reference to Fig. IB, both the direction and the degree of force applied. The force depicted in Figs. 10A - 10D and in Fig. 11 is greater than the force de¬ picted above in Figs. 4A - 4D, 5, 6A - 6D, and 7.
As is shown in Figs. 10A - 10D, the computer 100 responds to the force exerted by the user on the first force sensor 131 by zooming the depiction of the screen object 115 outward. The zoom is seen in Figs. 10A - 10D to occur at a higher rate than the zoom depicted in Figs. 6A - 6D, in response to the greater force applied by the user to the first force sensor 131. Thus, the manipulation of the first force sensor 131 of the input device 120 by the user controls motion as well as rate of motion of the screen object 115 in two different direc¬ tions in a third dimension orthogonal to the first two dimensions.
Reference is now made to Figs. 12A and 12B which are simplified pictorial illustrations showing the operation of the device of Fig. 1. In Figs. 12A and 12B, the user is shown pressing on the second force sensor 132.
Reference is now additionally made to Fig. 13 which is an enlarged view of a portion of the device of Fig. 12A at section lines XIII-XIII. As is shown in Fig. 13, the user presses down with one finger on the second force sensor 132 at one end thereof, shown in Fig. 13 as the front end. The user thus signals, as explained above with reference to Fig. IB, both the direction and the degree of force applied.
As is shown in Figs. 12A and 12B, the computer 100 responds to the force exerted by the user on the second force sensor 132 by rotating the depiction of the screen object 115 clockwise. Thus, the manipulation of the second force sensor 132 of the input device 120 by the user controls motion of the screen object 115 in a fourth dimension orthogonal to the first three dimen¬ sions.
Reference is now made to Figs. 14A and 14B which are simplified pictorial illustrations showing the operation of the device of Fig. 1. In Figs. 14A and 14B, the user is shown pressing on the second force sensor 132.
Reference is now additionally made to Fig. 15 which is an enlarged view of a portion of the device of Fig. 14A at section lines XV-XV. As is shown in Fig. 15, the user presses down with one finger on the second force sensor 132 at an end thereof different from the end shown in Fig. 13, which different end is shown in Fig. 15 as the back end. The user thus signals, as explained above with reference to Fig. IB, both the direction and the degree of force applied.
As is shown in Figs. 14A and 14B, the computer 100 responds to the force exerted by the user on the second force sensor 132 by rotating the depiction of the screen object 115 counterclockwise. Thus, taking Figs. 12A, 12B, 13, 14A, 14B, and 15 together, it will be appreciated that the manipulation of the second force sensor 132 of the input device 120 by the user controls motion of the screen object 115 in two different direc¬ tions in a fourth dimension orthogonal to the first three dimensions.
It will be appreciated that, in a manner simi¬ lar to that explained above with reference to Figs. 8A - 8D, 9, 10A - 10D, and 11, the rate of rotation of the screen object 115 may be controlled by the degree of force applied by the user to the second force sensor 132. Thus, the manipulation of the second force sensor 132 of the input device 120 by the user controls motion as well as rate of motion of the screen object 115 in two differ- /14633 PC17US95/14316
18
ent directions in a fourth dimension orthogonal to the first three dimensions.
Reference is now made to Figs. 16A and 16B which are simplified pictorial illustrations showing the operation of the device of Fig. 1. Figs. 16A and 16B depict the user moving the input device 120 at a diago¬ nal, or in two orthogonal directions, and also applying force to both of the force sensors 130 at the same time. The computer 100 responds by moving the depiction of the screen object 115 in two orthogonal directions, zooming the depiction of the screen object 115 and rotating the depiction of the screen object 115, all at substantially the same time. The user thus controls motion of the screen object 115 in four mutually orthogonal dimensions at the same time.
It is appreciated that the scope of the present invention is not limited to computer input devices as shown in the embodiment described above, but rather is applicable to all types of electrical control input devices including input devices for computer games, remote manipulation devices, and other electrical control input devices.
It is appreciated that the software components of the present invention may, if desired, be implemented in ROM (read-only memory) form. The software components may, generally, be implemented in hardware, if desired, using conventional techniques.
It is appreciated that various features of the invention which are, for clarity, described in the con¬ texts of separate embodiments may also be provided in combination in a single embodiment. Conversely, various features of the invention which are, for brevity, de¬ scribed in the context of a single embodiment may also be provided separately or in any suitable subcombination.
It will be appreciated by persons skilled in the art that the present invention is not limited to what has been particularly shown and described hereinabove. Rather, the scope of the present invention is defined only by the claims that follow:

Claims

/14633 PC17US95/1431620CLAIMS
1. A hand manipulable electrical control input device comprising a finger engageable control element providing at least four mutually orthogonal parameter inputs and including at least one force sensor providing at least one of said at least four mutually orthogonal parameter inputs.
2. A hand manipulable electrical control input device according to claim 1 and wherein said control element comprises a mouse.
3. A hand manipulable electrical control input device according to claim 1 and wherein said control element comprises a computer input device.
4. A hand manipulable electrical control input device according to claim 1 and wherein said control element is displaceable in at least two dimensions.
5. A hand manipulable electrical control input device according to claim 1 and wherein said control element is displaceable in at least three dimensions.
6. A hand manipulable electrical control input device according to claim 1 and wherein said mutually orthogonal parameter inputs include at least three of the following parameter inputs: X-displacement, Y-displace¬ ment, size, zoom, rotation, color, intensity, speed, amplitude, frequency and texture.
7. A hand manipulable electrical control input device according to claim 1 and wherein each force sensor is associated with a displaceable switch.
8. A hand manipulable electrical control input device according to claim 1 and wherein said at least one force sensor comprises a plurality of force sensors.
9. A hand manipulable electrical control input device according to claim 8 and wherein each said force sensor is individual finger actuable.
10. A hand manipulable electrical control input device according to claim 8 and wherein each said force sensor is separately actuable.
11. A hand manipulable electrical control input device according to claim 8 and wherein said control element comprises a mouse.
12. A hand manipulable electrical control input device according to claim 8 and wherein said control element comprises a computer input device.
13. A hand manipulable electrical control input device according to claim 8 and wherein said control element is displaceable in at least two dimensions.
14. A hand manipulable electrical control input device according to claim 8 and wherein said control element is displaceable in at least three dimensions.
15. A hand manipulable electrical control input device according to claim 8 and wherein said mutually orthogonal parameter inputs include at least three of the following parameter inputs: X-displacement, Y-displace¬ ment, size, zoom, rotation, color, intensity, speed, 14633 PC17US95/14316
22
amplitude, frequency and texture.
16. A hand manipulable electrical control input device according to claim 8 and wherein each force sensor is associated with a displaceable switch.
17. A hand manipulable electrical control input device comprising a finger engageable control element providing at least three mutually orthogonal parameter inputs and including at least two force sensors providing at least two of said at least three mutually orthogonal parameter inputs.
18. A hand manipulable electrical control input device according to claim 17 and wherein each said force sensor is individual finger actuable.
19. A hand manipulable electrical control input device according to claim 17 and wherein each said force sensor is separately actuable.
20. A hand manipulable electrical control input device according to claim 17 and wherein said control element comprises a mouse.
21. A hand manipulable electrical control input device according to claim 17 and wherein said control element comprises a computer input device.
22. A hand manipulable electrical control input device according to claim 17 and wherein said control element is displaceable in at least two dimensions.
23. A hand manipulable electrical control input device according to claim 17 and wherein said control element is displaceable in at least three dimensions.
24. A hand manipulable electrical control input device according to claim 17 and wherein said mutually orthogonal parameter inputs include at least three of the following parameter inputs: X-displacement, Y-displace¬ ment, size, zoom, rotation, color, intensity, speed, amplitude, frequency and texture.
25. A hand manipulable electrical control input device according to claim 17 and wherein each force sensor is associated with a displaceable switch.
PCT/US1995/014316 1994-11-07 1995-11-07 Multi-dimensional electrical control device WO1996014633A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
AU41447/96A AU4144796A (en) 1994-11-07 1995-11-07 Multi-dimensional electrical control device

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US33504794A 1994-11-07 1994-11-07
US335,047 1994-11-07

Publications (1)

Publication Number Publication Date
WO1996014633A1 true WO1996014633A1 (en) 1996-05-17

Family

ID=23310032

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US1995/014316 WO1996014633A1 (en) 1994-11-07 1995-11-07 Multi-dimensional electrical control device

Country Status (2)

Country Link
AU (1) AU4144796A (en)
WO (1) WO1996014633A1 (en)

Cited By (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE19648487C1 (en) * 1996-11-12 1998-06-10 Primax Electronics Ltd Computer mouse with additional display window controls
EP0883055A2 (en) * 1997-06-02 1998-12-09 Sony Corporation Digital map display zooming method and device
DE19837510A1 (en) * 1998-08-19 2000-02-24 Bayerische Motoren Werke Ag Device for controlling the reproduction of an image displayed on a vehicle screen
US6081261A (en) * 1995-11-01 2000-06-27 Ricoh Corporation Manual entry interactive paper and electronic document handling and processing system
US6104380A (en) * 1997-04-14 2000-08-15 Ricoh Company, Ltd. Direct pointing apparatus for digital displays
US6181329B1 (en) 1997-12-23 2001-01-30 Ricoh Company, Ltd. Method and apparatus for tracking a hand-held writing instrument with multiple sensors that are calibrated by placing the writing instrument in predetermined positions with respect to the writing surface
US6201903B1 (en) 1997-09-30 2001-03-13 Ricoh Company, Ltd. Method and apparatus for pen-based faxing
EP1247157A1 (en) * 2000-01-14 2002-10-09 Sony Computer Entertainment Inc. Equipment, method and recording medium for enlargement, reduction and modification of shape of images depending on output from pressure-sensitive means
EP1484666A2 (en) * 2003-06-04 2004-12-08 3Dconnexion GmbH Multidimensional input device for navigating and selecting virtual objects
JP2007172633A (en) * 2006-12-28 2007-07-05 Sony Corp Method, device for magnification and reduction display of digital map, and storage medium which stores magnification and reduction display program of digital map
WO2008131544A1 (en) * 2007-04-26 2008-11-06 University Of Manitoba Pressure augmented mouse
US9235934B2 (en) 2004-01-30 2016-01-12 Electronic Scripting Products, Inc. Computer interface employing a wearable article with an absolute pose detection component
US11577159B2 (en) 2016-05-26 2023-02-14 Electronic Scripting Products Inc. Realistic virtual/augmented/mixed reality viewing and interactions

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4794388A (en) * 1980-02-20 1988-12-27 Summagraphics Corporation Method of and apparatus for controlling a display
US5095303A (en) * 1990-03-27 1992-03-10 Apple Computer, Inc. Six degree of freedom graphic object controller
US5122785A (en) * 1988-11-14 1992-06-16 Wang Laboratories, Inc. Squeezable control device for computer display system
US5162781A (en) * 1987-10-02 1992-11-10 Automated Decisions, Inc. Orientational mouse computer input system
US5313229A (en) * 1993-02-05 1994-05-17 Gilligan Federico G Mouse and method for concurrent cursor position and scrolling control

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4794388A (en) * 1980-02-20 1988-12-27 Summagraphics Corporation Method of and apparatus for controlling a display
US5162781A (en) * 1987-10-02 1992-11-10 Automated Decisions, Inc. Orientational mouse computer input system
US5122785A (en) * 1988-11-14 1992-06-16 Wang Laboratories, Inc. Squeezable control device for computer display system
US5095303A (en) * 1990-03-27 1992-03-10 Apple Computer, Inc. Six degree of freedom graphic object controller
US5313229A (en) * 1993-02-05 1994-05-17 Gilligan Federico G Mouse and method for concurrent cursor position and scrolling control

Cited By (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6081261A (en) * 1995-11-01 2000-06-27 Ricoh Corporation Manual entry interactive paper and electronic document handling and processing system
DE19648487C1 (en) * 1996-11-12 1998-06-10 Primax Electronics Ltd Computer mouse with additional display window controls
US6104380A (en) * 1997-04-14 2000-08-15 Ricoh Company, Ltd. Direct pointing apparatus for digital displays
EP2124135A1 (en) * 1997-06-02 2009-11-25 Sony Corporation Digital map display zooming method and device
EP0883055A2 (en) * 1997-06-02 1998-12-09 Sony Corporation Digital map display zooming method and device
EP0883055A3 (en) * 1997-06-02 2000-05-03 Sony Corporation Digital map display zooming method and device
US6424355B2 (en) 1997-06-02 2002-07-23 Sony Corporation Digital map display zooming method, digital map display zooming device, and storage medium for storing digital map display zooming program
US6201903B1 (en) 1997-09-30 2001-03-13 Ricoh Company, Ltd. Method and apparatus for pen-based faxing
US6181329B1 (en) 1997-12-23 2001-01-30 Ricoh Company, Ltd. Method and apparatus for tracking a hand-held writing instrument with multiple sensors that are calibrated by placing the writing instrument in predetermined positions with respect to the writing surface
US6492981B1 (en) 1997-12-23 2002-12-10 Ricoh Company, Ltd. Calibration of a system for tracking a writing instrument with multiple sensors
DE19837510A1 (en) * 1998-08-19 2000-02-24 Bayerische Motoren Werke Ag Device for controlling the reproduction of an image displayed on a vehicle screen
US6452570B1 (en) 1998-08-19 2002-09-17 Bayerische Motoren Werke Aktiengesellschaft Device for controlling the reproduction of an image displayed on a vehicle screen
EP1247157A1 (en) * 2000-01-14 2002-10-09 Sony Computer Entertainment Inc. Equipment, method and recording medium for enlargement, reduction and modification of shape of images depending on output from pressure-sensitive means
DE10325284A1 (en) * 2003-06-04 2005-01-13 3Dconnexion Gmbh Multidimensional input device for navigation and selection of visual objects
EP1484666A3 (en) * 2003-06-04 2007-09-05 3Dconnexion GmbH Multidimensional input device for navigating and selecting virtual objects
EP1484666A2 (en) * 2003-06-04 2004-12-08 3Dconnexion GmbH Multidimensional input device for navigating and selecting virtual objects
US9235934B2 (en) 2004-01-30 2016-01-12 Electronic Scripting Products, Inc. Computer interface employing a wearable article with an absolute pose detection component
US9939911B2 (en) 2004-01-30 2018-04-10 Electronic Scripting Products, Inc. Computer interface for remotely controlled objects and wearable articles with absolute pose detection component
US10191559B2 (en) 2004-01-30 2019-01-29 Electronic Scripting Products, Inc. Computer interface for manipulated objects with an absolute pose detection component
JP2007172633A (en) * 2006-12-28 2007-07-05 Sony Corp Method, device for magnification and reduction display of digital map, and storage medium which stores magnification and reduction display program of digital map
WO2008131544A1 (en) * 2007-04-26 2008-11-06 University Of Manitoba Pressure augmented mouse
US11577159B2 (en) 2016-05-26 2023-02-14 Electronic Scripting Products Inc. Realistic virtual/augmented/mixed reality viewing and interactions

Also Published As

Publication number Publication date
AU4144796A (en) 1996-05-31

Similar Documents

Publication Publication Date Title
EP0403782B1 (en) Three dimensional mouse with cavity
US5805144A (en) Mouse pointing device having integrated touchpad
US6906697B2 (en) Haptic sensations for tactile feedback interface devices
US7969418B2 (en) 3-D computer input device and method
US5335557A (en) Touch sensitive input control device
Hinckley et al. The videomouse: a camera-based multi-degree-of-freedom input device
US6115028A (en) Three dimensional input system using tilt
EP0662669B1 (en) Cursor positioning device
US5936612A (en) Computer input device and method for 3-D direct manipulation of graphic objects
EP0653725B1 (en) Co-ordinate input device
US5095303A (en) Six degree of freedom graphic object controller
JP3247630B2 (en) Pointing device, portable information processing apparatus, and method of operating information processing apparatus
US5748185A (en) Touchpad with scroll and pan regions
US6259382B1 (en) Isotonic-isometric force feedback interface
US8199107B2 (en) Input interface device with transformable form factor
US20040041787A1 (en) Method and apparatus for a hybrid pointing device used with a data processing system
US5563628A (en) Hand held computer cursor controller and command input device
US6480184B1 (en) Apparatus for entering data into a computer
US5111005A (en) Graphics tablet with n-dimensional capability
WO1996014633A1 (en) Multi-dimensional electrical control device
US20090109173A1 (en) Multi-function computer pointing device
WO2009008872A1 (en) Multi-dimensional input device with center push button
JPS59127143A (en) Manual computer controller
EP0782093A1 (en) Data input means
WO1997000713A1 (en) Method and apparatus for controlling images with a centrally located displacement control device

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): AL AM AT AU BB BG BR BY CA CH CN CZ DE DK EE ES FI GB GE HU IS JP KE KG KP KR KZ LK LR LS LT LU LV MD MG MK MN MW MX NO NZ PL PT RO RU SD SE SG SI SK TJ TM TT UA UG UZ VN

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): KE LS MW SD SZ UG AT BE CH DE DK ES FR GB GR IE IT LU MC NL PT SE BF BJ CF CG CI CM GA GN ML MR NE SN TD TG

DFPE Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101)
121 Ep: the epo has been informed by wipo that ep was designated in this application
REG Reference to national code

Ref country code: DE

Ref legal event code: 8642

122 Ep: pct application non-entry in european phase