WO1999060468A1 - Input unit, method for using the same and input system - Google Patents

Input unit, method for using the same and input system Download PDF

Info

Publication number
WO1999060468A1
WO1999060468A1 PCT/SE1999/000718 SE9900718W WO9960468A1 WO 1999060468 A1 WO1999060468 A1 WO 1999060468A1 SE 9900718 W SE9900718 W SE 9900718W WO 9960468 A1 WO9960468 A1 WO 9960468A1
Authority
WO
WIPO (PCT)
Prior art keywords
input unit
image
function
images
inputting
Prior art date
Application number
PCT/SE1999/000718
Other languages
French (fr)
Inventor
Christer FÅHRAEUS
Ola Hugosson
Petter Ericson
Original Assignee
C Technologies Ab
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from SE9801535A external-priority patent/SE511855C2/en
Priority claimed from SE9803455A external-priority patent/SE513940C2/en
Application filed by C Technologies Ab filed Critical C Technologies Ab
Priority to JP2000550019A priority Critical patent/JP2002516428A/en
Priority to EP99925538A priority patent/EP1073944A1/en
Priority to AU41794/99A priority patent/AU758036B2/en
Priority to CA002331073A priority patent/CA2331073A1/en
Priority to IL13910499A priority patent/IL139104A0/en
Priority to KR1020007012070A priority patent/KR20010052282A/en
Priority to BR9910083-5A priority patent/BR9910083A/en
Publication of WO1999060468A1 publication Critical patent/WO1999060468A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/038Control and interface arrangements therefor, e.g. drivers or device-embedded control circuitry
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/0304Detection arrangements using opto-electronic means
    • G06F3/0317Detection arrangements using opto-electronic means in co-operation with a patterned surface, e.g. absolute position or relative movement detection for an optical mouse or pen positioned with respect to a coded surface
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0346Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of the device orientation or free movement in a 3D space, e.g. 3D mice, 6-DOF [six degrees of freedom] pointers using gyroscopes, accelerometers or tilt-sensors
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0354Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
    • G06F3/03545Pens or stylus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V30/00Character recognition; Recognising digital ink; Document-oriented image-based pattern recognition
    • G06V30/10Character recognition
    • G06V30/14Image acquisition
    • G06V30/142Image acquisition using hand-held instruments; Constructional details of the instruments
    • G06V30/1423Image acquisition using hand-held instruments; Constructional details of the instruments the instrument generating sequences of position coordinates corresponding to handwriting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V30/00Character recognition; Recognising digital ink; Document-oriented image-based pattern recognition
    • G06V30/10Character recognition
    • G06V30/32Digital ink

Definitions

  • the present invention relates to an input unit having a mouse function and at least one inputting function, which input unit comprises image-recording means for providing the inputting function.
  • the invention also relates to a method for providing a mouse function and at least one inputting function with the aid of an input unit, as well as an input system having a mouse function and at least one inputting function.
  • mice Today, personal computers are usually equipped with a computer mouse, which is used for positioning a cursor on the computer screen. The positioning is carried out by the user passing the mouse over a surface, the hand movement thus indicating how the mouse should be positioned.
  • the mouse generates positioning signals indicating how the mouse is being moved and thus how the cursor should be moved.
  • the mouse usually has a track ball, which turns as a result of friction against the surface when the mouse is passed over the same and which m this connection drives position sensors which m turn generate the positioning signals.
  • the mouse can also be used for providing instructions to the computer by the intermediary of one or more buttons on which the user clicks.
  • the term "mouse function" is used below it only refers to the function of positioning a cursor or the like.
  • a hand-held scanner which images the text or image which is to be input with the aid of a light-sensitive sensor.
  • the scanner can only image a very limited text/ image area at one time. Consequently, m order to record one or several words or a whole image, the scanner must be passed over the text/image and several sub- images must be recorded.
  • the scanner has some kind of position sensor which determines how these sub-images should be stored m the computer to enable the creation of a composite image therefrom. It is known to combine a mouse function and an inputting function m a casing which is connected with a single flex to a computer.
  • US 4,906,843 shows a combined mouse, optical scanner, and digitising pad.
  • a track ball is used, which drives two position sensors, which generate the positioning signals.
  • a CCD line sensor as well as the position sensors are used for inputting characters or graphical information to the computer.
  • US 5,355,146 shows a similar input unit with a combined mouse function and scanner function, which also utilises a track ball and a CCD line sensor.
  • EP 0 782 321 shows yet another input unit having a mouse function and scanner function.
  • a track ball is used for the mouse function but instead of the line sensor, use is made of an area sensor which is capable of imaging a document m a single step and which thus need not be moved across the document . This is said to have the advantage that no software is requir- ed for correlating image data with position data.
  • US 5,633,489 shows a combined mouse and barcode reader, where the mouse function is provided by means of a track ball and the barcode reader comprises a laser diode which generates a laser beam emitted from the underside of the mouse and a photo detector which detects the varying intensity of the reflected light.
  • An input unit according to claim 1 thus comprises image-recording means for providing said inputting function, with the image-recording means also being used to provide the mouse function.
  • image-recording means instead of using the image-recording means for only one of these functions, according to the present invention they are used to provide both the mouse function and the inputting function, whereby improved integration of these functions is achieved.
  • the track ball can be omitted and both functions can be based on the same sensor technology so that the signals for both functions can be handled by the same hardware and/or software.
  • inputting function refers to a function whereby the user can input information to a receiver for storing and processing m the same, unlike the mouse function which is used for positioning purposes .
  • the mouse function can be used for positioning a cursor or the like m a plane or m space.
  • the input unit is advantageously adapted to emit positioning signals for providing the mouse function, as well as inputting signals for providing said inputting function, the positioning signals as well as the inputting signals being based on images recorded by means of the image-recording means.
  • the positioning signals can be used for controlling a cursor on a computer screen, while the inputting signals can contain information which is to be input to the computer.
  • the positioning signals and the inputting signals can be emitted as electrical signals on leads, as IR signals, as radio signals, or m some other suitable way.
  • the input unit can also emit signals other than the positioning signals and the inputting signals, e.g. instruction signals based on clickmgs.
  • the receiver of the signals can be a computer or some other input unit to which positioning information and/or other information is to be input.
  • the input unit is especially suitable for use with small portable computers where it is desirable to have few, but versatile, accessories.
  • the image-recording means may comprise a first image-recording unit for providing the mouse function and a second image-recording unit for providing the inputting function.
  • This may be particularly advantageous if diffe- rent image-recording characteristics are desired for the two functions, e.g. if different foci are desired for the image-recording.
  • the different image-recording units can be provided with different lens means with different foci .
  • the image-recording units can, for example, be located on different sides of the input unit, but have shared hard- ware and software.
  • the image-recording means may comprise an image-recording unit which is used for providing both the mouse function and the inputting function.
  • This embodiment is advantageous because it requires fewer com- ponents m the input unit and only one beam path.
  • the image-recording units may comprise any type of sensor which can be used for recording an image but should preferably be a light-sensitive sensor with a two- dimensional sensor surface, a so-called area sensor.
  • both the positioning signals and the inputting signals may essentially consist of the actual images recorded by the image-recording means.
  • essentially all processing of the images takes place m the receiver of the signals, e.g. m a computer. If so, the latter must have software for processing the signals m a suitable manner
  • Such software may already be stored m the computer or may, for example, be included m the input unit according to the invention and be transferred to the receiver when the input unit is m use.
  • the receiver of the signals from the input unit must be capable of determining whether the signals are intended as positioning signals or as inputting signals so that it will know how to process the signals.
  • the input unit is adapted to output the positioning signals and the inputting signals m such a way that the receiver can identify whether it is receiving positioning signals or inputting signals.
  • the input unit may use different protocols for the different signals.
  • the input unit should also know whether the user wishes to use the mouse function or the inputting function so that it will know how the images record- ed by the image recorder should be processed.
  • the input unit preferably comprises switching means, e.g. a button, which are adapted to switch the input unit between its different functions when acted upon by the user.
  • the image-recording means are adapted to record a plurality of images m such a way that the contents of each image overlap the contents of the previous image, if any.
  • This can be achieved by recording the images with sufficiently high frequency m relation to the expected speed of movement.
  • the subsequent processing can take place either m image-processing means m the input unit or m the receiver of the signals from the input unit.
  • the advantages of processing at least the inputting signals m the input unit are that, m this way, the input unit can be used as a stand-alone unit without being connected to an adjacent receiver, that information which has been input can be shown directly on a display on the input unit so that the user can check that the information recorded really is the information he intended to record, and that the information can be transferred m a more compressed format to the receiver. Furthermore, the input unit can be connected to any receiver that supports a mouse with no special software being required m the receiver for processing the images.
  • the input unit advantageously comprises image- processing means used for both the mouse function and the inputting function.
  • image-processing means may comprise a processing unit operating according to different program modules depending upon which function of the input unit is being used.
  • the input unit advantageously comprises means for determining the relative position of the images with the aid of the partially overlapping contents.
  • the means for determining the relative position of the images may be included m the shared image-processing means and be implemented by means of software.
  • the mouse function is used for linear positioning only, it is sufficient to determine the relative position of the images horizontally. However, if it is to be used for two-dimensional positioning, the relative position must be determined both horizontally and vertically.
  • the input unit comprises means for generating the positioning signals on the basis of the relative position of the images.
  • the positioning signals can, for example, be composed of one or more vectors indicating how the input unit has been moved between the recording of two images, or of one or more positioning coordinates.
  • the means for generating the positioning signals can also be included m the shared image-processing means and be implemented by means of software .
  • the input unit is advantageously hand-held so that it can be carried everywhere. This technology thus enables the user to have a personal mouse and input unit with stored personal settings and personal information.
  • the input unit also comprises a transmitter for wireless connection of the input unit to a receiver, which further facili- tates the use of the input unit.
  • the Bluetooth standard can advantageously be used for this purpose.
  • the inputting function comprises a scanner function so that the input unit can be used for recording text and/or images .
  • the inputting function can also comprise a camera function, wherein the image-recording means are utilised for imaging objects located at a distance from the input unit .
  • the inputting function can also comprise a function for inputting handwritten/drawn, i.e. hand-generated information.
  • Each of the scanner function, the camera function, and the handwritmg/drawmg function can be the only inputting function or one of several inputting functions .
  • the input unit can thus have a plurality of func- tions, all of which are based on images which are recorded by the image-recording means and which are processed efficiently by means of shared hardware and software.
  • a second aspect to the invention relates to a method for providing a mouse function and at least one inputting function with the aid of an input unit, comprising the steps of detecting which of said functions is desired; recording at least one image with the aid of the input unit; and processing said at least one image m different ways depending upon which of said functions is desired.
  • an input system having a mouse function and at least one inputting function, comprising image-recording means for recording images and image-processing means for processing the images recorded by the image-recording means for providing the mouse function and said at least one inputting function, the image-recording means being located m a first casing and the image-processing means being located m a second casing.
  • the input system comprises the case where the image-recording means are located m an input unit and the image-processing means are located m a computer or other receiver to which the input unit is connected and to which it transmits recorded images. Every- thing that has been stated above with respect to the image-recording means and the processing of the images recorded by the image-recording means also applies to the input system.
  • Fig. 1 schematically shows an embodiment of an input unit according to the invention
  • Fig. 2 is a block diagram of the electronic circuitry m an embodiment of an input unit according to the invention
  • Fig. 3 is a flowchart of the mouse function
  • Fig. 4 is a flowchart of the handwriting/drawing function
  • Fig. 5 schematically shows how a surface is imaged m connection with the inputting of handwritten information
  • Fig. 6 shows how the handwritten input can be shown on a display
  • Fig. 7 is a flowchart of the scanner function
  • Figs 8a-8c schematically show how text is recorded m the scanner mode
  • Fig. 9 is a flowchart of the camera function. Description of a Preferred Embodiment The following is a description of an embodiment of an input unit according to the invention having a mouse function, a scanner function, a camera function, as well as a function for inputting handwritten/drawn information.
  • Design of the Unit F g. 1 shows the design of the input unit according to this embodiment.
  • the unit has a casing 1 having approximately the same shape as a conventional highlighter pen.
  • One short side of the casing has a window 2, by the intermediary of which images are recorded for the various image-based functions of the input unit.
  • the casing 1 essentially contains an optics part 3, an electronic circuitry part 4, and a power supply 5.
  • the optics part 3 comprises a light -emitting diode 6, a lens system 7, and an image-recording means m the form of a light-sensitive sensor 8, which constitutes the interface with the electronic circuitry part 4.
  • the task of the LED 6 is to illuminate a surface which is currently located under the window.
  • a diffuser 9 is mounted m front of the LED 6 for diffusing the light.
  • the lens system 7 has the task of projecting an image of the surface located under the window 2 on the light-sensitive sensor 8 as accurately as possible.
  • the lens system is displaceable between two positions, the second of which is indicated by dashed lines.
  • the first position is used when images are to be recorded of a surface located directly below the window of the input unit and is primarily intended for the mouse function, the scanner function, and the handwriting/drawmg function.
  • the second position is used when images are to be recorded of objects located at a distance from the input unit and is primarily intended for the camera function, but can also be used for the other functions.
  • the sensor 8 is mounted at a small angle to the window 2 and on its own circuit board 11.
  • the power supply to the input unit is obtained from a battery 12 which is mounted m a separate compartment 13 m the casing .
  • the block diagram m Fig. 2 schematically shows the electronic circuitry part 4.
  • This comprises a processor 20, which by the intermediary of a bus 21 is connected to a ROM 22, m which the programs of the processor are stored, to a read/write memory 23, which constitutes the working memory of the processor and m which the images from the sensor are stored, to a control logic unit 24, as well as to the sensor 8 and the LED 6.
  • the processor 20, the bus 21, the memories 22 and 23, the control logic unit 24, as well as associated software together constitute image-processing means.
  • the control logic unit 24 is m turn connected to a number of peripheral units, comprising a display 25, which is mounted m the casing, a radio transceiver 26 for transferring information to/from an external computer, buttons 27, by means of which the user can control the input unit and specifically adjust the input unit between the mouse function, the scanner function, the camera function, and the handwriting/drawmg function, buttons 27' corresponding to the clicking buttons on a traditional mouse, a tracer LED 28 which emits a light beam, making it easier for the user to know which information he is inputting, as well as an indicator 29, e.g. an LED, indicating when the pen is ready to be used. Control signals to the memories, the sensor 8, and the peripheral units are generated m the control logic unit 24.
  • the control logic also handles generation and p ⁇ o ⁇ tisa- tion of interrupts to the processor.
  • the buttons 27 and 27', the radio transceiver 26, the display 25, the tracer LED 28, and the LED 6 are accessed by the processor writing and reading m a register the control logic unit 24.
  • the buttons 27 and 27' generate interrupts to the processor 20 when they are activated. Operation of the Unit
  • the various functions of the input unit viz. the mouse function, the scanner function, the handwriting/ drawing function, and the camera function, will now be described. All of these functions are based on images which are recorded with the aid of the sensor 8. When the first three functions are used, a plurality of images are recorded such a way that the contents of each image partially overlap the contents of the previous image, if any.
  • the relative position of the images is determined, i.e. the position which affords the best possible correspondence between their contents. Subsequently, the processing is carried out depending upon the function selected by the user.
  • the input unit can be passed over a surface with the window 2 contact with the same, or be held at a small or at a larger distance from the surface depending upon the setting of the lens system.
  • the surface need not be plane. For example, it could be a sheet of paper with text on it, a wall covered with patterned wallpaper, or a bowl of sweets. What is important is that images with varying contents can be recorded so that the relative positions of the images can be determined with the aid of the contents of the images.
  • the user wishes to use the input unit as a mouse.
  • the unit sets the unit to the mouse function with the aid of the buttons 27, whereupon the input unit starts operating m the mouse mode, and logs into the computer for which the input unit is to operate as a mouse.
  • the user directs the window 2 of the input unit at a patterned surface, e.g. a mouse pad. He presses one of the buttons 27 to activate the input unit, whereupon the processor 20 commands the
  • the LED 6 to begin generating strobe pulses at a predetermined frequency, suitably at least 50 Hz.
  • a predetermined frequency suitably at least 50 Hz.
  • the user passes the input unit over the surface m the same way as if it were a traditional mouse, whereupon images with partially overlapping contents are recorded by the sensor 8 and are stored the read/write memory 23.
  • the images are stored as images, i.e. with the aid of a plurality of pixels, each having a grey scale value a range from white to black.
  • the flowchart Fig. 3 shows m more detail how the input unit operates when the mouse function is used.
  • step 300 a starting image is recorded.
  • the next image is recorded. The contents of this image partially overlap the contents of the previous image.
  • step 301 the process begins of determining how it overlaps the previous image both vertically and horizontally, step 302, i.e. which relative position the best match is obtained between the contents of the images. For this purpose, every possible overlap position between the images is examined, at the pixel level, and an overlap measurement is determined as follows:
  • a movement vector is obtained, which indicates how far and which direction the input unit has been moved between the recording of the two images.
  • a positioning signal which includes this movement vector, is transmitted, step 303, by the radio transceiver 26 to the computer for which the input unit is operating as a mouse.
  • the computer uses the movement vector for positioning the cursor on its screen.
  • the flow returns to step 301.
  • the steps can partly be carried out parallel, e.g. by starting the recording of the next image while the relative position of the current and the previous image is being determined.
  • the buttons 27' can be used as clicking buttons for inputting instructions to the computer.
  • the processor 20 commands the LED 6 to begin generating strobe pulses at the predeter- mined frequency.
  • the user "writes" the text he wishes to input with the input unit directed at the selected surface, whereupon the sensor 8 records images with partially overlapping contents and stores them the read/write memory 23.
  • the tracer LED 28 successively indicates the path of movement on the surface by means of a luminous spot to give the user an idea of the movement .
  • the text is input one character at a time.
  • Fig. 4 illustrates m more detail how the input unit operates m the handwriting and drawing mode.
  • the first three steps correspond to those carried out m the mouse mode.
  • a starting image is recorded, step 400.
  • the next image, whose contents overlap the previous image, is recorded, step 401, and their relative position is determined, step 402, with the aid of the overlapping contents, whereby a movement vector is obtained.
  • the processor 20 determines whether the inputting of an information unit is complete or not, step 403. If not, the flow returns to step 401 and the next image is recorded.
  • the processor 20 reads the movement vectors determined for the information unit question to an ICR module which identifies which character the movement vectors represent, step 404. Subsequently, the identified character is stored m character-coded format the memory, step 405, and the input unit indicates that it is ready to input a new information unit, step 406.
  • the inputted and identified character is preferably transferred to a computer m character-coded format by the intermediary of the radio transmitter 26 and is shown directly on the computer screen. If the input unit is used as a stand-alone unit, the character can be shown on the display 25 instead.
  • Fig. 5 schematically shows how images with overlapping contents are recorded when the input unit is moved m a path of movement forming the letter "R" .
  • the contents of the images are not shown m Fig.5.
  • Fig. 6 shows how an inputted letter R can be reproduced on the display of the input unit or the computer on the basis of the relative positions of the images m Fig. 5 determined by the input unit when the drawing function is used. In this case, a "drawn image" of the recorded character is shown with the aid of the movement vectors, not an interpreted character. Obviously, arbitrary drawn figures and characters can be input to the input unit or a computer this manner.
  • the user wishes to use the input unit for recording predefined text on an information carrier, e.g. a sheet of paper, a newspaper, or a book.
  • an information carrier e.g. a sheet of paper, a newspaper, or a book.
  • he sets the input unit to the scanner func- tion with the aid of the buttons 27, whereupon the input unit starts operating the scanner mode.
  • the processor 20 commands the LED 6 to record images the same way as described above with respect to the mouse function.
  • the processor 20 raises the unit off the sheet of paper and releases the activat- mg button, whereupon the processor 20 turns off the LED 6.
  • step 700 a starting image is recorded.
  • step 701 a new image is recorded whose contents overlap that of the previous image.
  • step 702 the best overlap position for the current image and the previous image is determined the same way as described above with respect to the mouse function. In this position, the images are put together into a whole composite image, step 703.
  • step 704 the input unit detects whether the inputting of characters is complete. If not, the flow returns to step 701. If the user has released the activating button, indicating that the inputting is complete, the whole composite image is fed as an input signal to an OCR software which ldenti- fies and interprets the characters the image, step 705.
  • the identified and interpreted characters are obtained m a predetermined character-coded format, e.g. ASCII code, as output signals from the OCR software. They are stored m the read/write memory m a memory area for interpreted characters.
  • the processor activates the indicator 29 to inform the user that it is ready to record a new character sequence, step 706.
  • the interpreted characters can be transferred to a computer or other receiver m character-coded format with the aid of the radio transceiver 26.
  • Figs 8a-8c illustrate how the input unit operates when the character sequence "Flygande backasmer" is recorded.
  • Fig. 8a shows the text on a sheet of paper.
  • Fig. 8b shows the images which are being recorded with the aid of the sensor.
  • the contents of the images partially overlap.
  • the letter 1 appears completely m image No. 1 and partially image No. 2.
  • the degree of overlapping depends on the traction speed, i.e. the speed with which the user passes the input unit over the text relation to the frequency with which the contents of the sensor 8 are read out.
  • Fig. 8c shows what the whole composite image looks like. It should be noted that the image is still stored m the form of pixels.
  • the text "Flygande backasmer" is stor- ed m the read/write memory 23 of the input unit as ASCII code .
  • the user wishes to record an image of an object located at a distance from the input unit.
  • the "object” could, for example, be three-dimensional or it could be an image m a book.
  • the user sets the input unit to the camera function with the aid of the buttons 27, whereupon the input unit begins to operate m the camera mode and the position of the lens system 7 changes to a position suitable for recording images located at a distance from the input unit.
  • the user activates the input unit, whereupon the processor begins to read images from the sensor 8.
  • the read images can be shown either on the display 25 of the input unit or on a computer to which the input unit is connected and to which the images are transferred as they are recorded by the intermediary of the radio transceiver 26.
  • the flowchart m Fig. 9 illustrates how the input unit is adapted to operate m the camera mode.
  • step 901 the extent of the image is indicated on the display 25 of the input unit.
  • the image is recorded with the aid of a plurality of pixels, which can either have grey scale values from white to black or have colour values.
  • step 903 the image is stored the memory 23.
  • step 904 the unit indicates, m step 904, that it is ready to record a new image. If the user does not wish to keep the image, the process continues, from step 902, along the dashed line back to step 901 m order for a new image to be recorded.
  • An input unit according to the invention need not comprise all of the functions listed above. It is pos- sible to combine the mouse function with one or more of the scanner function, the camera function, the handwriting function, or other inputting functions.
  • all processing of recorded information takes place m the input unit . This is not essential. All of the above measures except for the actual image-recording can be carried out with the aid of image-processing means m an external computer or m some other external unit to which the images are transferred.
  • the recording of images is carried out by means of a single light -sensitive sensor.
  • a second light-sensitive sensor for instance, m the other end of the casing. In this case, it is possible to use one end with the first sensor for the mouse function and the other end with the second sensor for one of the inputting functions.
  • the mouse mode it is not completely essential for the images to be recorded with overlapping contents m the mouse mode, the handwriting mode and the drawing mode.
  • a special substrate with position indications can be used instead.
  • the positions can be written as coordinates, which are read and interpreted for providing positioning signals for a cursor or movement indications enabling the reproduction of a drawn image or a drawn character.
  • this has the drawback of requiring a special substrate as well as software for interpreting the position indications.

Abstract

An input unit has a mouse function and at least one inputting function. The input unit comprises image-recording means (8) which record images. These images are used for providing both the mouse function and the inputting function. The images are processed in the input unit or in some other unit.

Description

INPUT UNIT, METHOD FOR USING THE SAME AND INPUT SYSTEM
Field of the Invention
The present invention relates to an input unit having a mouse function and at least one inputting function, which input unit comprises image-recording means for providing the inputting function. The invention also relates to a method for providing a mouse function and at least one inputting function with the aid of an input unit, as well as an input system having a mouse function and at least one inputting function. Background of the Invention
Today, personal computers are usually equipped with a computer mouse, which is used for positioning a cursor on the computer screen. The positioning is carried out by the user passing the mouse over a surface, the hand movement thus indicating how the mouse should be positioned. The mouse generates positioning signals indicating how the mouse is being moved and thus how the cursor should be moved. For this purpose, the mouse usually has a track ball, which turns as a result of friction against the surface when the mouse is passed over the same and which m this connection drives position sensors which m turn generate the positioning signals. Normally, the mouse can also be used for providing instructions to the computer by the intermediary of one or more buttons on which the user clicks. However, when the term "mouse function" is used below it only refers to the function of positioning a cursor or the like.
To input text and images into a computer a hand-held scanner is sometimes used, which images the text or image which is to be input with the aid of a light-sensitive sensor. The scanner can only image a very limited text/ image area at one time. Consequently, m order to record one or several words or a whole image, the scanner must be passed over the text/image and several sub- images must be recorded. Usually, the scanner has some kind of position sensor which determines how these sub-images should be stored m the computer to enable the creation of a composite image therefrom. It is known to combine a mouse function and an inputting function m a casing which is connected with a single flex to a computer.
US 4,906,843, for example, shows a combined mouse, optical scanner, and digitising pad. In the mouse mode, a track ball is used, which drives two position sensors, which generate the positioning signals. In the scanner mode, a CCD line sensor as well as the position sensors are used for inputting characters or graphical information to the computer. US 5,355,146 shows a similar input unit with a combined mouse function and scanner function, which also utilises a track ball and a CCD line sensor.
EP 0 782 321 shows yet another input unit having a mouse function and scanner function. In this case, too, a track ball is used for the mouse function but instead of the line sensor, use is made of an area sensor which is capable of imaging a document m a single step and which thus need not be moved across the document . This is said to have the advantage that no software is requir- ed for correlating image data with position data.
US 5,633,489 shows a combined mouse and barcode reader, where the mouse function is provided by means of a track ball and the barcode reader comprises a laser diode which generates a laser beam emitted from the underside of the mouse and a photo detector which detects the varying intensity of the reflected light.
All these known input units have a rather complicated mechanical design with moving parts and many sensors. Moreover, they only provide limited synergies between the functions combined m one and the same casing. Summary of the Invention
It is thus an object of the present invention to provide an input unit having a mouse function and at least one inputting function as well as a method for providing the mouse function and the inputting function with the aid of the input unit, which input unit and which method reduce the above-mentioned deficiencies.
This object is achieved by an input unit according to claim 1, a method according to claim 19, and an input system according to claim 24. Preferred embodiments are stated m claims 2-18 and claims 20-23 respectively. An input unit according to the invention thus comprises image-recording means for providing said inputting function, with the image-recording means also being used to provide the mouse function. Instead of using the image-recording means for only one of these functions, according to the present invention they are used to provide both the mouse function and the inputting function, whereby improved integration of these functions is achieved. Furthermore, the track ball can be omitted and both functions can be based on the same sensor technology so that the signals for both functions can be handled by the same hardware and/or software. Thereby, real integration of the two functions is obtained rather than the two simply being brought toge- ther m the same casing as m the prior art. In addition, it becomes possible to provide an input unit without moving parts, which is advantageous from the point of view of durability and manufacturing. As will be evident below, it is also possible to provide an input unit which requires neither special position-determination means nor additional aids m the form of a raster or the like.
In this context, it should be noted that m this patent specification the term inputting function refers to a function whereby the user can input information to a receiver for storing and processing m the same, unlike the mouse function which is used for positioning purposes .
Furthermore, it should be noted that the mouse function can be used for positioning a cursor or the like m a plane or m space.
More specifically, the input unit is advantageously adapted to emit positioning signals for providing the mouse function, as well as inputting signals for providing said inputting function, the positioning signals as well as the inputting signals being based on images recorded by means of the image-recording means. The positioning signals can be used for controlling a cursor on a computer screen, while the inputting signals can contain information which is to be input to the computer. The positioning signals and the inputting signals can be emitted as electrical signals on leads, as IR signals, as radio signals, or m some other suitable way. Naturally, the input unit can also emit signals other than the positioning signals and the inputting signals, e.g. instruction signals based on clickmgs. The receiver of the signals can be a computer or some other input unit to which positioning information and/or other information is to be input. The input unit is especially suitable for use with small portable computers where it is desirable to have few, but versatile, accessories.
The image-recording means may comprise a first image-recording unit for providing the mouse function and a second image-recording unit for providing the inputting function. This may be particularly advantageous if diffe- rent image-recording characteristics are desired for the two functions, e.g. if different foci are desired for the image-recording. For example, when using the mouse function one may wish to be able to move the input unit across a surface m the same way as one would move a tra- ditional mouse, and when using the inputting function one may wish to be able to use the input unit as a camera for imaging objects located at a distance from the unit. In this case, the different image-recording units can be provided with different lens means with different foci . The image-recording units can, for example, be located on different sides of the input unit, but have shared hard- ware and software.
Alternatively, the image-recording means may comprise an image-recording unit which is used for providing both the mouse function and the inputting function. This embodiment is advantageous because it requires fewer com- ponents m the input unit and only one beam path.
The image-recording units may comprise any type of sensor which can be used for recording an image but should preferably be a light-sensitive sensor with a two- dimensional sensor surface, a so-called area sensor. In a less complex embodiment of the input unit, both the positioning signals and the inputting signals may essentially consist of the actual images recorded by the image-recording means. In this case, essentially all processing of the images takes place m the receiver of the signals, e.g. m a computer. If so, the latter must have software for processing the signals m a suitable manner Such software may already be stored m the computer or may, for example, be included m the input unit according to the invention and be transferred to the receiver when the input unit is m use.
The receiver of the signals from the input unit must be capable of determining whether the signals are intended as positioning signals or as inputting signals so that it will know how to process the signals. For this pur- pose, the input unit is adapted to output the positioning signals and the inputting signals m such a way that the receiver can identify whether it is receiving positioning signals or inputting signals. For example, the input unit may use different protocols for the different signals. Suitably, the input unit should also know whether the user wishes to use the mouse function or the inputting function so that it will know how the images record- ed by the image recorder should be processed. For this purpose, the input unit preferably comprises switching means, e.g. a button, which are adapted to switch the input unit between its different functions when acted upon by the user.
Preferably, at least for the mouse function, the image-recording means are adapted to record a plurality of images m such a way that the contents of each image overlap the contents of the previous image, if any. This can be achieved by recording the images with sufficiently high frequency m relation to the expected speed of movement. By virtue of the fact that the images overlap, their relative positions are determined and there is no need to use special position-determination means. Once the images have been recorded the subsequent processing can take place either m image-processing means m the input unit or m the receiver of the signals from the input unit. The advantages of processing at least the inputting signals m the input unit are that, m this way, the input unit can be used as a stand-alone unit without being connected to an adjacent receiver, that information which has been input can be shown directly on a display on the input unit so that the user can check that the information recorded really is the information he intended to record, and that the information can be transferred m a more compressed format to the receiver. Furthermore, the input unit can be connected to any receiver that supports a mouse with no special software being required m the receiver for processing the images.
Thus, the input unit advantageously comprises image- processing means used for both the mouse function and the inputting function. These image-processing means may comprise a processing unit operating according to different program modules depending upon which function of the input unit is being used. Moreover, for processing the images m the input unit, the input unit advantageously comprises means for determining the relative position of the images with the aid of the partially overlapping contents. The means for determining the relative position of the images may be included m the shared image-processing means and be implemented by means of software.
If the mouse function is used for linear positioning only, it is sufficient to determine the relative position of the images horizontally. However, if it is to be used for two-dimensional positioning, the relative position must be determined both horizontally and vertically.
In an advantageous embodiment, the input unit comprises means for generating the positioning signals on the basis of the relative position of the images. The positioning signals can, for example, be composed of one or more vectors indicating how the input unit has been moved between the recording of two images, or of one or more positioning coordinates. The means for generating the positioning signals can also be included m the shared image-processing means and be implemented by means of software .
The input unit is advantageously hand-held so that it can be carried everywhere. This technology thus enables the user to have a personal mouse and input unit with stored personal settings and personal information.
In a particularly advantageous embodiment, the input unit also comprises a transmitter for wireless connection of the input unit to a receiver, which further facili- tates the use of the input unit. The Bluetooth standard can advantageously be used for this purpose.
Advantageously, the inputting function comprises a scanner function so that the input unit can be used for recording text and/or images . The inputting function can also comprise a camera function, wherein the image-recording means are utilised for imaging objects located at a distance from the input unit .
The inputting function can also comprise a function for inputting handwritten/drawn, i.e. hand-generated information. Each of the scanner function, the camera function, and the handwritmg/drawmg function can be the only inputting function or one of several inputting functions .
The input unit can thus have a plurality of func- tions, all of which are based on images which are recorded by the image-recording means and which are processed efficiently by means of shared hardware and software.
According to a second aspect to the invention, it relates to a method for providing a mouse function and at least one inputting function with the aid of an input unit, comprising the steps of detecting which of said functions is desired; recording at least one image with the aid of the input unit; and processing said at least one image m different ways depending upon which of said functions is desired.
The advantages of the method according to the invention are evident from the above description of the input unit according to the invention.
According to a third aspect of the invention, it relates to an input system having a mouse function and at least one inputting function, comprising image-recording means for recording images and image-processing means for processing the images recorded by the image-recording means for providing the mouse function and said at least one inputting function, the image-recording means being located m a first casing and the image-processing means being located m a second casing.
Accordingly, the input system comprises the case where the image-recording means are located m an input unit and the image-processing means are located m a computer or other receiver to which the input unit is connected and to which it transmits recorded images. Every- thing that has been stated above with respect to the image-recording means and the processing of the images recorded by the image-recording means also applies to the input system.
Brief Description of the Drawings The present invention will now be described m more detail by way of an embodiment with reference to the accompanying drawings, m which
Fig. 1 schematically shows an embodiment of an input unit according to the invention; Fig. 2 is a block diagram of the electronic circuitry m an embodiment of an input unit according to the invention;
Fig. 3 is a flowchart of the mouse function;
Fig. 4 is a flowchart of the handwriting/drawing function;
Fig. 5 schematically shows how a surface is imaged m connection with the inputting of handwritten information;
Fig. 6 shows how the handwritten input can be shown on a display;
Fig. 7 is a flowchart of the scanner function;
Figs 8a-8c schematically show how text is recorded m the scanner mode; and
Fig. 9 is a flowchart of the camera function. Description of a Preferred Embodiment The following is a description of an embodiment of an input unit according to the invention having a mouse function, a scanner function, a camera function, as well as a function for inputting handwritten/drawn information. Design of the Unit F g. 1 shows the design of the input unit according to this embodiment. The unit has a casing 1 having approximately the same shape as a conventional highlighter pen. One short side of the casing has a window 2, by the intermediary of which images are recorded for the various image-based functions of the input unit.
The casing 1 essentially contains an optics part 3, an electronic circuitry part 4, and a power supply 5. The optics part 3 comprises a light -emitting diode 6, a lens system 7, and an image-recording means m the form of a light-sensitive sensor 8, which constitutes the interface with the electronic circuitry part 4.
The task of the LED 6 is to illuminate a surface which is currently located under the window. A diffuser 9 is mounted m front of the LED 6 for diffusing the light.
The lens system 7 has the task of projecting an image of the surface located under the window 2 on the light-sensitive sensor 8 as accurately as possible. The lens system is displaceable between two positions, the second of which is indicated by dashed lines. The first position is used when images are to be recorded of a surface located directly below the window of the input unit and is primarily intended for the mouse function, the scanner function, and the handwriting/drawmg function. The second position is used when images are to be recorded of objects located at a distance from the input unit and is primarily intended for the camera function, but can also be used for the other functions. In this example, the light-sensitive sensor 8 comprises a two-dimensional, square CCD unit (CCD = charge coupled device) with a built-in A/D converter. Such sensors are commercially available. The sensor 8 is mounted at a small angle to the window 2 and on its own circuit board 11.
The power supply to the input unit is obtained from a battery 12 which is mounted m a separate compartment 13 m the casing .
The block diagram m Fig. 2 schematically shows the electronic circuitry part 4. This comprises a processor 20, which by the intermediary of a bus 21 is connected to a ROM 22, m which the programs of the processor are stored, to a read/write memory 23, which constitutes the working memory of the processor and m which the images from the sensor are stored, to a control logic unit 24, as well as to the sensor 8 and the LED 6. The processor 20, the bus 21, the memories 22 and 23, the control logic unit 24, as well as associated software together constitute image-processing means.
The control logic unit 24 is m turn connected to a number of peripheral units, comprising a display 25, which is mounted m the casing, a radio transceiver 26 for transferring information to/from an external computer, buttons 27, by means of which the user can control the input unit and specifically adjust the input unit between the mouse function, the scanner function, the camera function, and the handwriting/drawmg function, buttons 27' corresponding to the clicking buttons on a traditional mouse, a tracer LED 28 which emits a light beam, making it easier for the user to know which information he is inputting, as well as an indicator 29, e.g. an LED, indicating when the pen is ready to be used. Control signals to the memories, the sensor 8, and the peripheral units are generated m the control logic unit 24. The control logic also handles generation and pπoπtisa- tion of interrupts to the processor. The buttons 27 and 27', the radio transceiver 26, the display 25, the tracer LED 28, and the LED 6 are accessed by the processor writing and reading m a register the control logic unit 24. The buttons 27 and 27' generate interrupts to the processor 20 when they are activated. Operation of the Unit The various functions of the input unit, viz. the mouse function, the scanner function, the handwriting/ drawing function, and the camera function, will now be described. All of these functions are based on images which are recorded with the aid of the sensor 8. When the first three functions are used, a plurality of images are recorded such a way that the contents of each image partially overlap the contents of the previous image, if any. As the images are being recorded, the relative position of the images is determined, i.e. the position which affords the best possible correspondence between their contents. Subsequently, the processing is carried out depending upon the function selected by the user. When recording the images, the input unit can be passed over a surface with the window 2 contact with the same, or be held at a small or at a larger distance from the surface depending upon the setting of the lens system. The surface need not be plane. For example, it could be a sheet of paper with text on it, a wall covered with patterned wallpaper, or a bowl of sweets. What is important is that images with varying contents can be recorded so that the relative positions of the images can be determined with the aid of the contents of the images. The Mouse Function
First, suppose that the user wishes to use the input unit as a mouse. In this case, he sets the unit to the mouse function with the aid of the buttons 27, whereupon the input unit starts operating m the mouse mode, and logs into the computer for which the input unit is to operate as a mouse. Subsequently, the user directs the window 2 of the input unit at a patterned surface, e.g. a mouse pad. He presses one of the buttons 27 to activate the input unit, whereupon the processor 20 commands the
LED 6 to begin generating strobe pulses at a predetermined frequency, suitably at least 50 Hz. Subsequently, the user passes the input unit over the surface m the same way as if it were a traditional mouse, whereupon images with partially overlapping contents are recorded by the sensor 8 and are stored the read/write memory 23. The images are stored as images, i.e. with the aid of a plurality of pixels, each having a grey scale value a range from white to black. The flowchart Fig. 3 shows m more detail how the input unit operates when the mouse function is used. In step 300, a starting image is recorded. In step 301, the next image is recorded. The contents of this image partially overlap the contents of the previous image.
As soon as an image has been recorded step 301, the process begins of determining how it overlaps the previous image both vertically and horizontally, step 302, i.e. which relative position the best match is obtained between the contents of the images. For this purpose, every possible overlap position between the images is examined, at the pixel level, and an overlap measurement is determined as follows:
1) For each overlapping pixel position, the grey scale values of the two relevant pixels are added up if the latter are not white. Such a pixel position m which none of the pixels are white is designated a plus position.
2) The grey scale sums for all the plus positions are added up .
3) The neighbours of each pixel position are exa- mined. If an overlapping pixel position is not a neighbour of a plus position and consists of a pixel which is white and a pixel position which is not white, the grey scale value of the non-white pixel is subtracted, possibly multiplied by a constant, from the sum m point 2) . 4) The overlap position providing the highest overlap measurement as stated above is selected.
Our Swedish patent application No. 9704924-1 and the corresponding U.S. application No. 024,641 describe an alternative way of matching the images m order to find the best overlap position. The content of these applications is herewith incorporated by reference.
As soon as the best overlap position between the current image and the previous image has been determined, the previous image is discarded, whereupon the current image becomes the previous image relation to the next image recorded. By determining the relative position of the two images a movement vector is obtained, which indicates how far and which direction the input unit has been moved between the recording of the two images. Subsequently, a positioning signal, which includes this movement vector, is transmitted, step 303, by the radio transceiver 26 to the computer for which the input unit is operating as a mouse. The computer uses the movement vector for positioning the cursor on its screen. Subsequently, the flow returns to step 301. In order to increase the speed, the steps can partly be carried out parallel, e.g. by starting the recording of the next image while the relative position of the current and the previous image is being determined. In the mouse mode, the buttons 27' can be used as clicking buttons for inputting instructions to the computer. The Handwriting and Drawing Function
Next, suppose that the user wishes to input handwritten text to his computer. In this case, with the aid of the buttons 27, he sets the input unit to the handwriting function, whereupon the input unit starts to operate m the handwriting mode. When the user subsequently activates the input unit, the processor 20 commands the LED 6 to begin generating strobe pulses at the predeter- mined frequency. Subsequently, the user "writes" the text he wishes to input with the input unit directed at the selected surface, whereupon the sensor 8 records images with partially overlapping contents and stores them the read/write memory 23. The tracer LED 28 successively indicates the path of movement on the surface by means of a luminous spot to give the user an idea of the movement . The text is input one character at a time. Between each character, the user indicates that an information unit has been input, for example by releasing the activating button 27 for a short time or by not moving the input unit for a short time. Fig. 4 illustrates m more detail how the input unit operates m the handwriting and drawing mode. The first three steps correspond to those carried out m the mouse mode. When the input unit is activated, a starting image is recorded, step 400. The next image, whose contents overlap the previous image, is recorded, step 401, and their relative position is determined, step 402, with the aid of the overlapping contents, whereby a movement vector is obtained. Subsequently, the processor 20 determines whether the inputting of an information unit is complete or not, step 403. If not, the flow returns to step 401 and the next image is recorded. If the inputting is complete, the processor 20 reads the movement vectors determined for the information unit question to an ICR module which identifies which character the movement vectors represent, step 404. Subsequently, the identified character is stored m character-coded format the memory, step 405, and the input unit indicates that it is ready to input a new information unit, step 406.
The inputted and identified character is preferably transferred to a computer m character-coded format by the intermediary of the radio transmitter 26 and is shown directly on the computer screen. If the input unit is used as a stand-alone unit, the character can be shown on the display 25 instead.
Fig. 5 schematically shows how images with overlapping contents are recorded when the input unit is moved m a path of movement forming the letter "R" . For the sake of simplicity, the contents of the images are not shown m Fig.5.
If, instead, the user indicates that the drawing function is to be used, only steps 400-403 are implemented. Fig. 6 shows how an inputted letter R can be reproduced on the display of the input unit or the computer on the basis of the relative positions of the images m Fig. 5 determined by the input unit when the drawing function is used. In this case, a "drawn image" of the recorded character is shown with the aid of the movement vectors, not an interpreted character. Obviously, arbitrary drawn figures and characters can be input to the input unit or a computer this manner. The Scanner Function
Now, suppose that the user wishes to use the input unit for recording predefined text on an information carrier, e.g. a sheet of paper, a newspaper, or a book. In this case, he sets the input unit to the scanner func- tion with the aid of the buttons 27, whereupon the input unit starts operating the scanner mode.
Subsequently, he directs the input unit at the sheet of paper with the text m the location where he wishes to begin recording text, activates the input unit with the aid of the buttons 27, and passes it over the text which is to be recorded, following the text the same manner as when one reads the text. The tracer LED 28 emits a light beam which makes it easier to follow the lines. When the user activates the input unit, the processor 20 commands the LED 6 to record images the same way as described above with respect to the mouse function. When the user has passed the input unit over the selected text or has come to the end of a line of characters, he lifts the unit off the sheet of paper and releases the activat- mg button, whereupon the processor 20 turns off the LED 6.
The flowchart Fig. 7 illustrates m more detail how the input unit operates m this mode. In step 700, a starting image is recorded. In step 701, a new image is recorded whose contents overlap that of the previous image. In step 702, the best overlap position for the current image and the previous image is determined the same way as described above with respect to the mouse function. In this position, the images are put together into a whole composite image, step 703. In step 704, the input unit detects whether the inputting of characters is complete. If not, the flow returns to step 701. If the user has released the activating button, indicating that the inputting is complete, the whole composite image is fed as an input signal to an OCR software which ldenti- fies and interprets the characters the image, step 705. The identified and interpreted characters are obtained m a predetermined character-coded format, e.g. ASCII code, as output signals from the OCR software. They are stored m the read/write memory m a memory area for interpreted characters. When the character identification and storing m character-coded format are finished, the processor activates the indicator 29 to inform the user that it is ready to record a new character sequence, step 706. The interpreted characters can be transferred to a computer or other receiver m character-coded format with the aid of the radio transceiver 26.
The above steps are thus carried out by the processor 20 with the aid of the associated units and suitable software. Such software can be created by the skilled person with the aid of the above instructions if it is not commercially available.
Figs 8a-8c illustrate how the input unit operates when the character sequence "Flygande backasmer" is recorded. Fig. 8a shows the text on a sheet of paper. Fig. 8b shows the images which are being recorded with the aid of the sensor. As can be seen from this Figure, the contents of the images partially overlap. For example, the letter 1 appears completely m image No. 1 and partially image No. 2. The degree of overlapping depends on the traction speed, i.e. the speed with which the user passes the input unit over the text relation to the frequency with which the contents of the sensor 8 are read out. Fig. 8c shows what the whole composite image looks like. It should be noted that the image is still stored m the form of pixels. When the method has been carried out, the text "Flygande backasmer" is stor- ed m the read/write memory 23 of the input unit as ASCII code .
The Camera Function
Next, assume that the user wishes to record an image of an object located at a distance from the input unit. The "object" could, for example, be three-dimensional or it could be an image m a book. In this case, the user sets the input unit to the camera function with the aid of the buttons 27, whereupon the input unit begins to operate m the camera mode and the position of the lens system 7 changes to a position suitable for recording images located at a distance from the input unit. Subsequently, the user activates the input unit, whereupon the processor begins to read images from the sensor 8. The read images can be shown either on the display 25 of the input unit or on a computer to which the input unit is connected and to which the images are transferred as they are recorded by the intermediary of the radio transceiver 26. When the user is satisfied with the appearance of the image, he presses one of the buttons 27, which then records an image of the object. When the image of the object has been recorded, the user can command the input unit to show the recorded image on the display 25 or to transfer the image to the computer by the intermediary of the radio transceiver 26. The flowchart m Fig. 9 illustrates how the input unit is adapted to operate m the camera mode. In step 901, the extent of the image is indicated on the display 25 of the input unit. When the user is satisfied with the appearance of the image, he presses the button 27, whereupon the image is frozen and recorded a buffer memory m step 902. The image is recorded with the aid of a plurality of pixels, which can either have grey scale values from white to black or have colour values. The user can then choose whether or not he wishes to keep the current image. If the user decides to keep the image, the process continues along the solid line to step 903, m which the image is stored the memory 23. When the image has been stored, the unit indicates, m step 904, that it is ready to record a new image. If the user does not wish to keep the image, the process continues, from step 902, along the dashed line back to step 901 m order for a new image to be recorded. Alternative Embodiments
The above embodiment is described by way of example only. An input unit according to the invention need not comprise all of the functions listed above. It is pos- sible to combine the mouse function with one or more of the scanner function, the camera function, the handwriting function, or other inputting functions.
In the embodiment described above, all processing of recorded information takes place m the input unit . This is not essential. All of the above measures except for the actual image-recording can be carried out with the aid of image-processing means m an external computer or m some other external unit to which the images are transferred. In the embodiment described above, the recording of images is carried out by means of a single light -sensitive sensor. However, it is also possible to arrange a second light-sensitive sensor, for instance, m the other end of the casing. In this case, it is possible to use one end with the first sensor for the mouse function and the other end with the second sensor for one of the inputting functions.
It is not completely essential for the images to be recorded with overlapping contents m the mouse mode, the handwriting mode and the drawing mode. In these modes, a special substrate with position indications can be used instead. For example, the positions can be written as coordinates, which are read and interpreted for providing positioning signals for a cursor or movement indications enabling the reproduction of a drawn image or a drawn character. However, this has the drawback of requiring a special substrate as well as software for interpreting the position indications.

Claims

1. An input unit having a mouse function and at least one inputting function, comprising image-recording means (8) for providing said inputting function, c h a r a c t e r i s e d m that the image-recording means (8) are also used for providing the mouse function.
2. An input unit according to claim 1, wherein the input unit is adapted to emit positioning signals for providing the mouse function, and inputting signals for providing said inputting function, both the positioning signals and the inputting signals being based upon images recorded with the aid of said image-recording means (8) .
3. An input unit according to claim 1 or 2 , wherein the image-recording means (8) comprise a first image- recording unit for providing the mouse function and a second image-recording unit for providing the inputting function.
4. An input unit according to claim 1 or 2, wherein the image-recording means (8) comprise an image-recording unit which is used for providing both the mouse function and the inputting function.
5. An input unit according to any one of the preced- mg claims, wherein the image-recording means comprise at least one area sensor.
6. An input unit according to any one of claims 2-5, wherein both the positioning signals and the inputting signals are essentially composed of said images.
7. An input unit according to any one of claims 2-6, wherein the input unit is adapted to output the positioning signals and the inputting signals m a way that enables a receiver to identify whether it is receiving the positioning signals or the inputting signals.
8. An input unit according to any one of the preceding claims, further comprising switching means (27) , which, when acted upon by a user, are adapted to switch the input unit between its different functions.
9. An input unit according to any one of the preceding claims, wherein the image-recording means (8), at least with respect to the mouse function, are adapted to record a plurality of images m such a way that the content of each image overlaps the content of the previous image .
10. An input unit according to any one of the pre- ceding claims, further comprising image-processing means
(20-24) which are used for both the mouse function and the inputting function.
11. An input unit according to claim 9 or 10, further comprising means (20-24) for determining the rela- tive position of the images with the aid of the partially overlapping contents.
12. An input unit according to claim 11, wherein the means (20-24) for determining the relative position of the images are adapted to determine the relative position of the images horizontally as well as vertically.
13. An input unit according to claim 11 or 12, further comprising means (20-24) for generating the positioning signals on the basis of the relative position of the images .
14. An input unit according to any one of the preceding claims, which input unit is hand-held.
15. An input unit according to any one of the preceding claims, further comprising a transmitter (26) for wireless connection of the input unit to a receiver.
16. An input unit according to any one of the preceding claims, wherein said at least one inputting function comprises a scanner function.
17. An input unit according to any one of the preceding claims, wherein said inputting function comprises a camera function for imaging objects located at a distance from the input unit .
18. An input unit according to any one of the preceding claims, wherein said inputting function comprises a function for inputting handwritten/drawn information.
19. A method for providing a mouse function and at least one inputting function with the aid of an input unit, comprising the steps of detecting which of said functions is desired; recording at least one image with the aid of the input unit; and processing said at least one image m different ways depending upon which of said functions is desired.
20. A method according to claim 19, which, when the mouse function is desired, further comprises the steps of recording at least two images with partially over- lapping contents; determining the relative position of the images both horizontally and vertically; determining a positioning signal on the basis of the relative position of the images; and using the positioning signal for controlling a cursor on a computer screen.
21. A method according to claim 19 or 20, m which the inputting function is a character- inputting function and which, when the character- inputting function is desired, further comprises the steps of recording a plurality of images with partially over- lapping contents, which images together image the characters with are to be input; putting the images together both horizontally and vertically into a composite image, identifying characters m the composite image; interpreting the identified characters; and storing the characters m character-coded format.
22. A method according to any one of claims 19-21, m which the inputting function is a function for inputting handwritten/drawn information and which, when the handwriting/drawmg function is desired, further com- prises the steps of recording a plurality of images with partially overlapping contents; determining the relative position of the images both horizontally and vertically; and determining the movement of the input unit on the basis of the relative position of the images.
23. A method according to claim 22, further compπs- mg the step of interpreting which character the movement of the input unit represents and storing this character m character-coded format.
24. An input system having a mouse function and at least one inputting function, comprising image-recording means (8) for recording images and image-processing means (20-24) for processing the images recorded by the image- recording means for providing the mouse function and said at least one inputting function, the image-recording means being located m a first casing and the lmage-pro- cessmg means being located m a second casing.
PCT/SE1999/000718 1998-04-30 1999-04-30 Input unit, method for using the same and input system WO1999060468A1 (en)

Priority Applications (7)

Application Number Priority Date Filing Date Title
JP2000550019A JP2002516428A (en) 1998-04-30 1999-04-30 Input device, method of using the same, and input system
EP99925538A EP1073944A1 (en) 1998-04-30 1999-04-30 Input unit, method for using the same and input system
AU41794/99A AU758036B2 (en) 1998-04-30 1999-04-30 Input unit, method for using the same and input system
CA002331073A CA2331073A1 (en) 1998-04-30 1999-04-30 Input unit, method for using the same and input system
IL13910499A IL139104A0 (en) 1998-04-30 1999-04-30 Input unit, method for using the same and input system
KR1020007012070A KR20010052282A (en) 1998-04-30 1999-04-30 Input unit, method for using the same and input system
BR9910083-5A BR9910083A (en) 1998-04-30 1999-04-30 Input unit, method for using it and input system

Applications Claiming Priority (8)

Application Number Priority Date Filing Date Title
SE9801535A SE511855C2 (en) 1998-04-30 1998-04-30 Handwritten character recording device for characters, symbols, graphs, calligraphy
SE9801535-7 1998-04-30
US9132398P 1998-06-30 1998-06-30
US60/091,323 1998-06-30
SE9803455-6 1998-10-09
SE9803455A SE513940C2 (en) 1998-04-30 1998-10-09 Unit and input system with mouse function and input function and ways to use the unit
US10578098P 1998-10-27 1998-10-27
US60/105,780 1998-10-27

Publications (1)

Publication Number Publication Date
WO1999060468A1 true WO1999060468A1 (en) 1999-11-25

Family

ID=27484809

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/SE1999/000718 WO1999060468A1 (en) 1998-04-30 1999-04-30 Input unit, method for using the same and input system

Country Status (9)

Country Link
EP (1) EP1073944A1 (en)
JP (1) JP2002516428A (en)
KR (1) KR20010052282A (en)
CN (1) CN1152296C (en)
AU (1) AU758036B2 (en)
BR (1) BR9910083A (en)
CA (1) CA2331073A1 (en)
IL (1) IL139104A0 (en)
WO (1) WO1999060468A1 (en)

Cited By (22)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2001061452A2 (en) * 2000-02-16 2001-08-23 Telefonaktiebolaget Lm Ericsson (Publ) Method for sharing information between electronic reading devices
WO2001061454A1 (en) 2000-02-18 2001-08-23 Anoto Ab Controlling an electronic device
WO2001061449A2 (en) * 2000-02-16 2001-08-23 Telefonaktiebolaget Lm Ericsson (Publ) Specially formatted paper based applications of a mobile phone
JP2001265457A (en) * 2000-02-04 2001-09-28 Robert Bosch Gmbh Device for manually operating unit of automobile and method for using the same
WO2001061450A3 (en) * 2000-02-16 2002-01-10 Ericsson Telefon Ab L M A system and method for operating an electronic reading device user interface
WO2002007424A2 (en) * 2000-07-14 2002-01-24 Siemens Aktiengesellschaft Input device for carrying out handwriting recognition for an electronic information and communications system, and method for detecting input data
KR20030017924A (en) * 2001-08-25 2003-03-04 윤덕기 A pan type wireless beem mouse
US6593908B1 (en) 2000-02-16 2003-07-15 Telefonaktiebolaget Lm Ericsson (Publ) Method and system for using an electronic reading device on non-paper devices
US6611259B1 (en) 2000-02-16 2003-08-26 Telefonaktiebolaget Lm Ericsson (Publ) System and method for operating an electronic reading device user interface
US6693623B1 (en) 2000-02-16 2004-02-17 Telefonaktiebolaget Lm Ericsson (Publ) Measuring applications for an electronic reading device
US6832116B1 (en) 2000-02-16 2004-12-14 Telefonaktiebolaget L M Ericsson (Publ) Method and system for controlling an electronic utility device using an electronic reading device
KR100461769B1 (en) * 2001-10-31 2004-12-14 삼성전자주식회사 Stylus with exterior-type camera lens module and portable telephone therewith
US6839453B1 (en) 2000-05-16 2005-01-04 The Upper Deck Company, Llc Method and apparatus for authenticating unique items such as sports memorabilia
US6839623B1 (en) 2000-02-16 2005-01-04 Telefonaktiebolaget Lm Ericsson (Publ) Positioning applications for an electronic reading device
US6885878B1 (en) 2000-02-16 2005-04-26 Telefonaktiebolaget L M Ericsson (Publ) Method and system for using an electronic reading device as a general application input and navigation interface
US6952497B1 (en) 2000-02-16 2005-10-04 Telefonaktiebolaget L M Ericsson (Publ) Method and system for electronically recording transactions and performing security function
US6992655B2 (en) 2000-02-18 2006-01-31 Anoto Ab Input unit arrangement
US7054487B2 (en) 2000-02-18 2006-05-30 Anoto Ip Lic Handelsbolag Controlling and electronic device
WO2006107245A1 (en) * 2005-04-05 2006-10-12 Lk Innovatronic Method to control a display
US7196825B2 (en) 2000-02-16 2007-03-27 Telefonaktiebolaget Lm Ericsson (Publ) Printer pen
US7996589B2 (en) 2005-04-22 2011-08-09 Microsoft Corporation Auto-suggest lists and handwritten input
US8054512B2 (en) 2007-07-30 2011-11-08 Palo Alto Research Center Incorporated System and method for maintaining paper and electronic calendars

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004086462A (en) * 2002-08-26 2004-03-18 Taizo Saito Pen type computer input device
US9024880B2 (en) 2004-08-11 2015-05-05 Pixart Imaging Inc. Interactive system capable of improving image processing
TWI236289B (en) 2004-08-11 2005-07-11 Pixart Imaging Inc Interactive device capable of improving image processing
CN2738325Y (en) * 2004-10-15 2005-11-02 段西京 Photoelectrical pen type mouse
KR100854650B1 (en) * 2007-02-08 2008-08-27 (주) 아이.에스.브이. Optical pen mouse capable of magnifying displayed object and method of magnifying displayed object using the same
CN107037891A (en) * 2015-08-19 2017-08-11 原建桥 A kind of combined type computer mouse

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4797544A (en) * 1986-07-23 1989-01-10 Montgomery James R Optical scanner including position sensors
US4804949A (en) * 1987-03-20 1989-02-14 Everex Ti Corporation Hand-held optical scanner and computer mouse

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4797544A (en) * 1986-07-23 1989-01-10 Montgomery James R Optical scanner including position sensors
US4804949A (en) * 1987-03-20 1989-02-14 Everex Ti Corporation Hand-held optical scanner and computer mouse

Cited By (29)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2001265457A (en) * 2000-02-04 2001-09-28 Robert Bosch Gmbh Device for manually operating unit of automobile and method for using the same
US6813396B1 (en) 2000-02-16 2004-11-02 Telefonatiebolaget L.M. Ericsson (Publ) Method for sharing information between electronic reading devices
US6885878B1 (en) 2000-02-16 2005-04-26 Telefonaktiebolaget L M Ericsson (Publ) Method and system for using an electronic reading device as a general application input and navigation interface
US6693623B1 (en) 2000-02-16 2004-02-17 Telefonaktiebolaget Lm Ericsson (Publ) Measuring applications for an electronic reading device
WO2001061452A3 (en) * 2000-02-16 2001-12-20 Ericsson Telefon Ab L M Method for sharing information between electronic reading devices
WO2001061450A3 (en) * 2000-02-16 2002-01-10 Ericsson Telefon Ab L M A system and method for operating an electronic reading device user interface
WO2001061452A2 (en) * 2000-02-16 2001-08-23 Telefonaktiebolaget Lm Ericsson (Publ) Method for sharing information between electronic reading devices
WO2001061449A3 (en) * 2000-02-16 2002-03-14 Ericsson Telefon Ab L M Specially formatted paper based applications of a mobile phone
US7196825B2 (en) 2000-02-16 2007-03-27 Telefonaktiebolaget Lm Ericsson (Publ) Printer pen
US6952497B1 (en) 2000-02-16 2005-10-04 Telefonaktiebolaget L M Ericsson (Publ) Method and system for electronically recording transactions and performing security function
US6593908B1 (en) 2000-02-16 2003-07-15 Telefonaktiebolaget Lm Ericsson (Publ) Method and system for using an electronic reading device on non-paper devices
US6832116B1 (en) 2000-02-16 2004-12-14 Telefonaktiebolaget L M Ericsson (Publ) Method and system for controlling an electronic utility device using an electronic reading device
US6611259B1 (en) 2000-02-16 2003-08-26 Telefonaktiebolaget Lm Ericsson (Publ) System and method for operating an electronic reading device user interface
US6839623B1 (en) 2000-02-16 2005-01-04 Telefonaktiebolaget Lm Ericsson (Publ) Positioning applications for an electronic reading device
WO2001061449A2 (en) * 2000-02-16 2001-08-23 Telefonaktiebolaget Lm Ericsson (Publ) Specially formatted paper based applications of a mobile phone
US7345673B2 (en) 2000-02-18 2008-03-18 Anoto Ab Input unit arrangement
US6992655B2 (en) 2000-02-18 2006-01-31 Anoto Ab Input unit arrangement
WO2001061454A1 (en) 2000-02-18 2001-08-23 Anoto Ab Controlling an electronic device
US7054487B2 (en) 2000-02-18 2006-05-30 Anoto Ip Lic Handelsbolag Controlling and electronic device
JP2003523572A (en) * 2000-02-18 2003-08-05 アノト・アクティエボラーク Configuration of input unit
US7027623B2 (en) 2000-05-16 2006-04-11 The Upper Deck Company, Llc Apparatus for capturing an image
US6839453B1 (en) 2000-05-16 2005-01-04 The Upper Deck Company, Llc Method and apparatus for authenticating unique items such as sports memorabilia
WO2002007424A3 (en) * 2000-07-14 2002-06-27 Siemens Ag Input device for carrying out handwriting recognition for an electronic information and communications system, and method for detecting input data
WO2002007424A2 (en) * 2000-07-14 2002-01-24 Siemens Aktiengesellschaft Input device for carrying out handwriting recognition for an electronic information and communications system, and method for detecting input data
KR20030017924A (en) * 2001-08-25 2003-03-04 윤덕기 A pan type wireless beem mouse
KR100461769B1 (en) * 2001-10-31 2004-12-14 삼성전자주식회사 Stylus with exterior-type camera lens module and portable telephone therewith
WO2006107245A1 (en) * 2005-04-05 2006-10-12 Lk Innovatronic Method to control a display
US7996589B2 (en) 2005-04-22 2011-08-09 Microsoft Corporation Auto-suggest lists and handwritten input
US8054512B2 (en) 2007-07-30 2011-11-08 Palo Alto Research Center Incorporated System and method for maintaining paper and electronic calendars

Also Published As

Publication number Publication date
CA2331073A1 (en) 1999-11-25
BR9910083A (en) 2000-12-26
KR20010052282A (en) 2001-06-25
EP1073944A1 (en) 2001-02-07
CN1303495A (en) 2001-07-11
AU4179499A (en) 1999-12-06
JP2002516428A (en) 2002-06-04
AU758036B2 (en) 2003-03-13
IL139104A0 (en) 2001-11-25
CN1152296C (en) 2004-06-02

Similar Documents

Publication Publication Date Title
US6906699B1 (en) Input unit, method for using the same and input system
AU758036B2 (en) Input unit, method for using the same and input system
US6992655B2 (en) Input unit arrangement
US6985643B1 (en) Device and method for recording hand-written information
US6151015A (en) Pen like computer pointing device
US6618038B1 (en) Pointing device having rotational sensing mechanisms
JP3641485B2 (en) Recording method and apparatus
US6243503B1 (en) Data acquisition device for optical detection and storage of visually marked and projected alphanumerical characters, graphics and photographic picture and/or three dimensional topographies
CN101751570B (en) Image reading apparatus, and reading method
JP2004164609A (en) Universal input device
US20050024690A1 (en) Pen with tag reader and navigation system
AU758514B2 (en) Control device and method of controlling an object
EP1073945B1 (en) Device and method for recording hand-written information
AU758236B2 (en) Device for recording information in different modes
US6715686B1 (en) Device for recording information in different modes
MXPA00010541A (en) Input unit, method for using the same and input system
SE513940C2 (en) Unit and input system with mouse function and input function and ways to use the unit
JP2004272310A (en) Ultrasonic light coordinate input device
WO1999060515A1 (en) Device for recording information in different modes
MXPA00010548A (en) Device and method for recording hand-written information
SE511855C2 (en) Handwritten character recording device for characters, symbols, graphs, calligraphy
MXPA00010533A (en) Control device and method of controlling an object

Legal Events

Date Code Title Description
WWE Wipo information: entry into national phase

Ref document number: 99806674.5

Country of ref document: CN

AK Designated states

Kind code of ref document: A1

Designated state(s): AE AL AM AT AT AU AZ BA BB BG BR BY CA CH CN CU CZ CZ DE DE DK DK EE EE ES FI FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MD MG MK MN MW MX NO NZ PL PT RO RU SD SE SG SI SK SK SL TJ TM TR TT UA UG US UZ VN YU ZA ZW

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): GH GM KE LS MW SD SL SZ UG ZW AM AZ BY KG KZ MD RU TJ TM AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE BF BJ CF CG CI CM GA GN GW ML MR NE SN TD TG

DFPE Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101)
121 Ep: the epo has been informed by wipo that ep was designated in this application
WWE Wipo information: entry into national phase

Ref document number: 139104

Country of ref document: IL

Ref document number: 41794/99

Country of ref document: AU

WWE Wipo information: entry into national phase

Ref document number: 09673704

Country of ref document: US

WWE Wipo information: entry into national phase

Ref document number: PA/a/2000/010541

Country of ref document: MX

ENP Entry into the national phase

Ref document number: 2331073

Country of ref document: CA

WWE Wipo information: entry into national phase

Ref document number: 1020007012070

Country of ref document: KR

WWE Wipo information: entry into national phase

Ref document number: 1999925538

Country of ref document: EP

WWP Wipo information: published in national office

Ref document number: 1999925538

Country of ref document: EP

REG Reference to national code

Ref country code: DE

Ref legal event code: 8642

WWP Wipo information: published in national office

Ref document number: 1020007012070

Country of ref document: KR

WWG Wipo information: grant in national office

Ref document number: 41794/99

Country of ref document: AU

WWW Wipo information: withdrawn in national office

Ref document number: 1020007012070

Country of ref document: KR

WWW Wipo information: withdrawn in national office

Ref document number: 1999925538

Country of ref document: EP