WO2001061454A1 - Controlling an electronic device - Google Patents

Controlling an electronic device Download PDF

Info

Publication number
WO2001061454A1
WO2001061454A1 PCT/SE2001/000360 SE0100360W WO0161454A1 WO 2001061454 A1 WO2001061454 A1 WO 2001061454A1 SE 0100360 W SE0100360 W SE 0100360W WO 0161454 A1 WO0161454 A1 WO 0161454A1
Authority
WO
WIPO (PCT)
Prior art keywords
electronic device
input unit
function mode
text string
arrangement according
Prior art date
Application number
PCT/SE2001/000360
Other languages
French (fr)
Other versions
WO2001061454A8 (en
Inventor
Petter Ericson
Henrik HÖGLIND
Original Assignee
Anoto Ab
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from SE0000541A external-priority patent/SE0000541L/en
Application filed by Anoto Ab filed Critical Anoto Ab
Priority to EP01906485A priority Critical patent/EP1285329A1/en
Priority to AU2001234308A priority patent/AU2001234308A1/en
Publication of WO2001061454A1 publication Critical patent/WO2001061454A1/en
Publication of WO2001061454A8 publication Critical patent/WO2001061454A8/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/0304Detection arrangements using opto-electronic means
    • G06F3/0317Detection arrangements using opto-electronic means in co-operation with a patterned surface, e.g. absolute position or relative movement detection for an optical mouse or pen positioned with respect to a coded surface
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/0304Detection arrangements using opto-electronic means
    • G06F3/0317Detection arrangements using opto-electronic means in co-operation with a patterned surface, e.g. absolute position or relative movement detection for an optical mouse or pen positioned with respect to a coded surface
    • G06F3/0321Detection arrangements using opto-electronic means in co-operation with a patterned surface, e.g. absolute position or relative movement detection for an optical mouse or pen positioned with respect to a coded surface by optically sensing the absolute position with respect to a regularly patterned surface forming a passive digitiser, e.g. pen optically detecting position indicative tags printed on a paper sheet
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0354Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
    • G06F3/03542Light pens for emitting or receiving light
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0354Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
    • G06F3/03545Pens or stylus
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09FDISPLAYING; ADVERTISING; SIGNS; LABELS OR NAME-PLATES; SEALS
    • G09F23/00Advertising on or in specific articles, e.g. ashtrays, letter-boxes
    • G09F23/06Advertising on or in specific articles, e.g. ashtrays, letter-boxes the advertising matter being combined with articles for restaurants, shops or offices

Abstract

An electronic device (200), such as a computer, a mobile phone or a PDA, is controlled by an arrangement comprising a handheld input unit (300), which records images and converts these to text strings comprising character sequences. The text strings recorded by the input unit (300) are used when controlling the electronic device (200). A signal-processing unit (210) receives the text strings, matches the format of a current text string to a format database comprising predetermined formats, each of which is associated with a command, and generates the command associated with the current text string, so as to control the electronic device (200). The signal-processing unit (210) can search for an address in the current text string. If an address is found, the signal-processing unit (210) causes the electronic device (200) to connect to the address. The input unit, which is capable of operating in different function modes, such as a mouse function mode and a text inputting mode, is arranged to automatically change between the different function modes on the basis of the contents of the recorded images, typically in identification of a predetermined pattern.

Description

CONTROLLING AN ELECTRONIC DEVICE
Field of the Invention
This invention concerns an arrangement and a method for controlling an electronic device starting from images of a base. The invention also concerns an input unit for inputting images from a base. Background of the Invention The Applicant's Patent Publication No. WO98/20446 discloses a reading pen which can be used for recording text by imaging this from a base. The reading pen has an optical sensor which records a number of images, with partially overlapping contents, of the text which is to be recorded. In addition, the reading pen has a processor which puts together the images, identifies the text in the images and stores it m character-coded format. The text can then be transferred to a computer with which the reading pen communicates. In addition, the Applicant's Patent Publication No. O99/60469 discloses an optical mouse which is arranged to position a cursor on a display of a computer.
The reading pen and the optical mouse are two examples of input units which can be used to input lnfor a- tion into a computer and to control the function of a computer. Another example of an input unit is a keyboard, which can be used both to enter information and to control the computer by means of various keyboard commands . A user does not want to have to change between dif- ferent input units m order to be able to carry out different functions. Therefore it is a general requirement that each input unit is to be able to be used for as many different functions as possible and m as flexible and simple a way as possible for the user. US Patent No. 5,932,864 discloses an arrangement for accessing electronic information via printed matter. More specifically, the arrangement comprises an input unit which via a keypad is switchable between a scanner mode and a mouse mode. In the scanner mode, a user can input an image of a dedicated object on the base, more specifically a bar code, a character or a symbol. The image is transferred from the input unit to a computer, which decodes the image and which, as a result, executes a preprogrammed command, for instance to obtain interactive software from an external database and executing this software on the computer. During execution, the user can, by operating the keypad, change the input unit to the mouse mode m order to interact with the software on a display.
A problem of inputting this type of dedicated objects is that the object must m many cases be accom- panied by some sort of explanatory text, so that the user can understand which command is initiated by the object. In spacious order catalogues this can usually be provided, but m many other cases the space is more restricted. Therefore the user may have to learn to what each object relates, which however reduces the usability of the arrangement if it is to be generalised beyond a certain application, such as programming a television set. Summary of the Invention
An object of this invention is to additionally improve prior-art input units. More specifically it an object to provide a general and flexible technique for controlling an electronic device. The provided technique should be intuitive to the user.
This object is achieved by an arrangement according to claim 1 and a method according to claim 22. The depen- dent claims define preferred embodiments.
According to a first aspect, this invention concerns an arrangement for controlling an electronic device, said arrangement comprising an input unit with an optical sensor for recording images, and a signal -processing unit for identifying predetermined information m at least one of said images and for controlling the electronic device dependent upon the predetermined information. The input unit is arranged, m a first function mode, to convert at least one image to a current text string containing a sequence of characters, and the signal -processing unit is arranged to control the electronic device on the basis of the current text string.
According to the invention the arrangement for controlling an electronic device is based on an input unit function known per se, that is recording of text, which, however, is modified by making the signal -processing unit control the electronic device on the basis of the current text string that is recorded. A user can thus control the electronic device on the basis of text which is read using the input unit. This is a great advantage since the command indication can be printed or written m more or less clear text on a base, whereby the control of the electronic device takes place m a way that is transparent and intuitive to the user. Moreover, such text-based command indications can be incorporated into a running text, without affecting the lay-out of the text. In order to facilitate the user's identification of commands, these may, however, be marked somehow, for instance by using a divergent font, a divergent colour, underlining etc . The inputting of text into the inventive arrange ment can advantageously be accomplished m the same way as m the Applicant's above-mentioned patent publication, although other accomplishments within the scope of prior art are also feasible. The optical sensor can be a line sensor, but is advantageously an area sensor.
The signal-processing unit is preferably accomplished as software. However, it may consist of a specially adapted hardware circuit, for instance an ASIC, an FPGA etc, or a suitable combination of hard software and hard- ware. The electronic device controlled by means of the arrangement can be a computer, a mobile phone, a PDA, or a similar electror __c device.
The command indication can correspond to a command to the electronic device at system or application level . The command can be, for instance, to emulate a keyboard command. Alternatively, it can also be a special command, for example a user-defined command.
Besides command indications m clear text, the elec- tronic device can also be controlled by text which is used m specific applications m the electronic device, such as addresses. The content of these addresses varies, but the arrangement is suitably arranged to identify the format of the current text string and, starting from this, interpret to what the text string relates. By "format" is meant the overall layout of the text string, such as presence and location of one or more given characters or character combinations m a text string.
The arrangement conveniently comprises a format database containing predetermined formats, to which the inputted text string is matched to determine whether a command is to be executed, and m that case which one. The arrangement advantageously also comprises a database editor which allows the user to introduce formats and associated commands into the format database. It is particularly preferred for these commands to initiate execution of software on the electronic device. Of course, such an editable database may also comprise a set of regular text strings (command indications) which are matched m their entirety to the recorded text and which are each associated with a command.
The use of a format database means that the arrangement, which with the input unit m its first function mode is arranged to input text, comprises an intelligent filter which interprets the inputted text m order to identify commands therein. The filter is intelligent to such an extent that it does not necessarily require iden- tical correspondence between the inputted text string and a text string m said database m order to identify a command. The use of such an intelligent filter gives several advantages. When necessary, the arrangement, with the input unit m one and the same function mode, can be used both to input text into the electronic device and to control the same. Moreover, the electronic device can be controlled, using at least part of the current text string, as will be described m more detail below. In a particularly preferred embodiment, the text string comprises an address. The format database thus comprises predetermined formats for different types of addresses, and the signal -processing unit is arranged to identify an address m the current text string and cause the electronic device to directly or indirectly connect to the address.
In a particularly preferred embodiment, the signal - processing unit is arranged to cause, when identifying an address for electronic mail m the recorded text, the electronic device to open a program for electronic mail. Preferably, the arrangement causes the device not only to open the program but also to open a template for electronic mail. It is still more preferred that the template is opened with the recorded e-mail address entered m the address field. It is also conceivable that the input unit is connected to the message field of the template, so that the input unit can be used to input message text, for instance by scanning of text or inputting of text, if the input unit has such a function mode. The identification of an e-mail address format can take place, for example, by recognition of the at sign (@) , the signal-processing unit interpreting all characters which are associated with the at sign as part of the address . Furthermore the signal-processing unit can advantageously be arranged to cause, when identifying a web address m the recorded text, the electronic device to open a web browser. Preferably, the device is caused not only to open the browser, but also the web page corresponding to the web address. The signal -processing unit's identification of a web address format can, for example, be based on recognition of the character combination
"http://" or "www", the signal -processing unit interpreting all characters which are associated with said character combination as part of the web address. In this way, the user can easily and quickly open a web page by using the same input unit as he or she uses for other input functions .
The signal-processing unit can also advantageously be arranged to cause, when identifying a phone number m the recorded text, the electronic device to call the phone number.
The arrangement according to the invention is easy to use for the user who need only move the input unit over text or place it on text m order to control the electronic device. The text can be self-explanatory, which means that the arrangement will be intuitive to the user. The arrangement is also general and flexible, m that the electronic device can be controlled on the basis of the format of an inputted text string, and by using at least part of the text string. The arrangement can also easily be combined with other input unit functions which are based on recording of images by means of an optical sensor.
In an advantageous embodiment, the arrangement also comprises an optical mouse function for controlling a cursor on a display of the electronic device. The user can then carry out mouse functions using the arrangement and also control the electronic device by recording text . The mouse function is advantageously integrated into the input unit and realised m the same way as m the Appli- cant's above-mentioned patent publication. Other realisations within the scope of prior-art technique are, however, also possible. As mentioned above, the signal -processing unit can generate a predetermined command when recognising one or more predetermined characters or words m the recorded text . The predetermined words can simply be designations of the commands that are generated. To this end, the arrangement may advantageously comprise a product on which a plurality of command indications are stated. The command indications can advantageously be indicated by character combinations which are easy for the user to understand. The product can be, for example, a mouse pad. In another advantageous embodiment, the arrangement comprises, alternatively or additionally, a handwriting/ handdrawmg function for inputting of handwritten information into the electronic device. The handwπtmg/hand- drawing function is advantageously integrated into the input unit and realised m the same way as m the Applicant's Patent Publication WO99/60467. Other realisations within the scope of prior-art technique are, however, also possible. In one embodiment, the signal -processing unit is at least partially located m the same casing as the electronic device. As a result, the input unit can be made simpler and cheaper. In addition, the processor capacity which is already available m the electronic device can be used to carry out the functions of the signal -pro^ .s- mg unit. Certain processing of the recorded text can, however, advantageously be carried out m the input unit, for example localisation of the text m the image or images and conversion of the text to character-coded format, for example ASCII code, so that a smaller amount of information needs to be transmitted from the input unit to the electronic device.
In another embodiment the signal -processing un t can be completely integrated with the input unit so that the electronic device receives one or more commands directly from the input unit . In another embodiment the input unit can just record images and transfer these to the signal- processing unit whi h carries out all the processing of the images.
The input unit c_.n be arranged to communicate with the electronic device by wireless means, so that the use of the input unit is as flexible as possible and so that certain functions can be used stand-alone. Alternatively, communication via a cable could be possible, for example
According to one more embodiment, the input unit functions m the first function mode as a handheld text scanner and the signal -processing unit is arranged to continually interpret the text content of the images recorded by the optical sensor. To prevent words which are only intended to be inputted and stored m the elec- tronic device, but which consist of the same character combination as a command indication, from being interpreted as a command, it is possible to place additional demands on what the signal -processing unit is to interpret as commands. For instance, it may be required that command indications consist of characters m special font, special size, printed m capital letters, underlined, printed m extra bold type or m italics etc.
Alternatively, the arrangement is designed m such manner that it can selectively be caused to operate m a control function mode, m which the signal-processing unit is arranged to control the electronic device on the basis of the current text string. The embodiment has the advantage that it simplifies the arrangement since it just has to interpret text m the control function mode. The risk that text which is only intended to be inputted and stored m the electronic device is erroneously interpreted as a command is also eliminated.
An input unit with a plurality of function modes is m one embodiment arranged to automatically select the function mode on the basis of the content of one or more images, i.e. by identifying some kind of predetermined information. The predetermined information can m pπn- ciple be any information which makes it possible for the input unit to interpret that it is to change from a current function mode to a new function mode. The information can, for example, consist of one or more predeter- mined characters, symbols, words, text m special font or line thickness or the like. When the input unit identifies the predetermined information, it automatically changes to the required function mode. In this way the user does not need to press any buttons. In a preferred embodiment, the predetermined information is a predetermined pattern. If the input unit, for example, has a mouse function mode and a control function mode, it can be programmed to be able to identify the pattern on a mouse pad. When the user places the input unit on the mouse pad, it records an image of the pattern on the mouse pad. The input unit identifies the pattern as predetermined information which indicates mouse function and it changes automatically to mouse function mode and processes the images to achieve the mouse function. It is, of course, convenient if the input unit is also arranged to change from the mouse function mode to the control function mode when it detects other predetermined information. If the input unit has only a mouse function mode and a control function mode, it can, for example, change back from the mouse function mode to the control function mode when it detects that the predetermined pattern for the mouse function mode is no longer present in the captured images. Alternatively, the change can be carried out on the basis of positive ldentifica- tion of a certain piece of predetermined information. This automatic change between different function modes can, of course, be used independently of how many and which function modes the input unit comprises. As mentioned above, the input unit may, for example, com- prise a control function mode, a text inputting mode (scanner function mode) , a mouse function mode and a handwriting recording mode. The input unit may also comprise a photographing mode, m which it can be caused, like a camera, to record and store single images.
In an advantageous embodiment, the predetermined information consists of a position-coding pattern, preferably an absolute position-coding pattern.
The advantage of a position-coding pattern is that the predetermined information can consist of one or more specific positions. This makes it easier for the device to identify when it is to change, as it does not need to carry out any character recognition (OCR) .
Absolute position-coding patterns are known, for example, from US 5,852,434 and the Applicant's Patent Publication WO00/73983 which was not publicly available at the time of the filing the Swedish patent application from which the present application claims priority.
According to a second aspect, the present invention relates to a method for controlling an electronic device, comprising the steps of operating a handheld input unit to record at least one image, identifying predetermined information m said at least one image, and controlling the electronic device dependent upon said predetermined information. In this method, said at least one image is converted to a current text string which comprises a sequence of characters, and the electronic device is controlled on the basis of the current text string.
The advantages of this method are evident from the above discussion of the arrangement . The features of the arrangement are, where appropriate, applicable also to the method.
The Applicant's Patent Publication WO99/60468 discloses an input unit which has an image-based mouse function mode and an image-based input function mode. The input unit is changed by the user pressing buttons. A further object is to generally simplify the use of an input unit which has two function modes, so that the change between different function modes can be car- ried out m a way that is smooth for the user. This object is achieved by an input unit according to claim 28. The dependent claims define preferred embodiments.
More specifically, the invention comprises according to a third aspect an input unit which has at least a first and a second function mode. The input unit comprises a detector for capturing of images, for instance an optical sensor, and an image processing device, for instance a processor, for processing the images to achieve said two function modes. The input unit is arranged to change from the first to the second function mode when the image processor detects a first piece of predetermined information m one of said images .
Like m the arrangement for controlling an electronic device, the predetermined information can m principle be any information which makes it possible for the input unit to interpret that it is to change from a current function mode to a new function mode. The information can, for example, consist of one or more predetermined characters, symbols, words, text m special font or _ me thickness or the like. When the input unit identifies the predetermined information, it automatically changes to the required function mode. In this way, the user does not need to press any buttons. In a preferred embodiment, the predetermined IF formation is a predetermined pattern. If the input unit, for example, has a mouse function mode and another function mode, it can be programmed to be able to identity the pattern on a mouse pad. When the user places the input unit on the mouse pad, the detector records an image of the pattern on the mouse pad. The image processor identifies the pattern as predetermined information which indicates mouse function and it then changes automatically to the mouse function mode and processes the images to achieve the mouse function.
Of course, it is convenient for the input unit also to be arranged to change from the second function mode to the first function mode when it detects a second piece of predetermined info -mat on. If the input unit just has a mouse function mode _.nd an input function mode, it can, for example, change back from the mouse function mode to the input function mode when the image processor detects that the predetermined pattern for the mouse function mode is no longer present m the recorded images. Alternatively, the change can take place on the basis of positive identification of another predetermined pattern. In an advantageous embodiment, the predetermined information consists of a position-coding pattern, preferably an absolute position-coding pattern. The advantage of this is evident from the above discussion of the arrangement for controlling an electronic device. The functions between which the change takes place can, for example, be a mouse function mode, a scanner function mode, a handwriting recording mode, a photographing mode or some similar function mode which can be achieved on the basis of captured images. Brief Description of the Drawings This invention will now be described m greater detail by means of an embodiment with reference to the accompanying drawings, m which
Fig. 1 schematically illustrates the composition and use of an arrangement according to the invention, and Fig. 2 is a simplified flow chart showing operations which are carried out m an arrangement similar to the one m Fig . 1. Description of a Preferred Embodiment
An embodiment of an arrangement is described below which comprises a mouse function mode, a scanner or read- mg pen function mode and a control function mode which is based on text recording.
Fig. 1 shows a mouse pad 100, an electronic device 200 m the form of a computer and an input unit 300 for the computer. The mouse pad 100 has a working field 110 with an irregular pattern (not shown) which makes it possible to determine the relative positions of two images which have partially overlapping contents by means of the contents of the images, and a command field 120, m which a number of predetermined command indications are stated.
The input unit 300 has a casing 1 m the shape of a pen. One short side of the casing 1 has a window 2 through which images are captured for the different image-based function modes of the input unit 300.
The casing 1 contains principally an optics part, an electronics part and a power supply.
The optics part comprises a number of light emitting diodes 6, a lens system 7 and an optical sensor 8 which constitutes the interface with the electronics part . The light emitting diodes 6 are intended to illuminate a surface of the base which is at the moment below the window. The lens system 7 is intended to project an image of the surface which is below the window 2 onto the light-sensi- tive sensor 8 as correct a way as possible. The optical sensor 8 can consist of an area sensor, such as a CMOS sensor or a CCD sensor with a built-in A/D transducer. Such sensors are commercially available.
In this example, the power supply for the input unit is obtained from a battery 12 but can alternatively be obtained from a ma s connection (not shown) .
The electronics part comprises a processor 20 with conventional associated circuits, such as various types of memory, and associated programs for carrying out the functions described here. The electronics part also comprises a transceiver 26 for transmitting information to/ from the computer 200. The transceiver 26 can be based on infrared technology, ultrasonics or radio technology for transmission over short distances, for example m accor- dance with the Bluetooth standard. The electronics part further comprises buttons 27, by means of which the user can control the input unit 300 and m particular change the input unit between the mouse function mode, the scanner function mode and the control function mode. When the input unit 300 operates the mouse function mode, the buttons 27 can also have functions which correspond to the click buttons on a traditional mouse.
The computer 200 is an ordinary personal computer with circuits and programs which make possible communication with the input unit 300. However, this embodiment this also contains a signal -processing unit which constitutes part of the arrangement for controlling its function. The signal-processing unit consists this example of a program which is installed m the computer 200. This is shown symbolically by broken lines and reference numeral 210. As mentioned, the input unit 300 has a scanner function mode, a mouse function mode and a control function mode .
The scanner function mode is used to record text. The user passes the input unit 300 across the text which he wants to record. The optical sensor 8 records images with partially overlapping contents. The images are put together by the processor 20. Each character m the put- together image is localised and, using for example neural network software m the processor 20, its corresponding ASCII characters are determined. The text converted m this way to character-coded format can be stored, m the form of a text string, the input unit 300 or transferred to the computer 200. The scanner function is described m greater detail m the Applicant's Patent Publi- cation No. WO98/20446, which is incorporated herein by reference .
The mouse function mode is used to control a cursor on the display 201 of the computer 200. The mouse function mode is also image-based m this embodiment. When the input unit 300 is moved across the working field 110, the optical sensor 8 records a plurality of images with partially overlapping images. The processor 20 determines positioning signals for the cursor of the computer 200 on the basis of the relative positions of the recorded images, which are determined by means of the contents of the images. The mouse function is described greater detail the Applicant's Patent Publication No.
WO99/60469, which is incorporated herein by reference.. The control function mode is based on the scanner function. The user records text the same way as the scanner function mode. The text is sent character- coded format from the input unit's 300 transceiver 26 to the signal-processing unit 210 m the computer 200, together with an indication that this is control information which is to be interpreted. The signal -processing unit 210 examines the received text and searches for predeter- mined information m this m the form of predetermined characters and character combinations. When such predetermined information is found, the signal -processing unit 210 creates predetermined commands to the computer 200 as a function of the predetermined information. The arrangement described above is used m the following way. First assume that the user wants to use the input unit 300 as a mouse. He selects the mouse function mode by means of the buttons 27. By moving the input unit 300 on the working field 110 he controls the cursor on the display 201 of the computer 200. Assume next ha*" the user edits a document m the computer 200. He can then mark text by "clicking" with the buttons 27 and positioning the cursor. Assume that the user first wants to replace a first piece of text with a second piece of text which is situated elsewhere the text. The user presses one of the buttons 27 and passes the input unit 300 across the second piece of text to mark the same. Then he changes the input unit 300 to the control function mode and records the command indication "cut" by passing the input unit 300 across this command indication on the command field 120 of the mouse pad 100. The input unit 300 then sends the character-coded text "cut" to the signal- processing unit 21C the computer 200, which identifies the text as a command indication and creates a corresponding command for Lne word processing application concerned, which cuts out the marked piece of text. The user next changes the input unit 300 to the mouse function mode and marks the first piece of text using the input unit 300 and then causes the computer 300 to paste the cut-out piece of text place of the marked text by changing the input unit 300 to the control function mode and recording the command indication "paste" using the input unit 300.
Now assume that the user wants to enter text from a newspaper his document . He first positions the cursor the required place using the input unit 300 changed to the mouse function mode. Then he changes the input unit 300 to the scanner function mode and scans the text from the newspaper. The text is converted into character-coded format and transmitted to the signal -processing unit 210 which causes the computer 200 to insert the text m the position marked by the cursor.
Now assume that the user sees an interesting web address in the newspaper he is reading and wants to look at this web page. He then changes the input unit 300 to control function mode and reads off the web address from the newspaper. The recorded text is transferred to the signal-processing unit 210 which identifies the character combination "http://" and causes the computer 200 to open the web page with the recorded address.
Finally, assume that the user wants to send an e-mail to a friend. He uses the input unit 300 changed to the control function mode to record the command indication "e-mail" on the mouse pad 100. The recognition of this command indication by the signal-processing unit 210 results the unit generating a command to the computer 200 which causes it to open the e-mail program. The user can then record the required e-mail address and even the content of the message using the scanner function. As shown above, the user can conveniently carry out a number of different functions which comprise inputting information and controlling the computer 200 by means of ust one input unit 300. Of course, other functions can be integrated into the input unit 300 order to further increase its usability. An example is a function to record handwritten text, which is described the Applicant's Patent Publication No. WO99/60467, which is incorporated herein by reference.
Another example is a photographing mode, in which the input unit 300 via the buttons 27 can be caused to record single images and store these and/or transfer these to the computer 200. The lens system must be chang- ed so that a sharp image on the sensor 8 is obtained at an infinite distance, or normally a distance of about two meters. In the scanner function mode and the mouse function mode, however, the lens system 7 is adjusted so that a sharp image is obtained of an object which is position- ed at the window 2, i.e. normally about two centimetres from the sensor 8.
Moreover, other commands than those stated above can be generated. A user may also himself define how recorded text is to be interpreted by the signal -processing unit 200 and which control of the computer 200 a certain recorded text is to result m.
It has been described above that the change between the different function modes takes place by the user pressing the buttons 27 on the input unit 300. As an alternative, the input unit 300 can itself detect that it is to change between different function modes.
Fig. 2 illustrates schematically a flow chart of the operations such an alternative arrangement. The input unit 300 is arranged to search for predetermined mfor- mation in each image recorded with the optical sensor 8 (step 401) . The predetermined information can be, for example, the pattern on the working field 110 of the mouse pad 100. If the processor 20 detects this pattern, it changes to the mouse function mode (step 402) and processes the images the manner described above to generate positioning signals to the cursor on the display 201 of the computer 200. If the user then places the input unit 300 on a newspaper m order to scan text, the processor 20 no longer detects the mouse pad pattern and then knows that it is to change to the scanner function mode (step 403) and process the images m the manner described above for identification of text and conversion thereof to a character-coded text string. The text string is then transmitted to the computer 200 (step 404) , which the signal-processing unit 210 m this embodiment continually matches the received text strings to the con- tents of a database (step 405) . In case of correspondence, the signal -processing unit 210 generates a command, corresponding to the text string, to the computer 200 (step 406) , and otherwise, the text is inputted, if possible, to an application m the computer 200 (step 407) . In this case, an automatic change between the scanner function mode and the control function mode thus takes place (step 405) by means of an intelligent filter. The intelligent filter m the signal-processing unit 210 analyses the format of the received text string for lden- tification of control information. An e-mail address can, for example, be identified by the presence of an at sign, optionally combination with its relationship with other given characters, and is interpreted as a command to open an e-mail program with the e-mail address entered the address field. A web address can be identified by the presence of the character combination "http://" or "www" , optionally combination with its relationship with other given characters, and is interpreted as a command to open a web reader with the indicated address. A file address can, for example, be identified on the basis of the character combination ":\" m combination with a given finalising file type indicator (".doc", ".exe", ".mp3", etc.) and interpreted as a command to open a file by means of the program indicated by the file type indicator.
According to an alternative, the input unit 300 is actively controlled to the control function mode by the user, however, without using buttons. The command indications m the command field 120 of the mouse pad 100 m Fig. 1 can be written m a given manner, so that the processor 20 can detect that the inputted characters are not characters that are to be inputted into the computer according to the scanner function mode, but characters that represent a command and are to be sent to the signal-processing unit 210 to be processed as such. The command indications can be, for example, written a special size, a special font or a special line thickness. As another example, the change to the scanner function mode can be carried out on the basis of change commands which are written the command field 120 of the mouse pad 100, as is evident from Fig. 1. When the user, for instance, wants to change to the scanner function mode he records by means of the input unit 300 the word "scanner" from the command field 120. The processor 20 identifies this as predetermined information which indicates that it is now to carry out a scanner function. Correspondingly, recording of the word "camera" from the command field 120 results m a change to the photographing mode .
The above description has been given with reference to a mouse pad and a computer. However, it will be appre- ciated that the arrangement is also usable m other cases, such as for controlling a mobile phone or a PDA. For instance, the input unit can be used to read, from a business card, a fax number, a phone number, an e-mail address or a web address and, on the basis thereof, cause the mobile phone or the PDA to connect to one of these addresses. In this context, it will appreciated that the input unit and the signal -processing unit can be an integrated part of the mobile phone or the PDA.
Reverting to thf mouse pad m Fig. 1, it was stated above that the pattern on the same is an irregular pat- tern and that the mouse function is achieved by determination of the relative positions of the recorded images. In another embodiment, the pattern on the mouse pad 100 can be a position-coding pattern, which systematically codes positions over the whole of the mouse pad 100. In this case the mouse function can be based on reading off positions using the position-coding pattern. In addition the change to the mouse function mode can be based on recognition of the position-coding pattern. In addition particular positions or position areas (also called domains or regions) , for example those corresponding to the different command indications the command field 120 can be dedicated to particular function modes. When the processor detects a particular position, it determines which function corresponds to this position. In this way the input unit 300 can be caused to change from one function mode to another function mode by placing it m a particular position on the mouse pad. Different regions of the position-coding pattern can also be dedicated to commands for controlling the computer 200. Instead of the signal-processing unit the computer 200 detecting predetermined information m the text which is entered, it can thus detect positions m the form of coordinates and identify which command is to be created to control the computer 200. For example, if the user wants to open the e-mail program m the computer 200, he can place the input unit 300 on the mouse pad 100 m a position where it says "e-mail". The optical sensor 8 records an image of the position-coding pattern this position. The processor 20 identifies which position, that is which coordinates, corresponds to the position- coding pattern the image. It sends the coordinates to the signal-processing unit 210 the computer 200. The signal -processing unit 210 identifies that these coordinates mean that it is to create a command to the computer 200 which causes it to open the e-mail program. In this case, the message "e-mail" is only an indication to the user to which command the field corresponds while the actual operation triggered by the input unit 300 is determined by the position-coding pattern. The advantage of this arrangement is that the input unit need not subject a text to character recognition (OCR) , with the ensuing risk of misinterpretation.
The mouse pad can thus be divided into position areas or regions which are associated with different functions or commands. An additional example of this is that one region can be dedicated to a relative mouse function mode (the cursor is moved the same way as the input unit) and another region to an absolute mouse function mode (the cursor is placed m the position which corresponds to the position of the input unit on the mouse pad) . The input unit itself understands which function it is to use on the basis of whether the position-coding pattern (and hence the identified coordinates) belongs to one or the other region.
Of course, alternatively the same surface can be used for the relative and the absolute function mode and the change can be carried out by means of change commands m the way described above.
A further example is that the mouse pad can have an area which is dedicated to a scrolling function. The input unit can thus be a mouse with various mouse func- tions. It can also be a mouse which, m addition to controlling a cursor on a display, can control other functions of a computer or other electronic devices, such as a mobile telephone or a PDA.
The above description is just one example of how the arrangement according to the invention can be designed. Based on the summary of the invention, experts m the field can achieve a number of variants of this example within the scope of the appended claims.

Claims

1. An arrangement for controlling an electronic device (200) , said arrangement comprising an input unit (300) with an optical sensor for recording images, and a signal-processing unit (210) for identifying predetermined information in at least one of said images and for controlling the electronic device (200) dependent upon said predetermined information, c h a r a c t e r i s e d in that the input unit (300) in a first function mode is arranged to convert said at least one image to a current text string containing a sequence of characters, and that the signal-processing unit (210) is arranged to control the electronic device (200) on the basis of the current text string.
2. An arrangement according to claim 1, wherein the signal-processing unit (210) is arranged to control the electronic device (200) on the basis of the format of the current text string.
3. An arrangement according to claim 2, wherein the signal-processing unit (210) is arranged to match the format of the current text string to a format database comprising predetermined formats, each of which is asso- ciated with at least one command, and to generate thr> command associated with the current text string, so as to control the electronic device (200) .
4. An arrangement according to claim 3 , wherein said command initiates execution of software on the electronic device (200) .
5. An arrangement according to claim 3 or 4 , wherein said format database comprises predetermined formats of different types of addresses.
6. An arrangement according to any one of claims 3-5, which further comprises a database editor which allows a user to add formats and associated commands to the format database .
7. An arrangemsnt accoi ding to any one of the preceding claims, wherein the signal -processing unit (210) is arranged to ιdent_ fy an address m the current text string and to cause the electronic device (200) to con- nect to said address.
8. An arrangement according to any one of the preceding claims, wherein the signal-processing unit (210) is arranged to cause, when identifying an address for electronic mail said text string, the electronic device (200) to open a program for electronic mail.
9. An arrangement according to any one of the preceding claims, wherein the signal-processing unit (210) is arranged to cause, when identifying a web address said text string, the electronic device (200) to open a web browser.
10. An arrangement according to any one of the preceding claims, wherein the signal-processing unit (210) is arranged to cause, when identifying a phone or fax number m said text string, the electronic device (200) to connect to the phone or fax number.
11. An arrangement according to any one of the preceding claims, wherein the input unit (300) m the first function mode is a handheld text scanner.
12. An arrangement according to any one of claims 1-10, which is selectively operable a control function mode, m which the signal-processing unit (210) is arranged to control the electronic device (200) on the basis of the current text string, preferably the format thereof .
13. An arrangement according to any one of the preceding claims, wherein the signal-processing unit (210) is at least partly placed m the same casing as the electronic device (200) .
14. An arrangement according to any one of the pre- ceding claims, wherein the input unit (300) is arranged for wireless communication with the electronic device (200) .
15. An arrangement according to any one of the preceding claims, wherein the input unit (300) a further function mode is controllable to record single images.
16. An arrangement according to any one of the pre- ceding claims, wherein the input unit (300) a second function mode is arranged to control a cursor on a display (201) of the electronic device (200) .
17. An arrangement according to claim 16, wherein the input unit (300) is arranged to automatically select function mode on the basis of the contents of said at least one image.
18. An arrangement according to claim 16 or 17, wherein the input unit (300) is arranged to operate m the second function mode when said at least one image contains a predetermined pattern.
19. An arrangement according to any one of claims 15-18, wherein the input unit (300) is arranged to automatically select a predefined function mode, preferably the first function mode, m the absence of a predeter- mined pattern m said at least one image.
20. An arrangement according to claim 18 or 19, wherein the predetermined pattern consists of a position- coding pattern, preferably an absolute position-coding pattern.
21. An arrangement according to any one of the preceding claims, which further comprises a product (100), on which a plurality of command words are indicated.
22. A method for controlling an electronic device (200) , comprising the steps of operating a handheld input unit (300) to record at least one image, identifying predetermined information m said at least one image, and controlling the electronic device (200) dependent upon said predetermined information, c h a r a c t e r i s e d by the further steps of converting said at least one image to a current text string including a sequence of characters, and controlling the electronic device (200) on the basis of the current text string.
23. A method according to claim 22, comprising the further step of controlling the electronic device (200) on the basis of the format of the current text string.
24. A method according to claim 23, comprising the step of matching the format of the current text string to a format database comprising predetermined formats, each of which is associated with a command, and generating the command associated with the current text string, so as to control the electronic device (200) .
25. A method according to claim 24, wherein said command initiates execution of software on the electronic device (200) .
26. A method according to claim 24 or 25, wherein the format database comprises predetermined formats for different types of addresses.
27. A method according to any one of claims 22-26, comprising the steps of searching for an address m the current text string, and, when an address is found, causing the electronic device (200) to connect to said address .
28. An input unit with at least a first and a second function mode, comprising a detector (8) for capturing images and an image processor (20) for processing the images to achieve said two function modes, c h a r - a c t e r i s e d m that the input unit is arranged to change from the first to the second function mode when the image processor (20) detects a first piece of predetermined information m one of said images .
29. An input unit according to claim 28, wherein said first piece of predetermined information is a predetermined pattern.
30. An input unit according to claim 28 or 29, which is arranged to change from the second function mode to the first function mode when it detects a second piece of predetermined information m one of said images.
31. An input unit according to any one of claims 28-30, wherein said predetermined information consists of a position-coding pattern, preferably an absolute position-coding pattern.
32. An input unit according to any one of claims 28-31, wherein the first function mode is a mouse func- tion, and the second function mode is an input function, preferably a scanner function.
PCT/SE2001/000360 2000-02-18 2001-02-19 Controlling an electronic device WO2001061454A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
EP01906485A EP1285329A1 (en) 2000-02-18 2001-02-19 Controlling an electronic device
AU2001234308A AU2001234308A1 (en) 2000-02-18 2001-02-19 Controlling an electronic device

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
SE0000541-3 2000-02-18
SE0000541A SE0000541L (en) 2000-02-18 2000-02-18 Electronic device control
SE0000939A SE0000939L (en) 2000-02-18 2000-03-21 Inenhetsarrangemang
SE0000939-9 2000-03-21

Publications (2)

Publication Number Publication Date
WO2001061454A1 true WO2001061454A1 (en) 2001-08-23
WO2001061454A8 WO2001061454A8 (en) 2002-03-28

Family

ID=26654994

Family Applications (2)

Application Number Title Priority Date Filing Date
PCT/SE2001/000361 WO2001061455A1 (en) 2000-02-18 2001-02-19 Input unit arrangement
PCT/SE2001/000360 WO2001061454A1 (en) 2000-02-18 2001-02-19 Controlling an electronic device

Family Applications Before (1)

Application Number Title Priority Date Filing Date
PCT/SE2001/000361 WO2001061455A1 (en) 2000-02-18 2001-02-19 Input unit arrangement

Country Status (5)

Country Link
EP (2) EP1259874A1 (en)
JP (1) JP2003523572A (en)
AU (2) AU2001234309A1 (en)
SE (1) SE0000939L (en)
WO (2) WO2001061455A1 (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2002039377A1 (en) * 2000-11-10 2002-05-16 Anoto Ab Device and system for information management
GB2377592A (en) * 2001-06-21 2003-01-15 Nec Corp A mobile phone provided with a mouse type input device such that the mobile phone may be used as a mouse for an associated device
WO2005010634A2 (en) * 2003-07-30 2005-02-03 Logitech Europe S.A. Digital pen function control
GB2444969A (en) * 2006-05-16 2008-06-25 Uniwill Comp Corp Transmitting handwritten scripts and converting handwritten patterns
US11269431B2 (en) * 2013-06-19 2022-03-08 Nokia Technologies Oy Electronic-scribed input

Families Citing this family (16)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7916124B1 (en) 2001-06-20 2011-03-29 Leapfrog Enterprises, Inc. Interactive apparatus using print media
US7385595B2 (en) 2001-11-30 2008-06-10 Anoto Ab Electronic pen and method for recording of handwritten information
AU2012202678B2 (en) * 2002-09-26 2015-07-16 Kenji Yoshida Information reproduction i/o method using dot pattern, information reproduction device, mobile information i/o device, and electronic toy
EP1548635B1 (en) * 2002-09-26 2009-06-24 Kenji Yoshida Information reproduction/i/o method using dot pattern and information reproduction device
US7831933B2 (en) 2004-03-17 2010-11-09 Leapfrog Enterprises, Inc. Method and system for implementing a user interface for a device employing written graphical elements
US7853193B2 (en) 2004-03-17 2010-12-14 Leapfrog Enterprises, Inc. Method and device for audibly instructing a user to interact with a function
JP4550460B2 (en) * 2004-03-30 2010-09-22 シャープ株式会社 Content expression control device and content expression control program
US8094139B2 (en) 2005-02-23 2012-01-10 Anoto Ab Method in electronic pen, computer program product, and electronic pen
EP2511853A3 (en) 2005-04-28 2013-09-11 YOSHIDA, Kenji Dot pattern
JP3771252B1 (en) 2005-07-01 2006-04-26 健治 吉田 Dot pattern
RU2457532C2 (en) * 2006-03-10 2012-07-27 Кенджи Йошида Input processing system for information processing apparatus
JP4973310B2 (en) 2007-05-15 2012-07-11 富士ゼロックス株式会社 Electronic writing instrument, computer system
US10620754B2 (en) 2010-11-22 2020-04-14 3M Innovative Properties Company Touch-sensitive device with electrodes having location pattern included therein
JP5664300B2 (en) * 2011-02-07 2015-02-04 大日本印刷株式会社 Computer apparatus, input system, and program
JP5664301B2 (en) * 2011-02-08 2015-02-04 大日本印刷株式会社 Computer device, electronic pen input system, and program
JP5664303B2 (en) * 2011-02-09 2015-02-04 大日本印刷株式会社 Computer apparatus, input system, and program

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4797544A (en) * 1986-07-23 1989-01-10 Montgomery James R Optical scanner including position sensors
US4804949A (en) * 1987-03-20 1989-02-14 Everex Ti Corporation Hand-held optical scanner and computer mouse
US5477012A (en) * 1992-04-03 1995-12-19 Sekendur; Oral F. Optical position determination
US5852434A (en) 1992-04-03 1998-12-22 Sekendur; Oral F. Absolute optical position determination
US5932863A (en) * 1994-05-25 1999-08-03 Rathus; Spencer A. Method and apparatus for accessing electric data via a familiar printed medium
WO1999050787A1 (en) 1998-04-01 1999-10-07 Xerox Corporation Cross-network functions via linked hardcopy and electronic documents
WO1999060457A1 (en) 1998-05-20 1999-11-25 Autoliv Development Ab A foot pedal arrangement
WO1999060468A1 (en) 1998-04-30 1999-11-25 C Technologies Ab Input unit, method for using the same and input system

Family Cites Families (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
IT1164865B (en) * 1982-02-10 1987-04-15 Price Stern Sloan Publishers EQUIPMENT FOR TEACHING OR FUN
GB8701206D0 (en) * 1987-01-20 1987-02-25 Hilton C S Apparatus for capturing information in drawing/writing
US5051736A (en) * 1989-06-28 1991-09-24 International Business Machines Corporation Optical stylus and passive digitizing tablet data input system
JPH0428339U (en) * 1990-06-22 1992-03-06
US5442147A (en) * 1991-04-03 1995-08-15 Hewlett-Packard Company Position-sensing apparatus
JP3262297B2 (en) * 1993-04-27 2002-03-04 株式会社ワコム Optical coordinate input device
JP3277052B2 (en) * 1993-11-19 2002-04-22 シャープ株式会社 Coordinate input device and coordinate input method
US5661506A (en) 1994-11-10 1997-08-26 Sia Technology Corporation Pen and paper information recording system using an imaging pen
JPH0934633A (en) * 1995-07-17 1997-02-07 Sanyo Electric Co Ltd Space mouse and space mouse system
JPH0944591A (en) * 1995-08-03 1997-02-14 Olympus Optical Co Ltd Code sheet and information reproducing device
US6081261A (en) * 1995-11-01 2000-06-27 Ricoh Corporation Manual entry interactive paper and electronic document handling and processing system
KR19980033584A (en) * 1998-04-28 1998-07-25 이종우 Data input device and method, computer system using same and method for executing the program
DE19835809A1 (en) * 1998-08-07 2000-02-10 Thomas Teufel Combo mouse

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4797544A (en) * 1986-07-23 1989-01-10 Montgomery James R Optical scanner including position sensors
US4804949A (en) * 1987-03-20 1989-02-14 Everex Ti Corporation Hand-held optical scanner and computer mouse
US5477012A (en) * 1992-04-03 1995-12-19 Sekendur; Oral F. Optical position determination
US5852434A (en) 1992-04-03 1998-12-22 Sekendur; Oral F. Absolute optical position determination
US5932863A (en) * 1994-05-25 1999-08-03 Rathus; Spencer A. Method and apparatus for accessing electric data via a familiar printed medium
WO1999050787A1 (en) 1998-04-01 1999-10-07 Xerox Corporation Cross-network functions via linked hardcopy and electronic documents
WO1999060468A1 (en) 1998-04-30 1999-11-25 C Technologies Ab Input unit, method for using the same and input system
WO1999060457A1 (en) 1998-05-20 1999-11-25 Autoliv Development Ab A foot pedal arrangement

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP1285329A1 *

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2002039377A1 (en) * 2000-11-10 2002-05-16 Anoto Ab Device and system for information management
GB2377592A (en) * 2001-06-21 2003-01-15 Nec Corp A mobile phone provided with a mouse type input device such that the mobile phone may be used as a mouse for an associated device
GB2377592B (en) * 2001-06-21 2003-09-17 Nec Corp Portable telephone set
WO2005010634A2 (en) * 2003-07-30 2005-02-03 Logitech Europe S.A. Digital pen function control
WO2005010634A3 (en) * 2003-07-30 2005-04-21 Logitech Europ Sa Digital pen function control
GB2444969A (en) * 2006-05-16 2008-06-25 Uniwill Comp Corp Transmitting handwritten scripts and converting handwritten patterns
US11269431B2 (en) * 2013-06-19 2022-03-08 Nokia Technologies Oy Electronic-scribed input

Also Published As

Publication number Publication date
SE0000939D0 (en) 2000-03-21
AU2001234309A1 (en) 2001-08-27
EP1259874A1 (en) 2002-11-27
AU2001234308A1 (en) 2001-08-27
JP2003523572A (en) 2003-08-05
WO2001061455A8 (en) 2002-03-28
EP1285329A1 (en) 2003-02-26
WO2001061454A8 (en) 2002-03-28
WO2001061455A1 (en) 2001-08-23
SE0000939L (en) 2001-08-19

Similar Documents

Publication Publication Date Title
US7054487B2 (en) Controlling and electronic device
EP1285329A1 (en) Controlling an electronic device
US6992655B2 (en) Input unit arrangement
JP4119004B2 (en) Data input system
US6040825A (en) Input/display integrated information processing device
KR101026630B1 (en) Universal computing device
KR100918535B1 (en) Notepad
US6115513A (en) Information input method and apparatus using a target pattern and an access indication pattern
EP0354703B1 (en) Information processing apparatus
JP2003529985A (en) Method and system for associating information
WO2003023595A1 (en) Method, computer program product and device for arranging coordinate areas relative to each other
EP1256091B1 (en) Method and system for configuring and unlocking an electronic reading device
US20040203411A1 (en) Mobile communications device
CA2221669C (en) Information input method, information input sheet, and information input apparatus
WO2001048590A1 (en) Written command
JP6888410B2 (en) Information processing equipment and information processing programs
JP2004508632A (en) Electronic recording and communication of information
US20030085872A1 (en) Recording of information
JP3296858B2 (en) Image filing method
KR20090124091A (en) Input device for pad-mouse type
JP4479195B2 (en) Optical mouse and information device system
KR200215091Y1 (en) Mouse device having a function of input characters
JPH0465766A (en) Card information managing device
SE516739C2 (en) Notepad for information management system, has activation icon that enables a position code detector to initiate a predetermined operation that utilizes the recorded information
KR20070029415A (en) Notebook computer installed scanner

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): AE AG AL AM AT AT AU AZ BA BB BG BR BY BZ CA CH CN CR CU CZ CZ DE DE DK DK DM DZ EE EE ES FI FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NO NZ PL PT RO RU SD SE SG SI SK SK SL TJ TM TR TT TZ UA UG US UZ VN YU ZA ZW

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): GH GM KE LS MW MZ SD SL SZ TZ UG ZW AM AZ BY KG KZ MD RU TJ TM AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE TR BF BJ CF CG CI CM GA GN GW ML MR NE SN TD TG

121 Ep: the epo has been informed by wipo that ep was designated in this application
DFPE Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101)
AK Designated states

Kind code of ref document: C1

Designated state(s): AE AG AL AM AT AT AU AZ BA BB BG BR BY BZ CA CH CN CR CU CZ CZ DE DE DK DK DM DZ EE EE ES FI FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NO NZ PL PT RO RU SD SE SG SI SK SK SL TJ TM TR TT TZ UA UG US UZ VN YU ZA ZW

AL Designated countries for regional patents

Kind code of ref document: C1

Designated state(s): GH GM KE LS MW MZ SD SL SZ TZ UG ZW AM AZ BY KG KZ MD RU TJ TM AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE TR BF BJ CF CG CI CM GA GN GW ML MR NE SN TD TG

WR Later publication of a revised version of an international search report
WWE Wipo information: entry into national phase

Ref document number: 2001906485

Country of ref document: EP

REG Reference to national code

Ref country code: DE

Ref legal event code: 8642

WWP Wipo information: published in national office

Ref document number: 2001906485

Country of ref document: EP

NENP Non-entry into the national phase

Ref country code: JP