CA2277963A1 - Means for inputting characters or commands into a computer - Google Patents

Means for inputting characters or commands into a computer Download PDF

Info

Publication number
CA2277963A1
CA2277963A1 CA002277963A CA2277963A CA2277963A1 CA 2277963 A1 CA2277963 A1 CA 2277963A1 CA 002277963 A CA002277963 A CA 002277963A CA 2277963 A CA2277963 A CA 2277963A CA 2277963 A1 CA2277963 A1 CA 2277963A1
Authority
CA
Canada
Prior art keywords
character
template
movement
pen
sequence
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
CA002277963A
Other languages
French (fr)
Inventor
Geoffrey Norman Walter Gay
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Co Operwrite Ltd
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Publication of CA2277963A1 publication Critical patent/CA2277963A1/en
Abandoned legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/10Image acquisition
    • G06V10/17Image acquisition using hand-held instruments
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0354Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
    • G06F3/03545Pens or stylus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V30/00Character recognition; Recognising digital ink; Document-oriented image-based pattern recognition
    • G06V30/10Character recognition
    • G06V30/32Digital ink
    • G06V30/36Matching; Classification
    • G06V30/373Matching; Classification using a special pattern or subpattern alphabet

Abstract

Means for inputting a hand generated character into a computer comprises means (10) for drawing a character, means for abstracting a sequence of signals as the character is drawn corresponding to components of the character to produce a code representative of that character and means (12) for recognising that code, whereby the character is inputted to the computer (14).

Description

TITLE: Means for inputting characters or commands into a computer.
DESCRIPTION
This invention concerns means for inputting characters or commands into a computer or other information receiving device without a keyboard or the like using the automatic skills of handwriting.
The present day computer keyboard was initially designed to operate a typewriter. The keys were operated as levers to stamp a die onto paper to print each character. Each key carried two characters one above the other, the lower case character being reproduced by normal depression of a key onto paper with an ink ribbon therebetween and the upper case character being obtained by shifting the entire paper carriage or die set so that the impact occurs with the upper character die impression rather than the lower. Punctuation and special characters were obtained by shifting the numbers or with extra keys.
The printing method is fundamentally the same as in a printing press but the purpose of a typewriter is very different from the purpose of a press.
Printing, of course, allows publication of a manuscript and the reproduction of many identical copies of the original manuscript without the effort of handwriting each copy .
The typewriter came into being with the growth of modern commerce and the need for legible business letters. At that time (and indeed presently), handwriting was highly personal and showed great variation from one person to another. This made handwritten letters, agreements, contracts and other legal documents potentially ambiguous or unclear in meaning. It is this complexity of handwriting which mitigates against current approaches to computer analysis of handwriting.
V ariations in handwriting represent simple information embedded in a mass of redundant detail. In modern information and communications, the approach to redundancy in a pattern is to throw large computing power into analysis and recognition. Computer equipment for analysing handwriting is available but does require considerable computing power and hence is relatively expensive and often cannot recognise the handwriting quickly enough, in real time, causing delays to the inputting process.
The analysis employed in such methods depends upon the extraction of salient features from the pattern of handwriting presented to the device and its software. It should be noted that the salient features chosen are often complex and any one may be specific to one character or letter. This implies that the set of such features is large and complex. In addition there exists a number of different ways in which a particular character can be drawn, each of which may contain different salient features. Add to this the difficulty that even with a single way of drawing a particular character, the actual pattern drawn will vary greatly from one person to another. The result is that such approaches to the computer recognition of handwriting have so far been limited in their success and often require a learning process in which the software adjusts to the handwriting of the user or the user learns a way of writing which allows the system to work. The overhead in terms of WO 98!33141 PCT/GB98/00245 programme size and computing power required is often expensive and impractical in the application to hand-held computers or personal digital assistants particularly at the smaller end of the scale of size, power and cost (the high volume market of pocket.
databanks, diaries, organisers and the like).
Another approach to data input to a computer from finger movements is embodied in systems that require the user to draw each character in a particular way, devoid of ambiguity. This results in a sort of short-hand code which has to be learned by the user. The short-hand forms are often not familiar or readily recognisable as the characters they represent. The result is a commercially successful system but some way removed from natural writing and which needs to be learned and practised.
Another difficulty associated with the current approaches to handwritten input to a computer is the complexity and expense of the hardware required for the sensing of the finger movements. In both the approaches described above, the moment-by-moment and point-by-point form of the motion of the fingers must be sensed, digitised and transmitted to the processor carrying out the analysis and recognition.
In many devices currently available this function is performed by a pen or stylus moved by the fingers across a touch sensitive screen. The finger motions are detected by this device and transmitted to the processor, which causes an image of the movement to be displayed on the same screen. Such a complex input device is expensive and can represent a significant proportion of the cost of for example a hand-held computer.
Thus, it is not easy to input hand generated information into a computer in a direct manner.
The printed word, on the other hand, is clear and unambiguous. Every character can be standard in form and scale and easy to read. The printing press sets up its text as a block of lead type which is impressed onto one or more paper pages at a time. This allows the rapid production of many copies of a page. The typewriter, however, needed to be flexible at the level of each character, not at the level of each page. Hence, one key (one print operation) per character. Therefore, the present day keyboard has 60 to 70 keys.
Keyboards which deliver the component parts of each character (one part to one key) have been proposed. Because the form of printed numbers and letters can be simplified (they can be displayed with 7 and 14 segment displays) such a keyboard would only need a relatively small number of keys compared to the standard keyboard. However, such keyboards have not been successful possibly due to the barner of having to learn a new way of typing which overrides the advantages of such a simple keyboard. It is to be noted that during conventional touch-typing, although the fingers of both hands cover the keys, only one finger is working at a time. With character constructing keyboards as mentioned above, a number of fingers must be employed simultaneously to print a character and so co-ordination skills must be learned by the user. This means that the typing skill called for is less natural than the one-key one-character scheme used by conventional keyboards.
An object of this invention is to provide means for inputting hand generated information into a computer.
According to one aspect of the invention there is provided means for inputting r, ( PCT/GB98/00245 _ a hand generated character into a computer comprising means for drawing a character, means for abstracting a sequence of signals as the character is drawn corresponding to components of the character to produce a code representative of that character and means for recognising that code, whereby the character is inputted to the computer.
The signal abstracted preferably corresponds to a quantization of motion as the character is drawn. The signal abstracted may correspond to a change in direction as the character is drawn and/or may correspond to movement beyond one or more defined thresholds in a particular direction as the character is drawn and/or a signal abstracted may correspond to a change in position of the drawing means from one defined area to another defined are on a drawing surface.
According to a second aspect of the invention there is provided means for convening movement or force generated in reproducing a character into a coded signal corresponding to one or more elements of said movement or force that are indicative of the character, whereby the character is recognisable from said coded signal.
According to a third aspect of the invention there is provided a device for converting movement of or force applied to at least a part of said device, said movement or force being imparted by reproduction of a character, into a coded signal corresponding to one or more elements of said movement or force that are indicative of the character, whereby the character is recognisable from said coded signal.
According to a fourth aspect of the invention there is provided means for inputting a hand generated character into a computer having a monitor, comprising means for drawing a character to produce a sequence of signals corresponding to that character, means for converting signals produced for one character into a code representative of that character, means for recognising that code and means for providing visual feedback corresponding to the character being inputted as the character is being drawn.
The means according to this aspect of the invention may be used with any handwriting recognition/input system whether involving quantisation recognition or any other system of handwriting analysis.
According to a fifth aspect of the invention there is provided a visual feedback to the writer on a display screen. Feedback may take the form of a sequential build-up or animation of a character form which itself is produced from the above mentioned coded signal. Feedback may be generated by the processor which is connected to the above mentioned input means or input device or any other suitable input means.
Thus the display screen can show the results of the handwriting recognition process as a feedback of information to guide the writer. It preferably operates step by step as the elements of movement are coded by the input device and includes the aspect of computer recognition in the visual feedback process unlike all prior art. It does not indicate the moment by moment movement of the fingers or the point by point form of the character as drawn, as is the case with current approaches to handwriting input to a computer. The user is guided by the interpretation of the finger movements by the system, so as to be able easily and naturally to produce just the correct finger movements that will code as the correct sequence of elements of r, unambiguous recognition of the writing.
Preferably the visual feedback means comprises means for producing on a monitor a graphic simulation of a character component in response to an abstracted signal. The graphic simulation is preferably modifiable in response to a subsequent signal of a sequence for a character.
The graphic simulation preferably further includes an indicator as to position of the drawing means on a drawing surface. The indicator may comprise an icon displayed at or near an end of the latest graphic simulation component.
Alternatively, the indicator may comprise an icon that moves around the graphic simulation of a character in response to movement of the drawing means.
The feedback can be a smoothly produced animation of a cursive character form that responds during its formation to the incoming flow of recognised elements or signal codes.
The computer or input device appears to the user to be cooperating in the process of writing and to be producing the characters on the screen from the prompting provided by the finger movements.
Of course, the characters shown on the screen are not representative of the actual locus or form of the movement of the fingers, but are synthetic representations of the intent of the user, and merely guide the user in the inputting process.
From the user's point of view the characters seem to appear as if written by the user, with the cooperation of the computer.
Such characters can build up to display a completed word, for example, in a standard, clear, joined-up cursive writing, each character of which has been produced from the sequence of simple elements produced by or abstracted from the operation of the input device.
When the user lifts the pen or signals the end of a word in an appropriate.
manner, then the processor can immediately replace the cursive characters with the same word displayed in a selected font appropriate to the application or application programme .
In contradistinction to prior art handwriting analysis systems which input information describing the character as drawn and carry out an extraction of salient features (necessarily scale and speed independent), followed by comparison with a stored library of possible shapes, strokes and their inter-relationships, both spatial and temporal, to give the best fit to one character of a complete character set, and thence to the recognised code for the character, the system of the present invention is a direct encoding system where the movements generating the character as drawn, are compared with a single template in such a way that complacent movements directly produce the elements of a code that identities the character completely by the time the character is completed. At the instant the character is completed, the recognisable code has been completely built and no further analysis or processing is required for recognition .
Preferably recognition occurs character by character in real time. The one or more elements of movement or force are preferably unit vectors.
Preferably analysis of movements or forces into elements is by means of quantizing said movements or forces into one or a sequence of unit vectors.
These elements are preferably speed independent, are preferably scale independent and are r' preferably substantially independent of distortions or variations in the character as reproduced.
Preferably the elements form a set common to all the characters to be reproduced, which set does not contain elements specific to only one or a few characters.
The signal is preferably recognisable by a computer or any other information handling device to which the device is connected, whereby the character can be displayed on a visual display unit operated by the computer or can be processed in the same manner as a character input from a keyboard.
If an input device were activated by movements similar to those employed in writing, then this could provide a method of inputting characters and text into a computer without the need to learn a completely new skill.
What is here described is a device providing a method of analysis which is mechanical or automatic and does not require an indirect process of analysis and comparison to produce a unique code for a character, in contrast to prior art.
This automatic generation of a unique character code may be facilitated by means of a visual feedback from a display of the recognised elements of a character as synthesised from the signal from the input device.
The automatic switch-like method of extracting the coded signal from the finger movements gives rise to relatively simple and inexpensive input devices, recognition contemporaneous with the completion of a handwritten character, low computing power requirement, natural character forms and ease of learning and use, in contrast to prior art.

Thus the invention herein described allows data input to a computer or other system by means of the natural finger movements employed in writing utilising simple and low cost input devices with high speed recognition and visual feedback There is an advantage to detecting motion as it is happening as opposed to analysing the space pattern of completed handwriting. The motion of a pen when writing the circle of the letter "a" is different from the motion when writing the circle of the letter "p", although the resulting shapes are very similar. The "a"
circle is normally produced by an anti-clockwise motion whilst the "p" circle is normally made with a clockwise motion. This distinction is lost if the resultant handwritten character is considered after it has been written. However, if the handwriting is analysed dynamically, as it is being written, then the information gained is far more useful. It will be appreciated that references to detection of movement include detection of applied forces in generating said movements.
In a preferred embodiment the drawing means will be a hand held pen or the like, whereby the pen or a part thereof can be moved to reproduce characters.
It is envisaged that the drawing means of the invention will have a part that may be moved relative to a real or notional template when a character is being reproduced and that the drawing means will include means for detecting said movement relative to the template. The template may be incorporated in the drawing means itself or may be separate therefrom. There are various ways in which the movement of said part of the drawing means may be detected.
For example it may be possible to have a template around which said part of the drawing means can be moved, whereby contact of that part of the drawing means rr on a sensor in a particular part of the template will indicate a direction of movement and again one movement or a sequence of movements will generate a signal corresponding to the character being reproduced by those movements.
Put another way assuming a pen having a body, writing tip and a real template, the template may be separate from the pen, such as on a surface, may be fixed to the pen body or may be fixed to the tip. On the other hand, for a pen having a body and a writing tip, movement of either or both may be relative to a notional template associated with the body, the tip or a separate surface.
The means for detecting a movement of the drawing means or that part thereof may include contact switches, magnetic or capacitive sensors, optical encoders, light detectors, voltage changes, piezo-electric crystal activation or any other suitable means .
The system of the invention preferably includes means for signalling completion of a drawn character. Completion may be signalled by lifting the drawing means from a drawing surface. Alternatively, completion of a character may be indicated by a unique movement of the drawing means relative to that character.
Another alternative may be to indicate completion of a character by movement of one of the drawing means and an icon indicative of the drawing means to a defined position, possibly on the drawing surface or an area defined on a monitor.
The mode of analysis envisaged by the invention is actually concerned with the time patterns of muscle action, in contrast to the space patterns of completed handwriting. It is relevant to note that all communication occurs through the medium of muscle action, whether speech, body language, touch, action, handwriting or typing. The first outward expression of thought is always through muscular action.
This invention is aimed at allowing the communication with a computer to take place at the level of the neuromuscular skill of writing.
It will be appreciated, however, that there is considerable redundancy present in handwriting. Although handwriting may be taught in a uniform fashion, variations and embellishments are added as a person develops his handwriting skill, so that whilst letters and words can be recognised, it is extremely difficult for, for example, a computer scanning device to extract the essential characters because of personal variations and embellishments.
Accordingly, a preferred aim of the device of the invention is to enable characters to be reproduced as unit vectors. In other words, each character as it is drawn using the device of the invention preferably produces a signal for that character as one or a sequence of steps . This may be achieved by limiting or restricting registration of the movement to one or a series of quantized steps or unit vectors.
It is important to realise that signals which solely describe the position, movement or locus of a pen or moving part of the device simply provide a copy of that movement etc. in electronic, electrical etc form. They do not of themselves facilitate logical recognition of the inputted letter form or character form.
What this invention allows is an automatic reduction of the movement etc into a quantized form. This means that the movement is divided into steps which indicate the time sequence of unit vectors which characterise the movement etc. The steps themselves do not describe the point by point and moment by moment movement which results from drawing the character form. They are rather the result of an analysis of the movement etc which indicates a series of unit vectors. This series of unit vectors cannot be used to reconstruct the original finger movements, because all redundant space and time information is discarded in the process of detecting the unit vector sequence. All that remains is the sequence of the unit vectors and the character of the unit vectors.
The character of the unit vectors will be dependent on the design of the device. In the case of a physical square template the unit vectors could be characterised for example as being up, down, left or right.
The time delay between one unit vector and the next is not of importance and is discarded information. All that matters for recognition is the sequence, eg. left then down then right then up then down for the handwritten letter form "a" .
Also the process of deriving the unit vectors disregards the scale or size of the movement or letter form. The same sequence of the same unit vectors results from a large "a" as from a small "a". In addition, provided the physical movements which activate the movement or position detectors are smaller than the smallest character to be drawn, the sequence of unit vectors will be the same for wide variations or distortions in the form of the original character, letter or resulting motion.
It should be noted that such a family of unit vectors (one simple case being:
UP, DOWN, LEFT, RIGHT) can represent all the characters to be input to a computer etc through finger movements.
In other words, each and every number, letter etc can be analysed into a sequence of the same set or family of unit vectors. The uniqueness of character resides in the sequence of the unit vectors which represents a unique code for the character. The different characters do not require analysis into unique individual features as in the prior art.
The analysis of original motion into unit vectors is according to a scheme.
which compares the movement to an arrangement of detectors placed in a fixed relation to a real or notional template. This allows the motion to be compared with the geometry of a template in such a way that a complacent movement will result in a single signal or part of a signal which indicates the characteristic direction or movement at that stage of the drawing of the letter or character etc.
For example, once the moving part has gone beyond the upper limit of detection, the unit vector will indicate simply "up" until the moving part has once again returned within the scope of detection in the direction, when it could be followed by "down" . Similarly with horizontal movement. This approach leads naturally to a description of operation of the device in terms of a template.
The template is simply the geometry which determines the signalling of the unit vectors, and may be a physical form eg. a square aperture within which the pen tip etc. moves, or it may be notional, and is simply the space pattern of detector switching limits in two dimensions or it may be embodied in the movement analysing processor which is connected to the input device moved by the fingers.
Either scheme will result in practical devices which convert the finger and hand movement familiar to us as handwriting into a code signals which is logically recognisable as corresponding to the character drawn.
For accuracy of coding, and in order to remove the inaccuracies introduced by personal embellishments, the writer may be guided by visual feedback from an image on a display screen, and can choose natural character shapes which can be learned quickly and easily.
Thus the device allows "typing" or inputting or textual information into a computer or other automatic text handler (eg. typewriter, portable databank or diary etc. ) at handwriting speeds or faster, without the need to learn the far more complicated skills of touch typing using a conventional keyboard.
The principle of operation is based on the quantization of motion, and is not to be confused with handwriting analysis which causes automatic recognition of the form of normal personal handwriting (or even the recognition of a limited or defined or stylised set of character forms) by an analysis of its complex actual shape.
The aim of the template either real or notional is to register the movement of the device as unit vectors but not necessarily to restrict the movement of the device to unit vector form, whereby a recognisable signal corresponding to that character can be produced.
In preferred forms of the invention the relation between the template and the part or parts of the device will be flexible, thereby freeing the device from performing forced angular, rectangular or linear movements. In other words, by introducing a flexible linkage between relatively movable parts of the device or between a movable part of the device and the template, the device can follow both straight and curved lines whilst those movements will be detected as straight line movements or forces producing unit vectors.
Thus, the preferred device of the invention has the ability to detect movements of at least a part thereof in producing a character as one or a sequence of unit vectors to produce a signal corresponding to the character, even when the character is not reproduced in a format constrained by the geometry of the template.
The flexible linkage may take any suitable form. For example, when the tip of a pen device is to be movable relative to the body of the device, the flexible linkage may be provided by one or more elastic members linking the tip to the body.
V arious considerations may be taken into account in deciding the nature of the real or notional template.
In one preferred embodiment, the template may be in the form of an enclosure having at spaced positions around its periphery means for detecting movement of said device part from one point to another around the periphery of the enclosure.
The enclosure may be of any suitable shape but will preferably be a square or a circle.
Preferably four detection positions will be provided at equidistant spacings.
The movable part of the device may be a rod or the like and its movement from one detection point to another may be by any suitable sensor means, such as already suggested above.
In another preferred embodiment, the template may be in the form of a confined track around which the movable part of the device can travel, again with spaced detection points as in the first preferred embodiment.
In a yet further preferred embodiment, the template is notional rather than real and may be embodied in the processor running the requisite software and the movable part of the device may be detectable as being in accordance with a template.
Thus, the device of this preferred embodiment of the invention will include means for registering the movement of said movable part as though it were following a rv ' template. Thus, the device may be arranged to produce output signals when movement of at least a part thereof exceeds a notional boundary of the notional template.
It will be appreciated that these signals indicate major changes in direction as compared to a template or set of directions or axes. It is possible to derive the signals indicating the unit vectors as changes in velocity or other time derivatives as well as direction or position. Such a derivation is suited to the application of this invention to conventional computer pointing equipment.
For example the data stream from a computer pointing device such as a mouse, trackball, pen and tablet etc indicates the relative position of the fingers moment by moment. If this data stream is analysed by a computer or dedicated processor in such a manner that excursions of the finger position are compared with a notional template, encoded in an algorithm stored within the computer or processor or its associated memory as a pattern of excursion limits in two dimensions, movements beyond these limits or complacent with the template boundaries can trigger the generation of a sequence of signals, indicative of the unit vectors, which codes uniquely for the character drawn by the fingers which are moving the mouse, trackball, pen and tablet or other pointing device.
This invention will now be further described, by way of example only, with reference to the accompanying drawings, in which:-Figure 1 shows schematically a system for writing into a computer;
Figures 2A and 2B show a possible arrangement for a pen device of the invention;

Figures 3A and 3B show possible movement of the pen body of Figure 2 and the resulting sequence of unit vectors around the template;
Figure 4 shows alternative forms of a letter, each of which can be represented by the same sequence of constrained movements;
Figure 5 shows another possible arrangement for a pen device of the invention;
Figure 6 shows a unit vector sequence resulting from forming a letter;
Figure 7 shows a variety of forms of the same letter that may all produce the unit vector sequence illustrated in Figure 6;
Figures 8A to D show schematically operation of a pen device utilising frictional forces between its tip and a surface;
Figures 9A and 9B show the correspondence of intended character, unit vector sequence and the animated cursive character form used in the visual feedback;
Figures 10, 11 and 12 show yet another form of pen device according to the invention;
Figures 13A and 14A are sectional views through a yet further form of pen device according to the invention; and Figures 13B and 14B are sections on lines AA and BB respectively of Figures 13A and 14A.
Figure 15 illustrates the principle of using a virtual template in relation to a pen device according to the invention;
Figure 16 shows a pocket databank with conventional keyboard;
Figure 17 shows a pocket databank with a pen device of the invention;

WO 98/33141 PCT/GB98J00245 - .

Figure 18 shows a flow chart illustrating the procedure of synthesising an animated image to be displayed on a screen to provide visual feedback to the writer;
and Figure 19 shows the flow of information in such a system employing an input device of the invention and a method of visual feedback described herein;
Figure 20 shows a letter "a" reproduced with an additional movement to indicate completion and start of the next letter;
Figure 21 illustrates detection of double unit vectors;
Figure 22 shows detection of double unit vectors in drawing a letter "g";
Figure 23 illustrates provision of an actual pen position icon as a letter is drawn;
Figure 24 illustrates provision of a synthetic pen position icon as a letter is drawn;
Figure 25 illustrates how letters may be drawn starting from the same point;
Figure 26 shows use of guide lines to aid character input;
Figure 27 illustrates visual feedback compared to actual movement of a drawing device;
Figure 28 illustrates a display screen with special areas for signalling completion of a character; and Figure 29 illustrates visual feedback with modification as new unit vectors are detected.
Referring to Figure 1 of the accompanying drawings, there is shown schematically an embodiment of the invention.

A pen device 10 contains a template which constrains the movements performed automatically by the fingers during handwriting and abstracts from these movements the elements that allow computer recognition. The result will be a "pen"
which senses the sequence of movement elements in each character while allowing the user to feel as if he is writing in a near-normal way. The sequence of movements can be registered electronically via mechanical switches or optical, electric or magnetic sensors or other means and the sequences decoded by a microprocessor 12 and the characters transmitted to a computer as if from a keyboard and displayed on a visual display unit of computer 14 as they are recognised. Alternatively, the sequence can be transmitted directly for simple logical recognition therein.
Taking this concept a step nearer to a practical form, one of the simplest forms of template is a square and the template could be constrained to move around the pen tip with the pen tip held stationary . Such a pen would feel like being forced to write in a squared handwriting. Add to this a "soft" or flexible linkage, integral with the pen, to allow for writing the circle of, for example, an "a" or a "p"
.
Such an arrangement as shown in section in Figures 2A and B of the accompanying drawings allows the pen to describe a circle while the template moves around the pen tip in four segmented movements. As the pen body 18 is moved in a circle by the fingers, the flexible linkage 20 will stretch to drag the template 24 around the pen tip 22. The forces involved can be quite small - giving a slight tactile feedback to guide the user. As the template is within the pen body, and is smaller than the smallest circle drawn by the user, the template will be pulled against the pen tip sides by the slight force of the stretched flexible linkage. The relative movement of the pen tip and template is, therefore, constrained to the four possible segments of the square template.
Figure 2B shows the pen at rest and Figure 2A shows the pen moved in the direction of arrow F.
These segments can be thought of as "unit vectors" which can be one of the following: up down left right or a d 1 or r. Thus the sequence of movements for the "a" circle would be detected as:
l, d,r,u and the sequence for the "p" circle will be"
r, d,l,u Figures 3A and B show respectively how a letter might be drawn with the pen of Figure 2 and the resultant sequence of unit vectors. This sequence of unit vectors will be the same with a wide variation of circle shapes such as shown in Figure 4 of the accompanying drawings.
In Figure 4, if all the circles were drawn clockwise starting with the pen tip in the top right of the template then they would all produce the same sequence of unit vectors:
d, l,u,r and yet the user would feel that he was drawing a free form circle.
In a practical form of this pen the body would be moved by the fingers while the tip would be pressed onto a surface and held still. The template could then be integral with, and inside, the pen body (a typical template equivalent size is O.Smm per side) and the tip would simply be the lower end of a spine rod that extended up the central hollow of the pen, and connected to the pen body through the flexible linkage and thus be constrained to move around the sides of the square template. The user would feel that he was writing in a near normal way while the finger movements would be convened into a sequence of unit vectors.
It turns out that a square template, for example, can code uniquely for all the lower case letters of the English alphabet and for the numerals 0 - 9.
In order for this device to be useful in producing movement sequences recognisable by a computer as characters, it is necessary to explore the unit vector conversion of each character in the character set a-Z and 0 - 9. The character forms are desirably intuitive and simple. It is proposed to write in lower case and shift to upper case (for example with a simultaneous modifier key mounted on the pen body j .
A shift key could allow the input of capital letters and the special characters ! ~ a $
& etc as with the standard keyboard. Thus, writing the character "a" while the shift key is down could give "A".
Further modifier keys, for example "option", could be employed to generate commands to the computer.
It will be noted that many redundant codes of unit vectors are available for the special characters, punctuation and commands.
For example a single "left" movement giving the L unit vector could delete the last character input, with the same result as pressing the "delete" key on a computer keyboard.
To determine the start and end of each character a signal could be generated by a switch inside the pen body activated by the pressure of the pen tip on the surface or by a third key. This key would be pressed while "writing" a character and released at the end of the character sequence. The action becomes swift and automatic with a little practice. The end-signal would initiate the unit vector sequence analysis, a Look-up algorithm lasting a few microseconds, and the character would then appear on the computer screen.
In another embodiment of this invention, the character end can be signalled by a slight pause (for example while the visual feedback device completes the animation of the intended cursive character form on the display screen) and the end of a word is signalled by the writer lifting the pen from the "writing" surface.
An arrangement for a template is shown in Figure 5 of the accompanying drawings. A square template SO has sensor switches 52 ( 1, 2, 3 and 4) to detect the position of the pen tip 54 (more accurately the spine rod) within the square.
These switches 52 are located at the centre of each template side and each switch operates whenever the spine rod is pressing against a particular side. It is the time sequence of these switch transitions that signals the motion of the pen relative to the spine rod and pen tip.
This leads to reduction in the redundancy of the information contained in the motion. Just as in the space domain the variation of form is removed by reducing the motion into notional unit vectors ("unit" implying the transparency of the absolute vector length - only the direction component is abstracted; this being effected by the design of the hardware switching), so in the time domain the variation in timing is removed by abstracting only the order of the switch transitions and disregarding the absolute time intervals involved; this being effected by the design of the software sequencing.
(Note that the spine rod and the template dimensions can be many times larger than the effective template size. The effective size is equal to the possible movement of the spine rod or pen tip within the template. This can be typically O.Smm x O.Smm. Compare this with the movement producing a written "a" having a diameter of about 3mm).
The sequence of transitions generated by drawing an "a" with the arrangement of Figure S will be:
2- 4+ 1- 3+ 4- 2+ 3- 1+ 1- 3+
(where + signifies a switch turning on and - signifies it turning off, the number preceding the sign indicating the switch number) . This is because the unit vector sequence for "a" is: 1, d, r, u, d starting at the top right of the template (see Figure 6).
Thus the same sequence of transitions will be generated if the user draws the first curve of the "a" slowly and then speeds up or when he begins quickly and then slows down. All that matters is the relative order of the unit vectors.
Also, provided that the miniature square template inside the pen is smaller than the smallest "a" drawn, all the "a"'s shown in Figure 7 will also encode as:
2- 4+ 1- 3+ 4- 2+ 3- 1 + 1- 3+
irrespective of variations of form or scale.
Remember that the fingers move the pen body freely and the relative movement of the tip and the template is effected through a flexible linkage.
This means that the character drawn can contain curves yet the template moves around the r' pen tip in a series of linear steps.
Turning to the question of stylising character forms to facilitate recognition of movement sequences, it is to be remembered that the upper case forms can he generated automatically by the look-up algorithm in response to the lower case unit vector sequence plus a shift key or the like. It is important to realise that the locus of the pen body is invisible. The pen movements are felt not seen. The pen does not "write", it simply signals codes to the computer. The stylised characters which may be used are virtual characters. The mind's eye constructs its own fond image of the character it thinks it is drawing.
Instead of the rigid finger positioning over the conventional keyboard during touch typing, the pen allows a relaxed operation. As the pen does not need to move across the "page" and as the movements may be guided automatically by tactile and/or visual feedback there is absolutely no need to look down at the pen.
One further embodiment of the invention is a pen device shown schematically in Figures 8A to D, wherein its tip 200 is held in contact with a "writing"
surface and is moved in relation to a real or virtual template 202 by means of the frictional force between the tip and the surface. This will signal the direction of movement of the pen body on it is moved by the fingers and hand. Figures 8A to D show respectively the pen moving downwards, upwards, to the left and to the right. As the tip moved under frictional forces, it touches contacts 211, 212, 2 I 3 and 214 respectively and thus signals a unit vector sequence. Such a pen is free to move over a surface in the same manner as a conventional pen.
Referring to Figures 9A and 9B, these tables show character stylisations which WO 98/33141 PCTIGB98/00245 _ form a character set which is only one example of many possible sets. The optimum set in any particular embodiment of the invention will depend on the template design and the arrangement and logic of the switching and the relationship to the animation sequences chosen to optimise the visual feedback as well as personal preferences.
This set relies on a flexible linkage to give a realistic feel to the drawing of the letters. Obviously the simple square template will not allow excursions (tails) up or down. However the fingers carry these out automatically, the pen body following the fingers, but the spine rod stays within the template square. Happily each character still generates a unique unit vector sequence and codes unambiguously into the target computer.
Obviously the writer will have to adapt the writing of each character to produce just the unit vectors required for error free recognition. However the abundance of codes derivable from sequences of unit vectors allows for multiple ways of drawing particular letters. (See the example of the letters "b" and "q" in the set of Figures 9A and 9B).
Most importantly the visual feedback will guide the writer effortlessly if the elements of the animation building the cursive character forms are designed to confirm the completed movements at any point in time and prompt for the required subsequent movements.
Because of the flexible linkage and the mind's own image of what it is telling its fingers to do, these letter forms seem quite natural.
After a little practice, far less than is needed to become skilled at using a conventional keyboard with all these characters, the component movements are not created individually but in a fast automatic flow, as the mind goes through the act of writing each character. The speed can be typically 20 unit vectors per second.
In Figures 10, 11 and 12 is shown a form of pen according to the invention in which the pen has a body 60 which is movable relative to a template 62 in the pen tip 64 which is held stationary upon a surface. The pen tip 64 may include a suitably shaped rubber or the like pad which is relatively non-slip upon say a table.
The advantage of this embodiment is that the actual movement of the pen around the template and the imagined movement of the pen tip are equivalent.
With the pen described earlier, these movements are opposite in sense and the mental link between the two has to be unlearned. The template may be of any desired shape with movement sensors also of any desired type as described hereinbefore or later.
Another refinement, which may be applicable to four-switch templates and more complex templates, is to generate the character start and end signals from the template switches. The start signal may be turned on whenever at least one of the template switches is on, and may be turned off whenever all four template switches are off. This defines a starting point for the pen tip at the centre of the template. If in addition, the pen tip is centre-sprung, ie. automatically returns to centre after each excursion, either by slightly lifting the pen or simply by relaxing pressure, then the process of sending a character becomes easier and automatic. The logic of the start signal may be handled electronically.
More complicated templates can be constructed, where the freedom of movement of the pen tip is greater. An analogy would be the increasing complexity of car gearshaft gates as the number of gears increases.

WO 98/33141 PCTlGB98/00245 When a physical or real template is being used, the effective size of the square template may be reduced until the relative movement of the pen body and the spine rod or pen tip is arbitrarily small. The unit vectors may then be sensed using pressure transducers or strain gauges on each of the four template sides.
Character startlstop signals can be derived logically from the template signals.
A degree of flexible linkage is desirable to allow a very slight movement of the pen under the pressure of the writing fingers . This can be achieved by moulding the pen tip from say rubber or like material, and/or building in a slight compression movement into the pressure transducers or some other convenient position.
The movement of the pen in this arrangement is not constrained so obviously to a square template, however the signals from the transducers will conform to the same coding sequences for the same characters.
Writing control can be effected by means of an audible feedback generated from the vector recognition circuits. For example, as the fingers go through the movements of a particular stylisation, an audible signal can be generated as each vector is completed, the f=requency of the sound being arranged to be unique to each vector. After a little practice this feedback could be muted or disabled. The occurrence of a mistake (unrecognised sequence) for a particular character could switch this feature back on for a predetermined number of characters following, thus reinforcing the learning process. Just as when, while dialling a familiar number on a touch-tone telephone, a mistake immediately "sounds" wrong and familiar groups of numbers sound right.
A further feedback to facilitate both learning and the normal operation of the device could be a visual indication of the vectors themselves as they build up to describe a character. Most computer displays operating in word processing mode employ a cursor shape on screen to indicate the insertion point. This could be replaced with say a square representation of a virtual template showing the vectors as emboldened sides of the square (or whichever alternative template shape is used). At character end-signal this graphic would be replaced with the coded character and would itself move on to the next text position, ready to display the next pattern of vectors.
More sophisticated techniques of visual feedback and confirmation may be employed, in which the vector sequence information is used to synthesize a graphic image on screen which reflects the growing character as intended by the operator, using a stored programme to determine the available possibilities at each stage in order to guide the formation of the inputted character.
Such a system of visual feedback is illustrated in Figure 18 which is to be read as a flowchart. Here the way in which characters that all begin with an "UP"
unit vector (chosen as an example) may be reproduced on a display screen as a progressively developing image of the intended character in synthesised, clear, standard, cursive form (represented in the square boxes) is illustrated.
In the flowchart of Figure 18, the sequence of unit vectors is indicated by the symbols in the circles. Thus I U indicates that the first unit vector is "UP"
.
Similarly, for example, 6L indicates that the sixth unit vector is "LEFT" .
At the point of recognition, when the system decodes the finger movement into a unique unit vector sequence for a specific character, then at the corresponding point in the flowchart of Figure 18 the recognised character is indicated by a square box containing the corresponding font character.
The progressive animation develops each character as the fingers move in drawing the character while holding the input device which converts these movements into a sequence of unit vectors. It is this stream of unit vectors which determines the animation process. Thus the feedback loop is closed allowing a completely novel method of inputting handwritten information into a computer or the like.
In other words the eye sees the character form on the screen as the fingers move in such a way as to produce the unit vector sequence. The computer etc appears to cooperate with the user in the process of writing the characters.
In the example illustrated in Figure 18 the letters "1" "h" "b" and "t" are reproduced and recognised. It can be seen from this example that all the basic forms of the characters "a" to "z" and "0" to "9" can be similarly analysed into unit vectors and animated on a display screen.
It is important to note that the definitions of the letter forms in terms of the unit vectors bears a functional relationship to the sequence of metamorphosis of the animation of the synthetic on-screen cursive character forms. As the unit vector sequence is generated automatically, the animation responds by developing the letter through the forms possible at each stage. Thus referring to Figure 18 the letter form for a cursive 1 transforms into the cursive form for the letter h with the further input of unit vectors U R D. Similarly the h transforms to the form b after an L
unit vector. Thus, the design of the cursive font employed in the visual feedback animation contains the structure of the basic handwriting movements as defined by the unit vector sequences (ie simple changes of average direction) as can be easily and automatically detected.
Thus the design of the visual feedback font and the process of its animation is very important. It is envisaged that different such fonts can be designed for different applications, languages, countries and scripts and users.
This gives rise to a device which allows the writing of natural character forms to be elegantly guided by visual feedback, thus placing the brain, fingers, input pen or input device, computer processor, display screen and eye, all in the same feedback loop.
Figure 19 shows this feedback loop. The flow of information is indicated by the arrows 406 ( 1 to S ) . The fingers 400 of the writer perform the movements of writing a character and these movements are detected by the input device 401 which automatically produces signals indicative of the unit vectors characterising the character drawn. These signals are fed to a processor 402 which synthesises an animated image in response to the sequence of these unit vectors. The animated character is displayed on a display screen 403 and viewed by the eye 404 of the writer. Thus the brain 405 of the writer receives feedback according to the development of the unit vector sequence in terms of the development of the synthesised image indicative of the writer's intention, and is able instinctively to correct the movement of the fingers to cause correct computer recognition of the character drawn.
The process of computer recognition is thus included in the total feedback loop involving the user. This is in complete contradistinction to prior art, where the feedback is merely from the reproduction of the actual finger movements on the display screen and does not include the recognition process itself.
The end of each character is signalled in this example by a slight pause in pen movement, shown in Figure 18 as a letter P in a circle. However, the on-screen animation can produce joined-up cursive handwriting by a simple process of stored instructions responding to the unit vector sequence, and animating the connecting links between letters.
It should be noted that the process of animation can present the user with a continuously moving cursive line on the display screen, in response to the signals from the input device, which may themselves be discontinuous in time. The eye sees what the mind intends, rather than what the fingers are doing. After a very short period of use, the process can become virtually automatic and natural.
At the end of each word the pen or input device may be lifted up (just as in normal writing onto paper) to activate a signal (produced automatically from a switch or other sensing means) to the system processor to initiate the transformation of the completed cursive image of the written word on the screen into the corresponding font characters of the application programme etc which is the object of the data input.
It should be noted that each character is recognised at the pause after the last unit vector has been input. In other words the user will pause momentarily after completing each character, while the processor completes the animation of the cursive character form on the display screen. This image of a cursive character form is already a product of the recognition process and has been derived from a unique code of unit vectors already input to the system, and should not be confused with the WO 98/33141 PCT/GB98/00245 _ cursive forms indicative of the actual unrecognised finger movements displayed in inventions of prior art.
In this example the cursive form is displayed on screen until the whole word is completed to facilitate useful feedback to the writer.
It should be understood that the cursive letter form so synthesised and displayed bears a functional relationship to the finger movements employed in writing the character. It would not be so useful to display the "printed" font characters at this point.
The structure of the synthesised character forms is based on the unit vectors that characterise the corresponding written characters. This relationship can be seen in the example of the flowchart of Figure I8.
The feedback thus guides the writer in a most natural way to input the correct sequence of unit vectors, without consciously having to pay attention to that level of analysis.
Once the whole word is completed the system has all the information required to display the recognised characters in the final form of "printed" font characters to make up the complete printed word.
It is easy to conceive computer learning programmes to take a new user through the structure of the character set stylisations, using graphics and feedbacks similar to those described above.
It is possible to use a virtual template as opposed to a physical template.
The character recognition in the physical template systems is facilitated by the simplif cation of the movement by means of the physical boundary of the template and by the resultant reduction of that movement to scale-independent and speed-independent unit vector sequences.
However, a further refinement is still possible, in which the restriction of the movement by a physical barrier is replaced by a notional limit to the registration of that movement. If movements are only recognised by sensors in directions parallel to the sides of a notional, non-physical template, and if these movements are quantized by the sensors and/or their associated electronics and algorithms up to a specific limit of excursion, and if this limit is smaller than the smallest character drawn, then the end result will be the same for the same character stylisations as with a physical template.
This would lead to the design of physically simpler, faster pens or touch screen sensing of stylus or finger movements and allow the invention to work utilising the input devices now available for computers such as the mouse, tracker ball, finger pad, touch sensitive screen, pressure sensitive screen, pen and digitising tablet and the like.
Further refinements of the invention are described below with reference to Figures 20 to 29 of the accompanying drawings.
Characters to be input are defined in terms of the movements required to produce the appropriate unit vector sequence. Therefore, predetermined styles of character are pre-supposed. These characters can be very close and in most cases identical to natural character forms. Characters may be defined in terms of unit vectors in such a way that each character is represented by a unit vector sequence that is not a truncation of any longer unit vector sequence for another character.
That can allow continuous input (eg within a word .without necessarily signalling in some way completion of a character. Thus, completion of a character may be signalled by the last unit vector of the defined sequence for that character.
An example of such a unit vector set follows:
a = rldrud then r for start b = uddurdl then r for start c = rldr then r for start d = rldruudd then r for start a = ruldr then r for start or ruld then r for start f = uddu then rr for start g = rldruddl then r for start h = uddurd then r for start i = d then r for start j = dl then r for start k = uddrl then r for start 1 = udd then r for start m = dudud then r for start n = dud then r for start o = rldru then r for start p = dduurdl then r for start q = rldudd then r for start r = duudr then r for start s = rudl then r for start t = udrld then r for start a = drud then r for start or dru then r v = du then r for start w = dudu then r for start x = rl then r for start y = druddl then r for start z = rlrdl then r for start Figure 20 shows an animated screen image corresponding to movement of a drawing device in drawing a letter "a" according to the above unit vector set.
The last RIGHT movement signals the completion of a unique unembedded code for "a"
and therefore the end of the character. That can be used to cause the visual animation on the display screen of a line extending to a standard start position for the next character.
The signalling of the end of a word may be achieved by pen lift activating a switch or sensor or other eg button press, or a special unit vector sequence or special movement sequence.
Unit vectors may be derived in the following ways:
from switches detecting motion in a pen device as described above;
from exceeding a threshold of motion in a direction;
from exceeding a threshold of any combination of time derivatives of motion in a direction;
from movement from one defined area of writing surface to another;
from substantial complacency with a direction or axis or template side;

from combinations of the above.
Here substantial complacency means that the resolved vector components of the motion parallel to the direction, axis or template side are greater than those parallel to all other defined directions, axes or template sides in the system.
To facilitate drawing and recognition of some characters, it may be useful to be able to detect doubling of unit vectors. In other words in drawing some characters unit vectors may repeat one after the other. Detection of two vectors in the same direction may be detected by arranging two detectors with different thresholds of detection or two templates (real or virtual) one after the other so that the movement produces the detection of first one and then the second unit vector in the same direction. This is illustrated in Figures 21 and 22 of the drawings. In Figure 21 the arrow indicates the direction of movement of the drawing device or pointer.
Figure 22 shows how this can be used, for example, for the letter "g" .
Pen and pointer devices used in conjunction with computers and associated display screens or monitors often employ the reproduction on the screen of a line of pixels that represents the track or locus of the drawing device. This is some times termed "screen ink" . Such a display can be used in conjunction with unit vector detection to guide the user in forming the correct letter shapes.
Referring to Figure 23 of the drawings, it is possible to cause an icon on a monitor screen to move in response to the actual movement of the drawing device.
The icon 500 can be used to appear adjacent to the animated font providing visual feedback as described above. This allows the user to judge more accurately the movements required to cause correct unit vector recognition, as confirmed by the display of the corresponding animated font elements SO 1, 502, S03 , 504, for example, corresponding to the input of a drawn letter "o" .
As the pointing device is moved to produce the display of animated font elements on the monitor screen, it is advantageous to indicate the direction of pen movement and to give a simulacrum of the pen position by causing the processor controlling the monitor to display an icon at the end of each consecutive animated font element. This icon is not to be confused with the icon which responds to and represents the actual drawing device movement. Figure 24 of the drawings illustrates the sequence of images that result from the input of the letter "o" . Icon S20 appears at the end of each animated font element S21, 522, S23 and 524 as the letter "o" is input.
It is advantageous to arrange the drawing of characters so that they all start from the same point. This allows the writer to memorise one set of character forms which do not need mental re-adjustment of the pen position before the input of the next character. This leads to increased speed of writing. Figure 25 of the drawings shows examples of letters that can be drawn from a common start.
At the end of each character it is advantageous to arrange the visual feedback to move the position of the pen position icon (whether actual or synthetic) from the end position of the character to the standard start position. This immediately re-adjusts the writer's assumption of pen position to facilitate the speedy input of the following character.
The same result may be obtained by advancing the screen ink to the standard start position, or by causing the animation of a font element on the monitor to bridge the gap between the end position and the following standard start position.
That is shown, for example, in Figure 20 of the drawings, where the final right unit vector signals completion of the character "a" and the visual feedback automatically produces a line extending to the common start position.
Figure 26 illustrates provision of guide lines on a monitor display to aid correct input by providing indications of appropriate relative scale and necessary movement in conjunction with screen ink or actual pen position icon. This ensures a more regular drawing of characters and a scale which is consistent with the scale of the unit vector detection thresholds.
The use of extending vector images to provide visual feedback is an alternative way of guiding the user in the input of characters to produce correct unit vector sequences. The unit vector detected causes the image displayed of the pointing device movement to he locked to the corresponding direction and allows the input of a line reproduced on the screen that represents the extension of the movement. When the direction of movement changes sufficiently to trigger the recognition of a new unit vector, then the displayed line is locked in the new direction. This visual feedback allows simulacrum images of the intended character shape to be displayed as straight line segments corresponding to the degree of movement in each direction.
Figure 27 illustrates the method.
It is advantageous to use special areas or special guidelines on the display screen used in conjunction with screen ink and/or pointer icon, in order to signal character end and therefore allow continuous input (eg within a word) without lifting the pen device or otherwise needing to signal character end and/or in order to signal control or modifier characters or signals . In this method when the pen position icon and/or screen ink moves into an area of the monitor display surface corresponding to a defined area of the writing surface, or when the pen enters the defined area of the writing surface, or when the pen crosses a defined line on either surface, a signal is produced by the processor which indicates the end of a character or other control event or command.
This allows the rapid input of joined-up cursive characters without the need to lift the pen or otherwise signal the end of each character. This is shown in Figure 28 of the drawings in which movement of screen ink or pen icon into shaded areas 550, signals the end of a character.
Visual feedback may include the modification of displayed character elements as new unit vectors are detected. Figure 29 of the drawings illustrates this method.
The seat of the "h" is modified into the circle of the "b" upon detection of the L (left) unit vector. Subsequently, the circle of the "b" is modified into the curl of the "k'"
on detection of the final R (right) unit vector.
A practical drawing device for use in the invention, which has been built to prove the efficacy of quantisation of motion to produce unit vectors from the finger movements of handwriting, is now described with reference to Figure 13A and B
and 14A and B of the accompanying drawings. It will be appreciated that many forms of pen can be produced in for use in this invention and that in addition existing computer input devices can be adapted to embody the invention herein described.
These drawings show a pen 100 having a tubular body 102. Extending through the lower end of the body is a rod 104 which is pivotally mounted in the body at 106, so that when the tip of the rod is held stationary on a surface, the pen body can move relative to the tip in directions normal to each other. Within the pen body are four Iight sources 108 each being at the mid-point of a side of a notional square template. Opposite each light source is an optical fibre I10 for detecting an on or off situation for its own light source, whereby signals can be generated for microprocessor recognition. The rod 104 has a square shutter plate 112 on its upper end, which in a rest position, ie when the rod is centrally aligned with the axis of the pen, all of the light sources are detectable by their corresponding optical fibres 110 but when the pen body is moved relative to the rod, the shutter plate is moved to obscure two of the light sources corresponding to the direction in which the pen is moved. Figures 13B and 14B respectively show the shutter in the neutral position and in position where the pen has been pushed to the top right. The pen tip movement is constrained by a square template 114 in the form of an aperture at the end of the pen body through which the pen tip extends. Thus, the pen includes the means for detecting direction of movement of the pen in forming characters in order to generate a signal that can be recognised by a microprocessor or computer to produce the character on a computer screen.
If the pen tip has a built-in flexibility, the fingers can perform circular and curved movements while the signals are generated with reference to the square templates.
Figure 15 of the drawings shows schematically a pen device operating with a virtual template. The position of the pen tip 150 relative to the centre of the virtual template 152 is sensed in terms of its x, y coordinates as shown. As the pen body is WO 98/33141 PCT/GB98/00245 _ moved around the pen tip by the fingers, the notional template moves with the pen body and causes a relative movement between the pen tip and the template. The track or locus of the pen tip relative to the virtual template is indicated by line 154.
The movement is referred to template sides, ie is registered as a mapping of the pen tip position onto the template, resulting for example in the unit vector L D R, which could decode as the character "c" .
Provided the pen tip travels around the outside of the template and the template is always smaller than the smallest character drawn, then the sequence of unit vectors will always decode for the stylised character shapes irrespective of the scale or speed they are drawn.
Another embodiment of the invention (see Figure 17) consists of a template built in to a portable databank 300 or portable computer or other product requiring the input of information such as a video recorder, pocket calculator, telephone, central heating controller, washing machine etc etc. The template sensors are activated by the movement of a small stylus 302 held by the fingers.
The stylus may be attached or hinged to the product or may be removable or separate. This application will allow the space taken up by data input to greatly reduce as the stylus template 304 will replace the much larger keyboard or keypad 310 of a conventional pocket databank 312 (see Figure 16) having a screen 314.
The stylus may fold down as shown to conserve space when not in use. The advantages of this embodiment of the invention are that the product can be made considerably smaller, the stylus can he used with the eyes on the screen 314 and can be used more easily than the usually cramped keyboard keys, and data can be input more quickly.

The input device can be fabricated at considerably less expense than a keyboard or touch sensitive screen. Also a data /ink cable between the pocket databank etc could connect with a computer to allow text input from the built-in pen device to be input to the computer.

Claims (41)

1
1. Means for inputting a hand generated character into a computer comprising means for drawing a character, means for abstracting a sequence of signals as the character is drawn corresponding to components of the character to produce a code representative of that character and means for recognising the code, whereby the character is inputted to the computer, characterised in that each signal corresponds to a relative change in direction of the drawing means independent of extent of movement of the drawing means in that direction.
2. Means as claimed in claim 1, characterised in that movement of the drawings means is abstracted as unit vectors.
3. Means as claimed in claim 1 or 2, wherein abstraction of a direction change is speed independent.
4. Means as claimed in claim 1, 2 or 3, wherein abstraction of a direction change is scale independent.
5. Means as claimed in anyone of claims 1 to 4, wherein abstraction of a direction change is substantially independent or distortions or variations in the character as drawn.
6. Means as claimed in any one of claims 1 to 5, wherein recognition occurs character by character in real time.
7. Means as claimed in any one of claims 1 to 6, further comprising means for displaying the recognised character.
8. Means as claimed in any one of claims 1 to 7, further comprising means for providing visual feedback corresponding to the character being inputted as each signal is abstracted.
9. Means as claimed in claim 8, wherein the visual feedback means comprises means for producing on a monitor a graphic simulation of a character component in response to an abstracted signal.
10. Means as claimed in claim 9, wherein said graphic simulation is modifiable in response to a subsequent signal of a sequence for a character.
11. Means as claimed in claim 9 or 10, wherein said graphic simulation further comprises an indicator as to position of the drawing means on a drawing surface.
12. Means as claimed in claim 11, wherein said indicator comprises an icon displayed at or near the end of the latest graphic simulation component.
13. Means as claimed in claim 11, wherein said indicator comprises an icon that moves around the graphic simulation of a character in response to movement of the drawing means.
14. Means as claimed in any one of claims 9 to 13, further comprising means for displaying on the monitor the character as a reproduction thereof.
15. Means as claimed in any one of claims 1 to 14, including means for signalling completion of a character.
16. Means as claimed in claim 15, wherein the drawing means is arranged to signal completion of a character by lifting the drawing means from a drawing surface.
17. Means as claimed in claim 15, wherein completion of a character is indicated by a unique movement of the drawing means relative to that character.
18. Means as claimed in claim 15, wherein completion of a character is indicated by movement of one of the drawing means and an icon indicative of the drawing means to a defined position.
19. Means as claimed in claim 18, wherein said defined position is an area of a drawing surface.
20. Means as claimed in claim 18, wherein said defined position is an area defined on a monitor.
21. Means as claimed in any one of claims 1 to 20, wherein the drawing means comprises a hand-held pen-like device.
22. Means as claimed in claim 21, wherein the device has a part which is movable about a template during reproduction of a character.
23. Means as claimed in claim 22, wherein the part is movable relative to a notional template.
24. Means as claimed in claim 22 or 23, wherein the drawing means comprises a hollow body part movable about a real or notional template within the hollow body part.
25. Means as claimed in any one of claims 22, 23 or 24, wherein at least one movable part of the device and the remainder of the device andlor template are flexibly linked.
26. Means as claimed in claim 25, wherein at least one movable part of the device is a tip movable relative to a body of the device and one or more flexible linkages affect movement of the tip relative to the body.
27. Means as claimed in any one of claims 22 to 26, including means for sensing direction of movement of said device or part thereof relative to a real or notional template in reproducing a character.
28. Means as claimed in claim 27, wherein sensing means are spaced about said real or notional template.
29. Means as claimed in claim 26 or 27, wherein the sensing means are selected from electrical, photoelectric and magnetic sensing means.
30. Means as claimed in any one of claims 22 to 29, wherein the template is a generally square enclosure.
31. Means as claimed in any one of claims 22 to 29, wherein the template is a generally circular enclosure.
32. Means as claimed in any one of claims 22 to 29, wherein the template defines a track.
33. Means as claimed in any one of claims 22 to 29, wherein the template has a plurality of zones and said part moves from zone to zone in reproducing a character.
34. Means as claimed in any one of claims 1 to 33, including means for converting a signal for a lower case character into a signal for an upper case character.
35. Means for inputting a hand generated character into a computer having a monitor, comprising means for drawing a character to produce a sequence of signals corresponding to that character, means for converting signals produced for one character into a code representative of that character, means for recognising that code and means for providing visual feedback corresponding to each signal of the sequence for the character being inputted as each signal is produced.
36. Means as claimed in claim 35, wherein visual feedback means comprises means for producing on the monitor a graphic simulation of a character component in response to each signal of a sequence of signals.
37. Means as claimed in claim 35, wherein said graphic simulation is modifiable in response to a subsequent signal of a sequence.
38. Means as claimed in claim 35, 36 or 37, wherein said graphic simulation further comprises an indicator as to position of the drawing means on a drawing surface.
39. Means as claimed in claim 38, wherein said indicator comprises an icon displayed at or near an end of the latest graphic simulation component.
40. Means as claimed in claim 38, wherein said indicator comprises an icon that moves around the graphic simulation of a character in response to movement of the drawing means.
41. Means as claimed in any one of claims 30 to 40, further comprising means for displaying on the monitor the character as a reproduction thereof.
CA002277963A 1997-01-29 1998-01-27 Means for inputting characters or commands into a computer Abandoned CA2277963A1 (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
GBGB9701793.3A GB9701793D0 (en) 1997-01-29 1997-01-29 Means for inputting characters or commands into a computer
GB9701793.3 1997-01-29
PCT/GB1998/000245 WO1998033141A1 (en) 1997-01-29 1998-01-27 Means for inputting characters or commands into a computer

Publications (1)

Publication Number Publication Date
CA2277963A1 true CA2277963A1 (en) 1998-07-30

Family

ID=10806747

Family Applications (1)

Application Number Title Priority Date Filing Date
CA002277963A Abandoned CA2277963A1 (en) 1997-01-29 1998-01-27 Means for inputting characters or commands into a computer

Country Status (14)

Country Link
US (1) US6647145B1 (en)
EP (1) EP1012780B1 (en)
JP (1) JP2001509288A (en)
KR (1) KR100438653B1 (en)
CN (1) CN1161707C (en)
AT (1) ATE284559T1 (en)
AU (1) AU748968B2 (en)
CA (1) CA2277963A1 (en)
DE (1) DE69828065T2 (en)
GB (1) GB9701793D0 (en)
ID (1) ID22678A (en)
RU (1) RU2236036C2 (en)
TW (1) TW469400B (en)
WO (1) WO1998033141A1 (en)

Families Citing this family (74)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6088019A (en) * 1998-06-23 2000-07-11 Immersion Corporation Low cost force feedback device with actuator for non-primary axis
US6429846B2 (en) 1998-06-23 2002-08-06 Immersion Corporation Haptic feedback for touchpads and other touch controls
US7268774B2 (en) * 1998-08-18 2007-09-11 Candledragon, Inc. Tracking motion of a writing instrument
US20100008551A9 (en) * 1998-08-18 2010-01-14 Ilya Schiller Using handwritten information
AUPQ056099A0 (en) * 1999-05-25 1999-06-17 Silverbrook Research Pty Ltd A method and apparatus (pprint01)
AUPQ363299A0 (en) * 1999-10-25 1999-11-18 Silverbrook Research Pty Ltd Paper based information inter face
US7170499B1 (en) * 1999-05-25 2007-01-30 Silverbrook Research Pty Ltd Handwritten text capture via interface surface
US6816274B1 (en) * 1999-05-25 2004-11-09 Silverbrook Research Pty Ltd Method and system for composition and delivery of electronic mail
EP2056233B1 (en) * 1999-12-23 2011-10-19 Anoto AB Information management system
US6822635B2 (en) 2000-01-19 2004-11-23 Immersion Corporation Haptic interface for laptop computers and other portable devices
CN1520574A (en) * 2001-06-20 2004-08-11 �¿���˹�����ɷ����޹�˾ Presentation of recognition of handwritten pattern
JP4261145B2 (en) * 2001-09-19 2009-04-30 株式会社リコー Information processing apparatus, information processing apparatus control method, and program for causing computer to execute the method
UA46628A (en) * 2001-10-12 2002-05-15 Віталій Іванович Гнатенко METHOD OF INTRODUCING SYMBOLS AND DEVICES FOR ITS IMPLEMENTATION
US7257255B2 (en) * 2001-11-21 2007-08-14 Candledragon, Inc. Capturing hand motion
CN100350364C (en) * 2001-12-21 2007-11-21 西门子公司 Device for detecting and displaying movements
US7096432B2 (en) * 2002-05-14 2006-08-22 Microsoft Corporation Write anywhere tool
SE0202446D0 (en) 2002-08-16 2002-08-16 Decuma Ab Ideon Res Park Presenting recognized handwritten symbols
US20040036711A1 (en) * 2002-08-23 2004-02-26 Anderson Thomas G. Force frames in animation
AU2003285886A1 (en) 2002-10-15 2004-05-04 Immersion Corporation Products and processes for providing force sensations in a user interface
US8125453B2 (en) 2002-10-20 2012-02-28 Immersion Corporation System and method for providing rotational haptic feedback
US7262764B2 (en) * 2002-10-31 2007-08-28 Microsoft Corporation Universal computing device for surface applications
US8059088B2 (en) 2002-12-08 2011-11-15 Immersion Corporation Methods and systems for providing haptic messaging to handheld communication devices
GB2413416B8 (en) 2002-12-08 2006-09-07 Immersion Corp Haptic massaging in handheld communication devices
US8830161B2 (en) 2002-12-08 2014-09-09 Immersion Corporation Methods and systems for providing a virtual touch haptic effect to handheld communication devices
US7729542B2 (en) * 2003-04-04 2010-06-01 Carnegie Mellon University Using edges and corners for character input
US8164573B2 (en) 2003-11-26 2012-04-24 Immersion Corporation Systems and methods for adaptive interpretation of input from a touch-sensitive input device
CN100421058C (en) * 2003-12-23 2008-09-24 奥森泰克公司 Electronic device with finger sensor for character entry and associated methods
CA2567751C (en) * 2004-06-01 2013-08-27 Mattel, Inc. An electronic learning device with a graphic user interface for interactive writing
US20060017702A1 (en) * 2004-07-23 2006-01-26 Chung-Yi Shen Touch control type character input method and control module thereof
JP4860625B2 (en) 2004-10-08 2012-01-25 イマージョン コーポレーション Haptic feedback for simulating buttons and scrolling motion on touch input devices
US7443386B2 (en) * 2004-11-01 2008-10-28 Nokia Corporation Mobile phone and method
US8849034B2 (en) * 2004-12-09 2014-09-30 Hewlett-Packard Development Company, L.P. System, method, and apparatus for triggering recognition of a handwritten shape
US7587087B2 (en) * 2004-12-10 2009-09-08 Nokia Corporation On-line handwriting recognition
KR100718126B1 (en) * 2005-02-05 2007-05-15 삼성전자주식회사 User interface method and apparatus for gesture-recognition based input device
US8250493B2 (en) 2005-02-05 2012-08-21 Samsung Electronics Co., Ltd. User interface method, medium, and apparatus with gesture-recognition
US7561145B2 (en) * 2005-03-18 2009-07-14 Microsoft Corporation Systems, methods, and computer-readable media for invoking an electronic ink or handwriting interface
US7825903B2 (en) 2005-05-12 2010-11-02 Immersion Corporation Method and apparatus for providing haptic effects to a touch panel
US20070008301A1 (en) * 2005-06-21 2007-01-11 Stewart Duncan H Training system and method
US7755026B2 (en) * 2006-05-04 2010-07-13 CandleDragon Inc. Generating signals representative of sensed light that is associated with writing being done by a user
KR100765264B1 (en) * 2006-08-04 2007-10-09 삼성전자주식회사 Display apparatus and control method thereof
KR100720335B1 (en) * 2006-12-20 2007-05-23 최경순 Apparatus for inputting a text corresponding to relative coordinates values generated by movement of a touch position and method thereof
US20080297491A1 (en) * 2007-05-29 2008-12-04 Adkins Gordon K Stylus for a touch-screen device
WO2009051065A1 (en) 2007-10-15 2009-04-23 Nippon Telegraph And Telephone Corporation Image generation method, device, its program and recording medium with program recorded therein
LV13941B (en) 2007-11-01 2012-09-20 Klaviatåŗra 21, Sia The method and device for inputting the information by descriptyion of the allowable trajectories, the device of the sensors of the characteristic points (variants)
US20090138800A1 (en) * 2007-11-23 2009-05-28 Mckesson Financial Holdings Limited Apparatus, method and computer-readable storage medium for directing operation of a software application via a touch-sensitive surface
US8707215B2 (en) 2007-12-31 2014-04-22 Motorola Mobility Llc Hand-held device and method for operating a single pointer touch sensitive user interface
US8237665B2 (en) * 2008-03-11 2012-08-07 Microsoft Corporation Interpreting ambiguous inputs on a touch-screen
RU2468417C2 (en) * 2008-05-28 2012-11-27 Шарп Кабусики Кайся Input detection device, input detection method, programme and data medium
JP2010015238A (en) * 2008-07-01 2010-01-21 Sony Corp Information processor and display method for auxiliary information
CN101650520A (en) * 2008-08-15 2010-02-17 索尼爱立信移动通讯有限公司 Visual laser touchpad of mobile telephone and method thereof
KR101019335B1 (en) * 2008-11-11 2011-03-07 주식회사 팬택 Method and system for controlling application of mobile terminal using gesture
JP5326802B2 (en) * 2009-05-19 2013-10-30 ソニー株式会社 Information processing apparatus, image enlargement / reduction method, and program thereof
EP2477096A1 (en) * 2009-09-09 2012-07-18 Sharp Kabushiki Kaisha Gesture determination device and method of same
WO2011047618A1 (en) * 2009-10-20 2011-04-28 Tuan Hsi-Ching Mouse pen and photoelectric control switch thereof
TWI402722B (en) * 2009-12-24 2013-07-21 Benq Corp Optical pen and operating method of the same
US20110254765A1 (en) * 2010-04-18 2011-10-20 Primesense Ltd. Remote text input using handwriting
US20110291964A1 (en) * 2010-06-01 2011-12-01 Kno, Inc. Apparatus and Method for Gesture Control of a Dual Panel Electronic Device
US9043732B2 (en) * 2010-10-21 2015-05-26 Nokia Corporation Apparatus and method for user input for controlling displayed information
CN102088486A (en) * 2010-12-31 2011-06-08 汉王科技股份有限公司 Handwriting recognition server and handling method thereof as well as handwriting recognition server cluster system
US9582178B2 (en) 2011-11-07 2017-02-28 Immersion Corporation Systems and methods for multi-pressure interaction on touch-sensitive surfaces
GB2497572A (en) * 2011-12-15 2013-06-19 St Microelectronics Ltd Function performance in response to pattern recognition on an optical navigation device such as a finger mouse
US9403399B2 (en) 2012-06-06 2016-08-02 Milwaukee Electric Tool Corporation Marking pen
US9886088B2 (en) * 2012-08-08 2018-02-06 Microsoft Technology Licensing, Llc Physically modulating friction in a stylus
JP5284523B1 (en) * 2012-09-05 2013-09-11 株式会社東芝 Information processing system, program, and processing method of information processing system
US20140089865A1 (en) * 2012-09-24 2014-03-27 Co-Operwrite Limited Handwriting recognition server
US8743072B2 (en) * 2012-09-28 2014-06-03 Lg Electronics Inc. Display device and control method thereof
KR20150104808A (en) * 2014-03-06 2015-09-16 삼성전자주식회사 Electronic device and method for outputing feedback
US10643067B2 (en) * 2015-10-19 2020-05-05 Myscript System and method of handwriting recognition in diagrams
KR20200078932A (en) * 2018-12-24 2020-07-02 삼성전자주식회사 Electronic device and controlling method of electronic device
EP3736677A1 (en) 2019-05-10 2020-11-11 MyScript A method and corresponding device for selecting and editing handwriting input elements
EP3754537A1 (en) 2019-06-20 2020-12-23 MyScript Processing text handwriting input in a free handwriting mode
EP3772015B1 (en) 2019-07-31 2023-11-08 MyScript Text line extraction
EP3796145A1 (en) 2019-09-19 2021-03-24 MyScript A method and correspond device for selecting graphical objects
US20220122477A1 (en) * 2020-10-20 2022-04-21 Holistic Language Solutions LLC Computerized method and apparatus for determining accuracy of written characters and stroke order and compliance with rules and providing visual and audio feedback

Family Cites Families (60)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3930229A (en) 1974-01-31 1975-12-30 Stanford Research Inst Handwriting system
GB1494901A (en) * 1974-04-30 1977-12-14 Suisse Horlogerie Data entry and decoding system
US4477797A (en) 1980-12-12 1984-10-16 Citizen Watch Company Limited Data input device for electronic device
US4423286A (en) 1982-07-21 1983-12-27 Talos Systems, Inc. Apparatus and method for determining the position of a driven coil within a grid of spaced conductors
US4532376A (en) 1983-05-26 1985-07-30 Sanders Associates, Inc. Electronic pen with switching mechanism for selectively providing tactile or non-tactile feel
JPS60205686A (en) 1984-03-30 1985-10-17 Hitachi Ltd Handwritten character and graphic recognizing system
US4751741A (en) 1984-07-19 1988-06-14 Casio Computer Co., Ltd. Pen-type character recognition apparatus
WO1986004704A1 (en) 1985-01-31 1986-08-14 Tybar Engineering Pty. Ltd. Input device for handwritten characters
JPS61275990A (en) * 1985-05-31 1986-12-06 Canon Inc Electronic equipment with handwritten pattern recognizing function
US4972496A (en) 1986-07-25 1990-11-20 Grid Systems Corporation Handwritten keyboardless entry computer system
US4820886A (en) 1987-03-16 1989-04-11 Sanders Associates, Inc. Low-cost, high-accuracy digitizer signal acquisition apparatus and method
US4905007A (en) 1987-05-29 1990-02-27 Samson Rohm Character input/output device
US4794208A (en) 1988-02-08 1988-12-27 Calcomp Inc. Frequency shifting digitizer for reducing AC fields interference
US4806707A (en) * 1988-02-12 1989-02-21 Calcomp Inc. 3-Dimensional digitizer pen
US4835347A (en) 1988-02-12 1989-05-30 Calcomp, Inc. Shifting wire sequence digitizer system
US4831216A (en) 1988-02-12 1989-05-16 Calcomp, Inc. Digitizer system with intertwined loopback conductor grid
US4853499A (en) 1988-12-12 1989-08-01 Calcomp Inc. Ground switching technique for silkscreened digitizer grids
FR2648255B1 (en) * 1989-06-08 1992-11-27 Gazale Midhat ELECTROMECHANICAL DEVICE FOR RECOGNIZING MANUALLY EXECUTED CHARACTERS
US5155813A (en) 1990-01-08 1992-10-13 Wang Laboratories, Inc. Computer apparatus for brush styled writing
US5029223A (en) 1990-02-02 1991-07-02 International Business Machines Corporation Constraint driven-on line recognition of handwritten characters and symbols
DK0522035T3 (en) * 1990-03-30 1994-11-07 Ferdinand Lutz Propeller with rotatable blades
JP2669575B2 (en) * 1991-04-19 1997-10-29 インターナショナル・ビジネス・マシーンズ・コーポレイション Data input method and device
US5544262A (en) 1992-04-07 1996-08-06 Apple Computer, Inc. Method and apparatus for processing graphically input equations
US5903668A (en) 1992-05-27 1999-05-11 Apple Computer, Inc. Method and apparatus for recognizing handwritten words
US5544265A (en) 1992-05-27 1996-08-06 Apple Computer, Inc. Shape recognizer for graphical computer systems
US5452371A (en) 1992-05-27 1995-09-19 Apple Computer, Inc. Method of aligning shapes on a display of a computer system
US5434777A (en) 1992-05-27 1995-07-18 Apple Computer, Inc. Method and apparatus for processing natural language
US5536930A (en) * 1992-06-03 1996-07-16 Symbol Technologies, Inc. Apparatus and method for sensing positional orientations of a portable terminal
US5465325A (en) 1992-11-16 1995-11-07 Apple Computer, Inc. Method and apparatus for manipulating inked objects
US5677710A (en) 1993-05-10 1997-10-14 Apple Computer, Inc. Recognition keypad
US5559942A (en) 1993-05-10 1996-09-24 Apple Computer, Inc. Method and apparatus for providing a note for an application program
US5566248A (en) 1993-05-10 1996-10-15 Apple Computer, Inc. Method and apparatus for a recognition editor and routine interface for a computer system
US6011865A (en) 1993-05-12 2000-01-04 International Business Machines Corporation Hybrid on-line handwriting recognition and optical character recognition system
US5596350A (en) 1993-08-02 1997-01-21 Apple Computer, Inc. System and method of reflowing ink objects
US5555363A (en) * 1993-09-30 1996-09-10 Apple Computer, Inc. Resetting the case of text on a computer display
US5583946A (en) 1993-09-30 1996-12-10 Apple Computer, Inc. Method and apparatus for recognizing gestures on a computer system
IL108319A0 (en) 1994-01-12 1994-04-12 Art Advanced Recognition Tech Method and system for visual and sound pattern recognition
EP0710384A4 (en) 1994-05-10 1997-05-02 Motorola Inc Method of stroke segmentation for handwritten input
IT1272259B (en) * 1994-05-30 1997-06-16 Texas Instruments Italia Spa PROCEDURE AND APPARATUS FOR THE RECOGNITION OF CHARACTERS
IL110137A (en) 1994-06-27 2000-06-29 Advanced Recognition Tech Handwriting recognition system
DE69533479T2 (en) 1994-07-01 2005-09-22 Palm Computing, Inc., Los Altos CHARACTER SET WITH CHARACTERS FROM MULTIPLE BARS AND HANDWRITING IDENTIFICATION SYSTEM
US5666438A (en) 1994-07-29 1997-09-09 Apple Computer, Inc. Method and apparatus for recognizing handwriting of different users of a pen-based computer system
US5768417A (en) 1994-09-09 1998-06-16 Motorola, Inc. Method and system for velocity-based handwriting recognition
US5854855A (en) 1994-09-09 1998-12-29 Motorola, Inc. Method and system using meta-classes and polynomial discriminant functions for handwriting recognition
AU3590795A (en) 1994-09-14 1996-03-29 Apple Computer, Inc. System and method for automatic subcharacter unit and lexicon generation for handwriting recognition
IL111039A (en) 1994-09-22 1998-08-16 Advanced Recognition Tech Handwritten pattern recognizer
US5675665A (en) 1994-09-30 1997-10-07 Apple Computer, Inc. System and method for word recognition using size and placement models
US5737443A (en) 1994-11-14 1998-04-07 Motorola, Inc. Method of joining handwritten input
AU690781B2 (en) 1994-11-14 1998-04-30 Motorola, Inc. Method of splitting handwritten input
US5521986A (en) * 1994-11-30 1996-05-28 American Tel-A-Systems, Inc. Compact data input device
AU4904396A (en) 1995-01-23 1996-08-14 Advanced Recognition Technologies, Inc. Handwriting recognizer with estimation of reference lines
US5623345A (en) 1995-03-06 1997-04-22 Motorola, Inc. Facsimile communication with a selective call system and method thereof
TW397951B (en) 1995-06-05 2000-07-11 Motorola Inc Method and microprocessor for preprocessing handwriting having characters composed of a preponderance of straight line segments
TW338815B (en) 1995-06-05 1998-08-21 Motorola Inc Method and apparatus for character recognition of handwritten input
WO1997003411A1 (en) 1995-07-07 1997-01-30 Motorola Inc. Method for entering handwritten messages in selective call receivers
AU5951696A (en) 1995-07-20 1997-02-18 Motorola, Inc. Method for entering handwritten information in cellular telephones
US5959260A (en) 1995-07-20 1999-09-28 Motorola, Inc. Method for entering handwritten information in cellular telephones
US5682439A (en) 1995-08-07 1997-10-28 Apple Computer, Inc. Boxed input correction system and method for pen based computer systems
TW388016B (en) 1995-11-13 2000-04-21 Motorola Inc Method and apparatus for character recognition interface
US6556712B1 (en) 1996-05-23 2003-04-29 Apple Computer, Inc. Methods and apparatus for handwriting recognition

Also Published As

Publication number Publication date
KR20000070619A (en) 2000-11-25
KR100438653B1 (en) 2004-07-12
DE69828065T2 (en) 2005-12-08
GB9701793D0 (en) 1997-03-19
JP2001509288A (en) 2001-07-10
RU2236036C2 (en) 2004-09-10
CN1249831A (en) 2000-04-05
EP1012780B1 (en) 2004-12-08
CN1161707C (en) 2004-08-11
DE69828065D1 (en) 2005-01-13
US6647145B1 (en) 2003-11-11
AU5773398A (en) 1998-08-18
AU748968B2 (en) 2002-06-13
ATE284559T1 (en) 2004-12-15
TW469400B (en) 2001-12-21
WO1998033141A1 (en) 1998-07-30
EP1012780A1 (en) 2000-06-28
ID22678A (en) 1999-12-09

Similar Documents

Publication Publication Date Title
US6647145B1 (en) Means for inputting characters or commands into a computer
Kölsch et al. Keyboards without keyboards: A survey of virtual keyboards
US8164570B2 (en) Condensed keyboard for electronic devices
US5303312A (en) Handwriting recognition by character template
EP0557284B1 (en) Computer with separate display plane and user interface processor
JP4567817B2 (en) Information processing apparatus and control method thereof
US20040047505A1 (en) Stylus computer
JPH10510639A (en) Multi pen stroke character set and handwritten document recognition system
WO2006076079A2 (en) System and method for identifying termination of data entry
CA1312300C (en) Keyboard for a word typewriter
US8174409B2 (en) Lineographic alphanumeric data input system
US11249558B1 (en) Two-handed keyset, system, and methods of making and using the keyset and system
MXPA99007094A (en) Means for inputting characters or commands into a computer
EP1780625A1 (en) Data input device and method and computer program product
JP2500283B2 (en) Virtual space keyboard device
Ravindhar et al. Virtual board
Seppanen Soft display key for Kanji input
Habib A reading system for the blind based on geometrical shapes
Noyes et al. What Are the Main Components of the System?
WO2005088522A1 (en) System and method for text entry
CN1409205A (en) Three-dimensional group spelling input method
Becker Enhancing the user-friendliness of Macintosh foreign character fonts
Rockwell Interrupting digitization and thinking about text or digitization and the form of digital text
Zyda et al. Non-Roman font generation via interactive computer graphics
CZ298195B6 (en) Coding method of text characters by making use of vectors

Legal Events

Date Code Title Description
EEER Examination request
FZDE Discontinued