WO2006076079A2 - System and method for identifying termination of data entry - Google Patents

System and method for identifying termination of data entry Download PDF

Info

Publication number
WO2006076079A2
WO2006076079A2 PCT/US2005/041880 US2005041880W WO2006076079A2 WO 2006076079 A2 WO2006076079 A2 WO 2006076079A2 US 2005041880 W US2005041880 W US 2005041880W WO 2006076079 A2 WO2006076079 A2 WO 2006076079A2
Authority
WO
WIPO (PCT)
Prior art keywords
user
active region
written data
writing instrument
data
Prior art date
Application number
PCT/US2005/041880
Other languages
French (fr)
Other versions
WO2006076079A3 (en
Inventor
James Marggraff
Alexander Chisholm
Tracy L. Edgecomb
Original Assignee
Leapfrog Enterprises, Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Leapfrog Enterprises, Inc. filed Critical Leapfrog Enterprises, Inc.
Priority to CA002532447A priority Critical patent/CA2532447A1/en
Priority to EP06000514A priority patent/EP1684160A1/en
Publication of WO2006076079A2 publication Critical patent/WO2006076079A2/en
Publication of WO2006076079A3 publication Critical patent/WO2006076079A3/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/0304Detection arrangements using opto-electronic means
    • G06F3/0317Detection arrangements using opto-electronic means in co-operation with a patterned surface, e.g. absolute position or relative movement detection for an optical mouse or pen positioned with respect to a coded surface
    • G06F3/0321Detection arrangements using opto-electronic means in co-operation with a patterned surface, e.g. absolute position or relative movement detection for an optical mouse or pen positioned with respect to a coded surface by optically sensing the absolute position with respect to a regularly patterned surface forming a passive digitiser, e.g. pen optically detecting position indicative tags printed on a paper sheet
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0354Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
    • G06F3/03545Pens or stylus
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0487Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser
    • G06F3/0488Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures
    • G06F3/04883Interaction techniques based on graphical user interfaces [GUI] using specific features provided by the input device, e.g. functions controlled by the rotation of a mouse with dual sensing arrangements, or of the nature of the input device, e.g. tap gestures based on pressure sensed by a digitiser using a touch-screen or digitiser, e.g. input of commands through traced gestures for inputting data by handwriting, e.g. gesture or text
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V30/00Character recognition; Recognising digital ink; Document-oriented image-based pattern recognition
    • G06V30/10Character recognition
    • G06V30/32Digital ink
    • G06V30/36Matching; Classification
    • G06V30/387Matching; Classification using human interaction, e.g. selection of the best displayed recognition candidate

Definitions

  • the present invention is related to the field of computer user interfaces.
  • embodiments of the present invention relate to identifying
  • One type of optical pen is used with a sheet of paper on which very small
  • dots are printed.
  • the dots are printed on the page in a pattern with a nominal
  • the optical pen essentially takes a
  • the optical pen can precisely calculate the dot positions captured in each snapshot.
  • Bluetooth or other wireless capability can be linked to other devices and used for
  • An optical pen may be used to input data to an application via a printable
  • the device may perform real-time character recognition on handwritten symbols.
  • it can be difficult to determine when a data input
  • an optical pen that can determine termination of data entry
  • the method further includes
  • method includes terminating the receiving and in response to terminating, the
  • method further includes processing the information to automatically recognize the
  • a prescribed action comprises
  • a double tap in the active region indicates termination of data entry
  • a prescribed action comprises
  • a writing time out threshold is used to
  • the threshold time begins once a writing instrument
  • a prescribed action comprises
  • the predetermined location is the predetermined location
  • the prescribed action may
  • the prescribed action is
  • a first application may allow a time-out
  • termination of data entry and a second application may allow tapping in the active
  • Figure 1 is a block diagram of a device upon which embodiments of the
  • FIG. 2 is a block diagram of another device upon which embodiments of
  • Figure 3 shows an exemplary sheet of paper provided with a pattern of
  • Figure 4 shows an enlargement of a pattern of marks on an exemplary
  • Figure 5 is an illustration of an exemplary tree menu in accordance with an
  • Figure 6A is an illustration of an exemplary surface comprising user-
  • Figure 6B is an illustration of an exemplary surface comprising user-
  • Figure 6C is an illustration of an exemplary surface comprising user-
  • Figure 7 is a flow diagram of an exemplary computer implemented method
  • Figure 8 is a flow diagram of an exemplary computer implemented method
  • Figure 1 is a block diagram of a device 100 upon which embodiments of
  • device 100 may be any type of device 100.
  • device 100 may be any type of device 100.
  • optical device more specifically as an optical reader
  • optical reader optical
  • device 100 includes a processor 32 inside
  • housing 62 has the form of a pen or other
  • Processor 32 is operable for processing
  • the device 100 may include an audio output
  • the audio output device and/or the display device are physically
  • device 100 can communicate with other devices.
  • device 100 can communicate with other devices.
  • device 100 can communicate with other devices.
  • device 100 can
  • the audio output includes a transceiver or transmitter (not shown in Figure 1).
  • the audio output includes a transceiver or transmitter (not shown in Figure 1).
  • the device 36 may include a speaker or an audio jack (e.g., for an earphone or
  • the display device 40 may be a liquid crystal display (LCD) or
  • device 100 includes input buttons 38
  • buttons 38 allow a user to input information and commands to
  • Device 100 or to turn device 100 on or off.
  • Device 100 also includes a power
  • Device 100 also includes a light source or optical emitter 44 and a light
  • the optical emitter 44 is coupled to the processor 32.
  • the optical emitter 44 is configured to emit light
  • the optical detector 42 may be a light emitting diode (LED), for example, and the optical detector 42 may be a light emitting diode (LED), for example, and the optical detector 42 may
  • CCD charge coupled device
  • CMOS complementary metal-oxide-semiconductor
  • the surface 70 may be any surface suitable to be written on, e.g., a sheet
  • a paper although the present invention is not so limited.
  • a paper although the present invention is not so limited.
  • a paper although the present invention is not so limited.
  • a paper although the present invention is not so limited.
  • the surface is a material with electronic ink, a flat panel display LCD
  • emitter 44 and optical detector 42 is placed against or near surface 70.
  • optical emitter 44 and optical detector 42 are recorded by optical emitter 44 and optical detector 42.
  • the markings on surface 70 are used to
  • markings on surface 70 are used to encode information
  • the captured images of surface 70 can be analyzed
  • Device 10O of Figure 1 also includes a memory unit 48 coupled to the
  • memory unit 48 is a removable memory unit
  • a memory cartridge embodied as a memory cartridge or a memory card.
  • a memory card embodied as a memory cartridge or a memory card.
  • memory unit 48 includes random access (volatile) memory (RAM) and read-only RAM
  • ROM non-volatile memory
  • the memory unit may be used to store information representing a
  • user written data e.g., a symbol, a number, a word or words.
  • device 100 includes a writing element 52
  • Writing element 52 can be, for example, a pen, pencil, marker or the
  • writing element may or may not be retractable.
  • writing element may or may not be retractable.
  • writing element 52 is not needed.
  • a user can use writing element 52 to rnaxe marks on surface 70, including characters such as letters, numbers,
  • position of the user-produced marks can be determined using a pattern of marks
  • the user-produced markings can be interpreted by device
  • OCR optical character recognition
  • surface 70 may be a sheet of paper, although
  • surface 70 consisting of materials other than paper may be used. Also, surface 70
  • surface 70 may or may not be flat.
  • surface 70 may be embodied as the
  • surface of a globe may be smaller or larger than a
  • FIG. 2 is a block diagram of another device 200 upon which
  • Device 200 includes
  • processor 32 power source 34, audio output device 36, input buttons 38,
  • optical detector 42 optical emitter 44 and writing element 52 are embodied as optical
  • optical device 201 is coupled to platform
  • Figure 3 shows a sheet of paper 15 provided with a pattern of marks
  • sheet of paper 15 is provided with a coding pattern in the form of
  • optically readable position code 17 that consists of a pattern of marks 18.
  • marks 18 may not be easily discernible by the human visual system, and may
  • the marks 18 are
  • Figure 4 shows an enlarged portion 19 of the position code 17 of Figure 3.
  • An optical device such as devices 100 and 200 ( Figures 1 and 2) is positioned to
  • optical device fits the marks 18 to a reference system in the form of a raster with
  • mark 23 is associated with raster
  • the displacement of a mark from the raster point associated with the mark is determined. Using these displacements,
  • the pattern in the image/raster is compared to patterns in the reference system.
  • Each pattern in the reference system is associated with a particular location on
  • the position of the optical device relative to the surface 70 can be determined.
  • the regions on surface 70 may overlap because
  • a user may create a character consisting, for example, of a circled
  • a prompt e.g., an audible prompt
  • device 100 records the pattern of markings that are
  • the device 100 associates that pattern of markings with the character just created.
  • the character is associated with a particular
  • a user can create (write) a character
  • Figure 5 shows a menu item tree directory according to an embodiment of
  • the menu item tree directory can embody an audio menu starting
  • the menu tree comprises menu options associated
  • a first audio subdirectory could be a tools
  • T subdirectory Under the tools T subdirectory, there could be a translator TR suD ⁇ irectory, a calculator C subdirectory, a spell checker SC subdirectory, a
  • subdirectory there can be an English E function, a Spanish SP function, and a
  • a personalization P function As illustrated by the menu item tree-directory, a
  • a user can cause the interactive
  • the interactive apparatus may be programmed to
  • switch may be provided in the interactive apparatus so that when the end of the
  • the audio output device in the interactive apparatus may recite “tools"
  • the user may select the circled letter "M" a second time to
  • the user can create a distinctive mark on the paper or
  • a specific gesture e.g., prescribed action
  • the user may draw a "checkmark" (or other graphic element)
  • prescribed action may be used to cause the interactive apparatus to perform
  • embodiments of the present invention comprise
  • a data input operation is terminated in
  • a data input operation is
  • the prescribed action is no action.
  • the predetermined area may comprise user generated or pre-printed
  • selecting a circled letter makes the
  • a user created mark defines an active region associated with the active
  • Figure 6A shows a printable surface 601 with written or printed elements
  • a user may first start with a blank piece
  • the interactive apparatus in response to a user selection, the interactive apparatus
  • a user may start with a pre-printed
  • surface 601 may be used with a pre-printed circled "D" 602 and a pre-printed
  • a user may select the
  • dictionary application by, for example, tapping the pre-printed circled "D" 602. After selection of an active application (e.g., dictionary), the interactive
  • apparatus may then prompt the user to input data (e.g., write on the printable
  • the user may then write
  • the interactive apparatus determines the area surrounding the characters of
  • the active region 620 defines a location on
  • tapping in the active region 620 e.g., tapping in the active region 620 (e.g., tap in the active region 620
  • double tapping in the active region 620 indicates that the user is done
  • Dots 650 are user written marks on the printable
  • waiting a threshold time-out period indicates to the
  • printable surface 601 indicates to the interactive apparatus that the user is done
  • the active region 620 is a virtual box
  • a single or double tap in the active region 620 indicates termination of
  • the processor on the device may be
  • Figure 6B is an illustration of an exemplary printable surface comprising
  • the printable surface 601 is the printable surface 601
  • a user may be prompted to create a calculator by
  • the string one-two-three could be
  • present invention recognize prescribed user-performed actions that indicate user
  • tapping is tapping
  • Figure 6B illustrates user created
  • each character is not illustrated, however, the active region can be defined as an
  • Figure 6C is an illustration of an exemplary printable surface comprising
  • predetermined area 680 of the printable surface terminates data entry in
  • printable surface 601 comprises a predetermined area 680 suited to terminate
  • the user taps the predetermined area
  • the predetermined area 680 can be any predetermined area 680.
  • a user may graphically bind
  • the predetermined area 680 by drawing a border around it.
  • the predetermined area 680 is a predetermined area 680
  • the predetermined image comprises pre-printed images.
  • the predetermined image comprises pre-printed images.
  • area 680 comprises the word "done.” In one embodiment of the invention, pre ⁇
  • printed surfaces can be application specific.
  • FIGS. 6B and 6C illustrate examples of user actions that can terminate
  • a user may have tapped the predetermined location 680 to terminate entry of
  • a user action may include ceasing
  • Figure 7 is a flow diagram of an exemplary computer implemented method
  • process 700 includes receiving information from the optical
  • information may include encoded information regarding the image of the written
  • information e.g., stroke data
  • location information gathered from
  • the writing surface comprises
  • the surface can be defined as a plurality of regions wherein each of the plurality of regions is associated with a
  • the data is representative of the real-time
  • the unique printed images are dot
  • written data may be received wirelessly (e.g., via a Bluetooth wireless connection
  • process 700 includes automatically defining an active region
  • an area encompassing the user written data defines the active region.
  • the processor automatically defines a surface region to
  • process 700 includes recognizing a user performing a
  • the writing instrument may be tapped a predetermined number of
  • writing instrument may be tapped on a letter or number of the user written data.
  • the tap may be made on or near the last character written of the user written
  • the prescribed action includes the
  • predetermined period of time e.g., threshold time
  • predetermined period of time indicates termination of the user written data.
  • the period of time begins once the writing
  • the period of time begins once receiving information representing user
  • the prescribed action includes the
  • the predetermined location on the surface is the predetermined location on the surface
  • the pre-printed image may be the word "done" printed on the surface.
  • Applications may be programmed to respond to one, two, or all of the above
  • the prescribed action is application
  • a first application may allow different prescribed actions
  • application may allow multiple prescribed actions to terminate an event.
  • process 700 includes in response to the recognizing of the
  • termination event terminating the receiving of the user written data.
  • identification of a prescribed action terminates the
  • process 700 includes in response to the terminating of
  • the user-written data In one embodiment of the invention, the user-written data
  • This step may
  • processor may implement some action related to a word, for example, the
  • processor may define the word, translate the word, etc.
  • a user may write a plurality of words.
  • processor then performs an identification of the word and a definition is then
  • FIG. 8 is a flow diagram of an exemplary computer implemented
  • process 800 includes determining an active region associated
  • the active region associated with an area on a
  • printable surface comprising user written data.
  • an area encompassing user-written data determines an active region.
  • process 800 includes receiving information from the optical
  • process 800 includes detecting a user input indicating a
  • a termination event of the user written data In one embodiment of the invention, a
  • the user input can be any one of or all of the prescribed actions
  • process 800 includes terminating data entry of the user
  • termination of data entry allows differentiation of
  • the number twelve can be distinguished
  • process 800 includes generating a tone indicating termination
  • multiple tones can be
  • Subsequent steps may then process the user written data, e.g., optical character
  • OCR OCR recognition
  • a method for inputting data includes receiving information representing user-written data, the user-written data made with a writing
  • the method further includes defining an active region on the surface surrounding the user wrtt ⁇ n data and recognizing a user
  • the method includes

Abstract

Computer implemented methods of and systems for inputting data are described. A method for inputting data includes receiving information representing user-written data, the user-written data made with a writing instrument upon a surface. The method further includes defining an active region on the surface surrounding the user written data and recognizing a user performing a prescribed action with the writing instrument indicating completion of the user-written data. In response to recognizing, the method includes terminating the receiving and in response to terminating, the method further includes processing the information to automatically recognize the user-written data.

Description

SYSTEM AND METHOD FOR IDENTIFYING TERMINATION OF DATA ENTRY
CROSS REFERENCES TO RELATED APPLICATIONS
This Application is a Continuation-in-Part of the co-pending, commonly-
owned U.S. Patent Application, Attorney Docket No. 020824-00461 OUS, Serial
No. 10/803,806, filed March 17, 2004, by James Marggraff et al., entitled
"Scanning Apparatus," and hereby incorporated by reference in its entirety.
This Application is a Continuation-in-Part of the co-pending, commonly-
owned U.S. Patent Application, Attorney Docket No. 020824-009500US, Serial
No. 10/861 ,243, filed June 3, 2004, by James Marggraff et al., entitled "User
Created Interactive Interface," and hereby incorporated by reference in its
entirety.
BACKGROUND OF THE INVENTION
FIELD OF THE INVENTION
The present invention is related to the field of computer user interfaces.
More specifically, embodiments of the present invention relate to identifying
termination of data entry in a user created interactive interface. RELATED ART
Devices such as optical readers or optical pens conventionally emit light
that reflects off a surface to a detector or imager. As the device is moved relative
to the surface (or vice versa), successive images are rapidly captured. By
analyzing the images, movement of the optical device relative to the surface can
be tracked.
One type of optical pen is used with a sheet of paper on which very small
dots are printed. The dots are printed on the page in a pattern with a nominal
spacing of about 0.3 millimeters (0.01 inches). The pattern of dots within any
region on the page is unique to that region. The optical pen essentially takes a
snapshot of the surface, perhaps 100 times a second or more. By interpreting
the dot positions captured in each snapshot, the optical pen can precisely
determine its position relative to the page.
Applications that utilize information about the position of an optical pen
relative to a surface have been or are being devised. An optical pen with
Bluetooth or other wireless capability can be linked to other devices and used for
sending electronic mail (e-mail) or faxes.
An optical pen may be used to input data to an application via a printable
surface. For example, the device may perform real-time character recognition on handwritten symbols. However, it can be difficult to determine when a data input
is completed. For example, if a user inputs the number one and then inputs the
number two, it is difficult to determine if the user intended to input the number
twelve or the individual numbers one and two. The same is true when a user is
writing a word. The device needs to know when the word is complete. Thus,
determining termination of data entry can be problematic.
SUMMARY OF THE INVENTION
Accordingly, an optical pen that can determine termination of data entry
would be valuable. Embodiments in accordance with the present invention
provide this and other advantages.
Embodiments of the present invention include a method for inputting data
including receiving information representing user-written data, the user-written
data made with a writing instrument upon a surface. The method further includes
defining an active region on the surface surrounding the user written data and
recognizing a user performing a prescribed action with the writing instrument
indicating completion of the user-written data. In response to recognizing, the
method includes terminating the receiving and in response to terminating, the
method further includes processing the information to automatically recognize the
user-written data.
In one embodiment of the invention, a prescribed action comprises
determining a writing instrument being tapped within an active region on the
surface. In this embodiment of the invention, a tap adjacent to the user-written
data indicates termination of data entry in that region of the surface.
Furthermore, a double tap in the active region indicates termination of data entry
in that region of the surface. In another embodiment of the invention, a prescribed action comprises
determining that a writing instrument is idle for a predetermined period of time. In
this embodiment of the invention, a writing time out threshold is used to
determine termination of data entry in that region of the surface. In one
embodiment of the invention, the threshold time begins once a writing instrument
is lifted from the surface.
In another embodiment of the invention, a prescribed action comprises
determining the writing instrument being tapped in a predetermined location on
the surface. In one embodiment of the invention, the predetermined location
comprises a pre-printed image. In other embodiments, the prescribed action may
be a combination of two or more of the above.
In another embodiment of the invention, the prescribed action is
application dependent. For example, a first application may allow a time-out
termination of data entry and a second application may allow tapping in the active
region to terminate data entry. In another embodiment of the invention, an
application may allow more than one termination event. These and other objects
and advantages of the present invention will be recognized by one skilled in the
art after having read the following detailed description, which are illustrated in the
various drawing figures. BRIEF DESCRIPTION OF THE DRAWINGS
The accompanying drawings, which are incorporated in and form a part of
this specification, illustrate embodiments of the invention and, together with the
description, serve to explain the principles of the invention:
Figure 1 is a block diagram of a device upon which embodiments of the
present invention can be implemented.
Figure 2 is a block diagram of another device upon which embodiments of
the present invention can be implemented.
Figure 3 shows an exemplary sheet of paper provided with a pattern of
marks according to one embodiment of the present invention.
Figure 4 shows an enlargement of a pattern of marks on an exemplary
sheet of paper according to one embodiment of the present invention.
Figure 5 is an illustration of an exemplary tree menu in accordance with an
embodiment of the present invention. Figure 6A is an illustration of an exemplary surface comprising user-
written data associated with a dictionary application in accordance with
embodiments of the present invention.
Figure 6B is an illustration of an exemplary surface comprising user-
written data associated with a calculator application wherein tapping of an active
region terminates data entry in accordance with embodiments of the present
invention.
Figure 6C is an illustration of an exemplary surface comprising user-
written data associated with a calculator application wherein tapping of
predetermined area terminates data entry in accordance with embodiments of the
present invention.
Figure 7 is a flow diagram of an exemplary computer implemented method
of inputting data in accordance with embodiments of the present invention.
Figure 8 is a flow diagram of an exemplary computer implemented method
of determining termination of data entry in accordance with embodiments of the
present invention. DETAILED DESCRIPTION
In the following detailed description of the present invention, numerous
specific details are set forth in order to provide a thorough understanding of the
present invention. However, it will be recognized by one skilled in the art that the
present invention may be practiced without these specific details or with
equivalents thereof. In other instances, well-known methods, procedures,
components, and circuits have not been described in detail as not to
unnecessarily obscure aspects of the present invention.
Some portions of the detailed descriptions, which follow, are presented in
terms of procedures, steps, logic blocks, processing, and other symbolic
representations of operations on data bits that can be performed on computer
memory. These descriptions and representations are the means used by those
skilled in the data processing arts to most effectively convey the substance of
their work to others skilled in the art. A procedure, computer executed step, logic
block, process, etc., is here, and generally, conceived to be a self-consistent
sequence of steps or instructions leading to a desired result. The steps are those
requiring physical manipulations of physical quantities. Usually, though not
necessarily, these quantities take the form of electrical or magnetic signals
capable of being stored, transferred, combined, compared, and otherwise
manipulated in a computer system. It has proven convenient at times, principally τor reasons of common usage, to refer to these signals as bits, values, elements,
symbols, characters, terms, numbers, or the like.
It should be borne in mind, however, that all of these and similar terms are
to be associated with the appropriate physical quantities and are merely
convenient labels applied to these quantities. Unless specifically stated
otherwise as apparent from the following discussions, it is appreciated that
throughout the present invention, discussions utilizing terms such as "encoding"
or "determining" or "identifying" or "accessing" or "rendering" or "reading" or
"receiving" or "identifying" or "terminating" or "executing" or the like, refer to the
actions and processes of a computer system (e.g., flowcharts 700 and 800 of
Figures 7 and 8), or similar electronic computing device, that manipulates and
transforms data represented as physical (electronic) quantities within the
computer system's registers and memories into other data similarly represented
as physical quantities within the computer system memories or registers or other
such information storage, transmission or display devices.
Figure 1 is a block diagram of a device 100 upon which embodiments of
the present invention can be implemented. In general, device 100 may be
referred to as an optical device, more specifically as an optical reader, optical
pen or digital pen. In the embodiment of Figure 1 , device 100 includes a processor 32 inside
a housing 62. In one embodiment, housing 62 has the form of a pen or other
writing utensil (e.g., writing instrument). Processor 32 is operable for processing
information and instructions used to implement the functions of device 100, which
are described below.
In the present embodiment, the device 100 may include an audio output
device 36 and a display device 40 coupled to the processor 32. In other
embodiments, the audio output device and/or the display device are physically
separated from device 100, but in communication with device 100 through either
a wired or wireless connection. For wireless communication, device 100 can
include a transceiver or transmitter (not shown in Figure 1). The audio output
device 36 may include a speaker or an audio jack (e.g., for an earphone or
headphone). The display device 40 may be a liquid crystal display (LCD) or
some other suitable type of display.
In the embodiment of Figure 1 , device 100 includes input buttons 38
coupled to the processor 32 for activating and controlling the device 100. For
example, the input buttons 38 allow a user to input information and commands to
device 100 or to turn device 100 on or off. Device 100 also includes a power
source 34 such as a battery. Device 100 also includes a light source or optical emitter 44 and a light
sensor or optical detector 42 coupled to the processor 32. The optical emitter 44
may be a light emitting diode (LED), for example, and the optical detector 42 may
be a charge coupled device (CCD) or complementary metal-oxide semiconductor
(CMOS) imager array, for example. The optical emitter 44 illuminates surface 70
or a portion thereof. Light reflected from the surface 70 is received at and
recorded by optical detector 42.
The surface 70 may be any surface suitable to be written on, e.g., a sheet
a paper, although the present invention is not so limited. In one embodiment, a
pattern of markings is printed on surface 70. In another embodiment of the
invention, the surface is a material with electronic ink, a flat panel display LCD
display or any other surface or display. The end of device 100 that holds optical
emitter 44 and optical detector 42 is placed against or near surface 70. As
device 100 is moved relative to the surface 70, the pattern of markings are read
and recorded by optical emitter 44 and optical detector 42. As discussed in more
detail further below, in one embodiment, the markings on surface 70 are used to
determine the position of device 100 relative to surface (see Figures 3 and 4). In
another embodiment, the markings on surface 70 are used to encode information
(see Figures 5 and 6). The captured images of surface 70 can be analyzed
(processed) by device 100 to decode the markings and recover the encoded
information. Additional descriptions regarding surface markings for encoding
information and the reading/recording of such markings by electronic devices can
be found in the following patents and patent applications that are assigned to
Anoto and that are all herein incorporated by reference in their entirety: U.S.
Patent No. 6,502,756, U.S. Application No. 101179,966, filed on June 26, 2002,
WO 01/95559, WO 01/71473, WO 01/75723, WO 01/26032, WO 01/75780, WO
01/01670, WO 01/75773, WO 01/71475, WO 1000/73983, and WO 01116691.
Device 10O of Figure 1 also includes a memory unit 48 coupled to the
processor 32. In one embodiment, memory unit 48 is a removable memory unit
embodied as a memory cartridge or a memory card. In another embodiment,
memory unit 48 includes random access (volatile) memory (RAM) and read-only
(non-volatile) memory (ROM) for storing information and instructions for
processor 32. The memory unit may be used to store information representing a
user written data, e.g., a symbol, a number, a word or words.
In the embodiment of Figure 1 , device 100 includes a writing element 52
situated at the same end of device 100 as the optical detector 42 and the optical
emitter 44. Writing element 52 can be, for example, a pen, pencil, marker or the
like, and may or may not be retractable. In certain applications, writing element
52 is not needed. In other applications, a user can use writing element 52 to rnaxe marks on surface 70, including characters such as letters, numbers,
mathematical symbols and the like. These marks can be scanned (imaged) and
interpreted by device 10O according to their position on the surface 70. The
position of the user-produced marks can be determined using a pattern of marks
that are printed on surface 70; refer to the discussion of Figures 3 and 4, below.
In one embodiment, the user-produced markings can be interpreted by device
100 using optical character recognition (OCR) techniques that recognize
handwritten characters.
As mentioned above, surface 70 may be a sheet of paper, although
surfaces consisting of materials other than paper may be used. Also, surface 70
may or may not be flat. For example, surface 70 may be embodied as the
surface of a globe. Furthermore, surface 70 may be smaller or larger than a
conventional (e.g., 8.5x11 inch) page of paper.
Figure 2 is a block diagram of another device 200 upon which
embodiments of the present invention can be implemented. Device 200 includes
processor 32, power source 34, audio output device 36, input buttons 38,
memory unit 48, optical detector 42, optical emitter 44 and writing element 52,
previously described herein. However, in the embodiment of Figure 2, optical
detector 42, optical emitter 44 and writing element 52 are embodied as optical
device 201 in housing 62, and processor 32, power source 34, audio output αevice 36, input buttons 38 and memory unit 48 are embodied as platform 202 in
housing 74. In the present embodiment, optical device 201 is coupled to platform
202 by a cable 102; however, a wireless connection can be used instead. The
elements illustrated by Figure 2 can be distributed between optical device 201
and platform 200 in combinations other than those described above.
Figure 3 shows a sheet of paper 15 provided with a pattern of marks
according to one embodiment of the present invention. In the embodiment of
Figure 3, sheet of paper 15 is provided with a coding pattern in the form of
optically readable position code 17 that consists of a pattern of marks 18. The
marks 18 in Figure 3 are greatly enlarged for the sake of clarity. In actuality, the
marks 18 may not be easily discernible by the human visual system, and may
appear as grayscale on sheet of paper 15. In one embodiment, the marks 18 are
embodied as dots; however, the present invention is not so limited.
Figure 4 shows an enlarged portion 19 of the position code 17 of Figure 3.
An optical device such as devices 100 and 200 (Figures 1 and 2) is positioned to
record an image of a region of the position code 17. In one embodiment, the
optical device fits the marks 18 to a reference system in the form of a raster with
raster lines 21 that intersect at raster points 22. Each of the marks 18 is
associated with a raster point 22. For example, mark 23 is associated with raster
point 24. For the marks in an image/raster, the displacement of a mark from the raster point associated with the mark is determined. Using these displacements,
the pattern in the image/raster is compared to patterns in the reference system.
Each pattern in the reference system is associated with a particular location on
the surface 70. Thus, by matching the pattern in the image/raster with a pattern
in the reference system, the position of the pattern on the surface 70, and hence
the position of the optical device relative to the surface 70, can be determined.
With reference back to Figure 1 , four positions or regions on surface 70
are indicated by the letters A, B, C and D (these characters are not printed on
surface 70, but are used herein to indicate positions on surface 70). There may
be many such regions on the surface 70. Associated with each region on surface
70 is a unique pattern of marks. The regions on surface 70 may overlap because
even if some marks are shared between overlapping regions, the pattern of
marks in a region is still unique to that region.
In the example of Figure 1, using device 100 (specifically, using writing
element 52), a user may create a character consisting, for example, of a circled
letter "M" at position A on surface 70 (generally, the user may create the
character at any position on surface 70). The user may create such a character
in response to a prompt (e.g., an audible prompt) from device 100. When the
user creates the character, device 100 records the pattern of markings that are
uniquely present at the position where the character is created. The device 100 associates that pattern of markings with the character just created. When device
100 is subsequently positioned over the circled "M," device 100 recognizes the
pattern of marks associated therewith and recognizes the position as being
associated with a circled "M." In effect, device 100 recognizes the character
using the pattern of markings at the position where the character is located,
rather than by recognizing the character itself.
In one embodiment, the character is associated with a particular
command. In the example just described, a user can create (write) a character
that identifies a particular command, and can invoke that command repeatedly by
simply positioning device 100 over the written character. In other words, the user
does not have to write the character for a command each time the command is to
be invoked; instead, the user can write the character for a command one time
and invoke the command repeatedly using the same written character.
Figure 5 shows a menu item tree directory according to an embodiment of
the invention. The menu item tree directory can embody an audio menu starting
from the menu M symbol. The menu tree comprises menu options associated
with applications.
Starting from the top of Figure 5, a first audio subdirectory could be a tools
T subdirectory. Under the tools T subdirectory, there could be a translator TR suDαirectory, a calculator C subdirectory, a spell checker SC subdirectory, a
personal assistant PA subdirectory, an alarm clock AL subdirectory, and a tutor
TV function. Under the translator TR subdirectory, there would be Spanish SP,
French FR, and German GE translator functions (e.g., applications). Under the
personal assistant PA subdirectory, there would be calendar C, phone list PL,
and to do list TD functions or subdirectories.
Under the reference R subdirectory, there could be thesaurus TH function,
a dictionary D subdirectory, and a help H function. Under the dictionary D
subdirectory, there can be an English E function, a Spanish SP function, and a
French FR function.
Under the games G subdirectory, there can be games such as word
scramble WS, funky potatoes FP, and doodler DO. Other games could also be
present in other embodiments of the invention.
Under the system S subdirectory, there can be a security SE function, and
a personalization P function. As illustrated by the menu item tree-directory, a
user may proceed down any desired path by listening to recitations of the various
menu items and then selecting the menu item desired. The subsequent selection
of the desired menu item may occur in any suitable manner. For example, in some embodiments, a user can cause the interactive
apparatus to scroll through the audio menu by "down touching" on a created
graphic element with a writing instrument. The "down touching" may be
recognized by the electronics in the interactive apparatus using any suitable
mechanism. For instance, the interactive apparatus may be programmed to
recognize the image change associated with the downward movement of it
towards the selected graphic element. In another example, a pressure sensitive
switch may be provided in the interactive apparatus so that when the end of the
interactive apparatus applies pressure to the paper, the pressure switch
activates. This informs the interactive apparatus to scroll through the audio
menu.
For instance, after selecting the circled letter "M" with the interactive
apparatus (to thereby cause the pressure switch in the interactive apparatus to
activate), the audio output device in the interactive apparatus may recite "tools"
and nothing more. The user may select the circled letter "M" a second time to
cause the audio output device to recite the menu item "reference". This can be
repeated as often as desired to scroll through the audio menu. To select a
particular menu item, the user can create a distinctive mark on the paper or
provide a specific gesture (e.g., prescribed action) with the writing instrument. For instance, the user may draw a "checkmark" (or other graphic element)
next to the circled letter "M" after hearing the word "tools" to select the
subdirectory "tools". Using a method such as this, a user may navigate towards
the intended directory, subdirectory, or function in the menu item tree. A different
prescribed action may be used to cause the interactive apparatus to perform
other operations. For example, embodiments of the present invention comprise
methods for recognizing when a user is finished inputting data for a particular
application based on prescribed actions.
In one embodiment of the invention, a data input operation is terminated in
response to detecting the prescribed action of tapping the last letter of a word, for
example. In another embodiment of the invention, a data input operation is
terminated in response to detecting the prescribed action of passing a threshold
time-out, wherein no user input is detected. In this embodiment of the invention,
the prescribed action is no action. In another embodiment of the invention, a
data input operation is terminated in response to detecting the prescribed action
of tapping a predetermined area on the paper. In this embodiment of the
invention, the predetermined area may comprise user generated or pre-printed
graphics.
In other embodiments, after creating the letter "M" with a circle, the user
may select the circled letter "M". Software in the scanning apparatus recognizes the circled letter "M" as being the menu symbol and causes the scanning
apparatus to recite the menu items "tools", "reference", "games", and "system"
sequentially and at spaced timing intervals, without down touching by the user.
In one embodiment of the invention, selecting a circled letter makes the
corresponding application the active application. In one embodiment of the
invention, a user created mark defines an active region associated with the active
application.
Figure 6A shows a printable surface 601 with written or printed elements
associated with a dictionary application. A user may first start with a blank piece
of paper and may draw the circled letter "D" 602 as shown. Then, the user may
"select" the circled letter "D" 602 by, for example, tapping the circled letter "D"
602 or selecting a check mark drawn adjacent to the letter. In one embodiment
of the invention, in response to a user selection, the interactive apparatus
generates an audible tone. For example, the word "dictionary" is recited.
In an embodiment of the invention, a user may start with a pre-printed
image on the printable surface 601. For example, a dictionary specific printable
surface 601 may be used with a pre-printed circled "D" 602 and a pre-printed
checkmark 604. In this embodiment of the invention, a user may select the
dictionary application by, for example, tapping the pre-printed circled "D" 602. After selection of an active application (e.g., dictionary), the interactive
apparatus may then prompt the user to input data (e.g., write on the printable
surface 601 ). For example, in the dictionary application, the user may then write
the word "magic" 607 as shown in Figure 6A. While writing the word "magic"
607, the interactive apparatus determines the area surrounding the characters of
the word "magic" 607 to be the active region 620 on the printable surface 601. In
one embodiment of the invention, prescribed actions are identified to terminate
data input in the active region 620. The active region 620 defines a location on
the surface.
In one embodiment of the invention, tapping in the active region 620 (e.g.,
at the end of the word) indicates to the interactive apparatus that the user is done
writing the intended word and that the interactive apparatus should recognize the
word and then produce the dictionary definition. In one embodiment of the
invention, double tapping in the active region 620 indicates that the user is done
writing the intended word. Dots 650 are user written marks on the printable
surface 601 and in the active region 620 resulting form a double tapping in the
active region with a writing instrument.
Alternatively, waiting a threshold time-out period indicates to the
interactive apparatus that the user is done writing the intended word and that the
interactive apparatus should produce the dictionary definition. In another embodiment of the invention selection of a predetermined area 610 of the
printable surface 601 indicates to the interactive apparatus that the user is done
writing the intended word and that the interactive apparatus should produce the
dictionary definition.
In one embodiment of the invention, the active region 620 is a virtual box
around any or all of the characters of the user written data. If the user selects
any region within this virtual box, this may indicate to the interactive apparatus
that the user is done writing the intended word. In one embodiment of the
invention, a single or double tap in the active region 620 indicates termination of
data entry in the active region. The processor on the device may be
programmed to recognize any or all of the above examples as user termination
events.
Figure 6B is an illustration of an exemplary printable surface comprising
user written data associated with a calculator application wherein tapping of an
active region terminates data entry in accordance with embodiments of the
present invention. In this embodiment of the invention, the printable surface 601
comprises a circled letter "C" corresponding to a calculator application. In this
embodiment of the invention, a user may be prompted to create a calculator by
writing the numbers zero through nine, the plus operand symbol, the minus operand symbol, the multiplication operand symbol and the division operand
symbol.
Suppose the user wrote the numbers without any action to indicate
termination of an individual number. The string one-two-three could be
interpreted as the number one-hundred-twenty-three instead of the intended
separate numbers one, two and three. To solve this issue, embodiments of the
present invention recognize prescribed user-performed actions that indicate user
intended termination of data entry. As stated above, one such action is tapping
in the active region of the user created data. Figure 6B illustrates user created
marks 650 next to each number and operand symbol resulting from a double
tapping in the active region of each character. In Figure 6B, the active region of
each character is not illustrated, however, the active region can be defined as an
area of the printable surface that surrounds each character or string of characters
Figure 6C is an illustration of an exemplary printable surface comprising
user written data associated with a calculator application wherein tapping of a
predetermined area 680 of the printable surface terminates data entry in
accordance with embodiments of the present invention. In this example, the
printable surface 601 comprises a predetermined area 680 suited to terminate
data entry when selected. In this embodiment of the invention, the user taps the predetermined area
680 to terminate data entry. For example, after writing the number one, a user
may tap the predetermined area 680 to terminate data entry opposed to the
termination action illustrated in Figure 6B wherein termination required tapping
the active region of the character. As a result, the numbers of Figure 6C do not
have user created marks in the active regions of the characters as shown in
Figure 6B.
In one embodiment of the invention, the predetermined area 680 can be
user selectable. In this embodiment of the invention, a user may graphically bind
the predetermined area 680 by drawing a border around it.
In another embodiment of the invention, the predetermined area 680
comprises pre-printed images. For example, in Figure 6C, the predetermined
area 680 comprises the word "done." In one embodiment of the invention, pre¬
printed surfaces can be application specific.
Figures 6B and 6C illustrate examples of user actions that can terminate
data entry. It is appreciated that an application may allow more than one
prescribed action to terminate data entry. For example, with reference to Figure
6C, a user may have tapped the predetermined location 680 to terminate entry of
the numbers zero through nine and may have double tapped in the active region of each mathematical operand symbol to terminate entry of each operand
symbol.
In another embodiment of the invention, a user action may include ceasing
to write for a predetermined period of time. In this embodiment of the invention, a
user may pause between writing the characters to differentiate each intended
character.
Figure 7 is a flow diagram of an exemplary computer implemented method
700 of inputting data including prescribed user data entry termination events in
accordance with embodiments of the present invention.
At step 702, process 700 includes receiving information from the optical
sensor representing user-written data, the user-written data made with a writing
instrument (e.g., device 100 or 200 of Figures 1 and 2) upon a surface. This
information may include encoded information regarding the image of the written
information, e.g., stroke data, that may include location information gathered from
encoded paper.
In one embodiment of the invention, the writing surface comprises
encoded position information that can be used to determine a specific location on
the surface. In one embodiment of the invention, the surface can be defined as a plurality of regions wherein each of the plurality of regions is associated with a
unique printed image. In this instance, the data is representative of the real-time
location of the writing instrument on the surface as the user writes.
In one embodiment of the invention, the unique printed images are dot
patterns. In one embodiment of the invention, the information representing user-
written data may be received wirelessly (e.g., via a Bluetooth wireless connection
or any other wireless connections known in the art).
At step 704, process 700 includes automatically defining an active region
on the surface surrounding the user written data. In one embodiment of the
invention, an area encompassing the user written data defines the active region.
As the user is writing, the processor automatically defines a surface region to
encompass the user written data.
At step 706, process 700 includes recognizing a user performing a
prescribed action or event with the writing instrument indicating completion of the
user-written data. In one embodiment of the invention, the prescribed action
includes the writing instrument being tapped within the active region. In this
embodiment, the writing instrument may be tapped a predetermined number of
times within the active region. Also in this embodiment of the invention, the
writing instrument may be tapped on a letter or number of the user written data. The tap may be made on or near the last character written of the user written
data.
In another embodiment of the invention, the prescribed action includes the
writing instrument ceasing to be used to write user written data for a
predetermined period of time (e.g., threshold time). In this embodiment of the
invention, receiving no information that represents user written data for the
predetermined period of time indicates termination of the user written data. In
one embodiment of the invention, the period of time begins once the writing
instrument is lifted from the printable surface. In another embodiment of the
invention, the period of time begins once receiving information representing user
written data ends.
In another embodiment of the invention, the prescribed action includes the
writing instrument being tapped in a predetermined location of the surface. In
one embodiment of the invention, the predetermined location on the surface
comprises a pre-printed image that may indicate a termination word. For
example, the pre-printed image may be the word "done" printed on the surface.
In this embodiment of the invention, selecting the pre-printed word "done"
terminates the receiving of information representing the user written data.
Applications may be programmed to respond to one, two, or all of the above
described termination events. In one embodiment of the invention, the prescribed action is application
specific. For example, a first application may allow different prescribed actions
than a second application. In another embodiment of the invention, an
application may allow multiple prescribed actions to terminate an event.
At step 708, process 700 includes in response to the recognizing of the
termination event, terminating the receiving of the user written data. In one
embodiment of the invention, identification of a prescribed action terminates the
receiving.
At step 710, process 700 includes in response to the terminating of
recognizing user written data, processing the information to automatically
recognize the user-written data. In one embodiment of the invention, the user-
written data can be recognized after termination of the receiving. This step may
include automatic recognition of the data and after the data is recognized, the
processor may implement some action related to a word, for example, the
processor may define the word, translate the word, etc.
For example, in the dictionary mode, a user may write a plurality of words.
Using a termination event after each word, an action related to a dictionary
application will be taken after each termination event. A user may then go back to a previously defined word and select it. The word will still be recognized and
the definition will be presented in response to selecting the word. In another
example, after the user writes a word, then taps the last character thereof, the
processor then performs an identification of the word and a definition is then
rendered. In one embodiment of the invention, processing the information
includes generating an audio response.
Figure 8 is a flow diagram of an exemplary computer implemented
process 800 of determining termination of data entry in accordance with
embodiments of the present invention where an audible tone is generated as
feedback to a user that the user written word has been completed and
acknowledged.
At step 802, process 800 includes determining an active region associated
with an active application, the active region associated with an area on a
printable surface comprising user written data. In one embodiment of the
invention, an area encompassing user-written data determines an active region.
At step 804, process 800 includes receiving information from the optical
sensor representing user written data associated with the active region. At step 806, process 800 includes detecting a user input indicating a
termination event of the user written data. In one embodiment of the invention, a
user input indicating a termination event of said user written data is application
specific. The user input can be any one of or all of the prescribed actions
described in conjunction with Figures 6A-6C.
At step 808, process 800 includes terminating data entry of the user
written data in the active region associated with the application. In one
embodiment of the invention, termination of data entry allows differentiation of
user-written characters or words. For example, by performing one of the
prescribed actions described above, the number twelve can be distinguished
from the numbers one and two.
At step 810, process 800 includes generating a tone indicating termination
of data entry in the active region. In one embodiment, multiple tones can be
generated to distinguish between data entry termination actions, for example.
Subsequent steps may then process the user written data, e.g., optical character
recognition (OCR).
Broadly, this writing has described computer implemented methods of and
systems for inputting data. A method for inputting data includes receiving information representing user-written data, the user-written data made with a writing
instrument upon a surface. The method further includes defining an active region
Figure imgf000033_0001
on the surface surrounding the user wrtt^n data and recognizing a user
performing a prescribed action with the writing instrument indicating completion
of the user-written data. In response to recognizing, the method includes
terminating the receiving and in response to terminating, the method further
includes processing the information to automatically recognize the user-written
data.
Embodiments of the present invention are thus described. While the
present invention has been described in particular embodiments, it should be appreciated that the present invention should not be construed as limited by such
embodiments, but rather construed according to the below claims.

Claims

1. A computer implemented method of inputting data comprising:
receiving information representing user-written data, said user-written data
made with a writing instrument upon a surface;
defining an active region on said surface surrounding said user written
data;
recognizing a user performing a prescribed action with said writing
instrument indicating completion of said user-written data;
in response to said recognizing, terminating said receiving; and
in response to said terminating, processing said information to
automatically recognize said user-written data.
2. A method as described in Claim 1 wherein said prescribed action is said
writing instrument being tapped within said active region.
3. A method as described in Claim 2 wherein said writing instrument is
tapped near the end of said user-written data.
4. A method as described in Claim 1 wherein said prescribed action is said
writing instrument being double tapped within said active region.
5. A method as described in Claim 1 wherein said prescribed action is said
writing instrument ceasing to be used to write user-written data for a
predetermined period of time.
6. A method as described in Claim 1 wherein said prescribed action is said
writing instrument being tapped in a prescribed location of said surface and
wherein said prescribed location bears a pre-printed image thereon.
7. A method as described in Claim 1 wherein said writing instrument
comprises a processor operable to execute a plurality of applications and
wherein said prescribed action is application specific.
8. A method as described in Claim 1 wherein said user-written data is a user-
written symbol, character, word or number.
9. A method as described in Claim 8 wherein said user written data is circled.
10. A method as described in Claim 1 wherein said user-written data is a
plurality of user-written words or numbers.
11. A computer system comprising:
a processor coupled to a bus; a sensor for sensing images on a surface;
a writing instrument;
a memory coupled to said bus and comprising instructions that when
executed implement a method of inputting data, said method comprising:
receiving information from said sensor representing user-written
data, said user-written data made with said writing instrument upon said
surface;
defining an active region on said surface surrounding said user
written data;
recognizing a user performing one of a plurality of prescribed
actions with said writing instrument indicating completion of said user-
written data;
in response to said recognizing, terminating said receiving; and
in response to said terminating, processing said information to
automatically recognize said user-written data.
12. A computer system as described in Claim 11 wherein one of said plurality
of prescribed actions is said writing instrument being tapped within said active
region.
13. A computer system as described in Claim 12 wherein said writing
instrument is tapped near the end of said user-written data.
14. A computer system as described in Claim 12 wherein one of said plurality
of prescribed actions is said writing instrument being double tapped within said
active region.
15. A computer system as described in Claim 12 wherein one of said plurality
of prescribed actions is said writing instrument ceasing to be used to write user-
written data for a predetermined period of time.
16. A computer system as described in Claim 15 wherein one of said plurality
of prescribed actions is said writing instrument being tapped in a prescribed
location of said surface and wherein said prescribed location bears a pre-printed
image thereon.
17. A computer system as described in Claim 11 wherein said processor is
operable to execute a plurality of applications and wherein said plurality of
prescribed actions are application specific.
18. A computer system as described in Claim 11 wherein said user-written
data is a user-written word.
19. A computer system as described in Claim 11 wherein said user-written
data is a plurality of user-written words.
20. A method for determining termination of data entry comprising:
determining an active region associated with an active application, said
active region associated with an area on a printable surface comprising user
written data;
receiving information representing user written data associated with said
active region;
detecting a user input indicating a termination event of said user written
data; and
terminating data entry of said user written data in said active region
associated with said application.
21. The method as described in Claim 20 wherein said active region is an
area surrounding said user written data.
22. The method as described in Claim 20 wherein said detecting comprises:
detecting a writing instrument being tapped in said active region.
23. The method as described in Claim 22 wherein said detecting tapping said
writing instrument in said active region comprises: detecting said writing instrument being double tapped in said active region.
24. The method as described in Claim 22 wherein said detecting tapping said
writing instrument in said active region further comprises:
detecting said writing instrument being tapped in a predetermined area of
said printable surface outside said active region.
25. The method as described in Claim 22 wherein said detecting tapping said
writing instrument in said active region further comprises:
detecting a writing time-out greater than a predetermined threshold of
time.
26. • The method as described in Claim 23 further comprising:
generating an audible signal indicating termination of data entry in said
active region.
27. The method as described in Claim 26 further comprising:
generating a first audible signal in response to a first tap of said writing
instrument in said active region; and
generating a second audible signal in response to a second tap of said
writing instrument in said active region.
28. The method as described in Claim 20 wherein said printable surface
comprises location encoded information.
29. The method as described in Claim 22 wherein said writing instrument
being tapped in said active region generates user-written marks on said printable
surface.
30. The method as described in Claim 22 wherein said writing instrument
being tapped in said active region comprises:
detecting tapping of said writing instrument adjacent to said user written
data.
31. The method as described in Claim 20 wherein said termination event is
application specific.
32. A method for determining termination of data entry comprising:
determining an active region associated with an active application, said
active region associated with an area on a printable surface comprising user
written data;
receiving information representing user written data associated with said
active region; and W
terminating data entry of said user written data in said active region
associated with said active application in response to passage of a
predetermined threshold of time in which no user written data is received.
33. The method as described in Claim 32 wherein said predetermined
threshold of time depends on said active application.
34. The method as described in Claim 32 wherein said predetermined
threshold of time begins after a writing instrument is lifted from said printable
surface.
35. The method as described in Claim 32 wherein said predetermined
threshold of time begins after a writing instrument contacts said printable surface.
36. A method for determining termination of data entry comprising:
determining an active region associated with an active application, said
active region associated with an area on a printable surface comprising user
written data;
receiving information representing user written data associated with said
active region;
detecting a user selection of a predetermined area of said printable
surface; and terminating data entry of said user written data in said active region
associated with said application in response to said user selection.
37. The method as described in Claim 36 wherein said predetermined area of
said printable surface comprises a pre-printed image.
38. The method as described in Claim 37 wherein said pre-printed image is
associated with a termination instruction understood by an interactive computer
system.
39. A device comprising:
a writing instrument;
a sensor for optically sensing images from a surface;
a processor coupled to said sensor; and
a memory coupled to said processor, said memory unit containing
instructions that when executed implement a method for recognizing data entry
termination, said method comprising:
determining an active region associated with an active application,
said active region associated with an area on a printable surface
comprising user written data;
receiving information representing user written data associated with
said active region; detecting a user input indicating a termination event of said user
written data; and
terminating data entry of said user written data in said active region
associated with said application.
40. The device as described in Claim 39 wherein said detecting comprises:
detecting tapping said writing instrument in said active region.
41. The device as described in Claim 40 wherein said detecting tapping said
writing instrument in said active region comprises:
detecting a writing time-out that is greater than a predetermined threshold
of time.
42. The device as described in Claim 39 wherein said method further
comprises:
generating a tone indicating termination of data entry in said active region.
43. The device as described in Claim 39 wherein said termination event is
application specific.
PCT/US2005/041880 2005-01-12 2005-11-18 System and method for identifying termination of data entry WO2006076079A2 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
CA002532447A CA2532447A1 (en) 2005-01-12 2006-01-10 System and method for identifying termination of data entry
EP06000514A EP1684160A1 (en) 2005-01-12 2006-01-11 System and method for identifying termination of data entry

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US11/035,003 2005-01-12
US11/035,003 US20060078866A1 (en) 2004-03-17 2005-01-12 System and method for identifying termination of data entry

Publications (2)

Publication Number Publication Date
WO2006076079A2 true WO2006076079A2 (en) 2006-07-20
WO2006076079A3 WO2006076079A3 (en) 2007-02-01

Family

ID=36678063

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2005/041880 WO2006076079A2 (en) 2005-01-12 2005-11-18 System and method for identifying termination of data entry

Country Status (7)

Country Link
US (1) US20060078866A1 (en)
EP (1) EP1684160A1 (en)
JP (1) JP2006195995A (en)
KR (2) KR100806240B1 (en)
CN (1) CN1855013A (en)
CA (1) CA2532447A1 (en)
WO (1) WO2006076079A2 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2854046A3 (en) * 2013-07-25 2015-04-29 Brother Kogyo Kabushiki Kaisha Paper medium, input device, and a non-transitory computer-readable medium for input device
EP2854011A3 (en) * 2013-09-17 2015-04-29 Brother Kogyo Kabushiki Kaisha Paper medium, input device, and computer-readable medium storing computer-readable instructions for input device

Families Citing this family (41)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7809215B2 (en) * 2006-10-11 2010-10-05 The Invention Science Fund I, Llc Contextual information encoded in a formed expression
US8290313B2 (en) * 2005-03-18 2012-10-16 The Invention Science Fund I, Llc Electronic acquisition of a hand formed expression and a context of the expression
US8640959B2 (en) 2005-03-18 2014-02-04 The Invention Science Fund I, Llc Acquisition of a user expression and a context of the expression
US8232979B2 (en) 2005-05-25 2012-07-31 The Invention Science Fund I, Llc Performing an action with respect to hand-formed expression
US8229252B2 (en) * 2005-03-18 2012-07-24 The Invention Science Fund I, Llc Electronic association of a user expression and a context of the expression
US20070273674A1 (en) * 2005-03-18 2007-11-29 Searete Llc, A Limited Liability Corporation Machine-differentiatable identifiers having a commonly accepted meaning
US8340476B2 (en) * 2005-03-18 2012-12-25 The Invention Science Fund I, Llc Electronic acquisition of a hand formed expression and a context of the expression
US8599174B2 (en) * 2005-03-18 2013-12-03 The Invention Science Fund I, Llc Verifying a written expression
US20060212430A1 (en) 2005-03-18 2006-09-21 Searete Llc, A Limited Liability Corporation Of The State Of Delaware Outputting a saved hand-formed expression
US7826687B2 (en) 2005-03-18 2010-11-02 The Invention Science Fund I, Llc Including contextual information with a formed expression
US7672512B2 (en) 2005-03-18 2010-03-02 Searete Llc Forms for completion with an electronic writing device
WO2008150919A1 (en) * 2007-05-29 2008-12-11 Livescribe, Inc. Electronic annotation of documents with preexisting content
US8254605B2 (en) * 2007-05-29 2012-08-28 Livescribe, Inc. Binaural recording for smart pen computing systems
WO2008150916A1 (en) * 2007-05-29 2008-12-11 Livescribe, Inc. Enhanced audio recording for smart pen computing systems
WO2008150887A1 (en) * 2007-05-29 2008-12-11 Livescribe, Inc. Self-addressing paper
US8416218B2 (en) * 2007-05-29 2013-04-09 Livescribe, Inc. Cyclical creation, transfer and enhancement of multi-modal information between paper and digital domains
US8374992B2 (en) * 2007-05-29 2013-02-12 Livescribe, Inc. Organization of user generated content captured by a smart pen computing system
CA2688634A1 (en) * 2007-05-29 2008-12-11 Livescribe, Inc. Multi-modal smartpen computing system
WO2008150923A1 (en) * 2007-05-29 2008-12-11 Livescribe, Inc. Customer authoring tools for creating user-generated content for smart pen applications
WO2008150921A1 (en) * 2007-05-29 2008-12-11 Livescribe, Inc. Communicating audio and writing using a smart pen computing system
WO2008150924A1 (en) * 2007-05-29 2008-12-11 Livescribe, Inc. Animation of audio ink
US20090251441A1 (en) * 2008-04-03 2009-10-08 Livescribe, Inc. Multi-Modal Controller
US9058067B2 (en) * 2008-04-03 2015-06-16 Livescribe Digital bookclip
US8149227B2 (en) * 2008-04-03 2012-04-03 Livescribe, Inc. Removing click and friction noise in a writing device
US8944824B2 (en) * 2008-04-03 2015-02-03 Livescribe, Inc. Multi-modal learning system
US8446297B2 (en) 2008-04-03 2013-05-21 Livescribe, Inc. Grouping variable media inputs to reflect a user session
US20090251338A1 (en) * 2008-04-03 2009-10-08 Livescribe, Inc. Ink Tags In A Smart Pen Computing System
US7810730B2 (en) 2008-04-03 2010-10-12 Livescribe, Inc. Decoupled applications for printed materials
US8446298B2 (en) * 2008-04-03 2013-05-21 Livescribe, Inc. Quick record function in a smart pen computing system
US20090251440A1 (en) * 2008-04-03 2009-10-08 Livescribe, Inc. Audio Bookmarking
US8300252B2 (en) * 2008-06-18 2012-10-30 Livescribe, Inc. Managing objects with varying and repeated printed positioning information
US8490157B2 (en) * 2008-10-29 2013-07-16 Microsoft Corporation Authentication—circles of trust
US20110041052A1 (en) * 2009-07-14 2011-02-17 Zoomii, Inc. Markup language-based authoring and runtime environment for interactive content platform
CN101963846B (en) * 2009-07-24 2013-04-24 精工爱普生株式会社 Optical pen
US20110148757A1 (en) * 2009-07-24 2011-06-23 Seiko Epson Corporation Optical input pen device with a trigger-style switch
CN102737015A (en) * 2011-04-07 2012-10-17 英业达股份有限公司 Writing system with real-time translation and writing method of writing system with real-time translation
JP5754257B2 (en) * 2011-06-20 2015-07-29 大日本印刷株式会社 Information processing system and program
US9348438B2 (en) * 2013-02-19 2016-05-24 Dell Products L.P. Advanced in-cell touch optical pen
CN104035685A (en) * 2013-03-07 2014-09-10 龙旗科技(上海)有限公司 Hand-held terminal unlocking method based on motion sensing
US10671795B2 (en) * 2014-12-23 2020-06-02 Lenovo (Singapore) Pte. Ltd. Handwriting preview window
US10754442B2 (en) * 2015-07-09 2020-08-25 YewSavin, Inc. Films or surfaces including positional tracking marks

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020041290A1 (en) * 2000-10-06 2002-04-11 International Business Machines Corporation Extending the GUI desktop/paper metaphor to incorporate physical paper input
US20020197589A1 (en) * 2001-06-26 2002-12-26 Leapfrog Enterprises, Inc. Interactive educational apparatus with number array
US20030087219A1 (en) * 2001-07-18 2003-05-08 Berger Lawrence J. System and method for real-time observation assessment
US6608618B2 (en) * 2001-06-20 2003-08-19 Leapfrog Enterprises, Inc. Interactive apparatus using print media
US20030162162A1 (en) * 2002-02-06 2003-08-28 Leapfrog Enterprises, Inc. Write on interactive apparatus and method
US20040229195A1 (en) * 2003-03-18 2004-11-18 Leapfrog Enterprises, Inc. Scanning apparatus

Family Cites Families (102)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
AUPQ131399A0 (en) * 1999-06-30 1999-07-22 Silverbrook Research Pty Ltd A method and apparatus (NPAGE02)
JPS61199119A (en) 1985-03-01 1986-09-03 Nec Corp Hand-writing tablet with display function
US4841387A (en) * 1987-12-15 1989-06-20 Rindfuss Diane J Arrangement for recording and indexing information
JPH0322259A (en) * 1989-03-22 1991-01-30 Seiko Epson Corp Small-sized data display and reproducing device
US5484292A (en) * 1989-08-21 1996-01-16 Mctaggart; Stephen I. Apparatus for combining audio and visual indicia
US5209665A (en) * 1989-10-12 1993-05-11 Sight & Sound Incorporated Interactive audio visual work
JP2784825B2 (en) * 1989-12-05 1998-08-06 ソニー株式会社 Information input control device
JP2925359B2 (en) * 1991-06-19 1999-07-28 キヤノン株式会社 Character processing method and apparatus
JP3120085B2 (en) * 1991-11-21 2000-12-25 株式会社セガ Electronic devices and information carriers
US5294792A (en) * 1991-12-31 1994-03-15 Texas Instruments Incorporated Writing tip position sensing and processing apparatus
WO1994015272A1 (en) * 1992-12-22 1994-07-07 Morgan Michael W Pen-based electronic teaching system
US5409381A (en) * 1992-12-31 1995-04-25 Sundberg Learning Systems, Inc. Educational display device and method
US6853293B2 (en) * 1993-05-28 2005-02-08 Symbol Technologies, Inc. Wearable communication system
US5413486A (en) * 1993-06-18 1995-05-09 Joshua Morris Publishing, Inc. Interactive book
US5640193A (en) * 1994-08-15 1997-06-17 Lucent Technologies Inc. Multimedia service access by reading marks on an object
US5652714A (en) * 1994-09-30 1997-07-29 Apple Computer, Inc. Method and apparatus for capturing transient events in a multimedia product using an authoring tool on a computer system
WO1996015837A1 (en) * 1994-11-21 1996-05-30 Compaq Computer Corporation Interactive play with a computer
US5520544A (en) * 1995-03-27 1996-05-28 Eastman Kodak Company Talking picture album
US5730602A (en) * 1995-04-28 1998-03-24 Penmanship, Inc. Computerized method and apparatus for teaching handwriting
US5978773A (en) * 1995-06-20 1999-11-02 Neomedia Technologies, Inc. System and method for using an ordinary article of commerce to access a remote computer
JPH0926769A (en) * 1995-07-10 1997-01-28 Hitachi Ltd Picture display device
JPH0944598A (en) * 1995-07-31 1997-02-14 Sanyo Electric Co Ltd Handwritten character input device
DE69637146T2 (en) * 1995-08-03 2008-02-28 Interval Research Corp., Palo Alto COMPUTER INTERACTOR SYSTEM AND METHOD FOR PROVIDING IT
US7498509B2 (en) * 1995-09-28 2009-03-03 Fiberspar Corporation Composite coiled tubing end connector
US6081261A (en) * 1995-11-01 2000-06-27 Ricoh Corporation Manual entry interactive paper and electronic document handling and processing system
US5767457A (en) * 1995-11-13 1998-06-16 Cirque Corporation Apparatus and method for audible feedback from input device
US5902968A (en) * 1996-02-20 1999-05-11 Ricoh Company, Ltd. Pen-shaped handwriting input apparatus using accelerometers and gyroscopes and an associated operational device for determining pen movement
US5757361A (en) * 1996-03-20 1998-05-26 International Business Machines Corporation Method and apparatus in computer systems to selectively map tablet input devices using a virtual boundary
US5903729A (en) * 1996-09-23 1999-05-11 Motorola, Inc. Method, system, and article of manufacture for navigating to a resource in an electronic network
US6218964B1 (en) * 1996-09-25 2001-04-17 Christ G. Ellis Mechanical and digital reading pen
US5803748A (en) * 1996-09-30 1998-09-08 Publications International, Ltd. Apparatus for producing audible sounds in response to visual indicia
SE509327C2 (en) * 1996-11-01 1999-01-11 C Technologies Ab Method and device for registering characters using a pen
US5937110A (en) * 1996-12-20 1999-08-10 Xerox Corporation Parallel propagating embedded binary sequences for characterizing objects in N-dimensional address space
KR100224618B1 (en) * 1997-03-27 1999-10-15 윤종용 View changing method for multi-purpose educational device
KR100208019B1 (en) * 1997-07-16 1999-07-15 윤종용 Multi-purpose training system
US5910009A (en) * 1997-08-25 1999-06-08 Leff; Ruth B. Communication aid using multiple membrane switches
US6252564B1 (en) * 1997-08-28 2001-06-26 E Ink Corporation Tiled displays
US6201903B1 (en) * 1997-09-30 2001-03-13 Ricoh Company, Ltd. Method and apparatus for pen-based faxing
US6518950B1 (en) * 1997-10-07 2003-02-11 Interval Research Corporation Methods and systems for providing human/computer interfaces
US6256638B1 (en) * 1998-04-14 2001-07-03 Interval Research Corporation Printable interfaces and digital linkmarks
JPH11122401A (en) * 1997-10-17 1999-04-30 Noritsu Koki Co Ltd Device for preparing photograph provided with voice code
US6181329B1 (en) * 1997-12-23 2001-01-30 Ricoh Company, Ltd. Method and apparatus for tracking a hand-held writing instrument with multiple sensors that are calibrated by placing the writing instrument in predetermined positions with respect to the writing surface
US6456749B1 (en) * 1998-02-27 2002-09-24 Carnegie Mellon University Handheld apparatus for recognition of writing, for remote communication, and for user defined input templates
US6064855A (en) * 1998-04-27 2000-05-16 Ho; Frederick Pak Wai Voice book system
US6199042B1 (en) * 1998-06-19 2001-03-06 L&H Applications Usa, Inc. Reading system
WO2000011596A1 (en) * 1998-08-18 2000-03-02 Digital Ink, Inc. Handwriting device with detection sensors for absolute and relative positioning
JP2000206631A (en) * 1999-01-18 2000-07-28 Olympus Optical Co Ltd Photographing device
US20020000468A1 (en) * 1999-04-19 2002-01-03 Pradeep K. Bansal System and method for scanning & storing universal resource locator codes
US7106888B1 (en) * 1999-05-25 2006-09-12 Silverbrook Research Pty Ltd Signature capture via interface surface
AUPQ363299A0 (en) * 1999-10-25 1999-11-18 Silverbrook Research Pty Ltd Paper based information inter face
US6832717B1 (en) * 1999-05-25 2004-12-21 Silverbrook Research Pty Ltd Computer system interface surface
JP4785310B2 (en) * 1999-05-28 2011-10-05 アノト アクティエボラーク Products used to record information
JP2000357046A (en) 1999-06-15 2000-12-26 Mitsubishi Electric Corp Handwriting input device and computer readable recording medium recording handwriting input program
SE516561C2 (en) * 1999-06-28 2002-01-29 C Technologies Ab Reading pen for reading text with light emitting diodes placed in the body on the large face of a printed circuit board to supply illumination
US6405167B1 (en) * 1999-07-16 2002-06-11 Mary Ann Cogliano Interactive book
US6363239B1 (en) * 1999-08-11 2002-03-26 Eastman Kodak Company Print having attached audio data storage and method of providing same
US6564249B2 (en) * 1999-10-13 2003-05-13 Dh Labs, Inc. Method and system for creating and sending handwritten or handdrawn messages
US6886036B1 (en) * 1999-11-02 2005-04-26 Nokia Corporation System and method for enhanced data access efficiency using an electronic book over data networks
US7006116B1 (en) * 1999-11-16 2006-02-28 Nokia Corporation Tangibly encoded media identification in a book cover
WO2001048685A1 (en) * 1999-12-23 2001-07-05 Anoto Ab General information management system
US20030046256A1 (en) * 1999-12-23 2003-03-06 Ola Hugosson Distributed information management
US6532314B1 (en) * 2000-01-28 2003-03-11 Learning Resources, Inc. Talking toy scanner
US6697602B1 (en) * 2000-02-04 2004-02-24 Mattel, Inc. Talking book
US6738053B1 (en) * 2000-02-16 2004-05-18 Telefonaktiebolaget Lm Ericsson (Publ) Predefined electronic pen applications in specially formatted paper
US6556188B1 (en) * 2000-02-25 2003-04-29 Ncr Corporation Three-dimensional check image viewer and a method of handling check images in an image-based check processing system
US6572378B1 (en) * 2000-02-29 2003-06-03 Rehco, Llc Electronic drawing assist toy
US7564995B1 (en) * 2000-03-07 2009-07-21 Apple Inc. Method and apparatus for acquiring and organizing ink information in pen-aware computer systems
SE0000949L (en) * 2000-03-21 2001-09-22 Anoto Ab location information
FR2807267B1 (en) * 2000-03-28 2002-12-06 Schlumberger Systems & Service EXTENDED MOBILE RADIOTELEPHONY NETWORK AND PUBLIC PHONE FOR IMPLEMENTING SUCH A NETWORK
US7094977B2 (en) * 2000-04-05 2006-08-22 Anoto Ip Lic Handelsbolag Method and system for information association
US6771283B2 (en) * 2000-04-26 2004-08-03 International Business Machines Corporation Method and system for accessing interactive multimedia information or services by touching highlighted items on physical documents
US6349491B1 (en) * 2000-05-16 2002-02-26 Roy Eugene Able Foldable poster sized card display apparatus having pockets and header
US20020023957A1 (en) * 2000-08-21 2002-02-28 A. John Michaelis Method and apparatus for providing audio/visual feedback to scanning pen users
WO2002019151A1 (en) * 2000-08-31 2002-03-07 The Gadget Factory Computer publication
US6704699B2 (en) * 2000-09-05 2004-03-09 Einat H. Nir Language acquisition aide
IL151213A0 (en) * 2000-12-15 2003-04-10 Finger System Inc Pen type optical mouse device and method of controlling the same
US7139982B2 (en) * 2000-12-21 2006-11-21 Xerox Corporation Navigation methods, systems, and computer program products for virtual three-dimensional books
US6924822B2 (en) * 2000-12-21 2005-08-02 Xerox Corporation Magnification methods, systems, and computer program products for virtual three-dimensional books
US7316566B2 (en) * 2001-03-15 2008-01-08 International Business Machines Corporation Method for accessing interactive multimedia information or services from Braille documents
US7107533B2 (en) * 2001-04-09 2006-09-12 International Business Machines Corporation Electronic book with multimode I/O
US6535799B2 (en) * 2001-04-30 2003-03-18 International Business Machines Corporation Dynamic technique for using corrective actions on vehicles undergoing excessive turns
US6954199B2 (en) * 2001-06-18 2005-10-11 Leapfrog Enterprises, Inc. Three dimensional interactive system
CN1518730A (en) 2001-06-20 2004-08-04 跳蛙企业股份有限公司 Interactive apparatus using print media
US7202861B2 (en) * 2001-06-25 2007-04-10 Anoto Ab Control of a unit provided with a processor
US6732927B2 (en) * 2001-06-26 2004-05-11 Anoto Ab Method and device for data decoding
US6966495B2 (en) * 2001-06-26 2005-11-22 Anoto Ab Devices method and computer program for position determination
US20030001020A1 (en) * 2001-06-27 2003-01-02 Kardach James P. Paper identification information to associate a printed application with an electronic application
US20030013483A1 (en) * 2001-07-06 2003-01-16 Ausems Michiel R. User interface for handheld communication device
US20030024975A1 (en) * 2001-07-18 2003-02-06 Rajasekharan Ajit V. System and method for authoring and providing information relevant to the physical world
US6516181B1 (en) * 2001-07-25 2003-02-04 Debbie Giampapa Kirwan Interactive picture book with voice recording features and method of use
GB2378073B (en) * 2001-07-27 2005-08-31 Hewlett Packard Co Paper-to-computer interfaces
JP4261145B2 (en) * 2001-09-19 2009-04-30 株式会社リコー Information processing apparatus, information processing apparatus control method, and program for causing computer to execute the method
US20030071850A1 (en) * 2001-10-12 2003-04-17 Microsoft Corporation In-place adaptive handwriting input method and system
US20030089777A1 (en) * 2001-11-15 2003-05-15 Rajasekharan Ajit V. Method and system for authoring and playback of audio coincident with label detection
KR20050027093A (en) * 2002-05-24 2005-03-17 에스엠티엠 테크놀러지스 엘엘씨 Method and system for skills-based testing and training
JP2004021760A (en) 2002-06-19 2004-01-22 Toshiba Corp Character recognition device and control method thereof
US6915103B2 (en) * 2002-07-31 2005-07-05 Hewlett-Packard Development Company, L.P. System for enhancing books with special paper
US7090020B2 (en) * 2002-10-30 2006-08-15 Schlumberger Technology Corp. Multi-cycle dump valve
US20040121298A1 (en) * 2002-11-06 2004-06-24 Ctb/Mcgraw-Hill System and method of capturing and processing hand-written responses in the administration of assessments
US20050024346A1 (en) * 2003-07-30 2005-02-03 Jean-Luc Dupraz Digital pen function control
US7555705B2 (en) * 2003-09-10 2009-06-30 Microsoft Corporation Annotation management in a pen-based computing system
US20060033725A1 (en) * 2004-06-03 2006-02-16 Leapfrog Enterprises, Inc. User created interactive interface

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20020041290A1 (en) * 2000-10-06 2002-04-11 International Business Machines Corporation Extending the GUI desktop/paper metaphor to incorporate physical paper input
US6608618B2 (en) * 2001-06-20 2003-08-19 Leapfrog Enterprises, Inc. Interactive apparatus using print media
US20030218604A1 (en) * 2001-06-20 2003-11-27 Leapfrog Enterprises, Inc. Interactive apparatus using print media
US20020197589A1 (en) * 2001-06-26 2002-12-26 Leapfrog Enterprises, Inc. Interactive educational apparatus with number array
US20030087219A1 (en) * 2001-07-18 2003-05-08 Berger Lawrence J. System and method for real-time observation assessment
US20030162162A1 (en) * 2002-02-06 2003-08-28 Leapfrog Enterprises, Inc. Write on interactive apparatus and method
US20040229195A1 (en) * 2003-03-18 2004-11-18 Leapfrog Enterprises, Inc. Scanning apparatus

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2854046A3 (en) * 2013-07-25 2015-04-29 Brother Kogyo Kabushiki Kaisha Paper medium, input device, and a non-transitory computer-readable medium for input device
EP2854011A3 (en) * 2013-09-17 2015-04-29 Brother Kogyo Kabushiki Kaisha Paper medium, input device, and computer-readable medium storing computer-readable instructions for input device

Also Published As

Publication number Publication date
KR100806240B1 (en) 2008-02-22
JP2006195995A (en) 2006-07-27
KR20070104309A (en) 2007-10-25
EP1684160A1 (en) 2006-07-26
WO2006076079A3 (en) 2007-02-01
CN1855013A (en) 2006-11-01
KR20060082428A (en) 2006-07-18
CA2532447A1 (en) 2006-07-12
US20060078866A1 (en) 2006-04-13

Similar Documents

Publication Publication Date Title
EP1684160A1 (en) System and method for identifying termination of data entry
US7853193B2 (en) Method and device for audibly instructing a user to interact with a function
KR100847851B1 (en) Device user interface through recognized text and bounded areas
US7831933B2 (en) Method and system for implementing a user interface for a device employing written graphical elements
KR100815534B1 (en) Providing a user interface having interactive elements on a writable surface
KR100814052B1 (en) A mehod and device for associating a user writing with a user-writable element
EP1780628A1 (en) Computer implemented user interface
WO2007136846A2 (en) Recording and playback of voice messages associated with a surface
US20090236152A1 (en) System and method for data organization by identification
EP1681623A1 (en) Device user interface through recognized text and bounded areas
WO2006076118A2 (en) Interactive device and method
CA2535505A1 (en) Computer system and method for audibly instructing a user

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application
NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 05825849

Country of ref document: EP

Kind code of ref document: A2