CA2221427A1 - Information input method and apparatus - Google Patents

Information input method and apparatus Download PDF

Info

Publication number
CA2221427A1
CA2221427A1 CA002221427A CA2221427A CA2221427A1 CA 2221427 A1 CA2221427 A1 CA 2221427A1 CA 002221427 A CA002221427 A CA 002221427A CA 2221427 A CA2221427 A CA 2221427A CA 2221427 A1 CA2221427 A1 CA 2221427A1
Authority
CA
Canada
Prior art keywords
information
input
pattern
image recognition
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
CA002221427A
Other languages
French (fr)
Inventor
Motohiro Kobayashi
Mitsuhiro Miyazaki
Hiroshi Usuda
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Sony Corp
Original Assignee
Sony Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Sony Corp filed Critical Sony Corp
Publication of CA2221427A1 publication Critical patent/CA2221427A1/en
Abandoned legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/041Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means
    • G06F3/042Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means
    • G06F3/0425Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means using a single imaging device like a video camera for tracking the absolute position of a single or a plurality of objects with respect to an imaged reference surface, e.g. video camera imaging a display or a projection screen, a table or a wall surface, on which a computer generated image is displayed or projected
    • G06F3/0426Digitisers, e.g. for touch screens or touch pads, characterised by the transducing means by opto-electronic means using a single imaging device like a video camera for tracking the absolute position of a single or a plurality of objects with respect to an imaged reference surface, e.g. video camera imaging a display or a projection screen, a table or a wall surface, on which a computer generated image is displayed or projected tracking fingers with respect to a virtual keyboard projected or printed on the surface

Landscapes

  • Engineering & Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Human Computer Interaction (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • User Interface Of Digital Computer (AREA)
  • Position Input By Displaying (AREA)
  • Image Processing (AREA)
  • Digital Computer Display Output (AREA)
  • Information Transfer Between Computers (AREA)
  • Computer And Data Communications (AREA)

Abstract

An information input method and interface, which can easily be operated without direct awareness of hardware, are provided. Image recognition is performed with an image recognition unit on image information obtained from imaging a target pattern on an object surface and an access indication pattern created by a user. The input patterns are created with an image pickup unit. The target pattern is associated with corresponding information stored in a storage unit. An access indication input which specifies a subset of the information is generated by performing image recognition on the access indication pattern. A control operation corresponding to the access indication input is performed in response to the access indication input by a control unit.

Description

CA 02221427 1997-11-19'~7~/o, ~

SO~rY-P4100 PATENT

INFORMATION INPUT METHOD AND APPARATUS

5 INVENTORS: Mitsuhiro Miyazaki Hiroshi Usuda Motohiro Kobayashi RELATED APPLICATIONS

The present application claims foreign priority under 35 U.S.C. Section 1 l9(a)-(d) based on J~p~nese Patent Application P08-3 15290 filed in the J~p~nese Patent Office on November26, 1996.
BACKGROUND OF THE INVENTION

Field of the Invention T.he present invention relates to a method and app~Lus for ent~nng data and/or control information into an electronic device. More particularly the present invention relates to a system and method for employing image recognition to enter information into a COl"puL~,~ or other data and/or control system.

De~c,;l,lion of the Related Art Various new multimedia devices that combine the functions of existing audio-visual (AV), information, and communication devices have been proposed which enable input/output of multimedia information. Examples of such new devices include portable information terrnin~lc with cornmunications capabilities, camcorders with communications capabilities, and personal col"l~uLe~s with both co~ u,~ication capabilities and AV functions. In addition, systems de~ignPd to allow these new multimedia devices to communicate with each other via different types of networks are under development. Some of the different types of networks that are being used for inter-device communication include Ethernet local area networks (LANs), Token Ring LANs, soN~r-p4loo PATENT

ATM wide area networks (WANs), wire communication networks like public telephonen~lw~,lk~, and wireless communication networks such as infrared con~ ication systems, cellular telephone systems, and satellite communication systems.
Many of these new multimedia devices use the same conventional int~rf~ces for 5 in~uuillg information as the çxi~tin~ AV, information, and coll~lllunication devices. For exarnple, both the conventional and new multimedia devices use keyboards, mice, touch panels, dedicated controllers and microphones.
As depicted in Fig. 1 5A, Bar codes are another example of a conventional means used for entering information into multimedia systems. A bar code typically includes a o pattem of lines of varying thi(~l~nesses which represent binary codes. They are typically used for the m~çhine identification of goods and other articles, or to specify a target object. Bar codes can be one or two ~limen~ional. In other words, bar codes are arranged so that they can be scanned by a bar code reader in either one or two dirrelcnl sc~nnin~
directions. Fig. 1 5B depicts a two dimensional bar code.
ls Unfollullately, operating conventional input interfaces is awkw~.l and not suited to the human senses. This is not to say that exi~tinp devices themselves are notergonomic, but rather that their method of operation is not intuitive and they typically require special training and practice before they can be used efficiently. For example, a keyboard can have many kinds of adjustments and special comfort features but until the 20 user is taught and then practices touch-typing, even the most ergonomic keyboard is very difficult to use efficiently. The sarne is true for mice, touch panels, dedicated controllers and microphones used in conventional multimedia devices. Such input devices simply do not suit the human senses. In other words, users who are not accustomed to opel~ g such devices do not intuitively understand the associated op. l~lhlg methods, and need 2s special knowledge and training to become skilled in operating these devices.
In addition to not being easily recognized or read by hllm~n~, systems that use bar codes suffer from the added problem that there are only a finite number of codes for a given bar pattern display area. The more di~lelll articles that need to be identified by such a system, the more unique pattems are required. Eventually, either the number of bar 30 code pattems has to be increased or the number of articles that can be identified must be limited. Increasing the number of patterns necessitates increasing the area of the bar SO~rY-P4100 PATENT

pattern display on the article. This is not a plefel~ed solution. For example, in the case of a product, the surface area of the article is finite and usually used for displaying other information. Further, adding more unique patterns by adding additional bars, which typically each only add two bits worth of information, increases the time required to recognize the bar code.
Thus, it is an object of the present invention to provide an information input appaldlus, and an information input method, which can be easily, efficiently, and intuitively operated without the user having to learn how to use or even become fully aware of speci~li7e~ hal.lw~e.
0 It is further object of the present invention to provide an information input appaldlus, and an information input method, which enables a user to specify, associate, and reference information from among a large volume of data through a simple operation.

SUMMARY OF THE INVENTION

Tthe above and other objects of the invention are achieved by the present invention of an information input method and appaldlus. The information input method according to the present invention includes the steps of recognizing a predefined target pattern on an 20 object's surface, associating the recognized pattern with information, receiving an access indication input signal that is generated by image recognition of an access indication input pattern and that corresponds to the associated information, and performing a control operation coll~;~ollding to the access indication input signal.
The information input appaldlus of the present invention includes image pickup 25 means for im~ging a target pattern and an access indication pattern, image recognition means for recognizing the target pattern and the access indication pattern imaged by the image pickup means, storage means for storing information associated with the target pattern, ~1th~ntication proces~ing means for associating the target pattern recognized by the image recognition means with information, and control means for receiving an access 30 indication input signal based on the information and for performing a control operation based on the access indication input signal.

These and other features and advantages of the present invention will be understood upon consideration of the following detailed description of the invention and the accompanying drawings.

BRIEF DESCRIPTION OF THE DRAWINGS

Figure 1 is a functional block diagram depicting an embodiment of an informationinput appaldlus according to the present invention.
0 Figure 2 is a block diagram depicting the structure of a multimedia system within which the information input a~p~d~us of Fig. 1 is applied.
Figure 3 is a block diagram depicting an embodiment of a hardware structure of aportion of the information input apparatus of Fig. l .
Figure 4 is a plan view illustration depicting an example arrangement of an embodiment of an input sheet of the information input appaldlus of Fig. 1.
Figure 5A is an illustration depicting an example of code value ~qqi~nm~ntq of rotated icon codes that can be used on the input sheet of Fig. 4.
Figure 5B is an illustration depicting a first example of rotated icon codes that can be used on the input sheet of Fig. 4.
Figure 6 is a plan view illustration depicting an example embodiment of an inputsheet of Fig. 4 for controlling an electronic device having convelllional video c~c.sette recorder controls.
Figure 7 is an illustration depicting an exemplary output display of an output unit of an embodiment of an information input a~a~dlus according to the present invention 25 using the input sheet of Fig. 6.
Figure 8 is a flowchart depicting an embodiment of an identification processing method of recognizing a target with an image recognition unit of an embodiment of an information input app~dlus according to the present invention.
Figure 9 is an illustration depicting the amount of shift detecte~l during position 30 correction by the image recognition unit of an embodiment of an information input appaldlus according to the present invention.

Figure 10 is an illustration depicting the center of mass and a circumscribed rectangle of each icon extracted by rotated icon extraction processing in an embodiment of an identification processing method of recognizing a target according to the present inventlon.
s Figure 11 is an illustration depicting a part of a pre-defined rotated icon code group used in rotated icon code determin~tion processing in an embodiment of an identification processing method of recognizing a target according to the present invention.
Figure 12 is a flowchart depicting an embodiment of an input/selection indication 0 processing method in an embodiment of an image recognition unit of an embodiment of an information input dppaldLUS of the present invention.
Figure 13 is an illustration depicting an access indication input pattern giving an input/selection indication extracted by input/selection indication extraction proces~ing pclrulllled by an image recognition unit of an embodiment of an information input appaldlus ofthe present invention.
Figure 14 is an illustration depicting a second example of rotated icons described on the input sheet of Fig. 4 Figure 1 5A is an illustration depicting different formats of one ~limçn~ional bar codes used with input systems of the prior art.
Figure 1 SB is an illustration depicting a two tiimencional bar code used with input systems of the prior art.

DETAILED DESCRIPTION OF THE PREFERRED EMBODIMENTS
2s Turning to Fig. 1, a plefclled embodiment of an information input a~ualus 100 according to the present invention includes an image pickup unit 101, an image recognition unit 102, a control unit 103, a storage unit 104 and an output unit 105. The control unit 103 is connected to an information distribution unit 106 via collllllw~ication 30 link 108 and to an external device 107 via collllllul~ication line 109. Coll~ ~ication link 108 can be for example, a network, while communication line 109 can be for example, a home bus.
Fig. 2 depicts an application of the information input ~paldlus 100 of Fig. 1. In other words, Fig. 2 illustrates the present invention as it could be used in, for example, a mllltimedia system. Referring to both Figs. 1 and 2, the image pickup unit 101 images a recognition target pattern appended onto an object surface, such as paper, and an access indication pattern made by a user with a finger or any other similar pointer. Specifically, the image pickup unit 101 is preferably embodied as sensitive digital camera such as a charge-coupled-device (CCD) video camera 201 as shown in Fig. 2 or a Quantum Well Infrared Photodetector (QWIP) video camera. The CCD video camera 201 is preferably 0 located on top of a viewer 205A as in the multimedia system of Fig. 2. The CCD video camera 201 is focused to capture images of an input sheet 209 set on a medium base 208.
The medium base 208 is adjacent to the viewer 205A which is disposed in an upright position so that the CCD video camera 201, located on top of the viewer 205A, can be focused on input sheet 209 on the medium base 208.
In such an embodiment, the input sheet 209 serves as an information selection medium which is segmented into different category areas. The input selection sheet 209 can be made, for example, of paper or a thin plastic sheet. Turning to Fig. 4, an embodiment of an arrangement of the input sheet 209 of Fig. 2 is shown. The embodiment depicted in Fig. 4 illustrates an example input sheet 209 that includes five 20 different category areas labeled A, B, C, D and E. Note that a particular category ~can include multiple distinct regions on the input sheet 209.
In the first category area A of the input sheet 209, an al-thentication icon is provided that can be used for authenticating that a particular plastic sheet or piece of paper is an input sheet 209 that will be permitted to be used with the information input a~aldlus 25 100 of the present invention. In other words, if the input sheet 209 includes an thentication icon in category area A that the information input appaldlus 100 is able to ~nthenticate, then the system will enable operation with that particular input sheet 209.
Other~vise, the system will be disabled from opeldlillg with that particular input sheet 209.
The combination of an authentication icon located in a predefined category area is referred 30 to as a recognition target pattern.
In the second category area B of a preferred embodiment of an input sheet 209 there is an identification icon (or icons) for associating a particular sheet with stored or accessible electronic information. The identification icon is also referred to as a recognition target pattern. In the third category area C, an input sheet 209 position correction mark is provided. In the fourth category area D, a graphic for a user to indicate s a selection is provided. Finally, in the fifth category area E, some form of a listing of the information content of the system is provided. The structure of an input sheet 209 and the arrangement of the graphics it provides is further described in grater detail below.
The image recognition unit 102 of Fig. 1 performs image recognition on image inforrnation provided by the image pickup unit 101. In the embodiment of Fig. 2, the o CCD video camera 201 generates image information from im~gin~ the input sheet 209.
The image recognition unit 102 is embodied within a control box 202 in the mllltimefli~
system shown in Fig. 2. The control box 202 with the image recognition unit 102 receives the image information output from the CCD video camera 201. The image recog~ution unit 102 p~ .rOlllls identification and a~1thentication processing of the recognition target 5 patterns contained within the image information received from the CCD video camera 201. Based on the identification and ~lth~ntication processin~, the image recognition unit 102 outp~uts an identification result and an ~llthentication result to the control unit 103.
As will be described below in detail, once an association between an input sheet209 and electronic information is made as a result of recognizing the recognition target 20 pattern, the image recognition unit 102 proceeds to recognize an input/selection in~dication made by the user and then outputs a recognition result signal to the control unit 103.
Along with the image recognition unit 102, the control unit 103 is also providedwithin the control box 202 of the multimedia system shown in Fig. 2. The control unit 103 controls tr~n.~mi~.~ion and reception of information between the various other 2s components. Based on the recognition result of the recognition target pattern by the image recognition unit 102, the control unit 103 deter-m--ines whether the recognition target pattern is associated with information that is stored locally in the storage unit 104. If so, the control unit 103 ~ccecses the associated locally stored information.
If the recognition target pattern is associated with information that is not stored 30 locally, the recognition result is transmitted to the information distribution unit 106. In this case, the associated information is stored in the information distribution unit 106 or in an information processing unit located on an ÇxtPrn~l collllllul~ication network connected to the information distribution unit 106. The information distribution unit 106 ~ccesses the associated remotely stored information.
In either case, once the associated information is located, the control unit 103transfers a copy of the associated information into a temporary storage unit (not pictured) provided within the control unit 103. The associated information preferably includes several types of data. For example, information associated with an input sheet 209 preferably includes display information for providing the user feedback on the output unit 105, control information for controlling an extprn~l device 107, link information to locate 0 other related information, and command configuration information for intc~ lg and using the input sheet 209.
Once the control unit 103 has a copy of the associated information in the temporary storage unit, ~llthpntication is performed for each input/selection indication image received from the image recognition unit 102. As the input/selection indication images are ~llthpnticated~ a display screen on the output unit 105 is ~-p-l~te~l and control functions of the extern~l device 107 are performed. In other words, leprese~ lions indicating the functioning of these different operations and information distribution results are displayed on the output unit 105 as they are performed.
The above-listed types of information that are associated with different input 20 sheets 209, or different areas of an input sheet 209, will now be explained in further detail.
The display information includes characters, images, layout information, sound information and the like. Based on this display information, the control unit 103 creates displays on the output unit 105 and controls sound generation.
The control information includes control com m~qntls for controlling an ~xtPrn~l25 device 107 in response to the user making an input/selection indication on an input sheet 209. The control unit 103 controls the extern~l device 107 based on the control information associated with the input sheet 209. For example, if the çxtPrn~l device 107is a video c~c~ette recorder (VCR), the control information includes reproduction (play), stop, fast forward, rewind, and pause control information for controlling the operation of a 30 VCR. When, for example, the user points to a graphic on the input sheet 209 that represents the play control information, the image pickup unit 101 sends image information to the imàge recognition unit 102 which, in response, recognizes the play inputtselection indication image and sends a play command to the VCR.
The link information includes addresses for reading out other information when the user makes an input/selection indication. The links can point to information stored in the storage unit 104, the information distribution unit 106, or an e~tern~l information processing unit connected to the communication network. Based on the link information, the control unit 103 accesses other additional information. For example, a Uniform Resource Locator (URL), as used on the Internet, may be used as link information in the present invention.
0 The comm~n-l configuration infor nation includes infor nation indicating which functions are to be performed when a graphic within the D or E category areas of the information input sheet 209 is selected by the user. The process of a user selecting a graphic on the input sheet 209 is also referred to herein as a user inputtselection indication. The command configuration information is in the form of a physical position S on the input sheet 209 as selected by the user and a next processing command col.esponding to that selected position. For example, the co--ll-~alld configuration information could be a position coordinate and a processing command for executing the above-described control information or link information when the appropl;ate position coordinate is selected by the user.
In addition to the control unit 103 and the image recognition unit 102, the storage unit 104 is also provided within the control box 202 in the multimedia system shown in Fig. 2. The storage unit 104 stores information associated with the recognition target pattern. The information stored in the storage unit 104 includes the information for controlling the ea~tern~l device 107. This information can be retrieved from an ext~rn~l 25 information processing unit connected to the network via the information distribution unit 106 or, for exa~nple, from a remote information source via a satellite co.llm~lications system. By locating the storage unit 104 within the control box 202, the traffic on the network may be reduced and the response time to user inputtselection indications may be shortened.
The output unit 105 preferably includes a display, a speaker, andtor a printer.
Thus, the output unit 105 can provide sound and a visual display of both the function of the operation selected by the user and the information distribution results. Fig. 7 is an example of sample information output to an output unit 105 display screen. On the left side of the display screen 700 there is a graphic 710 illustrating the function of the operation selected by the user. Note that VCR controls are represented and the play command graphic 720 (a right pointing arrow) is ~l~rkened. This indicates the user selection of the play comm~nfl On the right side of the display screen 700 there is an image 730 output from the extem~l device 107 which represents the information distribution results. Image outputs of various other devices under control of the system may also be synthPsi7~1 and outputted to the output unit 105 as shown in Fig. 7. The 0 output unit 105 of the multimedia system shown in Fig. 2 is embodied as a viewer 205A
and a video scan converter 205B.
The information distribution unit 106 can be connected to the control unit 103 via wire Collllllul~iCatiOn networks such as Ethernet LANs, Token Ring LANs, or ATM
WANs. Alternatively, wireless coll~ lication n~Lwolh~ such as infrared, PHS telephone, digital cellular telephone, or satellite communication systems can be employed.
When the control unit 103 determines that the recognition result from the image recogmtion unit 102 requires information associated with a particular input sheet 209 that is not available locally in the storage unit 104, the information distribution unit 106 retrieves the associated information from an extern~l information processing unit 20 connected to the system via the collllllul~ication network. In the multimedia system shown in Fig. 2, the information distribution unit 106 is embodied as a programmed wol~Lion 206.
The extern~l device 107 can be any number of dirr~,lcnt electronic devices that can be controlled by the control unit 103 in response to the inputlselection indication. For 25 example, in Fig. 2, the extern~l device 107 includes an AV controller device 207A, a VCR
208B, a modem 207C, a telephone set 207D, and a camera 207E. The extern~l device 107 could also include many other types of electronic devices, such as, COlllpll~ i and game m~hines The extern~l device 107 can be connected to the control unit 103 via wire communication lines or wireless communication systems.
Fig. 3 depicts an exemplary hardware embodiment of the control box 202 of Fig. 2.
The control box 202 of Fig. 2 includes a central processing unit (CPU) 301, an image processing unit 302, à camera interface 303, a display interface 304, a network interface 305, a home network interface 306, a read only memory (ROM) 307, a random accessmemory (RAM) 308, a hard disk drive (HDD) 309. All of the components of the control box 202 can be interconnected via a bus 310. Alternatively, these components can be connected to each other via dedicated col,~llunications paths. For example, the camera int~ re 303 is shown connected directly to the Image processing unit 302 by a dedicated link. Likewise, the CPU can have a second separate bus for connecting to the ROM 307 and/or RAM 308.
The CPU 301 is adapted to control the control box 202 in accordance with a 0 system program stored in the ROM 307 or a program developed in the RAM 308. The CPU 301 provides some of the functions of the image recognition unit 102 and some of the functions of the control unit 103 depicted in Fig. 1. In other words, both the image recognition unit 102 and the control unit 103 can be in part embodied as a CPU 301.
Programs and data necessary for the CPU 301 to p~lrOllll various types of processing are stored in the ROM 307. The ROM 307 provides some of the functions of the control unit 103 of Fig. 1. The RAM 308 is adapted to develop and temporarily store programs and data n~cess~ry for the CPU 301 to perform various types of procee~ing The RAM 308 also provides some of the functions of the control unit 103 of Fig. 1. In other words, the control unit 103 can be in part embodied as a ROM 307 and a RAM 308 20 together with a CPU 310. The HDD 309 provides the functions of the storage unit 104 of Fig. 1 and thus, the storage unit 104 can be embodied as a HDD 309.
The image processing unit 302 receives image information from the CCD video camera 201 via the camera int~ce 303. Various types of image processing are performed by the image processing unit 302 such as image recognition procescing The 25 image procçsqin~ unit 302 provides some of the functions of the image recognition unit 102 of Fig. 1. The camera int~ ce 303 receives image information from the CCD video camera 201 and then converts the image information to a signal format compatible with the image processing unit 302. The camera int~rf~ce 303 then outputs the converted image information to the image processing unit 302. Thus, the image recognition unit 102 30 can be embodied in part as a camera interface 303 and an image processing unit 302 together with a CPU 301.

The display int~rf~ce 304 receives display data processed by the CPU 301 and theimage processing unit 302, converts the display data to signals compatible with the viewer 205A, and then outputs the converted signals to the viewer 205A. The display interface 304 and the viewer 205A provide some of the functions of the output unit 105 of Fig. 1.
Thus, the output unit 105 can be embodied in part as a display interface 304 and a viewer 205A.
The network interface 305 provides a connection to the w~lk~Ldlion 206, thereby enabling access to an extçrn~l network. The network intçrf~ce 305 and the workstation 206 provide some of the functions of the collllllu~ication link 108 and the information o distribution unit 106 of Fig. 1. Thus, the colll~llu~fication link 108 and the information distribution unit 106 can be embodied in part as a network interf~ce 305 and a workstation 206.
The home network interface 306 provides a connection to the external device 107.The CPU 301 is thus able to control the external units 207A-207E via the home network interf~ce 306. The home network interface 306 and the extprn~l units 207A-207E provide some of the functions of the communication line 109 and the ext~rn~l device 107 of Fig. 1.
Thus, the conllllullication line 109 and the extem~l device 107 can be embodied in part as a home network intPrf~ce 306 and a plurality of ext~rn~l units 207A-207E such as VCRs, modems, video cameras, etc.
The above described constituent components 301-309 are connected to one another via a bus 310. Together, they provide the various functions of the information input a~aldLus as described above with reference to Fig. 1.
The input sheet 209 imaged by the information input appa~dlus as described with reference to Figs. 1 to 3 above is explained below in grater detail with reference to Fig. 4.
25 As described above, the input sheet 209 is preferably segTnented into the five category areasA,B,C,DandE.
In the first category area A of the input sheet 209, an ~llth.?ntication icon isprovided for ~th~nticating that a particular plastic sheet or piece of paper is an input sheet 209 that will be permitted to be used with the information input al~paldllls 100 of the 30 present invention. As explained above with reference to Figs. 1, 2 and 3, the~lthentication icon is provided as a recognition target pattern. The ~nthentication icon in the category area A is imaged by the CCD video camera 201 and compared with co~npalalivt; icon information stored in the ROM 307 by the CPU 301. When the CPU
301 is able to find a match between the a~lth~ntication icon of the input sheet 209 and the co~ al~live icon information stored in the ROM 307, operation proceeds to identification s proce~ing using the category area B. If no match exists, an error message is displayed on the viewer 205A.
In the second category area B, an identification icon (or icons) for associating the particular sheet with electronic information is provided as a recognition target pattern.
The identification icon has a directionality and a predefined number of recognizably 0 distinct orientations. For example, in the Fig. 4 embodiment of the input sheet 209, an icon resembling a key is used. The key shape has a narrow end and a wide end that gives it directionality and allows one to identify its orientation. Further, dirreielll orientations of the identification icon represent dirrelellt values, codes or me~ning.~. In the embodiment pictured in Fig. 4, each identification icon that serves as part of the recognition target 15 pattern in the category area B, has eight distinct orientations. Thus, each identification icon can be oriented such that eight distinct code values can be represented based solely on the orientation of the icon. Fig 5A illustrates an example of eight distinct orientations for the key shaped icon with one of eight distinct code values assigned to each orientation.
Through the use of just a few additional distinctly oriented icons together in a20 sequence, many more distinct codes can be represented. For example, using a sequence of four identification icons, each having eight distinct orientations, 4096 dirr~.enl code values can be ~ ;s~ e~l The number of values that can be represented by four icons with eight orientations is computed as follows:

2s (80ri~nt~~ nc)jcOnl*(80ri~ont~ n~)icon2*(8ori~nt~tiQnc)icon3*(8ori~nt~ti~n~)icon4=84=4o96values Using the example orientation value ~csignm~ntc depicted in Fig. 5A, the orderedsequence of identification icons 500 shown in Fig. 5B represents a code value of "2574".
That is to say, the first key icon 510 represents "2", the second key icon 520 represents "5", the third key icon 530 represents "7", and the fourth key icon 540 represents "4".
In the third category area C, a position correction marks are provided. The position correction màrks are used for performing position correction of the image picked up by the CCD video camera 201.
In the fourth category area D, a graphics are provided for the user to perform aselection indication. By selecting a graphic in area D, the user can select information 5 which is displayed on the viewer 205A but is not displayed in the category area E of the input sheet 209. That is, the category area D is used to make a selection indication whenever the selection information provided in the category area E of the input sheet 209 differs from the selection information displayed on the viewer 205A. This allows the information input appa~a~us 100 to dynamically add input selection options not present in o category area E, via the viewer 205A.
For example, as shown in Fig. 4, when up, down, left, and right arrowhead graphics are provided in category area D, the user can choose an albiLI~y arrowhead graphic in category area D to perform the shifting of a cursor and/or the ~ie~ tion of a menu selection on a menu screen displayed on the viewer 205A. Although the graphics 5 provided in the category area D can be embodied as an up, down, left, and/or right arrow figures to provide cursor movement functions, the graphics are not necessarily limited to such arrQwhead shaped figures and the functions are not nec~c.~rily limited to such cursor movement functions.
For example, if the input sheet is to be used as a common medium for controlling20 both a VCR and an audio system, graphics for controlling functions common to both loxtçrn~l devices are provided. VCRs and audio systems frequently have functions in common such as, playback, stop, and record functions. Thus, by associating graphics in the category area D with command configuration information which includes embedded proces~ing comm~n~l~ and position information, execution of the embedded processing 25 comm~n-ls is enabled. In the above-described example, the embedded procescingcomm~n~l~ include: upward, downward, leftward, and righlw~d cursor shift comm~n-ls, and play, stop, and record comm~n~. But for example, for a compact disc (CD) player, similar graphics could be used for comm~n~lc such as fol ~d within track, forward to next track, reverse within track, reverse to previous track, eject, stop, and change 30 i~follllation display mode. Recall that command configuration information is copied into the temporary storage unit within the control unit 103 once an information sheet 209 is allth~nticated and identified.
The content of the substantive inforrnation available via the information input appaldlus 100 is provided in the fifth category area E. This colllent~ listing can be displayed in many dirre,elll ways. For example, it can be displayed in the form of a table s that enumerates general topics arranged in a subject matter based order, or as an image map analogous to a hyper-text tr~n~mi~sion protocol (HTTP) image map used on World Wide Web (WWW) pages on the Internet, and/or as a detailed index that itemizes each an every bit of substantive information available in a alphabetical order. By viewing the information described in this category area E, the user can learn what information is o accessible with the particular input sheet 209 cullelllly imaged by the CCD video camera 201.
In addition, once a particular input sheet 209 is recognized by the information input appa,d~us 100 shown in Fig. 1, the user may select from the contents inforrnation provided in this category area E by simply pointing to the desired item of information 5 within the contents information. The step of pointing provides an indication selection image pattern that can be recognized by the interface system 100. The indication of a particular selection is then provided to the control unit 103 which then can pelrollll some predeterminecl proces~ing As an example, consider the input sheet 209 of Fig. 6. The category area E
20 includes a graphic image map of VCR control buttons. The graphics 600 that lep,esent the six dirr~,rel,L control buttons provided an hlluilive set of controls that co~ spond to the VCR functions of play 610, stop 620, pause 630, rewind 640, fast forward 650, and eject 660. This example arrangement allows the user to easily operate the VCR using the six control buttons. Although the information contents described in the category area E has 25 been explained using the control buttons for a VCR, one skilled in the art would realize that many dirr~ " devices can be controlled using an appropliate input sheet 209 and, in particular, that the information contents are not nlocess~rily limited to such VCR control buttons.
As a further example, the contents information in the category area E could be in 30 the form of a series of menus. Thus, in response to the user pointing to a particular menu in the category area E using a finger, a subsequent display of menu items that correspond to the selected menu is displayed on the viewer 205A. Thus, by associating menus in the category area E with link information and command configuration information which includes embedded processing comm~n-l~ and position information, the display of link information and execution of the embedded processing comm~n~ls is enabled. In this s example a subsequent list of menu items are displayed on the viewer 205A in response to the user pointing to the location of the related initial menu on the input sheet 209. Once again, associated link information and command configuration information are copied into the temporary storage unit within the control unit 103 once an information sheet 209 is authenticated and identified.
o In a multimedia system structured as described above, the image recognition unit 102 carries out identification proces~ing for identifying the type of input sheet 209 being used. This happens once the input sheet 209 has been ~llthenticated by ~llthentication processing of the image of the input sheet 209 received from the CCD video camera 201.
The ~llthentication processing is carried out by first performing pre-procçssing, 5 such as, elimin~tion of noise, variable density processing, and adjstment ofthreshold value. Next position correction is pt;lro~med. The system then proceeds to extract the center of. mass and contour of the authentication icon used as the recognition target pattern in the category area A at the upper center of the input sheet 209. Finally, the system allt;lllllt~ to match the image information with a stored the ~llthentication icon pattern.
Next identification processing takes place. For example, as shown in the flowchart of Fig. 8, pre-processing (S801) is first performed, such as, elimin~tion of noise, variable density procec~ing, and adj..~tment of a threshold value, with respect to the image information obtained as im~gin~ output of the CCD video camera 201 focused on the input sheet 209. Then, position correction processing (S802) is performed by first 25 detecting and measuring a positional shift based on the image information of the position correction mark provided in the category area C preferably located at the four corners of the input sheet 209. The position correction processing (S802) then performs a transformation of the coordinate system colle;,l,onding to the amount of the positional shift. Next, extraction processing (S803) of the identification icon provided as a 30 recognition target pattern in the category area B of the input sheet 209 is carried out and ~lçtçrmin~tion processing (S804) of a rotated icon code is performed.

In the position correction processing (S802), the positional shift from a reference position is detected based on the image information obtained as im~gin~ output of the CCD video camera 201 focused on the category area C of the input sheet 209. As shown in Fig. 9, the system overlays a correction pattern on the position correction marks and s detects the direction and magnitude of any position error. For example, coincidence 900 occurs if the sheet is prop~ ,ly aligned, shift to upper left 910 occurs if the input sheet 209 is too high and offto the left, shift to upper right 920 occurs if the input sheet 209 is too high and offto the right. Unless the coincidence 900 pattern is detected, position correction is then carried out by m~them~tically transforming the coordinate system of the o irnage recognition in accordance with the direction and magnitude of the positional shift.
In the extraction processing of the identification icon (S803), the center of mass of variable density level and a circumscribed rectangle are found for each icon from the image information obtained as im~ging output of the CCD video camera 201 focused on the category area B of the input sheet 209, as shown in Fig. 10. In Fig. 10, the center of 15 mass of each icon as detP-rminPcl by the identification icon extraction processing (S803) is indicated by a white spot and the circumscribed rectangle is indicated by a broken line.
I~ the rotated icon code delP .--i~-~t;on processing (S804), one of eight different orientations of the icon are determined by referring to a direction characteristic table of the rotated icon. The direction characteristic table defines the eight different orientations in 20 terms of two parameters; the center of mass and the aspect ratio of the circumscribed rectangle found in the identification icon extraction proceeeing (S803). The direction char~ct~rietic table is t;2~ essed as shown below in Table 1. It is predetermined based on the conditions indicated by charactçrietics of each icon orientation. For example, the second row of the table is read as: "if the aspect ratio of the circumscribed rectangle is 2.5 25 or greater and the position of the center of mass is in the upper half of the circumscribed rectangle, the rotated icon is oriented downward." Based on the ~eeignmPnt of the icon's orientation with a code value as shown in Fig. 5A, the rotated icon is converted to a code.
This processing is repeated for each of the four icons so as to detçrmine a sequence of four codes.

Orientation of Icon Characteristics upward aspect ratio 5:2, center of mass located in lower half of rectangle downward aspect ratio 5:2, center of mass located in upper half of rectangle rightward aspect ratio 2:5, center of mass located in left half of rectangle leftward aspect ratio 2:5, center of mass located in right half of rectangle upper rightward aspect ratio 1:1, center of mass located in lower left area of rectangle lower righ~w~d aspect ratio 1:1, center of mass located in upper left area of rect~ng1e upper leftward aspect ratio 1:1, center of mass located in lower right area of rectangle lower leftward aspect ratio 1:1, center of mass located in upper right area of rectangle Next, the rotated icon code sequence found from the four rotated icons is co~ pal~d with a pre-defined rotated icon code group as shown in Fig. 11. The best match 5 between the rotated icon code sequence and one of the pre-defined code groups is selected as the rotated icon code of the input sheet 209. The best match is detçrrnined by finding the pre-defined rotated icon code group with the smallest sum of differences between the respective parts of the pre-defined rotated icon code group and the rotated icon code sequence. If the sum of differences of all the pre-defined rotated icon code groups 0 exceeds a threshold value, the recognition target is not recognized as known code and is judged as unrecognizable.
As described above, the control unit 103 det~rrnines whether or not an association between the recognition result of the recognition target by the image recognition unit 102 and some of the information stored in the storage unit 104 can be made. If an association 5 can be made, that information is accessetl That is, if association with information stored in the app~dlus itself can be made, the associated information is immediately ~scesse-l If there is no information associated with the recognition result from the image recognition unit 102 that is currently stored in the storage unit 104, the recognition result is transmitted to the information distribution unit 106.

In this case, the associated information is stored in the information distribution unit 106 or the information processing unit exi~ting in the external cornmunication network connected to the information distribution unit 106. Thus, after the information associated with the recognition result of the recognition target by the image recognition unit 102 is s located by the information distribution unit 106, the control unit 103 reads the associated information into the temporary storage unit provided in the control unit 103.
Thus, when identification of an input sheet 209 and d~lç. ..~ tion of its type is completed, processing is performed based on the information read into the temporary storage unit provided in the control unit 103. This all occurs in response to the lo input/selection indication from the image recognition unit 102.
Fig. 12 illustrates an embodiment of the steps used in the input/selection indication processing by the image recognition unit 102. Pre-processing (S1201) is first performed.
This includes functions such as, elimin~tion of noise, variable density proces~in~, and adjustment of threshold value. The pre-processing (S1201) is performed on the image 15 information obtained as im~ging output ofthe CCD video camera 201 during the input/selection indication on the input sheet 209 as made by the user.
Next, position correction processing (S1202) is begun by detecting a positional shift as shown in Fig. 9 based on the image information of the position correction mark provided in the category area C at the four corners of the input sheet 209. Position 20 correction proces.cing (S1202) proceeds by computing the m~th~ tical transformation of the coordinate system corresponding to the magnitude and direction of the positional shift as described above with reference to Fig. 9. As a result, position-corrected binary image information is obtained.
In the next step, extraction processing (S 1203) of the input/selection indication is 25 p~lroll.led, and then det~rmin~tion procçs.sing (S 1204) of the input/selection indication is performed.
In the input/selection indication extraction processing (S1203), a pre-stored original image of the input sheet 209 is compared to a variable-density image where the input/selection indication is detected. The position-corrected variable-density image 30 information which, for example, could be a pattern indicated by a finger as shown in Fig.
13, is extracted as a differential image from the original image of the input sheet 209.

That is, an access indication input pattern is determined for specifying the input/selection indication. An indication area includes the entire extracted dirr~ ~ l.lial image. Thus, a circumscribed rectangle and a pixel distribution within the indication area can be clet~,rmined.
s Next, in the input/selection indication d~lellllillation processing (S1204), the direction of entry of the pointer into the indication area is detçrminç~l from the pixel density distribution on each edge of the circnm~çribed rectangle. Image sc~nning within the rectangle is carried out relative to the entry direction and thereby the end point position of the pointer is determined. In the example embodiment of Fig. 11, the entry direction of 10 the pointer is found by confirming whether a large number of pixels of high density are distributed on a particular edge of the indication area. The category area E is divided into a plurality of blocks of known area and the image sc~nning is carried out by sc~nnin~ for pixels within these blocks based on the pointer entry direction and the co~ onding sc~nning pattern specified in Table 2. Thus, by sc~nning only the indication area of the image which is divided in to blocks, high-speed detection of the pointer end point within a limited sc~nning area is enabled.

ENTRY DIRECTION SCANNING PATTERN
entry from right edge sc~nning of left end blocks in rect~ngle from upper edge entry from left edge sc~nning of right end block in rectangle from upper edge entry from upper edge sc~nning of lower end blocks in rectangle from left edge entry from lower edge sc~nnin~ of upper end blocks in rect~ngle from left edge entry from upper right edge sc~nning of lower left end blocks in rectangle from left edge entry from lower right edge sc~nning of upper left end blocks in rectangle from left edge entry from upper left edge sc~nnin~ of lower right end blocks in rectangle from left edge entry from lower left edge sc~nning of upper right end blocks in rectangle from left edge The detected ènd point position is converted to corresponding comm~n-l/information, which is transmitted to the control unit 103. In the example shown in Fig. 13, the end point position of the finger is converted to the corresponding area number "6". In Fig. 13, the area indicated by cross h~t~hing is a block within the category area E corresponding to the area number "6".
Finally, the control unit 103 executes processing corresponding to the area number "6" which is defined in the command configuration information contained in the retrieved associated information of the identified input sheet 209 stored in the temporary storage unit of the control unit 103.
o Therefore, in the information input apparatus 100, authentication processin~ is performed by image recognition of the recognition target pattern provided on an input sheet 209. That is, a static medium can be used as the recognition target and anassociation between the recognition target and collcsponding information can be made.
The access indication input specifying the associated information is cletectecl by image recognition of the access indication input pattern. Thus, dynamic access of information can be carried out using a recognition target displayed on a static medium.
Tthe recognition target pattern is not limited to a key-shaped pattern as in the above described embodiment. It can be any distinct pattern with some directionality. Therefore, by combining the shape and the orientation of other recognition pattern.~, for example, as 20 shown in Fig. 14, an infinite number of different types of input sheets 209 can be defined.
Also, after recognition of the input sheet 209, the input/selection indication may be performed by merely pointing with a finger to the desired input/selection indication colllcnL~ of the input sheet 209. Thus, the information input/output/selection operation may be easily carried out without directly becoming aware of the hal.lw~c, and the 25 input/selection indication contents can be associated with specified information from among a very large volume of data.
In addition, in the above-described embodiment, the center of mass of a variabledensity level and the circumscribed rectangle of an icon's image are used for recognition of the rotated icon code. However, the orientation with respect to a ~Icpalcd template 30 image may be found by m~tc1ling processin~
As is described above, in an information input a~pal~L ls and method according to the present invention, image recognition of a recognition target pattern displayed on an object surface is performed, and a optically recognized target pattern is associated with corresponding information. Then, an access indication input corresponding to associated information is received by image recognition of an access indication input pattern. A
s control operation co~ ollding to the access indication input, which is received based on the information associated with the recognition target pattern, is performed. Thus, the user can easily carry out information input/output/selection operation without directly becoming aware of hardware.
In addition, in an inforrnation input apparatus and method according to the present 0 invention, control contents that correspond to the access indication input and information tr~n~mi~ion results, are displayed according to the information associated with the recognition target pattern. In other words, the present invention can be used to index a vast amount of data using a small display area and a user friendly interface. Thus, the information input appaldlus of the present invention can be used to easily and efficiently distinctly specify particular information from among a large volume of data.
Also, in the information input app~a~ls and method according to the present invention, an ~llthentication pattern and an identification pattern are recognized as a recognition target pattern displayed on an object surface. Only after ~thentication proces~ing is performed on the recognized ~lth~ntication pattern, is the recognized 20 identification pattern associated with the corresponding information. Thus, illegitimate access indication input is prevented and only access indications input by a legitim~te user will result in information retrieval.
Further, in the information input app~alus and the information input method according to the present invention, each time an access indication input is received by 25 image recognition of the access indication input pattern, ~llthentication processing is p~ lr~,l"led on the recognized ~lth~ntication pattern. Only if the pattern is ~lth~nticated, will a control operation corresponding to the access indication be performed. Thus, illegitimate access indication inputs are prevented and only access indication inputs by a legitimate user will result in execution of an operation.
Thus, the present invention provides an information input method and an information input a~pa~dlus which the user can easily operate without directly becoming SO~rY-P4100 -23- PATENT

aware of hardware. In addition, the present invention enables direct association of a small, simple int~rf~e with particular information contained within a large volume of data. By simple operations the user is able to directly access the desired information.
Various other modifications and alterations in the structure and method of 5 operation of this invention will be al~pal~ellt to those skilled in the art without departing from the scope and spirit of the invention. Although the invention has been described in connection with specific plefelled embodiments, it should be understood that theinvention as claimed should not be unduly limited to such specific embo-1imentc. It is int~ntled that the following claims define the scope of the present invention and that 0 structures and methods within the scope of these claims and their equivalents be covered thereby.

Claims (7)

1. An information input method comprising the steps of:
performing image recognition of a target pattern on an object;
associating the recognized target pattern with information;
performing image recognition of an access indication pattern on the object;
associating the recognized access indication with a subset of the associated information; and performing a control operation in response to the associated subset of the associated information.
2. The method of claim 1, further comprising the steps of:
displaying a control operation acknowledgement corresponding to the recognized access indication pattern; and displaying output information corresponding to the performed control operation.
3. The method of claim 1, wherein the step of performing image recognition of a target pattern on an object includes the steps of:
performing image recognition of an authentication pattern;
performing authentication processing on the recognized authentication pattern; and performing image recognition of an identification pattern.
4. The method of claim 3, further comprising the step of:
performing authentication processing on the recognized authentication pattern before the step of performing a control operation.
5. An information input apparatus comprising:
image pickup means for imaging a target pattern on an object and an access indication pattern;
image recognition means for performing image recognition of image information imaged by the image pickup means;
storage means for storing information associated with the target pattern;
processing means for associating the target pattern recognized by the image recognition means with corresponding information; and control means for receiving an access indication input from the image recognition means in response to recognizing the access indication pattern, and for performing a control operation corresponding to the access indication input.
6. An information input apparatus comprising:
an image recognition circuit for recognizing a target pattern and an access indication pattern;
a data access circuit, coupled to the image recognition circuit, for accessing operation information specified based on the recognized target pattern and the access indication pattern;
a control circuit, coupled to the data access circuit, for performing an operation in response to the operation information accessed by the data access circuit.
7. The apparatus of claim 6 further comprising:
an output display, coupled to the control circuit, for indicating the performance of the operation and for displaying output information generated by the performance of the operation.
CA002221427A 1996-11-26 1997-11-19 Information input method and apparatus Abandoned CA2221427A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JPPO8-315290 1996-11-26
JP31529096 1996-11-26

Publications (1)

Publication Number Publication Date
CA2221427A1 true CA2221427A1 (en) 1998-05-26

Family

ID=18063626

Family Applications (1)

Application Number Title Priority Date Filing Date
CA002221427A Abandoned CA2221427A1 (en) 1996-11-26 1997-11-19 Information input method and apparatus

Country Status (9)

Country Link
US (1) US6115513A (en)
EP (1) EP0844552A1 (en)
JP (1) JP4304729B2 (en)
CN (1) CN1183589A (en)
CA (1) CA2221427A1 (en)
ID (1) ID18978A (en)
MY (1) MY118364A (en)
SG (1) SG55417A1 (en)
TW (1) TW430774B (en)

Families Citing this family (34)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6561428B2 (en) * 1997-10-17 2003-05-13 Hand Held Products, Inc. Imaging device having indicia-controlled image parsing mode
CN1114166C (en) 1998-05-29 2003-07-09 索尼公司 Information providing system
JP4352287B2 (en) * 1998-08-31 2009-10-28 ソニー株式会社 Image processing apparatus, image processing method, and image processing program medium
DE69939858D1 (en) * 1998-08-31 2008-12-18 Sony Corp image processing
US6265993B1 (en) * 1998-10-01 2001-07-24 Lucent Technologies, Inc. Furlable keyboard
US6637882B1 (en) 1998-11-24 2003-10-28 Welch Allyn, Inc. Eye viewing device for retinal viewing through undilated pupil
US6675133B2 (en) * 2001-03-05 2004-01-06 Ncs Pearsons, Inc. Pre-data-collection applications test processing system
US7111787B2 (en) 2001-05-15 2006-09-26 Hand Held Products, Inc. Multimode image capturing and decoding optical reader
US7526122B2 (en) * 2001-07-12 2009-04-28 Sony Corporation Information inputting/specifying method and information inputting/specifying device
US6834807B2 (en) 2001-07-13 2004-12-28 Hand Held Products, Inc. Optical reader having a color imager
JP3622729B2 (en) * 2002-01-30 2005-02-23 株式会社日立製作所 Image input device
US7637430B2 (en) 2003-05-12 2009-12-29 Hand Held Products, Inc. Picture taking optical reader
US20050015370A1 (en) * 2003-07-14 2005-01-20 Stavely Donald J. Information management system and method
JP4590851B2 (en) * 2003-10-15 2010-12-01 カシオ計算機株式会社 Non-contact control device and program
FR2868648B1 (en) * 2004-03-31 2006-07-07 Wavecom Sa METHOD FOR TRANSMITTING DIGITAL DATA TO A PORTABLE ELECTRONIC DEVICE, CORRESPONDING SIGNAL, DEVICE AND APPLICATION
US7293712B2 (en) 2004-10-05 2007-11-13 Hand Held Products, Inc. System and method to automatically discriminate between a signature and a dataform
CN100361134C (en) * 2004-12-16 2008-01-09 赵建洋 Automatic reading method for analog meter
EP2153649A2 (en) * 2007-04-25 2010-02-17 David Chaum Video copy prevention systems with interaction and compression
US8972739B1 (en) 2007-06-01 2015-03-03 Plantronics, Inc. Methods and systems for secure pass-set entry in an I/O device
CN102034081B (en) 2009-09-25 2016-06-22 神基科技股份有限公司 Use image as the calculator device of Data Source
JP5005758B2 (en) * 2009-12-25 2012-08-22 株式会社ホンダアクセス In-vehicle device operating device in automobile
US9652914B2 (en) 2010-08-31 2017-05-16 Plantronics, Inc. Methods and systems for secure pass-set entry
CN103221912A (en) * 2010-10-05 2013-07-24 惠普发展公司,有限责任合伙企业 Entering a command
JP5673304B2 (en) * 2011-03-31 2015-02-18 日本電気株式会社 Authentication device, authentication system, authentication method, and program
US8657200B2 (en) 2011-06-20 2014-02-25 Metrologic Instruments, Inc. Indicia reading terminal with color frame processing
WO2013101221A1 (en) * 2011-12-30 2013-07-04 Intel Corporation Interactive drawing recognition using status determination
JP5954049B2 (en) * 2012-08-24 2016-07-20 カシオ電子工業株式会社 Data processing apparatus and program
CN104123551B (en) * 2013-04-26 2017-09-29 联想(北京)有限公司 A kind of information processing method, processor and message input device
US9733728B2 (en) * 2014-03-03 2017-08-15 Seiko Epson Corporation Position detecting device and position detecting method
CN105426900A (en) * 2014-09-18 2016-03-23 董醒华 Shadow analysis method of key photograph
JP6592904B2 (en) * 2015-01-22 2019-10-23 セイコーエプソン株式会社 Electronic equipment, program
JP6197920B2 (en) * 2016-06-08 2017-09-20 カシオ計算機株式会社 Data processing apparatus and program
GB2598628B (en) * 2020-09-08 2022-10-26 Inventivio Gmbh Tactile graphics reader
CN114322409B (en) * 2021-06-23 2023-09-19 海信视像科技股份有限公司 Refrigerator and method for displaying indoor scenery pictures

Family Cites Families (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS5887652A (en) * 1981-11-19 1983-05-25 Ricoh Co Ltd Picture processor
US4776464A (en) * 1985-06-17 1988-10-11 Bae Automated Systems, Inc. Automated article handling system and process
JPH01220075A (en) * 1988-02-29 1989-09-01 Nippon Taisanbin Kogyo Kk Displaying part reading method
US4958064A (en) * 1989-01-30 1990-09-18 Image Recognition Equipment Corporation Bar code locator for video scanner/reader system
JPH04137027A (en) * 1990-09-28 1992-05-12 Ezel Inc Input device for computer
US5189292A (en) * 1990-10-30 1993-02-23 Omniplanar, Inc. Finder pattern for optically encoded machine readable symbols
US5245165A (en) * 1991-12-27 1993-09-14 Xerox Corporation Self-clocking glyph code for encoding dual bit digital values robustly
US5221833A (en) * 1991-12-27 1993-06-22 Xerox Corporation Methods and means for reducing bit error rates in reading self-clocking glyph codes
JPH05334470A (en) * 1991-12-27 1993-12-17 Xerox Corp Self-clocking graphic mark code
US5288986A (en) * 1992-09-17 1994-02-22 Motorola, Inc. Binary code matrix having data and parity bits
IT1268511B1 (en) * 1993-02-23 1997-03-04 Zeltron Spa MAN-MACHINE INTERFACE SYSTEM
EP0622722B1 (en) * 1993-04-30 2002-07-17 Xerox Corporation Interactive copying system
JPH07160412A (en) * 1993-12-10 1995-06-23 Nippon Telegr & Teleph Corp <Ntt> Pointed position detecting method
JP2788604B2 (en) * 1994-06-20 1998-08-20 インターナショナル・ビジネス・マシーンズ・コーポレイション Information display tag having two-dimensional information pattern, image processing method and image processing apparatus using the same
US5718457A (en) * 1994-07-29 1998-02-17 Elpatronic Ag Method of marking objects with code symbols
US5887140A (en) * 1995-03-27 1999-03-23 Kabushiki Kaisha Toshiba Computer network system and personal identification system adapted for use in the same
JPH11502654A (en) * 1995-03-31 1999-03-02 キウィソフト プログラムス リミティッド Machine readable label

Also Published As

Publication number Publication date
JP4304729B2 (en) 2009-07-29
US6115513A (en) 2000-09-05
SG55417A1 (en) 1998-12-21
TW430774B (en) 2001-04-21
JPH10214153A (en) 1998-08-11
CN1183589A (en) 1998-06-03
MY118364A (en) 2004-10-30
ID18978A (en) 1998-05-28
EP0844552A1 (en) 1998-05-27

Similar Documents

Publication Publication Date Title
CA2221427A1 (en) Information input method and apparatus
CA2221669C (en) Information input method, information input sheet, and information input apparatus
US6690357B1 (en) Input device using scanning sensors
EP0949578A2 (en) Input device and method utilizing fingerprints of a user
JP5987780B2 (en) Information processing apparatus and information processing program
WO2001061454A1 (en) Controlling an electronic device
JP3804212B2 (en) Information input device
US20050053907A1 (en) Education-learning controller used with learning cards
US20080074386A1 (en) Virtual input device and the input method thereof
CN100410849C (en) Information processing device for setting background image, display method and program thereof
KR100874289B1 (en) Electronic pen-computer multimedia interactive system
GB2389935A (en) Document including element for interfacing with a computer
JPWO2003007137A1 (en) Information input / instruction method and information input / instruction device
JP3879208B2 (en) Information input method, information input medium, and information input device
JPH0660168A (en) Signature recognition device
KR19980042731A (en) Information input method and device
CN100524305C (en) Guidance apparatus related to operation of information equipment and guidance method related to operation of information equipment
JP3903540B2 (en) Image extraction method and apparatus, recording medium on which image extraction program is recorded, information input / output / selection method and apparatus, and recording medium on which information input / output / selection processing program is recorded
JP6868665B2 (en) Data entry device, data entry method and data entry program
CN101769756B (en) Fingerprint navigation method for establishing link between fingerprint and navigation destination and navigation device
JP2008188323A (en) Drawing logic making device, drawing logic making server, drawing logic making method and program
JP6734445B2 (en) Data input device, data input method, and data input program
KR100847943B1 (en) Creating responses for an electronic pen-computer multimedia interactive system
EP1695262A1 (en) Method of scanning an image using surface coordinate values and device using thereof
JPH10285389A (en) Image-processing method, image-processing unit and recording medium

Legal Events

Date Code Title Description
EEER Examination request
FZDE Discontinued