WO2005072372A2 - System for and method of finger initiated actions - Google Patents

System for and method of finger initiated actions Download PDF

Info

Publication number
WO2005072372A2
WO2005072372A2 PCT/US2005/002547 US2005002547W WO2005072372A2 WO 2005072372 A2 WO2005072372 A2 WO 2005072372A2 US 2005002547 W US2005002547 W US 2005002547W WO 2005072372 A2 WO2005072372 A2 WO 2005072372A2
Authority
WO
WIPO (PCT)
Prior art keywords
electronic device
task
biometric data
fingeφrint
tasks
Prior art date
Application number
PCT/US2005/002547
Other languages
French (fr)
Other versions
WO2005072372A3 (en
Inventor
Mark J. Howell
Anthony P. Russo
Marcia Tsuchiya
Frank H. Chen
Original Assignee
Atrua Technoligies, Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Atrua Technoligies, Inc. filed Critical Atrua Technoligies, Inc.
Publication of WO2005072372A2 publication Critical patent/WO2005072372A2/en
Publication of WO2005072372A3 publication Critical patent/WO2005072372A3/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/12Fingerprints or palmprints
    • G06V40/13Sensors therefor

Definitions

  • This invention relates to device controllers. More specifically, this invention relates to device controllers that use fingerprints to automatically perform tasks on an electronic device.
  • Shortcuts are used to reduce the number of steps and to simplify the steps required by a user to perform a task on an electronic device. For example, shortcut links are placed on the desktop of a personal computer; PDA's have a shortcut button to launch a calendar application; mobile phones support speed dial phone numbers. To many users, even shortcuts are not convenient enough. Short cuts on the desktop of a personal computer still require the user to move a pointer to select and launch a software application. The speed dial function of mobile phones still requires users to push at least 2 or 3 buttons. Users often forget the sequence of keystrokes or mouse clicks that provide a short cut. Moreover, these keypads and mice require valuable space, a disadvantage especially on portable electronic devices.
  • a fingerprint control system provides a method of and system for reading a biometric image and performing an associated task having a chain of tasks.
  • the task can include launching a computer program, such as an executable file, a macro, and a script; executing a function on an electronic device; or any combination of these.
  • a method of performing a task on an electronic device comprises matching read biometric data to stored biometric data having a corresponding task comprising a first chain of tasks and automatically performing the corresponding task on the electronic device.
  • the biometric data comprises fingerprint data.
  • the corresponding task is selected from a plurality of tasks that the electronic device is configured to perform.
  • the corresponding task has optional parameters.
  • automatically performing a task comprises retrieving user profile data corresponding to a user and the task, and using the user profile data to perform the task.
  • the user profile data comprises a telephone number and the corresponding task comprises dialing the telephone number.
  • the user profile data comprises login information for accessing a host system and the corresponding task comprises transmitting the login information to the host system.
  • the corresponding task comprises accessing a resource over a network such as a local area network or the Internet.
  • a task in the first chain of tasks comprises encrypting the login information before transmitting the login information to the host system.
  • the corresponding task comprises an interactive task.
  • performing the corresponding task comprises executing a computer game, remotely controlling a remote-controlled system, or performing a non- inherent function of the electronic device.
  • the corresponding task further comprises a second chain of tasks.
  • the first chain of tasks and the second chain of tasks are performed in parallel.
  • the method further comprises reading biometric data.
  • Reading biometric data comprises reading data captured during a fmger placement on a fingerprint image sensor.
  • reading biometric data comprises reading data captured during a fmger swipe over a fingerprint image sensor
  • matching biometric data to stored biometric data comprises identifying a direction of the finger swipe, a first direction having the corresponding task and a second direction having a different corresponding task.
  • Matching read biometric data to stored biometric data comprises determining whether a threshold number of points of read fingerprint data coincide with a number of points of stored fingerprint data.
  • the electronic device is portable, such as a personal digital assistant, a telephone, or any other hand-held device.
  • the electronic device comprises a personal computer, a remote controller, a security system, a television set, an audio player, a game device, or any combination of these.
  • the stored biometric data is an ordered set of biometric data, such as a permutation of biometric data or a combination of biometric data.
  • the read biometric data must also be a permutation or a combination of biometric data in order to match the stored biometric data.
  • the corresponding task relates to a context of an application executing on the electronic device.
  • an electronic device comprises (a) a biometric sensor for reading biometric data; (b) a memory storing a plurality of stored biometric data each having a corresponding task identifier used to perform a corresponding task, at least one task comprising a chain of tasks; and (c) a processor coupled to both the biometric sensor and the memory, the processor configured to match read biometric data with stored biometric data and to automatically perform a corresponding task on the electronic device.
  • the biometric sensor comprises a fingerprint image sensor, such as a placement sensor or a swipe sensor.
  • the swipe sensor is configured to detect a direction of a swipe, a first direction having a first task identifier and a second direction having a second task identifier.
  • the memory is further configured so that a task identifier also has user profile data corresponding to a user and the task.
  • the task identifier has optional parameters.
  • the corresponding task relates to a context of an application executing on the electronic device.
  • the electronic device further comprises a telephone operatively coupled to the processor.
  • the user profile data comprises a telephone number and the corresponding task comprises dialing the telephone number on the telephone.
  • the electronic device further comprises a link to a network, such as the Internet.
  • the user profile data comprises a resource locator, and the corresponding task comprises connecting the electronic device to a host identified by the resource locator and accessible over the network.
  • the link comprises a wireless transmitter.
  • the fingerprint image sensor comprises a thermal sensor, an optical sensor, a pressure sensor, or a capacitive sensor.
  • the processor is configured to execute two tasks in parallel.
  • a method of initializing an electronic device comprises reading biometric data, storing the biometric data, and mapping the stored biometric data to a chain of tasks that are automatically performed on an electronic device.
  • Figure 1 is a block diagram of a fingerprint control system in accordance with one embodiment of the present invention.
  • Figure 2 shows an array used to store fingerprint images and to associate each fingerprint image with a task and the task's parameters in accordance with one embodiment of the present invention.
  • Figure 3 is a flow chart illustrating the steps executed by a fingerprint control system in accordance with one embodiment of the present invention.
  • Figure 4 is a flow chart illustrating the steps executed by a fingerprint control system to perform the task of automatically logging into an e-mail server in accordance with one embodiment of the present invention.
  • Figure 5 is a flow chart illustrating the steps executed by a fingerprint control system to perform the task of automatically dialing a telephone number in accordance with one embodiment of the present invention.
  • Figure 6 is a flow chart illustrating the steps used to enroll a fingerprint image in a fingerprint control system and to associate a task with the fingerprint image in accordance with one embodiment of the present invention.
  • Figure 7 is a screen shot of a display screen schematically illustrating a plurality of hands each having a highlighted finger and the associated task for each highlighted finger.
  • Figure 8 is a schematic diagram illustrating relationships between engines used to set up and run a fingerprint control system in accordance with the present invention.
  • Figure 9 is a flow chart illustrating the steps used to spawn child processes so that multiple tasks can be performed concurrently.
  • Figure 10 is a graph showing a parent task that automatically spawns children tasks, each children task in a chain of tasks.
  • a fingerprint image can be used to initiate one or more associated tasks on an electronic device.
  • the electronic device reads a fingerprint image and performs a specific task associated with a fingerprint image.
  • the electronic device is a personal computer (PC)
  • PC personal computer
  • the PC launches a web browser
  • the PC connects to the e-mail account of the user
  • the PC runs a calculator program.
  • each fingerprint image can be used to perform tasks on an electronic device.
  • a pair of fingerprint images can be used to perform a task on the electronic device. If a single fmger is used, ten unique tasks can be performed. If a pair of fingerprint images are used, up to one hundred unique tasks can be performed.
  • a task is any operation on an electronic device.
  • the performance of a task includes, but is not limited to, (1) the execution of an inherent function of an electronic device, such as copying on a photocopier; (2) the launching of a program, such as a Web browser, e-mail program, or electronic calculator, that is not specific or inherent to a particular electronic device, including interactive programs; (3) any performance of one or more steps on the electronic device to perform any operation; or (4) any combination or permutation of the above.
  • a task can include the inherent function of powering ON an electronic device, launching a Web browser to connect to a remote host machine, and automatically transmitting a user name and password to the host machine to log on to the host machine.
  • the words "tasks” and “actions” are used interchangeably herein.
  • the term "computer program” refers to executable files, scripts, macros, and any sequence of instructions that can control the performance of tasks on an electronic device.
  • Embodiments of the present invention thus advantageously (1) reduce the number of entries that a user must make to perform a task-rather than entering numerous keystrokes to connect to an e-mail account, a single finger placement or swipe can accomplish the same task; (2) reduce the footprint of an electronic device since keypads, function buttons, joy sticks, and mice can be replaced with a fingerprint image sensor; (3) reduce the complexity of user interfaces since a user does not have to remember keystrokes, logon information, or other information needed to perform a task; and (4) increase the security of electronic devices since, using a fingerprint, a user can be authenticated each time she requests that a task be performed so that the authentication and the performance of the task can be accomplished using a s single finger swipe or placement.
  • FIG. 1 is a schematic diagram of a fingerprint control system 100 in accordance with the present invention.
  • the fingerprint control system 100 comprises a central processing unit 110 electronically coupled to a fingerprint sensor 115, a storage unit 120, a display unit 130, and an input unit 135.
  • the storage unit 120 comprises a mapping store 125 containing fingerprint mapping information, such as the task information and associated parameters described in more detail below.
  • the fingerprint control system 100 can be a stand-alone system comprising an electronic device or can be coupled to an external electronic device (not shown) and used to control the external electronic device as described below.
  • the fingerprint sensor 115 is used to read the image of a fingerprint placed upon it.
  • the fingerprint sensor 115 can be a placement sensor or a swipe sensor. Placement sensors require a user to place her finger on a reading surface of the fingerprint sensor until a fingerprint image is captured. Placement sensors are designed to actively sense the entire surface of a finger at once. Placement sensors can be based on optical, thermal, pressure, electrical, or other sensing means. In general placement sensors are designed to have a reading surface with an area as large as the pad of a typical human finger, typically 15 mm 2 . It will be appreciated that the area of the reading surface in accordance with the present invention can be larger or smaller than 15 mm 2 . Alternatively, the fingerprint sensor 115 can be a swipe sensor.
  • swipe sensors are fully sized in one direction (typically in width) but abbreviated in the other (typically in height). Swipe sensors are thus configured to sense only a small rectangular portion of a finger at any one time. To capture a finge ⁇ rint image, a user needs to swipe his fmger over the sensor. Swipe sensors are especially suitable for portable devices because they are smaller than placement sensors. Methods of and systems for finge ⁇ rint sensing are described in detail in the U.S. Patent Application Serial Number 10/194,994, filed July 12, 2002, and titled "Method and System for Biometric Image Assembly from Multiple Partial Biometric Frame Scans," and in the U.S.
  • the finge ⁇ rint sensor is a ATW100 capacitive swipe sensor by Atrua Technologies, Inc., at 1696 Dell Avenue, Campbell, California 95008. It will be appreciated that any sensor technology can be used in accordance with the present invention.
  • the input device 135 is used to input information, such as task information, into the finge ⁇ rint control system 100.
  • an administrator or user may use the input device to enter a user name and password associated with a finge ⁇ rint image.
  • a user places her thumb on the finge ⁇ rint sensor 115, a Web browser is opened, a connection made to an e-mail server, and the user name and password are automatically transmitted to the e-mail server so that an e-mail session is automatically initiated for the user.
  • the input device can be used to type in the task information associated with her thumb print.
  • Such task information can include (1) parameters such as a user name and password and (2) the name or memory address of one or more programs that launch a Web browser, connect to an e-mail server, and transmit the user name and password combination.
  • the input device 135 can be any type of device for inputting information including, but not limited to, a keyboard, a mouse, a touch screen, and a port for receiving information from an electronic device such as a personal digital assistant (PDA).
  • PDA personal digital assistant
  • the typed information can be displayed on the display unit 130.
  • the input device 135 and the display unit 130 are typically used only when the finge ⁇ rint control system 100 is being initialized, such as described in Figure 6, and thus both are optional and can be absent when the finge ⁇ rint control system 100 is used in normal operation, such as performing a task.
  • the storage unit 120 comprises a finge ⁇ rint mapping store 125 used to store finge ⁇ rint mapping information.
  • the finge ⁇ rint mapping information comprises a plurality of finge ⁇ rint images, their associated tasks, and the parameters used by the tasks.
  • a control program executed by the CPU 110 compares the read finge ⁇ rint image to finge ⁇ rint images stored in the finge ⁇ rint mapping store 125.
  • the stored finge ⁇ rint images are referred to as "enrolled finge ⁇ rint images” or “stored finge ⁇ rint images.”
  • the control program matches the read finge ⁇ rint image to an stored finge ⁇ rint image, the CPU 110 will execute the tasks associated with the matched stored finge ⁇ rint image, using the associated parameters, if any.
  • the finge ⁇ rint mapping information is stored in a finge ⁇ rint array (table) where the index or key is a finge ⁇ rint image and the associated task is the value for the array.
  • Figure 2 shows one embodiment of a finge ⁇ rint array, described in more detail below.
  • the control program can quickly associate a finge ⁇ rint image to its corresponding task.
  • the electronic device to be controlled by the finge ⁇ rint control system 100 can include the same hardware as the finge ⁇ rint control system 100.
  • the PC can also comprise one or more of the CPU 110, the finge ⁇ rint sensor 115, the storage 120, the display 130, and the input device 135.
  • the CPU 110 can execute finge ⁇ rint operations as well as other application programs.
  • the finge ⁇ rint control system 100 can reside on and form part of the telephone.
  • the finge ⁇ rint sensor 115 can be integrated into a telephone case, the display 130 can be a telephone liquid crystal display unit, and the input 135 can be the telephone keypad.
  • the fmge ⁇ rint control system 100 can reside on hardware separate from the electronic device it controls. Thus, for example, if the electronic device is a coffee maker, the finge ⁇ rint control system can be electronically coupled to the coffee maker. Thus, a user can use the finge ⁇ rint control system 100 to launch a program, and the program can send control signals to the coffee maker, thereby controlling the coffee maker. It will be appreciated that the finge ⁇ rint control system 100 can be coupled to the electronic device in any number of ways, depending on the application at hand.
  • the finge ⁇ rint control system 100 can be directly coupled to the electronic device using control wires. Alternatively, the finge ⁇ rint control system 100 can be coupled to the electronic device over a local area network, using for example the Ethernet protocol, or over a wide area network, using, for example, TCP/IP. Also, the finge ⁇ rint control system 100 can be wirelessly coupled to the electronic device using radio or infrared signals. It will also be appreciated that the electronic device can also be coupled to a remote host using, for example, a wireless transmitter or a wireless transceiver.
  • Figure 2 illustrates one embodiment of a finge ⁇ rint array 200 in accordance with one embodiment of the present invention.
  • the finge ⁇ rint array 200 comprises rows 210, 220, 230, 240, and 250 and columns 201, 204, and 206. It will be understood by those of ordinary skill in the are that rows and columns are representative only. Any other database technique can be used with equal success.
  • the column 201 contains multiple finge ⁇ rint images, 211, 221, 231,241 and 251.
  • the column 204 contains associated tasks (here, scripts or executable files).
  • the column 206 contains any parameters for each associated task. In one embodiment, the parameters are input to the programs or scripts shown in the column 204.
  • the row 210 contains the stored finge ⁇ rint image 211, its associated task 214, and the parameter list 216 used to perform the associated task 214.
  • the associated task 214 contains the PERL script email.pls, which is used to access an email account.
  • the parameter list 216 contains a first element, "John Doe", and a second element, "passwordl”. The performance of a task to access an email account using the associated task 214 and its parameter list 216 are described in detail relative to Figure 4 below, describing the steps performed to execute a task when a read fmge ⁇ rint image matches the stored finge ⁇ rint image 211.
  • the first element, "John Doe”, and the second element, "passwordl”, are one example of what are referred to as user profile data, that is, data specific to a finge ⁇ rint image (or other finge ⁇ rint data) and used to perform a task.
  • the row 220 contains the stored finge ⁇ rint image 221 and its associated task 224.
  • the empty cell 226 indicates that no parameters are required to perform the associated task 224.
  • the associated task 224 comprises one operation executed by the Visual BASIC script calc.vbs, which launches a calculator, allowing a user to perform mathematical operations on the electronic device.
  • the control program launches (e.g., calls) calc.vbs, which displays a calculator GUI on a display device (e.g., 130, Figure 1).
  • the user can then enter onto an input device (e.g., 135, Figure 1) operands in a mathematical equation, which the calculator computes and displays an answer.
  • the row 230 contains the stored finge ⁇ rint image 231, its associated task 234, and the parameter 236 used to perform the associated task 234.
  • the associated task 234 is performed by launching the executable file "Wordprocessor.exe".
  • the control program calls Wordprocessor.exe, which presents a user interface to a word processor, such as Wordperfect, Word, vi, or any other kind of word processor.
  • the parameter 236 can be passed to the word processing program in a variety of ways. For example, the parameters 236 can be sent to the word processing in a macro that is automatically called, by a template, by a style sheet, or by any other means.
  • the row 240 contains the stored finge ⁇ rint image 241, its associated task 244, and parameter 246 used by the associated task 254.
  • the associated task 224 is performed by the PERL script phone.pls.
  • the parameter 256 comprises the string "10102504085551212" designating a calling card code and a telephone number.
  • the control program calls phone.pls to launch a telephone program that dials the digits "10102504085551212".
  • Figure 5 describes the tasks executed when a read finge ⁇ rint image matches the stored finge ⁇ rint image 241.
  • the row 250 contains the finge ⁇ rint image 251, its associated task 254, and the parameter list 256 used by the associated task 254.
  • the associated task 254 is performed by the PERL scripts ftp.pls and email.pls.
  • the first element of the parameter list 256 comprises the string "http://www.site.com”
  • the second element of the parameter list 256 comprises the string "joe(g),vahoo.com.”
  • the control program calls the program ftp.pls and passes it the first element of the parameter list 256 "http://www.site.com” to download a file from the Web site "http://www.site.com” using the file transfer protocol.
  • the control program then attaches the file to an e-mail addressed to the e-mail address stored in the second element of the parameter list 256, "ioe@,yahoo.com", and then calls the script email.pls to send the e- mail.
  • a file stored at a Web address and containing information that may be updated periodically can be automatically sent to a specific user.
  • This example illustrates nested tasks, where one task can include the performance of multiple tasks, here the downloading of a file and the sending of the file by email.
  • the task performed by the script ftp.pls is said to be chained to the task performed by the script email.pls.
  • finge ⁇ rint data that uniquely identifies a finge ⁇ rint image rather than an entire finge ⁇ rint image is stored in the finge ⁇ rint array 200.
  • the finge ⁇ rint data will correspond to a subset of the entire finge ⁇ rint image, thus requiring less data storage.
  • This finge ⁇ rint data can correspond, for example, to multiple minutiae points for a finge ⁇ rint.
  • an extracted set of unique finge ⁇ rint data is stored in a template. It will be appreciated that any reference herein to finge ⁇ rint images will also correspond to finge ⁇ rint data.
  • finge ⁇ rint images can be correlated to tasks using structures other than the finge ⁇ rint array 200.
  • other finge ⁇ rint identifiers that uniquely identify a finge ⁇ rint image can be used.
  • pointers to finge ⁇ rint images can be stored.
  • the finge ⁇ rint pointers can contain the address of finge ⁇ rint images stored elsewhere in the storage 120 ( Figure 1) or at other locations, as described below.
  • finge ⁇ rint images, associated tasks and their parameters can be stored in other data structures such as a hash (associative array), with each finge ⁇ rint image as a key, and the corresponding tasks and parameters as values.
  • the finge ⁇ rint image and corresponding tasks and parameters can be stored in a database.
  • Many structures can be used, depending on the application at hand. For example, if the electronic device is a PC, then a database and its associated database management system can be used in conjunction with the control program. If the electronic device is smaller, such as a PDA, and has less memory, an array can be used.
  • FIG 3 is a flow chart illustrating the steps 300 taken by a control program in accordance with the present invention and used to perform a task after a finge ⁇ rint image is read from the finge ⁇ rint sensor 115 of Figure 1.
  • the finge ⁇ rint sensor 115 and any data and variables used by the control program are initialized.
  • the user swipes or places her fmger on the finge ⁇ rint sensor 115.
  • the control program determines whether the image quality of the read (scanned) finge ⁇ rint image is acceptable.
  • the control program then loops back to the step 305. It will be appreciated that the control program can be configured to allow only a pre-determined number of unacceptable images to be scanned. After this limit is reached, the control program can alert the user that the finge ⁇ rint sensor 115 is dirty or damaged.
  • the control program continues to the step 315, where it compares the read finge ⁇ rint image with the stored finge ⁇ rint images in the column 201 of the finge ⁇ rint array 200 of Figure 2. Finge ⁇ rint images can be compared sequentially or using some weight based on the number of times a particular finge ⁇ rint was matched in earlier sessions. A stored finge ⁇ rint image (and thus a corresponding task) may match more often (e.g., corresponding to a popular task) and thus will first be compared with other stored finge ⁇ rint images.
  • the control program checks whether the read finge ⁇ rint image matches the current candidate stored finge ⁇ rint image. If a match is not found, the control program proceeds to the step 305. If a match was found (e.g., matching the read finge ⁇ rint image to the stored finge ⁇ rint image 211), the control program continues processing at the step 325.
  • the control program retrieves the corresponding task (e.g., the script email.pls in the field 214) and the corresponding parameters (e.g., the parameter list 216).
  • the control program performs the corresponding task (e.g., calls or launches the script email.pls) using the corresponding parameters (e.g., the user name "John Doe" and the password "passwordl”). It will be appreciated that executing one task can comprise executing one or more tasks in a task chain.
  • the control program ENDS.
  • a "test" finge ⁇ rint image (also referred to as a "read finge ⁇ rint image” or a “candidate finge ⁇ rint image”) is compared to a finge ⁇ rint images in a set of stored finge ⁇ rint images.
  • the read finge ⁇ rint image is read when a user places or swipes her finger on a finge ⁇ rint sensor.
  • the read finge ⁇ rint image may not exactly match any image in the set of stored finge ⁇ rint images. Therefore, the candidate finge ⁇ rint image is compared to all of the stored finge ⁇ rint images to compute a matching score, a number reflecting the number of matching minutiae points.
  • the stored finge ⁇ rint image with the highest match score is considered the best match, as long as the score is above a predetermined match threshold. If it is not above this threshold, none of the stored finge ⁇ rint images is considered a match.
  • the threshold is required because some security systems cannot be sure whether an impostor is trying to fool it. In certain circumstances the cost of illicit access can be excessively high (hence the need for finge ⁇ rints in the first place). In low-security systems, the match threshold can be made artificially low, so that the best matching finge ⁇ rint image is always selected.
  • Figure 4 illustrates the steps 400 taken to perform the task of automatically logging a user into her e-mail account so that the logon page is automatically bypassed, and the command page is displayed.
  • the steps 400 correspond to the step 330 in Figure 3 and are performed, for example, when a read finge ⁇ rint image matches the stored finge ⁇ rint image 211 in Figure 2.
  • the stored finge ⁇ rint image 211 is read by the finge ⁇ rint sensor 115.
  • the fmge ⁇ rint sensor 115 and any data structures used by the control program are initialized.
  • the a Web browser is automatically launched on the user's electronic device.
  • a Uniform Resource Locator (URL) for her e-mail account is automatically input into the address field of the Web browser in the step 415.
  • the home page for the user's e-mail server is received and displayed on the user's electronic device.
  • URL Uniform Resource Locator
  • the user's login ID ("John Doe”, the first element in the parameter list 216) and password ("passwordl", the second element in the parameter list 216) are automatically input into the appropriate fields of the e-mail home page.
  • the SUBMIT command is then automatically invoked in the step 430.
  • the user's mailbox can then be manipulated in the step 435, either manually, by the user, or automatically, by software in accordance with the present invention.
  • the mailbox can be manipulated by, for example, displaying messages from the user's e-mail account, automatically composing a new e-mail message, allowing the user to compose a new e-mail or delete an old one.
  • the process ends in the step 440 when, for example, the user logs off from her e-mail account.
  • the control program can call other software, such as Messaging Application Programming Interface (MAPI), which allows a user to seamlessly manipulate e-mail accounts and mailboxes.
  • MMI Messaging Application Programming Interface
  • the control program can invoke a Web browser and post an HTML form to automatically log on to an e-mail server to access her account.
  • the user can manipulate her e-mail account without using a Web browser, by using, for example, a command line such as used in the UNIX environment.
  • Figure 5 illustrates the steps 500 used to perform the task of automatically dialing (i.e., speed dialing) a telephone number in accordance with the present invention.
  • the steps 500 correspond to the step 330 in Figure 3.
  • the control program launches the PERL script phone.pls, passing it the parameter "10102504085551212".
  • the PERL script phone.pls then executes the tasks displayed in the steps 500.
  • the finge ⁇ rint sensor 115 and any data structures used by the PERL script phone.pls are initialized.
  • the control program turns the electronic device (here, a telephone) ON.
  • phone.pls reads the parameters 246, the sequence of digits "10102504085551212" containing a calling card code and a telephone number.
  • phone.pls sequentially transmits the digits to a tone generator in the telephone, thus automatically dialing the calling card code and the telephone number.
  • the process ends.
  • FIG 6 illustrates the steps 600 performed by an Enrollment Program to store finge ⁇ rint images and their associated tasks and parameters in the electronic device 100 ( Figure 1) during a set up process.
  • the Enrollment Program starts in the START step 601.
  • the START step 601 can be entered in a variety of ways, such as by pressing a START button (not shown) on the input device 135 of Figure 1.
  • the user is prompted to select a task, which in turn can execute a chain of tasks.
  • Tasks can be presented in a number of ways. For example, a user can be presented with a menu of tasks on the display device 130, any number of which can be selected by, for example, entering a selected task on the input device 135.
  • a user can type in the name of a script (e.g., phone.pls in the array cell 244 of Figure 2) and its associated parameters (e.g., "10102504085551212" in the array cell 246 in Figure 2).
  • the Enrollment Program checks whether the selected task has already been associated with a finge ⁇ rint image. The Enrollment Program can do this by, for example, parsing the column 204 in Figure 2 to see whether any of the tasks stored there correspond to the task selected in the step 605.
  • the Enrollment Program proceeds to the step 615; otherwise the Enrollment Program proceeds to the step 620.
  • the Enrollment Program prompts the user whether she would like to change the mapping (association) currently stored in the finge ⁇ rint array 200.
  • the system of the present invention disallows two finge ⁇ rint images from mapping to the same task. It will be appreciated, however, that two fingers can be mapped to the same task. If the user would not like to remap the action already stored in the finge ⁇ rint array 200, the Enrollment Program proceeds to the step 605; otherwise, the Enrollment Program proceeds to the step 620.
  • the user is prompted to select a finger that she will later swipe and whose image will be mapped to the selected task.
  • the user can be prompted in a variety of ways. For example, she can be presented with a screen image (similar to column 201 in Figure 4) of fingers and a means (e.g., mouse click or keypad entry) for selecting one of the fingers presented.
  • the user swipes (or places) her finger on the finge ⁇ rint sensor 115, and in the step 635 the Enrollment Program checks whether the finge ⁇ rint image quality is acceptable. It will be appreciated by those skilled in the art that this can be accomplished in many ways, such as by ensuring an adequate number of dark ridges.
  • FIG. 7 is a screen shot 700 of a display showing the relationship between fingers (and thus finge ⁇ rint images) and the corresponding tasks. Embodiments of the present invention can display the screen shot 700 to remind a user of the tasks associated with each fmger.
  • the screen shot 700 shows that when the user swipes her right pinkie fmger on the finge ⁇ rint sensor 115, a Web browser is automatically launched; when she swipes her right ring finger, an e-mail program is automatically launched; when she swipes her middle finger, a calculator program is automatically launched; when she swipes her right index finger, a word processor is automatically launched; and when she swipes her right thumb, a telephone number is automatically dialed.
  • Figure 8 illustrates the relationship between the elements of a finge ⁇ rint control system 800 comprising a finge ⁇ rint reader 845, an initialization program 825, a finge ⁇ rint array 835, and a control program 840.
  • the initialization program 825 comprises an Enrollment engine 810, a Task Definition engine 820, and a Task Configuration engine 815.
  • the finge ⁇ rint array 835 contains a column of 10 finge ⁇ rint images labeled A through J (also referred to as row A through row J, respectively). Each finge ⁇ rint image has associated with it a task chain.
  • the task chain corresponds to multiple tasks that are executed by the control program 840 when a finge ⁇ rint image read by the finge ⁇ rint reader 845 matches a finge ⁇ rint image stored in the finge ⁇ rint array 835.
  • read finge ⁇ rint data e.g., minutiae points
  • the row A has associated with it the chain of tasks (e.g., executable files or a scripts) Al, A2, and A3.
  • Each task Al, A2, and A3 can have one or more parameters associated with it. The result of each task depends on the values of these parameters.
  • Figure 8 shows a linear (one-dimensional) task chain Al, A2, and A3, such that task Al is performed, followed by task A2 and then task A3, tasks can also be performed in parallel.
  • a first task Al and a second task Al 1 can be performed in parallel, thus allowing for the performance of tasks in a two-dimensional manner.
  • tasks can be performed linearly (e.g., sequentially) and in parallel (e.g., concurrently).
  • tasks can be executed in many dimensions.
  • Figure 10 One such embodiment is illustrated in Figure 10.
  • a task in a task chain can be performed when called (e.g., launched) by a control program, when called by another task (e.g., program performing a task) in the task chain, or in any other manner.
  • Figure 8 also shows two hands having associated fingers and their corresponding finge ⁇ rint images 860A through 860J.
  • the finge ⁇ rint image corresponding to the fmger 860A is stored in the finge ⁇ rint array 835 at the location A; the finge ⁇ rint image corresponding to the fmger 860B is stored in the finge ⁇ rint array 835 at the location B; etc.
  • the Enrollment engine 810 when the user chooses to enroll a finge ⁇ rint image into the finge ⁇ rint control system 800, the Enrollment engine 810 is executed.
  • the Enrollment engine 810 processes a finge ⁇ rint image captured by the finge ⁇ rint scanner 845 and passes the finge ⁇ rint image, a pointer to a memory location containing the finge ⁇ rint image, or any other finge ⁇ rint image identifier to the Task Configuration engine 815.
  • a Task Definition i.e., associated tasks or programs
  • the user types tasks using the an input device such as the input device 135 shown in Figure 1.
  • the Task Configuration engine 815 then stores the finge ⁇ rint image identifier (e.g., a finge ⁇ rint image, finge ⁇ rint minutiae points, pointer to finge ⁇ rint minutiae points) into the first column of a row in the Finge ⁇ rint Array 835 (e.g. A) and then stores the task chain (e.g., action sequence) and their associated parameters in the remaining elements in the row (e.g., Al, A2, and A3).
  • the finge ⁇ rint image can have any number of tasks in its respective task chain.
  • the finge ⁇ rint image B has four tasks B1-B4 in its task chain; the finge ⁇ rint image C has one task Cl in its task chain; etc.
  • Each task in a task chain has any number of optional parameters. Parameters can include those used to perform a task (e.g., execute a program) regardless of the finge ⁇ rint data that are matched, or those used when particular finge ⁇ rint data are read (e.g., user profile data), or any combination of these.
  • the finge ⁇ rint array 835 and the initialization program 825 can both reside in a single memory storage, such as the storage 120 in Figure 1.
  • the finge ⁇ rint array 835 and the initialization program 825 can reside on different storage devices at different locations.
  • the initialization program 825 can reside on an electronic device (not shown) and the finge ⁇ rint array 835 can be coupled to the electronic device over a network.
  • the initialization program 825 is not required after the set up process.
  • finge ⁇ rint images are stored on a storage device in encrypted form, thus adding an additional level of security. It will be appreciated that other modifications can be made to this embodiment in accordance with the present invention.
  • the read finge ⁇ rint can be compared to decoded stored finge ⁇ rint images, though such a modification will require multiple decryptions and thus may take longer.
  • Embodiments of the present invention can be used to launch any number of tasks.
  • the electronic device e.g., platform
  • the left index finger can launch a browser and bring up a web site, while the right middle fmger can launch an email program.
  • the platform is a mobile phone
  • the middle finger can trigger the phone to speed dial the home phone number of the user.
  • the platform is a game device, the thumb can launch a particular game to play on the game device.
  • the platform is a television remote control
  • the index finger can launch a program to switch to a parent-controlled channel while the last finger will turn off the television.
  • the right thumb can perform the inherent function of turning the automatic coffee maker ON and then launch a program that causes the coffee maker to brew a cup of cappuccino; if the right index finger is swiped, the program can direct the coffee maker to prepares a cup of mocha.
  • the platform is a MP3 player, swiping the right index fmger will direct the MP3 player to play the second song on a disc.
  • an electronic device is part of a high security environment, a user will normally swipe his left index fmger to pass a security checkpoint. If he is under duress or danger, he will swipe his left middle finger to pass the security checkpoint and also indicate a potential danger.
  • any number of electronic devices can be controlled with embodiments of the present invention.
  • the present invention is independent of device types and operating systems. It can work with all types of finge ⁇ rint authentication solutions and all types of finger image sensors (e.g. capacitive, optical, thermal, pressure, etc.), swipe or placement.
  • the present invention can also be executed in specialized hardware or firmware instead of in software. Tasks or parts of a task can be implemented in software while other portions are implemented in firmware, hardware, software, or any combination of these.
  • the control program can launch multiple tasks that run concurrently.
  • Figure 9 illustrates the steps 900 taken by a parent program (e.g., process) in accordance with one embodiment of the present invention.
  • the parent program initializes any parameters used to control the multiple processes, such as process identifiers (PIDs).
  • PIDs process identifiers
  • the parent program waits for a finger swipe on a finge ⁇ rint sensor.
  • the parent program can, for example, wait for a swipe event, be interrupted as part of an interrupt service routine triggered when a finger is swiped on a finge ⁇ rint sensor, or triggered in other ways.
  • the parent program spawns a child process or thread, passing it the finge ⁇ rint image identifier, such as a finge ⁇ rint image or the address of a memory location containing the finge ⁇ rint image.
  • FIG. 10 shows a graph 920 illustrating how one task can automatically launch a plurality of tasks, each task being a part of a task chain, in accordance with one embodiment of the present invention.
  • the associated task 925 launches the tasks (also referred to as sub tasks) 930A, 940A, 950A, and 960A.
  • Each task 930A, 940A, 950A, and 960A can have any number of associated parameters (e.g., user profile data) that can be dependent on the stored finge ⁇ rint data.
  • the task 930A can have a first element that corresponds to a script (e.g., 214 in Figure 2) and a second element that corresponds to the parameters used by the script (e.g., 216 in Figure 2).
  • the tasks 930A, 940A, 950A, and 960A are launched concurrently and can execute in parallel.
  • the task 940A in turn calls the task 940B, which in turn calls the task 940C.
  • the tasks 940B and 940C also both have optional parameters.
  • the task 940A can call the task 940B just before the task 940A completes so that the tasks 940A and 940B are not executing in parallel.
  • the task 940A can call the task 940B at another time, so that the tasks are executing in parallel.
  • the tasks 940A and 940B, as well as the other tasks illustrated in Figure 10, can also be coordinated so that they can share data.
  • the task 940A can read stock data on a Web page, and then pass the stock data to the task 940B, which includes the data in an e-mail to a user, thereby notifying the user of changes in stock prices.
  • the task 950A calls the task 950B.
  • the task 960A calls the task 960B, which calls the task 960C, which calls the task 960D.
  • the tasks 925 and 930A are said to form a first task chain
  • the tasks 925, 940A, 940B, and 940C are said to form a second task chain
  • the tasks 925, 950A, and 950B are said to form a third task chain
  • the tasks 925, 960A, 960B, 960C, and 960D are said to form a fourth task chain.
  • one or more of the tasks can each be a parent task, launching one or more parallel tasks such as done by 925. In this way, an n-dimensional chain can be formed with concurrently executing tasks.
  • the electronic device 100 can have separate profiles for each user.
  • Each profile will consist of the enrolled finge ⁇ rint data (e.g., minutiae points) and the mapping information (e.g., associated tasks and their parameters) of the user. Different users enroll their fingers and associated tasks and parameters. Since finge ⁇ rint authentication is technically reliable, a finge ⁇ rint image from a user will not trigger the performance of a task associated with the finge ⁇ rint image of another user. It will be appreciated that the fmge ⁇ rint array 200 can hold finge ⁇ rint images from one person or more than one person.
  • finge ⁇ rint control system in accordance with the present invention, with each person's fmge ⁇ rints mapping to particular tasks.
  • one person's right index finge ⁇ rint image can map to the task of opening a Web browser and a second person's right index finge ⁇ rint image can map to the task of automatically logging in to the second person's e-mail account.
  • finge ⁇ rint data do not have to be stored in user profiles.
  • finge ⁇ rint images of all users are stored together in a single file.
  • one or more finge ⁇ rint arrays can be used to store finge ⁇ rint images and their mapping information, the associated tasks and parameters.
  • each user each has a corresponding finge ⁇ rint array.
  • all subsequent finge ⁇ rint images are compared to fmge ⁇ rint images stored in her corresponding finge ⁇ rint array.
  • fewer finge ⁇ rint images e.g., entries in the finge ⁇ rint array
  • all of the finge ⁇ rint images are stored in a single finge ⁇ rint array. In this embodiment, users do not have to log in and out.
  • the captured finge ⁇ rint image is compared against all of the finge ⁇ rint images in the single finge ⁇ rint array to determine the best match.
  • ten tasks can be performed by a single user, one for each fmger.
  • Another embodiment expands the number of action sequences by mapping permutations of finge ⁇ rint images. If two finge ⁇ rint images are required to start an action sequence, then the maximum number of tasks that can be performed is 10 2 or 100. The two images may come from the same finger or different fingers of the same person.
  • a user may define the sequence of finge ⁇ rint images for her left small fmger followed by her right small fmger to map to the action sequence that turns OFF the electronic device.
  • the associated tasks will be appropriately mapped to allow users to link and map multiple fmger images to an action sequence.
  • the number of action sequence equals 10 n , where n is the required number of finger images to start a task. It will be appreciated that in accordance with the present invention, tasks can be nested using combinations or permutations of finge ⁇ rint images to perform a particular task.
  • a finge ⁇ rint control system in accordance with the present invention can associate the combination of swiping the user's ring finger followed by the swiping of her thumb to perform a certain task.
  • the combination of the ring fmger and the thumb in any order, can thus be used to perform the task.
  • the finge ⁇ rint control system can associate permutations of finger swipes to unique tasks.
  • the swiping of a ring finger followed by a thumb can associate with (e.g., trigger) one task, but the swiping of a thumb followed by a ring finger can associate with a second task, different from the first. It will be appreciated that any level of nesting can be used.
  • a finge ⁇ rint sensor is configured to detect the direction of a finge ⁇ rint swipe. Most swipe sensors require that a finger be swiped in a direction pe ⁇ endicular to the length of the swipe sensor surface.
  • the swipe sensor can capture finge ⁇ rint images in either swipe direction, and can tell which direction a fmger has been swiped.
  • a direction sensitive swipe sensor Using a direction sensitive swipe sensor, a finger can map to two different tasks or action sequences, depending on the swipe direction, thus increasing the number of tasks that can be mapped to a user's set of finge ⁇ rint images.
  • two finge ⁇ rint images can map to four tasks.
  • Ten finge ⁇ rint images can map to twenty tasks.
  • the same fmge ⁇ rint image is mapped to different tasks based upon the context.
  • the context can be defined as the application that is currently active and occupying the Desktop.
  • a finge ⁇ rint image read when a text editor is active may launch the task of adding a signature at the cursor, but the same finge ⁇ rint image read while a financial software application is running may launch the task of authorizing a transaction.
  • a mobile phone program can be launched to automatically dial the corresponding phone number.
  • detecting the same finge ⁇ rint image will emit a series of DTMF tones that represent the calling card numbers. It will be appreciated that in this embodiment, the electronic device must support at least two contexts.
  • swiping a fmger launches the combined tasks of granting a user access to the electronic device but also launches another application program.
  • a user who is computer illiterate or unable to type can start a computer and launch different applications depending on which finger is used for authentication.
  • the user may use her left index fmger to start a computer with a browser pointing to the news, or she may use her right thumb to open the same computer and start with online radio.
  • each finge ⁇ rint image corresponds to a separate start-up profile of the user.

Abstract

A device for and method of performing a task associated with fingerprint data is disclosed (120). The method comprises reading fingerprint data (115), matching the read fingerprint data to stored fingerprint data (120), the stored fingerprint data having an associated task (125), and perorming the task. The associated task is part of a chain of tasks that are automatically executed when the read Fingerprint data (115) is matched to the stored fingerprint data (120). preferably, at least one taks has associated user profile data that is used to perform the task. An electronic device that can e controlled in accordance with the present invention includes, but is not limited to, a personl computer, a personal digital assistant, and a remote controlled device.

Description

SYSTEM FOR AND METHOD OF FINGER INITIATED ACTIONS
Related Application This application claims priority under 35 U.S.C. § 119(e) of the co-pending U.S. provisional application Serial Number 60/540,950, filed on January 29, 2004, and titled "SYSTEM FOR AND METHOD OF FINGER INITIATED ACTIONS." The provisional application Serial Number 60/540,950, filed on January 29, 2004, and titled "SYSTEM FOR AND METHOD OF FINGER INITIATED ACTIONS" is hereby incorporated by reference.
Field of the Invention This invention relates to device controllers. More specifically, this invention relates to device controllers that use fingerprints to automatically perform tasks on an electronic device.
Background of the Invention Electronic computing platforms are used in every aspect of our daily lives. Devices such as personal computers, personal digital assistants (PDAs), mobile phones, portable game consoles, remote controls, and digital cameras all provide multiple functions and services on which we depend heavily. To use services available on these platforms, however, a user typically must make a series of inputs or selections. For example, a personal computer user must find the icon of an application and double click on it to launch the application; a mobile phone user must enter the destination phone number and then press the send button to make a phone call; an e-mail user must enter her user name and password to access her e-mail account. When the same services are used or the same phone numbers are dialed every day, users often look for more convenient solutions. Shortcuts are used to reduce the number of steps and to simplify the steps required by a user to perform a task on an electronic device. For example, shortcut links are placed on the desktop of a personal computer; PDA's have a shortcut button to launch a calendar application; mobile phones support speed dial phone numbers. To many users, even shortcuts are not convenient enough. Short cuts on the desktop of a personal computer still require the user to move a pointer to select and launch a software application. The speed dial function of mobile phones still requires users to push at least 2 or 3 buttons. Users often forget the sequence of keystrokes or mouse clicks that provide a short cut. Moreover, these keypads and mice require valuable space, a disadvantage especially on portable electronic devices. Accordingly, what is needed is a system for and a method of executing tasks on an electronic device using a minimal footprint. What is also needed is a system for and a method of executing tasks on an electronic device using a minimal amount of entries, such as keystrokes or button presses. What is also needed is a system for and a method of authenticating a user before allowing him or her to perform tasks on an electronic device, using a minimal number of entries.
Brief Summary of the Invention A fingerprint control system provides a method of and system for reading a biometric image and performing an associated task having a chain of tasks. The task can include launching a computer program, such as an executable file, a macro, and a script; executing a function on an electronic device; or any combination of these. In a first aspect of the present invention, a method of performing a task on an electronic device comprises matching read biometric data to stored biometric data having a corresponding task comprising a first chain of tasks and automatically performing the corresponding task on the electronic device. Preferably, the biometric data comprises fingerprint data. The corresponding task is selected from a plurality of tasks that the electronic device is configured to perform. The corresponding task has optional parameters. In one embodiment, automatically performing a task comprises retrieving user profile data corresponding to a user and the task, and using the user profile data to perform the task. The user profile data comprises a telephone number and the corresponding task comprises dialing the telephone number. Alternatively, the user profile data comprises login information for accessing a host system and the corresponding task comprises transmitting the login information to the host system. The corresponding task comprises accessing a resource over a network such as a local area network or the Internet. A task in the first chain of tasks comprises encrypting the login information before transmitting the login information to the host system. The corresponding task comprises an interactive task. In one embodiment, performing the corresponding task comprises executing a computer game, remotely controlling a remote-controlled system, or performing a non- inherent function of the electronic device. Preferably, the corresponding task further comprises a second chain of tasks. In one embodiment, the first chain of tasks and the second chain of tasks are performed in parallel. In one embodiment, the method further comprises reading biometric data. Reading biometric data comprises reading data captured during a fmger placement on a fingerprint image sensor. Alternatively, reading biometric data comprises reading data captured during a fmger swipe over a fingerprint image sensor, h one embodiment, matching biometric data to stored biometric data comprises identifying a direction of the finger swipe, a first direction having the corresponding task and a second direction having a different corresponding task. Matching read biometric data to stored biometric data comprises determining whether a threshold number of points of read fingerprint data coincide with a number of points of stored fingerprint data. Preferably, the electronic device is portable, such as a personal digital assistant, a telephone, or any other hand-held device. Alternatively, the electronic device comprises a personal computer, a remote controller, a security system, a television set, an audio player, a game device, or any combination of these. In another embodiment, the stored biometric data is an ordered set of biometric data, such as a permutation of biometric data or a combination of biometric data. The read biometric data must also be a permutation or a combination of biometric data in order to match the stored biometric data. In another embodiment, the corresponding task relates to a context of an application executing on the electronic device. The stored biometric data corresponds to one set of stored biometric data from a plurality of sets of stored biometric data from a plurality of users. In a second aspect of the present invention, an electronic device comprises (a) a biometric sensor for reading biometric data; (b) a memory storing a plurality of stored biometric data each having a corresponding task identifier used to perform a corresponding task, at least one task comprising a chain of tasks; and (c) a processor coupled to both the biometric sensor and the memory, the processor configured to match read biometric data with stored biometric data and to automatically perform a corresponding task on the electronic device. Preferably, the biometric sensor comprises a fingerprint image sensor, such as a placement sensor or a swipe sensor. In one embodiment, the swipe sensor is configured to detect a direction of a swipe, a first direction having a first task identifier and a second direction having a second task identifier. In one embodiment, the memory is further configured so that a task identifier also has user profile data corresponding to a user and the task. The task identifier has optional parameters. The corresponding task relates to a context of an application executing on the electronic device. In one embodiment, the electronic device further comprises a telephone operatively coupled to the processor. The user profile data comprises a telephone number and the corresponding task comprises dialing the telephone number on the telephone. The electronic device further comprises a link to a network, such as the Internet. The user profile data comprises a resource locator, and the corresponding task comprises connecting the electronic device to a host identified by the resource locator and accessible over the network. In one embodiment, the link comprises a wireless transmitter. The fingerprint image sensor comprises a thermal sensor, an optical sensor, a pressure sensor, or a capacitive sensor. Preferably, the processor is configured to execute two tasks in parallel. In a third aspect of the present invention, a method of initializing an electronic device comprises reading biometric data, storing the biometric data, and mapping the stored biometric data to a chain of tasks that are automatically performed on an electronic device.
Brief Description of the Several Views of the Drawings Figure 1 is a block diagram of a fingerprint control system in accordance with one embodiment of the present invention. Figure 2 shows an array used to store fingerprint images and to associate each fingerprint image with a task and the task's parameters in accordance with one embodiment of the present invention. Figure 3 is a flow chart illustrating the steps executed by a fingerprint control system in accordance with one embodiment of the present invention. Figure 4 is a flow chart illustrating the steps executed by a fingerprint control system to perform the task of automatically logging into an e-mail server in accordance with one embodiment of the present invention. Figure 5 is a flow chart illustrating the steps executed by a fingerprint control system to perform the task of automatically dialing a telephone number in accordance with one embodiment of the present invention. Figure 6 is a flow chart illustrating the steps used to enroll a fingerprint image in a fingerprint control system and to associate a task with the fingerprint image in accordance with one embodiment of the present invention. Figure 7 is a screen shot of a display screen schematically illustrating a plurality of hands each having a highlighted finger and the associated task for each highlighted finger. Figure 8 is a schematic diagram illustrating relationships between engines used to set up and run a fingerprint control system in accordance with the present invention. Figure 9 is a flow chart illustrating the steps used to spawn child processes so that multiple tasks can be performed concurrently. Figure 10 is a graph showing a parent task that automatically spawns children tasks, each children task in a chain of tasks.
Detailed Description of the Invention In accordance with the present invention, a fingerprint image can be used to initiate one or more associated tasks on an electronic device. In one embodiment, the electronic device reads a fingerprint image and performs a specific task associated with a fingerprint image. Thus, for example, if the electronic device is a personal computer (PC), when a user places or swipes her index finger on a fingerprint sensor coupled to or part of the PC, the PC launches a web browser; when the user places or swipes her thumb on the fingerprint sensor, the PC connects to the e-mail account of the user; when the user places or swipes her ring finger on the fingerprint sensor, the PC runs a calculator program. Thus, each fingerprint image can be used to perform tasks on an electronic device. Likewise, a pair of fingerprint images can be used to perform a task on the electronic device. If a single fmger is used, ten unique tasks can be performed. If a pair of fingerprint images are used, up to one hundred unique tasks can be performed. As used herein, a task is any operation on an electronic device. Thus, the performance of a task includes, but is not limited to, (1) the execution of an inherent function of an electronic device, such as copying on a photocopier; (2) the launching of a program, such as a Web browser, e-mail program, or electronic calculator, that is not specific or inherent to a particular electronic device, including interactive programs; (3) any performance of one or more steps on the electronic device to perform any operation; or (4) any combination or permutation of the above. Thus, a task can include the inherent function of powering ON an electronic device, launching a Web browser to connect to a remote host machine, and automatically transmitting a user name and password to the host machine to log on to the host machine. The words "tasks" and "actions" are used interchangeably herein. As used herein, the term "computer program" refers to executable files, scripts, macros, and any sequence of instructions that can control the performance of tasks on an electronic device. Embodiments of the present invention thus advantageously (1) reduce the number of entries that a user must make to perform a task-rather than entering numerous keystrokes to connect to an e-mail account, a single finger placement or swipe can accomplish the same task; (2) reduce the footprint of an electronic device since keypads, function buttons, joy sticks, and mice can be replaced with a fingerprint image sensor; (3) reduce the complexity of user interfaces since a user does not have to remember keystrokes, logon information, or other information needed to perform a task; and (4) increase the security of electronic devices since, using a fingerprint, a user can be authenticated each time she requests that a task be performed so that the authentication and the performance of the task can be accomplished using a s single finger swipe or placement. Embodiments of the present invention accomplish any one or more of these results. Figure 1 is a schematic diagram of a fingerprint control system 100 in accordance with the present invention. The fingerprint control system 100 comprises a central processing unit 110 electronically coupled to a fingerprint sensor 115, a storage unit 120, a display unit 130, and an input unit 135. The storage unit 120 comprises a mapping store 125 containing fingerprint mapping information, such as the task information and associated parameters described in more detail below. It will be appreciated that the fingerprint control system 100 can be a stand-alone system comprising an electronic device or can be coupled to an external electronic device (not shown) and used to control the external electronic device as described below. The fingerprint sensor 115 is used to read the image of a fingerprint placed upon it. The fingerprint sensor 115 can be a placement sensor or a swipe sensor. Placement sensors require a user to place her finger on a reading surface of the fingerprint sensor until a fingerprint image is captured. Placement sensors are designed to actively sense the entire surface of a finger at once. Placement sensors can be based on optical, thermal, pressure, electrical, or other sensing means. In general placement sensors are designed to have a reading surface with an area as large as the pad of a typical human finger, typically 15 mm2. It will be appreciated that the area of the reading surface in accordance with the present invention can be larger or smaller than 15 mm2. Alternatively, the fingerprint sensor 115 can be a swipe sensor. In general, swipe sensors are fully sized in one direction (typically in width) but abbreviated in the other (typically in height). Swipe sensors are thus configured to sense only a small rectangular portion of a finger at any one time. To capture a fingeφrint image, a user needs to swipe his fmger over the sensor. Swipe sensors are especially suitable for portable devices because they are smaller than placement sensors. Methods of and systems for fingeφrint sensing are described in detail in the U.S. Patent Application Serial Number 10/194,994, filed July 12, 2002, and titled "Method and System for Biometric Image Assembly from Multiple Partial Biometric Frame Scans," and in the U.S. Patent Application Serial Number 10/099,558, filed March 13, 2002, and titled "Fingeφrint Biometric Capture Device and Method with Integrated On-Chip Data Buffering," both of which are hereby incoφorated by reference in their entireties. In the preferred embodiment, the fingeφrint sensor is a ATW100 capacitive swipe sensor by Atrua Technologies, Inc., at 1696 Dell Avenue, Campbell, California 95008. It will be appreciated that any sensor technology can be used in accordance with the present invention. The input device 135 is used to input information, such as task information, into the fingeφrint control system 100. For example, as described in more detail below, an administrator or user may use the input device to enter a user name and password associated with a fingeφrint image. Thus, for example, when a user places her thumb on the fingeφrint sensor 115, a Web browser is opened, a connection made to an e-mail server, and the user name and password are automatically transmitted to the e-mail server so that an e-mail session is automatically initiated for the user. The input device can be used to type in the task information associated with her thumb print. Such task information can include (1) parameters such as a user name and password and (2) the name or memory address of one or more programs that launch a Web browser, connect to an e-mail server, and transmit the user name and password combination. The input device 135 can be any type of device for inputting information including, but not limited to, a keyboard, a mouse, a touch screen, and a port for receiving information from an electronic device such as a personal digital assistant (PDA). When entering the task information, such as a user name and password, into the fingeφrint control system 100, the typed information can be displayed on the display unit 130. It will be appreciated that the input device 135 and the display unit 130 are typically used only when the fingeφrint control system 100 is being initialized, such as described in Figure 6, and thus both are optional and can be absent when the fingeφrint control system 100 is used in normal operation, such as performing a task. The storage unit 120 comprises a fingeφrint mapping store 125 used to store fingeφrint mapping information. As described in more detail below, the fingeφrint mapping information comprises a plurality of fingeφrint images, their associated tasks, and the parameters used by the tasks. Thus, when the fingeφrint sensor 115 reads a fingeφrint image (the read fingeφrint image), a control program executed by the CPU 110 compares the read fingeφrint image to fingeφrint images stored in the fingeφrint mapping store 125. These stored fingeφrint images are referred to as "enrolled fingeφrint images" or "stored fingeφrint images." When the control program matches the read fingeφrint image to an stored fingeφrint image, the CPU 110 will execute the tasks associated with the matched stored fingeφrint image, using the associated parameters, if any. In a preferred embodiment, the fingeφrint mapping information is stored in a fingeφrint array (table) where the index or key is a fingeφrint image and the associated task is the value for the array. Figure 2 shows one embodiment of a fingeφrint array, described in more detail below. Using a fmgeφrint array, the control program can quickly associate a fingeφrint image to its corresponding task. The electronic device to be controlled by the fingeφrint control system 100 can include the same hardware as the fingeφrint control system 100. Thus, for example, if the electronic device is a PC, the PC can also comprise one or more of the CPU 110, the fingeφrint sensor 115, the storage 120, the display 130, and the input device 135. The CPU 110 can execute fingeφrint operations as well as other application programs. Similarly, if the electronic device is a telephone, the fingeφrint control system 100 can reside on and form part of the telephone. Thus, the fingeφrint sensor 115 can be integrated into a telephone case, the display 130 can be a telephone liquid crystal display unit, and the input 135 can be the telephone keypad. Alternatively, the fmgeφrint control system 100 can reside on hardware separate from the electronic device it controls. Thus, for example, if the electronic device is a coffee maker, the fingeφrint control system can be electronically coupled to the coffee maker. Thus, a user can use the fingeφrint control system 100 to launch a program, and the program can send control signals to the coffee maker, thereby controlling the coffee maker. It will be appreciated that the fingeφrint control system 100 can be coupled to the electronic device in any number of ways, depending on the application at hand. The fingeφrint control system 100 can be directly coupled to the electronic device using control wires. Alternatively, the fingeφrint control system 100 can be coupled to the electronic device over a local area network, using for example the Ethernet protocol, or over a wide area network, using, for example, TCP/IP. Also, the fingeφrint control system 100 can be wirelessly coupled to the electronic device using radio or infrared signals. It will also be appreciated that the electronic device can also be coupled to a remote host using, for example, a wireless transmitter or a wireless transceiver. Figure 2 illustrates one embodiment of a fingeφrint array 200 in accordance with one embodiment of the present invention. The fingeφrint array 200 comprises rows 210, 220, 230, 240, and 250 and columns 201, 204, and 206. It will be understood by those of ordinary skill in the are that rows and columns are representative only. Any other database technique can be used with equal success. The column 201 contains multiple fingeφrint images, 211, 221, 231,241 and 251. The column 204 contains associated tasks (here, scripts or executable files). The column 206 contains any parameters for each associated task. In one embodiment, the parameters are input to the programs or scripts shown in the column 204. The row 210 contains the stored fingeφrint image 211, its associated task 214, and the parameter list 216 used to perform the associated task 214. In this example, the associated task 214 contains the PERL script email.pls, which is used to access an email account. Of course, any other software language can be used. The parameter list 216 contains a first element, "John Doe", and a second element, "passwordl". The performance of a task to access an email account using the associated task 214 and its parameter list 216 are described in detail relative to Figure 4 below, describing the steps performed to execute a task when a read fmgeφrint image matches the stored fingeφrint image 211. The first element, "John Doe", and the second element, "passwordl", are one example of what are referred to as user profile data, that is, data specific to a fingeφrint image (or other fingeφrint data) and used to perform a task. The row 220 contains the stored fingeφrint image 221 and its associated task 224. The empty cell 226 indicates that no parameters are required to perform the associated task 224. The associated task 224 comprises one operation executed by the Visual BASIC script calc.vbs, which launches a calculator, allowing a user to perform mathematical operations on the electronic device. Thus, when a read fingeφrint image matches the stored fingeφrint image 221, the control program launches (e.g., calls) calc.vbs, which displays a calculator GUI on a display device (e.g., 130, Figure 1). The user can then enter onto an input device (e.g., 135, Figure 1) operands in a mathematical equation, which the calculator computes and displays an answer. The row 230 contains the stored fingeφrint image 231, its associated task 234, and the parameter 236 used to perform the associated task 234. The associated task 234 is performed by launching the executable file "Wordprocessor.exe". In this example, when a read fingeφrint image matches the stored fingeφrint image 231, the control program calls Wordprocessor.exe, which presents a user interface to a word processor, such as Wordperfect, Word, vi, or any other kind of word processor. The parameter 236 passed to the word processing program, "Font=τoman; size=12" can be used to automatically set the font style to "Roman" and the size (point) to "12". The parameter 236 can be passed to the word processing program in a variety of ways. For example, the parameters 236 can be sent to the word processing in a macro that is automatically called, by a template, by a style sheet, or by any other means. The row 240 contains the stored fingeφrint image 241, its associated task 244, and parameter 246 used by the associated task 254. The associated task 224 is performed by the PERL script phone.pls. The parameter 256 comprises the string "10102504085551212" designating a calling card code and a telephone number. Thus, when a read fmgeφrint image matches the fingeφrint image 241, the control program calls phone.pls to launch a telephone program that dials the digits "10102504085551212". In accordance with one embodiment of the present invention, Figure 5 describes the tasks executed when a read fingeφrint image matches the stored fingeφrint image 241. The row 250 contains the fingeφrint image 251, its associated task 254, and the parameter list 256 used by the associated task 254. The associated task 254 is performed by the PERL scripts ftp.pls and email.pls. The first element of the parameter list 256 comprises the string "http://www.site.com" , and the second element of the parameter list 256 comprises the string "joe(g),vahoo.com." In this example, when a read fingeφrint image matches the stored fingeφrint image 251, the control program calls the program ftp.pls and passes it the first element of the parameter list 256 "http://www.site.com" to download a file from the Web site "http://www.site.com" using the file transfer protocol. The control program then attaches the file to an e-mail addressed to the e-mail address stored in the second element of the parameter list 256, "ioe@,yahoo.com", and then calls the script email.pls to send the e- mail. Thus, in this example, a file stored at a Web address and containing information that may be updated periodically can be automatically sent to a specific user. This example illustrates nested tasks, where one task can include the performance of multiple tasks, here the downloading of a file and the sending of the file by email. In this example, the task performed by the script ftp.pls is said to be chained to the task performed by the script email.pls. Preferably, fingeφrint data that uniquely identifies a fingeφrint image rather than an entire fingeφrint image is stored in the fingeφrint array 200. The fingeφrint data will correspond to a subset of the entire fingeφrint image, thus requiring less data storage. This fingeφrint data can correspond, for example, to multiple minutiae points for a fingeφrint. Thus, for example, when a fingeφrint image is read by a fingeφrint sensor, an extracted set of unique fingeφrint data is stored in a template. It will be appreciated that any reference herein to fingeφrint images will also correspond to fingeφrint data. It will be appreciated that fingeφrint images can be correlated to tasks using structures other than the fingeφrint array 200. Rather than storing fingeφrint images 211, 221, 231, 241, and 251 in the column 201, other fingeφrint identifiers that uniquely identify a fingeφrint image can be used. For example, rather than storing fingeφrint images in the array 200, pointers to fingeφrint images can be stored. The fingeφrint pointers can contain the address of fingeφrint images stored elsewhere in the storage 120 (Figure 1) or at other locations, as described below. In addition, fingeφrint images, associated tasks and their parameters can be stored in other data structures such as a hash (associative array), with each fingeφrint image as a key, and the corresponding tasks and parameters as values. Alternatively, the fingeφrint image and corresponding tasks and parameters can be stored in a database. Many structures can be used, depending on the application at hand. For example, if the electronic device is a PC, then a database and its associated database management system can be used in conjunction with the control program. If the electronic device is smaller, such as a PDA, and has less memory, an array can be used. Figure 3 is a flow chart illustrating the steps 300 taken by a control program in accordance with the present invention and used to perform a task after a fingeφrint image is read from the fingeφrint sensor 115 of Figure 1. Referring to Figures 1 and 3, in the start step 301, the fingeφrint sensor 115 and any data and variables used by the control program are initialized. Next, in the step 305, the user swipes or places her fmger on the fingeφrint sensor 115. Next, in the step 310, the control program determines whether the image quality of the read (scanned) fingeφrint image is acceptable. If it is not, the user is prompted (for example on the display device 130) to again swipe or place her finger on the fingeφrint sensor 115. Alternatively, if an optional display is absent the unit can beep or blink on LED to signal the user. The control program then loops back to the step 305. It will be appreciated that the control program can be configured to allow only a pre-determined number of unacceptable images to be scanned. After this limit is reached, the control program can alert the user that the fingeφrint sensor 115 is dirty or damaged. If it is determined in the step 310 that the quality of the read fmgeφrint image is acceptable, the control program continues to the step 315, where it compares the read fingeφrint image with the stored fingeφrint images in the column 201 of the fingeφrint array 200 of Figure 2. Fingeφrint images can be compared sequentially or using some weight based on the number of times a particular fingeφrint was matched in earlier sessions. A stored fingeφrint image (and thus a corresponding task) may match more often (e.g., corresponding to a popular task) and thus will first be compared with other stored fingeφrint images. By comparing a read fingeφrint image first with an often accessed fingeφrint image, the average number of comparisons on a fingeφrint control system can be reduced. Next, in the step 320, the control program checks whether the read fingeφrint image matches the current candidate stored fingeφrint image. If a match is not found, the control program proceeds to the step 305. If a match was found (e.g., matching the read fingeφrint image to the stored fingeφrint image 211), the control program continues processing at the step 325. At the step 325, the control program retrieves the corresponding task (e.g., the script email.pls in the field 214) and the corresponding parameters (e.g., the parameter list 216). Next, in the step 330, the control program performs the corresponding task (e.g., calls or launches the script email.pls) using the corresponding parameters (e.g., the user name "John Doe" and the password "passwordl"). It will be appreciated that executing one task can comprise executing one or more tasks in a task chain. Next, in the step 325, the control program ENDS. In accordance with one embodiment of the present invention, a "test" fingeφrint image (also referred to as a "read fingeφrint image" or a "candidate fingeφrint image") is compared to a fingeφrint images in a set of stored fingeφrint images. The read fingeφrint image is read when a user places or swipes her finger on a fingeφrint sensor. The read fingeφrint image may not exactly match any image in the set of stored fingeφrint images. Therefore, the candidate fingeφrint image is compared to all of the stored fingeφrint images to compute a matching score, a number reflecting the number of matching minutiae points. The stored fingeφrint image with the highest match score is considered the best match, as long as the score is above a predetermined match threshold. If it is not above this threshold, none of the stored fingeφrint images is considered a match. The threshold is required because some security systems cannot be sure whether an impostor is trying to fool it. In certain circumstances the cost of illicit access can be excessively high (hence the need for fingeφrints in the first place). In low-security systems, the match threshold can be made artificially low, so that the best matching fingeφrint image is always selected. In this way, fingeφrint images that might not match with a higher threshold (due to noise, for instance), will be properly matched at the lower threshold, thereby reducing user inconvenience due to system error. Such an approach is only acceptable if the fmger initiated actions do not need high security protection (e.g. the actions can be started in other ways with more steps or keystrokes). Figure 4 illustrates the steps 400 taken to perform the task of automatically logging a user into her e-mail account so that the logon page is automatically bypassed, and the command page is displayed. The steps 400 correspond to the step 330 in Figure 3 and are performed, for example, when a read fingeφrint image matches the stored fingeφrint image 211 in Figure 2. Referring to Figures 1 and 2, for this example, it is assumed that the stored fingeφrint image 211 is read by the fingeφrint sensor 115. First, in the Start step 401, the fmgeφrint sensor 115 and any data structures used by the control program are initialized. Next, in the step 410 the a Web browser is automatically launched on the user's electronic device. A Uniform Resource Locator (URL) for her e-mail account is automatically input into the address field of the Web browser in the step 415. In the step 420, the home page for the user's e-mail server is received and displayed on the user's electronic device. Next, in the step 425, the user's login ID ("John Doe", the first element in the parameter list 216) and password ("passwordl", the second element in the parameter list 216) are automatically input into the appropriate fields of the e-mail home page. The SUBMIT command is then automatically invoked in the step 430. The user's mailbox can then be manipulated in the step 435, either manually, by the user, or automatically, by software in accordance with the present invention. The mailbox can be manipulated by, for example, displaying messages from the user's e-mail account, automatically composing a new e-mail message, allowing the user to compose a new e-mail or delete an old one. The process ends in the step 440 when, for example, the user logs off from her e-mail account. It will be appreciated that a user can access her mailbox or e-mail account in accordance with the present invention in other ways. For example, the control program can call other software, such as Messaging Application Programming Interface (MAPI), which allows a user to seamlessly manipulate e-mail accounts and mailboxes. The control program can invoke a Web browser and post an HTML form to automatically log on to an e-mail server to access her account. The user can manipulate her e-mail account without using a Web browser, by using, for example, a command line such as used in the UNIX environment. It will be appreciated that user name, passwords, and other secured information can be used in other environments in accordance with the present invention, such as with online banking or other online purchasing where credit card and other confidential information can be transmitted. Figure 5 illustrates the steps 500 used to perform the task of automatically dialing (i.e., speed dialing) a telephone number in accordance with the present invention. The steps 500 correspond to the step 330 in Figure 3. Referring to Figures 1 and 2, for this example, it is assumed that the fingeφrint image 241 is read by the fingeφrint sensor 115. The control program launches the PERL script phone.pls, passing it the parameter "10102504085551212". The PERL script phone.pls then executes the tasks displayed in the steps 500. First, in the start step 501, the fingeφrint sensor 115 and any data structures used by the PERL script phone.pls are initialized. Next, in the step 505, the control program turns the electronic device (here, a telephone) ON. Next, in the step 510, phone.pls reads the parameters 246, the sequence of digits "10102504085551212" containing a calling card code and a telephone number. Next, in the step 520, phone.pls sequentially transmits the digits to a tone generator in the telephone, thus automatically dialing the calling card code and the telephone number. Next, in the step 530, the process ends. Figure 6 illustrates the steps 600 performed by an Enrollment Program to store fingeφrint images and their associated tasks and parameters in the electronic device 100 (Figure 1) during a set up process. The Enrollment Program starts in the START step 601. The START step 601 can be entered in a variety of ways, such as by pressing a START button (not shown) on the input device 135 of Figure 1. Next, in the step 605, the user is prompted to select a task, which in turn can execute a chain of tasks. Tasks can be presented in a number of ways. For example, a user can be presented with a menu of tasks on the display device 130, any number of which can be selected by, for example, entering a selected task on the input device 135. In addition, a user can type in the name of a script (e.g., phone.pls in the array cell 244 of Figure 2) and its associated parameters (e.g., "10102504085551212" in the array cell 246 in Figure 2). Next, in the step 610, the Enrollment Program checks whether the selected task has already been associated with a fingeφrint image. The Enrollment Program can do this by, for example, parsing the column 204 in Figure 2 to see whether any of the tasks stored there correspond to the task selected in the step 605. If the selected task corresponds to a fingeφrint image in the fingeφrint array 200, then the Enrollment Program proceeds to the step 615; otherwise the Enrollment Program proceeds to the step 620. In the step 615, the Enrollment Program prompts the user whether she would like to change the mapping (association) currently stored in the fingeφrint array 200. Preferably, the system of the present invention disallows two fingeφrint images from mapping to the same task. It will be appreciated, however, that two fingers can be mapped to the same task. If the user would not like to remap the action already stored in the fingeφrint array 200, the Enrollment Program proceeds to the step 605; otherwise, the Enrollment Program proceeds to the step 620. In the step 620, the user is prompted to select a finger that she will later swipe and whose image will be mapped to the selected task. The user can be prompted in a variety of ways. For example, she can be presented with a screen image (similar to column 201 in Figure 4) of fingers and a means (e.g., mouse click or keypad entry) for selecting one of the fingers presented. Next, in the step 630, the user swipes (or places) her finger on the fingeφrint sensor 115, and in the step 635 the Enrollment Program checks whether the fingeφrint image quality is acceptable. It will be appreciated by those skilled in the art that this can be accomplished in many ways, such as by ensuring an adequate number of dark ridges. If the fingeφrint image quality is acceptable, the Enrollment Program proceeds to the step 640; otherwise the Enrollment Program proceeds to the step 630. In the step 640, the Enrollment Program stores in the fingeφrint array 200 the fingeφrint image, its corresponding task, and the tasks corresponding parameters. Next, in the step 645, the Enrollment Program ENDs. Figure 7 is a screen shot 700 of a display showing the relationship between fingers (and thus fingeφrint images) and the corresponding tasks. Embodiments of the present invention can display the screen shot 700 to remind a user of the tasks associated with each fmger. For example, referring to Figure 1, by pressing a button on the input device 135, the screen shot can be displayed on the display unit 130. The screen shot 700 shows that when the user swipes her right pinkie fmger on the fingeφrint sensor 115, a Web browser is automatically launched; when she swipes her right ring finger, an e-mail program is automatically launched; when she swipes her middle finger, a calculator program is automatically launched; when she swipes her right index finger, a word processor is automatically launched; and when she swipes her right thumb, a telephone number is automatically dialed. Figure 8 illustrates the relationship between the elements of a fingeφrint control system 800 comprising a fingeφrint reader 845, an initialization program 825, a fingeφrint array 835, and a control program 840. The initialization program 825 comprises an Enrollment engine 810, a Task Definition engine 820, and a Task Configuration engine 815. The fingeφrint array 835 contains a column of 10 fingeφrint images labeled A through J (also referred to as row A through row J, respectively). Each fingeφrint image has associated with it a task chain. The task chain corresponds to multiple tasks that are executed by the control program 840 when a fingeφrint image read by the fingeφrint reader 845 matches a fingeφrint image stored in the fingeφrint array 835. Preferably, read fingeφrint data (e.g., minutiae points) are compared to fingeφrint data stored in the fingeφrint array 835. The row A has associated with it the chain of tasks (e.g., executable files or a scripts) Al, A2, and A3. Each task Al, A2, and A3 can have one or more parameters associated with it. The result of each task depends on the values of these parameters. It will also be appreciated that while Figure 8 shows a linear (one-dimensional) task chain Al, A2, and A3, such that task Al is performed, followed by task A2 and then task A3, tasks can also be performed in parallel. Thus, for example, when a read fingeφrint image matches the fingeφrint image A, a first task Al and a second task Al 1 (not shown) can be performed in parallel, thus allowing for the performance of tasks in a two-dimensional manner. Thus, in accordance with the present invention, tasks can be performed linearly (e.g., sequentially) and in parallel (e.g., concurrently). It will be appreciated that in accordance with the present invention, tasks can be executed in many dimensions. One such embodiment is illustrated in Figure 10. It will also be appreciated that a task in a task chain can be performed when called (e.g., launched) by a control program, when called by another task (e.g., program performing a task) in the task chain, or in any other manner. Figure 8 also shows two hands having associated fingers and their corresponding fingeφrint images 860A through 860J. The fingeφrint image corresponding to the fmger 860A is stored in the fingeφrint array 835 at the location A; the fingeφrint image corresponding to the fmger 860B is stored in the fingeφrint array 835 at the location B; etc. As shown in Figure 8, when the user chooses to enroll a fingeφrint image into the fingeφrint control system 800, the Enrollment engine 810 is executed. The Enrollment engine 810 processes a fingeφrint image captured by the fingeφrint scanner 845 and passes the fingeφrint image, a pointer to a memory location containing the fingeφrint image, or any other fingeφrint image identifier to the Task Configuration engine 815. A Task Definition (i.e., associated tasks or programs) are then supplied to the Task Definition engine 820. In one embodiment, the user types tasks using the an input device such as the input device 135 shown in Figure 1. The Task Configuration engine 815 then stores the fingeφrint image identifier (e.g., a fingeφrint image, fingeφrint minutiae points, pointer to fingeφrint minutiae points) into the first column of a row in the Fingeφrint Array 835 (e.g. A) and then stores the task chain (e.g., action sequence) and their associated parameters in the remaining elements in the row (e.g., Al, A2, and A3). It will be appreciated that a fingeφrint image can have any number of tasks in its respective task chain. Thus, for example, the fingeφrint image B has four tasks B1-B4 in its task chain; the fingeφrint image C has one task Cl in its task chain; etc. Each task in a task chain has any number of optional parameters. Parameters can include those used to perform a task (e.g., execute a program) regardless of the fingeφrint data that are matched, or those used when particular fingeφrint data are read (e.g., user profile data), or any combination of these. In one embodiment, the fingeφrint array 835 and the initialization program 825 can both reside in a single memory storage, such as the storage 120 in Figure 1. It will be appreciated, however, that the fingeφrint array 835 and the initialization program 825 can reside on different storage devices at different locations. For example, the initialization program 825 can reside on an electronic device (not shown) and the fingeφrint array 835 can be coupled to the electronic device over a network. The initialization program 825 is not required after the set up process. In one embodiment, fingeφrint images are stored on a storage device in encrypted form, thus adding an additional level of security. It will be appreciated that other modifications can be made to this embodiment in accordance with the present invention. For example, the read fingeφrint can be compared to decoded stored fingeφrint images, though such a modification will require multiple decryptions and thus may take longer. Embodiments of the present invention can be used to launch any number of tasks. For example, if the electronic device (e.g., platform) is a personal computer, the left index finger can launch a browser and bring up a web site, while the right middle fmger can launch an email program. If the platform is a mobile phone, the middle finger can trigger the phone to speed dial the home phone number of the user. If the platform is a game device, the thumb can launch a particular game to play on the game device. If the platform is a television remote control, the index finger can launch a program to switch to a parent-controlled channel while the last finger will turn off the television. If the platform is an automatic coffee maker, the right thumb can perform the inherent function of turning the automatic coffee maker ON and then launch a program that causes the coffee maker to brew a cup of cappuccino; if the right index finger is swiped, the program can direct the coffee maker to prepares a cup of mocha. If the platform is a MP3 player, swiping the right index fmger will direct the MP3 player to play the second song on a disc. If an electronic device is part of a high security environment, a user will normally swipe his left index fmger to pass a security checkpoint. If he is under duress or danger, he will swipe his left middle finger to pass the security checkpoint and also indicate a potential danger. In sum, any number of electronic devices can be controlled with embodiments of the present invention. It will be appreciated that the present invention is independent of device types and operating systems. It can work with all types of fingeφrint authentication solutions and all types of finger image sensors (e.g. capacitive, optical, thermal, pressure, etc.), swipe or placement. The present invention can also be executed in specialized hardware or firmware instead of in software. Tasks or parts of a task can be implemented in software while other portions are implemented in firmware, hardware, software, or any combination of these. The control program can launch multiple tasks that run concurrently. Figure 9 illustrates the steps 900 taken by a parent program (e.g., process) in accordance with one embodiment of the present invention. First, in the step 901, the parent program initializes any parameters used to control the multiple processes, such as process identifiers (PIDs). Next, in the step 901, the parent program waits for a finger swipe on a fingeφrint sensor. The parent program can, for example, wait for a swipe event, be interrupted as part of an interrupt service routine triggered when a finger is swiped on a fingeφrint sensor, or triggered in other ways. Next, in the step 910, the parent program spawns a child process or thread, passing it the fingeφrint image identifier, such as a fingeφrint image or the address of a memory location containing the fingeφrint image. The child process then performs the tasks associated with the fingeφrint image, as illustrated, for example, in the steps 300 of Figure 3. The parent program then loops back to the step 905. Thus, in a windows environment, multiple windows for multiple tasks can be open at once in accordance with the present invention. It will be appreciated that the parent program will perform other steps to perform steps of the present invention. Figure 10 shows a graph 920 illustrating how one task can automatically launch a plurality of tasks, each task being a part of a task chain, in accordance with one embodiment of the present invention. Thus, for example, when read fingeφrint data corresponds to stored fingeφrint data having an associated task 925, the associated task 925 launches the tasks (also referred to as sub tasks) 930A, 940A, 950A, and 960A. Each task 930A, 940A, 950A, and 960A can have any number of associated parameters (e.g., user profile data) that can be dependent on the stored fingeφrint data. Thus, in one embodiment, the task 930A can have a first element that corresponds to a script (e.g., 214 in Figure 2) and a second element that corresponds to the parameters used by the script (e.g., 216 in Figure 2). For the embodiment shown in Figure 10, the tasks 930A, 940A, 950A, and 960A are launched concurrently and can execute in parallel. Still referring to Figure 10, the task 940A in turn calls the task 940B, which in turn calls the task 940C. The tasks 940B and 940C also both have optional parameters. The task 940A can call the task 940B just before the task 940A completes so that the tasks 940A and 940B are not executing in parallel. Alternatively, the task 940A can call the task 940B at another time, so that the tasks are executing in parallel. The tasks 940A and 940B, as well as the other tasks illustrated in Figure 10, can also be coordinated so that they can share data. For example, the task 940A can read stock data on a Web page, and then pass the stock data to the task 940B, which includes the data in an e-mail to a user, thereby notifying the user of changes in stock prices. In a similar manner, the task 950A calls the task 950B. The task 960A calls the task 960B, which calls the task 960C, which calls the task 960D. The tasks 925 and 930A are said to form a first task chain, the tasks 925, 940A, 940B, and 940C are said to form a second task chain, the tasks 925, 950A, and 950B are said to form a third task chain, and the tasks 925, 960A, 960B, 960C, and 960D are said to form a fourth task chain. It will be appreciated that one or more of the tasks (e.g., 930A, 940A-C, 940A-B, and 960A-D) can each be a parent task, launching one or more parallel tasks such as done by 925. In this way, an n-dimensional chain can be formed with concurrently executing tasks. Multiple users may share the electronic device 100 of Figure 1. In such case, the electronic device 100 can have separate profiles for each user. Each profile will consist of the enrolled fingeφrint data (e.g., minutiae points) and the mapping information (e.g., associated tasks and their parameters) of the user. Different users enroll their fingers and associated tasks and parameters. Since fingeφrint authentication is technically reliable, a fingeφrint image from a user will not trigger the performance of a task associated with the fingeφrint image of another user. It will be appreciated that the fmgeφrint array 200 can hold fingeφrint images from one person or more than one person. Thus, multiple persons can use a single fingeφrint control system in accordance with the present invention, with each person's fmgeφrints mapping to particular tasks. Thus, one person's right index fingeφrint image can map to the task of opening a Web browser and a second person's right index fingeφrint image can map to the task of automatically logging in to the second person's e-mail account. It will be appreciated that fingeφrint data do not have to be stored in user profiles. In an alternative embodiment, fingeφrint images of all users are stored together in a single file. When more than one user has fingeφrint images enrolled in a fingeφrint control system, one or more fingeφrint arrays can be used to store fingeφrint images and their mapping information, the associated tasks and parameters. In one embodiment, each user each has a corresponding fingeφrint array. When a user first swipes his finger on the fingeφrint reader, all subsequent fingeφrint images (until he logs out) are compared to fingeφrint images stored in that user's corresponding fingeφrint array. When a second user logs in to the fingeφrint control system, all subsequent fingeφrint images (until she logs out) are compared to fmgeφrint images stored in her corresponding fingeφrint array. In this embodiment, fewer fingeφrint images (e.g., entries in the fingeφrint array) must be compared before finding the best match. In a second embodiment, all of the fingeφrint images are stored in a single fingeφrint array. In this embodiment, users do not have to log in and out. Instead, when a finger is swiped, the captured fingeφrint image is compared against all of the fingeφrint images in the single fingeφrint array to determine the best match. It will be appreciated that other methods and structures for storing and comparing fingeφrint images can be used in accordance with the present invention. In one embodiment of the present invention, ten tasks can be performed by a single user, one for each fmger. Another embodiment expands the number of action sequences by mapping permutations of fingeφrint images. If two fingeφrint images are required to start an action sequence, then the maximum number of tasks that can be performed is 102 or 100. The two images may come from the same finger or different fingers of the same person. For example, a user may define the sequence of fingeφrint images for her left small fmger followed by her right small fmger to map to the action sequence that turns OFF the electronic device. The associated tasks will be appropriately mapped to allow users to link and map multiple fmger images to an action sequence. In general, the number of action sequence equals 10n, where n is the required number of finger images to start a task. It will be appreciated that in accordance with the present invention, tasks can be nested using combinations or permutations of fingeφrint images to perform a particular task. Thus, for example, a fingeφrint control system in accordance with the present invention can associate the combination of swiping the user's ring finger followed by the swiping of her thumb to perform a certain task. The combination of the ring fmger and the thumb, in any order, can thus be used to perform the task. Alternatively, the fingeφrint control system can associate permutations of finger swipes to unique tasks. Thus, the swiping of a ring finger followed by a thumb can associate with (e.g., trigger) one task, but the swiping of a thumb followed by a ring finger can associate with a second task, different from the first. It will be appreciated that any level of nesting can be used. In another embodiment, requiring that multiple people authorize a particular task, at least one fingeφrint image from two or more people can be required. Thus, for example, a task will be performed only if a first person swipes her thumb on a fingeφrint sensor and then a second person swipes his index fmger on the fingeφrint sensor. In this way, an added level of security is added. In accordance with another embodiment of the present invention, a fingeφrint sensor is configured to detect the direction of a fingeφrint swipe. Most swipe sensors require that a finger be swiped in a direction peφendicular to the length of the swipe sensor surface. The swipe sensor can capture fingeφrint images in either swipe direction, and can tell which direction a fmger has been swiped. Using a direction sensitive swipe sensor, a finger can map to two different tasks or action sequences, depending on the swipe direction, thus increasing the number of tasks that can be mapped to a user's set of fingeφrint images. Thus, since a single fingeφrint image can map to two tasks, depending on the swipe direction, two fingeφrint images can map to four tasks. Ten fingeφrint images can map to twenty tasks. In accordance with another embodiment of the present invention, the same fmgeφrint image is mapped to different tasks based upon the context. For example, on a personal computer, the context can be defined as the application that is currently active and occupying the Desktop. Thus, a fingeφrint image read when a text editor is active may launch the task of adding a signature at the cursor, but the same fingeφrint image read while a financial software application is running may launch the task of authorizing a transaction. As a second example, when the left index finger is detected, a mobile phone program can be launched to automatically dial the corresponding phone number. However, after the call is connected, detecting the same fingeφrint image will emit a series of DTMF tones that represent the calling card numbers. It will be appreciated that in this embodiment, the electronic device must support at least two contexts. In accordance with another embodiment of the present invention, swiping a fmger launches the combined tasks of granting a user access to the electronic device but also launches another application program. Thus, for example, a user who is computer illiterate or unable to type can start a computer and launch different applications depending on which finger is used for authentication. The user may use her left index fmger to start a computer with a browser pointing to the news, or she may use her right thumb to open the same computer and start with online radio. Thus, each fingeφrint image corresponds to a separate start-up profile of the user. It will be readily apparent to one skilled in the art that various modifications may be made to the embodiments without departing from the spirit and scope of the invention as defined by the appended claims.

Claims

ClaimsWe claim:
1. A method of performing a task on an electronic device, the method comprising: a. matching read biometric data to stored biometric data, the stored biometric data having a corresponding task comprising a first chain of tasks; and b. automatically performing the corresponding task on the electronic device.
2. The method of claim 1, wherein the biometric data comprises fingeφrint data.
3. The method of claim 1, wherein the corresponding task is selected from a plurality of tasks that the electronic device is configured to perform.
4. The method of claim 1, wherein the corresponding task has optional parameters.
5. The method of claim 1 , wherein automatically performing a task comprises retrieving user profile data corresponding to a user and the task, and using the user profile data to perform the task.
6. The method of claim 5, wherein the user profile data comprises a telephone number and the corresponding task comprises dialing the telephone number.
7. The method of claim 5, wherein the user profile data comprises login information for accessing a host system and the corresponding task comprises transmitting the login information to the host system.
8. The method of claim 1, wherein the corresponding task comprises accessing a resource over a network.
9. The method of claim 8, wherein the network comprises the Internet.
10. The method of claim 7, wherein a task in the first chain of tasks comprises encrypting the login information before transmitting the login information to the host system.
11. The method of claim 1 , wherein the corresponding task comprises an interactive task.
12. The method of claim 1, wherein performing the corresponding task comprises executing a computer game.
13. The method of claim 1, wherein performing the corresponding task comprises remotely controlling a remote-controlled system.
14. The method of claim 1, wherein the corresponding task comprises a non-inherent function of the electronic device.
15. The method of claim 1, wherein the corresponding task further comprises a second chain of tasks.
16. The method of claim 15, wherein the first chain of tasks and the second chain of tasks are performed in parallel.
17. The method of claim 1 , further comprising reading biometric data.
18. The method of claim 17, wherein reading biometric data comprises reading data captured during a fmger placement on a fingeφrint image sensor.
19. The method of claim 17, wherein reading biometric data comprises reading data captured during a fmger swipe over a fingeφrint image sensor.
20. The method of claim 19, wherein matching biometric data to stored biometric data comprises identifying a direction of the finger swipe, a first direction having the corresponding task and a second direction having a different corresponding task.
21. The method of claim 1 , wherein matching biometric data to stored biometric data comprises determining whether a threshold number of points of read fingeφrint data coincide with a number of points of stored fingeφrint data.
22. The method of claim 1, wherein the electronic device is portable.
23. The method of claim 22, wherein the electronic device is hand held.
24. The method of claim 1, wherein the electronic device comprises a personal computer.
25. The method of claim 1, wherein the electronic device comprises a personal digital assistant.
26. The method of claim 1, wherein the electronic device comprises a telephone.
27. The method of claim 1, wherein the electronic device comprises a device selected from the group consisting of a remote controller, a security system, a television set, an audio player, and a game device.
28. The method of claim 1, wherein the stored biometric data is an ordered set of biometric data.
29. The method of claim 28, wherein the ordered set of biometric data is a permutation of biometric data.
30. The method of claim 28, wherein the ordered set of biometric data is a combination of biometric data.
31. The method of claim 1 , wherein the corresponding task relates to a context of an application executing on the electronic device.
32. The method of claim 1, wherein the stored biometric data corresponds to one set of stored biometric data from a plurality of sets of stored biometric data from a plurality of users.
33. An electronic device comprising: a. a biometric sensor for reading biometric data; b. a memory storing a plurality of stored biometric data each having a corresponding task identifier used to perform a corresponding task, at least one task comprising a chain of tasks; and c. a processor coupled to both the biometric sensor and the memory, the processor configured to match read biometric data with stored biometric data and to automatically perform a corresponding task on the electronic device.
34. The electronic device of claim 33, wherein the biometric sensor comprises a fingeφrint image sensor.
35. The electronic device of claim 34, wherein the fingeφrint image sensor is a placement sensor.
36. The electronic device of claim 34, wherein the fingeφrint image sensor is a swipe sensor.
37. The electronic device of claim 36, wherein the swipe sensor is configured to detect a direction of a swipe, a first direction having a first task identifier and a second direction having a second task identifier.
38. The electronic device of claim 33, wherein the memory is further configured so that a task identifier also has user profile data corresponding to a user and the task.
39. The electronic device of claim 33, wherein the task identifier has optional parameters.
40. The electronic device of claim 33, wherein the corresponding task relates to a context of an application executing on the electronic device.
41. The electronic device of claim 38, wherein the electronic device further comprises a telephone operatively coupled to the processor, the user profile data comprises a telephone number, and the corresponding task comprises dialing the telephone number on the telephone.
42. The electronic device of claim 38, wherein the electronic device further comprises a link to a network, the user profile data comprises a resource locator, and the corresponding task comprises connecting the electronic device to a host identified by the resource locator and accessible over the network.
43. The electronic device of claim 42, wherein the link comprises a wireless transmitter.
44. The electronic device of claim 43, wherein the network comprises the Internet.
45. The electronic device of claim 36, wherein the fingeφrint image sensor comprises a thermal sensor.
46. The electronic device of claim 36, wherein the fingeφrint image sensor comprises an optical sensor.
47. The electronic device of claim 36, wherein the fingeφrint image sensor comprises a pressure sensor.
48. The electronic device of claim 36, wherein the fingeφrint image sensor comprises a capacitive sensor.
49. The electronic device of claim 33, wherein the processor is configured to execute two tasks in parallel.
50. The electronic device of claim 33, further comprising a personal computer.
51. The electronic device of claim 33, wherein the electronic device is portable.
52. The electronic device of claim 51, wherein the electronic device is hand held.
53. The electronic device of claim 52, wherein the electronic device is a personal digital assistant.
54. The electronic device of claim 33, wherein the electronic device is a remote controller.
55. The electronic device of claim 33, wherein stored biometric data comprises an ordered set of biometric data.
56. The electronic device of claim 55, wherein the ordered set of biometric data comprises a permutation of biometric data.
57. The electronic device of claim 55, wherein the ordered set of biometric data comprises a combination of biometric data.
58. A method of initializing an electronic device, the method comprising: a. reading biometric data; b. storing the biometric data; and c. mapping the stored biometric data to a chain of tasks that are automatically performed on an electronic device.
59. The method of claim 58, further comprising storing user profile data for a task.
60. The method of claim 58, wherein the stored biometric data comprises an ordered set of biometric data.
61. The method of claim 60, wherein the ordered set of biometric data is a permutation of biometric data.
62. The method of claim 60, wherein the ordered set of biometric data is a combination of biometric data.
63. The method of claim 58, wherein reading biometric data comprises reading fingeφrint images corresponding to a plurality of users.
64. A method of performing a task on an electronic device, the method comprising: a. reading biometric data; b. matching the read biometric data to one of a plurality of stored biometric data, the one of the plurality of stored biometric data having a corresponding task selected from a pool of tasks, the pool of tasks comprising a first task performed by launching a computer program; and c. automatically performing the corresponding task on the electronic device.
PCT/US2005/002547 2004-01-29 2005-01-24 System for and method of finger initiated actions WO2005072372A2 (en)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US54095004P 2004-01-29 2004-01-29
US60/540,950 2004-01-29
US10/882,787 2004-06-30
US10/882,787 US7697729B2 (en) 2004-01-29 2004-06-30 System for and method of finger initiated actions

Publications (2)

Publication Number Publication Date
WO2005072372A2 true WO2005072372A2 (en) 2005-08-11
WO2005072372A3 WO2005072372A3 (en) 2007-09-27

Family

ID=34811423

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2005/002547 WO2005072372A2 (en) 2004-01-29 2005-01-24 System for and method of finger initiated actions

Country Status (2)

Country Link
US (1) US7697729B2 (en)
WO (1) WO2005072372A2 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103577739A (en) * 2013-11-15 2014-02-12 青岛尚慧信息技术有限公司 Intelligent mobile terminal and setting and accessing control method thereof
WO2018023579A1 (en) * 2016-08-04 2018-02-08 薄冰 Method for stopping using fingerprint-enabled software according to user feedback, and mobile phone system

Families Citing this family (199)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9292111B2 (en) 1998-01-26 2016-03-22 Apple Inc. Gesturing with a multipoint sensing device
US8479122B2 (en) 2004-07-30 2013-07-02 Apple Inc. Gestures for touch sensitive input devices
WO2005001611A2 (en) 2003-05-30 2005-01-06 Privaris, Inc. A system and methods for assignation and use of media content subscription service privileges
US7474772B2 (en) * 2003-06-25 2009-01-06 Atrua Technologies, Inc. System and method for a miniature user input device
US7587072B2 (en) * 2003-08-22 2009-09-08 Authentec, Inc. System for and method of generating rotational inputs
US20050085217A1 (en) * 2003-10-21 2005-04-21 Chae-Yi Lim Method for setting shortcut key and performing function based on fingerprint recognition and wireless communication terminal using thereof
WO2005079413A2 (en) * 2004-02-12 2005-09-01 Atrua Technologies, Inc. System and method of emulating mouse operations using finger image sensors
US8447077B2 (en) 2006-09-11 2013-05-21 Validity Sensors, Inc. Method and apparatus for fingerprint motion tracking using an in-line array
US8229184B2 (en) 2004-04-16 2012-07-24 Validity Sensors, Inc. Method and algorithm for accurate finger motion tracking
US8358815B2 (en) 2004-04-16 2013-01-22 Validity Sensors, Inc. Method and apparatus for two-dimensional finger motion tracking and control
US8131026B2 (en) 2004-04-16 2012-03-06 Validity Sensors, Inc. Method and apparatus for fingerprint image reconstruction
US7751601B2 (en) 2004-10-04 2010-07-06 Validity Sensors, Inc. Fingerprint sensing assemblies and methods of making
US8165355B2 (en) 2006-09-11 2012-04-24 Validity Sensors, Inc. Method and apparatus for fingerprint motion tracking using an in-line array for use in navigation applications
US8175345B2 (en) 2004-04-16 2012-05-08 Validity Sensors, Inc. Unitized ergonomic two-dimensional fingerprint motion tracking device and method
WO2005106774A2 (en) * 2004-04-23 2005-11-10 Validity Sensors, Inc. Methods and apparatus for acquiring a swiped fingerprint image
US7412700B2 (en) * 2004-05-18 2008-08-12 Oracle International Corporation Product packaging and installation mechanism
US8522205B2 (en) * 2004-05-18 2013-08-27 Oracle International Corporation Packaging multiple groups of read-only files of an application's components into multiple shared libraries
US8381135B2 (en) 2004-07-30 2013-02-19 Apple Inc. Proximity detector in handheld device
JP2006172439A (en) * 2004-11-26 2006-06-29 Oce Technologies Bv Desktop scanning using manual operation
CN101116114A (en) * 2005-02-10 2008-01-30 皇家飞利浦电子股份有限公司 Improved security device
US7831070B1 (en) 2005-02-18 2010-11-09 Authentec, Inc. Dynamic finger detection mechanism for a fingerprint sensor
US8231056B2 (en) * 2005-04-08 2012-07-31 Authentec, Inc. System for and method of protecting an integrated circuit from over currents
US20060282679A1 (en) * 2005-06-10 2006-12-14 Microsoft Corporation Secure rapid navigation and power control for a computer
US7505613B2 (en) * 2005-07-12 2009-03-17 Atrua Technologies, Inc. System for and method of securing fingerprint biometric systems against fake-finger spoofing
JP2007065858A (en) * 2005-08-30 2007-03-15 Fujitsu Ltd Authentication method, authentication device and program
US20070061126A1 (en) * 2005-09-01 2007-03-15 Anthony Russo System for and method of emulating electronic input devices
US7940249B2 (en) * 2005-11-01 2011-05-10 Authentec, Inc. Devices using a metal layer with an array of vias to reduce degradation
KR100663437B1 (en) * 2005-11-01 2007-01-02 삼성전자주식회사 Remote inputting method using finger print sensor
KR100856203B1 (en) * 2006-06-27 2008-09-03 삼성전자주식회사 User inputting apparatus and method using finger mark recognition sensor
US7885436B2 (en) * 2006-07-13 2011-02-08 Authentec, Inc. System for and method of assigning confidence values to fingerprint minutiae points
US20080049980A1 (en) * 2006-08-28 2008-02-28 Motorola, Inc. Button with integrated biometric sensor
US20080144144A1 (en) * 2006-10-31 2008-06-19 Ricoh Corporation Ltd. Confirming a state of a device
US8107212B2 (en) 2007-04-30 2012-01-31 Validity Sensors, Inc. Apparatus and method for protecting fingerprint sensing circuitry from electrostatic discharge
US8290150B2 (en) 2007-05-11 2012-10-16 Validity Sensors, Inc. Method and system for electronically securing an electronic device using physically unclonable functions
KR20110114732A (en) 2007-09-24 2011-10-19 애플 인크. Embedded authentication systems in an electronic device
US8204281B2 (en) 2007-12-14 2012-06-19 Validity Sensors, Inc. System and method to remove artifacts from fingerprint sensor scans
US8276816B2 (en) 2007-12-14 2012-10-02 Validity Sensors, Inc. Smart card system with ergonomic fingerprint sensor and method of using
US8600120B2 (en) 2008-01-03 2013-12-03 Apple Inc. Personal computing device control using face detection and recognition
WO2009100230A1 (en) * 2008-02-07 2009-08-13 Inflexis Corporation Mobile electronic security apparatus and method
US9785330B1 (en) 2008-02-13 2017-10-10 Apple Inc. Systems for and methods of providing inertial scrolling and navigation using a fingerprint sensor calculating swiping speed and length
US8005276B2 (en) 2008-04-04 2011-08-23 Validity Sensors, Inc. Apparatus and method for reducing parasitic capacitive coupling and noise in fingerprint sensing circuits
US8116540B2 (en) 2008-04-04 2012-02-14 Validity Sensors, Inc. Apparatus and method for reducing noise in fingerprint sensing circuits
GB2474999B (en) 2008-07-22 2013-02-20 Validity Sensors Inc System and method for securing a device component
US8391568B2 (en) 2008-11-10 2013-03-05 Validity Sensors, Inc. System and method for improved scanning of fingerprint edges
EP2192519B1 (en) * 2008-12-01 2015-02-25 BlackBerry Limited System and method of providing biometric quick launch
US20100138914A1 (en) * 2008-12-01 2010-06-03 Research In Motion Limited System and method of providing biometric quick launch
US8074880B2 (en) 2008-12-01 2011-12-13 Research In Motion Limited Method, system and mobile device employing enhanced fingerprint authentication
US20100182126A1 (en) * 2008-12-18 2010-07-22 Martis Dinesh J Biometric sensing apparatus and methods incorporating the same
US8278946B2 (en) 2009-01-15 2012-10-02 Validity Sensors, Inc. Apparatus and method for detecting finger activity on a fingerprint sensor
US8600122B2 (en) 2009-01-15 2013-12-03 Validity Sensors, Inc. Apparatus and method for culling substantially redundant data in fingerprint sensing circuits
US20100192096A1 (en) * 2009-01-27 2010-07-29 Sony Corporation Biometrics based menu privileges
US8374407B2 (en) 2009-01-28 2013-02-12 Validity Sensors, Inc. Live finger detection
KR101549558B1 (en) * 2009-03-18 2015-09-03 엘지전자 주식회사 Mobile terminal and control method thereof
US20100245288A1 (en) * 2009-03-29 2010-09-30 Harris Technology, Llc Touch Tunnels
US9485339B2 (en) 2009-05-19 2016-11-01 At&T Mobility Ii Llc Systems, methods, and mobile devices for providing a user interface to facilitate access to prepaid wireless account information
US20100302212A1 (en) * 2009-06-02 2010-12-02 Microsoft Corporation Touch personalization for a display device
US8455961B2 (en) * 2009-06-19 2013-06-04 Authentec, Inc. Illuminated finger sensor assembly for providing visual light indications including IC finger sensor grid array package
US8432252B2 (en) * 2009-06-19 2013-04-30 Authentec, Inc. Finger sensor having remote web based notifications
KR101032863B1 (en) * 2009-07-01 2011-05-06 주식회사 슈프리마 Fingerprint authentification apparatus and method using a plurality of sensor
US20110006880A1 (en) * 2009-07-09 2011-01-13 Medtronic Minimed, Inc. Fingerprint-linked control of a portable medical device
US8984596B2 (en) * 2009-09-30 2015-03-17 Authentec, Inc. Electronic device for displaying a plurality of web links based upon finger authentication and associated methods
WO2011044775A1 (en) * 2009-10-16 2011-04-21 华为终端有限公司 Data card, method and system for identifying fingerprint by data card
JP2011087785A (en) * 2009-10-23 2011-05-06 Hitachi Ltd Operation processor, operation processing method and operation processing program
US9336428B2 (en) 2009-10-30 2016-05-10 Synaptics Incorporated Integrated fingerprint sensor and display
US9274553B2 (en) 2009-10-30 2016-03-01 Synaptics Incorporated Fingerprint sensor and integratable electronic display
US9400911B2 (en) 2009-10-30 2016-07-26 Synaptics Incorporated Fingerprint sensor and integratable electronic display
JP4719296B1 (en) * 2009-12-25 2011-07-06 株式会社東芝 Information processing apparatus and information processing method
US8866347B2 (en) 2010-01-15 2014-10-21 Idex Asa Biometric image sensing
US8421890B2 (en) 2010-01-15 2013-04-16 Picofield Technologies, Inc. Electronic imager using an impedance sensor grid array and method of making
US8791792B2 (en) 2010-01-15 2014-07-29 Idex Asa Electronic imager using an impedance sensor grid array mounted on or about a switch and method of making
US8878791B2 (en) 2010-01-19 2014-11-04 Avaya Inc. Event generation based on print portion identification
US9666635B2 (en) 2010-02-19 2017-05-30 Synaptics Incorporated Fingerprint sensing circuit
US8716613B2 (en) 2010-03-02 2014-05-06 Synaptics Incoporated Apparatus and method for electrostatic discharge protection
US8532343B1 (en) * 2010-04-16 2013-09-10 Steven Jay Freedman System for non-repudiable registration of an online identity
US9282921B2 (en) * 2010-05-03 2016-03-15 Roche Diabetes Care, Inc. Measurement system for an analyte determination and a method
KR101678812B1 (en) * 2010-05-06 2016-11-23 엘지전자 주식회사 Mobile terminal and operation control method thereof
JPWO2011148719A1 (en) * 2010-05-28 2013-07-25 日本電気株式会社 Information processing apparatus, GUI operation support method, and GUI operation support program
JPWO2011152224A1 (en) * 2010-06-01 2013-07-25 日本電気株式会社 Terminal, process selection method, control program, and recording medium
US9001040B2 (en) 2010-06-02 2015-04-07 Synaptics Incorporated Integrated fingerprint sensor and navigation device
US8331096B2 (en) 2010-08-20 2012-12-11 Validity Sensors, Inc. Fingerprint acquisition expansion card apparatus
US20120090757A1 (en) 2010-10-18 2012-04-19 Qualcomm Mems Technologies, Inc. Fabrication of touch, handwriting and fingerprint sensor
US8604906B1 (en) 2010-11-18 2013-12-10 Sprint Spectrum L.P. Method and system for secret fingerprint scanning and reporting
US9201539B2 (en) 2010-12-17 2015-12-01 Microsoft Technology Licensing, Llc Supplementing a touch input mechanism with fingerprint detection
US8594393B2 (en) 2011-01-26 2013-11-26 Validity Sensors System for and method of image reconstruction with dual line scanner using line counts
US8538097B2 (en) 2011-01-26 2013-09-17 Validity Sensors, Inc. User input utilizing dual line scanner apparatus and method
JP5651494B2 (en) 2011-02-09 2015-01-14 日立マクセル株式会社 Information processing device
GB2489100A (en) 2011-03-16 2012-09-19 Validity Sensors Inc Wafer-level packaging for a fingerprint sensor
US8553001B2 (en) * 2011-03-22 2013-10-08 Adobe Systems Incorporated Methods and apparatus for determining local coordinate frames for a human hand
US8593421B2 (en) 2011-03-22 2013-11-26 Adobe Systems Incorporated Local coordinate frame user interface for multitouch-enabled devices
US8938101B2 (en) * 2011-04-26 2015-01-20 Sony Computer Entertainment America Llc Apparatus, system, and method for real-time identification of finger impressions for multiple users
GB201107273D0 (en) * 2011-04-28 2011-06-15 Inq Entpr Ltd Application control in electronic devices
USD667444S1 (en) * 2011-09-12 2012-09-18 Microsoft Corporation Display screen with icon
US8953889B1 (en) * 2011-09-14 2015-02-10 Rawles Llc Object datastore in an augmented reality environment
US8810367B2 (en) * 2011-09-22 2014-08-19 Apple Inc. Electronic device with multimode fingerprint reader
US8769624B2 (en) 2011-09-29 2014-07-01 Apple Inc. Access control utilizing indirect authentication
US9002322B2 (en) 2011-09-29 2015-04-07 Apple Inc. Authentication with secondary approver
KR101160681B1 (en) 2011-10-19 2012-06-28 배경덕 Method, mobile communication terminal and computer-readable recording medium for operating specific function when activaing of mobile communication terminal
US10043052B2 (en) 2011-10-27 2018-08-07 Synaptics Incorporated Electronic device packages and methods
US9223948B2 (en) 2011-11-01 2015-12-29 Blackberry Limited Combined passcode and activity launch modifier
US10082950B2 (en) * 2011-11-09 2018-09-25 Joseph T. LAPP Finger-mapped character entry systems
US11475105B2 (en) 2011-12-09 2022-10-18 Rightquestion, Llc Authentication translation
US9294452B1 (en) 2011-12-09 2016-03-22 Rightquestion, Llc Authentication translation
US9195877B2 (en) 2011-12-23 2015-11-24 Synaptics Incorporated Methods and devices for capacitive image sensing
US9785299B2 (en) 2012-01-03 2017-10-10 Synaptics Incorporated Structures and manufacturing methods for glass covered electronic devices
TWI533231B (en) 2012-01-17 2016-05-11 蘋果公司 Finger sensor having pixel sensing circuitry for coupling electrodes and pixel sensing traces and related methods
TW201335833A (en) * 2012-02-29 2013-09-01 Hon Hai Prec Ind Co Ltd Method and system for changing edit tools of electronic device
KR20150093254A (en) 2012-03-01 2015-08-17 시스-테크 솔루션스 인코포레이티드 Unique identification information from marked features
US20150379321A1 (en) 2012-03-01 2015-12-31 Sys-Tech Solutions, Inc. Methods and a system for verifying the authenticity of a mark
US10839365B2 (en) * 2012-03-01 2020-11-17 Paypal, Inc. Finger print funding source selection
US20150169928A1 (en) 2012-03-01 2015-06-18 Sys-Tech Solutions, Inc. Methods and a system for verifying the identity of a printed item
US9384518B2 (en) * 2012-03-26 2016-07-05 Amerasia International Technology, Inc. Biometric registration and verification system and method
US9137438B2 (en) 2012-03-27 2015-09-15 Synaptics Incorporated Biometric object sensor and method
US9268991B2 (en) 2012-03-27 2016-02-23 Synaptics Incorporated Method of and system for enrolling and matching biometric data
US9251329B2 (en) 2012-03-27 2016-02-02 Synaptics Incorporated Button depress wakeup and wakeup strategy
US9600709B2 (en) 2012-03-28 2017-03-21 Synaptics Incorporated Methods and systems for enrolling biometric data
US9152838B2 (en) 2012-03-29 2015-10-06 Synaptics Incorporated Fingerprint sensor packagings and methods
WO2013155224A1 (en) 2012-04-10 2013-10-17 Picofield Technologies Inc. Biometric sensing
US20130279768A1 (en) * 2012-04-19 2013-10-24 Authentec, Inc. Electronic device including finger-operated input device based biometric enrollment and related methods
US9348987B2 (en) 2012-04-19 2016-05-24 Apple Inc. Electronic device including finger-operated input device based biometric matching and related methods
US9024910B2 (en) 2012-04-23 2015-05-05 Qualcomm Mems Technologies, Inc. Touchscreen with bridged force-sensitive resistors
US20130298224A1 (en) 2012-05-03 2013-11-07 Authentec, Inc. Electronic device including a finger sensor having a valid authentication threshold time period and related methods
US8903141B2 (en) 2012-05-03 2014-12-02 Authentec, Inc. Electronic device including finger sensor having orientation based authentication and related methods
US9390307B2 (en) 2012-05-04 2016-07-12 Apple Inc. Finger biometric sensing device including error compensation circuitry and related methods
US9581628B2 (en) 2012-05-04 2017-02-28 Apple Inc. Electronic device including device ground coupled finger coupling electrode and array shielding electrode and related methods
US9322794B2 (en) 2012-12-18 2016-04-26 Apple Inc. Biometric finger sensor including array shielding electrode and related methods
CN106133748B (en) * 2012-05-18 2020-01-31 苹果公司 Device, method and graphical user interface for manipulating a user interface based on fingerprint sensor input
US8616451B1 (en) 2012-06-21 2013-12-31 Authentec, Inc. Finger sensing device including finger sensing integrated circuit die within a recess in a mounting substrate and related methods
US9965607B2 (en) 2012-06-29 2018-05-08 Apple Inc. Expedited biometric validation
US9710092B2 (en) 2012-06-29 2017-07-18 Apple Inc. Biometric initiated communication
US9471764B2 (en) 2012-07-19 2016-10-18 Apple Inc. Electronic device switchable to a user-interface unlocked mode based upon spoof detection and related methods
CN102902353B (en) * 2012-08-17 2016-06-08 广东欧珀移动通信有限公司 A kind of changing method of the camera installation of intelligent terminal
US9436864B2 (en) 2012-08-23 2016-09-06 Apple Inc. Electronic device performing finger biometric pre-matching and related methods
US8907914B2 (en) * 2012-08-31 2014-12-09 General Electric Company Methods and apparatus for documenting a procedure
US9665762B2 (en) * 2013-01-11 2017-05-30 Synaptics Incorporated Tiered wakeup strategy
US9104901B2 (en) 2013-03-15 2015-08-11 Apple Inc. Electronic device including interleaved biometric spoof detection data acquisition and related methods
KR102090750B1 (en) * 2013-08-23 2020-03-18 삼성전자주식회사 Electronic device and method for recognizing fingerprint
IN2013CH03958A (en) * 2013-09-04 2015-08-07 Samsung India Software Operations Pvt Ltd
US9898642B2 (en) 2013-09-09 2018-02-20 Apple Inc. Device, method, and graphical user interface for manipulating user interfaces based on fingerprint sensor inputs
US20150071508A1 (en) * 2013-09-09 2015-03-12 Apple Inc. Background Enrollment and Authentication of a User
US9928355B2 (en) 2013-09-09 2018-03-27 Apple Inc. Background enrollment and authentication of a user
USD741889S1 (en) * 2013-09-10 2015-10-27 Apple Inc. Display screen or portion thereof with animated graphical user interface
KR102126568B1 (en) * 2013-10-31 2020-06-24 삼성전자주식회사 Method for processing data and an electronic device thereof
CN103593214A (en) * 2013-11-07 2014-02-19 健雄职业技术学院 Method for starting and logging onto software through touch display screen and touch display screen
EP2884470A1 (en) * 2013-12-11 2015-06-17 Panasonic Intellectual Property Management Co., Ltd. Mobile payment terminal device
US9705676B2 (en) * 2013-12-12 2017-07-11 International Business Machines Corporation Continuous monitoring of fingerprint signature on a mobile touchscreen for identity management
KR102206394B1 (en) * 2013-12-17 2021-01-22 삼성전자 주식회사 Electronic Device And Method for Setting Task Of The Same
US10713466B2 (en) 2014-03-07 2020-07-14 Egis Technology Inc. Fingerprint recognition method and electronic device using the same
TWI517057B (en) * 2014-03-07 2016-01-11 神盾股份有限公司 Fingerprint recognition method and device
US10043185B2 (en) 2014-05-29 2018-08-07 Apple Inc. User interface for payments
US10289260B2 (en) * 2014-08-27 2019-05-14 Honda Motor Co., Ltd. Systems and techniques for application multi-tasking
CN105593868B (en) * 2014-09-09 2020-08-07 华为技术有限公司 Fingerprint identification method and device and mobile terminal
USD755839S1 (en) 2014-09-09 2016-05-10 Apple Inc. Display screen or portion thereof with graphical user interface
KR20160061163A (en) * 2014-11-21 2016-05-31 삼성전자주식회사 Method for registration and certification fingerprint and electronic device implementing the same
JP6055459B2 (en) * 2014-12-17 2016-12-27 京セラドキュメントソリューションズ株式会社 Touch panel device and image processing device
US9940572B2 (en) 2015-02-17 2018-04-10 Sys-Tech Solutions, Inc. Methods and a computing device for determining whether a mark is genuine
ES2915025T3 (en) 2015-06-16 2022-06-20 Sys Tech Solutions Inc Methods and a computer device to determine if a mark is genuine
CN106293191B (en) * 2015-06-19 2019-09-10 北京智谷睿拓技术服务有限公司 Information processing method and equipment
CN106325723B (en) 2015-06-19 2019-09-20 北京智谷睿拓技术服务有限公司 Information processing method and equipment
CN106293443B (en) * 2015-06-19 2019-09-20 北京智谷睿拓技术服务有限公司 Information processing method and equipment
JP7061465B2 (en) 2015-10-13 2022-04-28 華為技術有限公司 Operation methods, devices, and mobile devices that use fingerprint recognition
CN105302278B (en) 2015-10-19 2018-08-03 广东欧珀移动通信有限公司 The control method and device and mobile terminal of fingerprint sensor Serial Peripheral Interface (SPI)
US20170153696A1 (en) * 2015-11-30 2017-06-01 Internatioanal Business Machines Corporation Method and system for association of biometric sensor data with dynamic actions
US10719689B2 (en) * 2015-12-15 2020-07-21 Huawei Technologies Co., Ltd. Electronic device and fingerprint recognition method
KR102509018B1 (en) * 2016-01-11 2023-03-14 삼성디스플레이 주식회사 Display device and driving method thereof
AU2017234124B2 (en) 2016-03-14 2018-11-22 Sys-Tech Solutions, Inc. Methods and a computing device for determining whether a mark is genuine
US10713697B2 (en) 2016-03-24 2020-07-14 Avante International Technology, Inc. Farm product exchange system and method suitable for multiple small producers
CN106028141A (en) * 2016-04-28 2016-10-12 乐视控股(北京)有限公司 Interface display method and device of intelligent terminal
US9977946B2 (en) * 2016-05-03 2018-05-22 Novatek Microelectronics Corp. Fingerprint sensor apparatus and method for sensing fingerprint
DK179186B1 (en) 2016-05-19 2018-01-15 Apple Inc REMOTE AUTHORIZATION TO CONTINUE WITH AN ACTION
DK179593B1 (en) 2016-06-12 2019-02-25 Apple Inc. User interface for managing controllable external devices
WO2018023598A1 (en) * 2016-08-04 2018-02-08 薄冰 Method and login system for matching software account number based on fingerprint
WO2018023596A1 (en) * 2016-08-04 2018-02-08 薄冰 Method for suspending matching technology according to fingerprint and software account, and login system
WO2018068328A1 (en) 2016-10-14 2018-04-19 华为技术有限公司 Interface display method and terminal
USD804508S1 (en) 2016-10-26 2017-12-05 Apple Inc. Display screen or portion thereof with graphical user interface
US10402161B2 (en) 2016-11-13 2019-09-03 Honda Motor Co., Ltd. Human-vehicle interaction
KR102616793B1 (en) * 2016-11-15 2023-12-26 삼성전자 주식회사 Electronic device and method for providing scrren thereof
CN106843737B (en) * 2017-02-13 2020-05-08 北京新美互通科技有限公司 Text input method and device and terminal equipment
US10169631B2 (en) 2017-03-06 2019-01-01 International Business Machines Corporation Recognizing fingerprints and fingerprint combinations as inputs
US10534899B2 (en) 2017-08-24 2020-01-14 Blackberry Limited Utilizing inputs for accessing devices
KR102185854B1 (en) 2017-09-09 2020-12-02 애플 인크. Implementation of biometric authentication
EP4156129A1 (en) 2017-09-09 2023-03-29 Apple Inc. Implementation of biometric enrollment
US10140502B1 (en) * 2018-02-13 2018-11-27 Conduit Ltd Selecting data items using biometric features
CN111684762B (en) 2018-03-28 2022-11-18 华为技术有限公司 Terminal device management method and terminal device
EP4113989A1 (en) 2018-05-07 2023-01-04 Apple Inc. User interfaces for viewing live video feeds and recorded video
DE102018004376A1 (en) * 2018-06-01 2019-12-05 Psa Automobiles Sa Operating device with fingerprint triggering functions
US11170085B2 (en) 2018-06-03 2021-11-09 Apple Inc. Implementation of biometric authentication
US10860096B2 (en) 2018-09-28 2020-12-08 Apple Inc. Device control using gaze information
US11100349B2 (en) 2018-09-28 2021-08-24 Apple Inc. Audio assisted enrollment
KR102634349B1 (en) * 2018-10-11 2024-02-07 현대자동차주식회사 Apparatus and method for controlling display of vehicle
KR20200091751A (en) * 2019-01-23 2020-07-31 휴렛-팩커드 디벨롭먼트 컴퍼니, 엘.피. Image forming operation based on biometric information
US11698959B2 (en) * 2019-03-26 2023-07-11 Gear Radio Electronics Corp. Setup method, recognition method and electronic device using the same
TWI735171B (en) * 2019-05-10 2021-08-01 聚睿電子股份有限公司 Setup method, recognition method and electronic device using the same
US10904029B2 (en) 2019-05-31 2021-01-26 Apple Inc. User interfaces for managing controllable external devices
US11363071B2 (en) 2019-05-31 2022-06-14 Apple Inc. User interfaces for managing a local network
US11513667B2 (en) 2020-05-11 2022-11-29 Apple Inc. User interface for audio message
KR102499625B1 (en) * 2020-09-10 2023-02-14 코나아이 (주) Multi card including fingerprint input unit and payment method using the same
US11409410B2 (en) 2020-09-14 2022-08-09 Apple Inc. User input interfaces
US11907342B2 (en) * 2020-11-20 2024-02-20 Qualcomm Incorporated Selection of authentication function according to environment of user device
USD1007521S1 (en) 2021-06-04 2023-12-12 Apple Inc. Display screen or portion thereof with graphical user interface

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030028811A1 (en) * 2000-07-12 2003-02-06 Walker John David Method, apparatus and system for authenticating fingerprints, and communicating and processing commands and information based on the fingerprint authentication
US20050012714A1 (en) * 2003-06-25 2005-01-20 Russo Anthony P. System and method for a miniature user input device
US7003670B2 (en) * 2001-06-08 2006-02-21 Musicrypt, Inc. Biometric rights management system

Family Cites Families (178)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US208348A (en) * 1878-09-24 Improvement in printers galleys
EP0173972B1 (en) 1984-08-30 1991-02-27 Nec Corporation Pre-processing system for pre-processing an image signal succession prior to identification
GB8914235D0 (en) 1989-06-21 1989-08-09 Tait David A G Finger operable control devices
US5327161A (en) 1989-08-09 1994-07-05 Microtouch Systems, Inc. System and method for emulating a mouse input device with a touchpad input device
JPH0471079A (en) 1990-07-12 1992-03-05 Takayama:Kk Positioning method for image
US5170364A (en) 1990-12-06 1992-12-08 Biomechanics Corporation Of America Feedback system for load bearing surface
US5666113A (en) 1991-07-31 1997-09-09 Microtouch Systems, Inc. System for using a touchpad input device for cursor control and keyboard emulation
US5880411A (en) 1992-06-08 1999-03-09 Synaptics, Incorporated Object position detector with edge motion feature and gesture recognition
US5821930A (en) 1992-08-23 1998-10-13 U S West, Inc. Method and system for generating a working window in a computer system
US5612719A (en) 1992-12-03 1997-03-18 Apple Computer, Inc. Gesture sensitive buttons for graphical user interfaces
WO1995008167A1 (en) 1993-09-13 1995-03-23 Asher David J Joystick with membrane sensor
US6546112B1 (en) 1993-11-18 2003-04-08 Digimarc Corporation Security document with steganographically-encoded authentication data
US5825907A (en) 1994-12-28 1998-10-20 Lucent Technologies Inc. Neural network system for classifying fingerprints
FR2730810B1 (en) 1995-02-21 1997-03-14 Thomson Csf HIGHLY SELECTIVE CHEMICAL SENSOR
US5740276A (en) 1995-07-27 1998-04-14 Mytec Technologies Inc. Holographic method for encrypting and decrypting information using a fingerprint
US5825352A (en) 1996-01-04 1998-10-20 Logitech, Inc. Multiple fingers contact sensing method for emulating mouse buttons and mouse operations on a touch sensor pad
US5841888A (en) 1996-01-23 1998-11-24 Harris Corporation Method for fingerprint indexing and searching
US5963679A (en) 1996-01-26 1999-10-05 Harris Corporation Electric field fingerprint sensor apparatus and related methods
US6067368A (en) 1996-01-26 2000-05-23 Authentec, Inc. Fingerprint sensor having filtering and power conserving features and related methods
US5828773A (en) 1996-01-26 1998-10-27 Harris Corporation Fingerprint sensing method with finger position indication
US5956415A (en) 1996-01-26 1999-09-21 Harris Corporation Enhanced security fingerprint sensor package and related methods
JP3747520B2 (en) 1996-01-30 2006-02-22 富士ゼロックス株式会社 Information processing apparatus and information processing method
US5995630A (en) 1996-03-07 1999-11-30 Dew Engineering And Development Limited Biometric input with encryption
FR2749955B1 (en) 1996-06-14 1998-09-11 Thomson Csf FINGERPRINT READING SYSTEM
US5943044A (en) 1996-08-05 1999-08-24 Interlink Electronics Force sensing semiconductive touchpad
US6208329B1 (en) 1996-08-13 2001-03-27 Lsi Logic Corporation Supplemental mouse button emulation system, method and apparatus for a coordinate based data input device
JPH1069346A (en) 1996-08-28 1998-03-10 Alps Electric Co Ltd Coordinate input device and its control method
JPH1079948A (en) 1996-09-03 1998-03-24 Mitsubishi Electric Corp Image encoding device
US6219793B1 (en) 1996-09-11 2001-04-17 Hush, Inc. Method of using fingerprints to authenticate wireless communications
US6337918B1 (en) 1996-11-04 2002-01-08 Compaq Computer Corporation Computer system with integratable touchpad/security subsystem
FR2755526B1 (en) 1996-11-05 1999-01-22 Thomson Csf FINGERPRINT READING SYSTEM WITH INTEGRATED HEATING RESISTORS
US6061051A (en) 1997-01-17 2000-05-09 Tritech Microelectronics Command set for touchpad pen-input mouse
US6057830A (en) 1997-01-17 2000-05-02 Tritech Microelectronics International Ltd. Touchpad mouse controller
US5995084A (en) 1997-01-17 1999-11-30 Tritech Microelectronics, Ltd. Touchpad pen-input and mouse controller
US5982894A (en) 1997-02-06 1999-11-09 Authentec, Inc. System including separable protected components and associated methods
US5909211A (en) 1997-03-25 1999-06-01 International Business Machines Corporation Touch pad overlay driven computer system
CA2203212A1 (en) 1997-04-21 1998-10-21 Vijayakumar Bhagavatula Methodology for biometric encryption
US6125192A (en) 1997-04-21 2000-09-26 Digital Persona, Inc. Fingerprint recognition system
US5903225A (en) 1997-05-16 1999-05-11 Harris Corporation Access control system including fingerprint sensor enrollment and associated methods
US6088471A (en) 1997-05-16 2000-07-11 Authentec, Inc. Fingerprint sensor including an anisotropic dielectric coating and associated methods
US6088585A (en) 1997-05-16 2000-07-11 Authentec, Inc. Portable telecommunication device including a fingerprint sensor and related methods
US5953441A (en) 1997-05-16 1999-09-14 Harris Corporation Fingerprint sensor having spoof reduction features and related methods
US5920640A (en) 1997-05-16 1999-07-06 Harris Corporation Fingerprint sensor and token reader and associated methods
US6259804B1 (en) 1997-05-16 2001-07-10 Authentic, Inc. Fingerprint sensor with gain control features and associated methods
US6098330A (en) 1997-05-16 2000-08-08 Authentec, Inc. Machine including vibration and shock resistant fingerprint sensor and related methods
US5940526A (en) 1997-05-16 1999-08-17 Harris Corporation Electric field fingerprint sensor having enhanced features and related methods
US5943052A (en) 1997-08-12 1999-08-24 Synaptics, Incorporated Method and apparatus for scroll bar control
US6011849A (en) 1997-08-28 2000-01-04 Syndata Technologies, Inc. Encryption-based selection system for steganography
US6483931B2 (en) 1997-09-11 2002-11-19 Stmicroelectronics, Inc. Electrostatic discharge protection of a capacitve type fingerprint sensing array
US6035398A (en) 1997-11-14 2000-03-07 Digitalpersona, Inc. Cryptographic key generation using biometric data
US6028773A (en) 1997-11-14 2000-02-22 Stmicroelectronics, Inc. Packaging for silicon sensors
US6330345B1 (en) 1997-11-17 2001-12-11 Veridicom, Inc. Automatic adjustment processing for sensor devices
US6047281A (en) 1997-12-05 2000-04-04 Authentec, Inc. Method and apparatus for expandable biometric searching
US6070159A (en) 1997-12-05 2000-05-30 Authentec, Inc. Method and apparatus for expandable biometric searching
US6047282A (en) 1997-12-05 2000-04-04 Authentec, Inc. Apparatus and method for expandable biometric searching
US6317508B1 (en) 1998-01-13 2001-11-13 Stmicroelectronics, Inc. Scanning capacitive semiconductor fingerprint detector
US6408087B1 (en) 1998-01-13 2002-06-18 Stmicroelectronics, Inc. Capacitive semiconductor user input device
EP1717681B1 (en) 1998-01-26 2015-04-29 Apple Inc. Method for integrating manual input
US6141753A (en) 1998-02-10 2000-10-31 Fraunhofer Gesellschaft Secure distribution of digital representations
EP0940652B1 (en) 1998-03-05 2004-12-22 Nippon Telegraph and Telephone Corporation Surface shape recognition sensor and method of fabricating the same
US6278443B1 (en) 1998-04-30 2001-08-21 International Business Machines Corporation Touch screen with random finger placement and rolling on screen to control the movement of information on-screen
US6057540A (en) 1998-04-30 2000-05-02 Hewlett-Packard Co Mouseless optical and position translation type screen pointer control for a computer system
EP1076878B1 (en) 1998-05-08 2002-07-24 Siemens Aktiengesellschaft Method for producing a reference image for pattern recognition tasks
US6400836B2 (en) 1998-05-15 2002-06-04 International Business Machines Corporation Combined fingerprint acquisition and control device
US6404900B1 (en) 1998-06-22 2002-06-11 Sharp Laboratories Of America, Inc. Method for robust human face tracking in presence of multiple persons
GB9814398D0 (en) * 1998-07-02 1998-09-02 Nokia Mobile Phones Ltd Electronic apparatus
CA2273560A1 (en) 1998-07-17 2000-01-17 David Andrew Inglis Finger sensor operating technique
US6135958A (en) 1998-08-06 2000-10-24 Acuson Corporation Ultrasound imaging system with touch-pad pointing device
JP2000056877A (en) 1998-08-07 2000-02-25 Nec Corp Touch panel type layout free keyboard
US6950539B2 (en) 1998-09-16 2005-09-27 Digital Persona Configurable multi-function touchpad device
US6256022B1 (en) 1998-11-06 2001-07-03 Stmicroelectronics S.R.L. Low-cost semiconductor user input device
US6442286B1 (en) 1998-12-22 2002-08-27 Stmicroelectronics, Inc. High security flash memory and method
US6320975B1 (en) 1999-04-22 2001-11-20 Thomas Vieweg Firearm holster lock with fingerprint identification means
US6535622B1 (en) 1999-04-26 2003-03-18 Veridicom, Inc. Method for imaging fingerprints and concealing latent fingerprints
US6628812B1 (en) 1999-05-11 2003-09-30 Authentec, Inc. Fingerprint sensor package having enhanced electrostatic discharge protection and associated methods
US6683971B1 (en) 1999-05-11 2004-01-27 Authentec, Inc. Fingerprint sensor with leadframe bent pin conductive path and associated methods
NO307770B1 (en) 1999-05-20 2000-05-22 Idex As Method and system for verifying the identity of a sensor
US6744910B1 (en) 1999-06-25 2004-06-01 Cross Match Technologies, Inc. Hand-held fingerprint scanner with on-board image normalization data storage
JP2001023473A (en) 1999-07-07 2001-01-26 Matsushita Electric Ind Co Ltd Mobile communication terminal unit and transparent touch panel switch for use in it
US6681034B1 (en) 1999-07-15 2004-01-20 Precise Biometrics Method and system for fingerprint template matching
US6546122B1 (en) 1999-07-29 2003-04-08 Veridicom, Inc. Method for combining fingerprint templates representing various sensed areas of a fingerprint to derive one fingerprint template representing the fingerprint
DE19935910A1 (en) 1999-07-30 2001-02-08 Siemens Ag Passivation layer structure
DE19936322C2 (en) 1999-08-02 2001-08-09 Infineon Technologies Ag Semiconductor component with scratch-resistant coating
WO2001029731A1 (en) 1999-10-21 2001-04-26 3Com Corporation Access control using a personal digital assistant-type
JP2003529130A (en) 1999-10-27 2003-09-30 ガーサビアン、フィルーツ Integrated keypad system
US6654484B2 (en) * 1999-10-28 2003-11-25 Catherine Topping Secure control data entry system
US20030013849A1 (en) * 1999-10-29 2003-01-16 Ward William W. Renilla reniformis green fluorescent protein
WO2001039134A2 (en) 1999-11-25 2001-05-31 Infineon Technologies Ag Security system comprising a biometric sensor
US7054470B2 (en) 1999-12-02 2006-05-30 International Business Machines Corporation System and method for distortion characterization in fingerprint and palm-print image sequences and using this distortion as a behavioral biometrics
GB2357335B (en) 1999-12-17 2004-04-07 Nokia Mobile Phones Ltd Fingerprint recognition and pointing device
US6920560B2 (en) 1999-12-30 2005-07-19 Clyde Riley Wallace, Jr. Secure network user states
EP1113383A3 (en) 1999-12-30 2003-12-17 STMicroelectronics, Inc. Enhanced fingerprint detection
US6512381B2 (en) 1999-12-30 2003-01-28 Stmicroelectronics, Inc. Enhanced fingerprint detection
US7239227B1 (en) 1999-12-30 2007-07-03 Upek, Inc. Command interface using fingerprint sensor input system
US20040252867A1 (en) 2000-01-05 2004-12-16 Je-Hsiung Lan Biometric sensor
US20010032319A1 (en) 2000-01-10 2001-10-18 Authentec, Inc. Biometric security system for computers and related method
US6563101B1 (en) 2000-01-19 2003-05-13 Barclay J. Tullis Non-rectilinear sensor arrays for tracking an image
US6754365B1 (en) 2000-02-16 2004-06-22 Eastman Kodak Company Detecting embedded information in images
DE10009539A1 (en) 2000-02-29 2001-09-20 Infineon Technologies Ag Analysis of electronically generated fingerprint images
ATE280976T1 (en) 2000-03-24 2004-11-15 Infineon Technologies Ag HOUSING FOR BIOMETRIC SENSOR CHIPS
JP4426733B2 (en) 2000-03-31 2010-03-03 富士通株式会社 Fingerprint data synthesizing method, fingerprint data synthesizing device, fingerprint data synthesizing program, and computer-readable recording medium recording the program
US6819784B1 (en) 2000-04-04 2004-11-16 Upek, Inc. Method of and system for compensating for injection gradient in a capacitive sensing circuit array
EP1143373B1 (en) 2000-04-05 2008-07-16 Infineon Technologies AG Method for error-free image acquisition using an electronic sensor
EP1146471B1 (en) 2000-04-14 2005-11-23 Infineon Technologies AG Capacitive biometric sensor
US6518560B1 (en) 2000-04-27 2003-02-11 Veridicom, Inc. Automatic gain amplifier for biometric sensor device
CN100342422C (en) 2000-05-24 2007-10-10 英默森公司 Haptic devices using electroactive polymers
NO315016B1 (en) 2000-06-09 2003-06-23 Idex Asa Miniature sensor
NO314647B1 (en) 2000-06-09 2003-04-22 Idex Asa Fingerprint sensor measurement system
NO315017B1 (en) 2000-06-09 2003-06-23 Idex Asa Sensor chip, especially for measuring structures in a finger surface
NO20003006L (en) 2000-06-09 2001-12-10 Idex Asa Mouse
NO20003007L (en) 2000-06-09 2001-12-10 Idex Asa Fingerprint sensor speed calculation
NO20003002L (en) 2000-06-09 2001-12-10 Idex Asa Speed calculation of fingerprint measurement using flank measurement
US6667439B2 (en) 2000-08-17 2003-12-23 Authentec, Inc. Integrated circuit package including opening exposing portion of an IC
WO2002015209A2 (en) 2000-08-17 2002-02-21 Authentec Inc. Methods and apparatus for making integrated circuit package including opening exposing portion of the ic
US6501284B1 (en) 2000-08-28 2002-12-31 Stmicroelectronics, Inc. Capacitive finger detection for fingerprint sensor
US6661631B1 (en) 2000-09-09 2003-12-09 Stmicroelectronics, Inc. Automatic latchup recovery circuit for fingerprint sensor
DE10059099C1 (en) 2000-11-28 2002-06-06 Infineon Technologies Ag Component with ESD protection, e.g. Foil sensor for biometric recognition (fingerprint recognition sensor)
JP2002244781A (en) 2001-02-15 2002-08-30 Wacom Co Ltd Input system, program, and recording medium
DE10109327A1 (en) 2001-02-27 2002-09-12 Infineon Technologies Ag Semiconductor chip and manufacturing method for a package
DE10110724A1 (en) 2001-03-06 2002-09-26 Infineon Technologies Ag Fingerprint sensor with potential modulation of the ESD protective grid
DE10111805A1 (en) 2001-03-12 2002-09-26 Infineon Technologies Ag authentication medium
US6603462B2 (en) 2001-03-21 2003-08-05 Multidigit, Inc. System and method for selecting functions based on a finger feature such as a fingerprint
DE10120067C1 (en) 2001-04-24 2002-06-13 Siemens Ag Mobile communications device has incorporated biometric sensor for fingerprint checking for activation of communications device
US7256589B2 (en) 2001-04-27 2007-08-14 Atrua Technologies, Inc. Capacitive sensor system with improved capacitance measuring sensitivity
US6515488B1 (en) 2001-05-07 2003-02-04 Stmicroelectronics, Inc. Fingerprint detector with scratch resistant surface and embedded ESD protection grid
US7369688B2 (en) 2001-05-09 2008-05-06 Nanyang Technological Univeristy Method and device for computer-based processing a template minutia set of a fingerprint and a computer readable storage medium
DE10123330A1 (en) 2001-05-14 2002-11-28 Infineon Technologies Ag Detection of falsified fingerprints, e.g. a silicon casting of a fingerprint, using a dynamic, software-based method for detection of falsified fingerprints that is quick and efficient
GB0112161D0 (en) 2001-05-18 2001-07-11 Rogers Alan J Distributed fibre polarimetry for communications and sensing
TW506580U (en) 2001-06-06 2002-10-11 First Int Computer Inc Wireless remote control device of notebook computer
US7203347B2 (en) 2001-06-27 2007-04-10 Activcard Ireland Limited Method and system for extracting an area of interest from within a swipe image of a biological surface
WO2003007125A2 (en) 2001-07-12 2003-01-23 Icontrol Transactions, Inc. Secure network and networked devices using biometrics
US20030021495A1 (en) 2001-07-12 2003-01-30 Ericson Cheng Fingerprint biometric capture device and method with integrated on-chip data buffering
US6597289B2 (en) 2001-07-31 2003-07-22 Stmicroelectronics, Inc. Fingerprint sensor power management detection of overcurrent
DE10139382A1 (en) 2001-08-10 2003-02-27 Infineon Technologies Ag Chip card with integrated fingerprint sensor
US20030035568A1 (en) 2001-08-20 2003-02-20 Mitev Mitko G. User interface including multifunction fingerprint roller and computer including the same
US20030038824A1 (en) * 2001-08-24 2003-02-27 Ryder Brian D. Addition of mouse scrolling and hot-key functionality to biometric security fingerprint readers in notebook computers
JP2003075135A (en) 2001-08-31 2003-03-12 Nec Corp Fingerprint image input device and organism discrimination method by fingerprint image
US7131004B1 (en) 2001-08-31 2006-10-31 Silicon Image, Inc. Method and apparatus for encrypting data transmitted over a serial link
US20030123714A1 (en) * 2001-11-06 2003-07-03 O'gorman Lawrence Method and system for capturing fingerprints from multiple swipe images
JP3773442B2 (en) 2001-11-22 2006-05-10 シャープ株式会社 Image forming apparatus
NO316776B1 (en) 2001-12-07 2004-05-03 Idex Asa Package solution for fingerprint sensor
NO318294B1 (en) 2001-12-07 2005-02-28 Idex Asa Navigation Concept
NO316002B1 (en) 2001-12-07 2003-11-24 Idex Asa Method and apparatus for generating sound effects
DE60214044T2 (en) 2001-12-07 2007-02-15 Idex Asa SENSOR FOR MEASUREMENTS ON WETS AND DRY FINGERS
US20030108227A1 (en) 2001-12-11 2003-06-12 Koninklijke Philips Electronics N.V. Remote control with the fingerprint recognition capability
US7929951B2 (en) * 2001-12-20 2011-04-19 Stevens Lawrence A Systems and methods for storage of user information and for verifying user identity
US7002553B2 (en) 2001-12-27 2006-02-21 Mark Shkolnikov Active keyboard system for handheld electronic devices
US20030135764A1 (en) 2002-01-14 2003-07-17 Kun-Shan Lu Authentication system and apparatus having fingerprint verification capabilities thereof
US7013030B2 (en) * 2002-02-14 2006-03-14 Wong Jacob Y Personal choice biometric signature
NO316796B1 (en) 2002-03-01 2004-05-10 Idex Asa Sensor module for painting structures in a surface, especially a finger surface
US7076089B2 (en) 2002-05-17 2006-07-11 Authentec, Inc. Fingerprint sensor having enhanced ESD protection and associated methods
JP2004110438A (en) 2002-09-18 2004-04-08 Nec Corp Image processing device, image processing method, and program
US7116805B2 (en) * 2003-01-07 2006-10-03 Avagotechnologies Ecbu Ip (Singapore) Pte. Ltd. Fingerprint verification device
US7404086B2 (en) * 2003-01-24 2008-07-22 Ac Technology, Inc. Method and apparatus for biometric authentication
US20070034783A1 (en) 2003-03-12 2007-02-15 Eliasson Jonas O P Multitasking radiation sensor
US7941849B2 (en) * 2003-03-21 2011-05-10 Imprivata, Inc. System and method for audit tracking
AU2004227886A1 (en) 2003-04-04 2004-10-21 Lumidigm, Inc. Multispectral biometric sensor
US7274808B2 (en) * 2003-04-18 2007-09-25 Avago Technologies Ecbu Ip (Singapore)Pte Ltd Imaging system and apparatus for combining finger recognition and finger navigation
GB2401979B (en) 2003-05-21 2007-03-21 Research In Motion Ltd Apparatus and method of input and finger print recognition on a handheld electronic device
US20070038867A1 (en) 2003-06-02 2007-02-15 Verbauwhede Ingrid M System for biometric signal processing with hardware and software acceleration
US7088220B2 (en) 2003-06-20 2006-08-08 Motorola, Inc. Method and apparatus using biometric sensors for controlling access to a wireless communication device
US7587072B2 (en) * 2003-08-22 2009-09-08 Authentec, Inc. System for and method of generating rotational inputs
JP4859053B2 (en) 2003-09-12 2012-01-18 フラットフロッグ・ラボラトリーズ・アクチボラゲット System and method for locating radiation scattering / reflecting elements
US7577659B2 (en) 2003-10-24 2009-08-18 Microsoft Corporation Interoperable credential gathering and access modularity
TWI260525B (en) 2003-12-30 2006-08-21 Icp Electronics Inc Switch control system for multiple input devices and method thereof
WO2005079413A2 (en) 2004-02-12 2005-09-01 Atrua Technologies, Inc. System and method of emulating mouse operations using finger image sensors
US7574022B2 (en) 2004-05-20 2009-08-11 Atrua Technologies Secure system and method of creating and processing partial finger images
US7113179B2 (en) 2004-06-23 2006-09-26 Interlink Electronics, Inc. Force sensing resistor with calibration element and method of manufacturing same
JP2006053629A (en) 2004-08-10 2006-02-23 Toshiba Corp Electronic equipment, control method and control program
US7797750B2 (en) 2004-08-10 2010-09-14 Newport Scientific Research Llc Data security system
US7280679B2 (en) 2004-10-08 2007-10-09 Atrua Technologies, Inc. System for and method of determining pressure on a finger sensor
US20060103633A1 (en) 2004-11-17 2006-05-18 Atrua Technologies, Inc. Customizable touch input module for an electronic device
US20060242268A1 (en) 2005-04-25 2006-10-26 General Electric Company Mobile radiology system with automated DICOM image transfer and PPS queue management
US7505613B2 (en) 2005-07-12 2009-03-17 Atrua Technologies, Inc. System for and method of securing fingerprint biometric systems against fake-finger spoofing
US20070061126A1 (en) 2005-09-01 2007-03-15 Anthony Russo System for and method of emulating electronic input devices
US8090945B2 (en) 2005-09-16 2012-01-03 Tara Chand Singhal Systems and methods for multi-factor remote user authentication
US7791596B2 (en) 2005-12-27 2010-09-07 Interlink Electronics, Inc. Touch input device having interleaved scroll sensors
US7885436B2 (en) 2006-07-13 2011-02-08 Authentec, Inc. System for and method of assigning confidence values to fingerprint minutiae points

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030028811A1 (en) * 2000-07-12 2003-02-06 Walker John David Method, apparatus and system for authenticating fingerprints, and communicating and processing commands and information based on the fingerprint authentication
US7003670B2 (en) * 2001-06-08 2006-02-21 Musicrypt, Inc. Biometric rights management system
US20050012714A1 (en) * 2003-06-25 2005-01-20 Russo Anthony P. System and method for a miniature user input device

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103577739A (en) * 2013-11-15 2014-02-12 青岛尚慧信息技术有限公司 Intelligent mobile terminal and setting and accessing control method thereof
CN103577739B (en) * 2013-11-15 2016-08-17 上海快应信息科技有限公司 A kind of intelligent mobile terminal and setting thereof and access control method
WO2018023579A1 (en) * 2016-08-04 2018-02-08 薄冰 Method for stopping using fingerprint-enabled software according to user feedback, and mobile phone system

Also Published As

Publication number Publication date
WO2005072372A3 (en) 2007-09-27
US20050169503A1 (en) 2005-08-04
US7697729B2 (en) 2010-04-13

Similar Documents

Publication Publication Date Title
US7697729B2 (en) System for and method of finger initiated actions
US10621324B2 (en) Fingerprint gestures
US7673149B2 (en) Identification and/or authentication method
TWI490725B (en) Electronic device including finger-operated input device based biometric enrollment and related methods
EP3014509B1 (en) User verification for changing a setting of an electronic device
US8856543B2 (en) User identification with biokinematic input
US9348987B2 (en) Electronic device including finger-operated input device based biometric matching and related methods
KR101848948B1 (en) Methods and systems for enrolling biometric data
CN204833267U (en) Use biological measurement configuration electronic equipment's of remote user system and electronic equipment
JP2012521170A (en) Biometric recognition scan configuration and method
WO2013165801A1 (en) Electronic device including a finger sensor having a valid authentication threshold time period and related methods
WO2011126515A1 (en) Authenticating a person's identity using rfid card, biometric signature recognition and facial recognition
CN108629174B (en) Method and device for checking character strings
JP6407772B2 (en) Input device
GB2447752A (en) Registering fingerprints for application software login
JP2005004490A (en) Document processor and its program
JP4193123B2 (en) Document processing apparatus and document processing method
JP2009159539A (en) Electronic appliance
US20190130085A1 (en) Systems and methods of providing seamless and secure operations of authenticating and advertising on mobile communication terminals
JP2007265219A (en) Biometrics system
JP2003150557A (en) Automatic input method of information by organismic authentication, its automatic input system and its automatic input program
US11423183B2 (en) Thermal imaging protection
WO2020128693A1 (en) Device and method to control access to protected functionality of applications
JP3687569B2 (en) Portable display device and program

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A2

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BW BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE EG ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NA NI NO NZ OM PG PH PL PT RO RU SC SD SE SG SK SL SY TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW

AL Designated countries for regional patents

Kind code of ref document: A2

Designated state(s): BW GH GM KE LS MW MZ NA SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IS IT LT LU MC NL PL PT RO SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG

NENP Non-entry into the national phase

Ref country code: DE

WWW Wipo information: withdrawn in national office

Country of ref document: DE

121 Ep: the epo has been informed by wipo that ep was designated in this application
122 Ep: pct application non-entry in european phase