WO2002017221A1 - Fingerprint scanner auto-capture system and method - Google Patents

Fingerprint scanner auto-capture system and method Download PDF

Info

Publication number
WO2002017221A1
WO2002017221A1 PCT/US2000/035434 US0035434W WO0217221A1 WO 2002017221 A1 WO2002017221 A1 WO 2002017221A1 US 0035434 W US0035434 W US 0035434W WO 0217221 A1 WO0217221 A1 WO 0217221A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
darkness
fingeφrint
acceptable
test
Prior art date
Application number
PCT/US2000/035434
Other languages
French (fr)
Inventor
David C. Smith
Original Assignee
Cross Match Technologies, Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Cross Match Technologies, Inc. filed Critical Cross Match Technologies, Inc.
Priority to JP2002521215A priority Critical patent/JP2004506993A/en
Priority to DE60027207T priority patent/DE60027207T2/en
Priority to AU2001222942A priority patent/AU2001222942A1/en
Priority to EP00986761A priority patent/EP1312040B1/en
Publication of WO2002017221A1 publication Critical patent/WO2002017221A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/12Fingerprints or palmprints
    • G06V40/13Sensors therefor
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/98Detection or correction of errors, e.g. by rescanning the pattern or by human intervention; Evaluation of the quality of the acquired patterns
    • G06V10/993Evaluation of the quality of the acquired pattern

Definitions

  • the present invention relates to generally to fingerprint scanning and imaging. More specifically, the present invention relates to a system and method for capturing a fingerprint image.
  • Biometrics are a group of technologies that provide a high level of security. Fingerprint capture and recognition is an important biometric technology. Law enforcement, banking, voting, and other industries increasingly rely upon fingerprints as a biometric to recognize or verify identity. See, Biometrics Explained, v. 2.0, G. Roethenbaugh, International Computer Society Assn. Carlisle, PA 1998, pages 1-34. Finge ⁇ rint scanners having cameras are available that capture an image of a fingerprint. A signal representative of the captured image is then sent over a data communication interface to a host computer for further processing. For example, the host can perform one-to-one or one-to-many fingerprint matching.
  • a light source is typically directed towards a finge ⁇ rint capture surface that reflects light from the light source towards a camera.
  • the finge ⁇ rint capture surface is generally glass. Contact between the surface of a finger and the finge ⁇ rint capture surface causes the reflected light to be representative of the finge ⁇ rint of the particular finger placed against the finge ⁇ rint capture surface. This reflection then must be captured by camera.
  • the intensity of the reflected light varies greatly in such a system. For example, variations due to manufacturing tolerances and techniques used to produce the light source can affect the intensity of light directed towards the finge ⁇ rint capture surface. Such a variation can, however, be determined at the time of manufacture and can be factored into the design of the system. Other variations cannot be determined in advance, and so must be compensated for in the field.
  • the quality of contact between a finger and the finge ⁇ rint capture surface plays a large role in the intensity of the reflected light.
  • a very dry skin surface on a clean finge ⁇ rint capture surface may result in a low intensity level of reflected light.
  • an oily skin surface and/or a less- clean finge ⁇ rint capture surface may result in a high level of reflected light.
  • a method of capturing an acceptable finge ⁇ rint image is disclosed herein.
  • This method includes a step of capturing an initial finge ⁇ rint image at a nominal image integration time. Once this initial finge ⁇ rint image is captured, a first intermediate finge ⁇ rint image at a first intermediate image integration time is captured. Next, an image darkness test is performed followed by an image definition test. If one or more of these tests indicates that the first intermediate finge ⁇ rint image is unacceptable, a subsequent intermediate finge ⁇ rint image at a subsequent intermediate image integration time is captured. This subsequent intermediate finge ⁇ rint image can be captured before the image definition test is performed. Additional intermediate finge ⁇ rint images can be captured until an image that has an acceptable darkness level as a well as an acceptable definition level is captured. These additional intermediate finge ⁇ rint images can be captured at incremented intermediate integration times. The intermediate integration times can be derived from the nominal image integration time by multiplying the nominal image integration time by multiples of 1/7 of the nominal image integration time.
  • a method according to the present invention can include calculating average darkness values for a number of image darkness test lines. Once these image darkness values are calculated, acceptable overall image darkness and acceptable image darkness distribution are verified. Overall image darkness can be verified by calculating average darkness values for a number of image darkness lines arranged in pairs of image darkness lines, the pairs of image darkness lines situated within an expected image capture region. Next, it is verified that a predetermined number of the image darkness test lines have associated calculated average darkness values that exceed a darkness threshold value. The predetermined number can be eight.
  • acceptable image darkness distribution can be assessed by determining a ridge count for each of the image definition test lines, and then verifying that image definition is acceptable based on the ridge counts.
  • These ridges counts can be determined for each of a predetermined number, for example five, of vertical image definition test lines and for each of a predetermined number, for example seven, of horizontal image definition test lines.
  • a finge ⁇ rint scanner for capturing an acceptable finge ⁇ rint image that includes a camera that captures an initial finge ⁇ rint image at a nominal image integration time and captures a first intermediate finge ⁇ rint image at a first intermediate image integration time, as well as a processor that performs an image darkness test and an image definition test.
  • Such a finge ⁇ rint scanner can further capture a subsequent intermediate finge ⁇ rint image at a subsequent intermediate image integration time when the processor performs an image darkness test that results in an unacceptable darkness level.
  • the finge ⁇ rint scanner's camera can continue to capture additional subsequent intermediate integration times until the processor performs an image darkness test that results in an acceptable darkness level.
  • These intermediate integration times can be derived from the nominal integration time in a manner like that used in connection with the method disclosed herein.
  • the finge ⁇ rint scanner's camera continues to capture subsequent intermediate finge ⁇ rint images at subsequent intermediate integration times until the processor performs and image darkness test and an image definition test that both result in acceptable image darkness and definition levels, respectively, for a single intermediate finge ⁇ rint image, or until a maximum intermediate integration time is reached.
  • a finge ⁇ rint scanner according to the present invention can perform the image darkness and image definition tests described herein.
  • FIGs. 1 A, IB, and 1C are illustrations of three finge ⁇ rint images having different light levels.
  • FIG.2 A is an illustration of a finge ⁇ rint scanner according to the present invention.
  • FIGs. 2B and 2C illustrate an example of the outward appearance of a mobile, hand-held remote finge ⁇ rint scanner according to FIG. 2A
  • FIG.3 is an illustration of a routine for capturing an acceptable finge ⁇ rint image according to an embodiment of the present invention.
  • FIG.4A illustrates a routine for testing image darkness 400 in accordance with the present invention.
  • FIG. 4B illustrates an arrangement of image test lines used in an image darkness test according to the present invention.
  • FIG. 5A is an illustration of a routine for testing image definition in accordance with the present invention.
  • FIG. 5B illustrates an arrangement of image definition test lines used in an image definition test according to the present invention.
  • fin ⁇ rint scanner is used to refer to a finge ⁇ rint scanner that scans a finge ⁇ rint and then processes the image data or transmits the image data to a host processor.
  • a finge ⁇ rint scanner can be a remote finge ⁇ rint scanner where "remote” is meant to imply that the finge ⁇ rint scanning can take place at a location physically separate from the host processor.
  • a remote finge ⁇ rint scanner and a host processor may be considered physically separate even though they may be connected through a data interface, permanent or otherwise.
  • fin ⁇ rint capture event is used to refer to a single act of capturing a finge ⁇ rint image with a finge ⁇ rint scanner. This term is not meant to imply any temporal limitations but is instead intended to refer to the event along with the particular characteristics of the event that can change from event to event. Such characteristics include the particular finger and its physical characteristics as well as other factors like the cleanliness of the image capture surface that can affect finge ⁇ rint capture.
  • fin ⁇ rint image is used to refer to any type of detected finge ⁇ rint image including, but not limited to, an image of all or part of one or more finge ⁇ rints, a rolled finge ⁇ rint, a flat stationary finge ⁇ rint, a palm print, and/or prints of multiple fingers.
  • acceptable finge ⁇ rint image is used to refer to a finge ⁇ rint image that has both acceptable darkness as well as acceptable definition.
  • acceptable darkness and definition levels are not critical and can be determined by one skilled in the relevant art given this disclosure, as discussed herein.
  • FIGs. 1 A-1C are illustrations of three finge ⁇ rint images having different light levels.
  • the finge ⁇ rint image in FIG. 1 A is comparatively darker than those of FIGs. IB and lC.
  • adjacent ridges are not discernable since the valleys between such ridges cannot be seen in the image. Such a situation occurs due to over-sensitivity of a camera for a particular reflected image, as will now be described in terms of a finge ⁇ rint scanner according to present invention.
  • FIG. 2A is an illustration of a finge ⁇ rint scanner 200 according to the present invention.
  • Finge ⁇ rint scanner 200 includes a light source 205.
  • Light source 205 can be one or more light emitting diodes (LEDs).
  • LEDs light emitting diodes
  • light source 205 can be another type of light source suitable for use within a finge ⁇ rint scanner, as would be apparent to one skilled in the relevant art given this description.
  • Light source 205 directs light toward a finge ⁇ rint capture surface 210.
  • Finge ⁇ rint capture surface 210 is a transparent or semi-transparent material upon which a finger can be placed so as to cause light from light source 205 to be reflected towards a camera 215.
  • Finge ⁇ rint capture surface 210 can be glass, though other materials apparent to one skilled in the relevant art can be used without departing from the scope of the present invention.
  • the light reflected towards camera 215 by finge ⁇ rint capture surface 210 is representative of the contact of a finger with finge ⁇ rint capture surface 210.
  • contact of ridges on a finger with finge ⁇ rint capture surface 210 results in light being reflected in areas corresponding to that contact.
  • the quality of the contact places a role in the quantity of reflected light. This contact quality is affected by the dryness of the subject's skin, the cleanliness of the finge ⁇ rint contact surface 210, the pressure applied by the subject, and the like.
  • Memory 220 can include both non-volatile and volatile memory.
  • memory 220 includes non-volatile memory that stores the executable code necessary for device operation and volatile memory for storing data representative of the captured image. Any type of non- volatile memory may be used, for example an electrically-erasable read only memory (EEPROM) or an optically-erasable read only memory (Flash-EPROM), though the invention is not limited to these specific types of non- volatile memory.
  • Volatile memory can be a random-access-memory for storing detected finge ⁇ rint images.
  • the image can be stored as an array of values representing a gray-scale value associated with each pixel.
  • Other types of memory flash memory, floppy drives, disks, mini-floppy drives, etc.
  • Volatile memory can include mini-floppy drives (such as those available from Sandisk Co ⁇ . or Intel Co ⁇ .). In this way, multiple prints can be stored locally. This is especially important in border control, crime scene, and accident sight applications.
  • camera 215 is responsive to light reflected from finge ⁇ rint capture surface 210, pixel light intensity is converted into a darkness level so that the stored image is like those appearing in FIGs. 1 A-IC.
  • the actual stored image is represented by dark pixels where light was depicted such that an image of the actual received light pattern would appear as a "negative" of what is shown in FIGs. lA-lC.
  • the stored image could correspond to actual light levels received, without departing from the scope of the present invention.
  • Camera 215 can include a 1 inch x 1 inch array of 500 x 500 pixels. Other size arrays could also be used, for example a 620 x 480 pixel array, without departing from the scope of the present invention.
  • Camera 215 can be a CMOS square pixel array. For example, a CMOS camera manufactured by Motorola
  • Camera 215 has a sensitivity to light that is controlled by an integration time.
  • the integration time is the length of time the pixels in camera 215 collect light. A longer integration time means more light collected, and thus a brighter (or darker after conversion) image.
  • the finge ⁇ rint images illustrated in FIGs. 1A-1C illustrate how the quality of a captured finge ⁇ rint can be affected by the integration time of the camera.
  • the finge ⁇ rint image of FIG. 1 A is darker than that of FIG. IB. This increased darkness can be characterized as an over-sensitivity to light by the capturing camera (keeping in mind that the image received by the camera is the negative of the image shown in the figure). This over-sensitivity can be corrected by shortening the integration time. Thus, by simply shortening the integration time, an image like that of FIG. IB can be produced for the same finge ⁇ rint capture event.
  • the finge ⁇ rint image of FIG. IB is superior in quality to that of FIG.
  • FIG. 1A since the shorter integration times results in less saturation of pixels within the camera, while still capturing a high percentage of finge ⁇ rint images.
  • the finge ⁇ rint image of FIG. 1 C is lighter than that of FIG. IB.
  • This can be characterized as an under-sensitivity to light by the capturing camera.
  • This under-sensitivity results in the loss of several ridges throughout the captured image in FIG. lC.
  • the sensitivity of the capturing camera can be adjusted by changing its integration time. Thus, by lengthening the integration time of the capturing camera, more light can be collected and an image like FIG. IB can be captured.
  • FIGs. 1 A-IC are representative of finge ⁇ rint images captured during a single finge ⁇ rint capture event at different integration times.
  • FIG. IB is meant to illustrate an image with improved quality of images 1A and 1C, but is not meant to illustrate the quality needed to produce an acceptable finge ⁇ rint image.
  • Finge ⁇ rint image acceptability is determined by particular light levels and ridge count details as can be determined through the darkness and ridge count tests discussed below.
  • the finge ⁇ rint images of FIGs. 1A and 1C might be considered acceptable finge ⁇ rint images as that term is used herein.
  • the second point to note is that the images of FIGs.
  • 1A-1C correspond to a particular finge ⁇ rint capture event.
  • the integration time corresponding to FIG. IB could just as easily produce an image like that of FIG. 1 A, in a subsequent finge ⁇ rint capture event. Since many of the variables that affect the quality of the captured finge ⁇ rint image vary between finge ⁇ rint capture events, optimal integration time should be determined each time a finge ⁇ rint image is captured, as discussed more fully elsewhere herein.
  • system controller also referred to herein as a processor
  • System controller is also included.
  • System controller 225 uses the executable code stored in memory 220, is capable of performing the necessary functions associated with device operation, such as image sensor control in response to user input.
  • System controller 225 also performs the tests associated with capturing an acceptable finge ⁇ rint image, as discussed more fully below.
  • finge ⁇ rint scanner As would be apparent to a person skilled in the art, other types of memory, circuitry and/or processing capability may be included within finge ⁇ rint scanner
  • Examples of which include a frame grabber and an analog/digital converter.
  • a power supply 230 Also included in the finge ⁇ rint scanner 200 shown in FIG. 2 is a power supply 230, a Universal Serial Bus (USB) interface 240, indicators 235, and user input controls 236 (the latter two shown as indicators and buttons in FIG. 2B). While a USB interface is used in connection with the preferred embodiments, the invention is not limited to such an interface. Any communications interface can be used. For example, an IEEE 1394 High Performance Serial Bus interface, RF interface, or even a proprietary interface may be used without departing from the scope of the present invention.
  • USB Universal Serial Bus
  • FIGs. 2B and 2C illustrate an example of the outward appearance of a mobile, hand-held remote finge ⁇ rint scanner according to FIG. 2A.
  • Finge ⁇ rint scanner 202 is ergonomically designed to fit the hand naturally. The oblong, cylindrical shape (similar to a flashlight), does not contain sha ⁇ edges. The device is small enough to be gripped by large or small hands without awkward or unnatural movement. The device is comfortable to use without muscle strain on the operator or subject.
  • finge ⁇ rint scanner 202 is 1.5 x 8.0 x 1.5 inches (height x length x width), weighs about 340 grams (12 oz.), and has an image capture surface 210 size of about 1 " x 1 " .
  • Finge ⁇ rint scanner 202 has controls and status indicators on the front-face of the unit for single (left or right) hand operation.
  • the non-intimidating appearance of the finge ⁇ rint scanner 202 is designed to resemble a typical flashlight - a device that is not generally threatening to the public.
  • Finge ⁇ rint scanner 202 has no sha ⁇ edges and is constructed of a light-weight aluminum housing that is coated with a polymer to give the device a "rubberized" feel. Because finge ⁇ rint scanner 202 is small and lightweight, it may be carried on the officer's utility belt upon exiting a vehicle. The device is designed for one hand use, allowing the officer to have a free hand for protective actions.
  • Finge ⁇ rint scanner 202 is designed for harsh environments to sustain issues such as dramatic temperature changes and non-intentional abuse.
  • Finge ⁇ rint scanner 202 contains a simple push button and set of 3 LED's that provide user activation and status indication. The user need only press one button to activate the unit. Once activated, the finge ⁇ rint scanner 202 awaits a finger to be introduced to the finge ⁇ rint capture surface. The digital (or analog) image is automatically captured when an acceptable image is detected. The image is then tested for quality of data prior to notifying the operator with an indication (e.g., visual indication and/or audible tone) for acceptance. A routine for automatically capturing an acceptable finge ⁇ rint image can be performed in accordance with the present invention, as is discussed elsewhere herein. The unit emits a tone to indicate a completed process.
  • the officer may introduce the unit to a docking station blindly, maintaining his eyes on the subject for safety.
  • the finge ⁇ rint is automatically transferred to the mobile computer without operator intervention.
  • the detected image is scalable to conform to FBI provided software (cropped or padded to 512 pixels by 512 pixels), although the standard image size is 1" X 1", 500 dpi, 256 levels of gray-scale (ANSI-NIST).
  • Other details of finge ⁇ rint scanner 202 can be found in co-pending U.S. patent application no. 09/430,296, entitled Hand-Held Finge ⁇ rint Scanner With On-Board Image Normalization Data Storage, filed October 29, 1999 (attorney docket no. 1823.0100000).
  • Finge ⁇ rint scanner 202 is held in either hand and used to capture a person's finge ⁇ rint.
  • the finge ⁇ rint is captured from a cooperative individual (frontal approach) or an uncooperative individual (handcuffed subject - most commonly face down).
  • Finge ⁇ rint scanner 202 can be operated with one-hand, allowing the officer to have a hand ready for protective actions. The officer need not have finge ⁇ rinting knowledge to capture the finge ⁇ rint.
  • the integration time of camera 215 within finge ⁇ rint scanner 200 can be adjusted to compensate for light level changes introduced by variations in the contact quality between a finger and the finge ⁇ rint capture surface during any particular finge ⁇ rint capture event. Such compensation can be done automatically, i.e. without operator input, within the finge ⁇ rint scanner 200 according to a method that will next be described.
  • FIG. 3 is an illustration of a routine 300 for capturing an acceptable finge ⁇ rint image according to an embodiment of the present invention.
  • a first step 305 an initial finge ⁇ rint image is captured at a nominal integration time.
  • the finge ⁇ rint scanner is "waiting" for the presence of a finger.
  • the first step 305 involves the finge ⁇ rint scanner continually capturing images at the nominal integration time until the presence of a finger is detected. The presence of a finger is detected by performing a darkness test after each image is captured at the nominal integration time.
  • the darkness test used can be a darkness test according to the present invention, described below more fully in connection with FIGs. 4A and 4B.
  • the nominal integration time can be an integration time expected to a capture an acceptable finge ⁇ rint image based on the intensity of the light source used and the sensitivity of the camera, discounting any variations due to the quality of the contact between the finger and finge ⁇ rint capture surface.
  • there is a range of integration times associated with a given camera for example from 20- 120 milliseconds.
  • the nominal integration time can thus be determined based on expected conditions in advance as a particular integration time from within the typical range for a given camera.
  • a typical nominal integration time can be 50 ms, though other nominal integration times could be chosen without departing from the scope of the present invention.
  • a nominal integration time from within the range of 40 ms to 60 ms could be selected for a camera with an integration time range of 20-120 ms.
  • an intermediate finge ⁇ rint image is captured at a first integration time.
  • the present invention uses a set of integration times to find an optimal integration time once an initial finge ⁇ rint image is captured at the nominal integration time.
  • the set of integration times can be derived from the nominal integration time.
  • the set of integration times can include six integration times that are each equal to the nominal integration time multiplied by an appropriate scaling factor.
  • the integration times can be equal to 6/7, 7/7, 8/7, 9/7, 10/7, and 11/7 multiplied by the nominal integration time.
  • the integration times used in a routine would be: 43 ms, 50 ms, 57 ms, 64 ms, 71 ms, and 79 ms.
  • the integration time is shortened to 43 ms and an intermediate finge ⁇ rint image is captured.
  • additional intermediate finge ⁇ rint images can be captured at higher integration times until an acceptable finge ⁇ rint image is captured. It should thus be apparent to one skilled in the relevant art that the particular integration times used are not critical, so long as a range of integration times around the nominal integration time is used.
  • an image darkness test of the intermediate image captured in step 310 is performed. Such an image darkness test is used to determine whether the intermediate image is sufficiently dark.
  • An image darkness test of the present invention as discussed below in connection with FIGs.4A and 4B, can be used. Other image darkness tests could also be used without departing from the scope of the present invention. For example, simply averaging the values of all the pixels in the camera can give an indication of the darkness level of the captured intermediate image.
  • a next step 325 or 330 is performed as shown in FIG. 3 at 320.
  • the particular level of darkness required for an acceptable darkness level is not critical and could be determined by one skilled in the relevant art given this disclosure.
  • the acceptable darkness level can be environment and use specific and thus can be set by the manufacturer or user, as appropriate.
  • a next step 325 of incrementing the image integration time and capturing another intermediate image at the incremented integration time is performed. The only exception to this step is when the integration time cannot be incremented to a higher integration time because the highest integration is the one at which the intermediate finge ⁇ rint image was captured. In such a case, the routine returns to step 305.
  • routine 300 includes a loop with steps 315, 320, and 325 repeating until an intermediate image with an acceptable darkness level has been captured.
  • an image definition test is performed at a step 330.
  • the image definition test used can be an image definition test according to the present invention and discussed below in connection with FIGs. 5 A and 5B. Such an image definition test counts the number of ridges in predefined areas by focusing on pixel patterns that include minimum numbers of consecutive light and dark pixels generally representative of the presence of the ridges and valleys characteristic of a finge ⁇ rint image. Alternatively, any image definition test that tests the captured image for its level of detail can be used without departing from the scope of the present invention.
  • the particular level of image definition required for an acceptable image definition level is not critical and could be determined by one skilled in the relevant art given this disclosure.
  • the acceptable image definition level can be environment and use specific and thus can be set by the manufacturer or user, as appropriate.
  • step 330 Once the image definition test has been performed in step 330, one of two different steps are conducted based on the outcome of that test as shown at 335.
  • routine 300 If the image definition test 330 indicated that the intermediate finge ⁇ rint was of un-acceptable definition, then the routine returns to step 325, discussed above. As with the above description of step 325, if the integration time cannot be incremented because the captured image was a result of the maximum integration time, routine 300 returns to step 305 to await a new initial finge ⁇ rint image.
  • Step 340 can include a step of providing a signal that an acceptable finge ⁇ rint image has been captured. This signal can be audible, visible, or both.
  • FIG.4A illustrates a routine for testing image darkness 400 in accordance with the present invention.
  • image darkness test lines are selected from a captured image.
  • FIG.4B shows the details of such image test lines according to one example.
  • FIG.4B illustrates an arrangement of image darkness test lines used in an image darkness test according to the present invention.
  • image capture surface 210 is depicted with an expected image capture area 420.
  • Expected image capture area 420 is a region in which a finge ⁇ rint is expected to be located during an image capture event. The precise size and location of image capture area 420 can differ from that shown in the figure without departing from the scope of the invention.
  • image test lines are situated throughout expected image capture area 420. Specifically, in the arrangement of FIG.4B, there are ten image test lines 435, 436, and the like. These ten image test lines are arranged in five pairs of image test lines 430-434. These five pairs of image test lines 430-
  • each image test line 435, 436 is a diagonal arrangement of 32 pixels. Other numbers of pixels and arrangements of image test lines could be used without departing from the scope of the present invention.
  • an average darkness value for each image darkness test line is calculated. Such an average can be calculated by adding the darkness value for each pixel in an image darkness test line and then dividing that sum by the number of pixels in the image darkness test line.
  • acceptable overall image darkness is verified. This verification can be done, for example, by verifying that a predetermined number of image darkness test lines have an associated average image darkness level above a threshold darkness level. In an embodiment, the predetermined number (or percentage) of image darkness test lines is eight (or 80 % of the image darkness test lines). If eight image darkness test lines have an average image darkness level above the threshold darkness level, the overall image darkness is considered acceptable.
  • step 403 Anext step 404 of verifying acceptability of image darkness distribution is performed. It should be noted that if the previous step 403 resulted in a determination that overall image darkness was not acceptable for the tested image, it is not necessary that routine 400 continue, but could instead stop at step 403.
  • step 404 image darkness distribution is tested. Despite the determination in step 403 that overall image darkness was acceptable, this darkness may have been concentrated in a particular region. For example, if all image darkness test lines in pairs 430-433, as shown in FIG.4B, have acceptable darkness levels, the image will have an acceptable overall image darkness despite a lack of acceptable darkness in both image darkness test lines in pair 434. Thus, step 404 is used to verify that the darkness of the image is distributed throughout the expected image capture area 420. The step can be performed by verifying that at least one image darkness test line in each of the five pairs 430-434 of image darkness test lines has an acceptable darkness level. As with step 403, this can be done by comparing the average darkness value of each darkness test line with a predetermined threshold darkness value.
  • This threshold darkness value can be the same value used in connection with step 403.
  • the particular threshold darkness level chosen is not critical and could be determined by one skilled in the relevant art given this disclosure.
  • the acceptable darkness level can be based on the specific environment in which the finge ⁇ rint scanner is used as well as requirements associated with the field in which the finge ⁇ rint scanner is used and thus can be set by the manufacturer or user, as appropriate. Because step 404 of the routine 400 shown in FIG. 4A verifies that the image darkness is distributed throughout expected image capture region 420, the routine 400 of FIG.4A can be used to verify acceptable darkness level throughout a particular region. Accordingly, such a routine 400 can be used as the image darkness test within the routine 300 shown in FIG. 3.
  • FIG. 5 A is an illustration of a routine for testing image definition 500 in accordance with an embodiment of the present invention. While the routine 400 of FIG. 4A tested an image for an acceptable darkness level, the routine 500 of FIG. 5 A tests an image for an acceptable level of definition. Such a test is useful because, for example, a particular image may be have an acceptable level of darkness while lacking the necessary ridge details characteristic of an acceptable finge ⁇ rint image. Thus, routine 500 tests an image for its definition level.
  • routine 500 tests for image definition by counting ridges and valleys along image definition test lines.
  • image definition test lines are selected from a captured image to be tested. This will be explained in connection with FIG. 5B.
  • FIG. 5B illustrates an arrangement of image definition test lines used in an image definition test according to the present invention.
  • image capture surface 210 is depicted with an expected image capture area 520.
  • expected image capture area 520 is a region in which a finge ⁇ rint is expected to be located during an image capture event.
  • the precise size and location of image capture area 520 can differ from that shown in the figure without departing from the scope of the invention.
  • Within the image capture area 520 are arranged two groups 530, 540 of image definition test lines 531, 541, and the like.
  • Each image definition test line is a line of pixels within the image capture area 520.
  • the first group of image definition test lines 530 includes five vertically arranged parallel image definition test lines, e.g. 531.
  • the second group of image definition test lines 540 includes seven horizontally arranged parallel image definition test lines, e.g. 541. While specific numbers of image definition test lines have been depicted, other numbers of image definition test lines could be used without departing from the scope of the present invention. Likewise, while the arrangement of image definition test lines has been selected in the arrangement of FIG. 5B to include more horizontally arranged lines than vertically arranged lines, different arrangements could be used without departing from the scope of the present invention.
  • a ridge count for each image definition test line is determined.
  • Such a ridge count can be determined by looking for a pattern of pixel undulations representative of an expected pattern of finge ⁇ rint ridges.
  • ridges are shown as adjacent dark areas separated from each other by intervening light areas representative of valleys.
  • a line of pixels that includes a number of finge ⁇ rint ridges will include a substantially continuous group of comparatively dark pixels following by a substantially continuous group of comparatively light pixels. Whether a pixel is considered comparatively dark or light can be determined by selecting a mid-range light level.
  • This mid-range light level can be a single light level or a range of light levels.
  • a comparatively dark pixel is one that is on the dark side of this mid-range light level while a comparatively light pixel is one that is on the light side of this mid-range light level.
  • a ridge can be determined by the presence of, for example, three or more continuous comparatively dark pixels bounded by, for example, three or more comparatively light pixels.
  • the number of ridges within one image definition test line can be determined in step 502 by counting groups of comparatively dark pixels separated by groups of comparatively light pixels.
  • the actual number of comparatively dark pixels necessary to define to a ridge could be determined by one skilled in the relevant arts given this disclosure.
  • the ridge counts of the image definition test lines determined in step 502 are used to verify image definition acceptability. This can be done, for example, by verifying that the ridge count for each image definition test line is greater than a threshold ridge count value associated with each image definition test line.
  • the particular threshold ridge count values used are not critical and could be determined by one skilled in the relevant art given this disclosure. Rather than having a threshold ridge count value for each image definition test line, a singe threshold ridge count value could be used for all the image definition test lines.
  • the acceptable image definition level can be based on the specific environment in which the finge ⁇ rint scanner is used as well as requirements associated with the field in which the finge ⁇ rint scanner is used and thus can be set by the manufacturer or user, as appropriate.

Abstract

A system and method of capturing an acceptable fingerprint image is disclosed herein. The method includes a step of capturing an initial fingerprint image at a nominal image integration time. Once this initial fingerprint image is captured, a first intermediate fingerprint image at a first intermediate image integration time is captured. Next, an image darkness test is performed followed by an image definition test. If one or more of these tests indicates that the first intermediate fingerprint image is unacceptable, a subsequent intermediate fingerprint image at a subsequent intermediate image integration time is captured. This subsequent intermediate fingerprint image can be captured before the image definition test is performed. Additional intermediate fingerprint images can be captured until an image that has an acceptable darkness level as a well as an acceptable definition level is captured. Also disclosed is a fingerprint scanner that performs this method.

Description

Fingerprint Scanner Auto-Capture System and Method
Background of the Invention
Field of the Invention
The present invention relates to generally to fingerprint scanning and imaging. More specifically, the present invention relates to a system and method for capturing a fingerprint image.
Related Art
Biometrics are a group of technologies that provide a high level of security. Fingerprint capture and recognition is an important biometric technology. Law enforcement, banking, voting, and other industries increasingly rely upon fingerprints as a biometric to recognize or verify identity. See, Biometrics Explained, v. 2.0, G. Roethenbaugh, International Computer Society Assn. Carlisle, PA 1998, pages 1-34. Fingeφrint scanners having cameras are available that capture an image of a fingerprint. A signal representative of the captured image is then sent over a data communication interface to a host computer for further processing. For example, the host can perform one-to-one or one-to-many fingerprint matching.
In order to capture a fingerprint image electronically, a light source is typically directed towards a fingeφrint capture surface that reflects light from the light source towards a camera. The fingeφrint capture surface is generally glass. Contact between the surface of a finger and the fingeφrint capture surface causes the reflected light to be representative of the fingeφrint of the particular finger placed against the fingeφrint capture surface. This reflection then must be captured by camera. The intensity of the reflected light varies greatly in such a system. For example, variations due to manufacturing tolerances and techniques used to produce the light source can affect the intensity of light directed towards the fingeφrint capture surface. Such a variation can, however, be determined at the time of manufacture and can be factored into the design of the system. Other variations cannot be determined in advance, and so must be compensated for in the field.
For example, the quality of contact between a finger and the fingeφrint capture surface plays a large role in the intensity of the reflected light. A very dry skin surface on a clean fingeφrint capture surface may result in a low intensity level of reflected light. On the other hand, an oily skin surface and/or a less- clean fingeφrint capture surface may result in a high level of reflected light.
As a result of the above variations, a fingeφrint scanner system and method that captures an acceptable fingeφrint image is needed. Moreover, in order to produce an effective and simple to use fingeφrint scanner, it is desired that such a system and method for capturing an acceptable fingeφrint image be implemented with little needed user input.
Summary of the Invention
A method of capturing an acceptable fingeφrint image is disclosed herein.
This method includes a step of capturing an initial fingeφrint image at a nominal image integration time. Once this initial fingeφrint image is captured, a first intermediate fingeφrint image at a first intermediate image integration time is captured. Next, an image darkness test is performed followed by an image definition test. If one or more of these tests indicates that the first intermediate fingeφrint image is unacceptable, a subsequent intermediate fingeφrint image at a subsequent intermediate image integration time is captured. This subsequent intermediate fingeφrint image can be captured before the image definition test is performed. Additional intermediate fingeφrint images can be captured until an image that has an acceptable darkness level as a well as an acceptable definition level is captured. These additional intermediate fingeφrint images can be captured at incremented intermediate integration times. The intermediate integration times can be derived from the nominal image integration time by multiplying the nominal image integration time by multiples of 1/7 of the nominal image integration time.
A method according to the present invention can include calculating average darkness values for a number of image darkness test lines. Once these image darkness values are calculated, acceptable overall image darkness and acceptable image darkness distribution are verified. Overall image darkness can be verified by calculating average darkness values for a number of image darkness lines arranged in pairs of image darkness lines, the pairs of image darkness lines situated within an expected image capture region. Next, it is verified that a predetermined number of the image darkness test lines have associated calculated average darkness values that exceed a darkness threshold value. The predetermined number can be eight.
Meanwhile, acceptable image darkness distribution can be assessed by determining a ridge count for each of the image definition test lines, and then verifying that image definition is acceptable based on the ridge counts. These ridges counts can be determined for each of a predetermined number, for example five, of vertical image definition test lines and for each of a predetermined number, for example seven, of horizontal image definition test lines. Also disclosed is a fingeφrint scanner for capturing an acceptable fingeφrint image that includes a camera that captures an initial fingeφrint image at a nominal image integration time and captures a first intermediate fingeφrint image at a first intermediate image integration time, as well as a processor that performs an image darkness test and an image definition test. Such a fingeφrint scanner can further capture a subsequent intermediate fingeφrint image at a subsequent intermediate image integration time when the processor performs an image darkness test that results in an unacceptable darkness level. The fingeφrint scanner's camera can continue to capture additional subsequent intermediate integration times until the processor performs an image darkness test that results in an acceptable darkness level. These intermediate integration times can be derived from the nominal integration time in a manner like that used in connection with the method disclosed herein.
The fingeφrint scanner's camera continues to capture subsequent intermediate fingeφrint images at subsequent intermediate integration times until the processor performs and image darkness test and an image definition test that both result in acceptable image darkness and definition levels, respectively, for a single intermediate fingeφrint image, or until a maximum intermediate integration time is reached.
A fingeφrint scanner according to the present invention can perform the image darkness and image definition tests described herein.
Further embodiments, features, and advantages of the invention, as well as the structure and operation of the various embodiments of invention, are described in detail below with reference to the accompanying drawings.
Brief Description of the Figures
FIGs. 1 A, IB, and 1C are illustrations of three fingeφrint images having different light levels.
FIG.2 A is an illustration of a fingeφrint scanner according to the present invention.
FIGs. 2B and 2C illustrate an example of the outward appearance of a mobile, hand-held remote fingeφrint scanner according to FIG. 2A
FIG.3 is an illustration of a routine for capturing an acceptable fingeφrint image according to an embodiment of the present invention.
FIG.4A illustrates a routine for testing image darkness 400 in accordance with the present invention. FIG. 4B illustrates an arrangement of image test lines used in an image darkness test according to the present invention.
FIG. 5A is an illustration of a routine for testing image definition in accordance with the present invention. FIG. 5B illustrates an arrangement of image definition test lines used in an image definition test according to the present invention.
Detailed Description of the Preferred Embodiments
Terminology
As used herein, the term "fingeφrint scanner" is used to refer to a fingeφrint scanner that scans a fingeφrint and then processes the image data or transmits the image data to a host processor. Such a fingeφrint scanner can be a remote fingeφrint scanner where "remote" is meant to imply that the fingeφrint scanning can take place at a location physically separate from the host processor. A remote fingeφrint scanner and a host processor may be considered physically separate even though they may be connected through a data interface, permanent or otherwise.
As used herein, the term "fingeφrint capture event" is used to refer to a single act of capturing a fingeφrint image with a fingeφrint scanner. This term is not meant to imply any temporal limitations but is instead intended to refer to the event along with the particular characteristics of the event that can change from event to event. Such characteristics include the particular finger and its physical characteristics as well as other factors like the cleanliness of the image capture surface that can affect fingeφrint capture. As used herein, the term "fingeφrint image" is used to refer to any type of detected fingeφrint image including, but not limited to, an image of all or part of one or more fingeφrints, a rolled fingeφrint, a flat stationary fingeφrint, a palm print, and/or prints of multiple fingers.
As used herein, the term "acceptable fingeφrint image" is used to refer to a fingeφrint image that has both acceptable darkness as well as acceptable definition. The particular acceptable darkness and definition levels are not critical and can be determined by one skilled in the relevant art given this disclosure, as discussed herein. Auto-Capture System and Method
FIGs. 1 A-1C are illustrations of three fingeφrint images having different light levels. The fingeφrint image in FIG. 1 A is comparatively darker than those of FIGs. IB and lC. In a number of places in the fingeφrint image of FIG. 1A, adjacent ridges are not discernable since the valleys between such ridges cannot be seen in the image. Such a situation occurs due to over-sensitivity of a camera for a particular reflected image, as will now be described in terms of a fingeφrint scanner according to present invention.
FIG. 2A is an illustration of a fingeφrint scanner 200 according to the present invention. Fingeφrint scanner 200 includes a light source 205. Light source 205 can be one or more light emitting diodes (LEDs). Alternatively, light source 205 can be another type of light source suitable for use within a fingeφrint scanner, as would be apparent to one skilled in the relevant art given this description. Light source 205 directs light toward a fingeφrint capture surface 210. Fingeφrint capture surface 210 is a transparent or semi-transparent material upon which a finger can be placed so as to cause light from light source 205 to be reflected towards a camera 215. Fingeφrint capture surface 210 can be glass, though other materials apparent to one skilled in the relevant art can be used without departing from the scope of the present invention. As discussed above, the light reflected towards camera 215 by fingeφrint capture surface 210 is representative of the contact of a finger with fingeφrint capture surface 210. Specifically, contact of ridges on a finger with fingeφrint capture surface 210 results in light being reflected in areas corresponding to that contact. Thus, the quality of the contact places a role in the quantity of reflected light. This contact quality is affected by the dryness of the subject's skin, the cleanliness of the fingeφrint contact surface 210, the pressure applied by the subject, and the like. Camera 215 captures the reflected light within, for example, an array of photo-sensitive pixels. The image is then stored in a memory 220. Memory 220 can include both non-volatile and volatile memory. In one example, memory 220 includes non-volatile memory that stores the executable code necessary for device operation and volatile memory for storing data representative of the captured image. Any type of non- volatile memory may be used, for example an electrically-erasable read only memory (EEPROM) or an optically-erasable read only memory (Flash-EPROM), though the invention is not limited to these specific types of non- volatile memory. Volatile memory can be a random-access-memory for storing detected fingeφrint images. For example, the image can be stored as an array of values representing a gray-scale value associated with each pixel. Other types of memory (flash memory, floppy drives, disks, mini-floppy drives, etc.) can be used in alternative embodiments of the present invention. Volatile memory can include mini-floppy drives (such as those available from Sandisk Coφ. or Intel Coφ.). In this way, multiple prints can be stored locally. This is especially important in border control, crime scene, and accident sight applications. While camera 215 is responsive to light reflected from fingeφrint capture surface 210, pixel light intensity is converted into a darkness level so that the stored image is like those appearing in FIGs. 1 A-IC. In other words, the actual stored image is represented by dark pixels where light was depicted such that an image of the actual received light pattern would appear as a "negative" of what is shown in FIGs. lA-lC. Alternatively, the stored image could correspond to actual light levels received, without departing from the scope of the present invention. Camera 215 can include a 1 inch x 1 inch array of 500 x 500 pixels. Other size arrays could also be used, for example a 620 x 480 pixel array, without departing from the scope of the present invention. Camera 215 can be a CMOS square pixel array. For example, a CMOS camera manufactured by Motorola
Coφoration can be used. Camera 215 has a sensitivity to light that is controlled by an integration time. The integration time is the length of time the pixels in camera 215 collect light. A longer integration time means more light collected, and thus a brighter (or darker after conversion) image. Before discussing the remaining elements in the fingeφrint scanner 200 of FIG. 2, the relationship between integration time and captured image will be discussed in connection with the fingeφrint images of FIGs. 1A-1C.
The fingeφrint images illustrated in FIGs. 1A-1C illustrate how the quality of a captured fingeφrint can be affected by the integration time of the camera. The fingeφrint image of FIG. 1 A is darker than that of FIG. IB. This increased darkness can be characterized as an over-sensitivity to light by the capturing camera (keeping in mind that the image received by the camera is the negative of the image shown in the figure). This over-sensitivity can be corrected by shortening the integration time. Thus, by simply shortening the integration time, an image like that of FIG. IB can be produced for the same fingeφrint capture event. The fingeφrint image of FIG. IB is superior in quality to that of FIG. 1A since the shorter integration times results in less saturation of pixels within the camera, while still capturing a high percentage of fingeφrint images. Meanwhile, the fingeφrint image of FIG. 1 C is lighter than that of FIG. IB. This can be characterized as an under-sensitivity to light by the capturing camera. This under-sensitivity results in the loss of several ridges throughout the captured image in FIG. lC. As with FIG. 1 A, the sensitivity of the capturing camera can be adjusted by changing its integration time. Thus, by lengthening the integration time of the capturing camera, more light can be collected and an image like FIG. IB can be captured. Thus, FIGs. 1 A-IC are representative of fingeφrint images captured during a single fingeφrint capture event at different integration times.
Two points should be noted about the images of FIGs. 1 A-IC. The first is that the differences between the images is meant to illustrate changes in quality and in no way is intended to imply a particular level of quality required before an image is considered "acceptable." In other words, FIG. IB is meant to illustrate an image with improved quality of images 1A and 1C, but is not meant to illustrate the quality needed to produce an acceptable fingeφrint image. Fingeφrint image acceptability is determined by particular light levels and ridge count details as can be determined through the darkness and ridge count tests discussed below. Thus, the fingeφrint images of FIGs. 1A and 1C might be considered acceptable fingeφrint images as that term is used herein. The second point to note is that the images of FIGs. 1A-1C correspond to a particular fingeφrint capture event. The integration time corresponding to FIG. IB could just as easily produce an image like that of FIG. 1 A, in a subsequent fingeφrint capture event. Since many of the variables that affect the quality of the captured fingeφrint image vary between fingeφrint capture events, optimal integration time should be determined each time a fingeφrint image is captured, as discussed more fully elsewhere herein.
Returning to the fingeφrint scanner 200 of FIG. 2A, system controller (also referred to herein as a processor) 225 is also included. System controller
225, using the executable code stored in memory 220, is capable of performing the necessary functions associated with device operation, such as image sensor control in response to user input. System controller 225 also performs the tests associated with capturing an acceptable fingeφrint image, as discussed more fully below.
As would be apparent to a person skilled in the art, other types of memory, circuitry and/or processing capability may be included within fingeφrint scanner
200, examples of which include a frame grabber and an analog/digital converter.
Also included in the fingeφrint scanner 200 shown in FIG. 2 is a power supply 230, a Universal Serial Bus (USB) interface 240, indicators 235, and user input controls 236 (the latter two shown as indicators and buttons in FIG. 2B). While a USB interface is used in connection with the preferred embodiments, the invention is not limited to such an interface. Any communications interface can be used. For example, an IEEE 1394 High Performance Serial Bus interface, RF interface, or even a proprietary interface may be used without departing from the scope of the present invention.
FIGs. 2B and 2C illustrate an example of the outward appearance of a mobile, hand-held remote fingeφrint scanner according to FIG. 2A. Fingeφrint scanner 202 is ergonomically designed to fit the hand naturally. The oblong, cylindrical shape (similar to a flashlight), does not contain shaφ edges. The device is small enough to be gripped by large or small hands without awkward or unnatural movement. The device is comfortable to use without muscle strain on the operator or subject. In one example, fingeφrint scanner 202 is 1.5 x 8.0 x 1.5 inches (height x length x width), weighs about 340 grams (12 oz.), and has an image capture surface 210 size of about 1 " x 1 " .
Fingeφrint scanner 202 has controls and status indicators on the front-face of the unit for single (left or right) hand operation. The non-intimidating appearance of the fingeφrint scanner 202 is designed to resemble a typical flashlight - a device that is not generally threatening to the public. Fingeφrint scanner 202 has no shaφ edges and is constructed of a light-weight aluminum housing that is coated with a polymer to give the device a "rubberized" feel. Because fingeφrint scanner 202 is small and lightweight, it may be carried on the officer's utility belt upon exiting a vehicle. The device is designed for one hand use, allowing the officer to have a free hand for protective actions. Fingeφrint scanner 202 is designed for harsh environments to sustain issues such as dramatic temperature changes and non-intentional abuse.
Fingeφrint scanner 202 contains a simple push button and set of 3 LED's that provide user activation and status indication. The user need only press one button to activate the unit. Once activated, the fingeφrint scanner 202 awaits a finger to be introduced to the fingeφrint capture surface. The digital (or analog) image is automatically captured when an acceptable image is detected. The image is then tested for quality of data prior to notifying the operator with an indication (e.g., visual indication and/or audible tone) for acceptance. A routine for automatically capturing an acceptable fingeφrint image can be performed in accordance with the present invention, as is discussed elsewhere herein. The unit emits a tone to indicate a completed process. The officer may introduce the unit to a docking station blindly, maintaining his eyes on the subject for safety. Once seated in the docking station, the fingeφrint is automatically transferred to the mobile computer without operator intervention. The detected image is scalable to conform to FBI provided software (cropped or padded to 512 pixels by 512 pixels), although the standard image size is 1" X 1", 500 dpi, 256 levels of gray-scale (ANSI-NIST). Other details of fingeφrint scanner 202 can be found in co-pending U.S. patent application no. 09/430,296, entitled Hand-Held Fingeφrint Scanner With On-Board Image Normalization Data Storage, filed October 29, 1999 (attorney docket no. 1823.0100000).
Fingeφrint scanner 202 is held in either hand and used to capture a person's fingeφrint. The fingeφrint is captured from a cooperative individual (frontal approach) or an uncooperative individual (handcuffed subject - most commonly face down). Fingeφrint scanner 202 can be operated with one-hand, allowing the officer to have a hand ready for protective actions. The officer need not have fingeφrinting knowledge to capture the fingeφrint.
As discussed above, the integration time of camera 215 within fingeφrint scanner 200 can be adjusted to compensate for light level changes introduced by variations in the contact quality between a finger and the fingeφrint capture surface during any particular fingeφrint capture event. Such compensation can be done automatically, i.e. without operator input, within the fingeφrint scanner 200 according to a method that will next be described.
FIG. 3 is an illustration of a routine 300 for capturing an acceptable fingeφrint image according to an embodiment of the present invention. In a first step 305 an initial fingeφrint image is captured at a nominal integration time. In the first step 305, the fingeφrint scanner is "waiting" for the presence of a finger. Thus, the first step 305 involves the fingeφrint scanner continually capturing images at the nominal integration time until the presence of a finger is detected. The presence of a finger is detected by performing a darkness test after each image is captured at the nominal integration time. Once the result of a darkness test is positive, meaning a fingeφrint image with sufficient darkness has been detected, an initial fingeφrint image has been captured, thus completing the first step 305. The darkness test used can be a darkness test according to the present invention, described below more fully in connection with FIGs. 4A and 4B. The nominal integration time can be an integration time expected to a capture an acceptable fingeφrint image based on the intensity of the light source used and the sensitivity of the camera, discounting any variations due to the quality of the contact between the finger and fingeφrint capture surface. Typically, there is a range of integration times associated with a given camera, for example from 20- 120 milliseconds. The nominal integration time can thus be determined based on expected conditions in advance as a particular integration time from within the typical range for a given camera. For example, a typical nominal integration time can be 50 ms, though other nominal integration times could be chosen without departing from the scope of the present invention. For example, a nominal integration time from within the range of 40 ms to 60 ms could be selected for a camera with an integration time range of 20-120 ms.
In a next step 310 of the routine 300 shown in FIG. 3, an intermediate fingeφrint image is captured at a first integration time. The present invention uses a set of integration times to find an optimal integration time once an initial fingeφrint image is captured at the nominal integration time. The set of integration times can be derived from the nominal integration time. For example, the set of integration times can include six integration times that are each equal to the nominal integration time multiplied by an appropriate scaling factor. In an embodiment, the integration times can be equal to 6/7, 7/7, 8/7, 9/7, 10/7, and 11/7 multiplied by the nominal integration time. Thus, if the nominal integration time is selected to be 50 ms, the integration times used in a routine according to an embodiment of the present invention would be: 43 ms, 50 ms, 57 ms, 64 ms, 71 ms, and 79 ms. Thus, continuing with this example, once the initial image is captured at 50 ms, the integration time is shortened to 43 ms and an intermediate fingeφrint image is captured. As will be discussed below, additional intermediate fingeφrint images can be captured at higher integration times until an acceptable fingeφrint image is captured. It should thus be apparent to one skilled in the relevant art that the particular integration times used are not critical, so long as a range of integration times around the nominal integration time is used. In a next step 315 of the routine 300 of FIG. 3, an image darkness test of the intermediate image captured in step 310 is performed. Such an image darkness test is used to determine whether the intermediate image is sufficiently dark. An image darkness test of the present invention, as discussed below in connection with FIGs.4A and 4B, can be used. Other image darkness tests could also be used without departing from the scope of the present invention. For example, simply averaging the values of all the pixels in the camera can give an indication of the darkness level of the captured intermediate image.
Depending on the outcome of the image darkness test performed in step 315, a next step 325 or 330 is performed as shown in FIG. 3 at 320. The particular level of darkness required for an acceptable darkness level is not critical and could be determined by one skilled in the relevant art given this disclosure. The acceptable darkness level can be environment and use specific and thus can be set by the manufacturer or user, as appropriate. If the image darkness test of step 315 results in an un-acceptable darkness level, then a next step 325 of incrementing the image integration time and capturing another intermediate image at the incremented integration time is performed. The only exception to this step is when the integration time cannot be incremented to a higher integration time because the highest integration is the one at which the intermediate fingeφrint image was captured. In such a case, the routine returns to step 305.
If the image integration time has been incremented and another intermediate image captured, the routine returns to step 315 to perform the darkness test again. Thus, routine 300 includes a loop with steps 315, 320, and 325 repeating until an intermediate image with an acceptable darkness level has been captured.
Once an intermediate fingeφrint image with an acceptable darkness level has been captured, an image definition test is performed at a step 330. The image definition test used can be an image definition test according to the present invention and discussed below in connection with FIGs. 5 A and 5B. Such an image definition test counts the number of ridges in predefined areas by focusing on pixel patterns that include minimum numbers of consecutive light and dark pixels generally representative of the presence of the ridges and valleys characteristic of a fingeφrint image. Alternatively, any image definition test that tests the captured image for its level of detail can be used without departing from the scope of the present invention. The particular level of image definition required for an acceptable image definition level is not critical and could be determined by one skilled in the relevant art given this disclosure. The acceptable image definition level can be environment and use specific and thus can be set by the manufacturer or user, as appropriate.
Once the image definition test has been performed in step 330, one of two different steps are conducted based on the outcome of that test as shown at 335.
If the image definition test 330 indicated that the intermediate fingeφrint was of un-acceptable definition, then the routine returns to step 325, discussed above. As with the above description of step 325, if the integration time cannot be incremented because the captured image was a result of the maximum integration time, routine 300 returns to step 305 to await a new initial fingeφrint image.
If the image definition test 330 indicated that the intermediate fingeφrint image was of acceptable definition, then intermediate finger print image is an acceptable fingeφrint image in terms of both darkness as well as definition. Thus, in a final step 340, the intermediate fingeφrint image that has passed both tests is an acceptable fingeφrint image and the routine is complete. In this way, routine 300 has automatically captured a acceptable fingeφrint image. Step 340 can include a step of providing a signal that an acceptable fingeφrint image has been captured. This signal can be audible, visible, or both.
Details of an image darkness test and an image definition test in accordance of the present invention will now be described in terms of FIGs. 4A, 4B, 5A, and 5B. FIG.4A illustrates a routine for testing image darkness 400 in accordance with the present invention. In a first step 401 of routine 400, image darkness test lines are selected from a captured image. Thus, rather than observing pixels from the entire image to determine darkness, only a few lines of pixels are selected. The present inventor has discovered that by selecting particular test lines, the image darkness test can not only ensure adequate image darkness from testing only a handful of lines, but can also ensure proper fingeφrint placement on the image capture surface of a fingeφrint scanner. FIG.4B shows the details of such image test lines according to one example. FIG.4B illustrates an arrangement of image darkness test lines used in an image darkness test according to the present invention. In FIG.4B, image capture surface 210 is depicted with an expected image capture area 420. Expected image capture area 420 is a region in which a fingeφrint is expected to be located during an image capture event. The precise size and location of image capture area 420 can differ from that shown in the figure without departing from the scope of the invention. In order to ensure that the dark areas present during a darkness test are arranged in an expected area, image test lines are situated throughout expected image capture area 420. Specifically, in the arrangement of FIG.4B, there are ten image test lines 435, 436, and the like. These ten image test lines are arranged in five pairs of image test lines 430-434. These five pairs of image test lines 430-
434 are spaced throughout the expected image capture area 420 as shown in FIG. 4B. In an embodiment of the invention, each image test line 435, 436, is a diagonal arrangement of 32 pixels. Other numbers of pixels and arrangements of image test lines could be used without departing from the scope of the present invention.
In a next step 402 of the routine 400 shown in FIG. 4A, an average darkness value for each image darkness test line is calculated. Such an average can be calculated by adding the darkness value for each pixel in an image darkness test line and then dividing that sum by the number of pixels in the image darkness test line. In a next step 403, acceptable overall image darkness is verified. This verification can be done, for example, by verifying that a predetermined number of image darkness test lines have an associated average image darkness level above a threshold darkness level. In an embodiment, the predetermined number (or percentage) of image darkness test lines is eight (or 80 % of the image darkness test lines). If eight image darkness test lines have an average image darkness level above the threshold darkness level, the overall image darkness is considered acceptable. Other numbers (or percentages) of image darkness lines can be used without departing from the scope of the present invention. Likewise, the particular threshold darkness level chosen is not critical and could be determined by one skilled in the relevant art given this disclosure. The acceptable darkness level can be based on the specific environment in which the fingeφrint scanner is used as well as requirements associated with the field in which the fingeφrint scanner is used and thus can be set by the manufacturer or user, as appropriate. Once overall image darkness has been verified as acceptable in step 403, anext step 404 of verifying acceptability of image darkness distribution is performed. It should be noted that if the previous step 403 resulted in a determination that overall image darkness was not acceptable for the tested image, it is not necessary that routine 400 continue, but could instead stop at step 403. In step 404, image darkness distribution is tested. Despite the determination in step 403 that overall image darkness was acceptable, this darkness may have been concentrated in a particular region. For example, if all image darkness test lines in pairs 430-433, as shown in FIG.4B, have acceptable darkness levels, the image will have an acceptable overall image darkness despite a lack of acceptable darkness in both image darkness test lines in pair 434. Thus, step 404 is used to verify that the darkness of the image is distributed throughout the expected image capture area 420. The step can be performed by verifying that at least one image darkness test line in each of the five pairs 430-434 of image darkness test lines has an acceptable darkness level. As with step 403, this can be done by comparing the average darkness value of each darkness test line with a predetermined threshold darkness value. This threshold darkness value can be the same value used in connection with step 403. Likewise, as with step 403, the particular threshold darkness level chosen is not critical and could be determined by one skilled in the relevant art given this disclosure. The acceptable darkness level can be based on the specific environment in which the fingeφrint scanner is used as well as requirements associated with the field in which the fingeφrint scanner is used and thus can be set by the manufacturer or user, as appropriate. Because step 404 of the routine 400 shown in FIG. 4A verifies that the image darkness is distributed throughout expected image capture region 420, the routine 400 of FIG.4A can be used to verify acceptable darkness level throughout a particular region. Accordingly, such a routine 400 can be used as the image darkness test within the routine 300 shown in FIG. 3. Meanwhile, the image definition test 330 also shown in routine 300 can be performed with a routine like that shown in FIG. 5 A. FIG. 5 A is an illustration of a routine for testing image definition 500 in accordance with an embodiment of the present invention. While the routine 400 of FIG. 4A tested an image for an acceptable darkness level, the routine 500 of FIG. 5 A tests an image for an acceptable level of definition. Such a test is useful because, for example, a particular image may be have an acceptable level of darkness while lacking the necessary ridge details characteristic of an acceptable fingeφrint image. Thus, routine 500 tests an image for its definition level. Since a fingeφrint image should have the dark ridges separated by light valleys characteristic of an acceptable fingeφrint image, routine 500 tests for image definition by counting ridges and valleys along image definition test lines. In a first step 501 of the routine 500 of FIG. 5A, image definition test lines are selected from a captured image to be tested. This will be explained in connection with FIG. 5B.
FIG. 5B illustrates an arrangement of image definition test lines used in an image definition test according to the present invention. In FIG. 5B, image capture surface 210 is depicted with an expected image capture area 520. As with the arrangement shown in FIG. 4B, expected image capture area 520 is a region in which a fingeφrint is expected to be located during an image capture event. The precise size and location of image capture area 520 can differ from that shown in the figure without departing from the scope of the invention. Within the image capture area 520 are arranged two groups 530, 540 of image definition test lines 531, 541, and the like. Each image definition test line is a line of pixels within the image capture area 520. The first group of image definition test lines 530 includes five vertically arranged parallel image definition test lines, e.g. 531. The second group of image definition test lines 540 includes seven horizontally arranged parallel image definition test lines, e.g. 541. While specific numbers of image definition test lines have been depicted, other numbers of image definition test lines could be used without departing from the scope of the present invention. Likewise, while the arrangement of image definition test lines has been selected in the arrangement of FIG. 5B to include more horizontally arranged lines than vertically arranged lines, different arrangements could be used without departing from the scope of the present invention.
In a next step 502 of the routine 500 shown in FIG. 5A, a ridge count for each image definition test line is determined. Such a ridge count can be determined by looking for a pattern of pixel undulations representative of an expected pattern of fingeφrint ridges. In a fingeφrint image, ridges are shown as adjacent dark areas separated from each other by intervening light areas representative of valleys. Thus, a line of pixels that includes a number of fingeφrint ridges will include a substantially continuous group of comparatively dark pixels following by a substantially continuous group of comparatively light pixels. Whether a pixel is considered comparatively dark or light can be determined by selecting a mid-range light level. This mid-range light level can be a single light level or a range of light levels. A comparatively dark pixel is one that is on the dark side of this mid-range light level while a comparatively light pixel is one that is on the light side of this mid-range light level. Thus, a ridge can be determined by the presence of, for example, three or more continuous comparatively dark pixels bounded by, for example, three or more comparatively light pixels. In this way, the number of ridges within one image definition test line can be determined in step 502 by counting groups of comparatively dark pixels separated by groups of comparatively light pixels. The actual number of comparatively dark pixels necessary to define to a ridge could be determined by one skilled in the relevant arts given this disclosure.
In a final step 503, the ridge counts of the image definition test lines determined in step 502 are used to verify image definition acceptability. This can be done, for example, by verifying that the ridge count for each image definition test line is greater than a threshold ridge count value associated with each image definition test line. The particular threshold ridge count values used are not critical and could be determined by one skilled in the relevant art given this disclosure. Rather than having a threshold ridge count value for each image definition test line, a singe threshold ridge count value could be used for all the image definition test lines. As with acceptable image darkness, the acceptable image definition level can be based on the specific environment in which the fingeφrint scanner is used as well as requirements associated with the field in which the fingeφrint scanner is used and thus can be set by the manufacturer or user, as appropriate.
Conclusion
While various embodiments of the present invention have been described above, it should be understood that they have been presented by way of example only, and not limitation. It will be understood by those skilled in the art that various changes in form and details may be made therein without departing from the spirit and scope of the invention as defined in the appended claims. Thus, the breadth and scope of the present invention should not be limited by any of the above-described exemplary embodiments, but should be defined only in accordance with the following claims and their equivalents.

Claims

What Is Claimed Is:
1. A method of capturing an acceptable fingeφrint image comprising the steps of:
(a) capturing an initial fingeφrint image at a nominal image integration time;
(b) capturing a first intermediate fingeφrint image at a first intermediate image integration time;
(c) performing an image darkness test; and
(d) performing an image definition test.
2. The method of claim 1, further comprising a step (e) of capturing a subsequent intermediate fingeφrint image at a subsequent intermediate image integration time prior to said step (d) when said step (c) results in an unacceptable darkness level.
3. The method of claim 2, further comprising repeating said step (e) at additional subsequent intermediate integration times until said step (c) results in an acceptable darkness level.
4. The method of claim 3, wherein said intermediate integration times are within a range of times that includes said nominal image integration time.
5. The method of claim 4, wherein said intermediate integration times comprise multiples of 1/7 of the nominal image integration time.
6. The method of claim 2, further comprising repeating said steps (b), (c), (d), and (e) until said step (d) results in an acceptable image definition level.
7. The method of claim 1, wherein said step (c) further comprises the steps of: (f) calculating average darkness values for a plurality of image darkness test lines;
(g) verifying that overall image darkness is acceptable; and (h) verifying that image darkness distribution is acceptable.
8. The method of claim 7, wherein said step (f) further comprises calculating average darkness values for a plurality of image darkness lines arranged in pairs of image darkness lines, said pairs of image darkness lines situated within an expected image capture region.
9. The method of claim 8, wherein said step (g) further comprises verifying that a predetermined number of said plurality of image darkness test lines have associated calculated average darkness values that exceed a darkness threshold value.
10. The method of claim 9, wherein said step (g) further comprises verifying that eight of said plurality of image darkness test lines have associated calculated average darkness values that exceed a darkness threshold value, and wherein said plurality of image darkness test lines includes ten image darkness test lines.
11. The method of claim 1, wherein said step (d) further comprises the steps of:
(i) determining a ridge count for each of a plurality of image definition test lines; and
(j) verifying that image definition is acceptable based on the ridge counts determined in said step (i).
12. The method of claim 11, wherein said step (i) further comprises determining a ridge count for each of a predetermined number of a first set of image definition test lines and for each of a predetermined number of a second set of image definition test lines.
13. The method of claim 12, wherein said first set of image definition test lines comprises five vertical image definition test lines and said second set of image definition test lines comprises seven horizontal image definition test lines, and wherein said step (i) further comprises determining a ridge count for each of said five vertical image definition test lines and for each of said seven of horizontal image definition test lines.
14. A fingeφrint scanner for capturing an acceptable fingeφrint image comprising: a camera that captures an initial fingeφrint image at a nominal image integration time and captures a first intermediate fingeφrint image at a first intermediate image integration time; and a processor that performs an image darkness test and an image definition test.
15. The fingeφrint scanner of claim 14, wherein said camera further captures a subsequent intermediate fingeφrint image at a subsequent intermediate image integration time when said processor performs an image darkness test that results in an unacceptable darkness level.
16. The fingeφrint scanner of claim 15, wherein said camera captures additional subsequent intermediate integration times until said processor performs an image darkness test that results in an acceptable darkness level.
17. The fingeφrint scanner of claim 16, wherein said intermediate integration times are derived from said nominal image integration time.
18. The fingeφrint scanner of claim 17, wherein said intermediate integration times are derived from said nominal image integration time by multiplying said nominal image integration time by multiples of 1/7 of the nominal image integration time.
19. The fingeφrint scanner of claim 15, wherein said camera captures subsequent intermediate fingeφrint images at subsequent intermediate integration times until said processor performs and image darkness test and an image definition test that both result in acceptable image darkness and definition levels, respectively, for a single intermediate fingeφrint image.
20. The fingeφrint scanner of claim 14, wherein said processor calculates average darkness values for a plurality of image darkness test lines, verifies that overall image darkness is acceptable, and verifies that image darkness distribution is acceptable.
21. The finger print scanner of claim 20, wherein said processor calculates average darkness values for a plurality of image darkness lines arranged in pairs of image darkness lines, said pairs of image darkness lines situated within an expected image capture region.
22. The fingeφrint scanner of claim 21 , wherein said processor verifies that a predetermined number of said plurality of image darkness test lines have associated calculated average darkness values that exceed a darkness threshold value.
23. The fingeφrint scanner of claim 22, wherein said processor verifies that eight of said plurality of image darkness test lines have associated calculated average darkness values that exceed a darkness threshold value, and wherein said plurality of image darkness test lines includes ten image darkness test lines.
24. The fingeφrint scanner of claim 14, wherein said processor determines a ridge count for each of a plurality of image definition test lines and verifies that image definition is acceptable based on the ridge count for each of the plurality of image definition test lines.
25. The fingeφrint scanner of claim 24, wherein said processor determines a ridge count for each of a predetermined number of vertical image definition test lines and for each of a predetermined number of horizontal image definition test lines.
26. The fingeφrint scanner of claim 25, wherein said processor determines a ridge count for each of five vertical image definition test lines and for each of seven of horizontal image definition test lines.
27. A method of capturing an acceptable fingeφrint image comprising the steps of:
(a) capturing a first intermediate fingeφrint image at a first intermediate image integration time;
(b) performing an image darkness test; and
(c) performing an image definition test.
28. The method of claim 27, further comprising a step (d) of capturing a subsequent intermediate fingeφrint image at a subsequent intermediate image integration time prior to said step (c) when said step (b) results in an unacceptable darkness level.
29. The method of claim 28, further comprising repeating said step (d) at additional subsequent intermediate integration times until said step (b) results in an acceptable darkness level.
30. A fingeφrint scanner for capturing an acceptable fingeφrint image comprising: means for capturing an initial fingeφrint image at a nominal image integration time and for capturing a first intermediate fingeφrint image at a first intermediate image integration time; and means for performing an image darkness test and an image definition test.
31. A system controller for use in a fingeφrint scanner, wherein said system controller performs an image darkness test, and performs an image definition test.
32. The system controller of claim 31, wherein said system controller calculates average darkness values for a plurality of image darkness test lines within a fingeφrint image and verifies that overall image darkness and image darkness distribution are both acceptable.
33. The system controller of claim 31, wherein said system controller determines a ridge count for each of a plurality of image definition test lines within a fingeφrint image and verifies that image definition is acceptable based on the ridge count for each of the plurality of image definition test lines.
PCT/US2000/035434 2000-08-18 2000-12-28 Fingerprint scanner auto-capture system and method WO2002017221A1 (en)

Priority Applications (4)

Application Number Priority Date Filing Date Title
JP2002521215A JP2004506993A (en) 2000-08-18 2000-12-28 Fingerprint scanner automatic capture system and method
DE60027207T DE60027207T2 (en) 2000-08-18 2000-12-28 SYSTEM AND METHOD FOR AUTOMATIC CONTROL OF A FINGERPRINT KEYSTONE
AU2001222942A AU2001222942A1 (en) 2000-08-18 2000-12-28 Fingerprint scanner auto-capture system and method
EP00986761A EP1312040B1 (en) 2000-08-18 2000-12-28 Fingerprint scanner auto-capture system and method

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US22609200P 2000-08-18 2000-08-18
US60/226,092 2000-08-18

Publications (1)

Publication Number Publication Date
WO2002017221A1 true WO2002017221A1 (en) 2002-02-28

Family

ID=22847516

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2000/035434 WO2002017221A1 (en) 2000-08-18 2000-12-28 Fingerprint scanner auto-capture system and method

Country Status (7)

Country Link
US (2) US6983062B2 (en)
EP (1) EP1312040B1 (en)
JP (1) JP2004506993A (en)
AT (1) ATE322720T1 (en)
AU (1) AU2001222942A1 (en)
DE (1) DE60027207T2 (en)
WO (1) WO2002017221A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1215620A2 (en) * 2000-12-15 2002-06-19 Nippon Telegraph and Telephone Corporation Image capturing method and apparatus and fingerprint collation method and apparatus

Families Citing this family (55)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8036431B1 (en) * 1999-10-29 2011-10-11 Identix Incorporated Portable apparatus for identification verification
CA2359053C (en) * 1999-11-08 2004-04-06 Casio Computer Co., Ltd. Photosensor system and drive control method thereof
US6867811B2 (en) * 1999-11-08 2005-03-15 Casio Computer Co., Ltd. Photosensor system and drive control method thereof
ATE322720T1 (en) * 2000-08-18 2006-04-15 Cross Match Technologies Inc SYSTEM AND METHOD FOR AUTOMATICALLY CONTROLLING A FINGERPRINT SCAN
US20020138438A1 (en) * 2001-02-23 2002-09-26 Biometric Security Card, Inc. Biometric identification system using biometric images and copy protect code stored on a magnetic stripe and associated methods
GB0117418D0 (en) * 2001-07-17 2001-09-12 Storm Mason R Litecam
US7272247B2 (en) * 2001-10-10 2007-09-18 Activcard Ireland Limited Method and system for fingerprint authentication
JP3535494B2 (en) * 2001-12-25 2004-06-07 エヌティティエレクトロニクス株式会社 Fingerprint image determination device and program
US7073711B2 (en) * 2002-04-19 2006-07-11 Cross Match Technologies, Inc. Mobile handheld code reader and print scanner system and method
US6996259B2 (en) * 2002-08-02 2006-02-07 Cross Match Technologies, Inc. System and method for counting ridges in a captured print image
US20040125993A1 (en) * 2002-12-30 2004-07-01 Yilin Zhao Fingerprint security systems in handheld electronic devices and methods therefor
US7526109B2 (en) * 2003-11-26 2009-04-28 Microsoft Corporation Fingerprint scanner with translating scan head
US7403644B2 (en) * 2003-11-26 2008-07-22 Microsoft Corporation Fingerprint scanner with translating platen
GB2412775A (en) * 2004-03-31 2005-10-05 Seiko Epson Corp Fingerprint scanner and method of auto focusing one
US7850650B2 (en) 2005-07-11 2010-12-14 Covidien Ag Needle safety shield with reset
US7905857B2 (en) 2005-07-11 2011-03-15 Covidien Ag Needle assembly including obturator with safety reset
US7828773B2 (en) 2005-07-11 2010-11-09 Covidien Ag Safety reset key and needle assembly
US7565548B2 (en) * 2004-11-18 2009-07-21 Biogy, Inc. Biometric print quality assurance
US7546089B2 (en) * 2004-12-23 2009-06-09 Triquint Semiconductor, Inc. Switchable directional coupler for use with RF devices
WO2006099682A1 (en) * 2005-03-24 2006-09-28 Bio Recognition Systems Pty Ltd Resource effective image processing system
US20060276747A1 (en) 2005-06-06 2006-12-07 Sherwood Services Ag Needle assembly with removable depth stop
US7731692B2 (en) 2005-07-11 2010-06-08 Covidien Ag Device for shielding a sharp tip of a cannula and method of using the same
CN1710852B (en) * 2005-07-26 2010-08-11 北京飞天诚信科技有限公司 Intelligent ciphered key with biological characteristic identification function and its working method
US7654735B2 (en) 2005-11-03 2010-02-02 Covidien Ag Electronic thermometer
KR100780957B1 (en) * 2006-08-21 2007-12-03 삼성전자주식회사 Apparatus and method for selecting image
US7660442B2 (en) * 2006-09-01 2010-02-09 Handshot, Llc Method and system for capturing fingerprints, palm prints and hand geometry
US8462377B2 (en) * 2007-07-25 2013-06-11 Aptina Imaging Corporation Method, apparatus, and system for reduction of line processing memory size used in image processing
US8391635B2 (en) * 2007-08-31 2013-03-05 Olympus Corporation Noise removal device, noise removal method, and computer readable recording medium
US11441919B2 (en) * 2007-09-26 2022-09-13 Apple Inc. Intelligent restriction of device operations
US8357104B2 (en) * 2007-11-01 2013-01-22 Coviden Lp Active stylet safety shield
US8295561B2 (en) * 2008-11-04 2012-10-23 Signal Processing, Inc. Knowledge learning system and process for fingerprint verifications
US9274553B2 (en) 2009-10-30 2016-03-01 Synaptics Incorporated Fingerprint sensor and integratable electronic display
US9336428B2 (en) * 2009-10-30 2016-05-10 Synaptics Incorporated Integrated fingerprint sensor and display
US9280695B2 (en) * 2009-11-11 2016-03-08 Cross Match Technologies, Inc. Apparatus and method for determining sequencing of fingers in images to a two-finger scanner of fingerprint images
EP2756452B1 (en) 2011-09-16 2015-07-22 Life Technologies Corporation Simultaneous acquisition of biometric data and nucleic acid
US9058646B2 (en) 2011-09-23 2015-06-16 Life Technologies Corporation Simultaneous acquisition of biometric data and nucleic acid
US9330294B2 (en) 2012-01-26 2016-05-03 Aware, Inc. System and method of capturing and producing biometric-matching quality fingerprints and other types of dactylographic images with a mobile device
US9719130B2 (en) 2012-02-22 2017-08-01 Life Technologies Corporation Sample collection devices, kits and methods of use
US9342732B2 (en) * 2012-04-25 2016-05-17 Jack Harper Artificial intelligence methods for difficult forensic fingerprint collection
US20140133715A1 (en) * 2012-11-15 2014-05-15 Identity Validation Products, Llc Display screen with integrated user biometric sensing and verification system
KR101444064B1 (en) * 2013-03-22 2014-09-26 주식회사 슈프리마 Method and apparatus for fingerprint recognition by using multi scan
US9582716B2 (en) * 2013-09-09 2017-02-28 Delta ID Inc. Apparatuses and methods for iris based biometric recognition
USD741889S1 (en) * 2013-09-10 2015-10-27 Apple Inc. Display screen or portion thereof with animated graphical user interface
USD755839S1 (en) 2014-09-09 2016-05-10 Apple Inc. Display screen or portion thereof with graphical user interface
WO2016081726A1 (en) * 2014-11-19 2016-05-26 Booz Allen & Hamilton Device, system, and method for forensic analysis
US10037528B2 (en) 2015-01-14 2018-07-31 Tactilis Sdn Bhd Biometric device utilizing finger sequence for authentication
US9607189B2 (en) 2015-01-14 2017-03-28 Tactilis Sdn Bhd Smart card system comprising a card and a carrier
US10395227B2 (en) 2015-01-14 2019-08-27 Tactilis Pte. Limited System and method for reconciling electronic transaction records for enhanced security
US9550406B2 (en) * 2015-03-16 2017-01-24 Thunder Power Hong Kong Ltd. Thermal dissipation system of an electric vehicle
SE1551641A1 (en) * 2015-12-14 2017-06-15 Fingerprint Cards Ab Method and fingerprint sensing system for acquiring a fingerprint image
US10325139B2 (en) 2016-05-13 2019-06-18 Fingerprint Cards Ab Fingerprint authentication with parallel processing
USD804508S1 (en) 2016-10-26 2017-12-05 Apple Inc. Display screen or portion thereof with graphical user interface
US10817722B1 (en) 2017-03-20 2020-10-27 Cross Match Technologies, Inc. System for presentation attack detection in an iris or face scanner
US11531756B1 (en) 2017-03-20 2022-12-20 Hid Global Corporation Apparatus for directing presentation attack detection in biometric scanners
US11163970B1 (en) * 2020-06-16 2021-11-02 Google Llc Optical fingerprint system with varying integration times across pixels

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0623890A2 (en) * 1993-05-07 1994-11-09 Nippon Telegraph And Telephone Corporation Method and apparatus for image processing
WO1999026187A1 (en) * 1997-11-17 1999-05-27 Veridicom, Inc. Automatic adjustment processing for sensor devices

Family Cites Families (173)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US2500017A (en) 1948-07-07 1950-03-07 Eastman Kodak Co Apochromatic telescope objectives and systems including same
US3200701A (en) 1962-01-29 1965-08-17 Ling Temco Vought Inc Method for optical comparison of skin friction-ridge patterns
US3540025A (en) * 1967-01-20 1970-11-10 Sierracin Corp Ice detector
US3482498A (en) 1967-05-09 1969-12-09 Trw Inc Ridge pattern recording apparatus
DE1968844U (en) * 1967-05-23 1967-09-21 Eltro G M B H & Co Ges Fuer St ELECTRIC HEATING FOR EXIT WINDOWS OR FRONT LENSES OF OPTICAL DEVICES.
US3475588A (en) * 1968-08-20 1969-10-28 Permaglass Defrosting and deicing window assembly
US3527535A (en) 1968-11-15 1970-09-08 Eg & G Inc Fingerprint observation and recording apparatus
US3617120A (en) 1969-06-02 1971-11-02 Stephen Roka Fingerprint comparison apparatus
US3699519A (en) 1971-04-30 1972-10-17 North American Rockwell Fingerprint analysis device
US3906520A (en) * 1973-08-03 1975-09-16 Optics Technology Inc Apparatus for producing a high contrast visible image from an object
US4032975A (en) 1974-02-25 1977-06-28 Mcdonnell Douglas Corporation Detector array gain compensation
US4063226A (en) 1974-03-18 1977-12-13 Harris Corporation Optical information storage system
US3947128A (en) 1974-04-19 1976-03-30 Zvi Weinberger Pattern comparison
US3968476A (en) 1974-07-17 1976-07-06 Sperry Rand Corporation Spurious signal removal in optical processor fingerprint identification apparatus
US3975711A (en) * 1974-08-30 1976-08-17 Sperry Rand Corporation Real time fingerprint recording terminal
US4210899A (en) 1975-06-23 1980-07-01 Fingermatrix, Inc. Fingerprint-based access control and identification apparatus
US4209481A (en) * 1976-04-19 1980-06-24 Toray Industries, Inc. Process for producing an anisotropically electroconductive sheet
US4120585A (en) * 1976-11-19 1978-10-17 Calspan Corporation Fingerprint identification system using a pliable optical prism
US4152056A (en) * 1977-09-02 1979-05-01 Fowler Randall C Fingerprinting arrangement
US4322163A (en) * 1977-10-25 1982-03-30 Fingermatrix Inc. Finger identification
CA1087735A (en) * 1978-07-28 1980-10-14 Szymon Szwarcbier Process and apparatus for positive identification of customers
EP0031163B1 (en) 1979-12-24 1987-09-23 El-De Electro-Optic Developments Limited Method and device for carrying out a comparison between certain patterns, especially finger prints
US4544267A (en) 1980-11-25 1985-10-01 Fingermatrix, Inc. Finger identification
GB2089545A (en) 1980-12-11 1982-06-23 Watson Graham Michael Optical Image Formation
EP0101772A1 (en) 1982-09-01 1984-03-07 Jerome Hal Lemelson Computer security systems
JPS59153514U (en) 1983-03-31 1984-10-15 株式会社東海理化電機製作所 Contact pattern observation device
US4553837A (en) * 1983-10-24 1985-11-19 Fingermatrix, Inc. Roll fingerprint processing apparatus
US4537484A (en) 1984-01-30 1985-08-27 Identix Incorporated Fingerprint imaging apparatus
DE3577243D1 (en) * 1984-07-18 1990-05-23 Nec Corp IMAGE INPUT DEVICE.
US4601195A (en) 1985-04-11 1986-07-22 Rheometrics, Inc. Apparatus and method for measuring viscoelastic properties of materials
DK155242C (en) 1985-05-02 1989-07-31 Jydsk Telefon As METHOD AND APPARATUS FOR AUTOMATIC DETECTION OF FINGERPRINT
US4942482A (en) * 1985-08-09 1990-07-17 Sony Corporation Automatic page-turning device
US4783823A (en) 1985-09-16 1988-11-08 Omron Tateisi Electronics, Co. Card identifying method and apparatus
US4669487A (en) * 1985-10-30 1987-06-02 Edward Frieling Identification device and method
US5187747A (en) 1986-01-07 1993-02-16 Capello Richard D Method and apparatus for contextual data enhancement
US4876726A (en) 1986-01-07 1989-10-24 De La Rue Printrak, Inc. Method and apparatus for contextual data enhancement
US4684802A (en) * 1986-02-18 1987-08-04 International Business Machines Corporation Elliptical finger press scanner with rotating light source
EP0244498B1 (en) 1986-05-06 1991-06-12 Siemens Aktiengesellschaft Arrangement and process for determining the authenticity of persons by verifying their finger prints
US5067162A (en) 1986-06-30 1991-11-19 Identix Incorporated Method and apparatus for verifying identity using image correlation
US4701772A (en) * 1986-11-26 1987-10-20 Xerox Corporation Thermally activated image bar
US4792226A (en) 1987-02-27 1988-12-20 C.F.A. Technologies, Inc. Optical fingerprinting system
US4811414A (en) 1987-02-27 1989-03-07 C.F.A. Technologies, Inc. Methods for digitally noise averaging and illumination equalizing fingerprint images
EP0308162A3 (en) 1987-09-15 1990-06-06 Identix Incorporated Optical system for fingerprint imaging
US4933976A (en) 1988-01-25 1990-06-12 C.F.A. Technologies, Inc. System for generating rolled fingerprint images
FI893028A (en) 1988-06-23 1989-12-24 Fujitsu Ltd ANORDING FOR THE PURPOSE OF DATA FRAON EN OJAEMN YTA.
US5222153A (en) * 1988-09-02 1993-06-22 Thumbscan, Inc. Apparatus for matching a fingerprint using a tacky finger platen
US4946276A (en) * 1988-09-23 1990-08-07 Fingermatrix, Inc. Full roll fingerprint apparatus
US5067749A (en) 1989-01-09 1991-11-26 Land Larry D Method and apparatus for obtaining and recording fingerprint indicia
CA1326304C (en) 1989-01-17 1994-01-18 Marcel Graves Secure data interchange system
CA1286032C (en) 1989-09-28 1991-07-09 James H. Lougheed Optical scanning and recording apparatus for fingerprints
US5261266A (en) * 1990-01-24 1993-11-16 Wisconsin Alumni Research Foundation Sensor tip for a robotic gripper and method of manufacture
EP0439357B1 (en) * 1990-01-25 1997-07-23 Hewlett-Packard Company Method and apparatus for providing sensor compensation in a document scanner
US5146102A (en) 1990-02-22 1992-09-08 Kabushiki Kaisha Toshiba Fingerprint image input apparatus including a cylindrical lens
US5054090A (en) 1990-07-20 1991-10-01 Knight Arnold W Fingerprint correlation system with parallel FIFO processor
US5047861A (en) * 1990-07-31 1991-09-10 Eastman Kodak Company Method and apparatus for pixel non-uniformity correction
US5230025A (en) 1990-08-31 1993-07-20 Digital Biometrics, Inc. Method and apparatus for capturing skin print images
JP2779053B2 (en) * 1990-09-25 1998-07-23 株式会社日立製作所 Optical scanning device
US5131038A (en) 1990-11-07 1992-07-14 Motorola, Inc. Portable authentification system
US5249370A (en) 1990-11-15 1993-10-05 Digital Biometrics, Inc. Method and apparatus for fingerprint image processing
KR930001001Y1 (en) * 1990-11-17 1993-03-02 주식회사 금성사 Fingerprint recognition apparatus
US5157497A (en) * 1991-02-25 1992-10-20 Matsushita Electric Industrial Co., Ltd. Method and apparatus for detecting and compensating for white shading errors in a digitized video signal
US5185673A (en) * 1991-06-12 1993-02-09 Hewlett-Packard Company Automated image calibration
US5467403A (en) 1991-11-19 1995-11-14 Digital Biometrics, Inc. Portable fingerprint scanning apparatus for identification verification
US5222152A (en) 1991-11-19 1993-06-22 Digital Biometrics, Inc. Portable fingerprint scanning apparatus for identification verification
USD348445S (en) 1992-01-31 1994-07-05 Digital Biometrics, Inc. Hand held fingerprint scanner for imaging and capturing a photographic image
US5335288A (en) * 1992-02-10 1994-08-02 Faulkner Keith W Apparatus and method for biometric identification
US6347163B2 (en) * 1994-10-26 2002-02-12 Symbol Technologies, Inc. System for reading two-dimensional images using ambient and/or projected light
US5729334A (en) * 1992-03-10 1998-03-17 Van Ruyven; Lodewijk Johan Fraud-proof identification system
US5363318A (en) * 1992-03-23 1994-11-08 Eastman Kodak Company Method and apparatus for adaptive color characterization and calibration
US5855433A (en) * 1992-04-28 1999-01-05 Velho; Luiz C. Method of color halftoning using space filling curves
GB2267771A (en) 1992-06-06 1993-12-15 Central Research Lab Ltd Finger guide
US5351127A (en) * 1992-06-17 1994-09-27 Hewlett-Packard Company Surface plasmon resonance measuring instruments
JPH06259541A (en) * 1992-10-30 1994-09-16 Toshiba Corp Method for correcting image distorting and its system
US5291318A (en) * 1992-11-02 1994-03-01 Xerox Corporation Holographic member for a real-time clock in a raster output scanner
DE4311295A1 (en) 1993-04-02 1994-10-06 Borus Spezialverfahren Identification system
US6204331B1 (en) * 1993-06-01 2001-03-20 Spalding Sports Worldwide, Inc. Multi-layer golf ball utilizing silicone materials
DE4322445C1 (en) 1993-07-06 1995-02-09 Alfons Behnke Method for coding identification cards and for identifying such coded identification cards and means for carrying out the method, such as identification card, fingerprint sensor, fingerprint acceptance and comparison device
US5416573A (en) 1993-09-10 1995-05-16 Indentix Incorporated Apparatus for producing fingerprint images which are substantially free of artifacts attributable to moisture on the finger being imaged
US6064398A (en) * 1993-09-10 2000-05-16 Geovector Corporation Electro-optic vision systems
DE4332411A1 (en) 1993-09-23 1995-03-30 Bayerische Motoren Werke Ag Theft protection for motor vehicles with several control units for vehicle components
US5471240A (en) * 1993-11-15 1995-11-28 Hughes Aircraft Company Nonuniformity correction of an imaging sensor using region-based correction terms
CA2109682C (en) 1993-11-22 1998-11-03 Lee F. Hartley Multiple bus interface
USD351144S (en) 1993-12-07 1994-10-04 Digital Biometrics, Inc. Handheld finger print scanner for imaging and capturing a photographic image
US5384621A (en) 1994-01-04 1995-01-24 Xerox Corporation Document detection apparatus
DE69518233D1 (en) 1994-02-18 2000-09-07 Imedge Technology Inc COMPACT DEVICE FOR PRODUCING AN IMAGE OF THE SURFACE TOPOLOGY OF OBJECTS AND METHOD FOR PRODUCING THE DEVICE
US5973731A (en) 1994-03-03 1999-10-26 Schwab; Barry H. Secure identification system
US5528355A (en) 1994-03-11 1996-06-18 Idnetix Incorporated Electro-optic palm scanner system employing a non-planar platen
US5598474A (en) 1994-03-29 1997-01-28 Neldon P Johnson Process for encrypting a fingerprint onto an I.D. card
DE4416507C5 (en) 1994-05-10 2006-10-19 Volkswagen Ag Method for detecting a use authorization for a vehicle
US5448649A (en) * 1994-05-24 1995-09-05 Chen; Wang S. Apparatus for imaging fingerprint or topographic relief pattern on the surface of an object
US5473144A (en) 1994-05-27 1995-12-05 Mathurin, Jr.; Trevor R. Credit card with digitized finger print and reading apparatus
US5509083A (en) 1994-06-15 1996-04-16 Nooral S. Abtahi Method and apparatus for confirming the identity of an individual presenting an identification card
US5469506A (en) 1994-06-27 1995-11-21 Pitney Bowes Inc. Apparatus for verifying an identification card and identifying a person by means of a biometric characteristic
US5689529A (en) 1994-08-02 1997-11-18 International Automated Systems, Inc. Communications method and apparatus for digital information
US5517528A (en) 1994-08-02 1996-05-14 International Automated Systems, Inc. Modulation method and apparatus for digital communications
US5640422A (en) 1994-08-02 1997-06-17 International Automated Systems, Inc. Digital communications modulation method and apparatus
US5613014A (en) 1994-10-12 1997-03-18 Martin Marietta Corp. Fingerprint matching system
GB9420634D0 (en) 1994-10-13 1994-11-30 Central Research Lab Ltd Apparatus and method for imaging skin ridges
US5596454A (en) 1994-10-28 1997-01-21 The National Registry, Inc. Uneven surface image transfer apparatus
US5615277A (en) 1994-11-28 1997-03-25 Hoffman; Ned Tokenless security system for authorizing access to a secured computer system
US5757278A (en) * 1994-12-26 1998-05-26 Kabushiki Kaisha Toshiba Personal verification system
US5591949A (en) 1995-01-06 1997-01-07 Bernstein; Robert J. Automatic portable account controller for remotely arranging for payment of debt to a vendor
US5657400A (en) * 1995-01-31 1997-08-12 General Electric Company Automatic identification and correction of bad pixels in a large area solid state x-ray detector
DE69625398T2 (en) * 1995-02-24 2003-09-04 Eastman Kodak Co Black pattern correction for a charge transfer sensor
US5548394A (en) 1995-03-16 1996-08-20 Printrak International Inc. Scanning fingerprint reading
US5625448A (en) 1995-03-16 1997-04-29 Printrak International, Inc. Fingerprint imaging
US5946135A (en) * 1995-04-20 1999-08-31 Leica Geosystems Ag Retroreflector
US5942761A (en) * 1995-06-07 1999-08-24 Tuli; Raja Singh Enhancement methods and devices for reading a fingerprint image
US5822445A (en) 1995-06-27 1998-10-13 Dew Engineering And Development Limited Apparatus for identifying fingerprints
US5629764A (en) * 1995-07-07 1997-05-13 Advanced Precision Technology, Inc. Prism fingerprint sensor using a holographic optical element
CA2156236C (en) 1995-08-16 1999-07-20 Stephen J. Borza Biometrically secured control system for preventing the unauthorized use of a vehicle
US5815252A (en) 1995-09-05 1998-09-29 Canon Kabushiki Kaisha Biometric identification process and system utilizing multiple parameters scans for reduction of false negatives
DE69633515D1 (en) 1995-10-05 2004-11-04 Digital Biometrics Inc GAME CHIP DETECTION SYSTEM
JP3522918B2 (en) * 1995-10-05 2004-04-26 富士写真フイルム株式会社 Image input device
US5805777A (en) 1995-10-11 1998-09-08 Eastman Kodak Company Extended printer control interface
US5818956A (en) 1995-10-23 1998-10-06 Tuli; Raja Singh Extended fingerprint reading apparatus
US5650842A (en) 1995-10-27 1997-07-22 Identix Incorporated Device and method for obtaining a plain image of multiple fingerprints
US5825474A (en) 1995-10-27 1998-10-20 Identix Corporation Heated optical platen cover for a fingerprint imaging system
US5745684A (en) 1995-11-06 1998-04-28 Sun Microsystems, Inc. Apparatus and method for providing a generic interface between a host system and an asynchronous transfer mode core functional block
US5907627A (en) 1995-11-06 1999-05-25 Dew Engineering And Development Limited Contact imaging device
US5793218A (en) 1995-12-15 1998-08-11 Lear Astronics Corporation Generic interface test adapter
US5717777A (en) 1996-01-11 1998-02-10 Dew Engineering And Development Limited Longest line method and apparatus for fingerprint alignment
US5828773A (en) 1996-01-26 1998-10-27 Harris Corporation Fingerprint sensing method with finger position indication
US5848231A (en) 1996-02-12 1998-12-08 Teitelbaum; Neil System configuration contingent upon secure input
US5859420A (en) 1996-02-12 1999-01-12 Dew Engineering And Development Limited Optical imaging device
US5832244A (en) 1996-02-20 1998-11-03 Iomega Corporation Multiple interface input/output port for a peripheral device
US5778089A (en) 1996-03-04 1998-07-07 Dew Engineering And Development Limited Driver circuit for a contact imaging array
US5859710A (en) * 1996-03-20 1999-01-12 Intel Corporation Digital copying system using a high speed data bus without the use of data buffers
US5809172A (en) * 1996-04-17 1998-09-15 Canon Kabushiki Kaisha Non-linear aggregation mapping compression of image data and method
US5748766A (en) 1996-04-30 1998-05-05 Identix Incorporated Method and device for reducing smear in a rolled fingerprint image
US6122394A (en) * 1996-05-01 2000-09-19 Xros, Inc. Compact, simple, 2D raster, image-building fingerprint scanner
JP3678875B2 (en) * 1996-05-10 2005-08-03 株式会社リコー Image forming apparatus
US5801681A (en) * 1996-06-24 1998-09-01 Sayag; Michel Method and apparatus for generating a control signal
GB2313441A (en) 1996-05-18 1997-11-26 Motorola Israel Ltd Power conserving scanning method
JP3473658B2 (en) * 1996-07-18 2003-12-08 アルプス電気株式会社 Fingerprint reader
US5755748A (en) 1996-07-24 1998-05-26 Dew Engineering & Development Limited Transcutaneous energy transfer device
US5736734A (en) * 1996-08-12 1998-04-07 Fingermatrix, Inc. Liquid platen fingerprint image enhancement
US5680205A (en) 1996-08-16 1997-10-21 Dew Engineering And Development Ltd. Fingerprint imaging apparatus with auxiliary lens
JPH11514771A (en) * 1996-08-27 1999-12-14 カバ シュリースシステーメ アーゲー Method and apparatus for identifying undeployed fingerprints
US5963657A (en) * 1996-09-09 1999-10-05 Arete Associates Economical skin-pattern-acquisition and analysis apparatus for access control; systems controlled thereby
US5872834A (en) 1996-09-16 1999-02-16 Dew Engineering And Development Limited Telephone with biometric sensing device
US5869822A (en) 1996-10-04 1999-02-09 Meadows, Ii; Dexter L. Automated fingerprint identification system
FR2754168B1 (en) * 1996-10-04 1998-12-18 Thomson Csf METHOD FOR ACQUIRING FINGERPRINTS AND DEVICE FOR IMPLEMENTING IT
US6072891A (en) * 1997-02-21 2000-06-06 Dew Engineering And Development Limited Method of gathering biometric information
JP4005165B2 (en) 1997-01-29 2007-11-07 株式会社東芝 Image input system and image input method
JP3011126B2 (en) * 1997-03-27 2000-02-21 日本電気株式会社 Fingerprint detection device
US5900993A (en) 1997-05-09 1999-05-04 Cross Check Corporation Lens systems for use in fingerprint detection
US5920640A (en) 1997-05-16 1999-07-06 Harris Corporation Fingerprint sensor and token reader and associated methods
US6064753A (en) * 1997-06-10 2000-05-16 International Business Machines Corporation System and method for distortion control in live-scan inkless fingerprint images
JP3353878B2 (en) 1997-07-03 2002-12-03 富士通株式会社 Rotating fingerprint impression collection method
EP0996922A4 (en) * 1997-07-23 2001-01-17 Xros Inc Improved handheld document scanner
US5960100A (en) 1997-07-23 1999-09-28 Hargrove; Tom Credit card reader with thumb print verification means
US5999307A (en) * 1997-09-04 1999-12-07 The University Of British Columbia Method and apparatus for controllable frustration of total internal reflection
US6038332A (en) * 1997-09-05 2000-03-14 Digital Biometrics, Inc. Method and apparatus for capturing the image of a palm
EP0905646A1 (en) 1997-09-30 1999-03-31 Compaq Computer Corporation Pointing and fingerprint identifier mechanism for a computer system
US6281931B1 (en) * 1997-11-04 2001-08-28 Tien Ren Tsao Method and apparatus for determining and correcting geometric distortions in electronic imaging systems
US5928347A (en) 1997-11-18 1999-07-27 Shuttle Technology Group Ltd. Universal memory card interface apparatus
US6134340A (en) * 1997-12-22 2000-10-17 Trw Inc. Fingerprint feature correlator
US5920384A (en) 1997-12-09 1999-07-06 Dew Engineering And Development Limited Optical imaging device
US6041410A (en) 1997-12-22 2000-03-21 Trw Inc. Personal identification fob
US6097873A (en) * 1998-01-14 2000-08-01 Lucent Technologies Inc. Optical fiber attenuator device using an elastomeric attenuator member
US6195447B1 (en) * 1998-01-16 2001-02-27 Lucent Technologies Inc. System and method for fingerprint data verification
JP4150942B2 (en) * 1998-03-11 2008-09-17 ソニー株式会社 Fingerprint image processing apparatus and fingerprint image processing method
US6166787A (en) * 1998-03-17 2000-12-26 Motorola, Inc. Optical display device having prismatic film for enhanced viewing
US6241288B1 (en) * 1998-04-02 2001-06-05 Precise Biometrics Ab Fingerprint identification/verification system
US6178255B1 (en) * 1998-04-28 2001-01-23 Cross Match Technologies, Inc. Individualized fingerprint scanner
US6259108B1 (en) * 1998-10-09 2001-07-10 Kinetic Sciences Inc. Fingerprint image optical input apparatus
US6154285A (en) * 1998-12-21 2000-11-28 Secugen Corporation Surface treatment for optical image capturing system
US6327047B1 (en) * 1999-01-22 2001-12-04 Electronics For Imaging, Inc. Automatic scanner calibration
US6272562B1 (en) * 1999-05-28 2001-08-07 Cross Match Technologies, Inc. Access control unit interface
US6658164B1 (en) * 1999-08-09 2003-12-02 Cross Match Technologies, Inc. Calibration and correction in a fingerprint scanner
ATE322720T1 (en) * 2000-08-18 2006-04-15 Cross Match Technologies Inc SYSTEM AND METHOD FOR AUTOMATICALLY CONTROLLING A FINGERPRINT SCAN
JP2002062983A (en) * 2000-08-21 2002-02-28 Hitachi Ltd Pointing device

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0623890A2 (en) * 1993-05-07 1994-11-09 Nippon Telegraph And Telephone Corporation Method and apparatus for image processing
WO1999026187A1 (en) * 1997-11-17 1999-05-27 Veridicom, Inc. Automatic adjustment processing for sensor devices

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1215620A2 (en) * 2000-12-15 2002-06-19 Nippon Telegraph and Telephone Corporation Image capturing method and apparatus and fingerprint collation method and apparatus
EP1215620A3 (en) * 2000-12-15 2004-12-22 Nippon Telegraph and Telephone Corporation Image capturing method and apparatus and fingerprint collation method and apparatus
US6990219B2 (en) 2000-12-15 2006-01-24 Nippon Telegraph And Telephone Corporation Image capturing method and apparatus and fingerprint collation method and apparatus

Also Published As

Publication number Publication date
US20020021827A1 (en) 2002-02-21
JP2004506993A (en) 2004-03-04
DE60027207T2 (en) 2006-11-16
AU2001222942A1 (en) 2002-03-04
EP1312040B1 (en) 2006-04-05
US20060110016A1 (en) 2006-05-25
US7657067B2 (en) 2010-02-02
ATE322720T1 (en) 2006-04-15
EP1312040A1 (en) 2003-05-21
US6983062B2 (en) 2006-01-03
DE60027207D1 (en) 2006-05-18

Similar Documents

Publication Publication Date Title
EP1312040B1 (en) Fingerprint scanner auto-capture system and method
US7073711B2 (en) Mobile handheld code reader and print scanner system and method
US6744910B1 (en) Hand-held fingerprint scanner with on-board image normalization data storage
US8073209B2 (en) Biometric imaging system and method
US7218761B2 (en) System for obtaining print and other hand characteristic information using a non-planar prism
US7899216B2 (en) Biometric information processing apparatus and biometric information processing method
US7119890B2 (en) System and method for illuminating a platen in a live scanner and producing high-contrast print images
US6867850B2 (en) Light wedge for illuminating a platen in a print scanner
EP1661060B1 (en) Biometric imaging capture system and method
US20050249390A1 (en) Method and apparatus for discriminating ambient light in a fingerprint scanner
GB2400714A (en) Combined optical fingerprint recogniser and navigation control
US20060133656A1 (en) System and method for counting ridges in a captured print image
US20040170303A1 (en) Dynamic image adaption method for adjusting the quality of digital prints
EP1226544B1 (en) Hand-held fingerprint scanner with on-board image normalization data storage
JP4477258B2 (en) Fingerprint verification device
EP1466294B1 (en) Optical biometric sensor with planar waveguide
JPS6128175A (en) Image input device

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BY BZ CA CH CN CR CU CZ DE DK DM DZ EE ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NO NZ PL PT RO RU SD SE SG SI SK SL TJ TM TR TT TZ UA UG UZ VN YU ZA ZW

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): GH GM KE LS MW MZ SD SL SZ TZ UG ZW AM AZ BY KG KZ MD RU TJ TM AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE TR BF BJ CF CG CI CM GA GN GW ML MR NE SN TD TG

121 Ep: the epo has been informed by wipo that ep was designated in this application
DFPE Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101)
WWE Wipo information: entry into national phase

Ref document number: 2002521215

Country of ref document: JP

WWE Wipo information: entry into national phase

Ref document number: 2000986761

Country of ref document: EP

WWP Wipo information: published in national office

Ref document number: 2000986761

Country of ref document: EP

REG Reference to national code

Ref country code: DE

Ref legal event code: 8642

WWG Wipo information: grant in national office

Ref document number: 2000986761

Country of ref document: EP