WO2005022458A2 - System for and method of generating rotational inputs - Google Patents

System for and method of generating rotational inputs Download PDF

Info

Publication number
WO2005022458A2
WO2005022458A2 PCT/US2004/025528 US2004025528W WO2005022458A2 WO 2005022458 A2 WO2005022458 A2 WO 2005022458A2 US 2004025528 W US2004025528 W US 2004025528W WO 2005022458 A2 WO2005022458 A2 WO 2005022458A2
Authority
WO
WIPO (PCT)
Prior art keywords
image
patterned
sensor
correlating
linear
Prior art date
Application number
PCT/US2004/025528
Other languages
French (fr)
Other versions
WO2005022458A3 (en
Inventor
Anthony P. Russo
David L. Weigand
Original Assignee
Atrua Technologies, Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Atrua Technologies, Inc. filed Critical Atrua Technologies, Inc.
Priority to EP04780370A priority Critical patent/EP1661085A2/en
Priority to JP2006524682A priority patent/JP2007519064A/en
Publication of WO2005022458A2 publication Critical patent/WO2005022458A2/en
Publication of WO2005022458A3 publication Critical patent/WO2005022458A3/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/03Arrangements for converting the position or the displacement of a member into a coded form
    • G06F3/033Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor
    • G06F3/0354Pointing devices displaced or positioned by the user, e.g. mice, trackballs, pens or joysticks; Accessories therefor with detection of 2D relative movements between the device, or an operating part thereof, and a plane or surface, e.g. 2D mice, trackballs, pens or pucks
    • G06F3/03547Touch pads, in which fingers can move on a surface
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/12Fingerprints or palmprints
    • G06V40/1335Combining adjacent partial images (e.g. slices) to create a composite input or reference pattern; Tracking a sweeping finger movement
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F2203/00Indexing scheme relating to G06F3/00 - G06F3/048
    • G06F2203/033Indexing scheme relating to G06F3/033
    • G06F2203/0338Fingerprint track pad, i.e. fingerprint sensor used as pointing device tracking the fingertip image

Definitions

  • This invention relates to the field of biometric sensors.
  • this invention relates to systems and methods that use fingerprint images to emulate electronic positioning devices.
  • Portable electronic computing platforms need these user inputs for multiple purposes, including (a) navigating a cursor or a pointer to a certain location on a display, (b) selecting (e-g- 5 choosing or not choosing) an item or an action, and (c) orientating (e.g., changing direction with or without visual feedback) an input device.
  • Concepts for user input from much larger personal computers have been borrowed.
  • Micro joysticks, navigation bars, scroll wheels, touch pads, steering wheels and buttons have all been adopted, with limited success, in conventional devices. All these positioning devices consume substantial amounts of valuable surface real estate on a portable device.
  • the present invention discloses a system for and method of obtaining rotation information from a patterned image, such as a fingerprint image.
  • Embodiments of the present invention thus require smaller footprints than those that use joy sticks, steering wheels, and other, larger devices that require additional power.
  • Embodiments of the present ' invention use linear correlation methods that are easier to use than rotational and other methods such as those using trigonometric functions.
  • Embodiments of the present invention thus use simpler algorithms that can be performed faster and more reliably.
  • a method of obtaining rotation information comprises capturing a plurality of patterned images from a plurality of locations, correlating the plurality of patterned images to generate sets of linear differences, and using the sets of linear differences to generate the rotation information.
  • the plurality of locations comprise a first part of a sensor and a second part of the sensor.
  • a first of the plurality of patterned images is captured in the first part of the sensor and a second of the plurality of patterned images is captured in the second part of the sensor.
  • the sensor is a biometric image sensor, such as a finger image sensor.
  • the first of the plurality of patterned images and the second of the plurality of patterned images together correspond to a fingerprint image in a first position on the sensor.
  • a third of the plurality of patterned images is captured in the first part of the sensor and a fourth of the plurality of patterned images is captured in the second part of the sensor.
  • the third of the plurality of patterned images and the fourth of the plurality of patterned images together correspond to the fingerprint image in a second position on the sensor.
  • the rotation information corresponds to an angular difference between the first position and the second position.
  • correlating the plurality of patterned images comprises correlating the first patterned image with the third patterned image to generate a first set of linear differences from the sets of linear differences, correlating the second patterned image with the fourth patterned image to generate a second set of linear differences from the sets of linear differences, and correlating a first combination of the first patterned image and the second patterned image with a second combination of the third patterned image and the fourth patterned image to generate a third set of linear differences from the sets of linear differences.
  • Correlating the first patterned image with the third patterned image, correlating the second patterned image with the fourth patterned image, and correlating the first combination with the second combination all comprise performing a cross correlation, hi one embodiment, the cross correlation is either a normalized cross correlation or a standardized cross correlation.
  • the first part of the sensor and the second part of the sensor are contiguous. Alternatively, the first part of the sensor and the second part of the sensor are not contiguous.
  • the first part of the sensor comprises a first sub-frame of pixels and the second part of the sensor comprises a second sub-frame of pixels.
  • capturing the first patterned image comprises storing in the first sub-frame first data corresponding to the first patterned image
  • capturing the second patterned image comprises storing in the second sub-frame second data corresponding to the second patterned image
  • capturing the third patterned image comprises storing in the first sub-frame third data corresponding to the third patterned image
  • capturing the fourth patterned image comprises storing in the second sub-frame fourth data corresponding to the fourth patterned image.
  • Correlating the first patterned image with the third patterned image comprises correlating the first data with the third data to generate first and second linear differences from the first set of linear differences.
  • Correlating the second patterned image with the fourth patterned image comprises correlating the second data with the fourth data to generate first and second linear differences from the second set of linear differences.
  • correlating the first combination with the second combination comprises correlating a combination of the first data and the second data with a combination of the third data and the fourth data to generate first and second linear differences from the third set of linear differences.
  • correlating comprises determining a lag to correlate elements of one of the first and second sub- frames, the lag and a difference between the elements corresponding to first and second linear differences from one of the sets of linear differences.
  • Each element corresponds to a row of one of the first and second sub-frames.
  • each element corresponds to a column of one of the first and second sub- frames.
  • the method further comprises filtering the first set of linear differences, the second set of linear differences, the third set of linear differences, and the rotation information.
  • Filtering comprises multiplying by a scaling factor, performing a smoothing function, and performing a clipping function.
  • the finger image sensor is a finger swipe sensor.
  • the finger image sensor is a finger placement sensor.
  • the method further comprises using the rotation information on a host platform having a display screen, the rotation information used to rotate an object on the display screen, thereby emulating a computer input device.
  • the computer input device is selected from the group consisting of a steering wheel, a joystick, and a navigation bar.
  • Emulating a computer input device comprises moving the object on the display screen at a rate related to the angular difference or the angular position.
  • a system for obtaining rotation information comprises means for capturing a plurality of patterned images from a plurality of locations and means for correlating the plurality of patterned images to generate sets of linear differences and for using the sets of linear differences to generate the rotation information.
  • a method of emulating a rotational device using a pattern comprises capturing a first image of the pattern at a first orientation, capturing a second image of the pattern at a second orientation, correlating the first image with the second image to calculate linear differences between the first orientation and the second orientation, translating the linear difference into rotational data, and using the rotational data to emulate the movement of a rotational device.
  • a system for emulating a positional device comprises a sensor for capturing an image of a pattern and a processor coupled to the sensor.
  • the processor is configured to calculate linear differences between a first position of the image of the pattern and a second position of the image of the pattern and to translate the linear differences into rotational data corresponding to a rotation of the image of the pattern.
  • a method of sensing rotation of an object on an image sensor comprises sensing a first image of the object, sensing a second image of the object, and comparing the first image with the second image to determine whether there is linear motion in each of at least two portions of an area containing the first image and the second image to determine whether the object remained stationary, moved in a linear manner, or rotated.
  • Figures 1 A and IB show how an electronic image is rotated by rotating a finger on a finger image sensor, in accordance with the present invention.
  • Figure 2 shows a fingerprint image where ridges are shown in black and valleys are shown in white and indicating areas of bifurcation and ridge endings.
  • Figures 3 A-D shows left- and right-hand sections of a fingerprint sensor with a portion of a finge ⁇ rint image placed upon it as the finge ⁇ rint image is linearly moved and rotated in accordance with the present invention.
  • Figure 4 is a flowchart of a frame acquisition and image correlation procedure in accordance with the present invention.
  • Figure 5 A shows a finger image sensor in a horizontal orientation.
  • Figure 5B shows a finger image sensor in a vertical orientation.
  • Figure 6 shows pixel data from a frame (slice) from a finge ⁇ rint sensor.
  • Figure 7 shows pixel data from a finge ⁇ rint sensor during different iterations of reconstruction in accordance with the present invention.
  • Figures 8-12 show different partition configurations of a frame in accordance with the present invention.
  • the present invention is directed to systems for and methods of determining the rotational position and movement of an arbitrary patterned material imaged by an imaging sensor.
  • the arbitrary patterned material is a finger and the rotational position and movement of the image of the finger are determined.
  • Embodiments of the present invention advantageously determine and collect finger rotational information for use in a digital device and most preferably in personal computing devices.
  • rotational position correlators which are non-linear, requiring trigonometric functions like sine, cosine, and tangent calculations
  • embodiments of the present invention use a linear correlation method that is easier to implement and more computationally efficient.
  • Embodiments of the present invention allow for extremely efficient calculation of linear motion from the components used to determine the rotational motion, thereby reducing the complexity of systems that require one sensor to be used to gather both linear and rotational movement inputs.
  • a system in accordance with embodiments of the present invention reconstructs finge ⁇ rint images from swipe sensors, thereby efficiently providing rotational motion data along with data necessary to reconstruct the image.
  • the system is particularly well suited for applications that do not require high precision rotational information.
  • the fmge ⁇ rint sensor is an Atrua Wings ATW 100 capacitive swipe sensor by Atrua Technologies, Inc., at 1696 Dell Avenue, Campbell, California 95008.
  • a key aspect of the present invention is determining rotation from linear correlation rather than prior art methods that determine rotation by rotating one frame with respect to another, and then applying standard correlation methods. The prior art methods require choosing a pivot point (center of origin) and then performing additional computation.
  • embodiments of the present invention use the simpler linear correlation methods to determine rotational movement, which will occur when motion of the left side of a finger image is in an opposite direction to that of the right side. For instance, if the left half is moving upward and the right half downward, there is clockwise rotational movement as shown in Figure 2D, discussed below. If the linear movement detected at the left edge and right edge of the sensor are substantially equal but opposite, the center of rotation is at or near the center of the sensor.
  • the center of rotation can be calculated and will be closer to the end of the sensor with the smaller amount of linear movement. If the linear movement is not equal but opposite, the center of rotation can be calculated. If both sides are moving in the same direction, as in Figure 2B, then likely only overall linear movement is observed. It will be appreciated that the present invention can determine an angle of rotation even if the center of rotation is displaced, as when a finger slides along the sensor as it rotates.
  • FIG. 1A shows a system 10 used to illustrate one embodiment of the present invention.
  • the system 10 comprises a finger image sensor 20 coupled to a display device 11, displaying a triangular electronic image 15.
  • the display device 11 can be a monitor used with a personal computer, a screen on a personal digital assistant (PDA) or game device, or any other kind of display device.
  • Figure 1A also shows a finger 30 placed on the finger image sensor 25. It will be appreciated that the finger 30 has patterns on a surface contacting the finger image sensor 25 and that the finger image sensor 25 captures images of those patterns.
  • a coordinate axis 20 makes an angle 0 with an edge 21 of the triangular image 15.
  • a coordinate axis 35 makes an angle ⁇ 0 with a line segment 36 associated with the finger 30.
  • Figure IB shows the system 10 after the finger 30 has been rotated so that the coordinate axis 35 makes an angle ⁇ j with the line segment 36.
  • the triangular image 15 is rotated so that the coordinate axis 20 makes an angle o ⁇ with the edge 21.
  • results in rotating the triangular electronic image through an angle ⁇ , - ⁇ 0 ( ⁇ ).
  • can correspond to ⁇ in any number of ways.
  • can equal ⁇
  • can be a multiple of ⁇
  • can be a fraction of ⁇
  • can be a multiple of ⁇ plus some offset, etc.
  • the finger 30 does not have to maintain a pivot point on the finger image sensor 25. The finger 30 can be moved horizontally or vertically on the finger image sensor 25 before, while, or after it is rotated so that the line segment 36 is displaced horizontally, or vertically, or both, and the angle ⁇ still determined.
  • the finger image sensor 25 is depicted as a placement sensor, it will be appreciated that other types of sensors can be used in accordance with the present invention.
  • the finger image sensor 25 is a swipe sensor, described in more detail below.
  • Figure 2 shows a typical finge ⁇ rint image, including physical ridges and valleys on the surface of the finger.
  • the pattern of ridges and valleys has proven to be unique among very large populations of human beings, especially the ridge endings and bifurcations called "minutiae.”
  • Fingers also often have other measurable surface features such as pores, scars, and wrinkles. It is the overall pattern of features-not the unique individual features—that are tracked to measure the distance, rotation, direction, or speed that a fmge ⁇ rint has moved on the finger image sensor.
  • Many prior art electronic finger imaging sensors actively sense the entire surface of a fmge ⁇ rint at the same time.
  • these sensors have a surface area at least as large as a typical person's fingertip pad (typically 15mm x 15mm). Using these sensors the user simply places his finger on the sensor until the image is captured.
  • These sensors now known as placement sensors, contain rows and columns and can capture large images, typically ranging from 250-500 rows and 200-500 columns depending on the sensor's capabilities and size.
  • Such devices are capable of sensing rotational input, and can indeed be used with the new invention to collect rotational information, but they are larger than today's more miniaturized finger sensors.
  • the most promising of the miniaturized sensors is one that is fully sized in one direction (typically in width) but abbreviated in the other (typically height).
  • Embodiments of the present invention can acquire rotational position data from any device capable of imaging the surface of a human finger or other patterned material and is therefore not limited to use with placement or swipe finger image sensors, which typically provide at least 250 dots per inch resolution.
  • the present invention will also work with lower or higher resolution devices that may become available in the future.
  • Figure 3 A shows the left half 205 A and the right half 205B of a finger image sensor 205.
  • the finger image sensor 205 has placed upon it a finger identified by the finge ⁇ rint image 203 having identifiable ridge portions 201 and 202.
  • Figure 3A also shows an x- coordinate axis and a y-coordinate axis, with the arrow of each axis pointing in a direction of increasing values for the respective axis.
  • Figure 3 A shows the finge ⁇ rint image 203 in a first orientation on the finger image sensor 205 and thus in a first orientation with respect to the x-coordinate axis and the y-coordinate axis.
  • Figure 3B shows the finge ⁇ rint image after it has been moved linearly in a vertical position, in a direction of decreasing values for the y- coordinate.
  • Figure 3C shows the finge ⁇ rint image 203 after it has been rotated counterclockwise on the finger image sensor 205.
  • Figure 3D shows the finge ⁇ rint image 203 after it has been rotated clockwise on the finger image sensor 205. It is seen by comparing the orientations of the finge ⁇ rint image 203 from Figure 3C to Figure 3D that the y-coordinates of the identifiable portions 201 and 202 in the left half 205 A have both increased and that the y-coordinates in the right half 205B have both decreased.
  • FIG. 4 is a flowchart for an algorithm 210 for determining rotational movement or placement in accordance with a preferred embodiment of the present invention.
  • a user is prompted by an operating system or application executing on a host (not shown).
  • a finger image sensor (not shown) in accordance with the present invention is initialized, readied for reading data.
  • This step 212 comprises powering on the sensor on and making sure it is ready to capture a finge ⁇ rint image.
  • Step 212 can also include setting contrast and brightness levels, setting the sensor to a desired data acquisition mode, calibrating the sensor, or otherwise initializing the sensor. It will be appreciated that step 212 is not required if the sensor has already been initialized.
  • a frame is read by the sensor at a rate supported by it or by the hardware platform's computing power and bandwidth.
  • the properties of the frame are estimated to determine whether it is useful.
  • the metrics of the frame are analyzed in the step 220, to determine whether the frame is useful. If the frame is useful, it is kept and processing continues in the step 225; otherwise, the frame is disregarded, and processing continues in the step 255.
  • the usefulness of a frame is determined by measuring image statistics such as the average value and the variance of pixel data in the frame. The usefulness of a frame is directly related to whether or not a finger is present on the sensor.
  • the step 215 can be eliminated if a less efficient implementation is acceptable, or when the sensor only generates frames when a finger is present on it.
  • the current frame e.g., the frame most recently read and currently being processed
  • the current frame is copied to the last useful frame.
  • the frame is divided into a left half and a right half. It will be appreciated, however, that a frame can be divided into any number of parts.
  • the linear movement of the left half of the frame and the linear movement of the right half of the frame are both calculated.
  • step 235 using the linear movement of the left half of the frame and the linear movement of the right half of the frame, the overall linear movement of the frame is calculated. This calculation is described in more detail below.
  • step 240 the calculations made in the step 235 are used to calculate the rotational movement of the finge ⁇ rint image.
  • step 245 the process checks whether there was any movement, linear or rotational, of the finge ⁇ rint image. If there was movement, the process continues in the step 250, otherwise it continues in the step 255. hi the step 250, the last frame useful frame is updated, and in the step 251, the last useful frame is stored. Processing then continues in the step 225. In the step 255, the process checks whether more frames are to be acquired.
  • the process continues to the step 260, where a counter is incremented, and then continues on to the step 213. If no more frames are to be acquired, the process continues to the step 265, where it stops.
  • the pixels for the current frame are correlated to the pixels of the last useful frame to determine the amount of rotational or linear motion. If overall linear movement has been calculated in the step 235, data corresponding to the movement are sent to whatever downstream process needs it.
  • a program e.g., an application program, a device driver, or an operating system
  • data corresponding to the movement are sent to whatever downstream process needs it.
  • a program can use the corresponding data to rotate an image on the display screen.
  • the last useful frame is replaced by the current frame and the algorithm continues by acquiring new image data from the sensor.
  • the system iterates in real time.
  • the system stores all the frames in a memory buffer and calculates movement after multiple frames are acquired from the sensor.
  • the iteration halts when either the application or operating system tells it to stop.
  • the process can continue indefinitely.
  • the algorithm starts whenever there is a need for rotational feedback, such as power up of a device or start of a game. The algorithm terminates when rotational information is no longer needed.
  • the system executes on a computing device that is connected to a swipe finge ⁇ rint sensor 310 shown in Figure 5 A.
  • the swipe sensor 310 is mounted horizontally with respect to the x and y directions so that image frames as shown in Figure 6 are captured.
  • the x direction is along the longest side of the sensor 310 while the y direction is pe ⁇ endicular to the x direction.
  • the sensor 310 can be mounted in any orientation.
  • the length of the sensor along the x-axis will always denote the length of the sensor.
  • Figure 5B shows a sensor 315 having a second orientation used in accordance with the present invention.
  • the sensor 315 is mounted vertically with respect to the x and y directions.
  • the finge ⁇ rint sensor (e.g., 310 or 315) provides a single frame of data to a program upon request. As described below, a single frame can be logically divided into any number of smaller sub- frames. Sensors that can provide more than one frame per request can also be used in accordance with the present invention. It will also be appreciated that finge ⁇ rint sensors in accordance with the present invention can be mounted at orientations other than those shown in Figures 5 A and 5B. Typically, swipe sensors are capable of delivering anywhere from 250 to 3000 frames per second (the "frame rate"), depending on the sensor's capabilities, the interface used and the speed of the host personal computer.
  • Figure 6 shows image data 400 captured by a finger image sensor in accordance with the present invention.
  • the image data 400 comprises N rows by M columns of picture elements, or pixels, with each pixel typically represented by a single byte (8 bits) of data.
  • M can be any positive value (100-300 is typical, depending on resolution) and N must be at least 2 (the typical number of rows in a frame is 8-32).
  • a value of a pixel is a gray level, such that the image frame resembles a finger image when displayed on a computer monitor. Typically, this value ranges from 0 to 255, with 0 representing black and indicating the presence of a fmge ⁇ rint ridge, and 255 representing white and indicating the presence of a valley.
  • Figure 7 shows a current frame and the last useful frame for several iterations of the algorithm 210 in Figure 3. Iteration 1, shown in column 401, shows a last useful frame 405 and a current frame 410. No rows of the frame 405 correspond to the frame 410, so no recognizable movement can be identified. Iteration 2, shown in column 402, shows a last useful frame 410 and a current frame 415. It will be appreciated that the last useful frame
  • 410 corresponds to the current frame from the previous iteration, iteration 1.
  • Row 0 of frame 410 corresponds to row 1 of frame 415 (also indicated by an arrow).
  • Linear movement in a y direction is thus detected.
  • Iteration 3, shown in column 403, shows a last useful frame 415 and a current frame 420.
  • Frame 415 is identical to the frame 420, so zero motion is recognized. (E.g., the finger has not been moved on the finger image sensor.) hi this example, the last useful frame does not have to be updated.
  • Iteration 4, shown in column 404 shows a last useful frame 420 and a current frame 425. It will be appreciated that the last useful frame 420 corresponds to the current frame from iteration 3.
  • Row 1 of frame 420 corresponds to row 2 of frame 425 (also indicated by a straight arrow).
  • This vertical shift in rows indicates a downward movement of a finger on a finger image sensor, referred to as movement in a positive y direction.
  • column 2 of frame 420 (indicated by a squiggly arrow) corresponds to column 1 of frame 425 (also indicated by a squiggly arrow).
  • This horizontal shift in columns indicates a left movement of a finger on a finger image sensor, here labeled movement in a negative x-direction.
  • a positive y-movement and a negative x-movement has been detected, a clockwise rotation of the finge ⁇ rint image is recognized.
  • signs positive or negative assigned to a particular direction in an x-direction and a y-direction are arbitrarily chosen for the sake of explanation. The signs can be reversed.
  • the algorithm 210 in Figure 4 is now described in more detail.
  • the algorithm 200 iterates, requesting new frames of data (step 210) and correlating them to a previous image frame (step 225) stored in a buffer for just this pu ⁇ ose.
  • the sensing device e.g., a finger image sensor.
  • the frame is collected, it is analyzed to determine its usefulness. If the frame is deemed useful, it is saved in local memory for later use; if it is deemed useless, it is simply disregarded.
  • the usefulness of a frame is determined by ensuring the frame contains at least some finger image information.
  • the Metrics (calculated in the step 215) can also be calculated just using portions of the frame (rather than the entire frame), where the portions are arbitrary sub-images or are obtained by skipping every p th pixel in the calculation.
  • the frame is considered noise, and thereby disregarded, if: ⁇ ⁇ Noise_average_threshold_high or if ⁇ ⁇ Noise_average_threshold_low or if ⁇ ⁇ Variance_average_threshold
  • Noise_average_threshold_high 240
  • Noise_average_threshold_low 30
  • Variance_average_threshold 10
  • Standard cross-correlation SCC of row R of the last useful frame with row S of the current frame is mathematically expressed as: [Equation 1] where d is referred to as the "lag" or “offset.”
  • the lag is related to the horizontal movement of the data in one frame with respect to the data in another, and can also be thought of as a velocity.
  • NCC_w zo/e(R,S,d) Since the lag, or offset, of the information in the current frame to that in the last frame corresponds to an unknown amount of movement in the x-direction, NCC_w zo/e(R,S,d) must typically be calculated for multiple values of d to find the one that corresponds to the best fit.
  • L 8 8
  • L should be chosen so that it is as large as the maximum x-velocity that can occur from frame to frame.
  • a smaller L is more computationally efficient, but will produce inaccurate results if the finger shifts more than ⁇ L from one frame to the next.
  • L can be a function of the d peakwhole from the previous iteration i-1. For example,
  • L(at iteration i) d peakwhole (at iteration i-1) + e, where e is typically equal to 1 or 2.
  • L can be a function of the row number in the frame (i.e. R and/or S).
  • R and/or S the row number in the frame
  • NCC-squared the number of the NCC equations so that floating-point operations can be avoided, and that for computing purposes it is also possible to use NCC-squared to avoid an expensive square-root operation.
  • the PeakNCC whole corresponds to the correlation coefficient of the best fit, while d peakwhole corresponds to the amount of movement in the x direction. A method to obtain the amount of motion in the y direction is described below.
  • the NCC calculation in Equation 2 below can be restated, for last frame oF and current frame cF,
  • NCC _ whole (R, S, d) (A-B) / (C m x D)
  • NCC is the sum along row S from column 1 through column M/2-d. Furthermore, the NCC for the left and right halves of each row can be determined using:
  • the last useful frame at iteration i has rows numbered 1 through N, as shown in Figure 4, where N is the number of rows supplied by the sensor. Similarly, the current frame to be processed has rows 1 through N. For a given row R in the last frame, PeakNCC and d k are calculated as in Equations
  • MaxPeakNCC the value of the correlation of the best pair of matching rows in the current frame and the last frame 2.
  • bestR the row of the last frame that results in MaxPeakNCC 3.
  • bestS the row of the current frame that results in MaxPeakNCC 4.
  • peafc Max the correlation lag where MaxPeakNCC is reached
  • bestR and bestS are the pair of rows that yield the highest correlation value (MaxPeakNCC ).
  • ⁇ y(i) bestS-bestR, which is the y-velocity at iteration i otherwise 3.
  • ⁇ x(i) 0 4.
  • ⁇ y(i) 0 where corr Jhreshold is used to make sure the correlation is high enough to indicate an overlap at all.
  • corr Jhreshold 0.75, but other values can be used to tune the performance of the algorithm. If the correlation is below a threshold, it is impossible to determine actual values for the x- and y- velocities, so the algorithm simply outputs no motion vector. However, in accordance with alternative embodiments other values can be output as the default, such as maximum movements of N rows and M columns, or combinations thereof.
  • Step 3.1 Acquire a frame and calculate ⁇ y left (i) and ⁇ y right (i)
  • Equation 1 The resulting two estimated values can be averaged together or otherwise combined to form the final ⁇ theta(i).
  • the ⁇ theta(i) are made available to the host application and/or operating system.
  • a standard correlation is used instead of normalized cross correlation.
  • Standard cross correlation given in Equation 1 could be used instead of Normalized Cross Correlation. It is straightforward to split Equation 1 into terms from the left and right sides of each row.
  • the SCC value for the entire row is simply the sum of the correlation values of each half. It will also be appreciated that the maximum standard cross-correlation between a row S of a frame and a row R of the last useful frame can be given by other expressions. For example, weighted or voting schemes can be used. In one embodiment,
  • Weighted_MAX is a function that assigns a plurality of predetermined weights to its elements before generating a value
  • d peakwhole (R,S,L) is the value of d at which Equation 8 is satisfied
  • L is approximately equal to the maximum horizontal speed from the last useful frame to the current frame. While the preferred embodiment splits each row into a left and right side of equal length, alternative embodiments use any arbitrary division of each row, including more than
  • Figure 8 shows a finge ⁇ rint sensor having a left half (sub-frame) 605 contiguous with a right half 610.
  • Figure 9 shows a fmge ⁇ rint sensor having a first section
  • (sub-frame) 615 a second section 620, a third section 625, and a fourth section 630.
  • Section 615 is contiguous with section 620
  • section 620 is contiguous with 625
  • section 625 is contiguous with section 630.
  • Figure 10 shows a finger image sensor having two sections 631 and 632 that are not contiguous.
  • Figure 11 shows a finger image sensor having four sections 635, 640, 645, and 650, none of which are contiguous.
  • Figure 12 shows a finger image sensor having a section 655 contiguous with a section 660, and a section 665 contiguous with a section 670.
  • filtering is used to describe the function of processing an input in a well-defined way to produce an output.
  • correlation is computationally intensive. Accordingly, in one embodiment, the calculation of ⁇ theta(i) and/or the ⁇ x(i), ⁇ y(i) for the left, right, and whole array are performed on a separate processor or dedicated hardware, hi this embodiment, the hardware can be integrated into the silicon finge ⁇ rint sensor itself.
  • the hardware performing the correlation must have access to the current frame and the last useful frame, both of which can be stored in memory on the device. If, for example, this is integrated into the finger image sensor, such a device would obviously have access to the current frame (since the device itself created it), and it could save the last useful frame in volatile memory registers. In such a case the device would also need to determine whether a frame is useful or not, using the method described here in the preferred embodiment. In such an embodiment the host computing device is not necessary. Obviously, such a hardware implementation could also be used to reconstruct finge ⁇ rint images since doing so only requires the ⁇ x(i) and ⁇ y(i) for the whole array. It will be readily apparent to one skilled in the art that various modifications may be made to the embodiments without departing from the spirit and scope of the invention as defined by the appended claims.

Abstract

A system for and method of obtaining rotation information is disclosed. The method comprises capturing a plurality of patterned images from a plurality of locations, correlating the plurality of patterned images to generate sets of linear differences, and using the sets of linear differences to generate the rotation information. Preferably, the plurality of locations comprise a first part of a fingerprint swipe sensor and a second part of the fingerprint swipe sensor, each part configured to capture a part of a fingerprint image. Each part captures a part of the fingerprint image at two positions and correlates the parts at the two positions to determine one set of linear differences. Together, the sets of linear differences are used to calculate the rotation information, which can be used to emulate a rotational device such as a steering wheel, a joystick, or a navigation bar.

Description

SYSTEM FOR AND METHOD OF GENERATING ROTATIONAL INPUTS
RELATED APPLICATION This application claims priority under 35 U.S.C. § 119(e) of the co-pending U.S. provisional application Serial Number 60/497,045, filed on August 22, 2003, and titled
"ROTATIONAL INPUT METHOD PATENT." The provisional application Serial Number 60/497,045, filed on August 22, 2003, and titled "ROTATIONAL INPUT METHOD PATENT" is hereby incorporated by reference.
FIELD OF THE INVENTION This invention relates to the field of biometric sensors. In particular, this invention relates to systems and methods that use fingerprint images to emulate electronic positioning devices.
BACKGROUND OF THE INVENTION The emergence of portable electronic computing platforms allows functions and services to be enjoyed wherever necessary. Palmtop computers, personal digital assistants, mobile phones, portable game consoles, biometric/health monitors, remote controls, digital cameras, to name a few, are some daily-life examples of portable electronic computing platforms. The desire for portability has driven these computing platforms to become smaller and, consequently, to have longer battery life. A dilemma occurs when these ever-smaller devices require efficient ways to collect user input. Portable electronic computing platforms need these user inputs for multiple purposes, including (a) navigating a cursor or a pointer to a certain location on a display, (b) selecting (e-g-5 choosing or not choosing) an item or an action, and (c) orientating (e.g., changing direction with or without visual feedback) an input device. Concepts for user input from much larger personal computers have been borrowed. Micro joysticks, navigation bars, scroll wheels, touch pads, steering wheels and buttons have all been adopted, with limited success, in conventional devices. All these positioning devices consume substantial amounts of valuable surface real estate on a portable device.
Mechanical positioning devices such as joysticks, navigation bars and scroll wheels can wear out and become unreliable. Their sizes and required movements often preclude optimal ergonomic placement on portable computing platforms. Prior art methods calculate rotation by rotating one frame with respect to another and then applying standard correlation methods. These methods require the selection of a pivot point (e.g., the origin), followed by additional computations. These computations are not helpful for determining linear motion (e.g., non-rotational movement in the x- and y- directions). Such a shortcoming makes prior art systems even more inefficient when used in portable devices, in which both rotational and linear movement are required, such as when emulating, respectively, a steering wheel and a pointing device.
SUMMARY OF THE INVENTION The present invention discloses a system for and method of obtaining rotation information from a patterned image, such as a fingerprint image. Embodiments of the present invention thus require smaller footprints than those that use joy sticks, steering wheels, and other, larger devices that require additional power. Embodiments of the present ' invention use linear correlation methods that are easier to use than rotational and other methods such as those using trigonometric functions. Embodiments of the present invention thus use simpler algorithms that can be performed faster and more reliably. In a first aspect of the present invention, a method of obtaining rotation information comprises capturing a plurality of patterned images from a plurality of locations, correlating the plurality of patterned images to generate sets of linear differences, and using the sets of linear differences to generate the rotation information. The plurality of locations comprise a first part of a sensor and a second part of the sensor. A first of the plurality of patterned images is captured in the first part of the sensor and a second of the plurality of patterned images is captured in the second part of the sensor. Preferably, the sensor is a biometric image sensor, such as a finger image sensor. The first of the plurality of patterned images and the second of the plurality of patterned images together correspond to a fingerprint image in a first position on the sensor. A third of the plurality of patterned images is captured in the first part of the sensor and a fourth of the plurality of patterned images is captured in the second part of the sensor. The third of the plurality of patterned images and the fourth of the plurality of patterned images together correspond to the fingerprint image in a second position on the sensor. In one embodiment, the rotation information corresponds to an angular difference between the first position and the second position. In one embodiment, correlating the plurality of patterned images comprises correlating the first patterned image with the third patterned image to generate a first set of linear differences from the sets of linear differences, correlating the second patterned image with the fourth patterned image to generate a second set of linear differences from the sets of linear differences, and correlating a first combination of the first patterned image and the second patterned image with a second combination of the third patterned image and the fourth patterned image to generate a third set of linear differences from the sets of linear differences. Correlating the first patterned image with the third patterned image, correlating the second patterned image with the fourth patterned image, and correlating the first combination with the second combination all comprise performing a cross correlation, hi one embodiment, the cross correlation is either a normalized cross correlation or a standardized cross correlation. In one embodiment, the first part of the sensor and the second part of the sensor are contiguous. Alternatively, the first part of the sensor and the second part of the sensor are not contiguous. In one embodiment, the first part of the sensor comprises a first sub-frame of pixels and the second part of the sensor comprises a second sub-frame of pixels. In this embodiment, capturing the first patterned image comprises storing in the first sub-frame first data corresponding to the first patterned image, capturing the second patterned image comprises storing in the second sub-frame second data corresponding to the second patterned image, capturing the third patterned image comprises storing in the first sub-frame third data corresponding to the third patterned image, and capturing the fourth patterned image comprises storing in the second sub-frame fourth data corresponding to the fourth patterned image. Correlating the first patterned image with the third patterned image comprises correlating the first data with the third data to generate first and second linear differences from the first set of linear differences. Correlating the second patterned image with the fourth patterned image comprises correlating the second data with the fourth data to generate first and second linear differences from the second set of linear differences. And correlating the first combination with the second combination comprises correlating a combination of the first data and the second data with a combination of the third data and the fourth data to generate first and second linear differences from the third set of linear differences. In another embodiment, correlating comprises determining a lag to correlate elements of one of the first and second sub- frames, the lag and a difference between the elements corresponding to first and second linear differences from one of the sets of linear differences. Each element corresponds to a row of one of the first and second sub-frames. Alternatively, each element corresponds to a column of one of the first and second sub- frames. In another embodiment, the method further comprises filtering the first set of linear differences, the second set of linear differences, the third set of linear differences, and the rotation information. Filtering comprises multiplying by a scaling factor, performing a smoothing function, and performing a clipping function. Preferably, the finger image sensor is a finger swipe sensor. Alternatively, the finger image sensor is a finger placement sensor. In another embodiment, the method further comprises using the rotation information on a host platform having a display screen, the rotation information used to rotate an object on the display screen, thereby emulating a computer input device. The computer input device is selected from the group consisting of a steering wheel, a joystick, and a navigation bar. Emulating a computer input device comprises moving the object on the display screen at a rate related to the angular difference or the angular position. In accordance with a second aspect of the invention, a system for obtaining rotation information comprises means for capturing a plurality of patterned images from a plurality of locations and means for correlating the plurality of patterned images to generate sets of linear differences and for using the sets of linear differences to generate the rotation information. In accordance with a third aspect of the present invention, a method of emulating a rotational device using a pattern comprises capturing a first image of the pattern at a first orientation, capturing a second image of the pattern at a second orientation, correlating the first image with the second image to calculate linear differences between the first orientation and the second orientation, translating the linear difference into rotational data, and using the rotational data to emulate the movement of a rotational device. In accordance with a fourth aspect of the present invention, a system for emulating a positional device comprises a sensor for capturing an image of a pattern and a processor coupled to the sensor. The processor is configured to calculate linear differences between a first position of the image of the pattern and a second position of the image of the pattern and to translate the linear differences into rotational data corresponding to a rotation of the image of the pattern. In accordance with a fifth aspect of the present invention, a method of sensing rotation of an object on an image sensor comprises sensing a first image of the object, sensing a second image of the object, and comparing the first image with the second image to determine whether there is linear motion in each of at least two portions of an area containing the first image and the second image to determine whether the object remained stationary, moved in a linear manner, or rotated. BRIEF DESCRIPTION OF THE SEVERAL VIEWS OF THE DRAWINGS Figures 1 A and IB show how an electronic image is rotated by rotating a finger on a finger image sensor, in accordance with the present invention. Figure 2 shows a fingerprint image where ridges are shown in black and valleys are shown in white and indicating areas of bifurcation and ridge endings. Figures 3 A-D shows left- and right-hand sections of a fingerprint sensor with a portion of a fingeφrint image placed upon it as the fingeφrint image is linearly moved and rotated in accordance with the present invention. Figure 4 is a flowchart of a frame acquisition and image correlation procedure in accordance with the present invention. Figure 5 A shows a finger image sensor in a horizontal orientation. Figure 5B shows a finger image sensor in a vertical orientation. Figure 6 shows pixel data from a frame (slice) from a fingeφrint sensor. Figure 7 shows pixel data from a fingeφrint sensor during different iterations of reconstruction in accordance with the present invention. Figures 8-12 show different partition configurations of a frame in accordance with the present invention.
DETAILED DESCRIPTION OF THE INVENTION The present invention is directed to systems for and methods of determining the rotational position and movement of an arbitrary patterned material imaged by an imaging sensor. Preferably, the arbitrary patterned material is a finger and the rotational position and movement of the image of the finger are determined. Embodiments of the present invention advantageously determine and collect finger rotational information for use in a digital device and most preferably in personal computing devices. Unlike prior art rotational position correlators, which are non-linear, requiring trigonometric functions like sine, cosine, and tangent calculations, embodiments of the present invention use a linear correlation method that is easier to implement and more computationally efficient. Embodiments of the present invention allow for extremely efficient calculation of linear motion from the components used to determine the rotational motion, thereby reducing the complexity of systems that require one sensor to be used to gather both linear and rotational movement inputs. A system in accordance with embodiments of the present invention reconstructs fingeφrint images from swipe sensors, thereby efficiently providing rotational motion data along with data necessary to reconstruct the image. The system is particularly well suited for applications that do not require high precision rotational information. Methods of and systems for fingeφrint sensing are described in detail in the U.S. Patent Application Serial Number 10/194,994, filed July 12, 2002, and titled "Method and System for Biometric Image Assembly from Multiple Partial Biometric Frame Scans," and in the U.S. Patent Application
Serial Number 10/099,558, filed March 13, 2002, and titled "Fingeφrint Biometric Capture Device and Method with Integrated On-Chip Data Buffering," both of which are hereby incoφorated by reference in their entireties. In the preferred embodiment, the fmgeφrint sensor is an Atrua Wings ATW 100 capacitive swipe sensor by Atrua Technologies, Inc., at 1696 Dell Avenue, Campbell, California 95008. A key aspect of the present invention is determining rotation from linear correlation rather than prior art methods that determine rotation by rotating one frame with respect to another, and then applying standard correlation methods. The prior art methods require choosing a pivot point (center of origin) and then performing additional computation. Furthermore, such computation is not helpful for determining linear motion (non-rotational movement in the x- and y-directions). These computations are even more inefficient in portable electronic devices where it may be important to calculate both kinds of movement, for instance, when emulating a pointing device and a steering wheel on one component. Most of the prior art is concentrated on calculating exact rotational movement, and therefore requires the more precise steps outlined above. However, many applications do not require such precision, and it is these cases where the new invention is best suited. Embodiments of the present invention make use of the fact that, as a finger rotates clockwise, the left side of the image will appear to be moving upward, while the right half of the finger will appear to be moving downward. This is sometimes referred to as shear. The opposite is true of counterclockwise motion. Furthermore, in both cases, the left side will appear to be moving toward the right, and the right side will appear to be moving toward the left, as shown in Figures 2B and 2C. On a typical swipe type of finger imaging sensor, which is mounted horizontally so that its height is much smaller than its width, the upward and downward motion is easily observed and exploitable, but because the sensor is so small in height, observing the motion in the x-direction is more difficult. If a swipe sensor is mounted vertically, then it will be easier to observe motion in the x-direction than in the y-direction due to the device's small width in this case. The invention is equally applicable to both cases. It is also equally applicable on a large fingeφrint placement sensor, where movement in both the x- and y-directions is easily observable and can be exploited. Using the observations outlined above, embodiments of the present invention use the simpler linear correlation methods to determine rotational movement, which will occur when motion of the left side of a finger image is in an opposite direction to that of the right side. For instance, if the left half is moving upward and the right half downward, there is clockwise rotational movement as shown in Figure 2D, discussed below. If the linear movement detected at the left edge and right edge of the sensor are substantially equal but opposite, the center of rotation is at or near the center of the sensor. If the linear movement is not equal but opposite, the center of rotation can be calculated and will be closer to the end of the sensor with the smaller amount of linear movement. If the linear movement is not equal but opposite, the center of rotation can be calculated. If both sides are moving in the same direction, as in Figure 2B, then likely only overall linear movement is observed. It will be appreciated that the present invention can determine an angle of rotation even if the center of rotation is displaced, as when a finger slides along the sensor as it rotates. Accordingly, the present invention offers a reliable and computationally-efficient solution to obtain high-resolution rotational information about a user's finger as it contacts a finger imaging sensor, so that such a sensor can, for example, emulate a steering wheel for use in gaming, or rotate the image of a map for easier viewing on a display. Figure 1A shows a system 10 used to illustrate one embodiment of the present invention. The system 10 comprises a finger image sensor 20 coupled to a display device 11, displaying a triangular electronic image 15. The display device 11 can be a monitor used with a personal computer, a screen on a personal digital assistant (PDA) or game device, or any other kind of display device. Figure 1A also shows a finger 30 placed on the finger image sensor 25. It will be appreciated that the finger 30 has patterns on a surface contacting the finger image sensor 25 and that the finger image sensor 25 captures images of those patterns. A coordinate axis 20 makes an angle 0 with an edge 21 of the triangular image 15.
A coordinate axis 35 makes an angle β0 with a line segment 36 associated with the finger 30. Figure IB shows the system 10 after the finger 30 has been rotated so that the coordinate axis 35 makes an angle βj with the line segment 36. In accordance with the present invention, the triangular image 15 is rotated so that the coordinate axis 20 makes an angle o^ with the edge 21. Thus, in accordance with one embodiment, rotating the finger 30 through an angle βj - β0
(Δβ) results in rotating the triangular electronic image through an angle α, - α0 (Δα). It will be appreciated that Δα can correspond to Δβ in any number of ways. For example, Δα can equal Δβ, Δα can be a multiple of Δβ, Δα can be a fraction of Δβ, Δα can be a multiple of Δβ plus some offset, etc. It will also be appreciated that in accordance with one embodiment, the finger 30 does not have to maintain a pivot point on the finger image sensor 25. The finger 30 can be moved horizontally or vertically on the finger image sensor 25 before, while, or after it is rotated so that the line segment 36 is displaced horizontally, or vertically, or both, and the angle Δβ still determined. It will also be appreciated that vertical and horizontal movements of the finger 30 can also captured to vertically and horizontally displace the triangular image 15 on the display device 11. It will also be appreciated that the triangular image 15 can be moved at a rate related to Δβ (called the rate mode) or at a rate related to βt. While the finger image sensor 25 is depicted as a placement sensor, it will be appreciated that other types of sensors can be used in accordance with the present invention.
Preferably, the finger image sensor 25 is a swipe sensor, described in more detail below. Figure 2 shows a typical fingeφrint image, including physical ridges and valleys on the surface of the finger. The pattern of ridges and valleys has proven to be unique among very large populations of human beings, especially the ridge endings and bifurcations called "minutiae." Fingers also often have other measurable surface features such as pores, scars, and wrinkles. It is the overall pattern of features-not the unique individual features— that are tracked to measure the distance, rotation, direction, or speed that a fmgeφrint has moved on the finger image sensor. Many prior art electronic finger imaging sensors actively sense the entire surface of a fmgeφrint at the same time. Whether based on optical or electrical sensing methods, these sensors have a surface area at least as large as a typical person's fingertip pad (typically 15mm x 15mm). Using these sensors the user simply places his finger on the sensor until the image is captured. These sensors, now known as placement sensors, contain rows and columns and can capture large images, typically ranging from 250-500 rows and 200-500 columns depending on the sensor's capabilities and size. Such devices are capable of sensing rotational input, and can indeed be used with the new invention to collect rotational information, but they are larger than today's more miniaturized finger sensors. The most promising of the miniaturized sensors is one that is fully sized in one direction (typically in width) but abbreviated in the other (typically height). This results in a sensor that only is capable of sensing a small rectangular portion of the fingeφrint at any one time. Such smaller sensors are better suited for use with the present invention, not only because they are more apropos for portable devices, but also because they produce smaller images. The smaller images have less data in them, making the computations less intense. While it is possible to ignore or mask off data from a larger sensor to make it resemble a smaller one, such an approach is not ideal, because it does not guarantee that the finger of the user is even touching the area of interest. With a swipe sensor, this is not an issue. Embodiments of the present invention can acquire rotational position data from any device capable of imaging the surface of a human finger or other patterned material and is therefore not limited to use with placement or swipe finger image sensors, which typically provide at least 250 dots per inch resolution. The present invention will also work with lower or higher resolution devices that may become available in the future. Figure 3 A shows the left half 205 A and the right half 205B of a finger image sensor 205. The finger image sensor 205 has placed upon it a finger identified by the fingeφrint image 203 having identifiable ridge portions 201 and 202. Figure 3A also shows an x- coordinate axis and a y-coordinate axis, with the arrow of each axis pointing in a direction of increasing values for the respective axis. Figure 3 A shows the fingeφrint image 203 in a first orientation on the finger image sensor 205 and thus in a first orientation with respect to the x-coordinate axis and the y-coordinate axis. Figure 3B shows the fingeφrint image after it has been moved linearly in a vertical position, in a direction of decreasing values for the y- coordinate. Figure 3C shows the fingeφrint image 203 after it has been rotated counterclockwise on the finger image sensor 205. It is seen by comparing the orientations of the fingeφrint image 203 from Figure 3B to Figure 3C that the y-coordinates of the identifiable portions 201 and 202 in the left half 205 A have both decreased and that the y-coordinates in the right half 205B have both increased. Figure 3D shows the fingeφrint image 203 after it has been rotated clockwise on the finger image sensor 205. It is seen by comparing the orientations of the fingeφrint image 203 from Figure 3C to Figure 3D that the y-coordinates of the identifiable portions 201 and 202 in the left half 205 A have both increased and that the y-coordinates in the right half 205B have both decreased. Figure 4 is a flowchart for an algorithm 210 for determining rotational movement or placement in accordance with a preferred embodiment of the present invention. In the start step 211, a user is prompted by an operating system or application executing on a host (not shown). Next, in the step 212, a finger image sensor (not shown) in accordance with the present invention is initialized, readied for reading data. This step 212 comprises powering on the sensor on and making sure it is ready to capture a fingeφrint image. Step 212 can also include setting contrast and brightness levels, setting the sensor to a desired data acquisition mode, calibrating the sensor, or otherwise initializing the sensor. It will be appreciated that step 212 is not required if the sensor has already been initialized. Next, in the step 213, a frame is read by the sensor at a rate supported by it or by the hardware platform's computing power and bandwidth. In the step 215, the properties of the frame are estimated to determine whether it is useful. The metrics of the frame are analyzed in the step 220, to determine whether the frame is useful. If the frame is useful, it is kept and processing continues in the step 225; otherwise, the frame is disregarded, and processing continues in the step 255. As described in more detail below, in a preferred embodiment the usefulness of a frame is determined by measuring image statistics such as the average value and the variance of pixel data in the frame. The usefulness of a frame is directly related to whether or not a finger is present on the sensor. It will be appreciated that the step 215 can be eliminated if a less efficient implementation is acceptable, or when the sensor only generates frames when a finger is present on it. In the step 225, the current frame (e.g., the frame most recently read and currently being processed) is correlated with the last stored useful frame. On the very first iteration, since there is no "last useful frame," the current frame is copied to the last useful frame. In a preferred embodiment, the frame is divided into a left half and a right half. It will be appreciated, however, that a frame can be divided into any number of parts. Next, in the step 230, the linear movement of the left half of the frame and the linear movement of the right half of the frame are both calculated. In the step 235, using the linear movement of the left half of the frame and the linear movement of the right half of the frame, the overall linear movement of the frame is calculated. This calculation is described in more detail below. In the step 240, the calculations made in the step 235 are used to calculate the rotational movement of the fingeφrint image. Next, in the step 245 the process checks whether there was any movement, linear or rotational, of the fingeφrint image. If there was movement, the process continues in the step 250, otherwise it continues in the step 255. hi the step 250, the last frame useful frame is updated, and in the step 251, the last useful frame is stored. Processing then continues in the step 225. In the step 255, the process checks whether more frames are to be acquired. If more frames are to be acquired, the process continues to the step 260, where a counter is incremented, and then continues on to the step 213. If no more frames are to be acquired, the process continues to the step 265, where it stops. As described above, the pixels for the current frame are correlated to the pixels of the last useful frame to determine the amount of rotational or linear motion. If overall linear movement has been calculated in the step 235, data corresponding to the movement are sent to whatever downstream process needs it. For example, a program (e.g., an application program, a device driver, or an operating system) can use the corresponding data to linearly position a pointer on a display screen. If any overall rotational movement has been calculated in the step 240, data corresponding to the movement are sent to whatever downstream process needs it. For example, a program can use the corresponding data to rotate an image on the display screen. Once it is determined that movement has occurred, the last useful frame is replaced by the current frame and the algorithm continues by acquiring new image data from the sensor. In a preferred embodiment, the system iterates in real time. Alternatively, the system stores all the frames in a memory buffer and calculates movement after multiple frames are acquired from the sensor. Preferably, the iteration halts when either the application or operating system tells it to stop. When the system is used as a pointing device for an operating system, the process can continue indefinitely. The algorithm starts whenever there is a need for rotational feedback, such as power up of a device or start of a game. The algorithm terminates when rotational information is no longer needed. In a preferred embodiment, the system executes on a computing device that is connected to a swipe fingeφrint sensor 310 shown in Figure 5 A. The swipe sensor 310 is mounted horizontally with respect to the x and y directions so that image frames as shown in Figure 6 are captured. The x direction is along the longest side of the sensor 310 while the y direction is peφendicular to the x direction. It will be appreciated that the sensor 310 can be mounted in any orientation. For consistency, the length of the sensor along the x-axis will always denote the length of the sensor. Figure 5B shows a sensor 315 having a second orientation used in accordance with the present invention. The sensor 315 is mounted vertically with respect to the x and y directions. In the preferred embodiment, the fingeφrint sensor (e.g., 310 or 315) provides a single frame of data to a program upon request. As described below, a single frame can be logically divided into any number of smaller sub- frames. Sensors that can provide more than one frame per request can also be used in accordance with the present invention. It will also be appreciated that fingeφrint sensors in accordance with the present invention can be mounted at orientations other than those shown in Figures 5 A and 5B. Typically, swipe sensors are capable of delivering anywhere from 250 to 3000 frames per second (the "frame rate"), depending on the sensor's capabilities, the interface used and the speed of the host personal computer. Figure 6 shows image data 400 captured by a finger image sensor in accordance with the present invention. The image data 400 comprises N rows by M columns of picture elements, or pixels, with each pixel typically represented by a single byte (8 bits) of data. M can be any positive value (100-300 is typical, depending on resolution) and N must be at least 2 (the typical number of rows in a frame is 8-32). Preferably, N=8 and M=192. A value of a pixel is a gray level, such that the image frame resembles a finger image when displayed on a computer monitor. Typically, this value ranges from 0 to 255, with 0 representing black and indicating the presence of a fmgeφrint ridge, and 255 representing white and indicating the presence of a valley. Other ranges of data and other representations of such data are possible without affecting the nature of the invention. It is possible, in alternative embodiments, to use the system with little or no modification, with other similar types of sensors, such as optical document scanners. In one embodiment, the system of the present invention executes in specialized hardware or firmware instead of in software. In another embodiment, the algorithm executes on a general-puφose CPU and other portions execute solely in hardware. Figure 7 shows a current frame and the last useful frame for several iterations of the algorithm 210 in Figure 3. Iteration 1, shown in column 401, shows a last useful frame 405 and a current frame 410. No rows of the frame 405 correspond to the frame 410, so no recognizable movement can be identified. Iteration 2, shown in column 402, shows a last useful frame 410 and a current frame 415. It will be appreciated that the last useful frame
410 corresponds to the current frame from the previous iteration, iteration 1. Row 0 of frame 410 (indicated by an arrow) corresponds to row 1 of frame 415 (also indicated by an arrow). Linear movement in a y direction is thus detected. Iteration 3, shown in column 403, shows a last useful frame 415 and a current frame 420. Frame 415 is identical to the frame 420, so zero motion is recognized. (E.g., the finger has not been moved on the finger image sensor.) hi this example, the last useful frame does not have to be updated. Iteration 4, shown in column 404, shows a last useful frame 420 and a current frame 425. It will be appreciated that the last useful frame 420 corresponds to the current frame from iteration 3. Row 1 of frame 420 (indicated by a straight arrow) corresponds to row 2 of frame 425 (also indicated by a straight arrow). This vertical shift in rows (i.e., tracking images of fingeφrint patterns as they and hence a finger moves from row 1 to row 2) indicates a downward movement of a finger on a finger image sensor, referred to as movement in a positive y direction. Furthermore, column 2 of frame 420 (indicated by a squiggly arrow) corresponds to column 1 of frame 425 (also indicated by a squiggly arrow). This horizontal shift in columns (i.e., tracking images of fingeφrint patterns as they and hence a finger moves from column 2 to column 1) indicates a left movement of a finger on a finger image sensor, here labeled movement in a negative x-direction. In this example, because a positive y-movement and a negative x-movement has been detected, a clockwise rotation of the fingeφrint image is recognized. It will be appreciated that signs (positive or negative) assigned to a particular direction in an x-direction and a y-direction are arbitrarily chosen for the sake of explanation. The signs can be reversed. The algorithm 210 in Figure 4 is now described in more detail. As described above, the algorithm 200 iterates, requesting new frames of data (step 210) and correlating them to a previous image frame (step 225) stored in a buffer for just this puφose. At the z'"1 iteration of the algorithm 210, one frame of data is requested from the sensing device (e.g., a finger image sensor). Once the frame is collected, it is analyzed to determine its usefulness. If the frame is deemed useful, it is saved in local memory for later use; if it is deemed useless, it is simply disregarded. In the preferred embodiment, the usefulness of a frame is determined by ensuring the frame contains at least some finger image information. For instance, if a frame is collected when no finger is on the device, that frame likely will contain only noise or a blank image. This is done using rules based on measuring image statistics of the frame, namely the average value and the variance. Some sensors provide information on finger presence, and that can be used in systems where it is available, either by itself or in conjunction with the above statistics. Mathematically, if the pixel in the nth row and mth column is given by frame [n,m] ', then:
M ∑ ∑frame[n,m] FrameAverage = Φ = — m=l NxM
N M ∑ ∑ (frame[n,m]- Φ)2 FrameVariance = Ψ = "=1 ",=1 — NxM
For the puφoses of efficiency, the Metrics (calculated in the step 215) can also be calculated just using portions of the frame (rather than the entire frame), where the portions are arbitrary sub-images or are obtained by skipping every pth pixel in the calculation. The frame is considered noise, and thereby disregarded, if: Φ ≥ Noise_average_threshold_high or if Φ < Noise_average_threshold_low or if Ψ ≤ Variance_average_threshold
hi other words, if the average is above or below a certain level (e.g., a threshold), or if the variance is less than expected for a normal finger, it may indicate — depending on the sensor used — that no finger exists on the device at that moment. In the preferred embodiment, typical values of these thresholds are: Noise_average_threshold_high = 240 Noise_average_threshold_low = 30 Variance_average_threshold = 10
Note that other values can be used to tune the algorithm to the desired performance, and other more complicated combinations of the above statistics can also be used to determine the usefulness of a frame. Of course, other metrics, alone or in combination with the above, can also be used. Once the current frame has been found useful, it is next correlated to the last useful frame (stored in the step 251) to determine the finger movement, if any, that occurred. Once it is determined that finger movement has occurred, the last useful frame (step 251) is replaced by the current frame and the algorithm continues by acquiring a new frame from the sensor. In accordance with the present invention, a new frame ("cF") is correlated with an older one stored in memory ("oF"). Correlation is well known by any person skilled in the art, but it is described in more detail here to better explain the present invention. Standard cross-correlation SCC of row R of the last useful frame with row S of the current frame is mathematically expressed as: [Equation 1]
Figure imgf000015_0001
where d is referred to as the "lag" or "offset." The lag is related to the horizontal movement of the data in one frame with respect to the data in another, and can also be thought of as a velocity. Typically, the lag is -L <= d <= +L, where L is much less than M. All of the equations in this description are written assuming d >= 0 to keep the equations clear. It will be appreciated, however, that negative lag values can be processed by interchanging the column indices on oF and cF as shown below:
SCC _ whole (R,S,d) m - \d\] x cF[S, m])
Figure imgf000016_0001
where \d\ is the absolute value of d.
This interchange method is valid for all correlation equations in this document, not just SCC but also normalized cross-correlation NCC, discussed below. Though it is feasible to use standard correlation, the preferred embodiment uses a slightly modified version of the correlation called Normalized Cross Correlation NCC, defined in Equation 2 below, which is better suited to image registration tasks like fingeφrint image reconstruction. Unlike standard correlation, NCC is invariant to changes in image intensity, has a range that is independent of the number of pixels used in the calculation, and is more accurate because it is less dependent on local properties of the image frames being correlated.
NCC_whole(R,
Figure imgf000016_0002
[Equation 2] where: M-d oFsum[R, d] = ∑ {oF[R, '»]} m=d+\
is the sum along the row R from column d+1 through column M-d, and
M-2d cFsum [S, d] = ∑ {cF[S, m]} m=l
is the sum along row S from column I through column M-2d. The above equations are in terms of rows of each frame, but it is more general to think of "patches" of each frame, where a patch can be some subset or superset of a row. While the patches of each frame to be correlated can be any arbitrary set of pixels, the preferred embodiment uses a patch centered in the middle of each frame. While any row or subset of a row could be used, if the patch is too small, the statistical significance of the cross-correlation value will erode. Since the lag, or offset, of the information in the current frame to that in the last frame corresponds to an unknown amount of movement in the x-direction, NCC_w zo/e(R,S,d) must typically be calculated for multiple values of d to find the one that corresponds to the best fit.
Therefore, in one embodiment:
PeakNCCwhole(R,S,L) = MAX{ NCC_whole(R,S,d) }for d = -L to d = L. [Equation 3] dpeakwhole(R,S,L) = the value ofd at which the above equation is satisfied.
In the preferred embodiment, L = 8, but L should be chosen so that it is as large as the maximum x-velocity that can occur from frame to frame. A smaller L is more computationally efficient, but will produce inaccurate results if the finger shifts more than ±L from one frame to the next. In an alternative embodiment, L can be a function of the dpeakwhole from the previous iteration i-1. For example,
L(at iteration i) = dpeakwhole (at iteration i-1) + e, where e is typically equal to 1 or 2. In yet another embodiment, L can be a function of the row number in the frame (i.e. R and/or S). Also note that it is possible to use scaled versions of the NCC equations so that floating-point operations can be avoided, and that for computing purposes it is also possible to use NCC-squared to avoid an expensive square-root operation. The PeakNCCwhole corresponds to the correlation coefficient of the best fit, while dpeakwhole corresponds to the amount of movement in the x direction. A method to obtain the amount of motion in the y direction is described below. The NCC calculation in Equation 2 below can be restated, for last frame oF and current frame cF,
M-d {oF[R,m]x cF[S, m - d]} oFsum[R, d] cFsum[S, d] Σ (At - 2d) (M - 2d) X (M - 2d) NCC_ whole(R, S,d) = - M-d {oF[R,m]f foFsum[R,d] M-d {cF[S, m - d]f \ _ \cFsum[S, d ' =d+l (M-2d) j I (M-2d) Σ (M - 2d) \ (M - 2d)
[Equation 2]
as:
(M - 2d) T [{oF[R, m] x cF[S, m- d]}] - oFsutn[R, d] x cFsum[S, d] NCC_whole(R,S,d) -- (M -2d) ∑{oF[R,m]}2 i- {oFswn[R,d]} (M - 2d) J {cF[S, m - d]f ϊ - {cFsum[S, d]}
where the numerator and denominator have both been multiplied by (M-2d)2 to make it simpler to compute and understand. This can be separated into left and right halves of each row as:
NCC _ whole (R, S, d) = (A-B) / (Cm x D)
[Equation 4]
where A : (M-2d)
Figure imgf000019_0001
where B
[oFsum leβ [R, d] + oFsum right [R,
Figure imgf000019_0003
[cFsum left [S, d] + cFsum right [S,
Figure imgf000019_0002
where C
(M - sum [R,d] + oFsum [R, d] left' right
Figure imgf000019_0004
and, where D =
MI2 M-d (M - 2d)l ∑ {cF[S, m - d]f + ∑ {cF[S, m - d]f - sum [S, d] + cFsum [S, d] right lm=d+l m=M/2+\
where:
M-2d cFsumright= ∑{cF[S,m]} m=M ∑/2-(d<+l
is the sum along row R from column M/2+1 through column M-d,
M/2-d cFsumleβ = {cF[S, m]} m=\ is the sum along row R from column d +1 through column M/2,
Figure imgf000020_0001
is the sum along row S from column M/2-d+l through column M-2d, and
M/2 oFsum!eft = ∑ {0jP[i?,?M]} m=d+l
is the sum along row S from column 1 through column M/2-d. Furthermore, the NCC for the left and right halves of each row can be determined using:
Figure imgf000020_0002
[Equation 5 a]
Figure imgf000020_0003
[Equation 5b] hi addition, the PeakNCC for the left and right sides is given by:
PeakNCCleft(R,S,L) = MAX { NCC_left(R,S,d) } for d = -L to d = L. [Equation 6a] dpeaklφ(R,S,L) = the value ofd at which the above equation is satisfied.
PeakNCCright(R,S,L) = MAX{ NCC_right(R,S,d) jfor d = -L to d = L. [Equation 6b] dpeakright(R,S,L) = the value ofd at which the above equation is satisfied.
These equations allow the left and right sides of the sensor array to be treated separately, and efficiently determine the rotational movement as described below. Once all the NCC terms for left and right sides are calculated for the left and right sides in Equations 5a and 5b, only a few addition and division operations are required to calculate NCC whole for the entire sensing array using Equation 4. Then, overall linear motion can be calculated using Eq. 3 as before, while rotational movement is calculated using the linear motion for the left and right sides of the sensing array. Using PeakNCC(R,S,L) defined above in Equation 3 or Equations 6a and 6b, the calculation of exact x and y motion is straightforward. The last useful frame at iteration i has rows numbered 1 through N, as shown in Figure 4, where N is the number of rows supplied by the sensor. Similarly, the current frame to be processed has rows 1 through N. For a given row R in the last frame, PeakNCC and d k are calculated as in Equations
3, 6a and 6b with respect to rows 1 through N of the current frame, and take the άpeak that corresponds to the maximum PeakNCC. Preferably, this is done for two values of R: R=l and R=N. In this way, both upward and downward motion in the y direction can be determined while maximizing the speed at which a user can move his finger. It is also possible to choose only one R, at R=N/2 (or very near the middle row). However that is sub- optimal. It is also possible to choose values other than 1 or N, such as R=2 and R=N-1, which may be advantageous for accuracy reasons since they are not on the edge of the sensor array. Table 1 shows the pseudo-code for performing a single frame iteration for a given value of R. Although the calculations are carried out separately for the left side, right side, and whole row, only the generic case is described by the pseudo-code in Table 1. Step START Step Set R=l 11111 start with R=l and then do R=N Step Set L=8 11111 lag value fixed at 8 columns Step Set bests = 0 11111 initialize Step set MaxPeakNCC = 0 11111 initialize Step set dpeaiMax = 0 /////initialize Step 6: Loop from currentRow = 1 through N //loop over N rows of current frame { Step set tmp = PeakNCC (R, currentRow, L) Step if tmp greater than MaxPeakNCC then { Step set MaxPeakNCC = tmp //keep largest Step set dpeaiMax = dpeaJC(R, currentRow, L) Step set bests = currentRow Step set bestR = R } } Step 7: if R equals 1 then Step 7.1: set R = N Step 7.2: Go to Step 2 } Step 8: STOP Table 1
The pseudo-code in Table 1 can be summarized as: MaxPeakNCC = NCC_whole(bestR, bestS, dpeak(bestR,bestS,L)) [Equation 6c] dpmkMax = dpeak(bestR,bests,L) [Equation 6d]
Thus, after the above calculations, the following information is obtained:
1. MaxPeakNCC , the value of the correlation of the best pair of matching rows in the current frame and the last frame 2. bestR, the row of the last frame that results in MaxPeakNCC 3. bestS, the row of the current frame that results in MaxPeakNCC 4. dpeafcMax , the correlation lag where MaxPeakNCC is reached
bestR and bestS are the pair of rows that yield the highest correlation value (MaxPeakNCC ). Typically, MaxPeakNCC will be close to 1.0 (the closer to 1.0, the stronger the correlation), but if the finger being analyzed is moved too quickly, it is possible that the current frame does not have any rows in common with the last frame (i.e. a non-overlapping case). Therefore, MaxPeakNCC must be checked to ensure that it is large enough. Using the above information, the following calculations are performed: if MaxPeakNCC > corr Jhreshold: 1. Δx(i) = dpeaAMax, which is the x-velocity at iteration i 2. Δy(i) = bestS-bestR, which is the y-velocity at iteration i otherwise 3. Δx(i) = 0 4. Δy(i) = 0 where corr Jhreshold is used to make sure the correlation is high enough to indicate an overlap at all. Preferably, corr Jhreshold = 0.75, but other values can be used to tune the performance of the algorithm. If the correlation is below a threshold, it is impossible to determine actual values for the x- and y- velocities, so the algorithm simply outputs no motion vector. However, in accordance with alternative embodiments other values can be output as the default, such as maximum movements of N rows and M columns, or combinations thereof. In any case, after x and y motion have been calculated for the left side, right side, and whole rows for the current iteration /--denoted by Δx]eft(i) and Δyleft(i); Δxright(i) and Δyright(i); Δxwhole(i) and Δywhole(i), respectively-are calculated, the rotational movement Δthetari) can now be determined. The Δxwhole(i) and Δywhole(i) are made available to the host requiring the rotational information, and represent the overall linear x- and y-motion. Table 2 shows the pseudo code for detennining rotational movement. The pseudo code continues iterating until told to stop by the application program or operating system using the rotational data.
Step 0 : START Step 1 : set cumulativeDelYle£t ( 0 ) = 0 Step 2 : set cumulativeDelYright ( 0 ) = 0 Step 3 : Loop from iteration i = 1 through infinity { Step 3.1 Acquire a frame and calculate Δyleft(i) and Δyright (i) Step 3.2 Set cumulativeDelYle£t (i) =cumulativeDelYlet (i-1) +Δyle£t (i) Step 3.3 Seett ccuummuullaattiivveeDDeellYYrriigghhtt ((ii)) ==ccuummuullaattiivveDelYright (i-1) +Δyrlght ( Step 3.4 Iff aabbssoolluuttee vvaalluuee ((ccuummuullaattiivveeDDeellYYl].e,ft (i) - ccuummuullaattiivveeDDeellYYrriigghhtt((ii)) )) >>== TTHHRREESSHH,, then Step 3.5: Set Δtheta(i)= [cumulativeDelYleft (i) cumulativeDelY.right :i)i Step 3.6: Set cumulativeDelYle£t (i) = 0 Step 3.7: Set cumulativeDelYright(i) = 0
Else: Step 3.8: Set Δtheta(i)= 0 where THRESH can be any value >= 1. Preferably, THRESH=1. Table 2
It will be appreciated that there are alternative ways to compute the rotational delta, including arbitrary functions of the ΔyIeft(i) and Δyright(i). For alternative mountings of the sensor, where the x and y directions are transposed, Δxleft(i) and Δxright(i) are used in the pseudo code in Table 2. On full size placement sensors, more accuracy can be achieved using both Δx]eft(i), Δxright(i) and Δyleft(i), Δyright(i). This is achieved in one embodiment by calculating a Δtheta(i) using the pseudo code in Table 2 using Δx values and again using Δy values. The resulting two estimated values can be averaged together or otherwise combined to form the final Δtheta(i). The Δtheta(i) are made available to the host application and/or operating system. In other embodiments a standard correlation is used instead of normalized cross correlation. Standard cross correlation given in Equation 1 could be used instead of Normalized Cross Correlation. It is straightforward to split Equation 1 into terms from the left and right sides of each row. M/2 M-d SCC _ whole(R, S,d) = ∑∑ ((ooFF[[RR,,mm]]xx ccFF[[SS,,mm --dd]])) ++ ∑∑ ( (oF[R, m] x cF[S, m - d]) m=d÷ m=M/2+l
[Equation 7] In this case, which is much simpler than the NCC case in the preferred embodiment, the SCC value for the entire row is simply the sum of the correlation values of each half. It will also be appreciated that the maximum standard cross-correlation between a row S of a frame and a row R of the last useful frame can be given by other expressions. For example, weighted or voting schemes can be used. In one embodiment,
PeakSCCwhole(R,S,L) = Weighted__MAX{SCC_Whole(R,S,d)} for d=-L to d=L
[Equation 8]
where Weighted_MAX is a function that assigns a plurality of predetermined weights to its elements before generating a value, dpeakwhole(R,S,L) is the value of d at which Equation 8 is satisfied, and L is approximately equal to the maximum horizontal speed from the last useful frame to the current frame. While the preferred embodiment splits each row into a left and right side of equal length, alternative embodiments use any arbitrary division of each row, including more than
2 equal parts instead of 2, and also using divisions that are of differing length. It is also not necessary to have each division touch the next. Figures 8-12 show some of the possibilities, each with advantages over the other. For example, Figure 8 shows a fingeφrint sensor having a left half (sub-frame) 605 contiguous with a right half 610. Figure 9 shows a fmgeφrint sensor having a first section
(sub-frame) 615, a second section 620, a third section 625, and a fourth section 630. Section 615 is contiguous with section 620, section 620 is contiguous with 625, and section 625 is contiguous with section 630. Figure 10 shows a finger image sensor having two sections 631 and 632 that are not contiguous. Figure 11 shows a finger image sensor having four sections 635, 640, 645, and 650, none of which are contiguous. And Figure 12 shows a finger image sensor having a section 655 contiguous with a section 660, and a section 665 contiguous with a section 670. It will be appreciated that other configurations are also possible, each preferably having more than one single division and using linear correlation to determine rotational movement. The number and configurations of the sections can be chosen based on production cost, surface area of the finger image sensor, computational algorithms, processor configuration, speed required, and other criteria. In other embodiments, it is desirable to modify the raw values Δxwhole(i), Δywhole(i), and Δtheta(i) before sending it to the host. These types of modifications involve three different mathematical transformations, generically called filtering, where the transformed output is noted by the ' notation:
1. Scaling: applying a linear or non-linear scaling factor to multiply the original movement values to obtain scaled versions more appropriate to the displayed coordinate system of the host. An example is to multiply all values by a factor of 2. Another example is to multiply x movement by a factor of 2 and y movement by a 0.5. 2. Smoothing: applying a smoothing (low-pass) filter to successive movement values in order to make the values less jagged over time. Examples are: Linear Average = Δx'whole(i)=[Δxwhole(i)+ΔxwhoIe(i-l)+Δxwhole(i-2)]/3 Exponential average = Δtheta'(i)=[Δtheta'(i-l)+Δtheta(i)]/2 3. Clipping: limiting values of the movement to arbitrary values. An example is: If Δy(i)>T, then Δy'(i)=T. Otherwise Δy'(i)=Δy(i).
These operations can be combined in many ways in accordance with the present invention. Here, "filtering" is used to describe the function of processing an input in a well-defined way to produce an output. Those skilled in the art will recognize that correlation is computationally intensive. Accordingly, in one embodiment, the calculation of Δtheta(i) and/or the Δx(i), Δy(i) for the left, right, and whole array are performed on a separate processor or dedicated hardware, hi this embodiment, the hardware can be integrated into the silicon fingeφrint sensor itself.
The hardware performing the correlation must have access to the current frame and the last useful frame, both of which can be stored in memory on the device. If, for example, this is integrated into the finger image sensor, such a device would obviously have access to the current frame (since the device itself created it), and it could save the last useful frame in volatile memory registers. In such a case the device would also need to determine whether a frame is useful or not, using the method described here in the preferred embodiment. In such an embodiment the host computing device is not necessary. Obviously, such a hardware implementation could also be used to reconstruct fingeφrint images since doing so only requires the Δx(i) and Δy(i) for the whole array. It will be readily apparent to one skilled in the art that various modifications may be made to the embodiments without departing from the spirit and scope of the invention as defined by the appended claims.

Claims

CLAIMS: I claim:
1. A method of obtaining rotation information, the method comprising: a. capturing a plurality of patterned images from a plurality of locations; b. correlating the plurality of patterned images to generate sets of linear differences; and c. using the sets of linear differences to generate the rotation information.
2. The method of claim 1 , wherein the plurality of locations comprise a first part of a sensor and a second part of the sensor.
3. The method of claim 2, wherein a first of the plurality of patterned images is captured in the first part of the sensor and a second of the plurality of patterned images is captured in the second part of the sensor.
4. The method of claim 3, wherein the sensor is a biometric image sensor.
5. The method of claim 4, wherein the biometric image sensor is a finger image sensor.
6. The method of claim 5, wherein the first of the plurality of patterned images and the second of the plurality of patterned images together correspond to a fmgeφrint image in a first position on the sensor.
7. The method of claim 6, wherein a third of the plurality of patterned images is captured in the first part of the sensor and a fourth of the plurality of patterned images is captured in the second part of the sensor.
8. The method of claim 7, wherein the third of the plurality of patterned images and the fourth of the plurality of patterned images together correspond to the fmgeφrint image in a second position on the sensor.
9. The method of claim 8, wherein the rotation information corresponds to an angular difference between the first position and the second position.
10. The method of claim 9, wherein correlating the plurality of patterned images comprises: a. correlating the first patterned image with the third patterned image to generate a first set of linear differences from the sets of linear differences; b. correlating the second patterned image with the fourth patterned image to generate a second set of linear differences from the sets of linear differences; and c. correlating a first combination of the first patterned image and the second patterned image with a second combination of the third patterned image and the fourth patterned image to generate a third set of linear differences from the sets of linear differences.
11. The method of claim 10, wherein correlating the first patterned image with the third patterned image, correlating the second patterned image with the fourth patterned image, and correlating the first combination with the second combination all comprise performing a cross correlation.
12. The method of claim 11, wherein the cross correlation is a normalized cross correlation.
13. The method of claim 11 , wherein the cross correlation is a standardized cross correlation.
14. The method of claim 2, wherein the first part of the sensor and the second part of the sensor are contiguous.
15. The method of claim 2, wherein the first part of the sensor and the second part of the sensor are not contiguous.
16. The method of claim 11 , wherein the first part of the sensor comprises a first sub- frame of pixels and the second part of the sensor comprises a second sub-frame of pixels.
17. The method of claim 16, wherein capturing the first patterned image comprises storing in the first sub-frame first data corresponding to the first patterned image, capturing the second patterned image comprises storing in the second sub-frame second data corresponding to the second patterned image, capturing the third patterned image comprises storing in the first sub-frame third data corresponding to the third patterned image, and capturing the fourth patterned image comprises storing in the second sub-frame fourth data corresponding to the fourth patterned image.
18. The method of claim 17, wherein correlating the first patterned image with the third patterned image comprises correlating the first data with the third data to generate first and second linear differences from the first set of linear differences.
19. The method of claim 18, wherein correlating the second patterned image with the fourth patterned image comprises correlating the second data with the fourth data to generate first and second linear differences from the second set of linear differences.
20. The method of claim 19, wherein correlating the first combination with the second combination comprises correlating a combination of the first data and the second data with a combination of the third data and the fourth data to generate first and second linear differences from the third set of linear differences.
21. The method of claim 20, wherein correlating comprises determining a lag to correlate elements of one of the first and the second sub-frames, the lag and a difference between the elements corresponding to first and second linear differences from one of the sets of linear differences.
22. The method of claim 21, wherein each element corresponds to a row of one of the first and the second sub-frames.
23. The method of claim 21 , wherein each element corresponds to a column of one of the first and the second sub-frames.
24. The method of claim 11 , further comprising filtering the first set of linear differences, the second set of linear differences, the third set of linear differences, and the rotation information.
25. The method of claim 24, wherein filtering comprises multiplying by a scaling factor.
26. The method of claim 25, wherein filtering further comprises performing a smoothing function.
27. The method of claim 26, wherein filtering further comprises performing a clipping function.
28. The method of claim 10, wherein the finger image sensor is a finger placement sensor.
29. The method of claim 10, wherein the finger image sensor is a finger swipe sensor.
30. The method of claim 9, further comprising using the rotation information on a host platform having a display screen, the rotation information used to rotate an object on the display screen, thereby emulating a computer input device.
31. The method of claim 30, wherein the computer input device is selected from the group consisting of a steering wheel, a joystick, and a navigation bar.
32. The method of claim 30, wherein emulating a computer input device comprises moving the object on the display screen at a rate related to the angular difference.
33. A system for obtaining rotation information, the system comprising: a. means for capturing a plurality of patterned images from a plurality of locations; and b. means for correlating the plurality of patterned images to generate sets of linear differences and for using the sets of linear differences to generate the rotation information.
34. The system of claim 33, wherein the means for capturing comprises a sensor having a first part and a second part.
35. The system of claim 34, wherein the sensor is a biometric image sensor.
36. The system of claim 35, wherein the biometric image sensor is a finger image sensor.
37. The system of claim 36, wherein the first part of the sensor is configured to capture a first of the plurality of patterned images and the second part of the sensor is configured to capture a second of the plurality of patterned images.
38. The system of claim, 37 wherein the first of the plurality of patterned images and the second of the plurality of patterned images together correspond to a fingeφrint image in a first position.
39. The system of claim 38, wherein the first part of the sensor is further configured to capture a third of the plurality of patterned images and the second part of the sensor is further configured to capture a fourth of the plurality of patterned images.
40. The system of claim 39, wherein the third of the plurality of patterned images and the fourth of the plurality of patterned images together correspond to the fingeφrint image in a second position.
41. The system of claim 40, wherein the rotation information corresponds to an angular difference between the first position and the second position.
42. The system of claim 40, wherein correlating the plurality of patterned images comprises: a. correlating the first patterned image with the third patterned image to generate a first set of linear differences from the sets of linear differences; b. correlating the second patterned image with the fourth patterned image to generate a second set of linear differences from the sets of linear differences; and c. correlating a first combination of the first patterned image and the second patterned image with a second combination of the third patterned image and the fourth patterned image to generate a third set of linear differences from the sets of linear differences.
43. The system of claim 42, wherein correlating the first patterned image with the third patterned image, correlating the second patterned image with the fourth patterned image, and correlating the first combination with the second combination all comprise performing a cross correlation.
44. The system of claim 43, wherein the cross correlation is a normalized cross correlation.
45. The system of claim 43, wherein the cross correlation is a standardized cross correlation.
46. The system of claim 34, wherein the first part of the sensor and the second part of the sensor are contiguous.
47. The system of claim 34, wherein the first part of the sensor and the second part of the sensor are not contiguous.
48. The system of claim 43, wherein the first part of the sensor comprises a first sub- frame of pixels and the second part of the sensor comprises a second sub-frame of pixels.
49. The system of claim 48, wherein capturing a patterned image comprises storing in the first sub-frame first data corresponding to the first patterned image, capturing the second patterned image comprises storing in the second sub-frame second data corresponding to the second patterned image, capturing the third patterned image comprises storing in the first sub-frame third data corresponding to the third patterned image, and capturing the fourth patterned image comprises storing in the fourth sub- frame data corresponding to the fourth patterned image.
50. The system of claim 49, wherein correlating the first patterned image with the third patterned image comprises correlating the first data with the third data to generate first and second linear differences from the first set of linear differences.
51. The system of claim 50, wherein correlating the second patterned image with the fourth patterned image comprises correlating the second data with the third data to generate first and second linear differences from the second set of linear differences.
52. The system of claim 51 , wherein correlating the first combination with the second combination comprises correlating a combination of the first data and the second data with a combination of the third data and the fourth data to generate first and second linear differences from the third set of linear differences.
53. The system of claim 52, wherein correlating comprises determining a lag to correlate elements of one of the first and second sub-frames, the lag and a difference between the elements corresponding to the first and second linear differences from one of the sets of linear differences.
54. The system of claim 53, wherein each element corresponds to a row of one of the first and second sub-frames.
55. The system of claim 53, wherein each element corresponds to a column of one of the first and second sub-frames.
56. The system of claim 33, wherein the means for correlating is further configured for filtering the first set of linear differences, the second set of linear differences, the third set of linear differences, and the rotation information.
57. The system of claim 56, wherein filtering comprises multiplying by a scaling factor.
58. The system of claim 57, wherein filtering further comprises performing a smoothing function.
59. The system of claim 58, wherein filtering further comprises performing a clipping function.
60. The system of claim 33, wherein the means for capturing comprises a finger placement sensor.
61. The system of claim 33, wherein the means for capturing comprises a fmger swipe sensor.
62. The system of claim 41, further comprising a host device coupled to the means for correlating, the host device having a display screen and configured to receive the rotation information and use the rotation information to control an object on the display screen, thereby emulating a computer input device.
63. The system of claim 62, wherein the computer input device is selected from the group consisting of a steering wheel, a joystick, and a navigation bar.
64. The system of claim 63, wherein the host device is a portable device.
65. The system of claim 64, wherein the portable device is a device selected from the group consisting of a personal computer, a portable telephone, a portable electronic game device, and a digital camera.
66. The system of claim 62, wherein emulating a computer input device comprises moving the object on the display screen at a rate proportional to the angular difference.
67. The system of claim 41 , further comprising a host device integral with at least one of the means for capturing and the means for correlating.
68. A method of emulating a rotational device using a pattern, the method comprising: a. capturing a first image of the pattern at a first orientation; b. capturing a second image of the pattern at a second orientation; c. correlating the first image with the second image to calculate linear differences between the first orientation and the second orientation; d. translating the linear difference into rotational data; and f. using the rotational data to emulate the movement of a rotational device.
69. The method of claim 68, wherein the image comprises a fingeφrint image.
70. The method of claim 69, wherein the rotational device is further configured to emulate a linear positioning device.
71. A system for emulating a positional device, the system comprising: a. a sensor for capturing an image of a pattern; and b. a processor coupled to the sensor, the processor configured to calculate linear differences between a first position of the image of the pattern and a second position of the image of the pattern and to translate the linear differences into rotational data corresponding to a rotation of the image of the pattern.
72. A method of sensing rotation of an object on an image sensor comprising: a. sensing a first image of the object; b. sensing a second image of the object; and c. comparing the first image with the second image to determine whether there is linear motion in each of at least two portions of an area containing the first image and the second image to determine whether the object remained stationary, moved in a linear manner, or rotated.
PCT/US2004/025528 2003-08-22 2004-08-05 System for and method of generating rotational inputs WO2005022458A2 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
EP04780370A EP1661085A2 (en) 2003-08-22 2004-08-05 System for and method of generating rotational inputs
JP2006524682A JP2007519064A (en) 2003-08-22 2004-08-05 System and method for generating rotational input information

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US49704503P 2003-08-22 2003-08-22
US60/497,045 2003-08-22
US10/912,655 2004-08-04
US10/912,655 US7587072B2 (en) 2003-08-22 2004-08-04 System for and method of generating rotational inputs

Publications (2)

Publication Number Publication Date
WO2005022458A2 true WO2005022458A2 (en) 2005-03-10
WO2005022458A3 WO2005022458A3 (en) 2007-01-25

Family

ID=34198224

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2004/025528 WO2005022458A2 (en) 2003-08-22 2004-08-05 System for and method of generating rotational inputs

Country Status (4)

Country Link
US (1) US7587072B2 (en)
EP (1) EP1661085A2 (en)
JP (1) JP2007519064A (en)
WO (1) WO2005022458A2 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2006136644A1 (en) 2005-06-23 2006-12-28 Nokia Corporation Method and program of controlling electronic device, electronic device and subscriber equipment
JP2007000648A (en) * 2005-04-22 2007-01-11 Hitachi Omron Terminal Solutions Corp Biometrics authentication apparatus, terminal equipment, and consumer transaction facility
JP2007183901A (en) * 2005-12-30 2007-07-19 Altek Corp Method for processing moving image

Families Citing this family (51)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7190251B2 (en) * 1999-05-25 2007-03-13 Varatouch Technology Incorporated Variable resistance devices and methods
US7474772B2 (en) * 2003-06-25 2009-01-06 Atrua Technologies, Inc. System and method for a miniature user input device
US7161585B2 (en) * 2003-07-01 2007-01-09 Em Microelectronic-Marin Sa Displacement data post-processing and reporting in an optical pointing device
US7697729B2 (en) * 2004-01-29 2010-04-13 Authentec, Inc. System for and method of finger initiated actions
US20050179657A1 (en) * 2004-02-12 2005-08-18 Atrua Technologies, Inc. System and method of emulating mouse operations using finger image sensors
FR2869723A1 (en) * 2004-04-29 2005-11-04 Thomson Licensing Sa NON-CONTACT TRANSITION ELEMENT BETWEEN A WAVEGUIDE AND A MOCRORUBAN LINE
JP4471761B2 (en) * 2004-07-26 2010-06-02 任天堂株式会社 GAME PROGRAM, GAME DEVICE, AND INPUT DEVICE
JP3734819B1 (en) * 2004-07-26 2006-01-11 任天堂株式会社 GAME PROGRAM, GAME DEVICE, AND INPUT DEVICE
JP2006107366A (en) * 2004-10-08 2006-04-20 Fujitsu Ltd Living body information input device, living body authentication device, living body information processing method, living body information processing program and computer readable recording medium with the program recorded thereon
US7831070B1 (en) 2005-02-18 2010-11-09 Authentec, Inc. Dynamic finger detection mechanism for a fingerprint sensor
US8231056B2 (en) * 2005-04-08 2012-07-31 Authentec, Inc. System for and method of protecting an integrated circuit from over currents
WO2006126310A1 (en) * 2005-05-27 2006-11-30 Sharp Kabushiki Kaisha Display device
US7505613B2 (en) * 2005-07-12 2009-03-17 Atrua Technologies, Inc. System for and method of securing fingerprint biometric systems against fake-finger spoofing
US20070061126A1 (en) * 2005-09-01 2007-03-15 Anthony Russo System for and method of emulating electronic input devices
US7940249B2 (en) * 2005-11-01 2011-05-10 Authentec, Inc. Devices using a metal layer with an array of vias to reduce degradation
US7809211B2 (en) * 2005-11-17 2010-10-05 Upek, Inc. Image normalization for computed image construction
US7684953B2 (en) * 2006-02-10 2010-03-23 Authentec, Inc. Systems using variable resistance zones and stops for generating inputs to an electronic device
US7885436B2 (en) * 2006-07-13 2011-02-08 Authentec, Inc. System for and method of assigning confidence values to fingerprint minutiae points
US20080013805A1 (en) * 2006-07-17 2008-01-17 Authentec, Inc. Finger sensing device using indexing and associated methods
US9235274B1 (en) 2006-07-25 2016-01-12 Apple Inc. Low-profile or ultra-thin navigation pointing or haptic feedback device
WO2008127752A2 (en) 2007-01-25 2008-10-23 Magna Electronics Radar sensing system for vehicle
US8494234B1 (en) * 2007-03-07 2013-07-23 MotionDSP, Inc. Video hashing system and method
GB0711834D0 (en) * 2007-06-19 2007-07-25 Innometriks Ltd Methods of and apparatus for forming a blometric image
US20080317306A1 (en) * 2007-06-19 2008-12-25 Robin Hamilton Methods of and apparatus for forming a biometric image
US9785330B1 (en) 2008-02-13 2017-10-10 Apple Inc. Systems for and methods of providing inertial scrolling and navigation using a fingerprint sensor calculating swiping speed and length
US8634604B2 (en) * 2008-05-05 2014-01-21 Sonavation, Inc. Method and system for enhanced image alignment
JP5053177B2 (en) * 2008-05-23 2012-10-17 ラピスセミコンダクタ株式会社 Image processing device
US8775023B2 (en) 2009-02-15 2014-07-08 Neanode Inc. Light-based touch controls on a steering wheel and dashboard
US8773390B1 (en) * 2009-04-24 2014-07-08 Cypress Semiconductor Corporation Biometric identification devices, methods and systems having touch surfaces
US8421890B2 (en) 2010-01-15 2013-04-16 Picofield Technologies, Inc. Electronic imager using an impedance sensor grid array and method of making
US8866347B2 (en) 2010-01-15 2014-10-21 Idex Asa Biometric image sensing
US8791792B2 (en) 2010-01-15 2014-07-29 Idex Asa Electronic imager using an impedance sensor grid array mounted on or about a switch and method of making
US9092125B2 (en) * 2010-04-08 2015-07-28 Avaya Inc. Multi-mode touchscreen user interface for a multi-state touchscreen device
US20120092279A1 (en) 2010-10-18 2012-04-19 Qualcomm Mems Technologies, Inc. Touch sensor with force-actuated switched capacitor
JP5815932B2 (en) * 2010-10-27 2015-11-17 京セラ株式会社 Electronics
KR101160681B1 (en) 2011-10-19 2012-06-28 배경덕 Method, mobile communication terminal and computer-readable recording medium for operating specific function when activaing of mobile communication terminal
US20130279769A1 (en) 2012-04-10 2013-10-24 Picofield Technologies Inc. Biometric Sensing
US9024910B2 (en) 2012-04-23 2015-05-05 Qualcomm Mems Technologies, Inc. Touchscreen with bridged force-sensitive resistors
KR101312097B1 (en) * 2012-10-29 2013-09-25 크루셜소프트 주식회사 Method, apparatus and computer-readable recording medium for recognizing fingerprint
US9092093B2 (en) 2012-11-27 2015-07-28 Neonode Inc. Steering wheel user interface
US9754149B2 (en) * 2013-04-01 2017-09-05 AMI Research & Development, LLC Fingerprint based smart phone user verification
US10121049B2 (en) 2013-04-01 2018-11-06 AMI Research & Development, LLC Fingerprint based smart phone user verification
US9117100B2 (en) * 2013-09-11 2015-08-25 Qualcomm Incorporated Dynamic learning for object tracking
US9230152B2 (en) * 2014-06-03 2016-01-05 Apple Inc. Electronic device for processing composite finger matching biometric data and related methods
TWI557649B (en) * 2014-08-01 2016-11-11 神盾股份有限公司 Electronic device and control method for fingerprint recognition apparatus
US9521314B2 (en) * 2015-02-06 2016-12-13 Fingerprint Cards Ab Fingerprint enrollment using elongated fingerprint sensor
SE1550281A1 (en) 2015-03-06 2016-09-07 Fingerprint Cards Ab Method and system for estimating finger movement
CN105159571A (en) * 2015-07-07 2015-12-16 努比亚技术有限公司 Page slide control method and mobile terminal
CN107091704B (en) * 2016-02-17 2020-10-09 北京小米移动软件有限公司 Pressure detection method and device
US10489920B2 (en) 2017-01-11 2019-11-26 Egis Technology Inc. Method and electronic device for determining moving direction of a finger
US11429230B2 (en) 2018-11-28 2022-08-30 Neonode Inc Motorist user interface sensor

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6323846B1 (en) * 1998-01-26 2001-11-27 University Of Delaware Method and apparatus for integrating manual input
US20030002718A1 (en) * 2001-06-27 2003-01-02 Laurence Hamid Method and system for extracting an area of interest from within a swipe image of a biological surface

Family Cites Families (165)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US1684461A (en) 1922-12-01 1928-09-18 Dubilier Condenser Corp Electrical device
US1660161A (en) 1923-11-02 1928-02-21 Edmund H Hansen Light-dimmer rheostat
US3393390A (en) 1966-09-15 1968-07-16 Markite Corp Potentiometer resistance device employing conductive plastic and a parallel resistance
US3624584A (en) 1969-02-20 1971-11-30 Nippon Musical Instruments Mfg Variable resistance device for an electronic musical instrument
US3610887A (en) 1970-01-21 1971-10-05 Roper Corp Control arrangement for heating unit in an electric range or the like
US3621439A (en) 1970-06-08 1971-11-16 Gen Instrument Corp Variable resistor
US3863195A (en) 1972-09-15 1975-01-28 Johnson Co E F Sliding variable resistor
US3960044A (en) 1973-10-18 1976-06-01 Nippon Gakki Seizo Kabushiki Kaisha Keyboard arrangement having after-control signal detecting sensor in electronic musical instrument
US4152304A (en) 1975-02-06 1979-05-01 Universal Oil Products Company Pressure-sensitive flexible resistors
US3997863A (en) 1975-04-03 1976-12-14 Norlin Music, Inc. Helically wound pitch-determining element for electronic musical instrument
CA1096161A (en) 1976-12-24 1981-02-24 Katsuhiko Kanamori Pressure-sensitive, electrically conductive elastomeric composition
US4257305A (en) 1977-12-23 1981-03-24 Arp Instruments, Inc. Pressure sensitive controller for electronic musical instruments
US4333068A (en) 1980-07-28 1982-06-01 Sangamo Weston, Inc. Position transducer
DE3039256A1 (en) 1980-10-17 1982-04-29 Bosch-Siemens Hausgeräte GmbH, 7000 Stuttgart RESISTANT VARIABLE SWITCHGEAR
US4438158A (en) 1980-12-29 1984-03-20 General Electric Company Method for fabrication of electrical resistor
US4479392A (en) 1983-01-03 1984-10-30 Illinois Tool Works Inc. Force transducer
EP0173972B1 (en) * 1984-08-30 1991-02-27 Nec Corporation Pre-processing system for pre-processing an image signal succession prior to identification
US4604509A (en) 1985-02-01 1986-08-05 Honeywell Inc. Elastomeric push button return element for providing enhanced tactile feedback
DE3674086D1 (en) 1985-07-03 1990-10-18 Mitsuboshi Belting Ltd PRESSURE-SENSITIVE CONDUCTIVE RUBBER MATERIAL.
US4775765A (en) 1985-11-28 1988-10-04 Hitachi, Ltd. Coordinate input apparatus
US4745301A (en) 1985-12-13 1988-05-17 Advanced Micro-Matrix, Inc. Pressure sensitive electro-conductive materials
US4746894A (en) 1986-01-21 1988-05-24 Maurice Zeldman Method and apparatus for sensing position of contact along an elongated member
US4833440A (en) * 1987-01-16 1989-05-23 Eaton Corporation Conductive elastomers in potentiometers & rheostats
JPS63174401U (en) 1987-02-25 1988-11-11
DE3809770A1 (en) 1988-03-23 1989-10-05 Preh Elektro Feinmechanik KEY SWITCH
JPH0256903A (en) 1988-08-23 1990-02-26 Fine Rubber Kenkyusho:Kk Variable resistance device
GB2224400B (en) 1988-09-14 1992-07-08 Gates Rubber Co Electrical sensing element
GB8914235D0 (en) * 1989-06-21 1989-08-09 Tait David A G Finger operable control devices
US5327161A (en) * 1989-08-09 1994-07-05 Microtouch Systems, Inc. System and method for emulating a mouse input device with a touchpad input device
US5231386A (en) 1990-07-24 1993-07-27 Home Row, Inc. Keyswitch-integrated pointing assembly
US5457368A (en) * 1993-03-09 1995-10-10 University Of Utah Research Foundation Mechanical/electrical displacement transducer
US4933660A (en) 1989-10-27 1990-06-12 Elographics, Inc. Touch sensor with touch pressure capability
US5060527A (en) 1990-02-14 1991-10-29 Burgess Lester E Tactile sensing transducer
DE4011636A1 (en) 1990-04-11 1991-10-24 Nokia Unterhaltungselektronik PRESSURE SENSITIVE SWITCH
JPH0471079A (en) * 1990-07-12 1992-03-05 Takayama:Kk Positioning method for image
US5541622A (en) * 1990-07-24 1996-07-30 Incontrol Solutions, Inc. Miniature isometric joystick
US5170364A (en) * 1990-12-06 1992-12-08 Biomechanics Corporation Of America Feedback system for load bearing surface
US5666113A (en) * 1991-07-31 1997-09-09 Microtouch Systems, Inc. System for using a touchpad input device for cursor control and keyboard emulation
US5999084A (en) 1998-06-29 1999-12-07 Armstrong; Brad A. Variable-conductance sensor
JPH0758234B2 (en) 1992-04-16 1995-06-21 株式会社エニックス Semiconductor matrix type fine surface pressure distribution sensor
US5880411A (en) 1992-06-08 1999-03-09 Synaptics, Incorporated Object position detector with edge motion feature and gesture recognition
JPH0621531A (en) 1992-07-01 1994-01-28 Rohm Co Ltd Neuro element
US5821930A (en) 1992-08-23 1998-10-13 U S West, Inc. Method and system for generating a working window in a computer system
DE4228297A1 (en) 1992-08-26 1994-03-03 Siemens Ag Changeable high current resistor, especially for use as a protective element in power switching technology, and switching using the high current resistor
US5612719A (en) * 1992-12-03 1997-03-18 Apple Computer, Inc. Gesture sensitive buttons for graphical user interfaces
US5376913A (en) 1993-07-12 1994-12-27 Motorola, Inc. Variable resistor utilizing an elastomeric actuator
WO1995008167A1 (en) 1993-09-13 1995-03-23 Asher David J Joystick with membrane sensor
US6546112B1 (en) * 1993-11-18 2003-04-08 Digimarc Corporation Security document with steganographically-encoded authentication data
US5825907A (en) * 1994-12-28 1998-10-20 Lucent Technologies Inc. Neural network system for classifying fingerprints
FR2730810B1 (en) 1995-02-21 1997-03-14 Thomson Csf HIGHLY SELECTIVE CHEMICAL SENSOR
US5675309A (en) 1995-06-29 1997-10-07 Devolpi Dean Curved disc joystick pointing device
US5740276A (en) * 1995-07-27 1998-04-14 Mytec Technologies Inc. Holographic method for encrypting and decrypting information using a fingerprint
US5614881A (en) 1995-08-11 1997-03-25 General Electric Company Current limiting device
US5825352A (en) * 1996-01-04 1998-10-20 Logitech, Inc. Multiple fingers contact sensing method for emulating mouse buttons and mouse operations on a touch sensor pad
US5841888A (en) 1996-01-23 1998-11-24 Harris Corporation Method for fingerprint indexing and searching
US6067368A (en) 1996-01-26 2000-05-23 Authentec, Inc. Fingerprint sensor having filtering and power conserving features and related methods
US5963679A (en) 1996-01-26 1999-10-05 Harris Corporation Electric field fingerprint sensor apparatus and related methods
US5956415A (en) 1996-01-26 1999-09-21 Harris Corporation Enhanced security fingerprint sensor package and related methods
US5828773A (en) 1996-01-26 1998-10-27 Harris Corporation Fingerprint sensing method with finger position indication
JP3747520B2 (en) 1996-01-30 2006-02-22 富士ゼロックス株式会社 Information processing apparatus and information processing method
US5995630A (en) 1996-03-07 1999-11-30 Dew Engineering And Development Limited Biometric input with encryption
FR2749955B1 (en) * 1996-06-14 1998-09-11 Thomson Csf FINGERPRINT READING SYSTEM
US5943044A (en) * 1996-08-05 1999-08-24 Interlink Electronics Force sensing semiconductive touchpad
US6208329B1 (en) * 1996-08-13 2001-03-27 Lsi Logic Corporation Supplemental mouse button emulation system, method and apparatus for a coordinate based data input device
JPH1069346A (en) * 1996-08-28 1998-03-10 Alps Electric Co Ltd Coordinate input device and its control method
JPH1079948A (en) 1996-09-03 1998-03-24 Mitsubishi Electric Corp Image encoding device
US6219793B1 (en) * 1996-09-11 2001-04-17 Hush, Inc. Method of using fingerprints to authenticate wireless communications
US5945929A (en) 1996-09-27 1999-08-31 The Challenge Machinery Company Touch control potentiometer
US6337918B1 (en) * 1996-11-04 2002-01-08 Compaq Computer Corporation Computer system with integratable touchpad/security subsystem
FR2755526B1 (en) 1996-11-05 1999-01-22 Thomson Csf FINGERPRINT READING SYSTEM WITH INTEGRATED HEATING RESISTORS
US6057830A (en) * 1997-01-17 2000-05-02 Tritech Microelectronics International Ltd. Touchpad mouse controller
US5995084A (en) 1997-01-17 1999-11-30 Tritech Microelectronics, Ltd. Touchpad pen-input and mouse controller
US6061051A (en) * 1997-01-17 2000-05-09 Tritech Microelectronics Command set for touchpad pen-input mouse
US5982894A (en) 1997-02-06 1999-11-09 Authentec, Inc. System including separable protected components and associated methods
US6809462B2 (en) * 2000-04-05 2004-10-26 Sri International Electroactive polymer sensors
US5909211A (en) * 1997-03-25 1999-06-01 International Business Machines Corporation Touch pad overlay driven computer system
CA2203212A1 (en) * 1997-04-21 1998-10-21 Vijayakumar Bhagavatula Methodology for biometric encryption
US6088471A (en) 1997-05-16 2000-07-11 Authentec, Inc. Fingerprint sensor including an anisotropic dielectric coating and associated methods
US5953441A (en) 1997-05-16 1999-09-14 Harris Corporation Fingerprint sensor having spoof reduction features and related methods
US6088585A (en) * 1997-05-16 2000-07-11 Authentec, Inc. Portable telecommunication device including a fingerprint sensor and related methods
US5920640A (en) 1997-05-16 1999-07-06 Harris Corporation Fingerprint sensor and token reader and associated methods
US6259804B1 (en) * 1997-05-16 2001-07-10 Authentic, Inc. Fingerprint sensor with gain control features and associated methods
US5940526A (en) 1997-05-16 1999-08-17 Harris Corporation Electric field fingerprint sensor having enhanced features and related methods
US5903225A (en) 1997-05-16 1999-05-11 Harris Corporation Access control system including fingerprint sensor enrollment and associated methods
US6098330A (en) 1997-05-16 2000-08-08 Authentec, Inc. Machine including vibration and shock resistant fingerprint sensor and related methods
US5943052A (en) * 1997-08-12 1999-08-24 Synaptics, Incorporated Method and apparatus for scroll bar control
US6011849A (en) * 1997-08-28 2000-01-04 Syndata Technologies, Inc. Encryption-based selection system for steganography
US5876106A (en) 1997-09-04 1999-03-02 Cts Corporation Illuminated controller
US5912612A (en) 1997-10-14 1999-06-15 Devolpi; Dean R. Multi-speed multi-direction analog pointing device
US6028773A (en) 1997-11-14 2000-02-22 Stmicroelectronics, Inc. Packaging for silicon sensors
US6035398A (en) * 1997-11-14 2000-03-07 Digitalpersona, Inc. Cryptographic key generation using biometric data
US6047282A (en) 1997-12-05 2000-04-04 Authentec, Inc. Apparatus and method for expandable biometric searching
US6047281A (en) 1997-12-05 2000-04-04 Authentec, Inc. Method and apparatus for expandable biometric searching
US6070159A (en) 1997-12-05 2000-05-30 Authentec, Inc. Method and apparatus for expandable biometric searching
US6408087B1 (en) * 1998-01-13 2002-06-18 Stmicroelectronics, Inc. Capacitive semiconductor user input device
US6141753A (en) * 1998-02-10 2000-10-31 Fraunhofer Gesellschaft Secure distribution of digital representations
EP0940652B1 (en) * 1998-03-05 2004-12-22 Nippon Telegraph and Telephone Corporation Surface shape recognition sensor and method of fabricating the same
US6278443B1 (en) * 1998-04-30 2001-08-21 International Business Machines Corporation Touch screen with random finger placement and rolling on screen to control the movement of information on-screen
US6057540A (en) * 1998-04-30 2000-05-02 Hewlett-Packard Co Mouseless optical and position translation type screen pointer control for a computer system
US6404900B1 (en) * 1998-06-22 2002-06-11 Sharp Laboratories Of America, Inc. Method for robust human face tracking in presence of multiple persons
CA2273560A1 (en) * 1998-07-17 2000-01-17 David Andrew Inglis Finger sensor operating technique
US6135958A (en) * 1998-08-06 2000-10-24 Acuson Corporation Ultrasound imaging system with touch-pad pointing device
US6950539B2 (en) * 1998-09-16 2005-09-27 Digital Persona Configurable multi-function touchpad device
US6256022B1 (en) * 1998-11-06 2001-07-03 Stmicroelectronics S.R.L. Low-cost semiconductor user input device
US6320975B1 (en) * 1999-04-22 2001-11-20 Thomas Vieweg Firearm holster lock with fingerprint identification means
US6535622B1 (en) * 1999-04-26 2003-03-18 Veridicom, Inc. Method for imaging fingerprints and concealing latent fingerprints
US6744910B1 (en) * 1999-06-25 2004-06-01 Cross Match Technologies, Inc. Hand-held fingerprint scanner with on-board image normalization data storage
US6681034B1 (en) * 1999-07-15 2004-01-20 Precise Biometrics Method and system for fingerprint template matching
US6546122B1 (en) * 1999-07-29 2003-04-08 Veridicom, Inc. Method for combining fingerprint templates representing various sensed areas of a fingerprint to derive one fingerprint template representing the fingerprint
KR20020048987A (en) * 1999-10-27 2002-06-24 피루쯔 가사비안 Integrated Keypad System
US7054470B2 (en) * 1999-12-02 2006-05-30 International Business Machines Corporation System and method for distortion characterization in fingerprint and palm-print image sequences and using this distortion as a behavioral biometrics
GB2357335B (en) * 1999-12-17 2004-04-07 Nokia Mobile Phones Ltd Fingerprint recognition and pointing device
US6920560B2 (en) * 1999-12-30 2005-07-19 Clyde Riley Wallace, Jr. Secure network user states
US6754365B1 (en) * 2000-02-16 2004-06-22 Eastman Kodak Company Detecting embedded information in images
US6518560B1 (en) * 2000-04-27 2003-02-11 Veridicom, Inc. Automatic gain amplifier for biometric sensor device
WO2001091100A1 (en) * 2000-05-24 2001-11-29 Immersion Corporation Haptic devices using electroactive polymers
US20030028811A1 (en) * 2000-07-12 2003-02-06 Walker John David Method, apparatus and system for authenticating fingerprints, and communicating and processing commands and information based on the fingerprint authentication
JP2002062983A (en) * 2000-08-21 2002-02-28 Hitachi Ltd Pointing device
JP2002132444A (en) * 2000-10-27 2002-05-10 Fuji Xerox Co Ltd Mouse
JP2002196882A (en) * 2000-12-27 2002-07-12 Hitachi Ltd Optical remote controller
JP2002244781A (en) * 2001-02-15 2002-08-30 Wacom Co Ltd Input system, program, and recording medium
DE10110724A1 (en) * 2001-03-06 2002-09-26 Infineon Technologies Ag Fingerprint sensor with potential modulation of the ESD protective grid
US6977645B2 (en) * 2001-03-16 2005-12-20 Agilent Technologies, Inc. Portable electronic device with mouse-like capabilities
US6621483B2 (en) * 2001-03-16 2003-09-16 Agilent Technologies, Inc. Optical screen pointing device with inertial properties
US6677929B2 (en) * 2001-03-21 2004-01-13 Agilent Technologies, Inc. Optical pseudo trackball controls the operation of an appliance or machine
US6603462B2 (en) * 2001-03-21 2003-08-05 Multidigit, Inc. System and method for selecting functions based on a finger feature such as a fingerprint
WO2002089038A2 (en) * 2001-04-27 2002-11-07 Atrua Technologies, Inc. Capacitive sensor system with improved capacitance measuring sensitivity
US7369688B2 (en) * 2001-05-09 2008-05-06 Nanyang Technological Univeristy Method and device for computer-based processing a template minutia set of a fingerprint and a computer readable storage medium
KR100430054B1 (en) * 2001-05-25 2004-05-03 주식회사 씨크롭 Method for combining fingerprint by digital linear image sensor
US7003670B2 (en) * 2001-06-08 2006-02-21 Musicrypt, Inc. Biometric rights management system
JP2005531935A (en) * 2001-07-12 2005-10-20 アトルア テクノロジーズ インコーポレイテッド Method and system for biometric image assembly from multiple partial biometric frame scans
US20030021495A1 (en) * 2001-07-12 2003-01-30 Ericson Cheng Fingerprint biometric capture device and method with integrated on-chip data buffering
KR100434491B1 (en) * 2001-08-17 2004-06-05 삼성전자주식회사 Resist or etching by-products removing composition and resist removing method using the same
US20030035568A1 (en) * 2001-08-20 2003-02-20 Mitev Mitko G. User interface including multifunction fingerprint roller and computer including the same
JP2003075135A (en) * 2001-08-31 2003-03-12 Nec Corp Fingerprint image input device and organism discrimination method by fingerprint image
US7131004B1 (en) * 2001-08-31 2006-10-31 Silicon Image, Inc. Method and apparatus for encrypting data transmitted over a serial link
US20030123714A1 (en) * 2001-11-06 2003-07-03 O'gorman Lawrence Method and system for capturing fingerprints from multiple swipe images
JP3773442B2 (en) * 2001-11-22 2006-05-10 シャープ株式会社 Image forming apparatus
US7929951B2 (en) * 2001-12-20 2011-04-19 Stevens Lawrence A Systems and methods for storage of user information and for verifying user identity
US7002553B2 (en) * 2001-12-27 2006-02-21 Mark Shkolnikov Active keyboard system for handheld electronic devices
US20030135764A1 (en) * 2002-01-14 2003-07-17 Kun-Shan Lu Authentication system and apparatus having fingerprint verification capabilities thereof
JP4022090B2 (en) * 2002-03-27 2007-12-12 富士通株式会社 Finger movement detection method and detection apparatus
JP2004110438A (en) * 2002-09-18 2004-04-08 Nec Corp Image processing device, image processing method, and program
US7404086B2 (en) * 2003-01-24 2008-07-22 Ac Technology, Inc. Method and apparatus for biometric authentication
CN1777860A (en) * 2003-03-12 2006-05-24 O-笔公司 Multi-task ray sensor
US7941849B2 (en) * 2003-03-21 2011-05-10 Imprivata, Inc. System and method for audit tracking
KR20060002923A (en) * 2003-04-04 2006-01-09 루미다임 인크. Multispectral biometric sensor
US7274808B2 (en) * 2003-04-18 2007-09-25 Avago Technologies Ecbu Ip (Singapore)Pte Ltd Imaging system and apparatus for combining finger recognition and finger navigation
WO2005001751A1 (en) * 2003-06-02 2005-01-06 Regents Of The University Of California System for biometric signal processing with hardware and software accelaration
US7474772B2 (en) * 2003-06-25 2009-01-06 Atrua Technologies, Inc. System and method for a miniature user input device
JP4859053B2 (en) * 2003-09-12 2012-01-18 フラットフロッグ・ラボラトリーズ・アクチボラゲット System and method for locating radiation scattering / reflecting elements
JP3924558B2 (en) * 2003-11-17 2007-06-06 富士通株式会社 Biological information collection device
TWI260525B (en) * 2003-12-30 2006-08-21 Icp Electronics Inc Switch control system for multiple input devices and method thereof
US7697729B2 (en) * 2004-01-29 2010-04-13 Authentec, Inc. System for and method of finger initiated actions
US20050179657A1 (en) * 2004-02-12 2005-08-18 Atrua Technologies, Inc. System and method of emulating mouse operations using finger image sensors
US7113179B2 (en) * 2004-06-23 2006-09-26 Interlink Electronics, Inc. Force sensing resistor with calibration element and method of manufacturing same
JP2006053629A (en) * 2004-08-10 2006-02-23 Toshiba Corp Electronic equipment, control method and control program
US7280679B2 (en) * 2004-10-08 2007-10-09 Atrua Technologies, Inc. System for and method of determining pressure on a finger sensor
US20060103633A1 (en) * 2004-11-17 2006-05-18 Atrua Technologies, Inc. Customizable touch input module for an electronic device
US20060242268A1 (en) * 2005-04-25 2006-10-26 General Electric Company Mobile radiology system with automated DICOM image transfer and PPS queue management
US7505613B2 (en) * 2005-07-12 2009-03-17 Atrua Technologies, Inc. System for and method of securing fingerprint biometric systems against fake-finger spoofing
US20070061126A1 (en) * 2005-09-01 2007-03-15 Anthony Russo System for and method of emulating electronic input devices
US8090945B2 (en) * 2005-09-16 2012-01-03 Tara Chand Singhal Systems and methods for multi-factor remote user authentication
US7791596B2 (en) * 2005-12-27 2010-09-07 Interlink Electronics, Inc. Touch input device having interleaved scroll sensors
US7885436B2 (en) * 2006-07-13 2011-02-08 Authentec, Inc. System for and method of assigning confidence values to fingerprint minutiae points

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6323846B1 (en) * 1998-01-26 2001-11-27 University Of Delaware Method and apparatus for integrating manual input
US20030002718A1 (en) * 2001-06-27 2003-01-02 Laurence Hamid Method and system for extracting an area of interest from within a swipe image of a biological surface

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
BALLARD: 'Computer Vision', 1982 pages 66 - 69, XP008074778 *

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007000648A (en) * 2005-04-22 2007-01-11 Hitachi Omron Terminal Solutions Corp Biometrics authentication apparatus, terminal equipment, and consumer transaction facility
WO2006136644A1 (en) 2005-06-23 2006-12-28 Nokia Corporation Method and program of controlling electronic device, electronic device and subscriber equipment
US9152840B2 (en) 2005-06-23 2015-10-06 Nokia Technologies Oy Method and program of controlling electronic device, electronic device and subscriber equipment
JP2007183901A (en) * 2005-12-30 2007-07-19 Altek Corp Method for processing moving image

Also Published As

Publication number Publication date
EP1661085A2 (en) 2006-05-31
US7587072B2 (en) 2009-09-08
US20050041885A1 (en) 2005-02-24
JP2007519064A (en) 2007-07-12
WO2005022458A3 (en) 2007-01-25

Similar Documents

Publication Publication Date Title
US7587072B2 (en) System for and method of generating rotational inputs
US7474772B2 (en) System and method for a miniature user input device
US7409107B2 (en) Input device, information device, and control information generation method
US7263212B2 (en) Generation of reconstructed image data based on moved distance and tilt of slice data
US20110285648A1 (en) Use of fingerprint scanning sensor data to detect finger roll and pitch angles
EP1645989B1 (en) Collecting biometric information
US8417060B2 (en) Methods for multi-point descriptors for image registrations
US20050249386A1 (en) Pointing device having fingerprint image recognition function, fingerprint image recognition and pointing method, and method for providing portable terminal service using thereof
US7403658B2 (en) Direct homography computation by local linearization
KR101793769B1 (en) System and method for determining object information using an estimated deflection response
US7257240B2 (en) Input device, information device, and control information generation method
US20050179657A1 (en) System and method of emulating mouse operations using finger image sensors
WO2007053484A2 (en) Monocular tracking of 3d human motion with a coordinated mixture of factor analyzers
US20060204101A1 (en) Spatial transforms from displayed codes
WO2011143661A2 (en) Methods and systems for pointing device using acoustic impediography
US20100103092A1 (en) Video-based handwritten character input apparatus and method thereof
KR20140037026A (en) System and method for determining object information using an estimated rigid motion response
KR100562632B1 (en) A video based handwriting recognition system and method
Liu et al. 3D Human motion tracking by exemplar-based conditional particle filter
Duan et al. Estimating 3D finger pose via 2D-3D fingerprint matching
JP4229201B2 (en) Input device, information device, and control information generation method
JP4605280B2 (en) Input device, information device, and control information generation method
Evreinova et al. Video as input: spiral search with the sparse angular sampling
CN117590954A (en) Pen state detection circuit and method and input system
Ran et al. STGauntlet: Recognizing Hand Gestures over Multiple Hand-Worn Motion Sensors

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A2

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BW BY BZ CA CH CN CO CR CU CZ DE DK DM DZ EC EE EG ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NA NI NO NZ OM PG PH PL PT RO RU SC SD SE SG SK SL SY TJ TM TN TR TT TZ UA UG US UZ VC VN YU ZA ZM ZW

AL Designated countries for regional patents

Kind code of ref document: A2

Designated state(s): BW GH GM KE LS MW MZ NA SD SL SZ TZ UG ZM ZW AM AZ BY KG KZ MD RU TJ TM AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IT LU MC NL PL PT RO SE SI SK TR BF BJ CF CG CI CM GA GN GQ GW ML MR NE SN TD TG

121 Ep: the epo has been informed by wipo that ep was designated in this application
DPEN Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed from 20040101)
WWE Wipo information: entry into national phase

Ref document number: 2004780370

Country of ref document: EP

Ref document number: 2006524682

Country of ref document: JP

WWP Wipo information: published in national office

Ref document number: 2004780370

Country of ref document: EP