CA2166904A1 - Freehand image scanning device and method - Google Patents
Freehand image scanning device and methodInfo
- Publication number
- CA2166904A1 CA2166904A1 CA002166904A CA2166904A CA2166904A1 CA 2166904 A1 CA2166904 A1 CA 2166904A1 CA 002166904 A CA002166904 A CA 002166904A CA 2166904 A CA2166904 A CA 2166904A CA 2166904 A1 CA2166904 A1 CA 2166904A1
- Authority
- CA
- Canada
- Prior art keywords
- image
- navigation
- original
- sensor
- scanning device
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
Classifications
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N1/00—Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
- H04N1/04—Scanning arrangements, i.e. arrangements for the displacement of active reading or reproducing elements relative to the original or reproducing medium, or vice versa
- H04N1/047—Detection, control or error compensation of scanning velocity or position
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N1/00—Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
- H04N1/04—Scanning arrangements, i.e. arrangements for the displacement of active reading or reproducing elements relative to the original or reproducing medium, or vice versa
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N1/00—Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
- H04N1/04—Scanning arrangements, i.e. arrangements for the displacement of active reading or reproducing elements relative to the original or reproducing medium, or vice versa
- H04N1/10—Scanning arrangements, i.e. arrangements for the displacement of active reading or reproducing elements relative to the original or reproducing medium, or vice versa using flat picture-bearing surfaces
- H04N1/107—Scanning arrangements, i.e. arrangements for the displacement of active reading or reproducing elements relative to the original or reproducing medium, or vice versa using flat picture-bearing surfaces with manual scanning
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N1/00—Scanning, transmission or reproduction of documents or the like, e.g. facsimile transmission; Details thereof
- H04N1/04—Scanning arrangements, i.e. arrangements for the displacement of active reading or reproducing elements relative to the original or reproducing medium, or vice versa
- H04N1/19—Scanning arrangements, i.e. arrangements for the displacement of active reading or reproducing elements relative to the original or reproducing medium, or vice versa using multi-element arrays
- H04N1/191—Scanning arrangements, i.e. arrangements for the displacement of active reading or reproducing elements relative to the original or reproducing medium, or vice versa using multi-element arrays the array comprising a one-dimensional array, or a combination of one-dimensional arrays, or a substantially one-dimensional array, e.g. an array of staggered elements
- H04N1/192—Simultaneously or substantially simultaneously scanning picture elements on one main scanning line
- H04N1/193—Simultaneously or substantially simultaneously scanning picture elements on one main scanning line using electrically scanned linear arrays, e.g. linear CCD arrays
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N2201/00—Indexing scheme relating to scanning, transmission or reproduction of documents or the like, and to details thereof
- H04N2201/024—Indexing scheme relating to scanning, transmission or reproduction of documents or the like, and to details thereof deleted
- H04N2201/02406—Arrangements for positioning elements within a head
- H04N2201/02439—Positioning method
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N2201/00—Indexing scheme relating to scanning, transmission or reproduction of documents or the like, and to details thereof
- H04N2201/04—Scanning arrangements
- H04N2201/0402—Arrangements not specific to a particular one of the scanning methods covered by groups H04N1/04 - H04N1/207
- H04N2201/0414—Scanning an image in a series of overlapping zones
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N2201/00—Indexing scheme relating to scanning, transmission or reproduction of documents or the like, and to details thereof
- H04N2201/04—Scanning arrangements
- H04N2201/0402—Arrangements not specific to a particular one of the scanning methods covered by groups H04N1/04 - H04N1/207
- H04N2201/043—Viewing the scanned area
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N2201/00—Indexing scheme relating to scanning, transmission or reproduction of documents or the like, and to details thereof
- H04N2201/04—Scanning arrangements
- H04N2201/047—Detection, control or error compensation of scanning velocity or position
- H04N2201/04701—Detection of scanning velocity or position
- H04N2201/0471—Detection of scanning velocity or position using dedicated detectors
- H04N2201/04712—Detection of scanning velocity or position using dedicated detectors using unbroken arrays of detectors, i.e. detectors mounted on the same substrate
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N2201/00—Indexing scheme relating to scanning, transmission or reproduction of documents or the like, and to details thereof
- H04N2201/04—Scanning arrangements
- H04N2201/047—Detection, control or error compensation of scanning velocity or position
- H04N2201/04701—Detection of scanning velocity or position
- H04N2201/04729—Detection of scanning velocity or position in the main-scan direction
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N2201/00—Indexing scheme relating to scanning, transmission or reproduction of documents or the like, and to details thereof
- H04N2201/04—Scanning arrangements
- H04N2201/047—Detection, control or error compensation of scanning velocity or position
- H04N2201/04701—Detection of scanning velocity or position
- H04N2201/04731—Detection of scanning velocity or position in the sub-scan direction
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N2201/00—Indexing scheme relating to scanning, transmission or reproduction of documents or the like, and to details thereof
- H04N2201/04—Scanning arrangements
- H04N2201/047—Detection, control or error compensation of scanning velocity or position
- H04N2201/04701—Detection of scanning velocity or position
- H04N2201/04734—Detecting at frequent intervals, e.g. once per line for sub-scan control
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N2201/00—Indexing scheme relating to scanning, transmission or reproduction of documents or the like, and to details thereof
- H04N2201/04—Scanning arrangements
- H04N2201/047—Detection, control or error compensation of scanning velocity or position
- H04N2201/04701—Detection of scanning velocity or position
- H04N2201/04737—Detection of scanning velocity or position by detecting the scanned medium directly, e.g. a leading edge
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N2201/00—Indexing scheme relating to scanning, transmission or reproduction of documents or the like, and to details thereof
- H04N2201/04—Scanning arrangements
- H04N2201/047—Detection, control or error compensation of scanning velocity or position
- H04N2201/04701—Detection of scanning velocity or position
- H04N2201/04743—Detection of scanning velocity or position by detecting the image directly
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N2201/00—Indexing scheme relating to scanning, transmission or reproduction of documents or the like, and to details thereof
- H04N2201/04—Scanning arrangements
- H04N2201/047—Detection, control or error compensation of scanning velocity or position
- H04N2201/04753—Control or error compensation of scanning position or velocity
- H04N2201/04758—Control or error compensation of scanning position or velocity by controlling the position of the scanned image area
- H04N2201/04787—Control or error compensation of scanning position or velocity by controlling the position of the scanned image area by changing or controlling the addresses or values of pixels, e.g. in an array, in a memory, by interpolation
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N2201/00—Indexing scheme relating to scanning, transmission or reproduction of documents or the like, and to details thereof
- H04N2201/04—Scanning arrangements
- H04N2201/047—Detection, control or error compensation of scanning velocity or position
- H04N2201/04753—Control or error compensation of scanning position or velocity
- H04N2201/04794—Varying the control or compensation during the scan, e.g. using continuous feedback or from line to line
Abstract
A scanning device (10) and method of forming a scanned electronic image (54) include an imaging sensor (22) and at least one navigation sensor (24 and 26). In the preferred embodiment, the imaging sensor is a linear array of sensor elements, with a two-dimensional navigation sensor array at each end. The scanning device has three degrees of freedom, since position information from the navigation sensors allows manipulation of an image signal from the imaging sensor to reduce distortion artifacts caused by curvilinear scanning. Acceptable sources of the position information include printed matter and contrast variations dictated by variations in the inherent structure-related properties (64) of the medium (14) on which the scanned image is formed.
Illumination for optimal operation of the navigation system may be introduced at a grazing angle (30) in some applications or in the normal to a plane of the original in other applications, but this is not essential.
Illumination for optimal operation of the navigation system may be introduced at a grazing angle (30) in some applications or in the normal to a plane of the original in other applications, but this is not essential.
Description
2~90~
Description Freehand Image Scanning Device and Method Technical Field The present invention relates generally to devices and methods for forming scanned electronic images of originals and more particularly to scanning devices and methods that accommodate imprecise movements during image capture.
Background Art Scanners for electronically forming an image of an original are known.
Typically, the captured image provided by a scanner is a pixel data array that is stored in memory in a digital format. A distortion-free image requires a faithful mapping of the original image to the pixel data array. Scanners typically include at least one means for imposing a mechanical constraint during the image capture process in order to maximize the likelihood of faithful mapping.
The four types of scanners known in the art are drum scanners, flatbed scanners, two-dimensional array scanners and hand scanners. Drum scanners attach the original to the surface of a cylindrical drum that rotates at a substantially fixed velocity.
During the rotation of the drum, an image sensor is moved in a direction parallel to the rotational axis of the drum. The combination of the linear displacement of the image sensor and the rotation of the original on the drum allows the entire original to be scanned.
At any moment during the imaging process, the current position within the pixel data array relative to the original can be determined by measuring the angular position of the drum and the translational position of the sensor. The position of the pixel data array with respect to the original is fixed as long as the original is properly attached to the drum, the drum rotation is properly controlled, and the sensor is properly controlled in its displacement along the linear path.
Flatbed scanners include a linear array sensor that is moved relative to the original along an axis that is perpendicular to the axis of the array. Thus, the position of the sensor in one dimension may be known by tracking the relative movement of the sensor. The position of the sensor in the perpendicular direction is implicitly fixed by HP: 1094285-1 .APL
2~66g~
addressing a particular array element at which intensity is to be measured. In one embodiment of the flatbed scanner, the original is placed on a transparent platen and the sensor, along with an image illumination source, is placed on a side of the platen opposite to the original. As long as the original is not moved relative to the platen, the pixel data array will be fixed with respect to the image to be captured. In another embodiment, the original is moved, rather than the sensor. This second embodiment is typical of facsimile machines. Precision paper transports provide a high degree of positional accuracy during the image-capture process.
Advantages of the drum and flatbed scanners include the ability to accommodate documents at least as large as A4, or 8.5" x 11U paper. Moreover, some of these scanners can handle A1 paper in a single setup. However, the scanners are not generally portable, since they require a host computer for control, data storage and image manipulation.
Two-dimensional array scanners may be used in the absence of mechanical encoding constraints, and require only that the array and the original be held motionless during an exposure period. A two-dimensional array of photosensitive elements directly accomplishes the mapping of the image of the original into a pixel data array. However, because a single 300 dpi mapping of an 8.5" x 1 lu original requires an image sensor having an array of 2500 x 3300 elements, i.e. 8.25 million pixels, these scanners are cost-prohibitive in most applications.
Conventional hand scanners require a user to move a linear array of electrooptical sensor elements over an original. The movement is by hand manipulation.
Array-position information is determined using methods such as those employed inoperation of a computer "mouse." As a linear sensor array is moved, the rotation of wheels, balls or rollers that are in contact with the original is sensed, and the position information is determined from the mechanical details of the rotation. In general, the surface of the mechanical element in contact with the original has a high coefficient of friction, e.g. rubber, so as to resist slip and skid. A cylindrical roller or two wheels connected by a rigid axle may be used to enforce a single translational degree of freedom during the scanning process. A straight-edge or other fixture is often used to fix the scan direction with respect to the original and to further enforce the translational constraint provided by the pair of wheels or the roller. Nevertheless, the position encoder approach HP: 1094285-l .APL
is one that is often susceptible to slips and skips, so that the pixel data array loses its cor-respondence with the image on the original.
Hand scanners are typically connected directly to a personal computer for image data storage, processing, and use. Data rates from the image sensor tend to limit the scanning speed. The scanners provide feedback to the user, typically by means of green or red light emitting diodes, to maintain the appropriate speed for the desired image resolution. Some hand scanners use electromagnetic brakes to prevent the user from dragging the scanner over the image too rapidly, with the mechanical resistance increasing with increases in scanning speed.
Hand scanners utilize relatively small imaging arrays and generally cannot handle larger than A6 documents in a single pass. This requires stitching algorithms to join together multiple swaths of a larger document. Swath stitching is done in a separate operation by the personal computer. Scanning a multi-page business document or report with a hand scanner is a tedious process that often yields low-quality results.
As previously noted, some type of fixture is typically used with a hand scanner. In the absence of a fixture, there is a tendency to impose some rotation as the hand scanner is moved across an original. If the user's elbow is resting on a flat surface during movement of the scanner, the rotation is likely to have a radius defined by the distance between the scanner and the user's elbow. As a consequence, the scanned elec-tronic image will be distorted. Other curvilinear movements during a swath of the scanner will also create distortions.
What is needed is a scanning device that accommodates curvilinear movement during a scanning process, with accommodation being achieved in a low-cost manner and with a high degree of correspondence between the original image and the resulting image.
Summary of the Invention A scanning device and method for forming a scanned electronic image include using navigation information that is acquired along with image data, and then rectifying the image data based upon the navigation and image information. In the preferred embodiment, the navigation information is acquired by means of at least one navigation sensor that detects inherent structure-related properties of an original being scanned. Movement of an image sensor along the original may be tracked by monitoring HP: 1094285-1 .APL
216690~
variations of the inherent structure-related properties as the image sensor is moved relative to the original. Preferably, the inherent structure-related properties that are monitored are inherent structural features, such as paper fibers, or other constituents of the original. Navigation may also be speckle-based, wherein movement of the image sensor along the original is tracked by monitoring variations of speckle patterns produced using coherent illumination for acquiring the navigation information.
"Inherent structure-related propertiesU are defined herein as properties of the original that are attributable to factors that are independent of forming image data and/or of systematic registration data on the original. The navigation information may be formed by generating a position signal that is responsive to detection of inherent structure-related properties, such as a position signal of speckle information or a position signal that permits tracking of individual inherent structural features. "Inherent structural features" are defined herein as those features of an original that are characteristic of processes of forming the original and are independent of forming image data and/or systematic registration data on the original. For example, if the original recorded media is a paper product, the inherent structural features of interest may be paper fibers. As another example, navigation of the image sensor across a glossy original or an overhead transparency film may be determined by tracking surface texture variations that affect specular fields. Typically, the inherent structural features are microscopic, e.g. between 10 and 40 ~m, features of surface texture.
Thus, the contemplated approaches to acquiring navigation information vary in scope. In the broadest approach, there is no limitation to the sources of navigation information that is to be used to remove distortion artifacts of curvilinear and rotational movement of the scanning device along a scan path. The navigation signal may therefore be in the form of a position signal that is responsive to detection of image data on the original (e.g., iden~i~ication of edges of text characters), with the position signal then being used in the manipulation of an image signal. A narrower second approach is one in which a position signal is responsive to detection of inherent-structure related properties, such as the properties that determine speckle patterns. The third approach is to track navigation of the scanning device by monitoring the positions of individual inherent structural features (e.g., paper fibers) over time. This third approach is the narrowest of the three, since it is actually a subcategory of the second approach.
HP: 1094285-1 .APL
2i66~04 In the preferred embodiment, the image sensor is a linear array of electrooptical elements, while the navigation approach utilizes at least one two-dimensional array of navigation sensor elements. By placing a separate two-dimensional navigation array at each end of the image sensor, the scanner is afforded three degrees of freedom of movement. If the original is planar, two of the degrees of freedom are translational and are perpendicular to each other within the plane of the original, while the third degree of freedom is rotatioi)al about the normal to the plane of the original. The accuracy of rotation tracking is enhanced by the use of two navigation arrays, with each array having a smaller array extent than would be necessary if only a single navigation array were used. While the preferred embodiment is one in which a navigation sensor is a two-dimensional array, linear arrays may also be used. Moreover, as will be described more fully below,navigation information for rectifying image data could feasibly be acquired by fixing other position-tracking means to the scanning device, including encoding wheels and balls, computer mice track balls, registration grid-detectors, accelerometers, mechanical linkages, non-contacting electromagnetic and electrostatic linkages and time-delay integration sensor arrays. In many of these alternative embodiments, navigation information for rectifying the image data is acquired in manners independent of any inherent structure-related properties of the original, since position tracking does not include image acquisition.
The navigation sensors are in a known position relative to the image sensor.
Preferably, the navigation sensors are as close to the end points of the imaging sensor as possible, so that the navigation sensors are less susceptible to traveling beyond the edge of an original as the image array is moved. The image sensor forms a signal that is representative of an image of interest. Simultaneously, each navigation sensor forms a signal representative of the inherent structure-related properties of the original. The scanning device may be moved in a freehand meandering pattern, such as one of alternating left-to-right and right-to-left movements with descent along the original, with the device remaining in contact with the original. Each one of the side-to-side swaths should overlap a portion of the previous swath, so that the image may be manipulated with respect to position and stitched either during or following the scanning process. The manipulation of the image signal is a rectification of image data, with the rectification being based upon the relative movement between the navigation sensor or sensors and the inherent structure-related properties detected by the navigation sensors. The manipulation HP: 1094285-1 .APL
21~9~4 is a ~rectification" of the image signal, i.e., an operation of arranging and modifying acquired image data based upon navigation data in order to achieve conformance between original and output images. The stitching is used to connect image data acquired during successive swaths.
Preferably, each navigation sensor includes one or more light sources designed to provide contrast dependent upon the inherent structure-related properties of the original. Emitted light may be in the visible range, but this is not essential. For example, ~grazing" light that has large angles of incidence relative to the surface normal will interact with paper fibers at or near the surface of an original that is a paper product, creating contrast-enhancing shadows among the fibers. On the other hand, if the original has a glossy surface, such as a photographic print, a clay-coated paper or an overhead transparency film, normally incident light will produce an image in the specular field that has image-contrast features sufficient for purposes of navigation. Optical elements such as filters and one or more imaging lenses further improve detection of inherent structure-related properties.
An advantage of the invention is that the scanning device and method allow three degrees of freedom of movement of the scanning device while still affording quality image capture. Thus, a portable, pocket-sized scanning device may be manufactured and used in the absence of mechanical constraints, other than that afforded by contact with the surface of the original throughout the image capture process. In fact, for embodiments in which image rectification is provided by correlation of navigation images, the device-to-original contact constraint may be eliminated. Another advantage is that because the scanning device of the preferred embodiment forms an electronic image based upondetection of inherent structural features, large areas of Uwhitespace'' between image features of the original will be preserved and therefore not result in the image features being moved closer together during a stitching step.
Brief Description of the Drawings Fig. 1 is a perspective view of a hand-held scanning device following a meandering path on an original in accordance with the invention.
Fig. 2 is a rearward view of imaging and navigation sensors of the scanning device of Fig. 1.
HP: 1099285-1 .APL
~16~gO~
Fig. 3 is a perspective view of the scanning device of Fig. 1, shown with the imaging and navigation sensors exposed.
Fig. 4 is a schematical side view of an illumination system for one of the navigation sensors of Fig. 3.
Fig. 5 is a side schematical view of a light emitting diode and optical elementsfor providing the illumination described with reference to Fig. 4.
Fig. 6 is a conceptual view of the image capture operation of the scanning device of Fig. 1.
Fig. 7 is an operational view of one embodiment of the navigation processing of the scanning device of Fig. 1.
Fig. 8 is a schematical view of selected steps of Fig. 7.
Fig. 9 is a block diagram of the components for carrying out the steps of Fig.
8.
Fig. 10 is a representation of a position-tagged data stream typical of that output from Fig. 9 Figs. 11 and 12 are representations of swaths by the scanning device of Fig.
1.
Fig. 13 is a representation of a registration tile that may be utilized to achieve stitching of successive swatches.
Best Mode for Carrying Out the Invention With reference to Fig. 1, a portable, hand-held scanning device 10 is shown as having followed a meandering path 12 along an original 14. In the preferred embodiment, the original is a piece of paper, an overhead transparency film, or any other image-bearing surface upon which inherent structure-related properties of the original generate sufficient contrast to provide position information during navigation along the meandering path. Typically, the positions of inherent structural features are tracked and the position information is used to rectify image data, but other embodiments will be described. The scanning device is preferably self-contained and battery powered, but may include a connection to an external source of power or to data ports of computers or networks.
HP: 1094285-l.APL
~16~904 The scanning device 10 of Fig. 1 includes an image display 16. The display may provide almost immediate viewing of a captured image. However, a display is not essential to the use of the scanning device.
The scanning device 10 allows three degrees of freedom, with two being in translation and one in rotation. The first degree is the side-to-side movement (X axis movement) along the original 14. The second degree of freedom is movement upwardly and downwardly along the original (Y axis movement). The third degree of freedom is the ability to oper~te the device with rotational misalignment of a linear array of image sensor elements relative to an edge of the original 14 (~ axis movement). That is, the linear array of imaging elements may have an angle of attack that is not perpendicular to the direction of device translation.
Referring now to Figs. 1-3, the forward side 18 of the scanning device 10 includes a pivoting member 20 that aids in maintaining proper contact between the original 14 and an imaging sensor 22. Navigation sensors 24 and 26 are located at the opposed ends of the imaging sensor. Because the navigation sensors are mounted on the pivoting member, the navigation sensors are in a fixed location relative to the imaging sensor.
For reasons of physical compactness, the imaging sensor array 22 is preferably a contact image device, but for applications in which compactness is less of a concern or a smaller image is desired, sensors employing projection optics may be employed, with magnification less than unity. In such applications, the elements of the imaging sensor 22 should be smaller and more closely packed together. Contact imaging devices typically employ lenses sold under the trademark SELFOC, which is a federally registered mark of Nippon Sheet Glass Company Limited. Less conventionally, contact imaging can be obtained using interleaved array elements of sources and proximalsensors, without any imaging lenses. Conventional imaging sensors for scanning appli-cations may be used. The imaging sensor may be part of a unit that also includes an illumination source, illumination optics, and image transfer optics.
The imaging sensor is shown as a linear array of discrete optically sensitive elements. The spacing of the elements plays a role in determining the spatial resolution of the scanner 10. For example, a linear array having a length of 101.6 mm requires 1200 sensor elements to achieve a resolution of 300 dpi. The sensor may be a charged coupled device, an amorphous silicon photodiode array, or any other type of linear array sensor known in the art.
HP: 1094285-1 .APL
2~66904 g A key consideration in the design of the imaging sensor unit is speed. The imaging sensor 22 preferably is able to image each pixel at approximately 10K samples per second. Linear imaging arrays generally produce a serial data stream, wherein pixel values, i.e. charges, are placed into a shift register and then shifted out. Achieving the desired speed requires either very fast serial transfer rates out of the entire image array or multiple taps, so that pixel values can be shifted through fewer cells. This introduces parallelism, which is advantageous for digital processing.
Another consequence of the speed requirement is that the products of pixel areas, at the surface of the original, and their solid angles of emitted light collected and conveyed to each array element should be sufficiently large to produce a detectable signal in integration times on the order of 100 microseconds. An enhancement option is to add an optical element to the sensor to increase the effective fraction of sensor pitch for which each sensing element is responsive. Since there is typically unused area in the array matrix, such light collection optics increase sensitivity.
A straightforward modification of the imaging sensor 22 enables sensing of color images. Three linear arrays parallel to one another, each with at least one embedded filter element that selectively passes red, green and blue components of the incident light respectively, would allow color imaging. Alternatively, a single array having broad-band sensitivity may be sequentially illuminated by red, green and blue light sources.
Regarding illumination for improving the operation of the imaging sensor 22, a linear array of high intensity light emitting diodes at the amber wavelength may be used.
However, the selection of the preferred illumination source and any optical elements is dependent upon the medium of the original. The wavelength of the light is selected to maximize the contrast image data acquired during the scan of a given area of the original 14, while disregarding unwanted signals. Illumination optics may consist of LED dome lenses or may include a light pipe consisting of a precision-molded optical element that channels the illumination onto the original with a minimal amount of light loss. Such a design can afford a relatively uniform illumination of a target area of the original at a wide range of angles, but blocks normal incident rays in order to avoid specular surface reflections.
In Fig. 1, the meandering path 12 is shown as having four and a fraction swaths, i.e. side-to-side passes across the original 14. A useful imaging sensor 22 for HP: 1094285-1 .APL
~6~0~
most opportunistic applications has a length within the range of 25.4 mm and 101.6 mm.
If the sensor 22 has a length of 63.5 mm, an A4 paper can be scanned in four or five swaths. As will be explained more fully below, the swaths should include regions of overlap, so that a stitching process can be used to produce a faithful reproduction of the original image.
The scanning device 10 typically includes at least one navigation sensor 24 or 26. In the preferred embodiment, the device includes a pair of navigation sensors, with the sensors being at opposite ends of the imaging sensor 22. While a one-dimensional array of optoelectronic elements may be used, in the preferred embodiment, each navigation sensor is a two-dimensional array of elements. The navigation sensors 24 and 26 are used to track movement of the scanning device 10 relative to the original 14.
In the preferred embodiment, each navigation sensor 24 and 26 captures images related to inherent structure-related properties of the original in order to produce information related to the position of the scanning device 10. For most prior art scanning devices, inherent structural features are considered to be noise. For the scanning device 10 of Figs.1-3, such features are noise with regard to the imaging sensor 22, but may be used to provide a basis for the navigation sensors 24 and 26 to generate position information. Useful, high-contrast images of surface texture can be generated by detecting structural variations that are inherent to the medium or are formed on the medium, e.g., text. For example, images may be formed based upon the contrast between shadows in valleys and bright spots at the peaks of inherent structural features. Such features are typically microscopic in nature, often ranging between 10 ~m and 40 I~m in size on common printed media. As an alternative, speckle may be used, since specular reflection of a coherent beam produces a contrast pattern of bright and dark areas. A third source of contrast information is color. Color contrast is independent of surface texture. Even when illuminating the texture-free surface with light in the visible range, color contrast exists between regions of different colors, e.g., between different shades of gray.
However, it is contemplated to use the invention for applications in which navigation information is independent of inherent structure-related properties of the original. For example, one or both of the navigation sensors 24 and 26 of Fig. 2 may be used to form successive images of print on the original, with correlation of the successive images being used to determine the position and the orientation of the image sensor 22 along the original 14. In this embodiment, all three sensors 22, 24 and 26 image text on HP: 1094285-l.APL
the original, but only the signal from the sensor 22 is used to acquire image data. The signals from the navigation sensors 24 and 26 are used to acquire image-based navigation information.
Non-imaging approaches can also be used to acquire and process X, Y and theta position information. Unfortunately, many of the alternative means impose various limitations upon compactness, convenience of use, speed, freedom of motion, power consumption, accuracy, precision, and/or cost. One imaging-independent alternative available to acquiring position information is to provide one or more encoding wheels in place of the navigation sensors. The encoding wheels may then roll without slip upon the scanned surface, enabling the scanning device to travel along straight or curvilinear trajectories. It is not critical that the encoding wheels be on a common axis. The wheels may be mounted to swivel. Encoders coupled to monitor rotations would provide the input data from which to calculate position and orientation of an imaging sensor relative to a starting position and orientation.
Another image-free approach to acquiring navigation information is to use track balls similar to those for a computer mouse. A track ball could be used in place of each encoder wheel described above. Encoders would be used to obtain two-dimensional displacement information from each track ball. In another approach, optical or electronic (c~p~citive, resistive or inductive) sensors may be used in place of the navigation sensors of Fig. 2 in order to sense position and orientation relative to a cooperative (active or passive) grid or other reference constructed in an underlying tablet that, in turn, serves as a support for the original being scanned.
Another image-free approach to acquiring position and orientation information is to provide an accelerometer. An on-board inertial navigation platform may be used, with accelerations being sensed and integrated either once to obtain velocities or twice to obtain positions. Or velocities of spring-suspended masses could be sensed and integrated once in order to obtain positions. Gyroscopes could be employed in a direct sensing of orientation.
Yet another alternative approach would be to use any of a variety of mechanical linkages with which to track position and orientation relative to reference coordinates fixed with respect to the medium being scanned. Position and orientation information could be obtained by means of sensors coupled to measure the relative movement of the mechanical members. These sensors could be of either the relative or HP: 1094285-1 .APL
~1~6904 absolute type and could be based on direct position and orientation sensing, or the sensing of accelerations or velocities that would then be integrated with respect to time, once or twice, to obtain positions. Non-contacting remote sensing could also be used to measure position and orientation of the scanning device relative to reference coordinates fixed with respect to the scanned original. Examples of such non-contacting sensing would include those that use electro-magnetic fields, waves or beams (e.g. at optical or radiofrequencies); electric effects (e.g. capacitive); and/or magnetic effects (e.g. inductive).
These approaches could utilize standard or differential Global Positioning technologies and potentially could use satellites. These approaches can also include traditional navigation/surveying methods, such as triangulations. They could also include techniques employed in robotics technologies, such as using shaped light beams and interpreting position from images of where these beams intercept the moving object.
The navigation sensors 24 and 26 of Fig. 2 effectively observe a moving image of the original 14 and produce an indication of the displacement in two planar dimensions between successive observations. As will be explained more fully below, pixel values from the navigation sensors are operated upon by processing elements to determine proper mapping of image data from thè imaging sensor 22. The processing elements operate on a particular pixel and its nearest neighbors to produce an array of correlation values at each pixel location. The correlation values are based uponcomparisons between a current image of the surface structure and a stored image representing a known position of inherent structural features, wherein the stored image serves as a position reference. However, operations other than the correlation process may be employed in manipulating the input image data to form the output image.
Referring now to Figs. 4 and 5, navigation sensor 24 is shown as being operatively associated with illumination optics. If the original 14 is a paper product for which paper fibers are to be detected by the navigation sensor 24, the introduction of light at a grazing angle of incidence is preferred. While not essential, one or more light emitting diodes (LEDs) 28 may be used. The grazing angle 30, which is the complement of the angle of incidence, is preferably in the range of zero degrees and fifteen degrees, but this may change depending upon the properties of the original 14. In Fig. 5, the source 28 is shown with illumination optics 34. The optics may comprise a single element or acombination of lenses, filters and/or holographic elements to accomplish suitable collimated and generally uniform illumination of the target surface. The wavelength of the HP: 1094285-l.APL
~6~904 light emitted by the source 28 should be selected to enhance the spatial frequency information available for navigation. Fixed pattern noise in the illumination field should be minimized. The output of source 28 may require adjustment to accommodate wide dynamic ranges of reflectivity of the medium as the scanning device proceeds over printed materials with absorbing or reflecting inks or other marking agents.
In Fig. 4, light from a source 35iS collimated at illumination optics 36 and then redirected by an amplitude splitting beam-splitter 37. That portion of the light energy from the LED directly to and transmitted through the beam-splitter is not shown in Fig. 4.
The light energy from the beam-splitter illuminates the original 14 along the normal to the surface.
Also represented in Fig. 4 is the portion of the light energy that is reflected or scattered from the original 14 and passed through the beam-splitter 37 for aperturing and filtering at element 38 and focusing to an image at element 39. The portion of light energy passing from the original to the beam-splitter and reflecting from the beam-splitter is not shown. The magnification of navigation imaging optics should be constant over the field-of-view of the two-dimensional sensor array 24 which detects the focused light. In many ~pp'ic tions, the modulation transfer functions, i.e. the amplitude measure of optical frequency response, of the navigation optics must be such as to provide attenuation before the Nyquist frequency that is determined by the pitch of the sensor elements of the navigation sensor and by the magnification of the optical elements. The optical elements should also be designed to prevent background illumination from creating noise. Note that a wavefront splitting beam-splitter could also be used.
The selection of the angle of incidence depends upon the material properties of the original. Grazing angles of illumination generate longer shadows and more apparent contrast, or AC signal if the surface of the original is not glossy. The DC signal level, however, increases as the illumination angle approaches the normal to the original.
Illuminating the target region of the original 14 at a grazing angle 30 works well for applications in which the surface of the original has a high degree of unevenness at the microscopic level. For example, the introduction of light from the source 28 at a grazing angle provides a high signal-to-noise ratio of data related to inherent structural features when the original is stationery, cardboard, fabric, or human skin. On the other hand, the use of incoherent light at a normal angle of incidence may be preferred in applications in which position data is needed to track scanner movement along such HP: 1094285-1 .APL
-216~04 originals as photographs, glossy magazine pages, and overhead transparency films. With normal illumination, using incoherent light, viewing the original in the specularly reflected field will provide an image that is sufficiently rich in texture content to allow image and cor,elation-based navigation. The surface of the original has a microscopic relief such that the surface reflects light as if the surface were a mosaic of tiles, or facets. Many of the "tiles~ of an original reflect light in directions slightly perturbed from the normal. A field of view that includes the scattered light and the specularly reflected light can thus be modeled as though the surface were composed of many such tiles, each tilted somewhat differently with respect to the normal. This modeling is similar to that of W.W. Barkas in an article entitled "Analysis of Light Scattered from a Surface of Low Gloss into Its Specular and Diffuse Components," in Proc. Phys. Soc.. Vol. 51, pages 274-292 (1939).
Fig. 4 shows illumination by a source 35 of incoherent light, which is directed along the normal of the surface of the original 14. Fig. 5 describes illumination at a grazing angle 30. In a third embodiment, no illumination is provided. Instead, the navigation information is accumulated using background light, i.e. Iight from the environment.
In a fourth embodiment, coherent illumination is introduced at normal incidence to permit speckle-based navigation. Relative motion between a scanning device and an original may be tracked by monitoring motion of speckle relative to the navigation sensors. If coherent illumination is used without using imaging optics, then by selecting a small area of illumination and by having a relatively large separation between the surface of the original and the photodetector array of the navigation sensor 24, the resulting predominant speckle cell sizes with coherent illumination are sufficiently large to satisfy the Nyquist sampling criterion. The use of a beam splitter allows the direction of both the incident illumination and the detected scatter to be near to normal to the surface of the original, as similarly accomplished in Fig. 4.
Referring now to Fig. 6, the scanner 10 is shown as being moved across an original 44 having a block 46 imprinted onto a surface of the original. Because the scanner 10 is not subjected to any kinematic constraints in the plane of the original, there is a tendency for a user to follow a curvilinear path across the original, as when the hand and forearm of the user rotate about the elbow. In Fig. 6, the scanning device is shown as following a curved path 48 across the block 46. If the lower edge of the scanning device is the edge that is closer to the elbow that defines the axis of rotation, the lower edge will have a shorter radius. Consequently, imaging elements of an imaging sensor will vary with HP: 1094285-1 .APL
~1669~4 respect to the time and distance required to pass over the block 46. A distorted image 50 of the block is captured as the device is moved to the second position 52, shown in dashed lines.
The captured image 50 would be the stored image in the absence of processing to be described below. However, as the imaging sensor captures data related to the block 46, navigation information is acquired. In the preferred embodiment, one or more navigation sensors capture data related to inherent structural features of the original 44. Movement of the inherent structural features relative to the scanning device 10 is tracked in order to determine displacement of the imaging sensor relative to the block 46.
A faithful captured image 54 may then be formed. The image 54 is defined herein as the ~rectified" image.
In Fig. 7, one embodiment of navigation processing is shown. The navigation processing is performed by correlating successive frames of navigation information, such as data related to inherent structural features. The correlations compare the positions of the inherent structural features in successive frames to provide information related to the position of a navigation sensor at a particular time. The navigation information is then used to rectify image data. The processing of Fig. 7 is typically performed for each navigation sensor.
In a first step 56, a reference frame is acquired. In effect, the reference frame is a start position. The position of a navigation sensor at a later time may be determined by acquiring 58 a sample frame of position data from the navigation sensor at the later time and then computing correlations 60 between the reference frame and the later-acquired sample frame.
Acquiring the initial reference frame 56 may take place upon initiation of the imaging process. For example, the acquisition may be triggered by mere placement of the scanning device into contact with the original. Alternatively, the scanning device may include a start button that initiates the image process and the navigation process. Initiation may also take place by a periodic pulsing of the illumination system of each navigator. If there is a reflected signal that exceeds a prescribed threshold of reflection or a correlation signal that indicates motion, the reference frame is then acquired.
While the navigation processing is performed computationally, the concepts of this embodiment may be described with reference to Figs. 7 and 8. A reference frame 62 is shown as having an image of a T-shaped inherent structural feature 64. The size of HP: 1094285-1 .APL
~7 ~ga~
the reference frame depends upon factors such as the maximum scanning speed of the scanning device, the dominant spatial frequencies in the imaging of the structurai features, and the image resolution of the sensor. A practical size of the reference frame for a navigation sensor that is thirty-two pixels (N) by sixty-four pixels (M) is 24 x 56 pixels.
At a later time (dt) a navigation sensor acquires a sample frame 66 which is displaced with respect to frame 62, but which shows substantially the same inherent structural features. The duration dt is preferably set such that the relative displacement of the T-shaped feature 64 is less than one pixel of the navigation sensor at the velocity of translation of the scanning device. An acceptable time period is 50 ,us for velocities of 0.45 meters/sec at 600 dpi. This relative displacement is referred to herein as a ~microstep.~
If the scanning device has moved during the time period between acquiring 56 the reference frame 62 and acquiring 58 the sample frame 66, the first and second images of the T-shaped feature will be ones in which the feature has shifted. While the preferred embodiment is one in which dt is less than the time that allows a full-pixel movement, the schematic representation of Fig.8 is one in which the feature 64 is allowed to shift up and to the right by one pixel. The full-pixel shift is assumed only to simplify the representation.
Element 70 in Fig. 8 represents a sequential shifting of the pixel values of frame 68 into the eight nearest-neighbor pixels. That is, step NON does not include a shift, step ~1~ is a diagonal shift upward and to the left, step U2" is an upward shift, etc. In this manner, the pixel-shifted frames can be combined with the sample frame 66 to produce the array 72 of position frames. The position frame designated as "Position 0~ does not include a shift, so that the result is merely a combination of frames 66 and 68. "Position 3" has the minimum number of shaded pixels, and therefore is the frame with the highest correlation. Based upon the correlation results, the position of the T-shaped feature 64 in the sample frame 66 is determined to be a diagonal rightward and upward shift relative to the position of the same feature in earlier-acquired reference frame 62, which implies that the scanning device has moved leftwardly and downwardly during time dt.
While other correlation approaches may be employed, an acceptable approach is a Usum of the squared differences" correlation. For the embodiment of Fig.
8, there are nine correlation coefficients (Ck = C0, C1... C8) formed from the nine offsets at element 70, with the correlation coefficients being determined by equation:
HP: 1094285-l.APL
21~6304 Ck = ~ ~ (Sij R(i~ )~k) where Sjj denotes the navigation sensor-measured value at the position ij of the sample frame 66 and Rj denotes the navigation sensor-measured value at the frame 68 as shifted at the element 70 in the k direction, with k being the identifier of the shift at element 70.
In Fig. 8, k=3 provides the correlation coefficient with the lowest value.
Correlations are used to find the locations of identical features in successive frames in order to determine the displacements of the features from frame-to-frame.
Summing or integrating these displacements and correcting for scale factors introduced through the design of the relevant optics determine the displacements of the imaging sensor as a scanning procedure progresses.
As previously noted, the frame-to-frame correlations are referred to as Rmicrosteps,~ since frame rates are chosen to be sufficiently high to ensure that the displacements do not exceed the dimension of a single pixel. Oversampling can provide sub-pixel displacement precision. Referring to Fig. 7, a determination 74 of whether a microstep is to be taken is made following each computation 64 of the correlations. If a microstep is required, the reference frame is shifted at 76. In this step, the sample frame 66 of Fig. 8 becomes the reference frame and a new sample frame is acquired. Thecorrelation computation is then repeated.
While the process provides a high degree of correlation match, any errors that do occur will accumulate with each successive shift 76 of a sample frame 66 to the reference frame designation. In order to place a restriction on the growth rate of this "random walkH error, a sample frame is stored in a separate buffer memory. This separately stored sample frame becomes a new reference frame for a subsequent series of correlation computations. The latter correlation is referred to as a Umacrostep.R
By using macrosteps, a more precise determination of scanner displacement across a distance of m image frame displacements, i.e. m microsteps, can be obtained.
The error in one macrostep is a result of a single correlation calculation, whereas the equivalent error of m microsteps is m1'2 times the error in a single microstep. Although the average of errors in m microsteps approaches zero as m increases, the standard deviation in the average of errors grows as m"2. Thus, it is advantageous to reduce the standard HP: 1094285-l.APL
deviation of accumulated error by using macrosteps having m as large as practical, as long as the two frames that define a macrostep are not so far spaced from one another that they have no significant region of common image content.
The sampling period dt does not have to be constant. The sampling period may be determined as a function of previous measurements. One method that employs a variable dt is to improve the accuracy of displacement calculation by keeping the relative displacement between successive reference frames within certain bounds. For example, the upper bound may be one-pixel displacement, while the lower bound is determined by numerical roundoff considerations in the processing of the navigation data.
Referring again to Fig. 9, the image signal generated at the imaging sensor 22 may then be "position-tagged" based upon the navigation data. In one embodiment, pixel values from the two navigation sensors 24 and 26 are received by a navigation processor 80 for performing the operations of Figs. 7 and 8. Based upon the computed correlations, coordinates are determined for the current position of the first navigation sensor 24 (X1, Y1) and for the current position of the second navigation sensor 26 (X2, Y2).
The navigation processor 80 also receives pixel values of the imaging sensor 22 via a pixel amplifier 82 and an analog-to-digital converter 84. Although Fig. 9 shows only a single tap from the image sensor 22 and a single A/D converter 84, multiple taps, each with an A/D
converter, are within the scope of the invention. The current position coordinates of the navigation sensors are "tagged" at the ends of a line of data that corresponds to the number of pixels within the imaging sensor. The output 86 of the navigation processor 80 is therefore a position-tagged data stream. In Fig.10 an increment 88 of the data stream is shown as having position coordinate cells 90, 92, 94 and 96 at the opposite ends of N
pixel cells, although this ordering is not essential.
The position-tagged data stream at the output 86 of the navigation processor 80 may be first stored in image space that allows the image to fill memory locations which provide continuity in both the X and Y axes. Consequently, image acquisition is not restricted to scanning from an upper-left corner of an original to the lower-right corner.
Because each image pixel is associated with a relative (X,Y) displacement from an arbitrary starting point, the image can expand in X and Y to the full size of the image memory.
The imaging sensor 22 is clocked as the scanning device moves across an original. The clocking ensures that the fastest moving element of the sensor samples at HP: 1094285-1 .APL
216~904 least once per pixel displacement. As previously noted with reference to Fig. 6, in the case of significant curvature of the scanning device 10 during image capture, one end of the imaging array will translate more rapidly than the other end, causing pixels at the slower end to be oversampled. This situation can be handled by either recording the most recent reading (for grayscales) or by recording in a logical OR mode (for binary images) at a specific pixel location in image space.
The next operation is to map the position-tagged increments. In one embodiment, the end points of the increments are joined by a line. Since the distance of each pixel of the imaging sensor 22 is fixed, the physical location of the pixels relative to the line can be calculated. One approach for determining the physical locations of each pixel is a modification of the Bresenham Raster Line Technique. The modification is that because the array of pixels in the imaging sensor is fixed, the line loop will be fixed at that same number. That is, the usual Bresenham algorithm is one in which the number of iterations in the line loop is the greater of delta_x and delta_y, i.e., max (delta_x, delta_y), but for the modified algorithm the number (N) of pixels along the array is used where max (delta_x, delta_y) is customarily used, so that the loop runs N times. The following program element describes this algorithm:
HP: 1094285-l.APL
21669~
IAAAAAAAAAAAA~AAAAA~AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA
Load pixel values with get_pixel() using location pairs (xa,ya) and (xb,yb) of the endpoints of an N-element array of pixel values using a modified Bresenham line draw algorithm AAAAAAA~AAAAAA~AAAAA~A~AAAA~AAAAAAAAAAAAAAAAAAAAAAAAAAAI
delta_x = xb - xa;
delta_y = yb - ya;
inc_x = (delta_x ~ O) - (delta_x < 0); /* increments are +1 or -1 */
inc_y = (delta_y > O) - (delta_y < O);
delta_x *= inc_x; /* take absolute vaiues */
delta_y*= inc_y;
x= xa;
y= ya;
x_err = O;
y_err = o;
for(i=O; i<N; i++) { get_pixel(i/2, x/2, yl2);
x_err += delta_x;
y_err += delta_y;
if (x_err >= N) { x_err -= N;
x += inc_x;
if (y_err >= N) { y_err-= N;
y += inc_y;
}
Thus, given two points on a raster (xa, ya) and (xb, yb) which are the end points of an imaging sensor of N pixels, the purpose is to find successively the points (x, y) on the raster where each pixel is to be read. These points form the best approximation to a straight line connecting the end points at a and b. Take the differences in x and y. From the sign of the distances between a and b, determine whether x and y will be incremented or decremented as the line is traversed. Start at x = xa, y = ya, with two error registers x_err and y_err set to zero and begin the loop. Next, read the value at (x, y) and write it to the output raster using get_pixel(). Given a linear image sensor with half the resolution of the navigation, use i/2, xl2, y/2 for the pixel number in the sensor and the position in the output raster. Add delta_x and delta_y to the respective error registers, then test both error HP: 1094285-1 .APL
registers to see if they exceed N. If so, subtract N from them and change x and/or y by the increment. If an error register does not exceed the N, continue to use the current value of x or y. The process continues until the loop has run N times.
The next step is to stitch successive image swaths within their region of overlap. This must be done in such a way as to identify and correct most of the accumulated navigation error and to mask any residual error. This Rmasking~ can be done in areas of black print on a white background, for example, by stitching only in white space areas, i.e. areas with intensity values above a pre-described or adaptive threshold. The following paragraphs describe how redundant data from areas of overlap is identified (to be discarded) and how the navigation error is measured and corrected.
Techniques for stitching image swaths are known in the scanning art. These techniques typically require a pair of complete image swaths and produce a single, global transformation which brings the two swaths into registration. In this case, however, continuous navigation data provides the registration information needed for stitching.
Since the navigation signal tends to accumulate error, it is continually amended by feeding back a correction signal derived from analysis of feature offsets.
Some area of overlap is necessary in order to stitch two image swaths, since the navigation correction is calculated by correlating features within this area. Consider the situation portrayed in Fig. 11, where the Swath #1 is being resampled by the return pass, Swath #2. At time T, a partial swath has thus far been scanned. Fig. 12 highlights this overlap area 108. As shown in Fig. 12, during collection of Swath #1, quadrilateral image segments (henceforth called "registration tilesU) are periodically labeled along the lower edge of the swath with the location of Tags 110, 112 and 114 that are described above. On a later pass (Swath #2) the USurplus Overlap Area" 108 above the tagged areas of Swath #1 is clipped, using navigation information to determine where to clip. As each segment length in Swath #2 is acquired, the registration tile from Swath #1 is located in the top of what remains of Swath #2, after the RsurplusH has been clipped. If the navigation data is perfect, there will be no offset between the location of Tag #1 and the location of that tile's rescanned image in Swath #2. More realistically, some navigation error will have accumulated since the last registration was performed. The offset between these two tiles produces a correction factor which is then used to update future navigation position-tags, associated with the data, in order to minimize the total accumulated error.
HP: 1094285-l.APL
21669~4 ln this way the total accumulated error in the navigation data is prevented from growing so large that it introduces an obvious distortion in the region of the swath overlap.
Since both Swath #1 and Swath #2 are combined to produce a single image, a buffer is used to temporarily store a copy of an original registration tile until Swath #2 has been located in it. The entire registration tile could be used for this correlation, but in the preferred embodiment a small area of high-frequency contrast (henceforth called a ~feature~) consisting of a rectangular tile (e.g.,15x15 pixels) of grey scale image is located within the registration tile of Swath #1 and saved to buffer. When the location of this feature is crossed for a second time, the offset between the location of the saved feature and the same feature in Swath #2 produces a navigation correction signal, i.e. the translation required to bring the two features into close correspondence. While other correlation approaches could be employed, an acceptable approach is a Usum of squared difference" correlation. A small search area is defined around the original location of the feature and correlation coefficients are determined by equation:
Ck,l ~ ~ (Ti,j Iitk, jtl) where Tjj denotes the grey scale values of the feature from Swath #1 and lj+k j+, denotes the grey scale values of the newly acquired feature from Swath #2. Indices i and j specify locations within the features, while k and I specify the magnitude of the proposed translational offset (constrained to remain within the search space). The smallest element in the resulting correlation array denotes the offset between the two features. Sub-pixel positional accuracy my be obtained using interpolation to find the minima of this bowl-shaped result.
The feature within the registration tile is selected to maximize image variance, since this improves the accuracy of the correlation method. In one possible embodiment, only a subset of locations within the region is considered. These locations 116,118,120,122 and 124 are shown in Fig.13 as Iying along the principal axes 126 and 128 of the registration tile (lines joining opposite midpoints of lines that define the region) and are sampled at the intersection and halfway between the intersection and each HP: 1094285-1 .APL
21669~4 endpoint of the axis. For each location 1 16, 1 18, 120, 122 and 124, the variance VARk, is calculated using the equations:
SUMk,, = ~ ~ Ik~i,ltj SUM2k, = ~ ~ (Ikti,ltj) VARk, = SUM2k~N - (SUMk,) /N
In order to prevent obvious distortions in the final representative image, the error estimate is applied slowly; the "position-tags" are modified in small fixed magnitude steps as each new row of linear sensor data is loaded into memory, until there has been an accounting for the entire error.
In the preferred embodiment, the processing electronics for image reconstruction, stitching and image management is contained within the housing that defines the scanning device 10 of Fig. 1. Thus, the scanned image may be immediately presented at the image display 16. However, the scanning device may contain memory to store the position-tagged image data, but without processing and file management electronics and firmware.
As noted in reference to Fig. 3, the navigation and imaging sensors 22, 24 and 26 are preferably mounted on a pivoting member 20. In one embodiment, the pivoting member is connected to the remainder of the housing by at least one elastomer for which one end of the elastomer is connected to the stationary portion of the housing and the other end is connected to the pivoting member. The elastomer acts as a hinge. Thus, the pivoting portion is allowed to "float" without the use of frictional elements. Power, control and data signals may be conducted to the sensors via flex cables that are shielded in order to minimize electromagnetic interference. Other methods of pivotally attaching the pivoting member can be used. If the pivoting member is deleted and the sensors are in a fixed position on the housing, care must be taken not to tilt the scanning device 10 excessively HP: 1094285-l.APL
~16~9U~
during image capture. In this ernbodiment, the design of illumination and optical elements must be given increased attention.
While the invention has been described and illustrated as one in which a planar original is scanned, this is not critical. In fact, persons skilled in the art will readily understand how many of the techniques may be used for scanning three-dimensionalimages. However, the preferred embodiment is one in which the image of interest is formed on a medium, such as a piece of paper, a transparency, or a photograph, and the scanning device is in contact with the medium.
HP: 1094285-l.APL
21~90~
Reference Numerals HP Docket No. 1094285-1 Freehand Image Scanning Device and Method 10 scanning device 66 sample frame 12 meandering path 68 frame 14 original 70 representation 16 image display72 array of position frames 18 forward end74 microstep determination 20 pivoting member76 shift reference frame 22 imaging sensor 78 step 24 navigation sensor80 navigation processor 26 navigation sensor 82 pixel amp 28 LED 84 A/D converter 30 angle of incidence 86 output 32 lens 88 increment 34 holographic diffuser90 position coordinate cells 35 source92 position coordinate cells 36 optical elements94 position coordinate cells 37 beam splitter96 position coordinate cells 38 optical elements108 overlap area 39 element 110 tags 40 optical elements 112 tags 42 target region 114 tags 44 original 116 location 46 block 118 location 48 curved path 120 location 50 distorted captured image122 location 52 second position 124 location 5~ captured image 126 axes 56 acquire reference frame128 axes 58 acquire sample frame correlating 62 reference frame 64 T-shaped feature
Description Freehand Image Scanning Device and Method Technical Field The present invention relates generally to devices and methods for forming scanned electronic images of originals and more particularly to scanning devices and methods that accommodate imprecise movements during image capture.
Background Art Scanners for electronically forming an image of an original are known.
Typically, the captured image provided by a scanner is a pixel data array that is stored in memory in a digital format. A distortion-free image requires a faithful mapping of the original image to the pixel data array. Scanners typically include at least one means for imposing a mechanical constraint during the image capture process in order to maximize the likelihood of faithful mapping.
The four types of scanners known in the art are drum scanners, flatbed scanners, two-dimensional array scanners and hand scanners. Drum scanners attach the original to the surface of a cylindrical drum that rotates at a substantially fixed velocity.
During the rotation of the drum, an image sensor is moved in a direction parallel to the rotational axis of the drum. The combination of the linear displacement of the image sensor and the rotation of the original on the drum allows the entire original to be scanned.
At any moment during the imaging process, the current position within the pixel data array relative to the original can be determined by measuring the angular position of the drum and the translational position of the sensor. The position of the pixel data array with respect to the original is fixed as long as the original is properly attached to the drum, the drum rotation is properly controlled, and the sensor is properly controlled in its displacement along the linear path.
Flatbed scanners include a linear array sensor that is moved relative to the original along an axis that is perpendicular to the axis of the array. Thus, the position of the sensor in one dimension may be known by tracking the relative movement of the sensor. The position of the sensor in the perpendicular direction is implicitly fixed by HP: 1094285-1 .APL
2~66g~
addressing a particular array element at which intensity is to be measured. In one embodiment of the flatbed scanner, the original is placed on a transparent platen and the sensor, along with an image illumination source, is placed on a side of the platen opposite to the original. As long as the original is not moved relative to the platen, the pixel data array will be fixed with respect to the image to be captured. In another embodiment, the original is moved, rather than the sensor. This second embodiment is typical of facsimile machines. Precision paper transports provide a high degree of positional accuracy during the image-capture process.
Advantages of the drum and flatbed scanners include the ability to accommodate documents at least as large as A4, or 8.5" x 11U paper. Moreover, some of these scanners can handle A1 paper in a single setup. However, the scanners are not generally portable, since they require a host computer for control, data storage and image manipulation.
Two-dimensional array scanners may be used in the absence of mechanical encoding constraints, and require only that the array and the original be held motionless during an exposure period. A two-dimensional array of photosensitive elements directly accomplishes the mapping of the image of the original into a pixel data array. However, because a single 300 dpi mapping of an 8.5" x 1 lu original requires an image sensor having an array of 2500 x 3300 elements, i.e. 8.25 million pixels, these scanners are cost-prohibitive in most applications.
Conventional hand scanners require a user to move a linear array of electrooptical sensor elements over an original. The movement is by hand manipulation.
Array-position information is determined using methods such as those employed inoperation of a computer "mouse." As a linear sensor array is moved, the rotation of wheels, balls or rollers that are in contact with the original is sensed, and the position information is determined from the mechanical details of the rotation. In general, the surface of the mechanical element in contact with the original has a high coefficient of friction, e.g. rubber, so as to resist slip and skid. A cylindrical roller or two wheels connected by a rigid axle may be used to enforce a single translational degree of freedom during the scanning process. A straight-edge or other fixture is often used to fix the scan direction with respect to the original and to further enforce the translational constraint provided by the pair of wheels or the roller. Nevertheless, the position encoder approach HP: 1094285-l .APL
is one that is often susceptible to slips and skips, so that the pixel data array loses its cor-respondence with the image on the original.
Hand scanners are typically connected directly to a personal computer for image data storage, processing, and use. Data rates from the image sensor tend to limit the scanning speed. The scanners provide feedback to the user, typically by means of green or red light emitting diodes, to maintain the appropriate speed for the desired image resolution. Some hand scanners use electromagnetic brakes to prevent the user from dragging the scanner over the image too rapidly, with the mechanical resistance increasing with increases in scanning speed.
Hand scanners utilize relatively small imaging arrays and generally cannot handle larger than A6 documents in a single pass. This requires stitching algorithms to join together multiple swaths of a larger document. Swath stitching is done in a separate operation by the personal computer. Scanning a multi-page business document or report with a hand scanner is a tedious process that often yields low-quality results.
As previously noted, some type of fixture is typically used with a hand scanner. In the absence of a fixture, there is a tendency to impose some rotation as the hand scanner is moved across an original. If the user's elbow is resting on a flat surface during movement of the scanner, the rotation is likely to have a radius defined by the distance between the scanner and the user's elbow. As a consequence, the scanned elec-tronic image will be distorted. Other curvilinear movements during a swath of the scanner will also create distortions.
What is needed is a scanning device that accommodates curvilinear movement during a scanning process, with accommodation being achieved in a low-cost manner and with a high degree of correspondence between the original image and the resulting image.
Summary of the Invention A scanning device and method for forming a scanned electronic image include using navigation information that is acquired along with image data, and then rectifying the image data based upon the navigation and image information. In the preferred embodiment, the navigation information is acquired by means of at least one navigation sensor that detects inherent structure-related properties of an original being scanned. Movement of an image sensor along the original may be tracked by monitoring HP: 1094285-1 .APL
216690~
variations of the inherent structure-related properties as the image sensor is moved relative to the original. Preferably, the inherent structure-related properties that are monitored are inherent structural features, such as paper fibers, or other constituents of the original. Navigation may also be speckle-based, wherein movement of the image sensor along the original is tracked by monitoring variations of speckle patterns produced using coherent illumination for acquiring the navigation information.
"Inherent structure-related propertiesU are defined herein as properties of the original that are attributable to factors that are independent of forming image data and/or of systematic registration data on the original. The navigation information may be formed by generating a position signal that is responsive to detection of inherent structure-related properties, such as a position signal of speckle information or a position signal that permits tracking of individual inherent structural features. "Inherent structural features" are defined herein as those features of an original that are characteristic of processes of forming the original and are independent of forming image data and/or systematic registration data on the original. For example, if the original recorded media is a paper product, the inherent structural features of interest may be paper fibers. As another example, navigation of the image sensor across a glossy original or an overhead transparency film may be determined by tracking surface texture variations that affect specular fields. Typically, the inherent structural features are microscopic, e.g. between 10 and 40 ~m, features of surface texture.
Thus, the contemplated approaches to acquiring navigation information vary in scope. In the broadest approach, there is no limitation to the sources of navigation information that is to be used to remove distortion artifacts of curvilinear and rotational movement of the scanning device along a scan path. The navigation signal may therefore be in the form of a position signal that is responsive to detection of image data on the original (e.g., iden~i~ication of edges of text characters), with the position signal then being used in the manipulation of an image signal. A narrower second approach is one in which a position signal is responsive to detection of inherent-structure related properties, such as the properties that determine speckle patterns. The third approach is to track navigation of the scanning device by monitoring the positions of individual inherent structural features (e.g., paper fibers) over time. This third approach is the narrowest of the three, since it is actually a subcategory of the second approach.
HP: 1094285-1 .APL
2i66~04 In the preferred embodiment, the image sensor is a linear array of electrooptical elements, while the navigation approach utilizes at least one two-dimensional array of navigation sensor elements. By placing a separate two-dimensional navigation array at each end of the image sensor, the scanner is afforded three degrees of freedom of movement. If the original is planar, two of the degrees of freedom are translational and are perpendicular to each other within the plane of the original, while the third degree of freedom is rotatioi)al about the normal to the plane of the original. The accuracy of rotation tracking is enhanced by the use of two navigation arrays, with each array having a smaller array extent than would be necessary if only a single navigation array were used. While the preferred embodiment is one in which a navigation sensor is a two-dimensional array, linear arrays may also be used. Moreover, as will be described more fully below,navigation information for rectifying image data could feasibly be acquired by fixing other position-tracking means to the scanning device, including encoding wheels and balls, computer mice track balls, registration grid-detectors, accelerometers, mechanical linkages, non-contacting electromagnetic and electrostatic linkages and time-delay integration sensor arrays. In many of these alternative embodiments, navigation information for rectifying the image data is acquired in manners independent of any inherent structure-related properties of the original, since position tracking does not include image acquisition.
The navigation sensors are in a known position relative to the image sensor.
Preferably, the navigation sensors are as close to the end points of the imaging sensor as possible, so that the navigation sensors are less susceptible to traveling beyond the edge of an original as the image array is moved. The image sensor forms a signal that is representative of an image of interest. Simultaneously, each navigation sensor forms a signal representative of the inherent structure-related properties of the original. The scanning device may be moved in a freehand meandering pattern, such as one of alternating left-to-right and right-to-left movements with descent along the original, with the device remaining in contact with the original. Each one of the side-to-side swaths should overlap a portion of the previous swath, so that the image may be manipulated with respect to position and stitched either during or following the scanning process. The manipulation of the image signal is a rectification of image data, with the rectification being based upon the relative movement between the navigation sensor or sensors and the inherent structure-related properties detected by the navigation sensors. The manipulation HP: 1094285-1 .APL
21~9~4 is a ~rectification" of the image signal, i.e., an operation of arranging and modifying acquired image data based upon navigation data in order to achieve conformance between original and output images. The stitching is used to connect image data acquired during successive swaths.
Preferably, each navigation sensor includes one or more light sources designed to provide contrast dependent upon the inherent structure-related properties of the original. Emitted light may be in the visible range, but this is not essential. For example, ~grazing" light that has large angles of incidence relative to the surface normal will interact with paper fibers at or near the surface of an original that is a paper product, creating contrast-enhancing shadows among the fibers. On the other hand, if the original has a glossy surface, such as a photographic print, a clay-coated paper or an overhead transparency film, normally incident light will produce an image in the specular field that has image-contrast features sufficient for purposes of navigation. Optical elements such as filters and one or more imaging lenses further improve detection of inherent structure-related properties.
An advantage of the invention is that the scanning device and method allow three degrees of freedom of movement of the scanning device while still affording quality image capture. Thus, a portable, pocket-sized scanning device may be manufactured and used in the absence of mechanical constraints, other than that afforded by contact with the surface of the original throughout the image capture process. In fact, for embodiments in which image rectification is provided by correlation of navigation images, the device-to-original contact constraint may be eliminated. Another advantage is that because the scanning device of the preferred embodiment forms an electronic image based upondetection of inherent structural features, large areas of Uwhitespace'' between image features of the original will be preserved and therefore not result in the image features being moved closer together during a stitching step.
Brief Description of the Drawings Fig. 1 is a perspective view of a hand-held scanning device following a meandering path on an original in accordance with the invention.
Fig. 2 is a rearward view of imaging and navigation sensors of the scanning device of Fig. 1.
HP: 1099285-1 .APL
~16~gO~
Fig. 3 is a perspective view of the scanning device of Fig. 1, shown with the imaging and navigation sensors exposed.
Fig. 4 is a schematical side view of an illumination system for one of the navigation sensors of Fig. 3.
Fig. 5 is a side schematical view of a light emitting diode and optical elementsfor providing the illumination described with reference to Fig. 4.
Fig. 6 is a conceptual view of the image capture operation of the scanning device of Fig. 1.
Fig. 7 is an operational view of one embodiment of the navigation processing of the scanning device of Fig. 1.
Fig. 8 is a schematical view of selected steps of Fig. 7.
Fig. 9 is a block diagram of the components for carrying out the steps of Fig.
8.
Fig. 10 is a representation of a position-tagged data stream typical of that output from Fig. 9 Figs. 11 and 12 are representations of swaths by the scanning device of Fig.
1.
Fig. 13 is a representation of a registration tile that may be utilized to achieve stitching of successive swatches.
Best Mode for Carrying Out the Invention With reference to Fig. 1, a portable, hand-held scanning device 10 is shown as having followed a meandering path 12 along an original 14. In the preferred embodiment, the original is a piece of paper, an overhead transparency film, or any other image-bearing surface upon which inherent structure-related properties of the original generate sufficient contrast to provide position information during navigation along the meandering path. Typically, the positions of inherent structural features are tracked and the position information is used to rectify image data, but other embodiments will be described. The scanning device is preferably self-contained and battery powered, but may include a connection to an external source of power or to data ports of computers or networks.
HP: 1094285-l.APL
~16~904 The scanning device 10 of Fig. 1 includes an image display 16. The display may provide almost immediate viewing of a captured image. However, a display is not essential to the use of the scanning device.
The scanning device 10 allows three degrees of freedom, with two being in translation and one in rotation. The first degree is the side-to-side movement (X axis movement) along the original 14. The second degree of freedom is movement upwardly and downwardly along the original (Y axis movement). The third degree of freedom is the ability to oper~te the device with rotational misalignment of a linear array of image sensor elements relative to an edge of the original 14 (~ axis movement). That is, the linear array of imaging elements may have an angle of attack that is not perpendicular to the direction of device translation.
Referring now to Figs. 1-3, the forward side 18 of the scanning device 10 includes a pivoting member 20 that aids in maintaining proper contact between the original 14 and an imaging sensor 22. Navigation sensors 24 and 26 are located at the opposed ends of the imaging sensor. Because the navigation sensors are mounted on the pivoting member, the navigation sensors are in a fixed location relative to the imaging sensor.
For reasons of physical compactness, the imaging sensor array 22 is preferably a contact image device, but for applications in which compactness is less of a concern or a smaller image is desired, sensors employing projection optics may be employed, with magnification less than unity. In such applications, the elements of the imaging sensor 22 should be smaller and more closely packed together. Contact imaging devices typically employ lenses sold under the trademark SELFOC, which is a federally registered mark of Nippon Sheet Glass Company Limited. Less conventionally, contact imaging can be obtained using interleaved array elements of sources and proximalsensors, without any imaging lenses. Conventional imaging sensors for scanning appli-cations may be used. The imaging sensor may be part of a unit that also includes an illumination source, illumination optics, and image transfer optics.
The imaging sensor is shown as a linear array of discrete optically sensitive elements. The spacing of the elements plays a role in determining the spatial resolution of the scanner 10. For example, a linear array having a length of 101.6 mm requires 1200 sensor elements to achieve a resolution of 300 dpi. The sensor may be a charged coupled device, an amorphous silicon photodiode array, or any other type of linear array sensor known in the art.
HP: 1094285-1 .APL
2~66904 g A key consideration in the design of the imaging sensor unit is speed. The imaging sensor 22 preferably is able to image each pixel at approximately 10K samples per second. Linear imaging arrays generally produce a serial data stream, wherein pixel values, i.e. charges, are placed into a shift register and then shifted out. Achieving the desired speed requires either very fast serial transfer rates out of the entire image array or multiple taps, so that pixel values can be shifted through fewer cells. This introduces parallelism, which is advantageous for digital processing.
Another consequence of the speed requirement is that the products of pixel areas, at the surface of the original, and their solid angles of emitted light collected and conveyed to each array element should be sufficiently large to produce a detectable signal in integration times on the order of 100 microseconds. An enhancement option is to add an optical element to the sensor to increase the effective fraction of sensor pitch for which each sensing element is responsive. Since there is typically unused area in the array matrix, such light collection optics increase sensitivity.
A straightforward modification of the imaging sensor 22 enables sensing of color images. Three linear arrays parallel to one another, each with at least one embedded filter element that selectively passes red, green and blue components of the incident light respectively, would allow color imaging. Alternatively, a single array having broad-band sensitivity may be sequentially illuminated by red, green and blue light sources.
Regarding illumination for improving the operation of the imaging sensor 22, a linear array of high intensity light emitting diodes at the amber wavelength may be used.
However, the selection of the preferred illumination source and any optical elements is dependent upon the medium of the original. The wavelength of the light is selected to maximize the contrast image data acquired during the scan of a given area of the original 14, while disregarding unwanted signals. Illumination optics may consist of LED dome lenses or may include a light pipe consisting of a precision-molded optical element that channels the illumination onto the original with a minimal amount of light loss. Such a design can afford a relatively uniform illumination of a target area of the original at a wide range of angles, but blocks normal incident rays in order to avoid specular surface reflections.
In Fig. 1, the meandering path 12 is shown as having four and a fraction swaths, i.e. side-to-side passes across the original 14. A useful imaging sensor 22 for HP: 1094285-1 .APL
~6~0~
most opportunistic applications has a length within the range of 25.4 mm and 101.6 mm.
If the sensor 22 has a length of 63.5 mm, an A4 paper can be scanned in four or five swaths. As will be explained more fully below, the swaths should include regions of overlap, so that a stitching process can be used to produce a faithful reproduction of the original image.
The scanning device 10 typically includes at least one navigation sensor 24 or 26. In the preferred embodiment, the device includes a pair of navigation sensors, with the sensors being at opposite ends of the imaging sensor 22. While a one-dimensional array of optoelectronic elements may be used, in the preferred embodiment, each navigation sensor is a two-dimensional array of elements. The navigation sensors 24 and 26 are used to track movement of the scanning device 10 relative to the original 14.
In the preferred embodiment, each navigation sensor 24 and 26 captures images related to inherent structure-related properties of the original in order to produce information related to the position of the scanning device 10. For most prior art scanning devices, inherent structural features are considered to be noise. For the scanning device 10 of Figs.1-3, such features are noise with regard to the imaging sensor 22, but may be used to provide a basis for the navigation sensors 24 and 26 to generate position information. Useful, high-contrast images of surface texture can be generated by detecting structural variations that are inherent to the medium or are formed on the medium, e.g., text. For example, images may be formed based upon the contrast between shadows in valleys and bright spots at the peaks of inherent structural features. Such features are typically microscopic in nature, often ranging between 10 ~m and 40 I~m in size on common printed media. As an alternative, speckle may be used, since specular reflection of a coherent beam produces a contrast pattern of bright and dark areas. A third source of contrast information is color. Color contrast is independent of surface texture. Even when illuminating the texture-free surface with light in the visible range, color contrast exists between regions of different colors, e.g., between different shades of gray.
However, it is contemplated to use the invention for applications in which navigation information is independent of inherent structure-related properties of the original. For example, one or both of the navigation sensors 24 and 26 of Fig. 2 may be used to form successive images of print on the original, with correlation of the successive images being used to determine the position and the orientation of the image sensor 22 along the original 14. In this embodiment, all three sensors 22, 24 and 26 image text on HP: 1094285-l.APL
the original, but only the signal from the sensor 22 is used to acquire image data. The signals from the navigation sensors 24 and 26 are used to acquire image-based navigation information.
Non-imaging approaches can also be used to acquire and process X, Y and theta position information. Unfortunately, many of the alternative means impose various limitations upon compactness, convenience of use, speed, freedom of motion, power consumption, accuracy, precision, and/or cost. One imaging-independent alternative available to acquiring position information is to provide one or more encoding wheels in place of the navigation sensors. The encoding wheels may then roll without slip upon the scanned surface, enabling the scanning device to travel along straight or curvilinear trajectories. It is not critical that the encoding wheels be on a common axis. The wheels may be mounted to swivel. Encoders coupled to monitor rotations would provide the input data from which to calculate position and orientation of an imaging sensor relative to a starting position and orientation.
Another image-free approach to acquiring navigation information is to use track balls similar to those for a computer mouse. A track ball could be used in place of each encoder wheel described above. Encoders would be used to obtain two-dimensional displacement information from each track ball. In another approach, optical or electronic (c~p~citive, resistive or inductive) sensors may be used in place of the navigation sensors of Fig. 2 in order to sense position and orientation relative to a cooperative (active or passive) grid or other reference constructed in an underlying tablet that, in turn, serves as a support for the original being scanned.
Another image-free approach to acquiring position and orientation information is to provide an accelerometer. An on-board inertial navigation platform may be used, with accelerations being sensed and integrated either once to obtain velocities or twice to obtain positions. Or velocities of spring-suspended masses could be sensed and integrated once in order to obtain positions. Gyroscopes could be employed in a direct sensing of orientation.
Yet another alternative approach would be to use any of a variety of mechanical linkages with which to track position and orientation relative to reference coordinates fixed with respect to the medium being scanned. Position and orientation information could be obtained by means of sensors coupled to measure the relative movement of the mechanical members. These sensors could be of either the relative or HP: 1094285-1 .APL
~1~6904 absolute type and could be based on direct position and orientation sensing, or the sensing of accelerations or velocities that would then be integrated with respect to time, once or twice, to obtain positions. Non-contacting remote sensing could also be used to measure position and orientation of the scanning device relative to reference coordinates fixed with respect to the scanned original. Examples of such non-contacting sensing would include those that use electro-magnetic fields, waves or beams (e.g. at optical or radiofrequencies); electric effects (e.g. capacitive); and/or magnetic effects (e.g. inductive).
These approaches could utilize standard or differential Global Positioning technologies and potentially could use satellites. These approaches can also include traditional navigation/surveying methods, such as triangulations. They could also include techniques employed in robotics technologies, such as using shaped light beams and interpreting position from images of where these beams intercept the moving object.
The navigation sensors 24 and 26 of Fig. 2 effectively observe a moving image of the original 14 and produce an indication of the displacement in two planar dimensions between successive observations. As will be explained more fully below, pixel values from the navigation sensors are operated upon by processing elements to determine proper mapping of image data from thè imaging sensor 22. The processing elements operate on a particular pixel and its nearest neighbors to produce an array of correlation values at each pixel location. The correlation values are based uponcomparisons between a current image of the surface structure and a stored image representing a known position of inherent structural features, wherein the stored image serves as a position reference. However, operations other than the correlation process may be employed in manipulating the input image data to form the output image.
Referring now to Figs. 4 and 5, navigation sensor 24 is shown as being operatively associated with illumination optics. If the original 14 is a paper product for which paper fibers are to be detected by the navigation sensor 24, the introduction of light at a grazing angle of incidence is preferred. While not essential, one or more light emitting diodes (LEDs) 28 may be used. The grazing angle 30, which is the complement of the angle of incidence, is preferably in the range of zero degrees and fifteen degrees, but this may change depending upon the properties of the original 14. In Fig. 5, the source 28 is shown with illumination optics 34. The optics may comprise a single element or acombination of lenses, filters and/or holographic elements to accomplish suitable collimated and generally uniform illumination of the target surface. The wavelength of the HP: 1094285-l.APL
~6~904 light emitted by the source 28 should be selected to enhance the spatial frequency information available for navigation. Fixed pattern noise in the illumination field should be minimized. The output of source 28 may require adjustment to accommodate wide dynamic ranges of reflectivity of the medium as the scanning device proceeds over printed materials with absorbing or reflecting inks or other marking agents.
In Fig. 4, light from a source 35iS collimated at illumination optics 36 and then redirected by an amplitude splitting beam-splitter 37. That portion of the light energy from the LED directly to and transmitted through the beam-splitter is not shown in Fig. 4.
The light energy from the beam-splitter illuminates the original 14 along the normal to the surface.
Also represented in Fig. 4 is the portion of the light energy that is reflected or scattered from the original 14 and passed through the beam-splitter 37 for aperturing and filtering at element 38 and focusing to an image at element 39. The portion of light energy passing from the original to the beam-splitter and reflecting from the beam-splitter is not shown. The magnification of navigation imaging optics should be constant over the field-of-view of the two-dimensional sensor array 24 which detects the focused light. In many ~pp'ic tions, the modulation transfer functions, i.e. the amplitude measure of optical frequency response, of the navigation optics must be such as to provide attenuation before the Nyquist frequency that is determined by the pitch of the sensor elements of the navigation sensor and by the magnification of the optical elements. The optical elements should also be designed to prevent background illumination from creating noise. Note that a wavefront splitting beam-splitter could also be used.
The selection of the angle of incidence depends upon the material properties of the original. Grazing angles of illumination generate longer shadows and more apparent contrast, or AC signal if the surface of the original is not glossy. The DC signal level, however, increases as the illumination angle approaches the normal to the original.
Illuminating the target region of the original 14 at a grazing angle 30 works well for applications in which the surface of the original has a high degree of unevenness at the microscopic level. For example, the introduction of light from the source 28 at a grazing angle provides a high signal-to-noise ratio of data related to inherent structural features when the original is stationery, cardboard, fabric, or human skin. On the other hand, the use of incoherent light at a normal angle of incidence may be preferred in applications in which position data is needed to track scanner movement along such HP: 1094285-1 .APL
-216~04 originals as photographs, glossy magazine pages, and overhead transparency films. With normal illumination, using incoherent light, viewing the original in the specularly reflected field will provide an image that is sufficiently rich in texture content to allow image and cor,elation-based navigation. The surface of the original has a microscopic relief such that the surface reflects light as if the surface were a mosaic of tiles, or facets. Many of the "tiles~ of an original reflect light in directions slightly perturbed from the normal. A field of view that includes the scattered light and the specularly reflected light can thus be modeled as though the surface were composed of many such tiles, each tilted somewhat differently with respect to the normal. This modeling is similar to that of W.W. Barkas in an article entitled "Analysis of Light Scattered from a Surface of Low Gloss into Its Specular and Diffuse Components," in Proc. Phys. Soc.. Vol. 51, pages 274-292 (1939).
Fig. 4 shows illumination by a source 35 of incoherent light, which is directed along the normal of the surface of the original 14. Fig. 5 describes illumination at a grazing angle 30. In a third embodiment, no illumination is provided. Instead, the navigation information is accumulated using background light, i.e. Iight from the environment.
In a fourth embodiment, coherent illumination is introduced at normal incidence to permit speckle-based navigation. Relative motion between a scanning device and an original may be tracked by monitoring motion of speckle relative to the navigation sensors. If coherent illumination is used without using imaging optics, then by selecting a small area of illumination and by having a relatively large separation between the surface of the original and the photodetector array of the navigation sensor 24, the resulting predominant speckle cell sizes with coherent illumination are sufficiently large to satisfy the Nyquist sampling criterion. The use of a beam splitter allows the direction of both the incident illumination and the detected scatter to be near to normal to the surface of the original, as similarly accomplished in Fig. 4.
Referring now to Fig. 6, the scanner 10 is shown as being moved across an original 44 having a block 46 imprinted onto a surface of the original. Because the scanner 10 is not subjected to any kinematic constraints in the plane of the original, there is a tendency for a user to follow a curvilinear path across the original, as when the hand and forearm of the user rotate about the elbow. In Fig. 6, the scanning device is shown as following a curved path 48 across the block 46. If the lower edge of the scanning device is the edge that is closer to the elbow that defines the axis of rotation, the lower edge will have a shorter radius. Consequently, imaging elements of an imaging sensor will vary with HP: 1094285-1 .APL
~1669~4 respect to the time and distance required to pass over the block 46. A distorted image 50 of the block is captured as the device is moved to the second position 52, shown in dashed lines.
The captured image 50 would be the stored image in the absence of processing to be described below. However, as the imaging sensor captures data related to the block 46, navigation information is acquired. In the preferred embodiment, one or more navigation sensors capture data related to inherent structural features of the original 44. Movement of the inherent structural features relative to the scanning device 10 is tracked in order to determine displacement of the imaging sensor relative to the block 46.
A faithful captured image 54 may then be formed. The image 54 is defined herein as the ~rectified" image.
In Fig. 7, one embodiment of navigation processing is shown. The navigation processing is performed by correlating successive frames of navigation information, such as data related to inherent structural features. The correlations compare the positions of the inherent structural features in successive frames to provide information related to the position of a navigation sensor at a particular time. The navigation information is then used to rectify image data. The processing of Fig. 7 is typically performed for each navigation sensor.
In a first step 56, a reference frame is acquired. In effect, the reference frame is a start position. The position of a navigation sensor at a later time may be determined by acquiring 58 a sample frame of position data from the navigation sensor at the later time and then computing correlations 60 between the reference frame and the later-acquired sample frame.
Acquiring the initial reference frame 56 may take place upon initiation of the imaging process. For example, the acquisition may be triggered by mere placement of the scanning device into contact with the original. Alternatively, the scanning device may include a start button that initiates the image process and the navigation process. Initiation may also take place by a periodic pulsing of the illumination system of each navigator. If there is a reflected signal that exceeds a prescribed threshold of reflection or a correlation signal that indicates motion, the reference frame is then acquired.
While the navigation processing is performed computationally, the concepts of this embodiment may be described with reference to Figs. 7 and 8. A reference frame 62 is shown as having an image of a T-shaped inherent structural feature 64. The size of HP: 1094285-1 .APL
~7 ~ga~
the reference frame depends upon factors such as the maximum scanning speed of the scanning device, the dominant spatial frequencies in the imaging of the structurai features, and the image resolution of the sensor. A practical size of the reference frame for a navigation sensor that is thirty-two pixels (N) by sixty-four pixels (M) is 24 x 56 pixels.
At a later time (dt) a navigation sensor acquires a sample frame 66 which is displaced with respect to frame 62, but which shows substantially the same inherent structural features. The duration dt is preferably set such that the relative displacement of the T-shaped feature 64 is less than one pixel of the navigation sensor at the velocity of translation of the scanning device. An acceptable time period is 50 ,us for velocities of 0.45 meters/sec at 600 dpi. This relative displacement is referred to herein as a ~microstep.~
If the scanning device has moved during the time period between acquiring 56 the reference frame 62 and acquiring 58 the sample frame 66, the first and second images of the T-shaped feature will be ones in which the feature has shifted. While the preferred embodiment is one in which dt is less than the time that allows a full-pixel movement, the schematic representation of Fig.8 is one in which the feature 64 is allowed to shift up and to the right by one pixel. The full-pixel shift is assumed only to simplify the representation.
Element 70 in Fig. 8 represents a sequential shifting of the pixel values of frame 68 into the eight nearest-neighbor pixels. That is, step NON does not include a shift, step ~1~ is a diagonal shift upward and to the left, step U2" is an upward shift, etc. In this manner, the pixel-shifted frames can be combined with the sample frame 66 to produce the array 72 of position frames. The position frame designated as "Position 0~ does not include a shift, so that the result is merely a combination of frames 66 and 68. "Position 3" has the minimum number of shaded pixels, and therefore is the frame with the highest correlation. Based upon the correlation results, the position of the T-shaped feature 64 in the sample frame 66 is determined to be a diagonal rightward and upward shift relative to the position of the same feature in earlier-acquired reference frame 62, which implies that the scanning device has moved leftwardly and downwardly during time dt.
While other correlation approaches may be employed, an acceptable approach is a Usum of the squared differences" correlation. For the embodiment of Fig.
8, there are nine correlation coefficients (Ck = C0, C1... C8) formed from the nine offsets at element 70, with the correlation coefficients being determined by equation:
HP: 1094285-l.APL
21~6304 Ck = ~ ~ (Sij R(i~ )~k) where Sjj denotes the navigation sensor-measured value at the position ij of the sample frame 66 and Rj denotes the navigation sensor-measured value at the frame 68 as shifted at the element 70 in the k direction, with k being the identifier of the shift at element 70.
In Fig. 8, k=3 provides the correlation coefficient with the lowest value.
Correlations are used to find the locations of identical features in successive frames in order to determine the displacements of the features from frame-to-frame.
Summing or integrating these displacements and correcting for scale factors introduced through the design of the relevant optics determine the displacements of the imaging sensor as a scanning procedure progresses.
As previously noted, the frame-to-frame correlations are referred to as Rmicrosteps,~ since frame rates are chosen to be sufficiently high to ensure that the displacements do not exceed the dimension of a single pixel. Oversampling can provide sub-pixel displacement precision. Referring to Fig. 7, a determination 74 of whether a microstep is to be taken is made following each computation 64 of the correlations. If a microstep is required, the reference frame is shifted at 76. In this step, the sample frame 66 of Fig. 8 becomes the reference frame and a new sample frame is acquired. Thecorrelation computation is then repeated.
While the process provides a high degree of correlation match, any errors that do occur will accumulate with each successive shift 76 of a sample frame 66 to the reference frame designation. In order to place a restriction on the growth rate of this "random walkH error, a sample frame is stored in a separate buffer memory. This separately stored sample frame becomes a new reference frame for a subsequent series of correlation computations. The latter correlation is referred to as a Umacrostep.R
By using macrosteps, a more precise determination of scanner displacement across a distance of m image frame displacements, i.e. m microsteps, can be obtained.
The error in one macrostep is a result of a single correlation calculation, whereas the equivalent error of m microsteps is m1'2 times the error in a single microstep. Although the average of errors in m microsteps approaches zero as m increases, the standard deviation in the average of errors grows as m"2. Thus, it is advantageous to reduce the standard HP: 1094285-l.APL
deviation of accumulated error by using macrosteps having m as large as practical, as long as the two frames that define a macrostep are not so far spaced from one another that they have no significant region of common image content.
The sampling period dt does not have to be constant. The sampling period may be determined as a function of previous measurements. One method that employs a variable dt is to improve the accuracy of displacement calculation by keeping the relative displacement between successive reference frames within certain bounds. For example, the upper bound may be one-pixel displacement, while the lower bound is determined by numerical roundoff considerations in the processing of the navigation data.
Referring again to Fig. 9, the image signal generated at the imaging sensor 22 may then be "position-tagged" based upon the navigation data. In one embodiment, pixel values from the two navigation sensors 24 and 26 are received by a navigation processor 80 for performing the operations of Figs. 7 and 8. Based upon the computed correlations, coordinates are determined for the current position of the first navigation sensor 24 (X1, Y1) and for the current position of the second navigation sensor 26 (X2, Y2).
The navigation processor 80 also receives pixel values of the imaging sensor 22 via a pixel amplifier 82 and an analog-to-digital converter 84. Although Fig. 9 shows only a single tap from the image sensor 22 and a single A/D converter 84, multiple taps, each with an A/D
converter, are within the scope of the invention. The current position coordinates of the navigation sensors are "tagged" at the ends of a line of data that corresponds to the number of pixels within the imaging sensor. The output 86 of the navigation processor 80 is therefore a position-tagged data stream. In Fig.10 an increment 88 of the data stream is shown as having position coordinate cells 90, 92, 94 and 96 at the opposite ends of N
pixel cells, although this ordering is not essential.
The position-tagged data stream at the output 86 of the navigation processor 80 may be first stored in image space that allows the image to fill memory locations which provide continuity in both the X and Y axes. Consequently, image acquisition is not restricted to scanning from an upper-left corner of an original to the lower-right corner.
Because each image pixel is associated with a relative (X,Y) displacement from an arbitrary starting point, the image can expand in X and Y to the full size of the image memory.
The imaging sensor 22 is clocked as the scanning device moves across an original. The clocking ensures that the fastest moving element of the sensor samples at HP: 1094285-1 .APL
216~904 least once per pixel displacement. As previously noted with reference to Fig. 6, in the case of significant curvature of the scanning device 10 during image capture, one end of the imaging array will translate more rapidly than the other end, causing pixels at the slower end to be oversampled. This situation can be handled by either recording the most recent reading (for grayscales) or by recording in a logical OR mode (for binary images) at a specific pixel location in image space.
The next operation is to map the position-tagged increments. In one embodiment, the end points of the increments are joined by a line. Since the distance of each pixel of the imaging sensor 22 is fixed, the physical location of the pixels relative to the line can be calculated. One approach for determining the physical locations of each pixel is a modification of the Bresenham Raster Line Technique. The modification is that because the array of pixels in the imaging sensor is fixed, the line loop will be fixed at that same number. That is, the usual Bresenham algorithm is one in which the number of iterations in the line loop is the greater of delta_x and delta_y, i.e., max (delta_x, delta_y), but for the modified algorithm the number (N) of pixels along the array is used where max (delta_x, delta_y) is customarily used, so that the loop runs N times. The following program element describes this algorithm:
HP: 1094285-l.APL
21669~
IAAAAAAAAAAAA~AAAAA~AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA
Load pixel values with get_pixel() using location pairs (xa,ya) and (xb,yb) of the endpoints of an N-element array of pixel values using a modified Bresenham line draw algorithm AAAAAAA~AAAAAA~AAAAA~A~AAAA~AAAAAAAAAAAAAAAAAAAAAAAAAAAI
delta_x = xb - xa;
delta_y = yb - ya;
inc_x = (delta_x ~ O) - (delta_x < 0); /* increments are +1 or -1 */
inc_y = (delta_y > O) - (delta_y < O);
delta_x *= inc_x; /* take absolute vaiues */
delta_y*= inc_y;
x= xa;
y= ya;
x_err = O;
y_err = o;
for(i=O; i<N; i++) { get_pixel(i/2, x/2, yl2);
x_err += delta_x;
y_err += delta_y;
if (x_err >= N) { x_err -= N;
x += inc_x;
if (y_err >= N) { y_err-= N;
y += inc_y;
}
Thus, given two points on a raster (xa, ya) and (xb, yb) which are the end points of an imaging sensor of N pixels, the purpose is to find successively the points (x, y) on the raster where each pixel is to be read. These points form the best approximation to a straight line connecting the end points at a and b. Take the differences in x and y. From the sign of the distances between a and b, determine whether x and y will be incremented or decremented as the line is traversed. Start at x = xa, y = ya, with two error registers x_err and y_err set to zero and begin the loop. Next, read the value at (x, y) and write it to the output raster using get_pixel(). Given a linear image sensor with half the resolution of the navigation, use i/2, xl2, y/2 for the pixel number in the sensor and the position in the output raster. Add delta_x and delta_y to the respective error registers, then test both error HP: 1094285-1 .APL
registers to see if they exceed N. If so, subtract N from them and change x and/or y by the increment. If an error register does not exceed the N, continue to use the current value of x or y. The process continues until the loop has run N times.
The next step is to stitch successive image swaths within their region of overlap. This must be done in such a way as to identify and correct most of the accumulated navigation error and to mask any residual error. This Rmasking~ can be done in areas of black print on a white background, for example, by stitching only in white space areas, i.e. areas with intensity values above a pre-described or adaptive threshold. The following paragraphs describe how redundant data from areas of overlap is identified (to be discarded) and how the navigation error is measured and corrected.
Techniques for stitching image swaths are known in the scanning art. These techniques typically require a pair of complete image swaths and produce a single, global transformation which brings the two swaths into registration. In this case, however, continuous navigation data provides the registration information needed for stitching.
Since the navigation signal tends to accumulate error, it is continually amended by feeding back a correction signal derived from analysis of feature offsets.
Some area of overlap is necessary in order to stitch two image swaths, since the navigation correction is calculated by correlating features within this area. Consider the situation portrayed in Fig. 11, where the Swath #1 is being resampled by the return pass, Swath #2. At time T, a partial swath has thus far been scanned. Fig. 12 highlights this overlap area 108. As shown in Fig. 12, during collection of Swath #1, quadrilateral image segments (henceforth called "registration tilesU) are periodically labeled along the lower edge of the swath with the location of Tags 110, 112 and 114 that are described above. On a later pass (Swath #2) the USurplus Overlap Area" 108 above the tagged areas of Swath #1 is clipped, using navigation information to determine where to clip. As each segment length in Swath #2 is acquired, the registration tile from Swath #1 is located in the top of what remains of Swath #2, after the RsurplusH has been clipped. If the navigation data is perfect, there will be no offset between the location of Tag #1 and the location of that tile's rescanned image in Swath #2. More realistically, some navigation error will have accumulated since the last registration was performed. The offset between these two tiles produces a correction factor which is then used to update future navigation position-tags, associated with the data, in order to minimize the total accumulated error.
HP: 1094285-l.APL
21669~4 ln this way the total accumulated error in the navigation data is prevented from growing so large that it introduces an obvious distortion in the region of the swath overlap.
Since both Swath #1 and Swath #2 are combined to produce a single image, a buffer is used to temporarily store a copy of an original registration tile until Swath #2 has been located in it. The entire registration tile could be used for this correlation, but in the preferred embodiment a small area of high-frequency contrast (henceforth called a ~feature~) consisting of a rectangular tile (e.g.,15x15 pixels) of grey scale image is located within the registration tile of Swath #1 and saved to buffer. When the location of this feature is crossed for a second time, the offset between the location of the saved feature and the same feature in Swath #2 produces a navigation correction signal, i.e. the translation required to bring the two features into close correspondence. While other correlation approaches could be employed, an acceptable approach is a Usum of squared difference" correlation. A small search area is defined around the original location of the feature and correlation coefficients are determined by equation:
Ck,l ~ ~ (Ti,j Iitk, jtl) where Tjj denotes the grey scale values of the feature from Swath #1 and lj+k j+, denotes the grey scale values of the newly acquired feature from Swath #2. Indices i and j specify locations within the features, while k and I specify the magnitude of the proposed translational offset (constrained to remain within the search space). The smallest element in the resulting correlation array denotes the offset between the two features. Sub-pixel positional accuracy my be obtained using interpolation to find the minima of this bowl-shaped result.
The feature within the registration tile is selected to maximize image variance, since this improves the accuracy of the correlation method. In one possible embodiment, only a subset of locations within the region is considered. These locations 116,118,120,122 and 124 are shown in Fig.13 as Iying along the principal axes 126 and 128 of the registration tile (lines joining opposite midpoints of lines that define the region) and are sampled at the intersection and halfway between the intersection and each HP: 1094285-1 .APL
21669~4 endpoint of the axis. For each location 1 16, 1 18, 120, 122 and 124, the variance VARk, is calculated using the equations:
SUMk,, = ~ ~ Ik~i,ltj SUM2k, = ~ ~ (Ikti,ltj) VARk, = SUM2k~N - (SUMk,) /N
In order to prevent obvious distortions in the final representative image, the error estimate is applied slowly; the "position-tags" are modified in small fixed magnitude steps as each new row of linear sensor data is loaded into memory, until there has been an accounting for the entire error.
In the preferred embodiment, the processing electronics for image reconstruction, stitching and image management is contained within the housing that defines the scanning device 10 of Fig. 1. Thus, the scanned image may be immediately presented at the image display 16. However, the scanning device may contain memory to store the position-tagged image data, but without processing and file management electronics and firmware.
As noted in reference to Fig. 3, the navigation and imaging sensors 22, 24 and 26 are preferably mounted on a pivoting member 20. In one embodiment, the pivoting member is connected to the remainder of the housing by at least one elastomer for which one end of the elastomer is connected to the stationary portion of the housing and the other end is connected to the pivoting member. The elastomer acts as a hinge. Thus, the pivoting portion is allowed to "float" without the use of frictional elements. Power, control and data signals may be conducted to the sensors via flex cables that are shielded in order to minimize electromagnetic interference. Other methods of pivotally attaching the pivoting member can be used. If the pivoting member is deleted and the sensors are in a fixed position on the housing, care must be taken not to tilt the scanning device 10 excessively HP: 1094285-l.APL
~16~9U~
during image capture. In this ernbodiment, the design of illumination and optical elements must be given increased attention.
While the invention has been described and illustrated as one in which a planar original is scanned, this is not critical. In fact, persons skilled in the art will readily understand how many of the techniques may be used for scanning three-dimensionalimages. However, the preferred embodiment is one in which the image of interest is formed on a medium, such as a piece of paper, a transparency, or a photograph, and the scanning device is in contact with the medium.
HP: 1094285-l.APL
21~90~
Reference Numerals HP Docket No. 1094285-1 Freehand Image Scanning Device and Method 10 scanning device 66 sample frame 12 meandering path 68 frame 14 original 70 representation 16 image display72 array of position frames 18 forward end74 microstep determination 20 pivoting member76 shift reference frame 22 imaging sensor 78 step 24 navigation sensor80 navigation processor 26 navigation sensor 82 pixel amp 28 LED 84 A/D converter 30 angle of incidence 86 output 32 lens 88 increment 34 holographic diffuser90 position coordinate cells 35 source92 position coordinate cells 36 optical elements94 position coordinate cells 37 beam splitter96 position coordinate cells 38 optical elements108 overlap area 39 element 110 tags 40 optical elements 112 tags 42 target region 114 tags 44 original 116 location 46 block 118 location 48 curved path 120 location 50 distorted captured image122 location 52 second position 124 location 5~ captured image 126 axes 56 acquire reference frame128 axes 58 acquire sample frame correlating 62 reference frame 64 T-shaped feature
Claims (10)
1. A method of forming a scanned electronic image comprising the steps of:
moving a scanning device (10) relative to an original (14) having an image, said scanning device having an imaging means (22) for detecting said image, said relative movement defining a scan path (12);
capturing a sequence of image data (50) formed as said imaging device moves along said scan path;
forming navigation information (56) representative of travel of said scanning device along said scan path; and forming an output image (54) from said image data, including removing distortion artifacts of curvilinear and rotational movement of said scanning device with travel along said scan path, said removing distortion artifacts being based upon said navigation information.
moving a scanning device (10) relative to an original (14) having an image, said scanning device having an imaging means (22) for detecting said image, said relative movement defining a scan path (12);
capturing a sequence of image data (50) formed as said imaging device moves along said scan path;
forming navigation information (56) representative of travel of said scanning device along said scan path; and forming an output image (54) from said image data, including removing distortion artifacts of curvilinear and rotational movement of said scanning device with travel along said scan path, said removing distortion artifacts being based upon said navigation information.
2. The method of claim 1 wherein said step of forming navigation information (56) includes detecting variations of inherent structure-related properties (64) of said original.
3. A scanning device comprising:
sensor means (22) for forming image signals upon relative movement between said sensor means and an original (14) having an image, said relative movement being a scanning process, said image signals being responsive to said image;
navigation means (24 and 26), in fixed position relative to said sensor means, for forming at least one position signal (56) responsive to detection of inherent structure-related properties (64) of said original during said scanning process; and processor means (80), responsive to said position signal, for manipulating said image signals based upon relative movement between said navigation means and said original as determined by variations of said inherent structure-related properties, said manipulating being conducted to increase correspondence between an output image (54) and said image of said original.
sensor means (22) for forming image signals upon relative movement between said sensor means and an original (14) having an image, said relative movement being a scanning process, said image signals being responsive to said image;
navigation means (24 and 26), in fixed position relative to said sensor means, for forming at least one position signal (56) responsive to detection of inherent structure-related properties (64) of said original during said scanning process; and processor means (80), responsive to said position signal, for manipulating said image signals based upon relative movement between said navigation means and said original as determined by variations of said inherent structure-related properties, said manipulating being conducted to increase correspondence between an output image (54) and said image of said original.
4. The device of claim 3 wherein said navigation means includes a first two-dimensional array (24) of navigation sensor elements.
5. The device of claim 4 wherein said navigation means includes a second two-dimensional array (26) of navigation sensor elements, said first array (24) being spaced apart from said second array.
6. The device of claim 3, 4 or 5 wherein said sensor means (22) and said navigation means (24 and 26) are fixed relative to a contact surface (18) to be brought into contact with said original (14), said device further comprising a first light means (28) positioned for directing light onto said original at an acute angle (30) relative to said contact surface.
7. The device of claim 3, 4 or 5 wherein said sensor means (22) and said navigation means (24 and 26) are fixed relative to a contact surface (18) to be brought into contact with said original (14), said device further comprising a second light means (35) for directing light onto said original at an angle that is generally perpendicular to said contact surface.
8. The device of claim 3, 4, 5, 6 or 7 further comprising a hand-manipulatable housing (10), said sensor means (22) and said navigation means (24 and 26) being attached to said housing.
9. The device of claim 8 wherein said sensor means (22) and said navigation means (24 and 26) are pivotally attached to said housing.
10. The device of claim 8 or 9 further comprising an image display (16) connected to said processor means (80) to form an image, said image display being attached to said housing (10).
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US08/396,826 | 1995-03-02 | ||
US08/396,826 US5578813A (en) | 1995-03-02 | 1995-03-02 | Freehand image scanning device which compensates for non-linear movement |
Publications (1)
Publication Number | Publication Date |
---|---|
CA2166904A1 true CA2166904A1 (en) | 1996-09-03 |
Family
ID=23568777
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CA002166904A Abandoned CA2166904A1 (en) | 1995-03-02 | 1996-01-10 | Freehand image scanning device and method |
Country Status (9)
Country | Link |
---|---|
US (4) | US5578813A (en) |
EP (2) | EP0730366B1 (en) |
JP (2) | JP3860242B2 (en) |
KR (1) | KR100463947B1 (en) |
CN (2) | CN1120444C (en) |
AU (1) | AU719574B2 (en) |
CA (1) | CA2166904A1 (en) |
DE (2) | DE69609096T2 (en) |
WO (1) | WO1996027257A2 (en) |
Families Citing this family (485)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6631842B1 (en) * | 2000-06-07 | 2003-10-14 | Metrologic Instruments, Inc. | Method of and system for producing images of objects using planar laser illumination beams and image detection arrays |
GB2288512B (en) * | 1994-04-14 | 1998-08-26 | Matsushita Electric Ind Co Ltd | Image processing apparatus |
US5578813A (en) * | 1995-03-02 | 1996-11-26 | Allen; Ross R. | Freehand image scanning device which compensates for non-linear movement |
JPH08298568A (en) * | 1995-04-27 | 1996-11-12 | Brother Ind Ltd | Input/output equipment |
US6950094B2 (en) * | 1998-03-30 | 2005-09-27 | Agilent Technologies, Inc | Seeing eye mouse for a computer system |
US5786804A (en) | 1995-10-06 | 1998-07-28 | Hewlett-Packard Company | Method and system for tracking attitude |
US5729008A (en) * | 1996-01-25 | 1998-03-17 | Hewlett-Packard Company | Method and device for tracking relative movement by correlating signals from an array of photoelements |
US6345116B1 (en) * | 1996-05-31 | 2002-02-05 | Matsushita Electric Industrial Co., Ltd. | Image processing apparatus |
IL118914A0 (en) * | 1996-07-22 | 1996-10-31 | Zohar Argamanit Ltd | Hand-holdable optical scanner particularly useful as electronic translator |
EP0873003B1 (en) * | 1997-04-14 | 2007-03-21 | Hewlett-Packard Company, A Delaware Corporation | Image scanning device and method |
US6256016B1 (en) | 1997-06-05 | 2001-07-03 | Logitech, Inc. | Optical detection system, device, and method utilizing optical matching |
US6104979A (en) | 1997-06-11 | 2000-08-15 | Starlink, Inc. | Integrated swath guidance system |
EP0884890B1 (en) * | 1997-06-12 | 2003-07-09 | Hewlett-Packard Company, A Delaware Corporation | Image processing method and device |
DE69816185T2 (en) | 1997-06-12 | 2004-04-15 | Hewlett-Packard Co. (N.D.Ges.D.Staates Delaware), Palo Alto | Image processing method and device |
US6002815A (en) * | 1997-07-16 | 1999-12-14 | Kinetic Sciences, Inc. | Linear sensor imaging method and apparatus |
JPH1158844A (en) * | 1997-08-08 | 1999-03-02 | Hewlett Packard Co <Hp> | Handy printer system |
US6249591B1 (en) * | 1997-08-25 | 2001-06-19 | Hewlett-Packard Company | Method and apparatus for control of robotic grip or for activating contrast-based navigation |
US6466701B1 (en) * | 1997-09-10 | 2002-10-15 | Ricoh Company, Ltd. | System and method for displaying an image indicating a positional relation between partially overlapping images |
US7028899B2 (en) * | 1999-06-07 | 2006-04-18 | Metrologic Instruments, Inc. | Method of speckle-noise pattern reduction and apparatus therefore based on reducing the temporal-coherence of the planar laser illumination beam before it illuminates the target object by applying temporal phase modulation techniques during the transmission of the plib towards the target |
US6201903B1 (en) * | 1997-09-30 | 2001-03-13 | Ricoh Company, Ltd. | Method and apparatus for pen-based faxing |
US6611629B2 (en) * | 1997-11-03 | 2003-08-26 | Intel Corporation | Correcting correlation errors in a composite image |
US6233368B1 (en) * | 1998-03-18 | 2001-05-15 | Agilent Technologies, Inc. | CMOS digital optical navigation chip |
US6002124A (en) * | 1998-03-20 | 1999-12-14 | Hewlett-Packard Company | Portable image scanner with optical position sensors |
US6366317B1 (en) * | 1998-03-27 | 2002-04-02 | Intel Corporation | Motion estimation using intrapixel logic |
US6151015A (en) * | 1998-04-27 | 2000-11-21 | Agilent Technologies | Pen like computer pointing device |
US5994710A (en) * | 1998-04-30 | 1999-11-30 | Hewlett-Packard Company | Scanning mouse for a computer system |
US6033086A (en) * | 1998-04-30 | 2000-03-07 | Hewlett-Packard Company | Compact illumination system for image scanner |
US6173213B1 (en) | 1998-05-11 | 2001-01-09 | Ellison Machinery Company | Motorized inbound laser orientation and wheel recognition station |
US6064062A (en) * | 1998-06-02 | 2000-05-16 | Hewlett-Packard Company | Optical stray light baffle for image scanner |
USD420344S (en) * | 1998-07-13 | 2000-02-08 | Hewlett-Packard Company | Hand held optical scanner |
USD422995S (en) * | 1998-07-13 | 2000-04-18 | Hewlett-Packard Company | Hand held optical scanner front face portion |
USD421418S (en) * | 1998-07-13 | 2000-03-07 | Hewlett-Packard Company | Battery cover for hand held apparatus such as an optical scanner |
US6043503A (en) * | 1998-07-22 | 2000-03-28 | Hewlett-Packard Company | Hand held scanning device |
IT1306266B1 (en) * | 1998-07-24 | 2001-06-04 | Gd Spa | CONTROL METHOD OF A PRINTED OBJECT |
US6160926A (en) * | 1998-08-07 | 2000-12-12 | Hewlett-Packard Company | Appliance and method for menu navigation |
US6611291B1 (en) * | 1998-08-07 | 2003-08-26 | Hewlett-Packard Development Company | Appliance and method for communicating and viewing multiple captured images |
US6232973B1 (en) * | 1998-08-07 | 2001-05-15 | Hewlett-Packard Company | Appliance and method for navigating among multiple captured images and functional menus |
JP2000069243A (en) * | 1998-08-24 | 2000-03-03 | Matsushita Electric Ind Co Ltd | Image processing method and image processor |
US6330082B1 (en) | 1998-08-28 | 2001-12-11 | Hewlett-Packard Company | Converter for optical scanner |
US6229137B1 (en) | 1998-09-14 | 2001-05-08 | Hewlett-Packard Company | Scan line illumination system utilizing hollow reflector |
US6195475B1 (en) | 1998-09-15 | 2001-02-27 | Hewlett-Packard Company | Navigation system for handheld scanner |
US6118132A (en) * | 1998-09-17 | 2000-09-12 | Agilent Technologies | System for measuring the velocity, displacement and strain on a moving surface or web of material |
US6188058B1 (en) | 1998-09-17 | 2001-02-13 | Agilent Technologies Inc. | System for taking displacement measurements having photosensors with imaged pattern arrangement |
US6459823B2 (en) * | 1998-10-28 | 2002-10-01 | Hewlett-Packard Company | Apparatus and method of increasing scanner resolution |
US6403941B1 (en) | 1998-10-29 | 2002-06-11 | Hewlett-Packard Company | Image scanner with real time pixel resampling |
US6427022B1 (en) * | 1998-11-10 | 2002-07-30 | Western Research Company, Inc. | Image comparator system and method for detecting changes in skin lesions |
US6292274B1 (en) | 1998-12-11 | 2001-09-18 | Hewlett-Packard Company | Portable scanner with tilting body design |
US6878922B1 (en) | 1998-12-23 | 2005-04-12 | Hewlett-Packard Development Company, L.P. | Optical system for compensating for non-uniform illumination of an object |
US6388773B1 (en) | 1999-02-24 | 2002-05-14 | Hewlett-Packard Company | Simultaneous multi-mode capture button behavior |
US6222174B1 (en) | 1999-03-05 | 2001-04-24 | Hewlett-Packard Company | Method of correlating immediately acquired and previously stored feature information for motion sensing |
US6246050B1 (en) | 1999-03-08 | 2001-06-12 | Hewlett-Packard Company | Optical encoders using non-patterned targets |
US6160250A (en) * | 1999-03-31 | 2000-12-12 | Hewlett-Packard Company | Integrated optical imaging assembly |
US6584214B1 (en) | 1999-04-23 | 2003-06-24 | Massachusetts Institute Of Technology | Identification and verification using complex, three-dimensional structural features |
US6816272B2 (en) | 1999-05-13 | 2004-11-09 | Hewlett-Packard Development Company, L.P. | System and method for selectively downloading data files from an optical scanner |
US6633332B1 (en) | 1999-05-13 | 2003-10-14 | Hewlett-Packard Development Company, L.P. | Digital camera system and method capable of performing document scans |
AUPQ439299A0 (en) * | 1999-12-01 | 1999-12-23 | Silverbrook Research Pty Ltd | Interface system |
US7106888B1 (en) * | 1999-05-25 | 2006-09-12 | Silverbrook Research Pty Ltd | Signature capture via interface surface |
US7707082B1 (en) * | 1999-05-25 | 2010-04-27 | Silverbrook Research Pty Ltd | Method and system for bill management |
IT1309271B1 (en) * | 1999-05-27 | 2002-01-16 | Gd Spa | BANKNOTE CHECK METHOD |
US6362465B1 (en) | 1999-06-14 | 2002-03-26 | Hewlett-Packard Company | Optical scanning system and method capable of receiving inputs from a user |
US6242731B1 (en) | 1999-06-30 | 2001-06-05 | Hewlett Packard Company | Imaging device having an integrated position sensing device |
US6207945B1 (en) | 1999-06-30 | 2001-03-27 | Hewlett-Packard Company | Integral positioning and imaging device |
US6265706B1 (en) | 1999-07-12 | 2001-07-24 | Hewlett-Packard Company | Edge to edge image sensor and navigator for portable scanner |
GB2387734B (en) * | 1999-07-12 | 2004-01-07 | Hewlett Packard Co | Edge to edge image sensor and navigator for portable scanner |
US6556315B1 (en) | 1999-07-30 | 2003-04-29 | Hewlett-Packard Company | Digital image scanner with compensation for misalignment of photosensor array segments |
US6300645B1 (en) | 1999-08-25 | 2001-10-09 | Hewlett-Packard Company | Position sensing device having a single photosensing element |
US6350980B1 (en) | 1999-08-27 | 2002-02-26 | Hewlett-Packard Company | Imaging assembly with a side mounted optical detector for a scanner |
US6380529B1 (en) * | 1999-09-29 | 2002-04-30 | Hewlett-Packard Company | Position sensing device having a movable photosensing element |
US6555812B1 (en) | 1999-09-29 | 2003-04-29 | Hewlett Packard Development Company, L.P. | Optics device dust seal |
EP1091560A1 (en) | 1999-10-05 | 2001-04-11 | Hewlett-Packard Company | Method and apparatus for scanning oversized documents |
US6376834B1 (en) * | 1999-10-26 | 2002-04-23 | Hewlett-Packard Company | Moire interference pattern reduction for scanning of half-toned images |
US6312124B1 (en) | 1999-10-27 | 2001-11-06 | Hewlett-Packard Company | Solid and semi-flexible body inkjet printing system |
US6455840B1 (en) | 1999-10-28 | 2002-09-24 | Hewlett-Packard Company | Predictive and pulsed illumination of a surface in a micro-texture navigation technique |
US6429422B1 (en) * | 1999-11-12 | 2002-08-06 | Hewlett-Packard Company | Scanner navigation system with variable aperture |
US6414293B1 (en) | 1999-11-12 | 2002-07-02 | Hewlett-Packard Company | Optical position sensing device and method using a contoured transparent sheet |
US6568777B1 (en) * | 1999-11-16 | 2003-05-27 | Agilent Technologies, Inc. | Optical navigation system and method |
US6303921B1 (en) | 1999-11-23 | 2001-10-16 | Hewlett-Packard Company | Method and system for capturing large format documents using a portable hand-held scanner |
DE19956467A1 (en) * | 1999-11-24 | 2001-06-07 | Wincor Nixdorf Gmbh & Co Kg | Process, data processing system and program for the correction of scanning errors and associated text recognition system |
GB2357209B (en) | 1999-12-07 | 2004-04-14 | Hewlett Packard Co | Hand-held image capture apparatus |
US6418372B1 (en) | 1999-12-10 | 2002-07-09 | Siemens Technology-To-Business Center, Llc | Electronic visitor guidance system |
US6538243B1 (en) | 2000-01-04 | 2003-03-25 | Hewlett-Packard Company | Contact image sensor with light guide having least reflectivity near a light source |
US6346699B1 (en) | 2000-01-04 | 2002-02-12 | Hewlett-Packard Company | Optical assembly having a reduced width |
US6563101B1 (en) | 2000-01-19 | 2003-05-13 | Barclay J. Tullis | Non-rectilinear sensor arrays for tracking an image |
US7133068B2 (en) | 2000-03-06 | 2006-11-07 | Sony Corporation | System and method for creating still images by utilizing a video camera device |
US6912076B2 (en) * | 2000-03-17 | 2005-06-28 | Accu-Sort Systems, Inc. | Coplanar camera scanning system |
US6377888B1 (en) | 2000-04-03 | 2002-04-23 | Disney Enterprises, Inc. | System for controlling movement of a vehicle |
US6426498B1 (en) | 2000-04-03 | 2002-07-30 | Hewlett-Packard Co. | Optics module for optical scanning device |
US6525306B1 (en) | 2000-04-25 | 2003-02-25 | Hewlett-Packard Company | Computer mouse with integral digital camera and method for using the same |
WO2001082586A2 (en) * | 2000-04-26 | 2001-11-01 | Raja Tuli | Document scanning apparatus integrated with a writing device |
US6930703B1 (en) | 2000-04-29 | 2005-08-16 | Hewlett-Packard Development Company, L.P. | Method and apparatus for automatically capturing a plurality of images during a pan |
US6618038B1 (en) | 2000-06-02 | 2003-09-09 | Hewlett-Packard Development Company, Lp. | Pointing device having rotational sensing mechanisms |
US7161578B1 (en) | 2000-08-02 | 2007-01-09 | Logitech Europe S.A. | Universal presentation device |
US7289649B1 (en) * | 2000-08-10 | 2007-10-30 | Avago Technologies Ecbu Ip (Singapore) Pte. Ltd. | Fingerprint imager |
US6781570B1 (en) * | 2000-11-09 | 2004-08-24 | Logitech Europe S.A. | Wireless optical input device |
US7164810B2 (en) * | 2001-11-21 | 2007-01-16 | Metrologic Instruments, Inc. | Planar light illumination and linear imaging (PLILIM) device with image-based velocity detection and aspect ratio compensation |
US20030098352A1 (en) * | 2000-11-24 | 2003-05-29 | Metrologic Instruments, Inc. | Handheld imaging device employing planar light illumination and linear imaging with image-based velocity detection and aspect ratio compensation |
US6711501B2 (en) * | 2000-12-08 | 2004-03-23 | Satloc, Llc | Vehicle navigation system and method for swathing applications |
JP2002196877A (en) * | 2000-12-25 | 2002-07-12 | Hitachi Ltd | Electronic equipment using image sensor |
US6357939B1 (en) | 2001-02-02 | 2002-03-19 | Hewlett-Packard Company | Method of and apparatus for handheld printing of images on a media |
US6621483B2 (en) | 2001-03-16 | 2003-09-16 | Agilent Technologies, Inc. | Optical screen pointing device with inertial properties |
US6977645B2 (en) | 2001-03-16 | 2005-12-20 | Agilent Technologies, Inc. | Portable electronic device with mouse-like capabilities |
US7184026B2 (en) * | 2001-03-19 | 2007-02-27 | Avago Technologies Ecbu Ip (Singapore) Pte. Ltd. | Impedance sensing screen pointing device |
US6478415B2 (en) | 2001-03-21 | 2002-11-12 | Hewlett-Packard Company | Rejuvenation station and printer cartridge therefore |
US6677929B2 (en) | 2001-03-21 | 2004-01-13 | Agilent Technologies, Inc. | Optical pseudo trackball controls the operation of an appliance or machine |
US20020140677A1 (en) * | 2001-03-28 | 2002-10-03 | Misek Brian J. | Optical mouse having an integrated processor |
US6603108B2 (en) * | 2001-04-09 | 2003-08-05 | Syscan Technology (Shenzhen) Co. Limited | Image sensing modules for portable optical scanners |
US6719467B2 (en) | 2001-04-30 | 2004-04-13 | Hewlett-Packard Development Company, L.P. | Floor printer |
US7333083B1 (en) | 2001-05-10 | 2008-02-19 | Logitech Europe S.A. | Optical based performance improvement for an optical illumination configuration |
US6809723B2 (en) | 2001-05-14 | 2004-10-26 | Agilent Technologies, Inc. | Pushbutton optical screen pointing device |
US6937135B2 (en) * | 2001-05-30 | 2005-08-30 | Hewlett-Packard Development Company, L.P. | Face and environment sensing watch |
US20020184196A1 (en) * | 2001-06-04 | 2002-12-05 | Lehmeier Michelle R. | System and method for combining voice annotation and recognition search criteria with traditional search criteria into metadata |
GB2376283B (en) * | 2001-06-04 | 2005-03-16 | Hewlett Packard Co | Foot activated user input |
SE523273C2 (en) | 2001-07-13 | 2004-04-06 | Print Dreams Europe Ab | Device and method |
SE519352C2 (en) * | 2001-07-13 | 2003-02-18 | Print Dreams Europe Ab | Handheld and hand operated random movement typing apparatus and method of writing thereof. |
US6795056B2 (en) | 2001-07-24 | 2004-09-21 | Agilent Technologies, Inc. | System and method for reducing power consumption in an optical screen pointing device |
US6935564B2 (en) * | 2001-07-30 | 2005-08-30 | Bayer Healthcare Llc | Circuit and method for correcting influence of AC coupling |
US6823077B2 (en) * | 2001-07-30 | 2004-11-23 | Agilent Technologies, Inc. | Simplified interpolation for an optical navigation system that correlates images of one bit resolution |
US6664948B2 (en) | 2001-07-30 | 2003-12-16 | Microsoft Corporation | Tracking pointing device motion using a single buffer for cross and auto correlation determination |
US6847353B1 (en) | 2001-07-31 | 2005-01-25 | Logitech Europe S.A. | Multiple sensor device and method |
US6703633B2 (en) | 2001-08-16 | 2004-03-09 | Hewlett-Packard Development Company, L.P. | Method and apparatus for authenticating a signature |
US7126585B2 (en) * | 2001-08-17 | 2006-10-24 | Jeffery Davis | One chip USB optical mouse sensor solution |
US6773177B2 (en) | 2001-09-14 | 2004-08-10 | Fuji Xerox Co., Ltd. | Method and system for position-aware freeform printing within a position-sensed area |
US6772000B2 (en) * | 2001-10-19 | 2004-08-03 | Scimed Life Systems, Inc. | Magnetic resonance imaging devices with a contrast medium for improved imaging |
US6657184B2 (en) * | 2001-10-23 | 2003-12-02 | Agilent Technologies, Inc. | Optical navigation upon grainy surfaces using multiple navigation sensors |
US6770863B2 (en) | 2001-10-26 | 2004-08-03 | Agilent Technologies, Inc. | Apparatus and method for three-dimensional relative movement sensing |
US6937357B1 (en) | 2001-10-30 | 2005-08-30 | Hewlett-Packard Development Company, L.P. | Hard copy system including rewritable media |
US7034805B2 (en) * | 2001-11-02 | 2006-04-25 | Kye Systems Corp. | Optical trackball |
US7042439B2 (en) * | 2001-11-06 | 2006-05-09 | Omnivision Technologies, Inc. | Method and apparatus for determining relative movement in an optical mouse |
US6859199B2 (en) | 2001-11-06 | 2005-02-22 | Omnivision Technologies, Inc. | Method and apparatus for determining relative movement in an optical mouse using feature extraction |
US6765555B2 (en) | 2001-11-07 | 2004-07-20 | Omnivision Technologies, Inc. | Passive optical mouse using image sensor with optional dual mode capability |
TWI263942B (en) * | 2001-12-05 | 2006-10-11 | Em Microelectronic Marin Sa | Method and sensing device for motion detection in an optical pointing device, such as an optical mouse |
US7583293B2 (en) * | 2001-12-06 | 2009-09-01 | Aptina Imaging Corporation | Apparatus and method for generating multi-image scenes with a camera |
US6646244B2 (en) | 2001-12-19 | 2003-11-11 | Hewlett-Packard Development Company, L.P. | Optical imaging device with speed variable illumination |
US6806453B1 (en) | 2002-01-17 | 2004-10-19 | Hewlett-Packard Development Company, L.P. | Scanning, copying, and printing with rewritable media |
US7333250B2 (en) * | 2002-01-31 | 2008-02-19 | Hewlett-Packard Development Company, L.P. | Image scanner with a single motor providing two-dimensional movement of photosensors |
AUPS049402A0 (en) * | 2002-02-13 | 2002-03-07 | Silverbrook Research Pty. Ltd. | Methods and apparatus (ap55) |
US7948769B2 (en) | 2007-09-27 | 2011-05-24 | Hemisphere Gps Llc | Tightly-coupled PCB GNSS circuit and manufacturing method |
SE527210C2 (en) * | 2002-03-11 | 2006-01-17 | Printdreams Europ Ab | Sensor and print head unit and method for a hand operated handwriting device |
SE527212C2 (en) * | 2002-03-11 | 2006-01-17 | Printdreams Europ Ab | Device and method of a handheld hand operated printer |
SE527211C2 (en) * | 2002-03-11 | 2006-01-17 | Printdreams Europ Ab | Sensor and print head unit of a hand operated handwriting device |
US6788875B1 (en) | 2002-04-08 | 2004-09-07 | Logitech Europe S.A. | Suspension system for use in an optical displacement detection system |
US6974947B2 (en) * | 2002-04-08 | 2005-12-13 | Agilent Technologies, Inc. | Apparatus and method for sensing rotation based on multiple sets of movement data |
US7131751B1 (en) | 2002-04-12 | 2006-11-07 | Logitech, Inc. | Attachment system for use in an optical illumination system |
DE10316208A1 (en) * | 2002-04-12 | 2003-11-20 | Samsung Electro Mech | Navigation system and navigation method |
US7362480B2 (en) * | 2002-04-24 | 2008-04-22 | Transpacific Ip, Ltd. | Method and system for changing a scanning resolution |
US7045763B2 (en) * | 2002-06-28 | 2006-05-16 | Hewlett-Packard Development Company, L.P. | Object-recognition lock |
US6983080B2 (en) * | 2002-07-19 | 2006-01-03 | Agilent Technologies, Inc. | Resolution and image quality improvements for small image sensors |
US7167604B2 (en) * | 2002-08-07 | 2007-01-23 | Hewlett-Packard Development Company, L.P. | Portable document scan accessory for use with a wireless handheld communications device |
US7015448B2 (en) * | 2002-08-22 | 2006-03-21 | Micron Technology, Inc. | Dark current reduction circuitry for CMOS active pixel sensors |
EP2241896B1 (en) * | 2002-09-23 | 2012-03-14 | Stefan Reich | Stabilising system for missiles |
US8077568B2 (en) * | 2002-11-12 | 2011-12-13 | Spencer Charles A | Method and system for synchronizing information specific to a location on a surface with an external source |
US6881949B2 (en) | 2002-11-12 | 2005-04-19 | Charles A. Spencer | Method and system for synchronizing information specific to a location on a surface with an external source |
US6956500B1 (en) | 2002-11-29 | 2005-10-18 | M & M Systems, Inc. | Real-time residential energy monitor |
US7885745B2 (en) | 2002-12-11 | 2011-02-08 | Hemisphere Gps Llc | GNSS control system and method |
US7142956B2 (en) * | 2004-03-19 | 2006-11-28 | Hemisphere Gps Llc | Automatic steering system and method |
US7162348B2 (en) | 2002-12-11 | 2007-01-09 | Hemisphere Gps Llc | Articulated equipment position control system and method |
US7689354B2 (en) * | 2003-03-20 | 2010-03-30 | Hemisphere Gps Llc | Adaptive guidance system and method |
US6924812B2 (en) * | 2002-12-24 | 2005-08-02 | Intel Corporation | Method and apparatus for reading texture data from a cache |
US6995748B2 (en) * | 2003-01-07 | 2006-02-07 | Agilent Technologies, Inc. | Apparatus for controlling a screen pointer with a frame rate based on velocity |
US7295186B2 (en) * | 2003-01-14 | 2007-11-13 | Avago Technologies Ecbuip (Singapore) Pte Ltd | Apparatus for controlling a screen pointer that distinguishes between ambient light and light from its light source |
EP1439689A1 (en) * | 2003-01-15 | 2004-07-21 | Siemens Aktiengesellschaft | Mobile telephone with scanning functionality |
US20040165224A1 (en) * | 2003-02-21 | 2004-08-26 | Allen Ross R. | Compact optical scanner |
US7009598B1 (en) | 2003-03-07 | 2006-03-07 | Microsoft Corporation | Multiple channel light guide for optically tracking pointing and input devices |
US7129929B1 (en) | 2003-03-07 | 2006-10-31 | Microsoft Corporation | Computer input device with multi-purpose light guide |
US9002565B2 (en) | 2003-03-20 | 2015-04-07 | Agjunction Llc | GNSS and optical guidance and machine control |
US20040212533A1 (en) * | 2003-04-23 | 2004-10-28 | Whitehead Michael L. | Method and system for satellite based phase measurements for relative positioning of fixed or slow moving points in close proximity |
US8271194B2 (en) | 2004-03-19 | 2012-09-18 | Hemisphere Gps Llc | Method and system using GNSS phase measurements for relative positioning |
US8594879B2 (en) | 2003-03-20 | 2013-11-26 | Agjunction Llc | GNSS guidance and machine control |
US8140223B2 (en) | 2003-03-20 | 2012-03-20 | Hemisphere Gps Llc | Multiple-antenna GNSS control system and method |
US8190337B2 (en) | 2003-03-20 | 2012-05-29 | Hemisphere GPS, LLC | Satellite based vehicle guidance control in straight and contour modes |
US8214111B2 (en) | 2005-07-19 | 2012-07-03 | Hemisphere Gps Llc | Adaptive machine control system and method |
US8138970B2 (en) | 2003-03-20 | 2012-03-20 | Hemisphere Gps Llc | GNSS-based tracking of fixed or slow-moving structures |
US8686900B2 (en) * | 2003-03-20 | 2014-04-01 | Hemisphere GNSS, Inc. | Multi-antenna GNSS positioning method and system |
US8634993B2 (en) | 2003-03-20 | 2014-01-21 | Agjunction Llc | GNSS based control for dispensing material from vehicle |
US8265826B2 (en) * | 2003-03-20 | 2012-09-11 | Hemisphere GPS, LLC | Combined GNSS gyroscope control system and method |
US20040189849A1 (en) * | 2003-03-31 | 2004-09-30 | Hofer Gregory V. | Panoramic sequence guide |
SE0300913D0 (en) * | 2003-03-31 | 2003-03-31 | Print Dreams Europe Ab | Method for navigation with optical sensors, and a device utilizing the method |
US7158659B2 (en) * | 2003-04-18 | 2007-01-02 | Avago Technologies Ecbu Ip (Singapore) Pte. Ltd. | System and method for multiplexing illumination in combined finger recognition and finger navigation module |
US7164782B2 (en) * | 2003-04-18 | 2007-01-16 | Avago Technologies Ecbu Ip (Singapore) Pte. Ltd. | System and method for time-space multiplexing in finger-imaging applications |
US7274808B2 (en) * | 2003-04-18 | 2007-09-25 | Avago Technologies Ecbu Ip (Singapore)Pte Ltd | Imaging system and apparatus for combining finger recognition and finger navigation |
TW576534U (en) * | 2003-04-23 | 2004-02-11 | Sunplus Technology Co Ltd | Light-guiding apparatus of optical mouse |
US7994877B1 (en) | 2008-11-10 | 2011-08-09 | Hrl Laboratories, Llc | MEMS-based quartz hybrid filters and a method of making the same |
US8766745B1 (en) | 2007-07-25 | 2014-07-01 | Hrl Laboratories, Llc | Quartz-based disk resonator gyro with ultra-thin conductive outer electrodes and method of making same |
CN100373310C (en) * | 2003-05-09 | 2008-03-05 | 凌阳科技股份有限公司 | Optical conducting device for optical mouse |
US20040227954A1 (en) * | 2003-05-16 | 2004-11-18 | Tong Xie | Interferometer based navigation device |
US7321359B2 (en) * | 2003-07-30 | 2008-01-22 | Avago Technologies Ecbu Ip (Singapore) Pte. Ltd. | Method and device for optical navigation |
US7116427B2 (en) | 2003-10-30 | 2006-10-03 | Avago Technologies Ecbu Ip (Singapore) Pte. Ltd. | Low power consumption, broad navigability optical mouse |
US7161586B2 (en) * | 2003-07-01 | 2007-01-09 | Em Microelectronic-Marin Sa | Method of operating an optical motion sensing device and optical motion sensing device implementing this method |
US7161585B2 (en) * | 2003-07-01 | 2007-01-09 | Em Microelectronic-Marin Sa | Displacement data post-processing and reporting in an optical pointing device |
US6963059B2 (en) * | 2003-07-01 | 2005-11-08 | Em Microelectronic-Marin Sa | Method and system for optimizing illumination power and integration time in an optical sensing device |
JP4169661B2 (en) * | 2003-07-24 | 2008-10-22 | オリンパス株式会社 | Imaging device |
JP2005041623A (en) * | 2003-07-25 | 2005-02-17 | Fuji Xerox Co Ltd | Carrying device and image forming device |
US7466356B2 (en) * | 2003-07-25 | 2008-12-16 | Hewlett-Packard Development Company, L.P. | Method and apparatus for setting a marker on an object and tracking the position of the object |
US20050022686A1 (en) * | 2003-07-28 | 2005-02-03 | Dreampatch, Llc | Apparatus, method, and computer program product for animation pad transfer |
US6934037B2 (en) * | 2003-10-06 | 2005-08-23 | Agilent Technologies, Inc. | System and method for optical navigation using a projected fringe technique |
US20050024346A1 (en) * | 2003-07-30 | 2005-02-03 | Jean-Luc Dupraz | Digital pen function control |
US7205521B2 (en) * | 2003-07-31 | 2007-04-17 | Avage Technologies Ecbu Ip (Singapore) Pte. Ltd. | Speckle based sensor for three dimensional navigation |
US20050024690A1 (en) * | 2003-07-31 | 2005-02-03 | Picciotto Carl E. | Pen with tag reader and navigation system |
US7227531B2 (en) * | 2003-08-15 | 2007-06-05 | Microsoft Corporation | Data input device for tracking and detecting lift-off from a tracking surface by a reflected laser speckle pattern |
US7161582B2 (en) * | 2003-08-29 | 2007-01-09 | Microsoft Corporation | Data input device for tracking and detecting lift-off from a tracking surface by a reflected laser speckle pattern |
US7423227B2 (en) * | 2003-09-04 | 2008-09-09 | Avago Technologies Ecbu Ip Pte Ltd | Apparatus for optical navigation |
EP1517119B1 (en) * | 2003-09-22 | 2008-04-09 | Xitact S.A. | Optical device for determining the longitudinal and angular position of a rotationally symmetrical apparatus |
US7034279B2 (en) | 2003-09-25 | 2006-04-25 | Hewlett-Packard Development Company, L.P. | Method and system for printhead rotation detection using photosensors |
KR100683248B1 (en) | 2003-10-29 | 2007-02-15 | 주식회사 애트랩 | Method of sub-pixel motion calculation and Sensor for chasing a position using this method |
US7167162B2 (en) * | 2003-12-12 | 2007-01-23 | Avago Technologies Ecbu Ip (Singapore) Pte. Ltd. | Apparatus and method for controlling a screen pointer |
US7869078B2 (en) * | 2003-12-18 | 2011-01-11 | Xerox Corporation | Reference marking system and tracking system for large area printing |
KR100545066B1 (en) * | 2004-02-16 | 2006-01-24 | 삼성전기주식회사 | Optical sensing equipment for navigating position and nevigation method using it |
US7221356B2 (en) * | 2004-02-26 | 2007-05-22 | Microsoft Corporation | Data input device and method for detecting an off-surface condition by a laser speckle size characteristic |
US7613329B2 (en) * | 2004-03-08 | 2009-11-03 | Avago Technologies Ecbu Ip (Singapore) Pte. Ltd. | Apparatus for controlling the position of a screen pointer that detects defective pixels |
EP1574825A1 (en) * | 2004-03-12 | 2005-09-14 | Xitact S.A. | Device for determining the longitudinal and angular position of a rotationally symmetrical apparatus |
TW200531724A (en) * | 2004-03-17 | 2005-10-01 | Zeroplus Technology Co Ltd | Game controlling system with displacement detecting capability |
US8583315B2 (en) | 2004-03-19 | 2013-11-12 | Agjunction Llc | Multi-antenna GNSS control system and method |
US7474297B2 (en) * | 2004-03-22 | 2009-01-06 | Avago Technologies Ecbu Ip (Singapore) Pte. | Contaminant-resistant optical mouse and cradle |
US7446756B2 (en) * | 2004-03-22 | 2008-11-04 | Avago Technologies Ecbu Ip (Singapore) Pte. Ltd. | Apparatus for controlling the position of a screen pointer with low sensitivity to particle contamination |
DE102004014994A1 (en) * | 2004-03-26 | 2005-10-13 | Hofmann Mess- Und Auswuchttechnik Gmbh & Co. Kg | Method and device for detecting a relative movement between a detector and a body having a surface structure |
US7242466B2 (en) * | 2004-03-31 | 2007-07-10 | Microsoft Corporation | Remote pointing system, device, and methods for identifying absolute position and relative movement on an encoded surface by remote optical method |
US7174260B2 (en) * | 2004-04-01 | 2007-02-06 | Blue Line Innovations Inc. | System and method for reading power meters |
US7439954B2 (en) * | 2004-04-15 | 2008-10-21 | Logitech Europe S.A. | Multi-light-source illumination system for optical pointing devices |
US8325140B2 (en) | 2004-04-20 | 2012-12-04 | Avago Technologies Ecbu Ip (Singapore) Pte. Ltd. | Illumination spot alignment |
US7292232B2 (en) * | 2004-04-30 | 2007-11-06 | Microsoft Corporation | Data input devices and methods for detecting movement of a tracking surface by a laser speckle pattern |
TWI238358B (en) * | 2004-05-12 | 2005-08-21 | Sunplus Technology Co Ltd | Optical mouse with shielding compensation, and the method for shielding compensation |
US7268341B2 (en) * | 2004-05-21 | 2007-09-11 | Silicon Light Machines Corporation | Optical position sensing device including interlaced groups of photosensitive elements |
US20050259097A1 (en) * | 2004-05-21 | 2005-11-24 | Silicon Light Machines Corporation | Optical positioning device using different combinations of interlaced photosensitive elements |
US20050258346A1 (en) * | 2004-05-21 | 2005-11-24 | Silicon Light Machines Corporation | Optical positioning device resistant to speckle fading |
US20050259078A1 (en) * | 2004-05-21 | 2005-11-24 | Silicon Light Machines Corporation | Optical positioning device with multi-row detector array |
US7285766B2 (en) * | 2004-05-21 | 2007-10-23 | Silicon Light Machines Corporation | Optical positioning device having shaped illumination |
US7773070B2 (en) | 2004-05-21 | 2010-08-10 | Cypress Semiconductor Corporation | Optical positioning device using telecentric imaging |
US7042575B2 (en) * | 2004-05-21 | 2006-05-09 | Silicon Light Machines Corporation | Speckle sizing and sensor dimensions in optical positioning device |
TWI240207B (en) * | 2004-06-11 | 2005-09-21 | Sunplus Technology Co Ltd | Method and system for real-time determining abnormality of pixel values for captured image |
US7653260B2 (en) * | 2004-06-17 | 2010-01-26 | Carl Zeis MicroImaging GmbH | System and method of registering field of view |
US7565034B2 (en) * | 2004-06-17 | 2009-07-21 | Avago Technologies Ecbu Ip (Singapore) Pte. Ltd. | Determination of a navigation window in an optical navigation system |
US7315013B2 (en) * | 2004-06-17 | 2008-01-01 | Avago Technologies Ecbu Ip (Singapore) Pte Ltd. | Optical navigation using one-dimensional correlation |
US8532338B2 (en) * | 2004-07-06 | 2013-09-10 | Hewlett-Packard Development Company, L.P. | System and method for compensating for motion blur in optical navigation |
US20060015804A1 (en) * | 2004-07-15 | 2006-01-19 | Microsoft Corporation | Method and system for presenting editable spreadsheet page layout view |
US7656395B2 (en) * | 2004-07-15 | 2010-02-02 | Microsoft Corporation | Methods and apparatuses for compound tracking systems |
US20060204061A1 (en) * | 2004-07-16 | 2006-09-14 | Atmel Grenoble S.A. | Method for the acquisition of an image of a finger print |
US20060023970A1 (en) * | 2004-07-29 | 2006-02-02 | Chinlee Wang | Optical tracking sensor method |
US7057148B2 (en) * | 2004-07-29 | 2006-06-06 | Ami Semiconductor, Inc. | Optical tracking sensor method |
US7176442B2 (en) * | 2004-08-13 | 2007-02-13 | Avago Technologies Ecbu Ip (Singapore) Pte. Ltd. | Optical navigation device with optical navigation quality detector |
US7166831B2 (en) * | 2004-09-01 | 2007-01-23 | Avago Technologies Ecbu Ip (Singapore) Pte. Ltd. | Optical mouse with replaceable contaminant barrier |
US7423633B2 (en) * | 2004-09-01 | 2008-09-09 | Avago Technologies Ec Buip Pte Ltd | Apparatus for controlling the position of a screen pointer with low sensitivity to fixed pattern noise |
US7126586B2 (en) * | 2004-09-17 | 2006-10-24 | Microsoft Corporation | Data input devices and methods for detecting movement of a tracking surface by detecting laser doppler self-mixing effects of a frequency modulated laser light beam |
US20060060653A1 (en) * | 2004-09-23 | 2006-03-23 | Carl Wittenberg | Scanner system and method for simultaneously acquiring data images from multiple object planes |
US7138620B2 (en) | 2004-10-29 | 2006-11-21 | Silicon Light Machines Corporation | Two-dimensional motion sensor |
TWI290221B (en) * | 2004-10-29 | 2007-11-21 | Silicon Light Machines Corp | Two-dimensional motion sensor |
US7189985B2 (en) * | 2004-10-30 | 2007-03-13 | Avago Technologies General Ip (Singapore) Pte. Ltd. | Tracking separation between an object and a surface using a reducing structure |
US7248345B2 (en) * | 2004-11-12 | 2007-07-24 | Silicon Light Machines Corporation | Signal processing method for use with an optical navigation system |
WO2006068746A2 (en) | 2004-11-19 | 2006-06-29 | Silicon Light Machines Corporation | Dense multi-axis array for motion sensing |
WO2006060798A2 (en) * | 2004-12-02 | 2006-06-08 | Silicon Light Machines Corporation | Signal processing method for optical sensors |
US7313271B2 (en) | 2004-12-07 | 2007-12-25 | Avago Technologies Ecbuip (Singapore) Pte. Ltd. | Color detection using grayscale and position information |
US7379049B2 (en) | 2004-12-13 | 2008-05-27 | Avago Technologies Ecbu Ip Pte Ltd | Apparatus for controlling the position of a screen pointer based on projection data |
US7619612B2 (en) * | 2004-12-20 | 2009-11-17 | Avago Technologies Ecbu Ip (Singapore) Pte. Ltd. | Pointing device with light source for providing visible light through a moveable puck |
US7557796B2 (en) * | 2004-12-22 | 2009-07-07 | Delphi Technologies, Inc. | Joystick sensor with two-dimensional image sensing |
US20060149425A1 (en) * | 2004-12-22 | 2006-07-06 | Davis Raymond A | Motion sensor system |
TWI288353B (en) * | 2004-12-24 | 2007-10-11 | Lite On Semiconductor Corp | Motion detection method |
US7114383B2 (en) * | 2005-01-07 | 2006-10-03 | Bridgestone Firestone North American Tire, Llc | Method and apparatus for monitoring tire performance |
US7499090B2 (en) | 2005-01-27 | 2009-03-03 | Datalogic Scanning, Inc. | Rolling-reset imager with optical filter |
US7215493B2 (en) * | 2005-01-27 | 2007-05-08 | Psc Scanning, Inc. | Imaging system with a lens having increased light collection efficiency and a deblurring equalizer |
US7224540B2 (en) * | 2005-01-31 | 2007-05-29 | Datalogic Scanning, Inc. | Extended depth of field imaging system using chromatic aberration |
US20060219863A1 (en) * | 2005-03-11 | 2006-10-05 | Burch Jefferson B | Obtaining data from a utility meter using image-based movement tracking |
EP1866871A4 (en) * | 2005-03-30 | 2012-01-04 | Worcester Polytech Inst | Free-hand three-dimensional ultrasound diagnostic imaging with position and angle determination sensors |
JP4616692B2 (en) * | 2005-04-21 | 2011-01-19 | 株式会社ミツトヨ | Displacement detector |
US7474848B2 (en) * | 2005-05-05 | 2009-01-06 | Hewlett-Packard Development Company, L.P. | Method for achieving correct exposure of a panoramic photograph |
US7656428B2 (en) * | 2005-05-05 | 2010-02-02 | Avago Technologies General Ip (Singapore) Pte. Ltd. | Imaging device employing optical motion sensor as gyroscope |
CN100420260C (en) * | 2005-05-25 | 2008-09-17 | 光宝科技股份有限公司 | Image combination method |
US7898524B2 (en) | 2005-06-30 | 2011-03-01 | Logitech Europe S.A. | Optical displacement detection over varied surfaces |
US8300015B2 (en) * | 2005-07-05 | 2012-10-30 | Stmicroelectronics S.A. | Method of detecting the movement of an entity equipped with an image sensor and device for implementing same |
US8179967B2 (en) * | 2005-07-05 | 2012-05-15 | Stmicroelectronics S.A. | Method and device for detecting movement of an entity provided with an image sensor |
US20070023997A1 (en) * | 2005-07-28 | 2007-02-01 | Ertel John P | System and method for optically detecting media feeding malfunctions in an image forming apparatus |
US20070032318A1 (en) * | 2005-08-04 | 2007-02-08 | Nishimura Ken A | Motion sensor in sporting equipment |
US7522746B2 (en) * | 2005-08-12 | 2009-04-21 | Avago Technologies Ecbu Ip (Singapore) Pte. Ltd. | Object tracking using optical correlation and feedback |
US7399954B2 (en) * | 2005-08-16 | 2008-07-15 | Avago Technologies Ecbu Ip Pte Ltd | System and method for an optical navigation device configured to generate navigation information through an optically transparent layer and to have skating functionality |
US7763875B2 (en) * | 2005-09-07 | 2010-07-27 | Romanov Nikolai L | System and method for sensing position utilizing an uncalibrated surface |
US7598979B2 (en) * | 2005-09-21 | 2009-10-06 | Aptina Imaging Corporation | Imaging device with blur reduction system including a primary array and at least one navigation array |
US7293459B2 (en) * | 2005-09-22 | 2007-11-13 | Avago Technologies General Ip (Singapore) Pte. Ltd. | Image-based sensing of acceleration |
US7500732B2 (en) * | 2005-09-30 | 2009-03-10 | Lexmark International, Inc. | Maintenance and docking station for a hand-held printer |
US20070076082A1 (en) * | 2005-09-30 | 2007-04-05 | Lexmark International, Inc. | Methods and apparatuses for measuring print area using hand-held printer |
US7388539B2 (en) | 2005-10-19 | 2008-06-17 | Hemisphere Gps Inc. | Carrier track loop for GNSS derived attitude |
US20070109271A1 (en) * | 2005-11-14 | 2007-05-17 | Phison Electronics Corp. | [a portable storage device with handwritten input device] |
US7735951B2 (en) * | 2005-11-15 | 2010-06-15 | Lexmark International, Inc. | Alignment method for hand-operated printer |
US20070120937A1 (en) * | 2005-11-30 | 2007-05-31 | Lexmark International, Inc. | System and method for hand-held printing |
TW200722835A (en) * | 2005-12-09 | 2007-06-16 | Ind Tech Res Inst | Polymer dispersed liquid crystal emulsion and polymer dispersed liquid crystal composite film |
US7567235B2 (en) | 2005-12-12 | 2009-07-28 | Cypress Semiconductor Corporation | Self-aligning optical sensor package |
US8072502B2 (en) * | 2005-12-12 | 2011-12-06 | Sony Ericsson Mobile Communications Ab | Multi-mega pixel resolution with small sensor |
US8471191B2 (en) | 2005-12-16 | 2013-06-25 | Cypress Semiconductor Corporation | Optical navigation system having a filter-window to seal an enclosure thereof |
US7765251B2 (en) * | 2005-12-16 | 2010-07-27 | Cypress Semiconductor Corporation | Signal averaging circuit and method for sample averaging |
US7399129B2 (en) * | 2005-12-20 | 2008-07-15 | Lexmark International, Inc. | User interface for a hand-operated printer |
US7524051B2 (en) | 2005-12-20 | 2009-04-28 | Lexmark International, Inc. | Hand-operated printer having a user interface |
US7737948B2 (en) * | 2005-12-20 | 2010-06-15 | Cypress Semiconductor Corporation | Speckle navigation system |
TWI287828B (en) * | 2005-12-30 | 2007-10-01 | Ind Tech Res Inst | Method for printing a pattern and data processing method thereof |
US7298460B2 (en) * | 2006-01-03 | 2007-11-20 | Silicon Light Machines Corporation | Method for determining motion using a velocity predictor |
US20070181785A1 (en) * | 2006-02-09 | 2007-08-09 | Helbing Rene P | Compact optical navigation module and microlens array therefore |
US7884801B1 (en) | 2006-02-16 | 2011-02-08 | Cypress Semiconductor Corporation | Circuit and method for determining motion with redundant comb-arrays |
US7593833B2 (en) * | 2006-03-03 | 2009-09-22 | At&T Intellectual Property I, L.P. | System and method for determining performance of network lines |
US7557338B2 (en) * | 2006-03-14 | 2009-07-07 | Avago Technologies General Ip (Singapore) Pte. Ltd. | Electronic device with integrated optical navigation module and microlens array therefore |
US7297912B1 (en) | 2006-03-27 | 2007-11-20 | Silicon Light Machines Corporation | Circuit and method for reducing power consumption in an optical navigation system having redundant arrays |
US7721609B2 (en) | 2006-03-31 | 2010-05-25 | Cypress Semiconductor Corporation | Method and apparatus for sensing the force with which a button is pressed |
US7809035B2 (en) * | 2006-03-31 | 2010-10-05 | Cypress Semiconductor Corporation | Eye-safe laser navigation sensor |
US20070237561A1 (en) * | 2006-04-11 | 2007-10-11 | Lexmark International Inc. | Methods and apparatuses for sensing a print area using a hand-held printer |
US7378643B2 (en) * | 2006-04-24 | 2008-05-27 | Avago Technologies General Ip Pte Ltd | Optical projection encoder with patterned mask |
US7748839B2 (en) | 2006-05-09 | 2010-07-06 | Lexmark International, Inc. | Handheld printing with reference indicia |
US7682017B2 (en) | 2006-05-10 | 2010-03-23 | Lexmark International, Inc. | Handheld printer minimizing printing defects |
US7470887B2 (en) * | 2006-05-23 | 2008-12-30 | Avago Technologies Ecbu Ip (Singapore) Pte. Ltd. | Collapsible structure for optical navigation system |
US7492445B1 (en) | 2006-06-05 | 2009-02-17 | Cypress Semiconductor Corporation | Method and apparatus for robust velocity prediction |
US7755604B2 (en) | 2006-06-19 | 2010-07-13 | Cypress Semiconductor Corporation | Optical navigation sensor with tracking and lift detection for optically transparent contact surfaces |
US7787145B2 (en) * | 2006-06-29 | 2010-08-31 | Lexmark International, Inc. | Methods for improving print quality in a hand-held printer |
US7728816B2 (en) * | 2006-07-10 | 2010-06-01 | Cypress Semiconductor Corporation | Optical navigation sensor with variable tracking resolution |
US20080030534A1 (en) * | 2006-08-02 | 2008-02-07 | Adam Jude Ahne | Hand Held Micro-fluid Ejection Devices Configured to Eject Fluid without Referential Position Information and Method of Ejecting Fluid |
US7555824B2 (en) * | 2006-08-09 | 2009-07-07 | Hrl Laboratories, Llc | Method for large scale integration of quartz-based devices |
US7442916B2 (en) * | 2006-08-25 | 2008-10-28 | Avago Technologies General Ip (Singapore) Pte. Ltd. | Lift detection adapted for navigation on a transparent structure |
US7675020B2 (en) * | 2006-08-28 | 2010-03-09 | Avago Technologies General Ip (Singapore) Pte. Ltd. | Input apparatus and methods having diffuse and specular tracking modes |
US20080059069A1 (en) * | 2006-08-30 | 2008-03-06 | Trutna William R | System and method for detecting an object in the path of a vehicle |
US7889178B2 (en) | 2006-09-08 | 2011-02-15 | Avago Technologies Ecbu Ip (Singapore) Pte. Ltd. | Programmable resolution for optical pointing device |
US8210758B2 (en) * | 2006-09-21 | 2012-07-03 | Lexmark International, Inc. | Guiding a hand-operated printer |
US20080075511A1 (en) * | 2006-09-21 | 2008-03-27 | William Henry Reed | Method for Position Acquisition for Use with a Hand-operated Printer |
US20080079956A1 (en) * | 2006-09-21 | 2008-04-03 | Mahesan Chelvayohan | Hand-Held Printer Having An Integrated Digital Camera Scanner |
US20080075513A1 (en) * | 2006-09-26 | 2008-03-27 | Douglas Laurence Robertson | Methods for a Maintenance Algorithm in Hand Held Printers |
US7748840B2 (en) | 2006-09-27 | 2010-07-06 | Lexmark International, Inc. | Methods and apparatus for handheld printing with optical positioning |
US7938531B2 (en) | 2006-09-27 | 2011-05-10 | Lexmark International, Inc. | Methods and apparatus for handheld printing with optical positioning |
US7918519B2 (en) | 2006-09-27 | 2011-04-05 | Lexmark International, Inc. | Methods and apparatus for handheld printing with optical positioning |
US7742514B1 (en) | 2006-10-31 | 2010-06-22 | Cypress Semiconductor Corporation | Laser navigation sensor |
US7570348B2 (en) | 2006-12-18 | 2009-08-04 | Avago Technologies Ecbu Ip (Singapore) Pte. Ltd. | Methods and apparatus for navigating a surface |
US7514668B2 (en) * | 2006-12-19 | 2009-04-07 | Avago Technologies Ecbu Ip (Singapore) Pte. Ltd. | Optical navigation device that utilizes a vertical cavity surface emitting laser (VCSEL) configured to emit visible coherent light |
US8072429B2 (en) * | 2006-12-22 | 2011-12-06 | Cypress Semiconductor Corporation | Multi-axial touch-sensor device with multi-touch resolution |
US9052759B2 (en) * | 2007-04-11 | 2015-06-09 | Avago Technologies General Ip (Singapore) Pte. Ltd. | Dynamically reconfigurable pixel array for optical navigation |
US7567341B2 (en) * | 2006-12-29 | 2009-07-28 | Avago Technologies Ecbu Ip (Singapore) Pte. Ltd. | Optical navigation device adapted for navigation on a transparent structure |
US7965278B2 (en) * | 2006-12-29 | 2011-06-21 | Avago Technologies Ecbu Ip (Singapore) Pte. Ltd. | Optical navigation device adapted for navigation on a transparent plate |
US9411431B2 (en) * | 2006-12-29 | 2016-08-09 | Marvell World Trade Ltd. | Tracking a position in relation to a surface |
US8226194B1 (en) | 2007-01-02 | 2012-07-24 | Marvell International Ltd. | Printing on planar or non-planar print surface with handheld printing device |
US7949370B1 (en) | 2007-01-03 | 2011-05-24 | Marvell International Ltd. | Scanner for a mobile device |
US8000740B1 (en) | 2007-01-03 | 2011-08-16 | Marvell International Ltd. | Image translation device for a mobile device |
US8632266B1 (en) | 2007-01-03 | 2014-01-21 | Marvell International Ltd. | Printer for a mobile device |
US8077343B1 (en) | 2007-01-03 | 2011-12-13 | Marvell International Ltd. | Determining end of print job in handheld image translation device |
US7835832B2 (en) | 2007-01-05 | 2010-11-16 | Hemisphere Gps Llc | Vehicle control system |
US8311696B2 (en) | 2009-07-17 | 2012-11-13 | Hemisphere Gps Llc | Optical tracking vehicle control system and method |
USRE48527E1 (en) | 2007-01-05 | 2021-04-20 | Agjunction Llc | Optical tracking vehicle control system and method |
US8768558B2 (en) | 2007-01-05 | 2014-07-01 | Agjunction Llc | Optical tracking vehicle control system and method |
US8472066B1 (en) | 2007-01-11 | 2013-06-25 | Marvell International Ltd. | Usage maps in image deposition devices |
US8342627B1 (en) | 2007-01-11 | 2013-01-01 | Marvell International Ltd. | Adaptive filtering scheme in handheld positioning device |
US8396654B1 (en) | 2007-01-18 | 2013-03-12 | Marvell International Ltd. | Sensor positioning in handheld image translation device |
US7675630B2 (en) * | 2007-01-24 | 2010-03-09 | Avago Technologies Ecbu Ip (Singapore) Pte. Ltd. | System and method for selectively setting optical navigation resolution |
US7938532B2 (en) | 2007-02-16 | 2011-05-10 | Lexmark International, Inc. | Hand held printer with vertical misalignment correction |
US20080204489A1 (en) * | 2007-02-22 | 2008-08-28 | Mckinley Patrick A | Self-propelled image translation device |
EP2114688A1 (en) * | 2007-02-23 | 2009-11-11 | Marvell World Trade Ltd | Determining positioning of a handheld image translation device |
US8223384B1 (en) | 2007-02-23 | 2012-07-17 | Marvell International Ltd. | Defining a print image in memory for handheld image translation devices |
US8351062B2 (en) | 2007-02-26 | 2013-01-08 | Marvell World Trade Ltd. | Bit selection from print image in memory of handheld image translation device |
US8000381B2 (en) | 2007-02-27 | 2011-08-16 | Hemisphere Gps Llc | Unbiased code phase discriminator |
US8107108B1 (en) | 2007-02-28 | 2012-01-31 | Marvell International Ltd. | Providing user feedback in handheld device |
US20080213018A1 (en) * | 2007-03-02 | 2008-09-04 | Mealy James | Hand-propelled scrapbooking printer |
US8079765B1 (en) | 2007-03-02 | 2011-12-20 | Marvell International Ltd. | Hand-propelled labeling printer |
US8083422B1 (en) * | 2007-03-02 | 2011-12-27 | Marvell International Ltd. | Handheld tattoo printer |
US9294649B2 (en) * | 2007-03-02 | 2016-03-22 | Marvell World Trade Ltd. | Position correction in handheld image translation device |
US8339675B2 (en) * | 2007-03-02 | 2012-12-25 | Marvell World Trade Ltd. | Dynamic image dithering |
US8096713B1 (en) | 2007-03-02 | 2012-01-17 | Marvell International Ltd. | Managing project information with a hand-propelled device |
WO2008109543A1 (en) * | 2007-03-02 | 2008-09-12 | Marvell World Trade Ltd. | Position correction for handheld printer |
US20080219737A1 (en) * | 2007-03-07 | 2008-09-11 | Michael David Stilz | Hand Held Printer Having A Doppler Position Sensor |
US20080231600A1 (en) | 2007-03-23 | 2008-09-25 | Smith George E | Near-Normal Incidence Optical Mouse Illumination System with Prism |
US9180686B1 (en) * | 2007-04-05 | 2015-11-10 | Marvell International Ltd. | Image translation device providing navigational data feedback to communication device |
US7933166B2 (en) * | 2007-04-09 | 2011-04-26 | Schlumberger Technology Corporation | Autonomous depth control for wellbore equipment |
GB2462711B (en) * | 2007-04-09 | 2010-04-14 | Schlumberger Holdings | Autonomous depth control for wellbore equipment |
US8509487B2 (en) * | 2007-04-19 | 2013-08-13 | Avago Technologies General Ip (Singapore) Pte. Ltd. | System and method for optically measuring a parameter of an object |
US20080257962A1 (en) * | 2007-04-23 | 2008-10-23 | Chiu Lihu M | Acceleration-corrected barcode verification |
US8123322B1 (en) | 2007-06-11 | 2012-02-28 | Marvell International Ltd. | Manually operated image translation device |
US8705117B1 (en) | 2007-06-18 | 2014-04-22 | Marvell International Ltd. | Hand-held printing device and method for tuning ink jet color for printing on colored paper |
US20090015875A1 (en) * | 2007-06-20 | 2009-01-15 | Ctb/Mcgraw-Hill Companies, Inc. | Image manipulation of digitized images of documents |
EP2171641A4 (en) * | 2007-06-21 | 2012-11-14 | Univ Johns Hopkins | Manipulation device for navigating virtual microscopy slides/digital images and methods related thereto |
US8092006B2 (en) | 2007-06-22 | 2012-01-10 | Lexmark International, Inc. | Handheld printer configuration |
US8314774B1 (en) | 2007-07-09 | 2012-11-20 | Cypress Semiconductor Corporation | Method and apparatus for quasi-3D tracking using 2D optical motion sensors |
US10266398B1 (en) | 2007-07-25 | 2019-04-23 | Hrl Laboratories, Llc | ALD metal coatings for high Q MEMS structures |
US8263921B2 (en) | 2007-08-06 | 2012-09-11 | Cypress Semiconductor Corporation | Processing methods for speckle-based motion sensing |
US9555645B1 (en) | 2007-08-07 | 2017-01-31 | Marvell International Ltd. | Controlling a plurality of nozzles of a handheld printer |
US20090040286A1 (en) * | 2007-08-08 | 2009-02-12 | Tan Theresa Joy L | Print scheduling in handheld printers |
US8119975B2 (en) * | 2007-09-26 | 2012-02-21 | Crowsocs, Inc. | High speed deterministic, non-contact, 3-axis free trajectory measurement device and free trajectory imaging device |
US7808428B2 (en) | 2007-10-08 | 2010-10-05 | Hemisphere Gps Llc | GNSS receiver and external storage device system and GNSS data processing method |
US20090102793A1 (en) * | 2007-10-22 | 2009-04-23 | Microsoft Corporation | Optical mouse |
US8244062B2 (en) * | 2007-10-22 | 2012-08-14 | Hewlett-Packard Development Company, L.P. | Correction of distortion in captured images |
US20090135140A1 (en) * | 2007-11-27 | 2009-05-28 | Logitech Europe S.A. | System and method for accurate lift-detection of an input device |
US8847888B2 (en) * | 2007-12-18 | 2014-09-30 | Microsoft Corporation | Optical mouse with limited wavelength optics |
US20090160772A1 (en) * | 2007-12-20 | 2009-06-25 | Microsoft Corporation | Diffuse optics in an optical mouse |
US20090160773A1 (en) * | 2007-12-20 | 2009-06-25 | Microsoft Corporation | Optical mouse |
US8259069B1 (en) | 2008-01-11 | 2012-09-04 | Cypress Semiconductor Corporation | Speckle-based optical navigation on curved tracking surface |
US8031176B1 (en) | 2008-01-22 | 2011-10-04 | Cypress Semiconductor Corporation | Optical navigation system using a single-package motion sensor |
US8151640B1 (en) * | 2008-02-05 | 2012-04-10 | Hrl Laboratories, Llc | MEMS on-chip inertial navigation system with error correction |
US9002566B2 (en) | 2008-02-10 | 2015-04-07 | AgJunction, LLC | Visual, GNSS and gyro autosteering control |
US7802356B1 (en) | 2008-02-21 | 2010-09-28 | Hrl Laboratories, Llc | Method of fabricating an ultra thin quartz resonator component |
WO2009117253A1 (en) * | 2008-03-18 | 2009-09-24 | Marvell World Trade Ltd. | Handheld mobile printing device capable of real-time in-line tagging of print surfaces |
WO2009126587A1 (en) | 2008-04-08 | 2009-10-15 | Hemisphere Gps Llc | Gnss-based mobile communication system and method |
US8238639B2 (en) | 2008-04-09 | 2012-08-07 | Cognex Corporation | Method and system for dynamic feature detection |
DE102008024104A1 (en) * | 2008-05-17 | 2010-05-27 | Robert Bosch Gmbh | A material mark sensor and method for detecting a mark on or in a material |
US20100060592A1 (en) * | 2008-09-10 | 2010-03-11 | Jeffrey Traer Bernstein | Data Transmission and Reception Using Optical In-LCD Sensing |
US8541727B1 (en) | 2008-09-30 | 2013-09-24 | Cypress Semiconductor Corporation | Signal monitoring and control system for an optical navigation sensor |
US8212794B2 (en) | 2008-09-30 | 2012-07-03 | Avago Technologies Ecbu Ip (Singapore) Pte. Ltd. | Optical finger navigation utilizing quantized movement information |
US7723659B1 (en) | 2008-10-10 | 2010-05-25 | Cypress Semiconductor Corporation | System and method for screening semiconductor lasers |
US8780206B2 (en) * | 2008-11-25 | 2014-07-15 | De La Rue North America Inc. | Sequenced illumination |
US8265346B2 (en) | 2008-11-25 | 2012-09-11 | De La Rue North America Inc. | Determining document fitness using sequenced illumination |
US8217833B2 (en) | 2008-12-11 | 2012-07-10 | Hemisphere Gps Llc | GNSS superband ASIC with simultaneous multi-frequency down conversion |
US8217334B1 (en) | 2008-12-24 | 2012-07-10 | Cypress Semiconductor Corporation | Optical navigation sensor including a spatial frequency filter |
US8315434B2 (en) | 2009-01-06 | 2012-11-20 | Avago Technologies Ecbu Ip (Singapore) Pte. Ltd. | Absolute tracking in a sub-pixel range |
JP2011029832A (en) * | 2009-01-06 | 2011-02-10 | Seiko Epson Corp | Document reading apparatus |
US8386129B2 (en) | 2009-01-17 | 2013-02-26 | Hemipshere GPS, LLC | Raster-based contour swathing for guidance and variable-rate chemical application |
US8797298B2 (en) * | 2009-01-23 | 2014-08-05 | Avago Technologies General Ip (Singapore) Pte. Ltd. | Optical fingerprint navigation device with light guide film |
US20100188332A1 (en) | 2009-01-23 | 2010-07-29 | Avago Technologies Ecbu Ip (Singapore) Pte. Ltd. | Thin-film transistor imager |
US8259068B2 (en) | 2009-02-24 | 2012-09-04 | Avago Technologies Ecbu Ip (Singapore) Pte. Ltd. | Light beam shaping element for an optical navigation input device |
US8085196B2 (en) | 2009-03-11 | 2011-12-27 | Hemisphere Gps Llc | Removing biases in dual frequency GNSS receivers using SBAS |
US8711096B1 (en) | 2009-03-27 | 2014-04-29 | Cypress Semiconductor Corporation | Dual protocol input device |
US8339467B2 (en) * | 2010-03-25 | 2012-12-25 | Dacuda Ag | Synchronization of navigation and image information for handheld scanner |
US20100296133A1 (en) * | 2009-05-20 | 2010-11-25 | Dacuda Ag | Mode switching in a handheld scanner |
US8441696B2 (en) | 2009-05-20 | 2013-05-14 | Dacuda Ag | Continuous scanning with a handheld scanner |
US9300834B2 (en) | 2009-05-20 | 2016-03-29 | Dacuda Ag | Image processing for handheld scanner |
US8582182B2 (en) * | 2009-05-20 | 2013-11-12 | Dacuda Ag | Automatic sizing of images acquired by a handheld scanner |
US8441695B2 (en) * | 2009-05-20 | 2013-05-14 | Dacuda Ag | Handheld scanner with high image quality |
GB2470925A (en) * | 2009-06-09 | 2010-12-15 | Neopost Technologies | Automatically Adjusting Scanner Carrier Apparatus |
US8401704B2 (en) | 2009-07-22 | 2013-03-19 | Hemisphere GPS, LLC | GNSS control system and method for irrigation and related applications |
US8174437B2 (en) | 2009-07-29 | 2012-05-08 | Hemisphere Gps Llc | System and method for augmenting DGNSS with internally-generated differential correction |
US8611584B2 (en) * | 2009-08-17 | 2013-12-17 | Avago Technologies General Ip (Singapore) Pte. Ltd. | System and method for performing optical navigation using portions of captured frames of image data |
US8749767B2 (en) | 2009-09-02 | 2014-06-10 | De La Rue North America Inc. | Systems and methods for detecting tape on a document |
US8334804B2 (en) | 2009-09-04 | 2012-12-18 | Hemisphere Gps Llc | Multi-frequency GNSS receiver baseband DSP |
US8664548B2 (en) | 2009-09-11 | 2014-03-04 | Apple Inc. | Touch controller with improved diagnostics calibration and communications support |
US8649930B2 (en) | 2009-09-17 | 2014-02-11 | Agjunction Llc | GNSS integrated multi-sensor control system and method |
CN102022981B (en) * | 2009-09-22 | 2013-04-03 | 重庆工商大学 | Peak-valley motion detection method and device for measuring sub-pixel displacement |
US8176607B1 (en) | 2009-10-08 | 2012-05-15 | Hrl Laboratories, Llc | Method of fabricating quartz resonators |
US8194237B2 (en) | 2009-10-15 | 2012-06-05 | Authentix, Inc. | Document sensor |
US8548649B2 (en) | 2009-10-19 | 2013-10-01 | Agjunction Llc | GNSS optimized aircraft control system and method |
JP5359783B2 (en) * | 2009-10-28 | 2013-12-04 | ソニー株式会社 | Image processing apparatus and method, and program |
CN102052900B (en) * | 2009-11-02 | 2013-09-25 | 重庆工商大学 | Peak valley motion detection method and device for quickly measuring sub-pixel displacement |
US8687060B1 (en) | 2009-12-28 | 2014-04-01 | Cognex Corporation | System and method for providing distance-based pulses relative to motion of a surface scanned by a vision system |
US8583326B2 (en) | 2010-02-09 | 2013-11-12 | Agjunction Llc | GNSS contour guidance path selection |
US8497840B2 (en) * | 2010-03-25 | 2013-07-30 | Dacuda Ag | Computer peripheral for scanning |
US20120197461A1 (en) * | 2010-04-03 | 2012-08-02 | Geoffrey Louis Barrows | Vision Based Hover in Place |
US8912711B1 (en) | 2010-06-22 | 2014-12-16 | Hrl Laboratories, Llc | Thermal stress resistant resonator, and a method for fabricating same |
US9851849B2 (en) | 2010-12-03 | 2017-12-26 | Apple Inc. | Touch device communication |
WO2012075629A1 (en) * | 2010-12-08 | 2012-06-14 | Nokia Corporation | User interface |
US9429940B2 (en) | 2011-01-05 | 2016-08-30 | Sphero, Inc. | Self propelled device with magnetic coupling |
US9090214B2 (en) | 2011-01-05 | 2015-07-28 | Orbotix, Inc. | Magnetically coupled accessory for a self-propelled device |
US9218316B2 (en) | 2011-01-05 | 2015-12-22 | Sphero, Inc. | Remotely controlling a self-propelled device in a virtualized environment |
US10281915B2 (en) | 2011-01-05 | 2019-05-07 | Sphero, Inc. | Multi-purposed self-propelled device |
US9114838B2 (en) | 2011-01-05 | 2015-08-25 | Sphero, Inc. | Self-propelled device for interpreting input from a controller device |
TWI432009B (en) * | 2011-01-14 | 2014-03-21 | Genesys Logic Inc | Hand-held scanning system and method thereof |
WO2012173640A1 (en) | 2011-06-16 | 2012-12-20 | Cypress Semiconductor Corporaton | An optical navigation module with capacitive sensor |
KR101830870B1 (en) * | 2011-06-22 | 2018-02-21 | 엘지전자 주식회사 | A Method for displaying a scan image, display apparatus thereof and a method for acquring a information for scan image, input apparatus thereof |
US9223440B2 (en) | 2011-06-28 | 2015-12-29 | Avago Technologies General Ip (Singapore) Pte. Ltd. | Optical navigation utilizing speed based algorithm selection |
US8730518B2 (en) * | 2011-08-18 | 2014-05-20 | Raytheon Company | Application of color imagery to a rewritable color surface |
US8896553B1 (en) | 2011-11-30 | 2014-11-25 | Cypress Semiconductor Corporation | Hybrid sensor module |
CN103295005B (en) * | 2012-03-01 | 2016-08-10 | 汉王科技股份有限公司 | The scan method of character string and scanning means |
US9827487B2 (en) | 2012-05-14 | 2017-11-28 | Sphero, Inc. | Interactive augmented reality using a self-propelled device |
US9292758B2 (en) | 2012-05-14 | 2016-03-22 | Sphero, Inc. | Augmentation of elements in data content |
KR20150012274A (en) | 2012-05-14 | 2015-02-03 | 오보틱스, 아이엔씨. | Operating a computing device by detecting rounded objects in image |
US10056791B2 (en) | 2012-07-13 | 2018-08-21 | Sphero, Inc. | Self-optimizing power transfer |
US8781685B2 (en) | 2012-07-17 | 2014-07-15 | Agjunction Llc | System and method for integrating automatic electrical steering with GNSS guidance |
US9053596B2 (en) | 2012-07-31 | 2015-06-09 | De La Rue North America Inc. | Systems and methods for spectral authentication of a feature of a document |
CN103791912A (en) * | 2012-10-30 | 2014-05-14 | 大陆汽车投资(上海)有限公司 | Navigation path planning device supporting hand-painted path |
US10278584B2 (en) | 2013-03-11 | 2019-05-07 | Carestream Dental Technology Topco Limited | Method and system for three-dimensional imaging |
WO2014139079A1 (en) * | 2013-03-11 | 2014-09-18 | Carestream Health, Inc. | A method and system for three-dimensional imaging |
US9250074B1 (en) | 2013-04-12 | 2016-02-02 | Hrl Laboratories, Llc | Resonator assembly comprising a silicon resonator and a quartz resonator |
WO2015028587A2 (en) | 2013-08-31 | 2015-03-05 | Dacuda Ag | User feedback for real-time checking and improving quality of scanned image |
US9599470B1 (en) | 2013-09-11 | 2017-03-21 | Hrl Laboratories, Llc | Dielectric high Q MEMS shell gyroscope structure |
DE102013110581B4 (en) * | 2013-09-24 | 2018-10-11 | Faro Technologies, Inc. | Method for optically scanning and measuring an environment and device therefor |
EP3089101A1 (en) | 2013-12-03 | 2016-11-02 | Dacuda AG | User feedback for real-time checking and improving quality of scanned image |
US9829882B2 (en) | 2013-12-20 | 2017-11-28 | Sphero, Inc. | Self-propelled device with center of mass drive system |
WO2015104236A1 (en) | 2014-01-07 | 2015-07-16 | Dacuda Ag | Adaptive camera control for reducing motion blur during real-time image capture |
EP4113457A1 (en) | 2014-01-07 | 2023-01-04 | ML Netherlands C.V. | Dynamic updating of composite images |
TWI534453B (en) * | 2014-02-18 | 2016-05-21 | 原相科技股份有限公司 | Relative position positioning system and tracking system |
US9977097B1 (en) | 2014-02-21 | 2018-05-22 | Hrl Laboratories, Llc | Micro-scale piezoelectric resonating magnetometer |
US9991863B1 (en) | 2014-04-08 | 2018-06-05 | Hrl Laboratories, Llc | Rounded and curved integrated tethers for quartz resonators |
US10484561B2 (en) | 2014-05-12 | 2019-11-19 | Ml Netherlands C.V. | Method and apparatus for scanning and printing a 3D object |
US10308505B1 (en) | 2014-08-11 | 2019-06-04 | Hrl Laboratories, Llc | Method and apparatus for the monolithic encapsulation of a micro-scale inertial navigation sensor suite |
KR102309863B1 (en) | 2014-10-15 | 2021-10-08 | 삼성전자주식회사 | Electronic device, controlling method thereof and recording medium |
US10031191B1 (en) | 2015-01-16 | 2018-07-24 | Hrl Laboratories, Llc | Piezoelectric magnetometer capable of sensing a magnetic field in multiple vectors |
US10110198B1 (en) | 2015-12-17 | 2018-10-23 | Hrl Laboratories, Llc | Integrated quartz MEMS tuning fork resonator/oscillator |
CN105630206B (en) * | 2015-12-23 | 2018-10-16 | 广州中国科学院先进技术研究所 | A kind of touch localization method and system based on DIC |
US10175307B1 (en) | 2016-01-15 | 2019-01-08 | Hrl Laboratories, Llc | FM demodulation system for quartz MEMS magnetometer |
CN107370904A (en) * | 2016-05-13 | 2017-11-21 | 菱光科技股份有限公司 | Image-taking device and electronic system |
RU169458U1 (en) * | 2016-12-23 | 2017-03-21 | Акционерное общество "НПО "Орион" | Image signal generator based on a matrix photodetector with gradient correction of heterogeneity and defects of photosensitive elements |
EP3367655B1 (en) * | 2017-02-28 | 2021-08-18 | Global Scanning Denmark A/S | Optical flatbed scanner with document presser member |
CN111133356B (en) * | 2017-09-20 | 2022-03-01 | 富士胶片株式会社 | Image pickup apparatus, image pickup apparatus main body, and focus control method for image pickup apparatus |
EP3625609A4 (en) * | 2017-11-30 | 2021-03-03 | Leica Biosystems Imaging, Inc. | Impulse rescan system |
US10900776B2 (en) * | 2018-02-06 | 2021-01-26 | Saudi Arabian Oil Company | Sensor device for distance offset measurements |
US10679320B1 (en) * | 2018-07-23 | 2020-06-09 | Ambarella International Lp | High dynamic range sensor system with row increment operation |
JP7081520B2 (en) * | 2019-02-06 | 2022-06-07 | コニカミノルタ株式会社 | Measuring device, scanning direction determination system and scanning direction determination program |
DE102019108426A1 (en) | 2019-04-01 | 2020-10-01 | Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. | Device and method for three-dimensional acquisition of at least one object |
CN111178167B (en) * | 2019-12-12 | 2023-07-25 | 咪咕文化科技有限公司 | Method and device for checking lasting lens, electronic equipment and storage medium |
US11347327B2 (en) * | 2020-06-26 | 2022-05-31 | Logitech Europe S.A. | Surface classification and sensor tuning for a computer peripheral device |
KR102227531B1 (en) * | 2020-07-06 | 2021-03-15 | 주식회사 딥노이드 | Image processing apparatus and method for x-ray search apparatus |
US11736640B1 (en) * | 2022-08-22 | 2023-08-22 | Kyocera Document Solutions, Inc. | Method and apparatus for detecting sheet-fed scanner double-feeds using neural network classifier |
KR102567729B1 (en) * | 2023-03-15 | 2023-08-17 | (주)에이스디이씨 | Calculating coordinates system |
Family Cites Families (22)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPS6042990B2 (en) * | 1978-05-22 | 1985-09-26 | 株式会社日立製作所 | Pattern recognition method |
FR2504673A1 (en) * | 1981-04-27 | 1982-10-29 | Thomson Csf | CARTOGRAPHIC INDICATOR RECORDED ON PHOTOGRAPHIC FILM |
US4723297A (en) * | 1984-09-27 | 1988-02-02 | Siemens Aktiengesellschaft | Method for automatic correction of character skew in the acquisition of a text original in the form of digital scan results |
US5023922A (en) * | 1985-06-07 | 1991-06-11 | Soricon Corporation | Optical character reader |
US4819083A (en) * | 1986-01-31 | 1989-04-04 | Konishiroku Photo Industry Co., Ltd. | Moving type image recording apparatus |
EP0249212B1 (en) * | 1986-06-11 | 1992-09-09 | Casio Computer Company Limited | Hand held manually sweeping printing apparatus |
US4797544A (en) * | 1986-07-23 | 1989-01-10 | Montgomery James R | Optical scanner including position sensors |
US4847786A (en) * | 1986-08-20 | 1989-07-11 | The Regents Of The University Of California | Object analysis of multi-valued images |
US4984287A (en) * | 1988-11-15 | 1991-01-08 | Msc Technologies, Inc. | Method for orienting a dual mouse optical scanner |
US4951214A (en) * | 1988-11-18 | 1990-08-21 | Texas Instruments Incorporated | Method for passively determining the relative position of a moving observer with respect to a stationary object |
US5089712A (en) * | 1989-06-08 | 1992-02-18 | Hewlett-Packard Company | Sheet advancement control system detecting fiber pattern of sheet |
JP2917155B2 (en) * | 1989-12-18 | 1999-07-12 | 株式会社日立製作所 | Image combining device and method |
US5355146A (en) * | 1990-03-05 | 1994-10-11 | Bmc Micro-Industries Ltd. | Multi-directional hand scanner and mouse |
US5675672A (en) * | 1990-06-26 | 1997-10-07 | Seiko Epson Corporation | Two dimensional linker for character string data |
US5185673A (en) * | 1991-06-12 | 1993-02-09 | Hewlett-Packard Company | Automated image calibration |
AU662947B2 (en) * | 1991-12-10 | 1995-09-21 | Logitech, Inc. | Apparatus and methods for automerging images |
US5686960A (en) * | 1992-01-14 | 1997-11-11 | Michael Sussman | Image input device having optical deflection elements for capturing multiple sub-images |
US5306908A (en) * | 1993-03-15 | 1994-04-26 | Hewlett-Packard Company | Manually operated hand-held optical scanner with tactile speed control assembly |
US5497150A (en) * | 1993-04-05 | 1996-03-05 | Smk Corporation | Image scanner |
GB2288512B (en) * | 1994-04-14 | 1998-08-26 | Matsushita Electric Ind Co Ltd | Image processing apparatus |
US5578813A (en) * | 1995-03-02 | 1996-11-26 | Allen; Ross R. | Freehand image scanning device which compensates for non-linear movement |
US5729008A (en) * | 1996-01-25 | 1998-03-17 | Hewlett-Packard Company | Method and device for tracking relative movement by correlating signals from an array of photoelements |
-
1995
- 1995-03-02 US US08/396,826 patent/US5578813A/en not_active Expired - Lifetime
- 1995-12-14 AU AU40460/95A patent/AU719574B2/en not_active Ceased
-
1996
- 1996-01-10 CA CA002166904A patent/CA2166904A1/en not_active Abandoned
- 1996-02-06 JP JP02013596A patent/JP3860242B2/en not_active Expired - Fee Related
- 1996-02-07 DE DE69609096T patent/DE69609096T2/en not_active Expired - Lifetime
- 1996-02-07 EP EP96300830A patent/EP0730366B1/en not_active Expired - Lifetime
- 1996-02-26 CN CN96100020A patent/CN1120444C/en not_active Expired - Fee Related
- 1996-02-29 KR KR1019960005378A patent/KR100463947B1/en not_active IP Right Cessation
- 1996-03-04 EP EP96904937A patent/EP0812505B1/en not_active Expired - Lifetime
- 1996-03-04 CN CNB961923040A patent/CN1156143C/en not_active Expired - Fee Related
- 1996-03-04 JP JP52611296A patent/JP3720367B2/en not_active Expired - Lifetime
- 1996-03-04 DE DE69608262T patent/DE69608262T2/en not_active Expired - Lifetime
- 1996-03-04 WO PCT/GB1996/000492 patent/WO1996027257A2/en active IP Right Grant
- 1996-03-04 US US08/860,652 patent/US6005681A/en not_active Expired - Lifetime
- 1996-08-14 US US08/696,713 patent/US5644139A/en not_active Expired - Lifetime
-
1997
- 1997-02-25 US US08/805,963 patent/US5825044A/en not_active Expired - Lifetime
Also Published As
Publication number | Publication date |
---|---|
CN1156143C (en) | 2004-06-30 |
CN1135062A (en) | 1996-11-06 |
EP0730366B1 (en) | 2000-07-05 |
US6005681A (en) | 1999-12-21 |
EP0730366A2 (en) | 1996-09-04 |
CN1120444C (en) | 2003-09-03 |
JP3720367B2 (en) | 2005-11-24 |
CN1183873A (en) | 1998-06-03 |
DE69608262D1 (en) | 2000-06-15 |
KR100463947B1 (en) | 2005-05-27 |
US5825044A (en) | 1998-10-20 |
AU719574B2 (en) | 2000-05-11 |
WO1996027257A2 (en) | 1996-09-06 |
JPH08265518A (en) | 1996-10-11 |
JP3860242B2 (en) | 2006-12-20 |
DE69608262T2 (en) | 2001-02-08 |
US5578813A (en) | 1996-11-26 |
DE69609096T2 (en) | 2000-11-16 |
AU4046095A (en) | 1996-09-12 |
EP0730366A3 (en) | 1996-10-09 |
WO1996027257A3 (en) | 1997-02-20 |
EP0812505B1 (en) | 2000-05-10 |
KR960036517A (en) | 1996-10-28 |
DE69609096D1 (en) | 2000-08-10 |
EP0812505A2 (en) | 1997-12-17 |
JPH11501174A (en) | 1999-01-26 |
US5644139A (en) | 1997-07-01 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US5578813A (en) | Freehand image scanning device which compensates for non-linear movement | |
US6249360B1 (en) | Image scanning device and method | |
US6195475B1 (en) | Navigation system for handheld scanner | |
US6303921B1 (en) | Method and system for capturing large format documents using a portable hand-held scanner | |
US6259826B1 (en) | Image processing method and device | |
EP0800307B1 (en) | Acquisition of data related to the surface topography of a medium | |
US6002124A (en) | Portable image scanner with optical position sensors | |
EP0277964A1 (en) | Optical scanner including position sensors. | |
JPS6139674A (en) | Handy scanning and inputting unit and system | |
EP0884890B1 (en) | Image processing method and device | |
EP1099934B1 (en) | Position sensing device and method | |
CN1621953A (en) | Document handler with improved optics | |
Allen et al. | Processes for Freehand Image Capture-HP CapShare Technology. |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
EEER | Examination request | ||
FZDE | Dead |