US5761070A - Automatic color and grain sorting of materials - Google Patents
Automatic color and grain sorting of materials Download PDFInfo
- Publication number
- US5761070A US5761070A US08/556,815 US55681595A US5761070A US 5761070 A US5761070 A US 5761070A US 55681595 A US55681595 A US 55681595A US 5761070 A US5761070 A US 5761070A
- Authority
- US
- United States
- Prior art keywords
- color
- black
- cameras
- white
- image
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Expired - Lifetime
Links
Images
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B07—SEPARATING SOLIDS FROM SOLIDS; SORTING
- B07C—POSTAL SORTING; SORTING INDIVIDUAL ARTICLES, OR BULK MATERIAL FIT TO BE SORTED PIECE-MEAL, e.g. BY PICKING
- B07C5/00—Sorting according to a characteristic or feature of the articles or material being sorted, e.g. by control effected by devices which detect or measure such characteristic or feature; Sorting by manually actuated devices, e.g. switches
- B07C5/34—Sorting according to other particular properties
- B07C5/342—Sorting according to other particular properties according to optical properties, e.g. colour
- B07C5/3422—Sorting according to other particular properties according to optical properties, e.g. colour using video scanning devices, e.g. TV-cameras
Definitions
- the present invention relates to the automatic color and/or grain sorting of materials.
- the materials which can be sorted by this method include any man-made or natural materials which may vary in color or surface grain structure from one part to another due to natural causes or inconsistences in manufacturing processes.
- Examples of materials which can be sorted by use of this invention include finished and unfinished hardwood and softwood parts; wood veneers; the hides of animals; birds and reptiles and products made from these hides; ceramic tiles; man-made wood products with and without painted, imprinted, or embossed grain structure; metal and plastic parts with and without painted, imprinted, embossed or molded grain structure; woven and knittend textile products either natural, bleached, dyed, or printed; carpet; and bricks.
- the color and grain sorting of edge glued panel parts can be conceptually divided into three basic operations. First, since each part has two faces, the color class of each face must be determined. Second, using the information about the color class of each face, a decision must be made as to which of the two part faces is the better looking. The "best" face will be the face that appears in the "front" panel face, i.e., in a door front or a table top. Finally, the grain class of the better face must be determined so that the complete color and grain sort can be made. In many applications only a color sort is required. A grain sort is usually only needed for demanding applications such as the export market for Europe.
- the color sorting process is complicated by a number of factors. These factors usually result from the heterogeneity of the wood material. A part is not one color but a distribution of colors. There is a low frequency variation in color along the length and width of the piece. This situation is further complicated by a high frequency variation in the grain pattern appearing on a part face. There is further the variation in color from one hardwood species to another. Clearly, the sorting process must vary with hardwood species, i.e., there will be different color classes for each hardwood species. The sorting process may also vary with the product being manufactured, i.e., there might be tighter color quality standards for a high priced item than a low priced item.
- the variation that is acceptable across a panel face is usually product quality dependent. In very high quality hardwood tables, chairs, cabinets, etc., the variation in color across the panel must be slight while in lower quality products the variation can be much higher.
- any color matching system must provide a mechanism for easily altering the number and types of classes that the system will handle. That is, the system needs to provide the capability of handling a variety of species and provide a mechanism to conveniently switch from one species to another. The system needs to be able to ignore character marks that have been left in part faces while performing the color match. The system needs to have some adjustable parameters that allows a manufacturer to adjust the color uniformity of the sort. High quality products need to have parts sorted into a class in such a way that there is very little color variation from one part to another, while lower quality, less expensive products can be sorted in such a way as to allow a good deal of variation from one part to the next within a color class.
- a color sorting device should allow the manufacturer to easily change the priorities of the color classes. If a particular color is badly needed to complete the day's production schedule, any part that can reasonably be sorted into this color class should be so sorted. Finally, wood colors can vary significantly from one part to another. If a part does not reasonably belong to any color class that a sorting system has been taught to recognize, this part should not be assigned to any of these classes but rather should be sorted into an out class.
- the grain sorting component of the system needs to have similar flexibility. Since the grain patterns vary significantly with hardwood species, the system must provide the capability to easily switch back and forth among species. As is the case with color sorting the grain sorting operation should not be affected by the presence of character marks. The system needs some parameters that allow the manufacturer to control the uniformity of the grain sort. As is the case with color sorting the quality of the grain match on a sort depends on the quality and price of the item being produced. Also, grain patterns vary significantly from one part to another. If a part has a grain pattern that does not reasonably belong to any of the grain patterns the system has been taught to recognize, it should be sorted into an out class so that plant employees can make the decision as to what to do with the part.
- the color of any material is determined by the spectral makeup of light that is reflected from its surface.
- a material preferentially absorbs certain wavelengths of incident light while reflecting others, e.g., a material that is red in color reflects a large proportion of light in the visible red part of the electromagnetic spectrum, i.e., 600 to 700 nanometers (nm) while reflecting less light in the green (500 to 600 nm) and blue (400 to 500 nm) parts of the spectrum. Since a materials color is based on the preferential absorption and reflectance of incident light, the spectral characteristics of the incident light can alter the human perception of materials color. If, for example, a blue light, i.e., one that produces only visible light in the 400 to 500 nm range, is used to illuminate the red material discussed above, this material will appear black to the human observer.
- the challenge was to define a meaningful standard(s) for the light source to be used and to define the spectral characteristics of the human visual system.
- a source of radiate energy may be characterized by its spectral energy distribution which specifies the time rate of energy the source emits per unit wavelength interval.
- a body that exists at an elevated temperature radiates electromagnetic energy.
- the amount of energy radiated is related to its temperature.
- a blackbody is an idealized type of heat radiator whose radiant flux is the maximum obtainable at any wavelength for a body at a fixed temperature.
- the spectral energy distribution of a blackbody is given by Planck's law.
- the key parameter which specifies the distribution of emitted light energy of a blackbody is the temperature of the body given in degrees Kelvin.
- the C.I.E. standard source S A is a tungsten filament bulb whose effective color temperature is 2854° K.
- Standard source S B approximates afternoon sunlight between 400 to 700 nm and has an effective color temperature of 6700° K.
- Standard source S C approximates light from an overcast sky between 400 to 700 nm and has an effective color temperature of 4900° K.
- C( ⁇ ) denote the spectrum of visible wavelengths, ⁇ , of the color C
- R( ⁇ ) denote the red human response curve of FIG. 11
- G( ⁇ ) denote the green human response curve of FIG. 11
- B( ⁇ ) denote the blue human response curve of FIG. 11
- I( ⁇ ) denote the spectral characteristics of the illumination source.
- the spectral characteristics of the detector must be constant over the time interval measurements are being made. Theoretically, the best detector would be one that matches the spectral characteristics of a standard human observer.
- a metric needs to be defined on the three-dimensional color space so that the relative similarity of any two colors can be gauged.
- a spectrophotometer measures the spectral reflectance characteristics of a material measured at, typically, a small circular spot on the materials surface.
- the result is a vector valued measurement, M, where each element is the gauged reflectance value for some interval of wavelengths, ⁇ i .
- the dot product of the vector M with vector representations for R( ⁇ ), G( ⁇ ), and B( ⁇ ) can be taken to yield the triple (r, g, b), i.e., the color space coordinate for the color.
- Solid state charge coupled devices (CCD) cameras have varying sensitivity from one sensing element to another. This means that two spots on a part that have the same color can have different (r, g, b) values as measured by the camera. If any of the points on a part's face that have the same color are to have the same measured (r, g b), then the lighting must be perfectly uniform across the part face. Achieving this uniformity is physically impossible to accomplish given today's state of lighting technology. Hence, a way around these problems must be found.
- Another problem involves the fact that color cameras are not calibrated by the manufacturer. Two different cameras can give different color measurements when both are looking at the same part under identical lighting conditions. This fact is of concern here because one of the objectives of a color and grain sorting system is to determine which of a part's two faces is the better to use in a panel. The reason for this is that to meet throughput requires two cameras must be used, one imaging each part face. Therefore, a solution to this problem must be found.
- the best standard light source to use in the measurement of color is the C.I.E. standard source S B that approximates the light of the afternoon sun.
- the only light sources capable of generating a 6700° K effective color temperature are arc lamps. These lamps are expensive to buy. Consequently, for at least this application, prudence demands that one look at light sources that have lower color temperatures.
- the requirements for a useful light sources arc that (1) it have smooth spectral energy distribution from 400 to 700 nm, (2) that it not have the spectral lines that are present in the light output of most fluorescent bulbs, (3) that its spectral energy distribution function not change significantly across a bulb's lifetime, and (4) that it be relatively inexpensive.
- the first requirement suggests that some type of incandescent source be used.
- the second requirement rules out the most inexpensive incandescent technology available, i.e., the normal light bulb, as well as special incandescent bulbs used in color photography that have relatively high color temperatures. Seemingly the best light source for meeting all the above requirements are quartz tungsten halogen bulbs. This is probably the reason the C.I.E. has one standard source based on this technology even though this standard source does not correspond to any naturally occurring lighting situation.
- the amount of light emitted by a quartz tungsten halogen bulb only drops by about 2 percent over the course of its lifetime. The reason for this is the halogen cycle.
- the halogen in the bulb causes tungsten atoms that are thrown off the filament to reunite with the filament. This, in turn, markedly reduces filament thinning.
- These bulbs operate at very high temperatures, so high in fact that the filament enclosure must be made of clear quartz rather than glass.
- any color can be represented by a point (r, g, b) in three-dimensional color space.
- the storage requirement for one color is exactly three numbers, to be precise three integers, because the output of the selected color sensor is digitized for input into a computer.
- a viable color metric In order to develop a viable edge glued panel part color matching system, a viable color metric must be found that has a simple mathematical definition, that can be quickly computed, and that can, at least partially, mimic human judgments in selected sectors of color space. Defining this metric then is another problem that is not solved by the prior art, and must be addressed.
- the sorting algorithm makes the color assignment by determining which interval of the R color axis r n is in, which interval of the G color axis g n is in, and which interval of the B color axis b n is in.
- This sorting algorithm is capable of only crude color sorts.
- using the normalized quantities (r n , g n , b n ) means the system is insensitive to how light or dark a color is. This system can only separate color with different hues and saturations. The above limitations mean that this system is inappropriate for the color sorting of edge glued panel parts.
- U.S. Pat. No. 5,020,675 to Cowlin et al., titled "Apparatus for Sorting Conveyed Articles,” describes the materials handling and imaging components of a system for finding blemishes on fruits and vegetables.
- the corresponding method for finding the blemishes is described in European Patent Application Publication No. 0 194 148.
- the blemish finding algorithm disclosed in the European Patent Application is based on finding areas that have significantly different color from the surrounding areas. As such this system does not address the problem of sorting parts into a preselected number of color classes, i.e., the problem that need be addressed in sorting edge glued panel parts.
- the Purdue system has serious flaws.
- the color of grain varies significantly from one part to another, as does the color of defects and even the color of pink, white, and brown. It is possible, even very probable, that one brown part may have grain the same color as the non-grain areas of another brown part. Given this situation, it seems very unlikely that any statistical classifier that defines decision boundaries in three-dimensional color space will give accurate results of the variety of color classes needed to color match edge glued panel parts.
- the major difficulty is that parts from one color class may have portions of their surfaces which are the same color as areas on parts from another color class.
- a system that can automatically color and/or grain sort materials, in particular panel-type parts having two opposite faces (a first or top face and a second or bottom face) to be color and grain sorted, is achieved by the provision of a materials conveyor for moving the part; a pair of color CCD line scan cameras positioned to view the top and bottom faces, respectively, of the hardwood part; quartz tungsten halogen lamps positioned to illuminate the top and bottom faces, respectively, of the hardwood part; a single switching power supply providing DC power to the tungsten halogen light sources; and a computer for analyzing the data from the top and bottom cameras.
- Each color camera has the appropriate infrared cutoff filter so that the sensitive locations on the charge coupled device (CCD) only respond to light in the visible range.
- CCD charge coupled device
- the cameras are contained in cooled enclosures. The cooling assures the cameras operate at a constant temperature.
- the power for the quartz tungsten halogen comes from a single switching power supply which provides DC power so that the color images obtained from the color cameras do not have stripping caused lighting variations induced by AC power.
- the switching power supply used has an input line that is used to control how much power is supplied to the bulbs.
- the power supplied depends on the voltage presented on the input line.
- the computer controls that voltage on this line using a digital to analog converter.
- the amount of voltage supplied is based on an examination of a target appearing in one of the camera's field of view outside the area in which the part appears. Picture elements (pixels) that cover an area of this target are averaged. This average is compared to a standard. If the average is slightly below the standard more voltage is applied to the line to the switching power supply. This causes the power supply to increase the voltage supplied to all the quartz tungsten halogen bulbs. If the computed average is slightly above the standard the computer reduces the voltage on the line to the power supply.
- the computer and the switching power supply are enclosed in a dust free enclosure that is air-conditioned to maintain a steady operating temperature, to improve computer reliability and the stability of the power provided to the quartz tungsten halogen bulbs.
- the light emitted from the quartz tungsten halogen bulbs is sent through a fiber optic light line.
- the light emitting from these lines illuminates the parts.
- the bulbs directly illuminate the surface.
- camera controllers accept analog data from the camera heads and digitize it so that the information can be passed to the computer for analysis.
- a single analog to digital (A/D) converter is used in the controller, and blue filters are used on the cameras. These filters assure that red, green, and blue values will be reasonably close when a white target is inserted into a camera's field of view.
- three different A/D converts are used, each having its own offset and gain, thus eliminating the need for the blue filters.
- the color image data coming from a camera is directly transferred to the computer using a direct memory access (DMA) computer interface, allowing image data to be placed in computer memory without the need for the computer's central processing unit to have to perform any functions.
- DMA direct memory access
- an optical sensor and an ultrasonic width sensor are also provided.
- the optical sensor determines when a part is about to enter the field of view of the two cameras and turns the cameras on and off.
- the ultrasonic sensor is used to determine the number of pixels on each CCD line scan camera that must be read.
- data from a color camera is first passed through a high speed processing board that performs certain operations on the image data.
- the results of this processing are then passed to the computer via the DMA interface discussed above.
- the determination of the width and the location of the leading and lagging edge of the part is determined by the high speed processing board. Therefore, no sensors arc provided, the cameras stay on all of the time, and the processing board only sends data for each part to the computer.
- two pneumatic devices controlled by the computer selectively insert a white target into the field of view of each camera where the edge glued panel parts pass through.
- the white target allows the computer to collect the data needed to do "shading compensation" on the collected color image data.
- the white targets are attached to the opposite faces of a carrier and are transported to the cameras by the materials conveyor.
- the shading compensation is done only on the digitized imagery.
- a "rough" analog shading correction is performed by the camera controller prior to the A/D conversion of the color imagery. After conversion, another "fine” shading correction is performed on the digital imagery.
- the rough analog shading correction is performed by controlling the offset and gain for each pixel going into each A/D converter.
- Three basic categories of algorithms are used in the sorting system in accordance with the present invention: (1) training algorithms, (2) real-time operation algorithms, and (3) utility algorithms.
- the training algorithms are used to teach the system to identify the color and grain classes that are to be used during the sorting process.
- the real-time algorithms actually perform the color and grain sorts with parts moving through the system at the desired throughput rate.
- the utility algorithms allow operators to display images, perform system tests, etc.
- Redundancy in the training algorithm for grain matching is eliminated by removing the effects of varying intensities among the color classes from the output of the edge detectors, using an equal probability quantizing algorithm.
- a training sample is one face of an edge-glued panel part considered to be prototypical of the color class to which the face has been assigned.
- the color sorting training algorithm is performed by analyzing a plurality of training samples, and includes the steps of: collecting a color image of a training sample; applying the shading compensation algorithm to the color image to produce a shading compensated color image; using the shading compensated color image to average the red, green, and blue channel values to form a black and white image; applying a preselected threshold to the black and white image, removing background pixels from further consideration; using the shading compensated color image to compute a three-dimensional color histogram for the training sample, ignoring any background pixels; using the black and white image to compute a black/white histogram for the training sample, again ignoring any background pixels; applying a character mark detection algorithm to the black/white histogram to find a threshold value; removing character mark pixels from the three-dimensional color histogram; normalizing the three-dimensional color histogram to convert it to an estimated probability function; adding the estimated probability function
- the remaining steps of the algorithm depend upon whether the decision as to color class is based on a single prototype used to represent each class or on a pattern classification method using a k-nearest neighbor classification procedure. Where color class is based on a single prototype, the algorithm includes the remaining steps of: using the color lookup table to map each running sum of three-dimensional estimated probability functions for a color class into a reduced measure vector; determining the prototype for each class based on the reduced measure vector; and for each class estimating the threshold by examining all the training samples for all the classes and selecting the threshold that gives the minimum probability of error.
- the algorithm includes the remaining steps of: using the color lookup table to map each estimated probability function into a reduced measure vector; and for each class, estimating the threshold by examining all of the training samples for all the classes and selecting the threshold that gives the best results.
- the shading correction algorithm is performed by collecting a selected number of lines of color image data from one of the cameras while the lens cap is on, computing the average response for each pixel in each channel of data, removing the lens cap and scanning the selected number of lines of color image data off the white target, computing the average response for each pixel in each channel of data, and applying the steps of a standard shading correction algorithm to the color imagery as it is being collected.
- the character mark detection algorithm accepts as input a black and white histogram of a part face, and outputs a threshold used to eliminate the effects of character mark from the color sorting of the parts.
- a color mapping algorithm is used to reduce the size of the measurement vector needed to represent each part face, and also to remove colors that might be character marks from the color class prototypes (if a simple pattern classification algorithm is employed), or from the training samples themselves (if a k-nearest neighbor pattern classification procedure is used), as well as from the two reduced measurement vectors that are used to represent the color of each part face during real-time operation.
- the color mapping algorithm is based on the Heckbert color mapping algorithm, with additional steps for removing character marks.
- the real-time color sorting algorithm has two embodiments, one based on using the distance from class prototype classifier, and the other based on using the k-nearest neighbor algorithm. Both embodiments have in common the steps of: collecting a color image of a sample part face; shade compensating the color image to remove non-uniformitites in light and in sensing element sensitivities across the filed of view; averaging the red, green, and blue components of each color pixel in the shade compensated color image to create a black and white image; applying a threshold to the black and white image to remove background pixels from consideration; computing a reduced measurement vector using the color lookup table; computing a histogram of the black and white image of the part face; applying the character mark algorithm to the histogram; removing character mark pixels from the reduced measurement vector; and normalizing the reduced measurement vector to form the normalized reduced measurement vector.
- the algorithm includes the additional steps of: removing character mark colors from each of the prototypes; forming a new set of modified prototypes with the effects of character marks removed from consideration; computing the distance from each of the prototypes; and assigning a color class to the part face based on this distance.
- the algorithm includes the additional steps of: removing character mark colors from each training sample to create a modified measurement vector for the sample; normalizing the modified measurement vector; computing the distance from the modified training sample; finding the k smallest of the distance values; finding the color class label that occurs the most often in the training vectors with the k smallest distances; and assigning the part face to a color class based on the shortest distance.
- the real-time color sorting algorithms pass three pieces of information about each part face to the best face selection algorithm: (1) the class label of the color class to which the face has been assigned; (2) a distance measure; and (3) Area clear , the clear area of the part face.
- the best face selection algorithm then uses this information to determine which of the two faces is the best face.
- the grain matching training algorithm is performed by analyzing a plurality of training samples, and includes the steps of: collecting a color image of a training sample; applying the shading compensation algorithm to the color image to remove non-uniformities in lighting and in camera sensing element sensitivities across the field of view to produce a shading compensated color image; using the shading compensated color image to average the red, green, and blue channel values to form a black and white image; applying a preselected threshold to the black and white image, removing background pixels from further consideration; using the black and white image to compute a black/white histogram for the part face, ignoring any background pixels; applying a character mark algorithm to the black/white histogram to find the threshold value; removing character mark pixels from the black/white histogram; removing character mark areas from further consideration by labeling them as background
- the remaining steps of the algorithm depend upon whether the decision as to grain class is based on a single prototype used to represent each class or on a pattern classification method using a k-nearest neighbor classification procedure. Where grain class is based on a single prototype, the algorithm includes the remaining steps of: finding the class prototype for each class; and for each class estimating the threshold by examining all the training samples for all the classes and selecting the threshold that gives the minimum probability of error.
- the algorithm includes the remaining step of estimating the threshold for each class by examining all the training samples for all the classes and selecting the threshold that gives the best results.
- the real-time grain sorting algorithms are performed after both part faces have been color sorted, and after the best face of the part has been selected.
- the real-time grain sorting algorithm has two embodiments, one based on using the distance from class prototype classifier, and the other based on using the k-nearest neighbor algorithm.
- Both embodiments have in common the steps of: removing character mark pixels from the black and white histogram of the best part face; removing character mark areas from further consideration by labeling them as background pixels in the black and white image of the best part face; computing the normalizing factor; applying the equal probability quantizing algorithm to the black and white image of the best part face, the black/white histogram of the black and white image of the best part face, the normalizing factor, and the number of gray levels appearing in the requantized black and white image to obtain a requantized version of the black and white image; applying edge detectors respectively most sensitive to the vertical, horizontal, right diagonal, and left diagonal edges to the requantized version of the black and white image to obtain the gradient images which record the absolute values of, respectively, the vertical, horizontal, right diagonal, and left diagonal edges; averaging the left and right diagonal gradient images to find a single gradient image that indicates the number of diagonal edges present in either the right or left diagonal dimensions; and creating an edge histogram and normalizing it.
- the algorithm includes the additional steps of: computing the distance from each of the prototypes; and assigning a grain pattern class to the part face based on the distance.
- the algorithm includes the additional steps of: computing the distance from the training sample; finding the k smallest of the distance values; finding the grain class label that occurs the most often in the training vectors with the k smallest distances; and assigning the part face to a grain class based on the shortest distance.
- the normalizing algorithm uses ten to fifteen color samples spanning the intensity levels that the cameras have been set up to handle, and includes the steps of: for each color sample, scanning the color sample to collect a color image of the sample from both cameras, applying the shading correction algorithm to the color images from both cameras, and computing the values of matrices representing the relative red, green, and blue responses of the two cameras based on the shading corrected color component images collected from both cameras; creating a two-dimensional plot of the relative red responses of the two cameras to all of the color targets, with the horizontal axis representing the output gray level value for the red channel of the first camera and the vertical axis representing the output gray level value for the red channel of the second camera; creating a two-dimensional plot of the relative green responses of the two cameras to all of the color targets, with the horizontal axis representing the output gray level value for the green channel of the first camera and the vertical
- FIG. 1 is a diagrammatic view of apparatus for the color and grain sorting of a hardwood part, in accordance with the present invention.
- FIG. 2 is a top perspective view of the conveyor, illumination, and imaging elements of the apparatus of FIG. 1.
- FIG. 3A is a side elevational view of the conveyor, illumination, and imaging elements of FIG. 2, in which the targets used for shading compensation are placed in the camera fields of view by pneumatic devices.
- FIG. 3B is a side elevational view of the conveyor, illumination, and imaging elements of FIG. 2, in which the targets used for shading compensation arc transported to the camera fields of view by a carrier moved by the conveyor.
- FIG. 4 is a typical shape black and white histogram extracted from a completely clear part face.
- FIG. 5 is a typical shape black and white histogram extracted from a part face that has a small amount of light-colored character marks present.
- FIG. 6 is a typical shape black and white histogram extracted from a part face that has a good deal of light-colored character marks.
- FIG. 7 is a typical shape black and white histogram extracted from a part face that has areas of very dark character mark and areas of light character marks.
- FIG. 8 is a typical shape black and white histogram extracted from a part face that has only areas of dark mineral steaks.
- FIG. 9 is a typical shape black and white histogram extracted from a completely clear part face.
- FIG. 10 is a graph illustrating the representation of a color C in three dimensional color space.
- FIG. 11 illustrates the spectral distribution curves specified by the C.I.E. for its standard narrow field observer.
- FIGS. 1, 2, 3A, and 3B there is shown the apparatus for carrying out the system in accordance with one embodiment of the present invention in connection with the color and grain sorting of hardwood edge-glued parts, so that panels of uniform color and grain pattern can be created.
- the apparatus can be varied depending on, among other things, the material being color and/or grain sorted.
- the apparatus as shown in FIGS. 1-3 includes a materials conveyor 2 for moving a hardwood part P having first and second opposite faces F1 and F2 to be color and grain sorted.
- Hardwood part P is positioned on driving belt 4 of conveyor 2 supported at an angle of between approximately 70° to approximately 80° by a support fence 10, with face F1 facing away from support fence 10 (and henceforth referred to as the top face) and face F2 facing toward support fence 10 (and henceforth referred to as the bottom face), for a purpose to be described hereinafter.
- Hardwood part P is propelled by friction between its bottom edge and belt 4.
- Hardwood part P is illuminated by top and bottom sets of quartz tungsten halogen lamps 12a and 12b positioned to illuminate the top and bottom faces, F1 and F2, respectively.
- DC power for quartz tungsten halogen lamps 12a and 12b comes from a single switching power supply 14.
- Top and bottom sets of lamps 12a and 12b are horizontally offset from each other in a plane lying at an angle to belt 4, to illuminate opposite faces F1 and F2 of part P at different positions on conveyor 2.
- quartz tungsten halogen lamps 12a and 12b do not directly illuminate faces F1 and F2 of edge glued panel part P. Rather, the light emitted from lamps 12a and 12b is sent through fiber optic light lines 16. Faces F1 and F2 are illuminated by the light emitted from lines 16. There are several reasons for using lines 16, including, but not limited to, removing the lamps 12a and 12b from the area to be illuminated, so they will not heat the cameras; providing bright and even light over the length of the camera scan line; and facilitating changing a lamp in case it burns out. If a lamp burns out, both it and its enclosure can be easily removed without waiting for the lamp to cool. In another embodiment of the present invention, lamps 12a and 12b directly illuminate faces F1 and F2 of part P.
- a pair of color CCD line scan cameras 20a and 20b are positioned above and below top and bottom faces F1 and F2 to view them respectively.
- Each color camera 20a and 20b has a field of view larger than the hardwood part P, so that the hardwood part P only passes through a portion or area of the field of view.
- the optical axes of top and bottom cameras 20a and 20b are offset from each other horizontally and lie in a plane at an angle to the horizontal.
- the optical axes are parallel to each other and perpendicular to fence 10 (and hence, to faces F1 and F2 of part P), and both intersect fence 10 at a point approximately one (1) inch (2.54 cm) above drive belt 4.
- Edge glued panel parts are always at least one inch wide. Locating the optical axes one inch above drive belt 4 prevents top and bottom cameras 20a and 20b from imaging the top edge of hardwood part P as well as top and bottom faces F1 and F2.
- Hardwood part P and cameras 20a and 20b are set at an angle as described above, to avoid either of cameras 20a and 20b from having to be located under part P as it is being imaged. Locating a camera below an area where part P traverses will result in dust, saw dust, and other debris falling off a part and onto the lens of the camera below.
- Color CCD line scan cameras are used because they provide a convenient method for handling parts of different lengths and they also provide a method for independently controlling both the cross part and down part spatial resolutions.
- Each color camera 20a and 20b has the appropriate infrared cutoff filter 22 (FIG. 2) so that the sensitive locations on the CCD only respond to light in the visible range.
- To reduce the amount of dark current produced by each of cameras 20a and 20b they are contained in cooled enclosures 24a and 24b (FIG. 1). The cooling assures that cameras 20a and 20b operate at a constant temperature, preferably less than or equal to 70°.
- power supply 14 provides DC power to lamps 12a and 12b so that the color images obtained from color cameras 20a and 20b do not have stripping caused lighting variations induced by AC power.
- Power supply 14 has a single input line 30 that is used to control how much power is supplied to each of lamps 12a and 12b. By supplying the power to each of lamps 12a and 12b over a single input line 30, lamps 12a and 12b can be controlled simultaneously and uniformly. The power supplied depends on the voltage presented on the input line 30.
- a computer 32 controls the voltage on input line 30 using a digital-to-analog converter 34. To improve the reliability of computer 32 and the stability of the power provided to the quartz tungsten halogen lamps 12a and 12b, computer 32 and the switching power supply 14 are both enclosed in a dust free enclosure 36 that is air-conditioned to maintain a steady operating temperature.
- the amount of voltage supplied to lamps 12a and 12b is based on an examination of a first target 40a appearing in the field of view of camera 20a. Although first target 40a appears in the field of view of camera 20a, it appears outside the area through which part P passes.
- the picture elements (pixels) of the viewing camera (i.e., camera 20a) that cover an area of first target 40a are averaged. This average is compared by computer 32 to a standard stored in the memory of computer 32. If the average is slightly below the standard, more voltage is applied to line 30 to switching power supply 14. This causes power supply 14 to increase the voltage supplied to all the quartz tungsten halogen lamps 12a and 12b. If the computed average is slightly above the standard, computer 32 reduces the voltage on line 30 to power supply 14. This causes the power supply to decrease the amount of power supplied to all the quartz tungsten halogen lamps 12a and 12b. If the average is markedly different from the standard, system operation is halted and an alarm is displayed on the system console 42 to inform operators that a lamp has burned outt or that the shading compensation values need to be recalculated.
- a second target 40b similar to target 40a appears in the field of view of camera 20b, also outside the area through which part P passes. Pixels from camera 20b that cover an area of second target 40b are also averaged. If the value of this average is markedly different from the standard, system operation is halted and an alarm is displayed on system console 42 to inform operators that a lamp has burned out or that the shading compensation values need to be recalculated.
- targets 40a and 40b are ceramic, and are white, although they can also be some shade of gray, or other colors.
- First and second camera controllers 50a and 50b accept analog data from the heads of cameras 20a and 20b, respectively, and digitize it so that the data can be passed to computer 32 for analysis.
- each controller 50a and 50b uses a single analog to digital (A/D) converter 52.
- A/D analog to digital
- the same gain and offset are used for the red, green, and blue pixels.
- blue filters 54 must be used on cameras 20a and 20b.
- Blue filters 54 (FIG. 2) assure that red, green, and blue values will be reasonably close when shading corrections are made, as described hereinafter.
- three different A/D converters 52 are used, each having its own offset and gain. This removes the need for blue filters 54. Removing blue filters 54 markedly increases the amount of light reaching the heads of cameras 20a and 20b. With more light reaching the camera heads, either fewer or less intense lamps 12a and 12b can be used. Because less intense lamps typically have longer expected lifetimes, their use can markedly reduce routine maintenance requirements.
- the color image data coming from cameras 20a and 20b are directly transferred to computer 32 using direct memory access (DMA) computer interfaces 60a and 60b.
- DMA direct memory access
- the use of interfaces 60a and 60b allows image data to be placed in computer memory without the need for the computer's central processing unit to have to perform any functions.
- Computer 32 is thus allowed to process image data from a previously-viewed part P while image data from the next part P is being collected.
- optical sensor 62 for determining when a part P is about to enter the field of view of cameras 20a and 20b
- ultrasonic width sensor 64 for determining the width of the entering part P.
- Optical sensor 62 is used to turn cameras 20a and 20b on and off.
- Ultrasonic sensor 64 is used to determine the number of pixels on each of CCD cameras 20a and 20b that must be read The goal of both these sensors 62 and 64 is to minimize the amount of data that is collected and must be analyzed.
- data from color cameras 20a and 20b is first passed through high speed processing boards 70a and 70b that perform certain operations on the image data.
- the data processed by processing boards 70a and 70b are then passed to computer 32 via DMA interfaces 60a and 60b.
- the use of processing boards 70a and 70b allows the system in accordance with the present invention to have significantly higher throughput.
- the determination of the width and the location of the leading and lagging edge of a part P is determined by the high speed processing boards 70a and 70b. Therefore, no sensors are provided, cameras 20a and 20b stay on all of the time, and the processing boards 70a and 70b only send data to the computer 32 for the pixels representing the part P. All other pixels are deleted by the processing boards 70a and 70b.
- a pair of pneumatic devices 80a and 80b controlled by computer 32 provide the capability of selectively inserting targets 82a and 82b, respectively, into the fields of view of cameras 20a and 20b, respectively, where the panel parts P pass through.
- a board 80 having targets 82a and 82b set into its faces can selectively be placed on the conveyor 2 for use in shading compensation.
- Targets 82a and 82b can, for example, be set into grooves routed in the faces of board 80 and are positioned on board 80 so that when board 80 is placed on conveyor 2 in the proper location, targets 82a and 82b are in the scan lines for cameras 20a and 20b, respectively, where the edge glued panel parts P pass through.
- targets 82a and 82b have faces which lie in the same plane as faces F1 and F2 will lie when part P is being scanned, and should be "whiter" than any part P that will be inspected.
- targets 82a and 82b should have a reflectivity which substantially matches that of the faces F1 and F2 of the edge glued panel parts P being inspected. For implementations requiring less sensitivity, targets having a reflectance of 99% are adequate, as a reflectance of 99% will encompass almost any article being examined.
- Targets 82a and 82b preferably are white ceramic tiles which are uniform in color across their surfaces. Targets 82a and 82b can also be some shade of gray, but must be equally reflective at each color channel of cameras 20a and 20b.
- Targets 82a and 82b allow computer 32 to collect the data needed to do "shading compensation" on the collected color image data.
- the shading compensation algorithm is used to remove any variations in lighting that occur across the fields of view of cameras 20a and 20b. It also removes the variations in sensitivity across the sensitive elements of the CCD cameras 20a and 20b.
- the shading compensation is done only on the digitized imagery.
- a "rough" analog shading correction is performed by camera controllers 50a and 50b prior to the A/D conversion of the color imagery by A/D converters 52. After conversion, another "fine” shading correction is performed on the digital imagery.
- the rough analog shading correction is performed by controlling the offset and gain for each pixel going into each A/D converter 52.
- an ink jet printer or the like (not shown) is positioned to spray onto the parts P a code indicating the color code and the best face of each part P.
- conveyor 2 is extended, and provided with a plurality of pneumatically-operated kick-off assemblies (not shown) which sort the parts P into bins (also not shown).
- the algorithms used in the system can be broken down into three basic categories: (1) training algorithms, (2) real-time operation algorithms, and (3) utility algorithms.
- the training algorithms are used to teach the system to identify the color and grain classes that are to be used during the sorting process.
- the real-time algorithms actually perform the color and grain sorts with parts moving through the system at the desired throughput rate.
- the utility algorithms allow operators to display images, perform system tests, etc.
- the number and type of utility algorithms can and will vary slightly with each device that is produced, in a manner which will be understood by those of skill in the art.
- these utility algorithms are used only for checks and are unrelated to the uniqueness of the present invention. Hence, only the training and real-time implementation algorithms will be described herein.
- One of the objects of the present invention is to eliminate this "redundant" training by removing the effects of varying intensities among the color classes from the output of the edge detectors used.
- a relatively standard procedure for removing the differences in intensities across the training samples is to use the equal probability quantizing (EPQ) algorithm as described by R. Conners and C. Harlow, 1978, “Equal Probability Quantizing and Texture Analysis of Radiographic Images," 8 Computer Graphics and Image Processing (1978), pp. 447-463. This algorithm is as follows:
- N color classes have been selected for the sort.
- a training sample is one face of an edge-glued panel part. This face represents what is considered to be a prototypical example of the color class to which the face has been assigned.
- To perform system training the selected parts are all run through the system with their selected faces up.
- the training algorithm is as follows:
- a single prototype is used to represent each class.
- a part face is assigned to a class based on the distance its color measurement vector is from the various class prototypes.
- the prototype that produces the minimum distance is the class to which the part face is assigned given that this distance is less than some specified threshold.
- Using these thresholds provides the capability of having an out class.
- a part face is assigned to the out class if it lies too far away from the prototype of the class to which it has been assigned. This distance from class prototype color sorting method is useful if the parts in each class have a very uniform color.
- a second embodiment of this aspect of the invention uses the k-nearest neighbor classification procedure, as described by R. Duda and P. Hart in Pattern Classification and Scene Analysis (John Wiley, New York 1973).
- This pattern classification method is useful if one or more of the color classes is made up of parts that have a variety of different colors.
- This method for color sorting parts is much more computationally complex than the method described above and hence should only be used when one or more of the color classes contain parts that have a variety of different colors.
- To use this classification method the measurement vector from each sample in each color class must be saved in computer memory. The measurement vector used to characterize the color of a part face is computed. The distance from this measurement vector to each of the measurement vectors computed from each of the training samples is determined.
- the k measurement vectors of the training samples that are the closest to the one computed from the part face are then found.
- the part face is assigned to the class to which the most of the k-nearest training sample measurement vectors belong.
- To provide for an out class the distance between the part face's measurement vector and the closest training sample in the class to which the part face has been assigned is compared to a threshold value. If this distance is less than or equal to the threshold value, the part face's classification remains unchanged. If, however, this distance is greater than the threshold value, the part face's classification is changed to the out class.
- the results of the training algorithm are stored in disk files that are read by the systems real-time operation algorithm when it is started up. In this way training files for a number of different species can be created and stored on disk. This allows the system to handle a variety of different species.
- the appropriate training files for the species to be processed only need be loaded into the real-time algorithm at the time it is started up.
- T n defined in either Step 6 or in Step 5', depending on the classification method being used, are only approximations to the actual values for these thresholds. This follows from the fact that under typical training conditions only a very limited number of training samples are used. Plant personnel can change these values after the training is performed. These parameters provide the mechanism for plant personnel to change the uniformity of the sort. If a T n is made small then all the parts sorted into a color class n will have approximately the same color. If T n is made larger then parts with more diverse colors will be sorted into a color class. Having a separate parameter for each color class allows optimal control of the sorting process.
- the purpose of the shading correction algorithm is to remove nonuniformities in lighting and/or sensitivity of sensing elements across a camera's field of view.
- a standard shading correction algorithm as described by A. Sawchuk, 1977, "Real-time Correction of Intensity Nonlinearities in Imaging System," IEEE Transactions on Computers, Vol. 26, No. 1 (1977), pp. 34-39, is employed in this invention. Since color line scan cameras are being used, the description given will be for such cameras, though the methodologies used can be employed on array cameras as well, as will be understood by those of skill in the art.
- shading correction can be applied to color imagery as it is being collected.
- the procedure is as follows:
- the above represents an all-digital way to shade compensate color image data.
- the all-digital method is used in one embodiment of this aspect of the invention.
- shading correction is performed as a two step process.
- This analog correction is based on the digital processing described immediately above in Steps 1 through 4.
- This information is then used to control the offset and gain on analog-to-digital (A/D) converters 52 found in the camera controller 50a or 50b.
- A/D converters 52 In this mode of operation there are three A/D converters 52 used, one each for the red, green, and blue color channels.
- the offset and gain to each converter 52 is varied as a function of i just as is done in the all-digital algorithm that is described above.
- the rough analog shading correction is done, then the purely digital method is applied as a second step. Only after the analog method is running can Steps 1 through 4 be performed to create the coefficients for the purely digital algorithm.
- This variation has a number of advantages. First, it gets rid of the blue color filters that are needed to adjust camera sensitivity. Removing the blue filters greatly increases the amount of light that reaches the camera for any setting of light source intensity. Hence, lower power bulbs can be used thus increasing bulb life. Secondly, the combined analog/digital processing greatly reduces the data loss that can occur in purely digital processing when lighting intensity varies significantly over the field of view.
- FIG. 4 shows the typical shape H BW extracted from a completely clear part face.
- the values of H BW are initially zero.
- a gray level value of 0 corresponds to black and a gray level value of 255 is completely white.
- the elements of H BW become non-zero, they rapidly increase in value until reaching a peak.
- this peak is a double peak as shown in the figure.
- a double peak results from the difference in color between early wood and late wood on a part face.
- FIG. 5 shows the typical shape of H BW extracted from a part face that has a small area of light-colored character marks present. Note the difference in this shape from that shown in FIG. 4.
- the shape difference can be characterized by the fact that once the elements of H BW become non-zero they initially do not increase in value as rapidly as is the case for completely clear parts. Note that variations in the shape of H BW caused by character marks will always appear on the left side of the peak because character marks are always darker than clear wood.
- FIG. 6 shows yet another possibility. It shows the typical shape of H BW extracted from a part face that has a good deal of light-colored character marks. Statistically speaking the resulting shape is caused by the mixing of two populations each having a slightly different mean value. The key to recognizing this shape is that there is a significant inflection point that occurs on the left side of the clear wood peak.
- FIG. 7 shows a slightly more complicated situation. It shows the typical shape of H BW extracted from a part face that has areas of very dark character marks and areas of light character marks. The areas of dark character marks cause the small peak shown. The highest peak must be clear wood since edge glued panel parts have the vast majority of their surface area composed of completely clear wood. The areas of light character marks in the part face once again cause the inflection point on the left side of the main peak.
- FIG. 8 shows the last situation that generally occurs. This shape for H BW results from parts that have only areas of dark character marks. No inflection points occur in this instance.
- the Gaussian filter is defined by an input variable, ⁇ 2 , and by the equation ##EQU14## 2. Apply a standard peak detection method to H SBW to locate the positions of all the peaks in H SBW .
- Peak NH ⁇ HF ⁇ Peak H where HF is a program input variable and 0 ⁇ HF ⁇ 1.0
- DF is a program input variable
- Steps 4 through 6 are aimed at identifying potential double peaks caused by early wood and late wood. If such peaks exist, the desire is to locate the left-most one. Once the left-most peak is located, the algorithm starts looking from the location of this peak backwards in an effort to find either a negative-to-positive (looking left to right) inflection point or a valley. The importance of inflection points was discussed above with respect to FIGS. 6 and 7. The objective is to find the first negative-to-positive inflection point as one proceeds from the left most peak to gray level 0. If such an inflection point is found first then T CM is set equal to the position of this inflection point, as is suggested by FIGS. 6 and 7.
- T CM is set equal to the position of this valley.
- the color mapping algorithm is very similar to the ones used to reduce the number of colors needed to give a good visual rendering of a color image.
- this algorithm has an additional very important requirement, by which it is distinguished from other color mapping algorithms.
- This algorithm must make removing elements in a reduced measurement vector that correspond to colors that might be areas containing character marks a computationally simple task to perform. That is, once the threshold T CM has been estimated from the black and white histogram H BW , the algorithm must provide a convenient means for setting elements in a reduced measurement vector that correspond to colors whose black and white values are less than T CM to zero.
- the measurement vectors involved include the reduced measurement of a part face and the reduced measurement vectors of the color class prototypes, if the distance-from-prototype classification algorithm is employed, or the reduced measurement vectors of the individual training samples themselves, if the k-nearest neighbor pattern classification procedure is used.
- the color mapping algorithm employed in this invention starts with a traditional algorithm, one conceived by Paul Heckbert, "Color Image Quantization for Frame Buffer Display,” 16 Computer Graphics, No.3 (July 1982), pp. 297-307 ("the Heckbert article"), to which we have added features so that the additional requirement can be met.
- a GL (l) is the average black and white value of the distribution of colors defined by s(i, j, k) in the l th rectangular parallelepiped.
- the algorithm repeatedly subdivides color space into smaller and smaller rectangular parallelepipeds. It starts by identifying the parallelepiped (box) that tightly encloses all the non-zero elements, s(i, j, k), of S. Then recursion is used to identify the final rectangular parallelepiped. The recursion is based on the following:
- the box is "shrunk" to fit tightly around the non zero elements it encloses by finding the minimum and maximum coordinates for each of these elements.
- this box is partitioned. To do this the enclosed points are sorted along the longest dimension of the box and segregated into two halves at the median point. Approximately equal numbers of points will fall on each side of the cutting plane.
- the matrix A GL is generated.
- INTEGER is a function that converts numbers into integers without rounding.
- Mxgl is a program input variable that specifies the number of gray levels being used. This variable has the same value as the dimension of the color space being used.
- the real-time operation has two contemplated embodiments. One is based on using the distance from class prototype classifier. The other is based on using the k-nearest neighbor algorithm. The steps that are common to these two embodiments are given below:
- SCI s Shade compensate this color image to remove nonuniformities in lighting and in sensing element sensitivities across the field of view.
- SCI s is composed of three matrices, SR s , SG s , and SB s just as above.
- H BW h bw (i)!, of the black and white image of the part face. That is for all x, y such that bw(x, y) ⁇ 0
- each color class n has a total of m samples (n) training samples. Then if the k-nearest neighbor classification rule is used the following steps need be performed:
- the computer process one part face at a time with the same steps being applied to each face.
- the real-time system loads parameters needed to do the classification from a disk file. This file is created by the training algorithm.
- the files to be loaded are user defined.
- red oak an operator must type in the file names of the red oak training files. If, later, the desire is to change from red oak to hard maple, the operates halts red oak real-time processing. Then he restarts the real-time system this time typing in the file names of hard maple files.
- additional classes may have to be added in order to span the set of colors that occur in a hardwood species parts.
- the l p norm or metric used in this embodiment is l 1 .
- This norm was chosen because it seemingly provided reasonably accurate results while, at the same time, being very easy to compute. As a greater understanding of this technology is developed a different l p norm may be found that gives better sorting accuracies.
- the real-time classification algorithms pass three pieces of information about each part face to the best face selection algorithm. These are (1) the class label of the color class to which the face has been assigned; (2) a distance measure; and (3) Area clear , the clear area of the part face. If the distance from prototype algorithm is used the distance passed is D n , the distance defined in Step 14 of the first embodiment of the real-time color sorting algorithm. If the k-nearest neighbor algorithm is used, the distance passed is D color .sbsb.label, defined in Step 13' in of the second embodiment of the real-time color sorting algorithm.
- label 1 is the color class label assigned to face 1 of the part
- D 1 is the distance measure passed for this face
- Area 1 is the clear area of this part face
- Similarity label 2 is the color class label assigned to face 2 of the part, D 2 is the distance measure passed for this face, and Area 2 is the clear area of the second part face.
- T Area is a program input variable.
- label 1 has a higher priority than label 2 then assign the best face to face 1 and sort the part into color class label 1 .
- the priority of each class is a program input parameter.
- label 2 has a higher priority than label 1 , then assign the best face to face 2 and sort the part into color class label 2 .
- color classes label 1 and label 2 both have the same priority and D 1 ⁇ D 2 assign the best face to face 1 and sort the part into color class label 1
- color classes label 1 and label 2 both have the same priority and D 2 ⁇ D 1 assign the best face to face 2 and sort the part into color class label 2 .
- a robust color sorting system must provide the capability for management to easily change the relative priorities of the various color classes to meet the ever-changing needs of the manufacturing process.
- This invention attempts to do just that.
- the priorities of the various color classes are stored in a disk file. At the time that real-time operation is begun these parameters are read from disk and stored in computer memory.
- an operator need only halt real-time operation, and access a utility program that can alter the priorities of the color classes as they are defined in the disk file. Once the modifications have been saved to the file, the operator only need start real-time operation again.
- Training for grain matching is done independently of the training that is done for color sorting.
- the grain matching operation requires only black and white imagery. Since, for color matching, color images are collected, one of the first operations is to convert a color image of a face into a black and white image of the face.
- the grain matching training procedure shares a number of common steps with the training used for color matching.
- N g grain classes have been selected for the sort.
- a training sample is one face of an edge-glued panel part. This face represents what is considered to be a prototypical example of the grain class to which the face has been assigned.
- To perform system training the selected parts are all run through the system with their selected faces up.
- the training algorithm is as follows:
- CI nm Collect the color image, of the m th training sample of class n.
- T background a preselected threshold
- G D g D (x, y)!, that indicates the number of diagonal edges present in either the right or left diagonal directions, i.e., for all x and y let
- a single prototype is used to represent each class.
- a part face is assigned to a class based on the distance its grain pattern measurement vector is from the various class prototypes.
- the prototype that produces the minimum distance is the class to which the part face is assigned given that this distance is less than some specified threshold.
- Using these thresholds provides the capability of having an out class.
- a part face is assigned to the out class if it lies too far away from of the prototype of the class to which it has been assigned. This distance from class prototype grain sorting method is useful if the parts in each class have a very uniform grain direction.
- Pr n pr n ( ⁇ i, ⁇ j, ⁇ k)! is the prototype for class n.
- a second embodiment of the grain sorting training algorithm in accordance with the present invention uses the k-nearest neighbor classification procedure, as described by Duda and Hart.
- This pattern classification method is useful if one or more of the grain pattern classes is made up of parts that have a variety of different grain patterns. Note that this method for grain sorting parts is much more computationally complex than the method described above and hence should only be used when one or more of the grain classes contain parts that have a variety of different grain patterns.
- To use this classification method the measurement vector from each sample in each grain class must be saved in computer memory. The measurement vector used to characterize the grain pattern of a part face is computed. The distance from this measurement vector to each of the measurement vectors computed from each of the training samples is determined.
- the k measurement vectors of the training samples that are the closest to the one computed from the part face are then found.
- the part face is assigned the class to which the most of the k-nearest training sample measurement vectors belong.
- To provide for an out class the distance between the part face's measurement vector and the closest training sample in the class to which the part face has been assigned is compared to a threshold value. If this distance is less than or equal to the threshold value the part face's classification remains unchanged. If, however, this distance is greater than the threshold value, the part face's classification is changed to the out class.
- the step needed to implement this second embodiment of the training algorithm is as follows:
- n 1 . . . , N g estimate the threshold T gn by examining all the training samples, EG nm , for all the classes and selecting T gn that gives the best results.
- the results of the grain sorting training algorithm are stored in disk files that are read by the systems real-time grain sorting algorithm when it is started up.
- training files for a number of different species can be created and stored on disk. This allows the system to handle a variety of different species.
- the appropriate training files for the species to be processed only need be loaded into the real-time algorithm at the time it is started up.
- T gn defined in either Step 3 or in Step 2', depending on the classification method being used, are only approximations to the actual values for these thresholds. This follows from the fact that under typical training conditions only a very limited number of training samples are used. Plant personnel can change these values after the training is performed. These parameters provide the mechanism for plant personnel to change the uniformity of the sort. If a T gn is made smaller then all the parts sorted into a grain class in will have approximately the same grain pattern. If T gn is made larger then parts with more diverse grain patterns will be sorted into a grain class. Having a separate parameter for each grain class allows optimal control of the sorting process.
- the real-time operation has two embodiments. One is based on using the distance from class prototype classifier. The other is based on using the k-nearest neighbor algorithm. The steps that are common to these two embodiments are given below:
- H BW h BW (i)! of the best part face, i.e., for 0 ⁇ i ⁇ T CM
- G D g D (x, y)!, that indicates the number of diagonal edges present in either the right or left diagonal directions, i.e., for all x and y let
- each color class n has a total of m g (n) training samples. Then if the k-nearest neighbor classification rule is used the following steps need be performed:
- the real-time system loads parameters needed to do the classification from a disk file. This file is created by the grain sorting training algorithm.
- the files to be loaded are user defined. Hence to process red oak an operator must input the file names of the red oak training files. If, later, the desire is to change from red oak to hard maple, the halts red oak real-time processing. Then he restarts the real-time system this time typing in the file names of hard maple files.
- additional classes may have to be added in order to span the set of grain patterns that occur in a hardwood species' parts.
- the l p norm or metric used in this embodiment is l 1 .
- This norm was chosen because it seemingly provides reasonably accurate results while, at the same time, being very easy to compute. As a greater understanding of this technology is developed a different l p norm may be found that gives better sorting accuracies.
- two color cameras can have varying sensitivities. Because, for purposes of this invention, two color cameras 20a and 20b must be used and because based on the data from each a "best" face must be selected, it is important that the top and bottom cameras be equally sensitive to each color. To ensure this will be the case, we have created a normalizing algorithm that effectively maps the output of one camera into the output of a second, so that the response of both cameras is the same for each possible color.
- this normalizing algorithm requires that data be collected prior to employing it in real-time data collection, whether that data collection be for purposes of training or for real-time sorting. After the cameras 20a and 20b have been pointed, the lights 12a and 12b (or 16) adjusted, and the automatic light adjustment program is running, the required data can be collected. It is further assumed that the shading correction algorithm has been set up and is capable of correcting imagery from both color cameras 20a and 20b. Note each camera 20a and 20b will need its own set of calibration data.
- N color the number of color samples to be used. These color samples must span the intensity levels that the cameras 20a and 20b have been setup to handle. In fact, each "color" could be a different shade of gray. Each color sample must be presented to each camera in the same plane as a part face would be presented.
- the data collection part of the algorithm is as follows:
- the final corrected green and blue response is found in a similar manner.
- this invention is not limited to the color sorting and grain sorting wooden parts. It has a broader range of applicability. In fact, the range of problems that this device can address can be precisely defined mathematically.
- this device is applicable to any problem where there is some fluctuation of an item's color across its surface and where each member of a color class that is to be sorted is such that all members of the class share some common colors.
- a problem domain requires items to be sorted into M color classes, say C 1 , C 2 , . . . , C M .
- the Heckbert algorithm is the color mapping algorithm employed to reduce the dimensionality of the measurement vector used to characterize an item's surface color. It was selected because of its computational simplicity and because this color mapping method is considered to be a very robust one by persons knowledgeable of the art. However, other color mapping methods have been developed and, indeed, it is possible that the selection of which algorithm to use may be application problem depend.
- neural networks may offer an attractive alternative to the pattern recognition methods currently being employed.
- the nonlinearity and the adaptability of the network seemingly offers an attractive way to better match human judgments of color similarity.
- grain sorting is not limited to wood products; grain sorting is nothing more that sorting parts based on surface texture.
- the methods developed can be applied to other surface texture sorting problems.
- the domain of applicability for the texture sorting methods can be precisely defined mathematically.
- a problem domain requires items to be sorted into M surface texture classes, say T 1 , T 2 , . . . , T M .
- G D g D (x, y)!.
- EG eg( ⁇ i, ⁇ j, ⁇ k)! is three-dimensional.
- some problems may require that right-diagonal edges be differentiated from left-diagonal edges. In such cases G RD and G LD should not be averaged. Therefore EG becomes four-dimensional.
- the rest of the analysis proceeds from the above-described analysis in a straightforward manner which will be understood by those of skill in the art.
- the Heckbert algorithm or some other color mapping algorithm can be applied to it to reduce the dimensionality of the measurement vector needed to do texture sorting. In the case of the Heckbert algorithm, this is true even when EG is four-dimensional. It is also true for a number of other color mapping algorithms. Because of the similarities between the texture sorting and color sorting operations, incorporating this step into the texture sorting process can be done in a straightforward manner to those skilled in the art. One of skill in the art need only examine the color sorting algorithm to see how it should be done.
- the "best" face algorithm it will vary with the application problem to be considered. For example, in the sorting, of wooden parts, there are applications where grain is very important. For such applications, knowing the grain class of a face becomes important in determining which part face is the "best" face.
- the system needs a color image of surfaces that need to be examined.
- Color array cameras could be used to image the surfaces.
- Three black and white line scan cameras could also be employed to image each surface. In the embodiment in which three black and white line scan cameras are used, one camera would have a red color filter, one camera would have a green color filter, and one camera would have a blue color filter.
- Three black and white array cameras could be used in a similar manner to image each part face.
- any light sensing device that is sensitive in the 400 to 700 nanometer range and that allows filters to be used can be used as an imaging, device for this invention.
- Each filter would effectively determine the color that any one sensing, element can sense.
- At least three filters would have to be used and the colors they represent would have to be able to span color space. That is, suppose n color are used and let F 1 , . . . , F n represent the colors of these filters. Then any color C must be expressible as
- a 1 , a 2 , . . . , a n are non-negative real numbers. If an n filter imaging system is used, then the histograms used to do the color sorting will have to be n-dimensional. In this case, the analysis would proceed in a straightforward manner similar to that described above, as will be understood by those of skill in the art. Also, as described above, the Heckbert algorithm could be applied to this n-dimensional problem to reduce the dimensionality of the resulting analysis problem.
- the imaging device need not have its optical axis perpendicular to the surface of the item to be analyzed. Also, the optical axis of the imaging device need not be positioned as described above for the color and grain sorting system for wooden parts.
- the imaging geometry is very much application dependent. The geometry described above is used with respect to the color and grain sorting system for wooden parts because it minimizes the analysis that has to be performed on each surface. Regardless of the orientation of the imaging devices optical axis, the white target used for shading correction must be positioned in the plane defined by the surface that is to be analyzed.
- the white target used in the shading correction its reflectance characteristics are application-dependent. It should have a lighter intensity than any surface and/or part of a surface that is to be analyzed. Further it need not be inserted in the manner described above.
- the method for inserting the white target is problem-dependent. The easiest, most economical method of inserting this target preferably should be used. In some cases, an employee may insert the target manually. This is also true for the lens cap. Indeed, a lens cap might not be used. Any device that blocks light from entering the imaging device can be used.
- any lighting technology that provides adequate illumination for a part face to be imaged can be employed.
- ordinary incadescent technology can be used.
- Drifts in the amount of light produced over the lifetime of the bulb can be adjusted by an illumination control system similar to the one described above with respect to the color and grain sorting system for wooden parts.
- High pressure gas discharge bulbs can also be used. These are becoming popular with the automotive industry and hence, the cost of these devices should fall. Theoretically, such devices are even better than the tungsten halogen bulbs described above. However, they are currently much more expensive. Low pressure gas discharge bulbs, arc laps, and even fluorescent bulbs could be used.
- lasers also could be used. Three different lasers would be needed, all different in color, or a mixed gas laser such as that described in U.S. Pat. No. 3,945,729 could be used.
- n filters of colors F 1 , . . . , F n are being used by an imaging device.
- the only requirement a light source or sources must satisfy is that it or they illuminate the part face in such a way that enough light passes through each filter for the each associated sensing element to register a reading. It is believed that the best illumination source to use will be application-dependent and will also depend on the type of imaging technology being used. For example, if an array camera is being used, a strobe light source might be appropriate so as to allow the item being examined to move continuously through the field of view without having to be stopped to be imaged.
- the number of illuminators required enough should be used to provide adequate surface illumination.
- the number to be used will be dependent on flow rate, the dimensions of the surfaces to be imaged, the distance the illuminators are from the surface they are illuminating, the candle power of each illuminator, and finally the distance each imaging device is from the surface it is imaging.
- any geometry that allows the sources to provide adequate illumination can be used.
- surface speculars must be considered.
- the lighting should be arranged so that the imaging device will not image a surface specular.
- the imaging geometry is application-dependent.
- bulbs can be powered by alternating current power sources.
- light drift control is optional, although poorer color sorting results would be expected to occur without light drift control. In some problem areas where only rough color sorts are needed, maintaining constant illumination on an item's surface may not be required.
- a number of light source vendors sell light sources. Many vendors provide a direct current or a alternating current power supply in each source, where each source typically contains only one illuminator, e.g., a bulb. Hence, for practical engineering reasons, the number of power supplies used can range from one to the number of illuminators used on a system. Note that the lighting control algorithm increases in complexity as the number of light sources goes up.
- the present invention uses a color line scan camera, the same camera that is used to image a part face, to sense light source drift.
- a light sensor and a control circuit There are a number of other ways that drift can be corrected. All that is required is a light sensor and a control circuit.
- a possible sensor is a photo diode.
- the control circuit can be a specially-designed analog circuit, a special purpose digital circuit, or a digital circuit employing a microprocessor/microcontroller.
- the photo diode can be located above a target close to the field of view of a color imaging device, or be located adjacent an illuminator again looking a light reflecting target. It is known that at least one light source vendor is planning to develop a light source that has a photo diode and control circuitry contained in the source to reduce light drift.
- the nature of the materials-handling system changes as a function of the item being analyzed.
- item size, item weight, item shape, and item throughput play important roles in designing an appropriate materials-handling system.
- Design requirements for the materials-handling system also depend on the imaging systems being used. If line scan technology is being employed, the materials-handling system is usually designed to maintain a constant velocity of throughput. If array imaging technology is being used, then the materials-handling system may have to stop the item while it is in the field of view of the camera so that the item can be imaged.
- the need to air condition the electronic components of the system is also application-dependent. It depends on the environment where the application system will be located. In situations where the equipment is located in dusty, potential hot areas, using dust-free enclosures and air conditioning makes good sense. In more protected conditions, such enclosures would not be needed.
Abstract
Description
C=rR+gG+bB
r=∫C(λ)R(λ)I(λ)dλ
g=∫C(λ)G(λ)I(λ)dλ
b=∫C(λ)B(λ)I(λ)dλ
m(C.sub.1,C.sub.2)≦m(C.sub.1,C.sub.3).
r.sub.n =r/(r+g+b)
g.sub.n =g/(r+g+b)
b.sub.n =b/(r+g+b).
p.sub.r (r)=Σ.sub.g Σ.sub.b p(r, g, b)
p.sub.g (g)=Σ.sub.r Σ.sub.b p(r, g, b)
p.sub.b (b)=Σ.sub.r Σ.sub.g p(r, g, b).
bw(x, y)=Lookup(bw(x, y)).
bw.sub.nm (x,y)=(sr.sub.nm (x,y)+sg.sub.nm (x,y)+sb.sub.nm (x,y)) /3.
bw.sub.nm (x,y)=0
h.sub.nm (sr.sub.nm (x, y), sg.sub.nm (x, y), sb.sub.nm (x, y))=.sub.nm (sr.sub.nm (x, y), sg.sub.nm (x, y), sb.sub.nm (x, y))+1.
h.sub.BW (bw(x,y))=h.sub.BW (bw(x,y))+1.
h.sub.nm (i, j ,k)=0.
s.sub.n (i, j, k)+s.sub.n (i, j, k)+p.sub.nm (i, j, k).
s(i, j, k)=s(i, j, k)+p.sub.nm (i, j, k).
pr.sub.n (l)=pr.sub.n (l)/F.sub.n
sd(i.sub.I)≧0, and
sd(i.sub.I -j.sub.Back)<0
sd(i.sub.I +j.sub.Forward)>0,
h.sub.SBW (i.sub.v)≦h.sub.SBW (i.sub.v -j.sub.Back)
h.sub.SBW (i.sub.v)≦h.sub.SBW (i.sub.v +j.sub.Foward)
mni(l)=min.sub.i {(i,j,k)ε(i,j,k).di-elect cons.RP.sub.l },
mnj(l)=min.sub.j {(i,j,k)ε(i,j,k).di-elect cons.RP.sub.l },
mnk(l)=min.sub.k {(i,j,k)ε(i,j,k).di-elect cons.RP.sub.l },
mxi(l)=max.sub.i {(i,j,k)ε(i,j,k).di-elect cons.RP.sub.l },
mxj(l)=max.sub.j {(i,j,k)ε(i,j,k).di-elect cons.RP.sub.l },
mxk(l)=max.sub.k {(i,j,k)ε(i,j,k).di-elect cons.RP.sub.l }.
(mni(l),mnj(l),mnk(l)),
(mxi(l),mnj(l),mnk(l)),
(mni(l),mxj(l),mnk(l)),
(mni(l),mnj(l),mxk(l)),
(mni(l),mxj(l),mxk(l)),
(mxi(l),mnj(l),mxk(l)),
(mni(l),mxj(l),mxk(l)),
(mxi(l),mxj(l),mxk(l)).
c.sub.lookup (i, j, k)=gl
c.sub.lookup (i, j, k)=index.
bw.sub.s (x, y)=(sr.sub.s (x, y)+sg.sub.s (x, y)+sb.sub.s (x, y))/3.
bw.sub.s (x, y)=0
m.sub.s (c.sub.lookup (sr.sub.s (x, y), sg.sub.s (x, y)))=m.sub.s (c.sub.lookup (sr.sub.s (x, y), sg.sub.s (x, y), sb.sub.s (x, y)))+1
h.sub.BW (bw(x, y))=h.sub.BW (bw(x, y))+1.
m.sub.s (l)=0.
m.sub.s (l)=m.sub.s (l)/Area.sub.clear.
pr.sub.n '(l)=0
pr.sub.n '(l)=pr.sub.n (l).
pr.sub.n "(l)=pr.sub.n '/F.
mmv.sub.nm (l)=0
mmv.sub.nm (l)=cmv.sub.nm (l).
mmv.sub.nm (l)=mmv.sub.nm (l)/F
{D.sub.nm, n=1, . . . , N, m=1, . . . , m.sub.samples (n)}.
bw.sub.nm (x, y)=(sr.sub.nm (x, y)+sg.sub.nm (x, y)=sb.sub.nm (x, y))/3.
bw.sub.nm (x, y)=0.
h.sub.BW (bw(x, y))=h.sub.BW (bw(x, y))+1.
h.sub.BW (i)=0.
bw(x, y)=0.
g.sub.D (x, y)=INTEGER (g.sub.RD (x, y)+g.sub.LD (x, y))/2+0.5!
eg.sub.nm (g.sub.V (x, y), g.sub.H (x, y), g.sub.D (x, y)))=eg.sub.nm (g.sub.V (x, y), g.sub.D (x, y), g.sub.D (x, y))+1.
eg.sub.nm (Δi, Δj, Δk)=eg.sub.nm (Δi, Δj, Δk)/F.
pr.sub.n (Δi, Δj, Δk)=pr.sub.n (Δi, Δj, Δk)/F.
h.sub.BW (i)=0.
bw(x, y)=0.
g.sub.D (x, y)=INTEGER (g.sub.RD (x, y)+g.sub.LD (x, y))/2+0.5!
eg(g.sub.V (x, y), g.sub.H (x, y), g.sub.D (x, y))=eg(g.sub.V (x, y), g.sub.H (x, y), g.sub.D (x, y))+1.
eg(Δi, Δj, Δk)=eg(Δi, Δj, Δk)/F.
{D.sub.nm, n=1, . . . , N.sub.g, m=1, . . . , m.sub.g (n)}
ch.sub.lookup (i)=INTEGER (f(i)+0.5)
red.sub.final (i)=red.sub.lookup (r.sub.corrected (i)).
p(i, j, k)=h(i, j, k)/F
C=a.sub.1 F.sub.1 +a.sub.2 F.sub.2 + . . . +a.sub.n F.sub.n
Claims (55)
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US08/556,815 US5761070A (en) | 1995-11-02 | 1995-11-02 | Automatic color and grain sorting of materials |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US08/556,815 US5761070A (en) | 1995-11-02 | 1995-11-02 | Automatic color and grain sorting of materials |
Publications (1)
Publication Number | Publication Date |
---|---|
US5761070A true US5761070A (en) | 1998-06-02 |
Family
ID=24222979
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US08/556,815 Expired - Lifetime US5761070A (en) | 1995-11-02 | 1995-11-02 | Automatic color and grain sorting of materials |
Country Status (1)
Country | Link |
---|---|
US (1) | US5761070A (en) |
Cited By (68)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6072890A (en) * | 1998-05-06 | 2000-06-06 | Forintek Canada Corp. | Automatic lumber sorting |
US6160912A (en) * | 1996-05-24 | 2000-12-12 | Fuji Photo Film Co., Ltd. | Method of correcting color conversion data with accuracy |
US6295385B1 (en) * | 1997-07-03 | 2001-09-25 | Canon Kabushiki Kaisha | Image processing apparatus and image processing method |
US6324531B1 (en) * | 1997-12-12 | 2001-11-27 | Florida Department Of Citrus | System and method for identifying the geographic origin of a fresh commodity |
US6360008B1 (en) * | 1998-03-25 | 2002-03-19 | Fujitsu Limited | Method of and apparatus for converting color data |
US20020067511A1 (en) * | 2000-08-03 | 2002-06-06 | Toru Fujita | Electrophotographic image forming apparatus and image forming program product therefor |
WO2002060605A1 (en) * | 2001-01-30 | 2002-08-08 | E.L.C.O.S. S.R.L. | Method of classifying the aesthetic quality of planar materials, and system for automatically sorting planar materials using such a method |
US20020184168A1 (en) * | 2001-06-05 | 2002-12-05 | Mcclanahan Craig J. | System and method for determining acceptability of proposed color solution using an artificial intelligence based tolerance model |
US20020181766A1 (en) * | 2001-06-05 | 2002-12-05 | Mcclanahan Craig J. | Color management and solution distribution system and method |
US20020184171A1 (en) * | 2001-06-05 | 2002-12-05 | Mcclanahan Craig J. | System and method for organizing color values using an artificial intelligence based cluster model |
US20020191843A1 (en) * | 2001-06-05 | 2002-12-19 | Mcclanahan Craig J. | Color management and solution distribution system and method |
US6507803B1 (en) * | 1999-07-31 | 2003-01-14 | Abb Research Ltd. | Method for determining spraying parameters for a paint spraying unit |
WO2003047772A2 (en) * | 2001-12-06 | 2003-06-12 | Tubitak-Bilten (Turkiye Bilimsel Ve Teknik Arastirma Kurumu-Bilgi Teknolojileri Ve Elektronik Arastirma Enstitusu) | Apparatus and method for arranging plaques into the same orientation |
GB2385662A (en) * | 2001-10-05 | 2003-08-27 | Millennium Venture Holdings Lt | Classifying workpieces according to their tonal variation |
US6714924B1 (en) | 2001-02-07 | 2004-03-30 | Basf Corporation | Computer-implemented neural network color matching formulation system |
US20040098164A1 (en) * | 2002-11-18 | 2004-05-20 | James L. Taylor Manufacturing Company | Color and size matching of wooden boards |
US20040131756A1 (en) * | 2002-11-21 | 2004-07-08 | Skierski Thomas J. | Method of color matching wood stains |
US6804390B2 (en) | 2001-02-07 | 2004-10-12 | Basf Corporation | Computer-implemented neural network color matching formulation applications |
US20050018223A1 (en) * | 2003-06-18 | 2005-01-27 | University Of Southem California | Color matching in lighting reproduction systems |
US20050075754A1 (en) * | 2003-10-07 | 2005-04-07 | Zeitler David W. | Conveyor induct system with probability estimator |
US20050094007A1 (en) * | 2003-10-31 | 2005-05-05 | Yoshikuni Nomura | Image processing apparatus, image processing method, and program |
US20050188630A1 (en) * | 2004-01-16 | 2005-09-01 | Schlyper Omer T. | Simulated divided light products and processes and systems for making such products |
US6993512B2 (en) | 2001-06-05 | 2006-01-31 | Basf Corporation | System and method for converting a color formula using an artificial intelligence based conversion model |
WO2006042411A1 (en) * | 2004-10-21 | 2006-04-27 | Moore Stuart G | Method and system for detecting characteristics of lumber using end scanning |
US20060100939A1 (en) * | 2002-06-07 | 2006-05-11 | Rejean Boyer | Method and system for managing commodity information in a supply chain of production |
US20060233435A1 (en) * | 2005-04-14 | 2006-10-19 | Jeld-Wen, Inc. | Systems and methods of identifying and manipulating objects |
US7130834B2 (en) | 1997-12-12 | 2006-10-31 | Idaho Potato Commission | Identification system and method for determining the geographic origin of a fresh commodity |
US7167246B1 (en) | 2002-07-12 | 2007-01-23 | The Sherwin-Williams Company | Method of color matching metallic paints |
US7212654B2 (en) * | 1997-06-20 | 2007-05-01 | Dawn Foods, Inc. | Measurement of fruit particles |
US7218775B2 (en) | 2001-09-17 | 2007-05-15 | Her Majesty The Queen In Right Of Canada, As Represented By The Minister Of Agriculture And Agrifood | Method and apparatus for identifying and quantifying characteristics of seeds and other small objects |
US20070196862A1 (en) * | 2003-01-02 | 2007-08-23 | Kuo-Jeng Wang | Method for detecting a response of each probe zone on a test strip |
US20070223781A1 (en) * | 2002-12-27 | 2007-09-27 | Kuo-Jeng Wang | Method for determining a response of each probe zone on a test strip |
US7275803B2 (en) | 2003-03-18 | 2007-10-02 | Autolog, Inc. | System and method for printing a code on an elongate article and the code so printed |
US20080027986A1 (en) * | 2006-07-25 | 2008-01-31 | Pitney Bowes Incorporated | Method and system for sorting mail |
US20080055673A1 (en) * | 2006-09-04 | 2008-03-06 | Chromasens Gmbh | Method and Device for Document Scanning With Line Camera |
US20090223600A1 (en) * | 2008-03-10 | 2009-09-10 | Tappan John C | Automated floor board texturing cell and method |
ES2328770A1 (en) * | 2007-04-18 | 2009-11-17 | Tecmec Iberica, S.A. | Method of characterization of wooden colors (Machine-translation by Google Translate, not legally binding) |
US20110037874A1 (en) * | 2009-08-17 | 2011-02-17 | Canon Kabushiki Kaisha | Image pick-up apparatus to pick up static image |
US20110167970A1 (en) * | 2007-12-21 | 2011-07-14 | Robert Bosch Gmbh | Machine tool device |
WO2011087454A1 (en) * | 2010-01-18 | 2011-07-21 | Azimuth Intellectual Products Pte Ltd | Apparatus and methods for manipulating and acquiring images of a pallet load |
US20110202169A1 (en) * | 2010-02-17 | 2011-08-18 | Dow Agrosciences Llc | Apparatus and method for sorting plant material |
US8400628B2 (en) | 2011-04-28 | 2013-03-19 | Centre De Recherche Industrielle Du Quebec | Enclosure for an optical inspection apparatus |
US8708582B2 (en) | 2011-04-28 | 2014-04-29 | Centre de Recherche-Industrielle du Qubec | Camera enclosure assembly |
US8723945B2 (en) | 2011-04-28 | 2014-05-13 | Centre De Recherche Industrielle Du Quebec | Optical inspection apparatus and method |
US20140192372A1 (en) * | 2008-09-24 | 2014-07-10 | Samsung Electronics Co., Ltd | Method of processing image and image forming apparatus using the same |
KR101466596B1 (en) * | 2006-11-27 | 2014-11-28 | 돌비 레버러토리즈 라이쎈싱 코오포레이션 | Apparatus and methods for boosting dynamic range in digital images |
CN105170487A (en) * | 2015-06-05 | 2015-12-23 | 中山市利光电子有限公司 | Automatic control system for LED sorting machine |
WO2015199850A1 (en) * | 2014-06-27 | 2015-12-30 | Key Technology, Inc. | Method and apparatus for sorting |
US20160103115A1 (en) * | 2014-10-09 | 2016-04-14 | Haskan, Llc | Scanning system for wood |
US9463493B1 (en) | 2012-03-01 | 2016-10-11 | General Mills, Inc. | Method of producing gluten free oats |
CN106024666A (en) * | 2016-07-14 | 2016-10-12 | 无锡宏纳科技有限公司 | Method of picking out unqualified dies through camera |
US9588098B2 (en) | 2015-03-18 | 2017-03-07 | Centre De Recherche Industrielle Du Quebec | Optical method and apparatus for identifying wood species of a raw wooden log |
US20180093497A1 (en) * | 2016-10-04 | 2018-04-05 | Océ Holding B.V. | Method for processing a web in an apparatus |
CN108873955A (en) * | 2018-04-27 | 2018-11-23 | 昆山保扬新型材料科技有限公司 | A kind of color matching method of original liquid coloring textile material |
US10195647B2 (en) | 2016-01-15 | 2019-02-05 | Key Technology, Inc | Method and apparatus for sorting |
US20190223335A1 (en) * | 2014-11-06 | 2019-07-18 | Fuji Corporation | Component supply device |
US10363582B2 (en) | 2016-01-15 | 2019-07-30 | Key Technology, Inc. | Method and apparatus for sorting |
EP3466553A4 (en) * | 2016-06-07 | 2020-02-19 | Federacion Nacional De Cafeteros De Colombia | Device and method for classifying seeds |
CN110945561A (en) * | 2017-06-13 | 2020-03-31 | 爱色丽公司 | Hyperspectral imaging spectrophotometer and system |
CN110947637A (en) * | 2019-12-31 | 2020-04-03 | 佛山喀视科技有限公司 | Ceramic tile sorting system |
US10624258B2 (en) * | 2015-05-29 | 2020-04-21 | Cnh Industrial America Llc | Controller for a harvester |
CN111112143A (en) * | 2018-10-31 | 2020-05-08 | 北京服装学院 | Online fiber product identification and sorting device and method |
CN111515143A (en) * | 2020-04-28 | 2020-08-11 | 杭州径霖家纺有限公司 | Utilize weaving product of luminousness principle to retrieve sorter |
US10769776B2 (en) | 2016-02-12 | 2020-09-08 | Cognex Corporation | System and method for efficiently scoring probes in an image with a vision system |
CN112950574A (en) * | 2021-02-26 | 2021-06-11 | 合肥高晶光电科技有限公司 | Image recognition algorithm capable of classifying mushrooms in grades |
US11062525B1 (en) * | 2020-01-28 | 2021-07-13 | United States Of America, As Represented By The Secretary Of The Army | Method for generating an augmented set of images |
US11376636B2 (en) | 2018-08-20 | 2022-07-05 | General Mills, Inc. | Method of producing gluten free oats through hyperspectral imaging |
WO2024000039A1 (en) * | 2022-06-30 | 2024-01-04 | Deimos Laboratory Pty Ltd | An apparatus and method for visual inspection |
Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US3945729A (en) * | 1974-12-30 | 1976-03-23 | Stanford Research Institute | Combined ranging and color sensor |
US4132314A (en) * | 1977-06-13 | 1979-01-02 | Joerg Walter VON Beckmann | Electronic size and color sorter |
US4278538A (en) * | 1979-04-10 | 1981-07-14 | Western Electric Company, Inc. | Methods and apparatus for sorting workpieces according to their color signature |
EP0194148A2 (en) * | 1985-03-06 | 1986-09-10 | Lockwood Graders (U.K.) Limited | Method and apparatus for detecting coloured regions, and method and apparatus for sorting articles thereby |
US4992949A (en) * | 1989-01-27 | 1991-02-12 | Macmillan Bloedel Limited | Color sorting of lumber |
US5020675A (en) * | 1986-11-12 | 1991-06-04 | Lockwood Graders (Uk) Limited | Apparatus for sorting conveyed articles |
US5075768A (en) * | 1988-09-02 | 1991-12-24 | Itek Graphix Corporation | Method and apparatus for color separation scanning |
US5085325A (en) * | 1988-03-08 | 1992-02-04 | Simco/Ramic Corporation | Color sorting system and method |
US5440127A (en) * | 1993-05-17 | 1995-08-08 | Simco/Ramic Corporation | Method and apparatus for illuminating target specimens in inspection systems |
-
1995
- 1995-11-02 US US08/556,815 patent/US5761070A/en not_active Expired - Lifetime
Patent Citations (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US3945729A (en) * | 1974-12-30 | 1976-03-23 | Stanford Research Institute | Combined ranging and color sensor |
US4132314A (en) * | 1977-06-13 | 1979-01-02 | Joerg Walter VON Beckmann | Electronic size and color sorter |
US4278538A (en) * | 1979-04-10 | 1981-07-14 | Western Electric Company, Inc. | Methods and apparatus for sorting workpieces according to their color signature |
EP0194148A2 (en) * | 1985-03-06 | 1986-09-10 | Lockwood Graders (U.K.) Limited | Method and apparatus for detecting coloured regions, and method and apparatus for sorting articles thereby |
US5020675A (en) * | 1986-11-12 | 1991-06-04 | Lockwood Graders (Uk) Limited | Apparatus for sorting conveyed articles |
US5085325A (en) * | 1988-03-08 | 1992-02-04 | Simco/Ramic Corporation | Color sorting system and method |
US5075768A (en) * | 1988-09-02 | 1991-12-24 | Itek Graphix Corporation | Method and apparatus for color separation scanning |
US4992949A (en) * | 1989-01-27 | 1991-02-12 | Macmillan Bloedel Limited | Color sorting of lumber |
US5440127A (en) * | 1993-05-17 | 1995-08-08 | Simco/Ramic Corporation | Method and apparatus for illuminating target specimens in inspection systems |
Non-Patent Citations (12)
Title |
---|
Adel, M. et al "Evaluation of Colour Spaces in Computer Vision Application of Wood Defects Detection" Proceedings of the IEEE International Conference on Systems, Man and Cybernetics Part 2 (of 5); pp. 499-504, Oct. 17, 1993. |
Adel, M. et al Evaluation of Colour Spaces in Computer Vision Application of Wood Defects Detection Proceedings of the IEEE International Conference on Systems, Man and Cybernetics Part 2 (of 5); pp. 499 504, Oct. 17, 1993. * |
L. Haney et al., Color Matching of Wood with a Real Time Machine Vision System, 1994 International Winter Meeting, The American Society of Agricultural Engineers (Dec. 13 16, 1994). * |
L. Haney et al., Color Matching of Wood with a Real-Time Machine Vision System, 1994 International Winter Meeting, The American Society of Agricultural Engineers (Dec. 13-16, 1994). |
Lemstrom G et al "Color line scan technology in industrial applications" SPIE-International Society for Optical Engineering, 23 Oct. 1995-26 1995. |
Lemstrom G et al Color line scan technology in industrial applications SPIE International Society for Optical Engineering, 23 Oct. 1995 26 1995. * |
R. Conners et al., A Machine Vision System For Automatically Grading Hardwood Lumber, "Industrial Metrology 2" (Elsevier Science Publishers B.V. 1992) pp. 317-342. |
R. Conners et al., A Machine Vision System For Automatically Grading Hardwood Lumber, Industrial Metrology 2 (Elsevier Science Publishers B.V. 1992) pp. 317 342. * |
R. Conners et al., The Utility of Color Information in the Locatin and Identification of Defects in Surfaced Hardwood Lumber, First International Conference on Scanning Technology in Sawmilling (Oct. 10 11, 1985). * |
R. Conners et al., The Utility of Color Information in the Locatin and Identification of Defects in Surfaced Hardwood Lumber, First International Conference on Scanning Technology in Sawmilling (Oct. 10-11, 1985). |
S. Yoo et al., Color Machine Vision Used to Establish Color Grading Standards for Hardwood Dimension Parts , 1992 International Winter Meeting, The American Society of Agricultural Engineers (Dec. 15 18, 1992). * |
S. Yoo et al., Color Machine Vision Used to Establish Color Grading Standards for Hardwood Dimension Parts, 1992 International Winter Meeting, The American Society of Agricultural Engineers (Dec. 15-18, 1992). |
Cited By (109)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6160912A (en) * | 1996-05-24 | 2000-12-12 | Fuji Photo Film Co., Ltd. | Method of correcting color conversion data with accuracy |
US7212654B2 (en) * | 1997-06-20 | 2007-05-01 | Dawn Foods, Inc. | Measurement of fruit particles |
US6295385B1 (en) * | 1997-07-03 | 2001-09-25 | Canon Kabushiki Kaisha | Image processing apparatus and image processing method |
US6324531B1 (en) * | 1997-12-12 | 2001-11-27 | Florida Department Of Citrus | System and method for identifying the geographic origin of a fresh commodity |
US7130834B2 (en) | 1997-12-12 | 2006-10-31 | Idaho Potato Commission | Identification system and method for determining the geographic origin of a fresh commodity |
US6360008B1 (en) * | 1998-03-25 | 2002-03-19 | Fujitsu Limited | Method of and apparatus for converting color data |
US6072890A (en) * | 1998-05-06 | 2000-06-06 | Forintek Canada Corp. | Automatic lumber sorting |
US6507803B1 (en) * | 1999-07-31 | 2003-01-14 | Abb Research Ltd. | Method for determining spraying parameters for a paint spraying unit |
US7009729B2 (en) * | 2000-08-03 | 2006-03-07 | Seiko Epson Corporation | Electrophotographic image forming apparatus and image forming program product therefor |
US20020067511A1 (en) * | 2000-08-03 | 2002-06-06 | Toru Fujita | Electrophotographic image forming apparatus and image forming program product therefor |
WO2002060605A1 (en) * | 2001-01-30 | 2002-08-08 | E.L.C.O.S. S.R.L. | Method of classifying the aesthetic quality of planar materials, and system for automatically sorting planar materials using such a method |
US6714924B1 (en) | 2001-02-07 | 2004-03-30 | Basf Corporation | Computer-implemented neural network color matching formulation system |
US6804390B2 (en) | 2001-02-07 | 2004-10-12 | Basf Corporation | Computer-implemented neural network color matching formulation applications |
US6973211B2 (en) | 2001-06-05 | 2005-12-06 | Basf Corporation | Color management and solution distribution system and method |
US6999615B2 (en) | 2001-06-05 | 2006-02-14 | Basf Corporation | Color management and solution distribution system and method |
US20020184168A1 (en) * | 2001-06-05 | 2002-12-05 | Mcclanahan Craig J. | System and method for determining acceptability of proposed color solution using an artificial intelligence based tolerance model |
US20020181766A1 (en) * | 2001-06-05 | 2002-12-05 | Mcclanahan Craig J. | Color management and solution distribution system and method |
US20020184171A1 (en) * | 2001-06-05 | 2002-12-05 | Mcclanahan Craig J. | System and method for organizing color values using an artificial intelligence based cluster model |
US6993512B2 (en) | 2001-06-05 | 2006-01-31 | Basf Corporation | System and method for converting a color formula using an artificial intelligence based conversion model |
US6892194B2 (en) * | 2001-06-05 | 2005-05-10 | Basf Corporation | System and method for organizing color values using an artificial intelligence based cluster model |
US20020191843A1 (en) * | 2001-06-05 | 2002-12-19 | Mcclanahan Craig J. | Color management and solution distribution system and method |
US7218775B2 (en) | 2001-09-17 | 2007-05-15 | Her Majesty The Queen In Right Of Canada, As Represented By The Minister Of Agriculture And Agrifood | Method and apparatus for identifying and quantifying characteristics of seeds and other small objects |
GB2385662A (en) * | 2001-10-05 | 2003-08-27 | Millennium Venture Holdings Lt | Classifying workpieces according to their tonal variation |
WO2003047772A2 (en) * | 2001-12-06 | 2003-06-12 | Tubitak-Bilten (Turkiye Bilimsel Ve Teknik Arastirma Kurumu-Bilgi Teknolojileri Ve Elektronik Arastirma Enstitusu) | Apparatus and method for arranging plaques into the same orientation |
WO2003047772A3 (en) * | 2001-12-06 | 2004-03-18 | Tubitak Bilten Turkiye Bilimse | Apparatus and method for arranging plaques into the same orientation |
US20060100939A1 (en) * | 2002-06-07 | 2006-05-11 | Rejean Boyer | Method and system for managing commodity information in a supply chain of production |
US7167246B1 (en) | 2002-07-12 | 2007-01-23 | The Sherwin-Williams Company | Method of color matching metallic paints |
US7571818B2 (en) * | 2002-11-18 | 2009-08-11 | James L. Taylor Manufacturing Company | Color and size matching of wooden boards |
US20040098164A1 (en) * | 2002-11-18 | 2004-05-20 | James L. Taylor Manufacturing Company | Color and size matching of wooden boards |
US20040131756A1 (en) * | 2002-11-21 | 2004-07-08 | Skierski Thomas J. | Method of color matching wood stains |
US7116420B2 (en) | 2002-11-21 | 2006-10-03 | The Sherwin-Williams Company | Method of color matching wood stains |
US7885444B2 (en) * | 2002-12-27 | 2011-02-08 | Transpacific Systems, Llc | Method for determining a response of each probe zone on a test strip |
US20070223781A1 (en) * | 2002-12-27 | 2007-09-27 | Kuo-Jeng Wang | Method for determining a response of each probe zone on a test strip |
US7822245B2 (en) | 2003-01-02 | 2010-10-26 | Kuo-Jeng Wang | Method for detecting a response of each probe zone on a test strip |
US20070196862A1 (en) * | 2003-01-02 | 2007-08-23 | Kuo-Jeng Wang | Method for detecting a response of each probe zone on a test strip |
US7275803B2 (en) | 2003-03-18 | 2007-10-02 | Autolog, Inc. | System and method for printing a code on an elongate article and the code so printed |
US20050018223A1 (en) * | 2003-06-18 | 2005-01-27 | University Of Southem California | Color matching in lighting reproduction systems |
US7529004B2 (en) * | 2003-06-18 | 2009-05-05 | University Of Southern California | Color matching in lighting reproduction systems |
US20050075754A1 (en) * | 2003-10-07 | 2005-04-07 | Zeitler David W. | Conveyor induct system with probability estimator |
US7191895B2 (en) * | 2003-10-07 | 2007-03-20 | Dematic Corp. | Conveyor induct system with probability estimator |
US7413071B2 (en) | 2003-10-07 | 2008-08-19 | Dematic Corp. | Conveyor induction subsystem and method of inducting articles using feedback-gappers |
US7412092B2 (en) * | 2003-10-31 | 2008-08-12 | Sony Corporation | Image processing apparatus, image processing method, and program |
US20050094007A1 (en) * | 2003-10-31 | 2005-05-05 | Yoshikuni Nomura | Image processing apparatus, image processing method, and program |
US20050188630A1 (en) * | 2004-01-16 | 2005-09-01 | Schlyper Omer T. | Simulated divided light products and processes and systems for making such products |
US7854097B2 (en) | 2004-01-16 | 2010-12-21 | Jeld-Wen, Inc. | Simulated divided light products and processes and systems for making such products |
US20080140248A1 (en) * | 2004-10-21 | 2008-06-12 | Stuart G. Moore Holding Inc. | Method and System for Determining Characteristics of Lumber Using End Scanning |
WO2006042411A1 (en) * | 2004-10-21 | 2006-04-27 | Moore Stuart G | Method and system for detecting characteristics of lumber using end scanning |
US20060233435A1 (en) * | 2005-04-14 | 2006-10-19 | Jeld-Wen, Inc. | Systems and methods of identifying and manipulating objects |
US20090299516A1 (en) * | 2005-04-14 | 2009-12-03 | Jeld-Wen, Inc. | Systems and methods of identifying and manipulating objects |
US7640073B2 (en) | 2005-04-14 | 2009-12-29 | Jeld-Wen, Inc. | Systems and methods of identifying and manipulating objects |
US7801638B2 (en) | 2005-04-14 | 2010-09-21 | Jeld-Wen, Inc. | Systems and methods of identifying and manipulating objects |
US20080027986A1 (en) * | 2006-07-25 | 2008-01-31 | Pitney Bowes Incorporated | Method and system for sorting mail |
US7769765B2 (en) | 2006-07-25 | 2010-08-03 | Lockheed Martin Corporation | Method and system for sorting mail |
US20080055673A1 (en) * | 2006-09-04 | 2008-03-06 | Chromasens Gmbh | Method and Device for Document Scanning With Line Camera |
US7986446B2 (en) * | 2006-09-04 | 2011-07-26 | Chromasens Gmbh | Method and device for document scanning with line camera |
KR101466596B1 (en) * | 2006-11-27 | 2014-11-28 | 돌비 레버러토리즈 라이쎈싱 코오포레이션 | Apparatus and methods for boosting dynamic range in digital images |
ES2328770A1 (en) * | 2007-04-18 | 2009-11-17 | Tecmec Iberica, S.A. | Method of characterization of wooden colors (Machine-translation by Google Translate, not legally binding) |
US20110167970A1 (en) * | 2007-12-21 | 2011-07-14 | Robert Bosch Gmbh | Machine tool device |
US8948903B2 (en) * | 2007-12-21 | 2015-02-03 | Robert Bosch Gmbh | Machine tool device having a computing unit adapted to distinguish at least two motions |
US20090223600A1 (en) * | 2008-03-10 | 2009-09-10 | Tappan John C | Automated floor board texturing cell and method |
US8186399B2 (en) * | 2008-03-10 | 2012-05-29 | Unilin Flooring Nc Llc | Automated floor board texturing cell and method |
US9635215B2 (en) * | 2008-09-24 | 2017-04-25 | Samsung Electronics Co., Ltd. | Method of processing image and image forming apparatus using the same |
US20140192372A1 (en) * | 2008-09-24 | 2014-07-10 | Samsung Electronics Co., Ltd | Method of processing image and image forming apparatus using the same |
US8908088B2 (en) * | 2009-08-17 | 2014-12-09 | Canon Kabushiki Kaisha | Image pick-up apparatus capable of correcting shading due to a closing travel operation of a shutter to pick up static image |
US8390735B2 (en) * | 2009-08-17 | 2013-03-05 | Canon Kabushiki Kaisha | Image pick-up apparatus having rotating shutter blades that move in mutually opposite directions for picking up a static image |
US20110037874A1 (en) * | 2009-08-17 | 2011-02-17 | Canon Kabushiki Kaisha | Image pick-up apparatus to pick up static image |
WO2011087454A1 (en) * | 2010-01-18 | 2011-07-21 | Azimuth Intellectual Products Pte Ltd | Apparatus and methods for manipulating and acquiring images of a pallet load |
US8253054B2 (en) * | 2010-02-17 | 2012-08-28 | Dow Agrosciences, Llc. | Apparatus and method for sorting plant material |
US20110202169A1 (en) * | 2010-02-17 | 2011-08-18 | Dow Agrosciences Llc | Apparatus and method for sorting plant material |
US8400628B2 (en) | 2011-04-28 | 2013-03-19 | Centre De Recherche Industrielle Du Quebec | Enclosure for an optical inspection apparatus |
US8708582B2 (en) | 2011-04-28 | 2014-04-29 | Centre de Recherche-Industrielle du Qubec | Camera enclosure assembly |
US8723945B2 (en) | 2011-04-28 | 2014-05-13 | Centre De Recherche Industrielle Du Quebec | Optical inspection apparatus and method |
US9463493B1 (en) | 2012-03-01 | 2016-10-11 | General Mills, Inc. | Method of producing gluten free oats |
US10596603B2 (en) | 2012-03-01 | 2020-03-24 | General Mills, Inc. | Method of producing gluten free oats |
US9266148B2 (en) | 2014-06-27 | 2016-02-23 | Key Technology, Inc. | Method and apparatus for sorting |
US20160129480A1 (en) * | 2014-06-27 | 2016-05-12 | Key Technology, Inc. | Method and apparatus for sorting |
US10478862B2 (en) | 2014-06-27 | 2019-11-19 | Key Technology, Inc. | Method and apparatus for sorting |
US9517491B2 (en) | 2014-06-27 | 2016-12-13 | Key Technology, Inc. | Method and apparatus for sorting |
US9573168B2 (en) | 2014-06-27 | 2017-02-21 | Key Technology,. Inc | Method and apparatus for sorting |
WO2015199850A1 (en) * | 2014-06-27 | 2015-12-30 | Key Technology, Inc. | Method and apparatus for sorting |
US9795996B2 (en) * | 2014-06-27 | 2017-10-24 | Key Technology, Inc. | Method and apparatus for sorting |
US20160103115A1 (en) * | 2014-10-09 | 2016-04-14 | Haskan, Llc | Scanning system for wood |
US9958428B2 (en) * | 2014-10-09 | 2018-05-01 | Haskan, Llc | Scanning system for wood |
US20190223335A1 (en) * | 2014-11-06 | 2019-07-18 | Fuji Corporation | Component supply device |
US10813258B2 (en) * | 2014-11-06 | 2020-10-20 | Fuji Corporation | Component supply device |
US9588098B2 (en) | 2015-03-18 | 2017-03-07 | Centre De Recherche Industrielle Du Quebec | Optical method and apparatus for identifying wood species of a raw wooden log |
US10624258B2 (en) * | 2015-05-29 | 2020-04-21 | Cnh Industrial America Llc | Controller for a harvester |
CN105170487A (en) * | 2015-06-05 | 2015-12-23 | 中山市利光电子有限公司 | Automatic control system for LED sorting machine |
CN105170487B (en) * | 2015-06-05 | 2018-07-24 | 中山市利光电子有限公司 | A kind of automatic control system of LED light splitting machines |
US10195647B2 (en) | 2016-01-15 | 2019-02-05 | Key Technology, Inc | Method and apparatus for sorting |
US10363582B2 (en) | 2016-01-15 | 2019-07-30 | Key Technology, Inc. | Method and apparatus for sorting |
US11676301B2 (en) | 2016-02-12 | 2023-06-13 | Cognex Corporation | System and method for efficiently scoring probes in an image with a vision system |
US10769776B2 (en) | 2016-02-12 | 2020-09-08 | Cognex Corporation | System and method for efficiently scoring probes in an image with a vision system |
EP3466553A4 (en) * | 2016-06-07 | 2020-02-19 | Federacion Nacional De Cafeteros De Colombia | Device and method for classifying seeds |
CN106024666A (en) * | 2016-07-14 | 2016-10-12 | 无锡宏纳科技有限公司 | Method of picking out unqualified dies through camera |
US10589546B2 (en) * | 2016-10-04 | 2020-03-17 | Canon Production Printing Holding B.V. | Method for processing a web in an apparatus |
US20180093497A1 (en) * | 2016-10-04 | 2018-04-05 | Océ Holding B.V. | Method for processing a web in an apparatus |
CN110945561A (en) * | 2017-06-13 | 2020-03-31 | 爱色丽公司 | Hyperspectral imaging spectrophotometer and system |
CN108873955A (en) * | 2018-04-27 | 2018-11-23 | 昆山保扬新型材料科技有限公司 | A kind of color matching method of original liquid coloring textile material |
CN108873955B (en) * | 2018-04-27 | 2021-05-11 | 昆山保扬新型材料科技有限公司 | Color matching method for dope-dyed textile material |
US11376636B2 (en) | 2018-08-20 | 2022-07-05 | General Mills, Inc. | Method of producing gluten free oats through hyperspectral imaging |
CN111112143A (en) * | 2018-10-31 | 2020-05-08 | 北京服装学院 | Online fiber product identification and sorting device and method |
CN111112143B (en) * | 2018-10-31 | 2021-07-27 | 北京服装学院 | Online fiber product identification and sorting device and method |
CN110947637A (en) * | 2019-12-31 | 2020-04-03 | 佛山喀视科技有限公司 | Ceramic tile sorting system |
US11062525B1 (en) * | 2020-01-28 | 2021-07-13 | United States Of America, As Represented By The Secretary Of The Army | Method for generating an augmented set of images |
CN111515143A (en) * | 2020-04-28 | 2020-08-11 | 杭州径霖家纺有限公司 | Utilize weaving product of luminousness principle to retrieve sorter |
CN112950574A (en) * | 2021-02-26 | 2021-06-11 | 合肥高晶光电科技有限公司 | Image recognition algorithm capable of classifying mushrooms in grades |
CN112950574B (en) * | 2021-02-26 | 2022-10-18 | 合肥高晶光电科技有限公司 | Image recognition algorithm capable of classifying mushrooms in grades |
WO2024000039A1 (en) * | 2022-06-30 | 2024-01-04 | Deimos Laboratory Pty Ltd | An apparatus and method for visual inspection |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US5761070A (en) | Automatic color and grain sorting of materials | |
US5850472A (en) | Colorimetric imaging system for measuring color and appearance | |
US20230368426A1 (en) | Hyperspectral imaging spectrophotometer and system | |
US4278538A (en) | Methods and apparatus for sorting workpieces according to their color signature | |
Polder et al. | Spectral image analysis for measuring ripeness of tomatoes | |
US4972093A (en) | Inspection lighting system | |
EP0948191B1 (en) | Scanner illumination | |
US20110282613A1 (en) | Characterization of a model-based spectral reflectance sensing device | |
US5062714A (en) | Apparatus and method for pattern recognition | |
US4977522A (en) | Apparatus for determining the formulation of paint for use in bodywork repair | |
CN101510942B (en) | Image reading apparatus | |
US20230252617A1 (en) | Systems and methods for automatically grading pre-owned electronic devices | |
WO1986005643A1 (en) | Method for determining the color of a scene illuminant from a color image of the scene | |
JP2003106900A (en) | System and method determining spectrum using fuzzy inference algorithm employing measurement from led sensor | |
US20040208359A1 (en) | Image highlight correction using illumination specific hsv color coordinate | |
Marszalec et al. | Some aspects of RGB vision and its applications in industry | |
CN208177898U (en) | For detecting the device of the object in material flow | |
Marszalec et al. | Color measurements based on a color camera | |
Kauppinen et al. | The effect of illumination variations on color-based wood defect classification | |
CA2153647A1 (en) | Method and apparatus for recognizing geometrical features of parallelepiped-shaped parts of polygonal section | |
US20210215618A1 (en) | Multi-color surface inspection system, method for inspecting a surface, and method for calibrating the multi-color surface inspection system | |
EP4260027A1 (en) | System, robot and method for measuring the color of an area of a sample or of a vehicle's part | |
EP1041378A1 (en) | Produce recognition system including a produce data collector | |
Srikanteswara et al. | Real-time implementation of a color sorting system | |
Lu | A real-time system for color sorting edge-glued panel parts |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: VIRGINIA POLYTECHNIC INSTITUTE AND STATE UNIVERSIT Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:CONNERS, RICHARD W.;LU, QIANG;REEL/FRAME:007807/0824 Effective date: 19960126 |
|
AS | Assignment |
Owner name: VIRGINIA TECH INTELLECTUAL PROPERTIES, INC., VIRGI Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNOR:VIRGINIA POLYTECHNIC INSTITUTE AND STATE UNIVERSITY;REEL/FRAME:007830/0141 Effective date: 19960214 |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |
|
REFU | Refund |
Free format text: REFUND - PAYMENT OF MAINTENANCE FEE, 4TH YEAR, LARGE ENTITY (ORIGINAL EVENT CODE: R183); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
FPAY | Fee payment |
Year of fee payment: 4 |
|
FEPP | Fee payment procedure |
Free format text: PAYER NUMBER DE-ASSIGNED (ORIGINAL EVENT CODE: RMPN); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY Free format text: PAYOR NUMBER ASSIGNED (ORIGINAL EVENT CODE: ASPN); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
REMI | Maintenance fee reminder mailed | ||
FPAY | Fee payment |
Year of fee payment: 8 |
|
FPAY | Fee payment |
Year of fee payment: 12 |
|
FEPP | Fee payment procedure |
Free format text: PAYOR NUMBER ASSIGNED (ORIGINAL EVENT CODE: ASPN); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY Free format text: PAYER NUMBER DE-ASSIGNED (ORIGINAL EVENT CODE: RMPN); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |