WO2012122542A2 - Intelligent airfoil component surface imaging inspection - Google Patents

Intelligent airfoil component surface imaging inspection Download PDF

Info

Publication number
WO2012122542A2
WO2012122542A2 PCT/US2012/028636 US2012028636W WO2012122542A2 WO 2012122542 A2 WO2012122542 A2 WO 2012122542A2 US 2012028636 W US2012028636 W US 2012028636W WO 2012122542 A2 WO2012122542 A2 WO 2012122542A2
Authority
WO
WIPO (PCT)
Prior art keywords
component
response
image
inspection
data set
Prior art date
Application number
PCT/US2012/028636
Other languages
French (fr)
Other versions
WO2012122542A3 (en
Inventor
Amir SHIRKHODAIE
Robert E. MORIARTY
Kong Ma
Matthew T. Kush
Original Assignee
Rolls-Royce Corporation
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Rolls-Royce Corporation filed Critical Rolls-Royce Corporation
Publication of WO2012122542A2 publication Critical patent/WO2012122542A2/en
Publication of WO2012122542A3 publication Critical patent/WO2012122542A3/en

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/91Investigating the presence of flaws or contamination using penetration of dyes, e.g. fluorescent ink
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0004Industrial image inspection
    • G06T7/0006Industrial image inspection using a design-rule based approach
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0004Industrial image inspection
    • G06T7/001Industrial image inspection using an image reference approach
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30108Industrial image inspection
    • G06T2207/30164Workpiece; Machine component

Definitions

  • the present invention generally relates to one or more aspects of systems useful for inspection and evaluation, such as, but not limited to, to automated surface inspection and evaluation using fuzzy logic analysis, automated grain structure characterization process including fuzzy logic analysis, automated surface inspection process including fuzzy logic analysis, protocol-based inspection system, automated object analysis manipulators, and continuous diffuse illumination system.
  • One embodiment of the present invention is a unique surface imaging inspection process.
  • Other embodiments include apparatuses, systems, devices, hardware, methods, and combinations for a surface imaging inspection process utilizing fuzzy logic analysis.
  • Still another embodiment of the present invention is a unique grain structure
  • a further embodiment of the present invention is a unique protocol-based inspection system. Still further embodiments
  • embodiments include apparatuses, systems, devices, hardware, methods, and
  • Yet another embodiment of the present invention is a unique automated object manipulation system. Yet still other embodiments include apparatuses, systems, devices, hardware, methods, and combinations for an automated object manipulation system. A still further embodiment of the present invention is a unique object illumination system. Still further embodiments include apparatuses, systems, devices, hardware, methods, and combinations for a continuous diffuse illumination system for an inspection system. Further embodiments, forms, features, aspects, benefits, and advantages of the present application shall become apparent from the description and figures provided herewith.
  • Figure 1 is an illustration of one embodiment of a surface imaging inspection system.
  • Figure 2 is a flow diagram of one embodiment of the present application.
  • Figure 3 is a flow diagram of one embodiment of an image acquisition module from Figure 2.
  • Figure 4 is a flow diagram of one embodiment of an image registration module from Figure 2.
  • Figure 5 is a flow diagram of one embodiment of an inspection module from Figure 2.
  • Figure 6 is a flow diagram of one embodiment of a condition assessment module from Figure 2.
  • Figure 6a is a flow diagram of one embodiment of a fuzzy logic analysis module.
  • Figure 7 is a flow diagram of one embodiment of a reporting module from Figure
  • Figure 8 is a flow diagram of one embodiment of an airfoil library module from Figure 5.
  • Figure 9 is an illustration of one embodiment of a grain structure characterization system.
  • Figure 10 is a flow diagram of one embodiment of a grain structure
  • Figure 11 is a flow diagram of a characterization process from Figure 2.
  • Figure 12 is a diagram illustrating one embodiment of a characterization system of the present application.
  • Figure 13 is an illustration of one embodiment of a surface inspection system.
  • Figure 14 is a flow diagram of one embodiment of an inspection process.
  • Figure 15 is a flow diagram of a process from Figure 14.
  • Figure 16 is a schematic of an embodiment of an inspection system of the present application.
  • Figure 17 is an illustration of a graphical user interface of an embodiment of an inspection system of the present application.
  • Figure 18 is a process flow diagram of an embodiment of an inspection process of the present application.
  • Figure 19 is a schematic diagram of an embodiment of a component of an inspection system.
  • Figure 20 is a process flow diagram of embodiment software of an inspection system of the present application.
  • Figure 21 is an illustration of an embodiment of an object manipulation system.
  • Figure 22 is an exploded view of an embodiment of an object manipulation system.
  • Figure 23a is a front view illustration of an embodiment of an object manipulation system.
  • Figure 23b is a side view illustration of an embodiment of an object manipulation system.
  • Figure 23c is a back view illustration of an embodiment of an object manipulation system.
  • Figure 24a and 4b are illustrations of an embodiment of an object manipulation system.
  • Figure 25 is an illustration of a portion of an embodiment of an object
  • Figure 26 is an illustration demonstrating movement of a portion of an
  • Figure 27 is an illustration of one degree of freedom of an embodiment of an object manipulation system.
  • Figure 28 is an illustration of another degree of freedom of an embodiment of an object manipulation system.
  • Figure 29 is an illustration of an embodiment of an illumination system of the present application.
  • Figure 30 is a diagram of an embodiment of the present application.
  • Figure 31 is an arrangement of components for an embodiment of the present application.
  • Figure 32 is a process flow diagram of an embodiment of the present application. DETAILED DESCRIPTION OF THE ILLUSTRATIVE EMBODIMENTS
  • FIG. 1 an illustration is shown for a surface imaging inspection system 1100 representing an embodiment of the present application including an automated imaging process, algorithms, sensors, robotic positioning and other analysis to locate, evaluate and report surface images.
  • Surface imaging inspection system 1100 is shown to include an inspection assembly 1120 and a controller 1130.
  • Inspection assembly 1120 includes a positioning system 1124 and an imaging system 1126.
  • Positioning system 1124 of this embodiment operates with a part presentation technique based on an algorithm for manipulating a part 1122 in an efficient manner with minimum hunting for part surfaces and anomalies.
  • Embodiments of positioning system 1124 can include a robotic part manipulator. Robotic part manipulation can provide consistent positioning of part 1122 during the inspection process which can reduce variation and improve efficiency of the inspection process.
  • part manipulation can include presenting the part to a detection device such as a camera.
  • a detection device such as a camera.
  • imaging system 1126 acquires images used to identify the type of part 1122 being inspected during a registration process. From the registration process, a positioning algorithm is selected to provide predetermined part manipulation during further imaging processes.
  • Controller 1130 of surface imaging inspection system 1100 is shown
  • Controller 1130 can also include one or more microprocessors where a single microprocessor provides the functions of each module or separate microprocessors are used for one or more of the control modules.
  • Controller 1130 as shown is capable of operating an image data processing system 1132 and a robotic manipulation module 1138.
  • Robotic manipulation module 1138 is shown in Figure 1 as part of controller 1130.
  • Robotic manipulation module 1138 can be part of the positioning equipment in positioning system 1124 as a single system or as separate components.
  • robotic manipulation module 1138 is capable of providing a positioning algorithm, a component type recognition database and a set of predetermined part manipulation instructions to surface imaging inspection system 1100.
  • Image data processing system 1132 can include an analyzer module 1134 and an imaging module 1136.
  • imaging module 1136 can include a controlled electromagnetic radiation configuration with a radiation media generator and a radiation detector whether the detector is physical or digital.
  • the radiation media can include visible light, radio waves, microwaves, infrared radiation, ultraviolet light, x-rays and gamma rays to name a few.
  • the intensity of the emitted radiation media can be adjusted to ensure adequate imaging.
  • the type of radiation media can be selected based on criteria such as but not limited to, equipment availability, component sensitivity, material, estimated defect characteristics and the like.
  • surface imaging inspection system 1100 can utilize a visible light generator with an optical camera for imaging system 1126 to produce images of a component as well as produce images of a surface or multiple surfaces of the component. Imaging module 1136 is then able to analyze the produced image(s) for surface features. In another embodiment, imaging module 1136 can interface with imaging system 1126 providing equipment controls as an alternative to controls provided directly with the imaging equipment or from another source.
  • Imaging module 1136 of surface imaging inspection system 1100 can include but are not limited to cracks, porosity, damage, curvature, dimensions and the like.
  • the component being analyzed can include a single crystal, a directionally solidified, and/or an equiaxed microstructure.
  • the component can include an airfoil component of a gas turbine engine.
  • One embodiment operates to mechanically locate, evaluate, and report surface features on families of airfoil type components.
  • Another embodiment of the present system generates a report of the sizes, locations and types of features on the surface of the component in tabular or graphical form.
  • analyzer module 1134 can be a fuzzy logic analyzer module capable of providing analysis of the image data sets from imaging system 1126.
  • fuzzy logic can be used in surface imaging inspection system 1100 to deal with fuzzy concepts— concepts that cannot be expressed as "true” or "false” but rather as “partial truths.” Fuzzy logic analysis allows an automated inspection to access a deposit of component images 1140 or a knowledge bank to apply cognitive characterization of features and provide a level of consistency to determine a pass/fail status according to a component specification.
  • Another embodiment of the present application applies a lighting configuration, a part presentation technique, and a fuzzy logic based image processing technique for identifying surface features in a single crystal cast airfoil component.
  • Yet another embodiment includes an algorithm for manipulating a part with respect to lighting and camera positions in an efficient manner with minimum hunting for a subject and a fuzzy logic based image processing algorithm to identify surface features which indicate a surface defect.
  • Inspection process 1200 includes five modules: an image acquisition module 1300, an image registration module 1400, an inspection module 1500, a condition assessment module 1600 and a reporting module 1700.
  • a particular embodiment of the present application can transition between modules and some aspects may or may not be evident in each embodiment.
  • Image acquisition module 1300 is shown with further detail for one embodiment in Figure 3.
  • Image acquisition module 1300 of this exemplary embodiment begins by acquiring an image in operation 1310 with a detection device, camera or electronic image sensor, for example.
  • a detection device camera or electronic image sensor
  • image acquisition of module 1300 can include processing, compression and storage before the data is further processed in image acquisition module 1300 or inspection process 1200.
  • Modules which can be a part of the acquisition process to improve the quality of the data created during the acquisition process of module 1300 can include image quality verification in conditional 1320, image illumination adjustment in operation 1330, and background image removal in operation 1340.
  • Conditional 1320 verifies the image quality.
  • Image quality verification in operation 1320 can include a comparison with an image from a deposit of component images 1800. If the image quality cannot be verified, image acquisition module 1300 returns to operation 1310 to acquire another image. Once an image can be verified, image acquisition module 1300 moves to operation 1330. In operation 1330, the illumination can be adjusted to improve contrast, increase
  • Operation 1330 is shown as preceding operation 1340 where a background image(s) can be removed.
  • background image removal could provide an image with fewer variations to evaluate or compare.
  • image registration module 1400 is shown with further detail from one embodiment in Figure 4 and can begin with operation 1420 including image registration.
  • Image registration of operation 1420 takes the image from image acquisition module 1300 and identifies a set of predetermined features.
  • the part being tested can be identified by component type from the set of predetermined features where further processing routines such as inspection part manipulation can be set up according to the identified component type.
  • the part identification in module 1400 can be based on a comparison between a collected image or images of the part being inspected and a database of part responses.
  • the system provides a predetermined manipulation algorithm for part presentation during an imaging process.
  • a part with multiple surfaces suitable for inspection can be automatically rotated and positioned according to a predetermined manipulation algorithm allowing for consistent multiple image acquisitions.
  • a cylindrical component can require rotation along the axis to expose a substantially complete exterior surface.
  • a predetermined manipulation algorithm determined by the system can then include manipulation instructions for positioning the cylindrical component perpendicular to the angle of incidence of the cylinder surface and rotating 360° to maintain the perpendicular orientation.
  • image registration module 1400 can also contain a macro component quality review with conditional 1430 where an image registration can be verified.
  • Conditional 1430 determines whether or not the sample image was properly identified by component type.
  • a part without proper features can have an image with missing or non-corresponding form.
  • the missing or non-corresponding form can be an indication of a non-conforming part which requires repair or rejection.
  • a part which has been identified to have non-conforming areas in conditional 1430 can be evaluated under operation 1460 to determine if the part should be rejected or repaired based on a degree of structural non-conformity.
  • a report can be generated in operation 1470 and a human inspector can be contacted in operation 1480 for confirmation.
  • a part can be labeled as a rejected part in conditional 1430 following image registration in operation 1420.
  • an airfoil casting with incomplete mold fill would not have an image comparable to a standard image but would show a lack of material with a non-corresponding form.
  • Image registration 1420 can determine an airfoil component using an outline image but registration verification 1430 can identify a missing portion of the image. The degree of deformity and location of the deformity can be factors in deciding whether the part is rejected or repaired in operation 1460.
  • a casting with a uniform surface can have a profile comparable to a standard image in image registration 1420.
  • Registration verification 1430 can determine if no macro deformities are present when comparing the acquired image with a standard image and verifying the part is ready for inspection.
  • inspection process 1200 can continue with inspection of the part in inspection module 1500.
  • Inspection module 1500 is shown with further detail for one embodiment in Figure 5.
  • module 1500 can be capable of presenting the part with a predetermined manipulation algorithm based on the automated recognition and registration in module 1400.
  • operation 1510 includes anomaly detection.
  • Anomaly detection can include collecting surface images as the part is manipulated to pre-determined positions. Surface images can be generated and captured by processes such as but not limited to reflected light, negative light, luminescence, fluorescence, x-ray and the like.
  • a surface with grooves would reflect radiation media in planes related to the grooves; radiation media detected outside the expected planes can indicate undesirable groove geometries or anomalies.
  • evaluation in operation 1520 can include an image data processing module of a controller which analyzes surface images for variations outside a standard with model based irregularity detection. Variations can include low reflective response or excessive reflective response, a low negative light or excessive negative light, no luminescence detected or uncharacteristic diffraction patterns depending on the type of surface image generation applied and the method for capturing the image.
  • irregularities are determined when collected images are compared with an airfoil surface imperfection detection (ASID) technique data fusion 1530 or component inspection requirements 1540. Both ASID data fusion 1530 and inspection requirements 1540 can operate with access to deposit of component images 1800. Deposit of component images 1800 can provide standard data sets for comparison.
  • ASID airfoil surface imperfection detection
  • Module 1600 which applies a fuzzy logic analysis of the sensed image(s).
  • Condition assessment module 1600 is shown with further detail from one embodiment in Figure 6.
  • Module 1600 is shown to begin with conditional 1610 where an initial determination of part quality can result in parts with no indications of anomalies moving to a final report in module 1700. Parts with identified anomalies move on for further assessment in module 1600 including an automated anomaly characterization in operation 1630.
  • Operation 1630 accesses a knowledge bank 1640 to apply cognitive characterization of anomalies or features indicated by the image processing.
  • Operation 1630 references inspection guidelines 1620. Inspection guidelines 1620 can result from design specifications, industry standards, or others.
  • a fuzzy logic analysis is performed in module 1650. Fuzzy logic analysis and cognitive characterization in module 1600 provides an objective ability to determine a consistent pass/fail status for parts being inspected.
  • An automated image processing method in an embodiment of the present application can include fuzzy logic analysis to enable the system to use an analysis tool with appropriate processing times for part inspection.
  • a fuzzy logic analysis system is a logic analysis system operable to process data by replacing what are commonly Boolean logic rules with a collection of fuzzy membership functions and rules.
  • An example rule in a fuzzy logic system can be of the form:
  • the rule's premise describes the degree to which the rule applies, while the rule's consequent assigns a membership function to the output variable(s), where the set of rules in a fuzzy logic analysis system is known as the rule base or knowledge base.
  • Data processing in a fuzzy logic analysis system of an embodiment of the present application can include four high level steps that correspond roughly to an input stage, a processing stage, a compilation stage and an output stage.
  • fuzzy logic is a mathematical model for addressing inherently imprecise data
  • Fuzzy logic provides a mathematical model of the vagueness found in non-precise measurements allowing automated determinations regarding component analysis such as surface imaging.
  • Figure 6a shows four operations that can be part of fuzzy logic algorithm 1650 which are input 1651, processing 1652, compilation 1653 and output 1654. These operations can be described in slightly differing terms and can be combined, expanded or omitted based on the way the fuzzy logic analysis is described without changing the meaning or intent of using fuzzy logic in this embodiment of the present application.
  • Input Stage 1651 - Fuzzification The membership functions defined for the input variables can be applied to the actual values of the input variables to determine the degree of truth for each rule premise.
  • the input variables in a fuzzy control system are in general mapped into sets of membership functions known as "fuzzy sets" in the process of converting an input value to a fuzzy value. All the rules that apply can be invoked, using the membership functions and truth values obtained from the inputs, to determine the results of the rules.
  • Processing stage 1652 - Inference The truth value for the premise of each rule can be computed and applied to its consequent. This computation results in one fuzzy subset being assigned to each output variable. The computation result in turn can be mapped into a membership function and truth value controlling the output variable.
  • Compilation stage 1653 - Composition All of the fuzzy subsets assigned to each output variable can be combined together to form a single fuzzy output subset for each output variable.
  • Output stage 1654 - Defuzzification The fuzzy output subset for each output variable can be convertible to a unique solution or a 'crisp' answer.
  • a component image is acquired and a data set is created.
  • Module 1650 compares the image data set to a set of rules assigning a degree of conformity to the data set.
  • the degree of conformity is a representation of the amount of variation between the component image and an image from a deposit of images or set of guidelines.
  • the degree of conformity can be representative of other levels of comparison in other embodiments.
  • the degree of conformity is compiled to produce an output data set related to position and level of conformity.
  • the output data set is compared to data sets in the knowledge bank to determine whether the output data sets are consistent with anomalies. Output data sets consistent with anomalies provide an indication of the anomalies present in the component.
  • Automated review of the conformity data set in this exemplary embodiment is capable of reducing variation found in surface indication detection.
  • a part can move on to a final report in module 1700 following a passing result from operation 1650 or a part can move on to a failure report generation in operation 1670 following a failing result from operation 1650. Additionally, a part labeled as rejected can be provided for human inspection in cases where operation 1650 provides an uncertain result.
  • Reporting module 1700 can follow the fuzzy logic analysis in operation 1650. Reporting module 1700 can also follow other operations which end inspection such as rejected components in module 1400. In this embodiment, a reporting module 1700 is shown with further detail from one embodiment in Figure 7. Reporting module 1700 can include conditional 1710 which reviews whether the inspection is complete. In this exemplary embodiment, an incomplete inspection returns to inspection process 1200 to conduct the further inspections. A particular example of an inspection can include variations such as multiple planar surfaces or a single surface divided into several inspection tasks.
  • operation 1730 can produce a report.
  • the report from operation 1730 can be in tabular or graphical form intended to communicate the location and degree of deviation for the indicated features.
  • module 1700 can provide a report regarding the features from conditional 1430 and the results of the fuzzy logic analysis in operation 1650.
  • reporting module 1700 can include identifying the surface features indicated by the imaging process, allowing the automatic detection of features on the surface of the part, and applying an accept/reject criteria which utilizes the results from fuzzy logic algorithm 1650.
  • the features can be cracks, pores, damage, missing material and combinations thereof.
  • a deposit of component images module 1800 is accessed during multiple operations such as those found in conditional 1320, operation 1530, operation 1540 and operation 1630.
  • Surface indication detection databases can be populated with data sets from components with known characteristics including conforming surfaces, non-conforming surfaces, defects, imperfections and the like. Data sets can also be generated through theoretical data from design applications including for example CAD and simulation software.
  • deposit of component images module 1800 can include a set of component poses 1810 for reference.
  • a negative skeleton model 1840 of a part under inspection is produced with edge strengthened formation and then the sensed image is compared with a norm reference 1840a.
  • Norm reference 1840a is analyzed with generalized reference model features in operation 1850.
  • a context-based adaptive surface irregularity detection parameter tuning can be applied in operation 1860.
  • Parameter tuning 1860 can include ensuring selected features are properly detected with applied enhancements and related detection system.
  • the skeleton model from operation 1840 can be further analyzed with ASID techniques such as but not limited to zero-crossing, constant false alarm rates (CFAR), salient-points, neural networks and the like to provide ASID data fusion in operation 1530 for irregularity detection in module 1500.
  • Deposit of component images module 1800 can also utilize a positive reference model 1820.
  • Positive reference model 1820 can compare an image with a reference norm 1820a representative of a positive reference model revealing desirable surface conditions.
  • Norm 1820a can be applied to inspection requirements in operation 1540 of inspection module 1500.
  • deposit module 1800 could be capable of retaining images produced during inspection process 1200 and categorizing the images with information determined according to the image analysis. New images could be stored in a knowledge bank. Deposit module 1800 could thereby learn from the inspection process.
  • a method with this system includes applying a surface imaging process to a component, applying an algorithm to efficiently manipulate the component with robotic positioning and applying fuzzy logic analysis to identify surface features of the component shown by the surface imaging process.
  • Grain structure characterization system 2100 representing an embodiment of the present application including an inspection process, algorithms, sensors, robotic positioning and analysis to locate, evaluate and report grain structures.
  • Grain structure characterization system 2100 is shown to include a positioning system 2110, a surface scanning system 2120 and a controller 2130.
  • Positioning system 2110 of this embodiment operates with a component presentation technique based on an algorithm for manipulating a component C with respect to surface scanning system 2120 in an efficient manner with minimum hunting for component surfaces and anomalies.
  • the positioning algorithm can be provided or can be selected based on other presented parameters designated to identify the type of component being characterized. Presented parameters could include a part number, a geometry or geometrical feature, a weight, and the like.
  • Embodiments of positioning system 2110 can include a robotic component manipulator.
  • positioning system 2110 utilizes an imaging system to identify the type of component being characterized and determines a positioning algorithm with which to provide predetermined component manipulation during characterization. Robotic component manipulation can provide consistent positioning of components, such as component C and camera or vision system, during the
  • component manipulation can include presenting the component to a scanning device such as but not limited to a camera.
  • Surface scanning system 2120 includes a light source 2122 and a detector unit 2124.
  • light source 2122 includes bright field incident illumination such as but not limited to that of light microscopes in reflective mode.
  • Light source 2122 can produce illumination in a variety of wavelengths depending on the material of the component and available equipment, thus the term "light” is not limited to visible light but includes the entire electromagnetic spectrum for shorthand.
  • light from light source 2122 is directed at the component.
  • Detector unit 2124 detects the light reflected by the surface of component C. Variations in the reflectivity data collected by detector unit 2124 can be the result of differences in the reflectance of various features of the surface of component C.
  • the direction of reflected light can be determined by several factors including but not limited to surface topography and grain structure.
  • Surface topography can include surface variations resulting from voids, grain boundaries (with or without etching) and grain defects.
  • a single crystal cast component can be subjected to the characterization system of the present application.
  • the surface of the cast component can be scanned to produce a reflectivity signal.
  • the reflectivity signal can indicate grain boundaries or multiple phases due to varying surface texture.
  • the varying surface texture can reflect directed light differently allowing reflectivity differentiation between the phases.
  • the component to be inspected includes one or more polished surfaces and/or as-cast surfaces that uniformly reflect light and otherwise not show contrast.
  • the surface of such a component can be prepared such as, but not limited to, chemical etching to optically enhance microstructural features so as to not uniformly reflect light and otherwise show image contrast.
  • Defects in the grain structure of a single crystal component can be characterized by an embodiment of the present application.
  • Various grain defects can find their way into a single crystal manufacturing process such as but not limited to high angle grain boundaries, low angle grain boundaries, recrystallized grains, and twinning defects. Examples of defects can include precipitates, dislocations and impurities.
  • One embodiment of the present application is designed to automatically locate, evaluate and report grain defects on families of single crystal cast airfoil type components.
  • Embodiments disclosed herein can also generate a report of the sizes and the locations of the grain defects on the component.
  • One embodiment of the present application includes identifying defects, defect locations and defect density.
  • Controller 2130 of grain structure characterization system 2100 is shown schematically in the embodiment of Figure 9 as a single component containing modules capable of performing various functions. Each function can be located on the same or separate pieces of hardware and can be one of several hardware varieties available and arranged by one skilled in the art. Controller 2130 can also include one or more microprocessors where a single microprocessor provides the functions of each module or separate microprocessors are used for one or more of the control modules.
  • Controller 2130 as shown in Figure 9 is capable of operating a scanned data processing module 2132 and a robotic manipulation module 2138.
  • Robotic manipulation module 2138 is shown in Figure 9 as part of controller 2130.
  • Robotic manipulation module 2138 can be part of the positioning equipment in positioning system 2124 as a single system or as separate components.
  • robotic manipulation module 2138 is capable of providing a positioning algorithm and predetermined part manipulation instructions to positioning system 2110 in response to an identification of component C.
  • Scanned data processing system 2132 can include an analyzer module 2134 and a scanning module 2136.
  • scanning module 2136 can interact with surface scanning system 2120 to provide equipment controls as an alternative to controls provided directly with the scanning system equipment or from another source and can be capable of providing acquisition and manipulation capabilities for data sets provided to surface scanning system 2120.
  • analyzer module 2134 can be a fuzzy logic analyzer module capable of providing analysis of the scanned data sets from surface scanning system 2120.
  • Fuzzy logic can be used in grain structure characterization system 2100 to deal with fuzzy concepts— concepts that cannot be expressed as "true” or “false” but rather as “partial truths.”
  • a reflectivity analysis method in an embodiment of the present application can include fuzzy logic analysis to enable the system to use an analysis tool with appropriate processing times for part inspection.
  • a fuzzy logic analysis system is a logic analysis system operable to process data by replacing what are commonly Boolean logic rules with a collection of fuzzy membership functions and rules.
  • An example rule in a fuzzy logic system may be of the form:
  • the rule's premise describes the degree to which the rule applies, while the rule's consequent assigns a membership function to the output variable(s), where the set of rules in a fuzzy logic analysis system is known as the rule base or knowledge base.
  • fuzzy logic is a mathematical model for addressing inherently imprecise data
  • a fuzzy logic analysis can be applied to the present application.
  • Luminescence may be a quantity but 'brightness' is not.
  • a sharp cut off does not exist between 'bright' and 'not bright.'
  • Fuzzy logic provides a mathematical model of the vagueness found in non-precise measurements of reflectivity allowing automated determinations regarding component analysis such as grain structure characterization.
  • FIG 10 shows a flow diagram of an embodiment of the present application including a scanning process 2200.
  • Scanning process 2200 begins with operation 2210 where light is directed at a localized area of a component's surface.
  • Operation 2210 may utilize light source 2122 from Figure 9.
  • operation 2220 senses the light which is reflected by the component surface.
  • Operation 2220 may be accomplished with detection unit 2124 from Figure 9 to detect the reflected light over a specified range of angles.
  • a reflection signal 2230 representing the intensity of the reflected light is provided for a reporting operation 2240.
  • Operation 2240 may include further analysis regarding the reflected light intensity resulting in a characterization of the component surface. Analysis from operation 2240 can include fuzzy logic analysis.
  • grain structure characterization of a component surface as part of operation 2240 may include defect inspection of a single crystal airfoil casting.
  • Figure 11 shows an embodiment of a fuzzy logic analysis that can be part of reporting operation 2240 from Figure 10.
  • Data processing in a fuzzy logic analysis system of an embodiment of the present application can include four high level steps that correspond roughly to an input stage 2310, a processing stage 2320, a compilation stage 2330 and an output stage 2340. These operations can be described in slightly differing terms and can be combined, expanded or omitted based on the way the fuzzy logic analysis is described without changing the meaning or intent of using fuzzy logic in this embodiment of the present application.
  • Input Stage 2310 - Fuzzification The membership functions defined for the input variables can be applied to the actual values of the input variables to determine the degree of truth for each rule premise.
  • the input variables in a fuzzy control system are in general mapped into sets of membership functions known as "fuzzy sets" in the process of converting an input value to a fuzzy value. All the rules that apply can be invoked, using the membership functions and truth values obtained from the inputs, to determine the results of the rules.
  • Output stage 2340 - Defuzzification The fuzzy output subset for each output variable can be convertible to a unique solution or a 'crisp' answer.
  • a grain structure characterization system of the present application utilizes fuzzy logic analysis to determine grain defects of a component.
  • the characterization system presents a surface of the component to a scanning system by manipulating the component according to a positioning algorithm.
  • the scanning system produces a reflectivity data set as a result of light directed to the surface of the component reflecting back.
  • the reflectivity data set would include the intensity of reflecting light and location on the surface of the component.
  • Fuzzy logic analysis is applied where the reflectivity data set is collected as the input variables.
  • the input variables are assigned a degree of intensity.
  • the degree of intensity is compiled to produce an output data set related to the level of reflectivity and location.
  • the output data set can be characterized to indicate grain structure or, more specifically, grain defects. Characterization can be performed by comparing the output data set to standard data sets in an airfoil defect knowledge bank.
  • an apparatus includes a controller 2130 with various components illustrated as representative modules, inputs, outputs and intermediate data parameters.
  • Module 2410 is a position module structured to determine a positioning algorithm 2414 in response to the identification of a type of component 2412 and a component positioning database 2416.
  • Various systems can be available for identifying the component in component identification 2412 such as but not limited to bar code, scan, operator input, CMM data, imaging data and the like.
  • Component manipulation instructions 2418 for the positioning equipment are provided in response to positioning algorithm 2414.
  • Manipulation instructions 2418 provides a component orientation data set 2401.
  • manipulation instructions 2418 can provide robotic manipulation of the component during grain structure characterization process 2100.
  • Module 2420 is a reflection module structured to direct a source of light 2422 on to the surface of a component and detect a quantity of reflected light 2424 with a detection unit. Module 2420 is further structured to provide a reflectivity data set 2426 in response to the quantity of reflected light 2424.
  • Module 2430 is a characterization module where a fuzzy logic algorithm 2432 can be applied to the reflectivity data set. Fuzzy logic algorithm 2432 applies reflectivity data set 2426 to a set of input variables 2433. A set of fuzzy logic membership functions 2435 assigns a degree of intensity to the set of input variables 2433. Fuzzy logic algorithm 2432 determines an output data set 2434 which is converted into a solution set 2436.
  • Indication module 2440 is structured to identify an indication 2445 of a grain structure feature in response to solution set 2436 and can also be in response to component orientation data set 2401.
  • Grain structure features indicated by a grain structure characterization system as part of an embodiment of the present application can include but are not limited to grain structure, grain defects, grain locations, grain size, and grain defect density.
  • the part can include a single crystal, a directionally solidified, and/or an equiaxed grain structure.
  • the part can include an airfoil component of a gas turbine engine.
  • One embodiment of a grain structure characterization system can operate to mechanically locate, evaluate, and report grain structure
  • Another embodiment of the system can generate a report of the sizes T ocations and types of grain structures of the component in tabular or graphical form.
  • An embodiment of the present application applies a special lighting configuration, a part presentation technique, and a fuzzy logic based processing technique for identifying grain structures in a single crystal cast airfoil component.
  • Embodiments of the present application can be applied to components requiring grain defect inspection such as but not limited to single crystal cast components, directionally solidified cast components, and equiax solidified cast components.
  • a method includes applying a grain structure characterization process to a component, applying an algorithm to efficiently position the component with an automatic positioning algorithm and applying fuzzy logic analysis to identify grain structure characterizations of the component.
  • a surface inspection system 3100 representing a non- limiting embodiment of the present invention including an automated surface inspection process, algorithms, sensors, robotic positioning, and analysis to locate, evaluate and report surface variances.
  • Surface inspection system 3100 is shown to include a preparation system 3110, an inspection system 3120 and a controller 3130.
  • Preparation system 3110 has four stages. In other embodiments, each stage can have multiple levels and one stage can be combined with another stage. In yet other embodiments, one or more of the stages may not be included.
  • the embodiment shown with preparation system 3110 includes an initial cleaning process 3112, an indicator application process 3114, an excess indicator removal process 3116, and a developer application process 3118.
  • Initial cleaning process 3112 can be included when the surface of a part 3122 contains contamination such as but not limited to lubricant and material shavings from previous manufacturing processes or other sources.
  • contamination such as but not limited to lubricant and material shavings from previous manufacturing processes or other sources.
  • a surface of a part that is clear of oil or debris can reduce the opportunity for obscuring an anomaly or falsely indicating a defect on the surface.
  • Indicator application process 3114 can include application techniques available to an operator including but not limited to dipping, brushing and spraying.
  • Indicators can include liquid indicators such as a dye or non-liquid indicators such as magnetic-particles.
  • Application parameters for indicator application process 3114 can depend on the indicator chosen and the types of anomalies anticipated. For example, dyes with lower viscosity may penetrate faster and small anomalies may require more time for penetration. In some applications, surface porosity may affect the ability of a liquid indicator to adequately indicate surface defects and adjustments can be made to the application parameters.
  • Excess indicator removal process 3116 can remove substantially all of the excess indicator from a surface without removing too much indicator which can affect the accuracy of an surface inspection test. Not removing enough of the excess indicator can lead to false indications and removing more than just the excess indicator can deplete the amount of indicator necessary on the surface for indicating anomalies.
  • developer application process 3118 a developer can be used in some embodiments which apply certain types of indicators to provide additional contrast between a fluorescent dye and the surrounding surfaces.
  • Inspection system 3120 includes a positioning system 3124 and an indication system 3126.
  • Positioning system 3124 of this embodiment operates with a part presentation technique based on an algorithm for manipulating part 3122 in an efficient manner with minimum hunting for part surfaces and anomalies.
  • Embodiments of positioning system 3124 can include a robotic part manipulator with a discussion of further details to follow.
  • positioning system 3124 utilizes illumination and imaging components to identify the type of part 3122 being inspected. Illumination can be, for example, supplied for reflection detection or shadow detection.
  • An imaging component can be, for example, a camera capable of reproducing the image, a photo sensor capable of detecting illumination, or the like.
  • Positioning system 3124 can determine the identity of part 3122 by analyzing the outline of part 3122 generated when the robotic part manipulator places the part in a predetermined position between a light source and an imaging component. In another embodiment, positioning system 3124 can analyze a reflection image based on light emitted toward part 3122 and reflected back to an imaging component. Radiation types other than light can be emitted. A detected image of part 3122 can be analyzed by comparison to a standard image within a library of images accessible by positioning system 3124. Comparison may include determining predetermined data points and comparing data points, overlaying images and determining differences, and other such methods known in the art.
  • robotic part manipulation may include robotic positioning of part 3122 with preset coordinates placing predetermined features of a part in a predetermined position relative to recognition equipment according to a positioning algorithm.
  • Part manipulation can also include predetermined repositioning of a part during further steps of the inspection process.
  • robotic part manipulation can provide consistent part positioning during the inspection process which can reduce variation and can improve efficiency of the inspection process.
  • positioning system 3124 can determine the positioning algorithm which would provide predetermined part manipulation based on part 3122 identification.
  • Indication system 3126 of inspection system 3120 may include an image capture device such as but not limited to a camera which may be capable of capturing the visible spectrum, a photo-emission sensor for various wavelengths including but not limited to ultraviolet and x-ray, detectors capable of sensing electromagnetic radiation, and the like. Other capture devices structured to capture an indication from suitable indicators are also contemplated herein.
  • a light source can be a laser, a discharge tube, or other radiation source.
  • indication system 3126 includes equipment with the capability to provide a radiation source to react with a fluorescent penetrant indicator causing an emission which can be detected by equipment of indication system 3126.
  • Equipment of indication system 3126 can be contained in a single housing as shown in Figure 13 or can be contained in separate housings.
  • Indication system 3126 can also include multiple radiation or illuminating sources and/or detection components. Components of indication system 3126 can also provide illuminating and image acquisition for use with positioning system 3124.
  • Controller 3130 of surface inspection system 3100 is shown in the embodiment of Figure 13 as a single component containing hardware capable of performing various functions. Each function can be located on a separate piece of hardware and can be one of several hardware varieties available and arranged by one skilled in the art. Controller 3130 can also include one or more microprocessors where, in one embodiment, a single microprocessor can provide the functions of each module or separate microprocessors can be used for one or more of the control modules. One skilled in the art would be able to determine a controller architecture.
  • Controller 3130 in the embodiment of Figure 13 is shown as being capable of operating an indication data processing system 3132 and a robotic manipulation module 3138.
  • Indication data processing system 3132 can include an analyzer module 3134 with further details to follow and a sensor module 3136.
  • the analyzer module 3134 is a fuzzy logic analyzer.
  • sensor module 3136 can interact with indication system 3126 to provide equipment controls as an alternative to controls provided directly with the indication equipment or from another source to interact with indication system 3126.
  • Sensor module 3136 can be capable of providing acquisition and manipulation capabilities for data sets obtained by indication system 3126.
  • analyzer module 3134 is a fuzzy logic analyzer module capable of providing analysis of the indication data sets from indication system 3126.
  • Fuzzy logic analysis provides a mathematical model of the vagueness found in non- precise measurements of surface inspection techniques such as but not limited to FPI and magnetic-particle inspection. Fuzzy logic can be used in machine control in order to deal with fuzzy concepts— concepts that cannot be expressed as "true” or "false” but rather as “partial truths.”
  • Fuzzy logic analyzer module can include an input stage, a processing stage, a compilation stage and an output stage.
  • the input stage maps sensor or other inputs to appropriate membership functions and truth values.
  • the processing stage invokes an appropriate set of logic rules in the form of IF-THEN statements - IF variable IS property THEN action.
  • the compilation stage combines the results of the rules.
  • the output stage converts the combined results into a control output value.
  • an indication data processing method in an embodiment of the present invention includes fuzzy logic analysis to enable a system to use an analysis tool with appropriate processing times for part inspection.
  • a fuzzy logic analysis system is a logic analysis system operable to process data by replacing what are commonly Boolean logic rules with a collection of fuzzy membership functions and rules.
  • An example rule in a fuzzy logic system may be of the form:
  • fuzzy logic is a mathematical model for addressing inherently imprecise data
  • surface anomalies are indicated by areas of brightness due to the presence of the fluorescent penetrant.
  • the concept of 'brightness' is not mathematically expressed in an equation. Luminescence may be a quantity but 'brightness' is not. A sharp cut off does not exist between 'bright' and 'not bright.' One cannot simply say that 'bright' is at X luminescence but 'not bright' is at X-l luminescence.
  • an operator may be able to infer differing 'brightness' for the areas of a sample with differing levels fluorescent penetrant responding to the radiation. How much 'brightness' recorded will vary between operators leading to reduced repeatability.
  • a radiance data set is collected and compared to a set of rules assigning a degree of intensity to the radiance data set.
  • the degree of intensity in this embodiment is a representation of the amount of radiance the fluorescent penetrant produces when radiated.
  • the degree of intensity may be representative of other levels of indicators in other embodiments.
  • the degree of intensity is compiled to produce an output data set related to position and level of radiance.
  • the output data set is compared to data sets in a knowledge bank to determine whether the output data sets are consistent with anomalies. Output data sets consistent with anomalies provide an indication of the anomalies present in the component. Automated review of the radiance data set in this embodiment is capable of reducing variation found in surface variance detection.
  • Robotic manipulation module 3138 is shown in Figure 13 as part of controller 3130. Robotic manipulation module 3138 can, in the alternative, be part of the positioning equipment in positioning system 3124 as a single system or as separate components. For one embodiment, robotic manipulation module 3138 is capable of providing a positioning algorithm, a component type recognition database and predetermined part manipulation instructions.
  • a positioning algorithm can include predetermined coordinates for a robotic part manipulator where coordinates can be based on an absolute or comparative capacity. For one embodiment, once a part has been identified and the position of certain features determined in relation to a part manipulator, a positioning algorithm produces
  • inspection begins with predetermined initial coordinates within the robotic manipulator's coordinate measuring system.
  • the positioning algorithm could then control movement of the part manipulator allowing the inspection to be systematically applied to related components.
  • Surface inspection system 3100 of Figure 13 can also include a final cleaning process 3140 which may allow a part 3122 to be returned to a manufacturing line following a surface inspection test.
  • the care and degree of cleaning necessary can depend on the remaining manufacturing processes and the final function of the parts being tested.
  • a surface indicated by an embodiment of surface inspection system 3100 can include but are not limited to micro and macro porosity, inclusion defects, inhomogeneities, and discontinuities.
  • the part would include a single crystal, a directionally solidified, and/or an equiaxed microstructure.
  • the part could include an airfoil component of a gas turbine engine.
  • Another embodiment can operate to mechanically locate, evaluate, and report surface variances on families of airfoil type components.
  • Yet another embodiment of the present application generates a report of the sizes and locations of the variances on the surface of a component in tabular or graphical form.
  • Inspection process 3200 begins with operation 3220 which includes surface defect indicator preparation. Shown as following operation 3220 in process 3200 is optional operation 3230 which includes recognizing the part being tested. The recognition in operation 3230 can be based on a comparison with the sensed image of the part and a database of part responses. Operation 3240 is then capable of applying a predetermined positioning algorithm based on the automated recognition of operation 3230 to manipulate the part. Automatic part positioning may reduce variability and improve the efficiency of the test.
  • operation 3250 provides a source of excitement and senses the response from the surface of the part to collect an indication data set.
  • UV radiation is directed toward a surface of a test part to irradiate a fluorescent dye.
  • ferrous iron particles are placed on a ferromagnetic component's surface and a magnetic field is applied to the component. The magnetic flux of the applied magnetic field leaks at surface anomalies. The iron particles are attracted to areas of flux leakage producing an indicator of the surface anomalies.
  • operation 3260 which applies a fuzzy logic analysis.
  • Figure 15 shows further detail regarding operation 3260 where, in one embodiment, four exemplary operations are part of a fuzzy logic analysis. These operations may be described in slightly differing terms and may be combined, expanded or omitted based on the way the fuzzy logic analysis is described without changing the meaning or intent of using fuzzy logic in this embodiment of the present invention.
  • Input Stage - Fuzzification (3262) The membership functions defined for the input variables can be applied to the actual values of the input variables to determine the degree of truth for each rule premise.
  • the input variables in a fuzzy control system can be, in general, mapped into sets of membership functions known as "fuzzy sets" in the process of converting an input value to a fuzzy value. Any of the rules that apply can be invoked, using the membership functions and truth values obtained from the inputs, to determine the results of the rules.
  • Processing stage - Inference (3264) The truth value for the premise of each rule may be computed and applied to its consequent. This computation results in one fuzzy subset being assigned to each output variable. The computation result may be mapped into a membership function and truth value controlling the output variable.
  • Compilation stage - Composition All of the fuzzy subsets assigned to each output variable may be combined together to form a single fuzzy output subset for each output variable.
  • Output stage - Defuzzification (3268) The fuzzy output subset for each output variable may be convertible to a unique solution or a 'crisp' answer.
  • Operation 3270 follows the fuzzy logic analysis in operation 3260.
  • Operation 3270 allows an automated identification of anomalies on the surface of the part as indicated by the indicator.
  • the anomalies can be inhomogeneities, micro structural discontinuities, inclusions, micro-porosity, grain structure and combinations thereof.
  • the fuzzy logic algorithm from operation 3260 can produce a characterization data set for comparison with a knowledge bank. This comparison in Operation 3270 allows the automated inspection process to apply cognitive
  • the knowledge bank includes, but is not limited to, data sets from previous surface inspection applications to standard components or data sets generated from theoretical calculations or simulations. Fuzzy logic analysis and cognitive characterization in operation 3270 can directly affect the ability to determine an automated pass/fail status for the part.
  • Operation 3280 includes the application of an accept/reject criteria which utilizes the results from the fuzzy logic algorithm in operation 3260 and the anomaly indication in operation 3270. Operation 3280 can also provide a report (3280a) regarding the anomalies from operation 3270 and the results of the fuzzy logic analysis in operation 3260. For some embodiments, the report from operation 3280 can be in tabular or graphical form intended to communicate the location and degree of deviation for the indicated anomalies.
  • inspection process variation can be greatly reduced via automating the detection of variances and the application of a pass/fail criteria using fuzzy logic analysis. Fuzzy logic analysis allows an automated inspection to access a knowledge bank to apply cognitive characterization of defects and provide a level of consistency to determine a pass/fail status according to a specification.
  • Another embodiment of the present application applies a special lighting configuration, a part presentation technique, and a fuzzy logic based image processing technique for identifying inhomogeneity in a single crystal cast airfoil component using a fluorescent penetrant process.
  • Yet another embodiment includes an algorithm for manipulating a part with respect to lighting and camera positions in an efficient manner with minimum hunting and a fuzzy logic based image processing algorithm to identify anomalies which may indicate a surface defect.
  • Embodiments from the present application can be applied to components utilizing FPI or magnetic-particle defect inspection such as but not limited to single crystal cast components, directionally solidified cast components, and equiax solidified cast components.
  • an embodiment of an intelligent automated visual inspection system 4100 is disclosed which is capable of acquiring and processing images of components such as, but not limited to, engine components such as airfoils of gas turbine assemblies.
  • the embodiment of inspection system 4100 as shown in Figure 16 includes an illumination system 4110, an imaging system 4120, a manipulation system 4130, a user interface 4140, an inspection processor 4150, and an image library 4160.
  • An illumination system such as system 4110 can include a source of radiance to be directed toward a component C under inspection.
  • the radiance can be reflected by the surface of component C and detected by imaging system 4120.
  • Radiance type can include various wavelengths in the electromagnetic spectrum including but not limited to the visible spectrum, ultraviolet light, near infrared, and x-rays.
  • the source of radiance can include a laser, a discharge tube and the like.
  • an imaging system can be a camera utilizing a conventional light or other electromagnetic radiation type such as x-ray, ultraviolet, fluorescent and the like.
  • An embodiment of manipulation system 4130 can include a robotic part manipulator and positioning algorithms to provide predetermined part presentation and positioning during an inspection process.
  • User interface 4140 includes an interface having parameters within modules to be selected by a user in determining a set of inspection protocols.
  • the inspection protocols can provide control of illumination system 4110, imaging system 4120 and manipulation system 4130 to produce an acquired image of component C under inspection.
  • the acquired image can be analyzed by inspection processor 4150.
  • the inspection protocols can further be applied to the analysis of the acquired image.
  • the analysis includes referencing image library 4160.
  • Inspection system 4100 can be used to analyze and determine characteristics or manufacturing flaws in components being inspected.
  • inspection system 4100 is a protocol-based visual inspection system with image processing algorithms and techniques implemented in system software.
  • a system of such an embodiment can offer intuitive and easy-to-use interfaces to develop visual inspection procedures of components.
  • inspection system 4100 can be used without writing lines of programming code.
  • An inspection system of another embodiment of the present application is fully automated, adaptive, and customizable to perform complex visual inspection comparable to that of a human inspector.
  • the protocol-based system of yet another embodiment can have a built-in capability to simultaneously facilitate automated control of the visual inspection system including the accompanying illumination, imaging, and component manipulation systems.
  • An inspection system of one embodiment can have a protocol-based development technique which follows an interactive process. Through a process such as the one found in this embodiment of the present application, inspectors can fine tune the inspection system control parameters to achieve the inspection of components to various degrees of requirements, yet within an acceptable margin recommended by the Engineering
  • GUI graphical user interface
  • an inspection protocol is interactively designed to meet a selected inspection requirement with inspection protocol development tools selected in an inspection protocol module 4215.
  • a user designs a protocol by selecting a series of available inspection options in an inspection setup requirement module 4205 and by defining four inspection parameters in an inspection process control parameters module 4210 including: contrast strength, border strength, edge strength, and noise strength.
  • an inspection protocol is interactively designed to meet a selected inspection requirement with inspection protocol development tools selected in an inspection protocol module 4215.
  • an inspection of a component can consist of individual protocols to be executed during the inspection process one by one in the order they are constructed.
  • embodiment can further allow the user to select inspection regions of components with a designated component regions module 4220 consistent with the specified EIS
  • the user can specify the section of a component to be inspected in a component section inspection module 4230 according to component linguistic terminology.
  • the exemplary inspection system protocol development GUI shown in Figure 17 can allow the user to specify surface conditions of components being inspected with an intrinsic surface conditions module 4250.
  • a still further embodiment can allow the user to control parameters enabling synchronization of inspection subcomponents including, but not limited to, illumination, imaging, and component manipulation systems.
  • the embodiments of the inspection systems and associated systems described herein can be used to control the illumination system, camera parameters, and component manipulators together, either concurrently or in a controlled sequence, with minimal to no additional interaction from an operator.
  • Figure 18 illustrates a flow chart representing steps for an automated inspection process 4300 that can be done through an embodiment of a GUI, for example an embodiment that is disclosed herein.
  • Inspection process 4300 is shown as initiating with operation 4305 which develops the inspection protocol.
  • a designed inspection protocol of one embodiment can control the illumination system, the imaging parameters, and the component manipulation system in response to the inspection protocol.
  • Built-in communication capabilities of an inspection system of an embodiment of the present application can facilitate synchronization of inspection hardware and software.
  • operation 4310 is an image acquisition operation accessing an imaging parameters module 4312, an illumination parameters module 4314, a component manipulation requirements module 4316 and an image depository requirement module 4318.
  • An imaging system operating under imaging parameters module 4312 can be a camera utilizing a conventional light or other electromagnetic radiation type such as x-ray, ultraviolet, fluorescent and the like.
  • Illumination parameters module 4314 can correspond with the technology of imaging parameters module 4312.
  • Component manipulation requirements module 4316 can include manual, automated, or semi-automated instructions and controls to manipulate a component during inspection.
  • automated component manipulation controls can be determined in response to a component identification process. The identification process can be integrated with image depository requirement module 4318.
  • Operation 4320 is a feature extraction process including an image segmentation module 4322, a feature vector formation module 4324, a feature vector clustering module 4326 and a weak feature pruning module 4328.
  • modules 4322, 4324, 4326 and 4328 of feature extraction operation 4320 can identify and remove segments of the component image acquired in operation 4310 deemed unnecessary or periphery. In various embodiments, removal of these segments allows an image with sharper edges for edge detection analysis or smooth shading for defect detection analysis, for example. Once segmented in feature extraction of operation 4320, the image background information can be ignored.
  • the ability to include and exclude certain features of the component C can also be provided.
  • the GUI described above can include, or alternative take the form of, a mask construction GUI that permits an operator to mask a specific region of the component C.
  • a polygon mask can be used in some forms and can have any geometrical shape useful to identify certain areas of the component C.
  • the GUI can also permit an operator to import and export masks associated with an inspection protocol.
  • the features available to the operator can permit the mask to be translated, rotated, expanded, shrunk, etc to identify certain areas.
  • one or more vertices of a polygon mask can be manipulated through the GUI. Two types of masks can be used in the various embodiments of the system described herein.
  • An "Include Mask” and an “Exclude Mask” can be used.
  • the Include Mask can enclose a section of the component C that is subjected to inspection, while the Exclude Mack can define sections of the component C that should be excluded from inspection.
  • one Include Mask and one Exclude mask are permitted for any given protocol.
  • defect detection and validation operation 4330 Upon segmentation of the foreground including the component subject to inspection from the background, defective regions can be determined in defect detection and validation operation 4330.
  • Defects can include burrs, nicks, marks, scores, pitting, dents, and visible cracks to name a few.
  • Operation 4330 includes a defect spatial registration module 4332 and a defect verification checks module 4334.
  • Defect spatial registration module 4332 can, for example, in one embodiment provide location information of a determined defect by coordinating with a component manipulation system. The spatial information can be used to communicate the location of the detected defect to a user.
  • Defect verification checks module 4334 can operate to provide information regarding characterization of a defect such as, but not limited to, the severity and type of defect detected.
  • Defect verification checks module 4334 can provide this characterization information to the next operation.
  • Operation 4340 is shown following operation 4330 and is a defect characterization operation including quantitative and qualitative analysis.
  • Operation 4340 applies a defect statistical data measurement module 4342 to define geometrical properties of an identified defective area.
  • fuzzy logic analysis can be applied in one or more portions of the inspection process 4300.
  • the qualitative judgment can provide an indication of the acceptability of a component with a defect according to the inspection standards being applied.
  • each defective area can be characterized based on both quantitative and qualitative measures with the application of a defect severity assessment module 4344 and a defect distribution assessment module 4346. Severity and distribution assessment can provide information relevant to determining a cause for the detected defects in addition to contributing to decisions regarding acceptability of a component.
  • inspection process 4300 can use an analysis technique to perform defect condition reasoning with respect to the inspection engineering standards and an image library. Fuzzy logic analysis can be applied in operation 4350. With the assessment of operation 4350, a recommendation can be made for passing, recalling, or rejecting the inspected component in decision making module 4352 and a report can be generated in report generation module 4354.
  • FIG 19 illustrates an embodiment of functional components with an inspection processor 4450 of an inspection system 4400.
  • Inspection processor 4450 is represented as a single component containing hardware capable of performing various functions. Each function can be located on a separate piece of hardware and can be one of several hardware varieties available and arranged by one skilled in the art.
  • Processor 4450 can also include one or more microprocessors where in one embodiment a single
  • microprocessor can provide the function of each module or separate microprocessors can be used for one or more of the modules.
  • Functional components can include a graphical user interface 4410, a component manipulation system interface 4420, an imaging system interface 4425, an image processing library 4430 and an inspection preference interface 4440.
  • image processing library is capable of providing images for identification, verification and assessment of images acquired from a component under inspection. Further, image processing library is capable of storing acquired images for application in subsequent image analysis.
  • component manipulation system interface 4420 provides a communication interface to a manipulation system. In some forms the system 4420 can be used to pass information from a manipulation programs module 4421 when positioning a component. Manipulation programs module 4421 can provide instructions for manipulating a component during an inspection process.
  • Manipulation programs module 4421 can also assess an object to determine the instructions to be applied when manipulating the component.
  • Imaging system interface 4425 provides a communication interface to a imaging system.
  • the system 4425 can be used to pass information from an image calibration settings module 4426 when acquiring an image of the component.
  • Image calibration settings module 4426 can provide assessment and control of the imaging system to ensure consistent performance.
  • component manipulation system interface 4420 can be a communication interface.
  • imaging system interface 4425 can be a communication interface.
  • inspection processor 4450 is shown with a technique module 4460, a protocol module 4470 and an inspection tool module 4480.
  • Technique module 4460 can include protocol based inspection 4462, quantitative characterization 4464, and fuzzy logic qualitative reasoning 4466. These techniques can be applied during image analysis.
  • Protocol module 4470 can include design standards 4472, inspection requirements 4474, and protocol designs 4476 selected with a user interface to provide parameters for the inspection process.
  • Inspection tool module 4480 can include post-inspection analysis 4482, defect training 4484, and results recording 4486.
  • Module 4480 can operate in coordination with image processing library 4430 to store products of inspection tool module 4480. Inspection system 4400 can also provide a component quality report 4490 with status such as pass, reject, repair and recall, for example.
  • an inspection system of one embodiment includes a defect training module.
  • the inspection system supports an interactive process by which an inspector can train the inspection system to detect certain defect conditions.
  • the inspector can train the system with two different types of defects including Positive and Negative defects.
  • Each defect category can be associated with a relative scaling factor of low, medium or large on a qualitative basis.
  • the training can be used to specify the identified defect either as a "pass", a "reject", or a "rework” defective class.
  • the system can maintain a library of inspection information such as a surface defect database.
  • the library can contain hundreds of different surface conditions.
  • the inspection library can be referenced when performing calculated assessments and intelligent reasoning about the condition of observed defects.
  • an inspection system can support utilities for registering and displaying complete airfoil surface defect maps.
  • the exemplary system can allow an inspector to view a substantially 360 degree surface defect map of inspected airfoil models.
  • the inspection system can register and maintain spatial locations of defects in a traceable quad-tree format.
  • the inspection system can also be capable of displaying historical inspection occurrence maps to allow the inspector to correlate defects with other input factors such as design and manufacturing parameters.
  • Figure 20 illustrates a process flow chart of one embodiment of the system software of an inspection process 4900.
  • Inspection process 4900 as shown includes image acquisition 4300 and quality verification 4400 with comparison to a library of images 4800 for inspection.
  • quality verification 4400 can include modifying the image acquired to provide an image with edge strength for example.
  • illumination can be adjusted to produce a specified image quality or the image can be segmented and background images can be removed.
  • Image inspection 4500 can provide an indication whether a component under inspection includes anomalies and irregularities in reference to images from library of images 4800. In condition
  • assessment 4600 analysis of negative and positive imperfection and anomaly detections can be conducted using various techniques including model based, cognitive
  • a report generator 4700 can produce a report regarding the results of the various analysis techniques which can be made available to indicate component quality acceptability.
  • manipulation system 5100 includes a support structure 5120 and two robotic fingers 5130, 5140 each with five degrees of freedom, but other embodiments of the system 5100 can include greater or fewer freedoms.
  • the five degrees of freedom can include, but are not limited to, three degrees of rotational freedom and two degrees of linear freedom.
  • the degrees of rotational freedom provide object positioning capabilities during an object analysis process while the two degrees of linear freedom aid in object capture and alignment.
  • an automated object manipulation system can include a single robotic finger.
  • Robotic fingers or end effectors capable of physically grasping an object with direct force can include various forms of mechanical grippers including parallel jaws, claws, grapples, tongs, multiple fingers, and the like.
  • robotic fingers 5130, 5140 are shown to have a sliding drive 5131, 5141 for opening and closing a set of parallel jaws 5135, 5145.
  • Robotic fingers 5130, 5140 are further shown having a y-axis rotary drive 5150, 5151 which is capable of providing a degree of rotational freedom about the y-axis.
  • An L-bracket 5161, 5171 is shown linking robotic fingers 5130, 5140 with a z-axis rotary drive 5160, 5170.
  • Z-axis rotary drives 5160, 5170 are capable of providing a degree of rotational freedom about the z-axis.
  • Robotic fingers 5130, 5140 share a circular frame 5181 with an orientation controlled by an x-axis rotary drive 5180 shown mounted below circular frame 5181.
  • X- axis rotary drive 5180 is capable of providing a degree of rotational freedom about the x- axis.
  • Circular frame 5181 is shown housed inside a cavity of support structure 5120. For exemplary purposes, embodiments are described with a right-hand coordinate frame and should not be construed as limiting.
  • object manipulation system 5100 is controlled by a processor 5110.
  • Processor 5110 can contain modules for predetermined object manipulation by the fingers and thereby the fingers are capable of positioning the object in various positions to provide automated object presentation during an analysis.
  • Processor 5110 is represented as a single component containing hardware capable of performing various functions. Each function can be located on a separate piece of hardware and can be one of several hardware varieties available and arranged by one skilled in the art. Processor 5110 can also include one or more microprocessors where in one embodiment a single microprocessor can provide the function of each module or separate microprocessors can be used for one or more of the modules. In a further embodiment, processor 5110 can include a data storage module 5111, an instruction module 5112 and a control module 5113. Computerized control can allow preprogrammed and operator initiated control of object manipulation system 5100.
  • Control module 5113 can provide object features and position data from a sensor resulting from an object assessment.
  • the feature and position data can be fed to instruction module 5112.
  • Instruction module 5112 can supply preprogrammed manipulation instructions in response to the feature and position data of the object assessment.
  • the preprogrammed manipulation instructions can be retrieved from data storage module 5111.
  • the feature and position data from the object assessment can be stored in data storage module 5111.
  • a processor can include manipulation instructions which are capable of controlling the position and alignment of an object in relation to a sensor for an object analysis in response to data obtained by a profiling assessment.
  • a profiling assessment of an object can provide identification features to be used to establish a profile and preprogrammed manipulation during the object analysis.
  • a mark or feature can be used to establish a zero reference point.
  • An automated object manipulation system can allow for repeatable analysis on multiple objects utilizing object features and preprogrammed manipulation.
  • a processor can include programming to continuously interpret data received from profiling assessments and object analysis in a repeatable manner, for instance.
  • Object manipulation by finger assemblies of a manipulation system in response to an analysis program from a processor can repeatedly provide data including the positions of an object and indications of features above and below the surface as well as anomalies.
  • the object can be transferred to a second finger assembly in order to continue inspection of another end of the object.
  • One embodiment of the present application can include multiple stepper motors, position encoders, and limit switches to name a few which can be used to controllably position objects with the finger assemblies of a manipulation system.
  • a drive motor and position encoder can be included for each degree of freedom, which in some cases can be five degrees, to allow exposure of the object surfaces to a sensor.
  • a processor can receive input signals from transducers and position encoders associated with each degree of freedom where the signals can be incorporated as part of the object manipulation programming.
  • Figure 22 illustrates an exploded view of a manipulation system 5200 of one embodiment of the present application.
  • a finger assembly 5230 can be installed on a circular frame 5281 mechanically coupling finger assembly 5230 and a rotary drive 5280 and then the entire circular frame assembly 5201 can be mounted inside the cavity of a support structure 5220.
  • the circular frame can be integrally manufactured in the support structure and the finger assembly can be installed on to the circular frame as part of the support structure. Assembly
  • parameters and procedures can be determined based on the size and profile of the objects being analyzed as would be known to one skilled in the art.
  • Figures 23a, 23b, and 23c illustrate three views of one embodiment of an automated object manipulation system 5300.
  • Figure 23a is representative of a front view.
  • Figure 23b is representative of a side view.
  • Figure 23c is representative of a back view.
  • System 5300 can be adjusted to accommodate objects of varying sizes and profiles.
  • Figures 24a and 24b illustrate one embodiment of a finger assembly 5400 showing a C-bracket 5436 with an L-bracket linking structure 5461 and an alignment system 5431 including adjustable parallel jaws 5435a, 5435b for holding an object 5401 securely.
  • Parallel jaws 5435a, 5435b are operated up and down by drive system 5431 to accommodate varying sizes and shapes of objects 5401.
  • the L-bracket linking structure 5461 can be slidingly received with a component that allows the finger assembly 5400 to be adjusted toward or away from the opposing finger assembly (not shown). In this way the finger assemblies can be adjusted to alter the gap between the assemblies.
  • the bracket between the rotary drive and the finger assembly is one example of a device that permits the L-bracket linking structure 5461 to be adjusted.
  • the jaw 5435a can be slidingly adjusted relative to the C-bracket 5436 using a screw depicted at the top of the C-bracket, while the jaw 5435b is adjusted relative to the C-bracket using the drive system 5431.
  • the jaw 5435a is relatively static and the jaw 5435b is usually moveable to capture or release the object 5401.
  • Figure 25 illustrates movement in one embodiment of the parallel jaws on a finger assembly.
  • Finger assembly 5500 shown here includes a support bracket 5536 and two parallel jaws - a first jaw 5537 and a second jaw 5538.
  • Second jaw 5538 height can be adjustable by a drive system 5531, a screw or other such means to accommodate objects 5501 with different dimensions.
  • First jaw 5537 displacement can be controlled by a small scale linear actuator 5534.
  • the mechanism of drive system 5531 and actuator 5534 can be alternated or the displacement adjustment can be controlled or accomplished using other mechanisms known in the art.
  • jaws 5537, 5538 slide up and down a grooved slot 5539 on support bracket 5536 housing parallel jaws 5537, 5538. This arrangement can allow a secure alignment of parallel jaws 5537, 5538 with respect to object 5501 and improve analysis repeatability.
  • first jaw 5657 and second jaw 5538 are adjusted to an open position allowing placement of an object 5501 between jaws 5537, 5538. Jaws 5537, 5538 are then adjusted to a closed position thereby holding object 5501 for manipulation and analysis.
  • Parallel jaws 5537, 5538 of finger assembly 5500 can include pads 5533 which can be replaceable and/or constructed of a high density polymer material to facilitate a firm and secure grip for the object.
  • pads 5533 can be replaceable and/or constructed of a high density polymer material to facilitate a firm and secure grip for the object.
  • a second finger assembly receives padding conformal to accommodate the airfoil blade shape.
  • Figure 26 illustrates movement of a finger assembly 5632 about a z-axis.
  • the right-hand coordinate frame assumed for exemplary purpose includes an origin at a center of a circular frame 5681 of an analysis system 5600.
  • the z-axis points upward and aligns with the axis of rotation of a motor drive system 5660.
  • This embodiment can provide substantially full angular rotation around the z-axis and finger assembly 5632 can alter the position relative to another finger assembly 5633.
  • the z-axis motion can facilitate analysis of relative bottom and top sections of an object as well as but not limited to components with intricate fillets, orifices, and labels (e.g., part number or serial number) engraved or embossed on a relative bottom surface of an object.
  • finger assembly 5632 is rotated partially around the z-axis by motor drive system 5660
  • finger assembly 5632 is rotated around the z-axis by motor drive system 5660 approximately 90° exposing a relative bottom surface 5602 of object 5601.
  • finger assembly 6 is returned to the original position.
  • Figure 27 illustrates movement of a finger assembly 5732 about the x-axis.
  • a rotary drive system 5780 is employed to achieve x-axis motion of finger assembly 5732 in an automated object manipulation system 5700.
  • Rotary drive system 5780 can be mounted below a circular frame 5781.
  • circular frame 5781 is mechanically coupled to finger assembly 5732 and rotary drive 5780.
  • rotary drive system 5780 causes circular frame 5781 to rotate along a track.
  • Circular frame 5781 can be capable of rotating in related clockwise and counterclockwise directions.
  • circular frame 5781 can be capable of rotating a substantially complete 5360° during object manipulation.
  • a circular frame has on the circumferential wall a tooth-belt acting as a rack. The pinion on the rotary drive engages with the outer rack of the circular frame to facilitate the x-axis motion of a finger assembly.
  • Figure 28 illustrates movement of a finger assembly 5800 about the y-axis.
  • a y- axis motor drive 5850 rotates a support bracket 5836 holding an object 5801 in a set of parallel jaws 5837, 5838. Substantially full 5360-degree rotation of an object about the y-axis is achievable as a sequence of y-axis motions are shown in the series of illustrations in Figure 28.
  • a linking bracket 5861 of finger assembly 5800 can be adjustable. By adjusting the spacing of linking brackets 5861 in one embodiment, finger assembly 5800 can accommodate objects of differing dimensions.
  • the x-axis can point towards a sensor system (not shown).
  • the sensor can be positioned to operate along another axis.
  • the sensor system can include various equipment such as illumination and imaging devices. These devices can operate with the generation/detection of electromagnetic radiation, visible light, x-ray, ultraviolet and the like.
  • a sensor can also be based on sound or physical detection.
  • the sensor can be located a fixed distance from the object manipulator along the symmetric axis of the circular frame.
  • the sensor system can be applied for object assessment and/or analysis of an object presented by an object manipulation system.
  • An automated object manipulation system of the present application can include an automated analysis apparatus capable of improving the accuracy of repeated object manipulation with various components.
  • an automated object manipulation system can include a mechanical object manipulation support means having five degrees of freedom for supporting, aligning and positioning objects in proximity of an analysis tool.
  • an automated object manipulation system can adjust the location of an object to a position and orientation allowing analysis to be performed repeatedly and reliably.
  • an object illumination system 6100 of one embodiment of the present application having a set of concentric fluorescent light bulbs 6110, a diffusion shield 6120 positioned radially inwards of concentric fluorescent light bulbs 6110, and a cylindrical illumination harvesting shield 6130 positioned radially outward from concentric fluorescent light bulbs 6110.
  • an imaging system 6140 is shown located to capture an image of a component 6101 positioned within diffusion shield 6120 by a manipulation system 6160.
  • Concentric fluorescent light bulbs 6110 can serve as the light source for object illumination system 6100.
  • concentric fluorescent light bulbs 6110 can include a single light or illumination source.
  • concentric fluorescent light bulbs 6110 can include partial rings or configurations positioned in concentric circles or related arrangements. Illumination from concentric fluorescent light bulbs 6110 can include fluorescent, incandescent, LED or other illumination as known in the art.
  • fluorescent light bulbs/sources used in any given embodiment can be replaced individually and/or collectively in other embodiments with other light bulbs/sources such as incandescent, LED, etc. Configurations and shapes other than rings or circles can, in some embodiments, be used for concentric fluorescent light bulbs 6110 to provide illumination to the portions of a component presented to the imaging system.
  • Illumination arrangements can be determined based on component parameters.
  • Other forms of radiance can be applied where diffused light can illuminate an object and produce an image having a relatively balanced intensity of radiance.
  • multiple light sources within a single system can provide a variation of illumination arrangements which can be specified based on component parameters such as material, shape, and size to name a few. Further, longitudinal adjustment of the distance between the light source and the position of the component can allow control of the extent of illumination intensity provided to the component.
  • diffusion shield 6120 can have a truncated-type cone shape.
  • Concentric fluorescent light bulbs 6110 can be positioned radially outwards of a small radius base 6121 of cone-shaped diffusion shield 6120.
  • the truncated cone-shape provides an opening for placing the detection portion of imaging system 6140 at small radius base 6121 of diffusion shield 6120.
  • a background 6150 is positioned at a large radius base 6122 of cone-shaped diffusion shield 6120. In one form the background 6150 can be green.
  • Diffusion shield 6120 can be composed of a material which allows penetration of bright illumination from a light source external to diffusion shield 6120 to diffuse to the inside of a conic tunnel 6170 of diffusion shield 6120 where component 6101 can be positioned for imaging purposes.
  • the diffusion shield can be a plastic polymer material.
  • Diffusion shield 6120 is structured to diffuse light from concentric fluorescent light bulbs 6110 as the light passes through diffusion shield 6120 to illuminate the component.
  • a cylindrical illumination harvesting shield 6130 is structured to redirect at least a portion of the light from concentric fluorescent light bulbs 6110 to diffusion shield 6120.
  • Cylindrical illumination harvesting shield 6130 is shown in this embodiment with a cylindrical shape.
  • the shape of a harvesting shield lean be structured to accommodate various selections of illumination arrangements, diffusion shields, and components, for example.
  • a harvesting shield can be permanently placed in the system or can be removable to provide flexibility in an illumination system.
  • the harvesting shield can be composed of various materials. In some embodiments, the material can be selected to provide a degree of redirection or reflectivity.
  • Illumination harvesting shield 6130 can be used to intensify and redirect ambient light through diffusion shield 6120 to illuminate component 6101.
  • imaging system 6140 can further include a camera utilizing a conventional light or other electromagnetic radiation type such as x-ray, ultraviolet, fluorescent and the like.
  • images taken of an object by imaging system 6140 can be utilized for conducting an analysis of the component.
  • the analysis can include component identification, defect determination, quality assessments and the like, some or all of which are associated with an inspection regime.
  • the embodiments disclosed herein can be used with an inspection regime that involves evaluating gas turbine engines and associated components as will have been appreciated by other aspects of the instant application.
  • Such components can includes vanes, blades, etc. of the gas turbine engine.
  • An illumination system of one embodiment is structured to place an object at one end of an illumination tunnel created by a generally cone-shaped diffusion shield.
  • An illumination source produces illumination which is diffused as it passes through the diffusion shield. The diffused light then illuminates the object.
  • the object being illuminated can be positioned near a large radius base of the diffusion shield with a manipulation system.
  • the manipulation system can include a robotic system, such as a multi-axis robotic system.
  • an imaging system such as a camera is capable of capturing perspective images of the object as the object is manipulated by the multi-axis robotic system.
  • the imaging system can be attached at a small radius base of the cone-shaped diffusion shield. The imaging system is capable of capturing images of the object for further analysis.
  • a multi-axis robotic system is capable of performing a series of controlled motions to expose different sections of a component to an imaging system producing at least one image of the component within a diffusion system.
  • the diffusion system can include a diffusion shield and a harvesting shield.
  • the diffusion shield can be structured to diffuse illumination from an illumination arrangement as the illumination passes through the diffusion shield.
  • the harvesting shield can be structured to redirect at least a portion of the illumination from the illumination arrangement toward the diffusion shield.
  • the extent of illumination intensity reaching the component can be controlled.
  • the component can be exposed to diffused lighting without any direct projection of a light source on the component, and hence a well-balanced intensity of illumination can be achieved.
  • an imaging system can be capable of acquiring images of a light reflecting component without irregularities from reflections.
  • an external illumination harvesting shield can be introduced to change the intensity and uniformity of supplied ambient illumination to an object.
  • Figure 30 is a cross-sectional schematic diagram illustrating the mechanics of an illumination harvesting shield 6230 of one embodiment of the present application.
  • Illumination harvesting shield 6230 allows illumination produced from a florescent light source 6210 to be redirected toward a cone-shaped diffusion shield 6220 where an object 6201 within the cone-shaped diffusion shield 6220 can be exposed to an imaging system 6240.
  • florescent light source 6210 is shown as a first ring fluorescent light bulb 6211 and a second ring florescent light bulb 6212.
  • Florescent light source 6210 can have other shapes and configurations designed to meet the requirements of an inspection system including the object being inspected.
  • Illumination harvesting shield 6230 can be used to intensify and unify the ambient radiance from florescent light bulbs 6211, 6212 which can diffuse through cone-shaped diffusion shield 6220 to illuminate object 6201 positioned within the conic tunnel of cone-shaped diffusion shield 6220.
  • Figure 31 demonstrates an illumination system 6300 having a cone-shaped diffusion shield 6320 with a small radius base 6321 and a large radius base 6322.
  • Illumination system 6300 is also shown with a manipulation system 6360 to position a component (not shown) within diffusion shield 6320 to expose the component to an imaging system 6340.
  • Fluorescent lighting bulbs are present but not visible in Figure 31 on the opposing side of an illumination support 6380.
  • An illumination harvesting shield can be used in the embodiment of Figure 31 even though such a shield is not currently depicted.
  • An embodiment with a manipulation system 6360 can include a robotic part manipulator and positioning algorithms to provide predetermined part presentation and positioning during an inspection process.
  • Manipulator system 6360 is capable of presenting a component in at least one position for imaging system 6340.
  • Manipulation system 6360 can include various forms of component positioning equipment.
  • the manipulation system provides a single stationary position for the component.
  • FIG 32 illustrates one embodiment of an illumination process 6400 utilizing an illumination system.
  • Operation 6401 is shown initiating illumination process 6400 and includes providing a source of illumination or light 6410.
  • the source of light 6410 can be from a set of concentric fluorescent light bulbs.
  • Illumination process 6400 continues with operation 6402 which creates a diffused light.
  • Operation 6402 utilizes a diffusion system 6425.
  • Diffusion system 6425 can include a diffusion shield 6420 and a harvesting shield 6430.
  • Diffusion shield 6420 can be structured to diffuse illumination from the source of illumination as the illumination passes through the diffusion shield.
  • Harvesting shield 6430 can be structured to redirect at least a portion of the illumination from the source of illumination toward the diffusion shield.
  • the diffused light produced in operation 6402 is provided in order to illuminate an object in operation 6403.
  • An imaging system 6440 is then able to create an image of the object illuminated by the diffused light from operation 6403 in operation 6405.
  • Imaging system 6440 can include a camera and image analysis.
  • operation 6404 can vary the position of the illuminated object during operation 6403 and 6405.
  • Operation 6404 can utilize an automated manipulation system 6460.
  • any of the embodiments disclosed in one or more figures above can be used in the others as will be appreciated by those in the art.
  • any of the positioning systems disclosed above can be used in any of the various other embodiments, just as can any of the imaging sytems, databases, controllers, protocol development tools, fuzzy logic analysis, report generators, etc.

Abstract

Devices and systems used to inspect and evaluate components. Some embodiments include an imaging system, positioning system, and controller. A robotic manipulation can be provided to manipulate the component. Some embodiments include an illumination system. A graphical user interface can be used to develop and/or run inspection protocols. A fuzzy logic analysis can be coupled with one or more devices/systems to provide evaluation of an inspection.

Description

INTELLIGENT AIRFOIL COMPONENT SURFACE IMAGING INSPECTION
CROSS REFERENCE TO RELATED APPLICATIONS
The present application claims the benefit of U.S. Provisional Patent Application 61/451,005, filed March 9, 2011, U.S. Provisional Patent Application 61/451,038, filed March 9, 2011, U.S. Provisional Patent Application 61/451,036, filed March 9, 2011, U.S. Provisional Patent Application 61/451,035, U.S. Provisional Patent Application 61/450,973, filed March 9, 2011, U.S. Provisional Patent Application 61/450,963, filed March 9, 2011, each of which is incorporated herein by reference.
TECHNICAL FIELD
The present invention generally relates to one or more aspects of systems useful for inspection and evaluation, such as, but not limited to, to automated surface inspection and evaluation using fuzzy logic analysis, automated grain structure characterization process including fuzzy logic analysis, automated surface inspection process including fuzzy logic analysis, protocol-based inspection system, automated object analysis manipulators, and continuous diffuse illumination system.
BACKGROUND
Present approaches to inspection and evaluation suffer from a variety of drawbacks, limitations, disadvantages and problems including those respecting efficiency, repeatability and others. There is a need for the unique and inventive surface imaging inspection apparatuses, systems and methods disclosed herein.
SUMMARY
One embodiment of the present invention is a unique surface imaging inspection process. Other embodiments include apparatuses, systems, devices, hardware, methods, and combinations for a surface imaging inspection process utilizing fuzzy logic analysis. Still another embodiment of the present invention is a unique grain structure
characterization process. Other embodiments also include apparatuses, systems, devices, hardware, methods, and combinations for an automated grain structure characterization process including fuzzy logic analysis. Another embodiment of the present invention is a unique automated surface inspection process. Still other embodiments include apparatuses, systems, devices, hardware, methods, and combinations for an automated surface inspection process including fuzzy logic analysis. A further embodiment of the present invention is a unique protocol-based inspection system. Still further
embodiments include apparatuses, systems, devices, hardware, methods, and
combinations for a protocol-based inspection. Yet another embodiment of the present invention is a unique automated object manipulation system. Yet still other embodiments include apparatuses, systems, devices, hardware, methods, and combinations for an automated object manipulation system. A still further embodiment of the present invention is a unique object illumination system. Still further embodiments include apparatuses, systems, devices, hardware, methods, and combinations for a continuous diffuse illumination system for an inspection system. Further embodiments, forms, features, aspects, benefits, and advantages of the present application shall become apparent from the description and figures provided herewith.
BRIEF DESCRIPTION OF THE FIGURES
Figure 1 is an illustration of one embodiment of a surface imaging inspection system.
Figure 2 is a flow diagram of one embodiment of the present application.
Figure 3 is a flow diagram of one embodiment of an image acquisition module from Figure 2.
Figure 4 is a flow diagram of one embodiment of an image registration module from Figure 2.
Figure 5 is a flow diagram of one embodiment of an inspection module from Figure 2.
Figure 6 is a flow diagram of one embodiment of a condition assessment module from Figure 2.
Figure 6a is a flow diagram of one embodiment of a fuzzy logic analysis module. Figure 7 is a flow diagram of one embodiment of a reporting module from Figure
2.
Figure 8 is a flow diagram of one embodiment of an airfoil library module from Figure 5.
Figure 9 is an illustration of one embodiment of a grain structure characterization system.
Figure 10 is a flow diagram of one embodiment of a grain structure
characterization process.
Figure 11 is a flow diagram of a characterization process from Figure 2.
Figure 12 is a diagram illustrating one embodiment of a characterization system of the present application.
Figure 13 is an illustration of one embodiment of a surface inspection system.
Figure 14 is a flow diagram of one embodiment of an inspection process.
Figure 15 is a flow diagram of a process from Figure 14.
Figure 16 is a schematic of an embodiment of an inspection system of the present application.
Figure 17 is an illustration of a graphical user interface of an embodiment of an inspection system of the present application. Figure 18 is a process flow diagram of an embodiment of an inspection process of the present application.
Figure 19 is a schematic diagram of an embodiment of a component of an inspection system.
Figure 20 is a process flow diagram of embodiment software of an inspection system of the present application.
Figure 21 is an illustration of an embodiment of an object manipulation system. Figure 22 is an exploded view of an embodiment of an object manipulation system.
Figure 23a is a front view illustration of an embodiment of an object manipulation system.
Figure 23b is a side view illustration of an embodiment of an object manipulation system.
Figure 23c is a back view illustration of an embodiment of an object manipulation system.
Figure 24a and 4b are illustrations of an embodiment of an object manipulation system.
Figure 25 is an illustration of a portion of an embodiment of an object
manipulation system.
Figure 26 is an illustration demonstrating movement of a portion of an
embodiment of an object manipulation system.
Figure 27 is an illustration of one degree of freedom of an embodiment of an object manipulation system.
Figure 28 is an illustration of another degree of freedom of an embodiment of an object manipulation system.
Figure 29 is an illustration of an embodiment of an illumination system of the present application.
Figure 30 is a diagram of an embodiment of the present application.
Figure 31 is an arrangement of components for an embodiment of the present application.
Figure 32 is a process flow diagram of an embodiment of the present application. DETAILED DESCRIPTION OF THE ILLUSTRATIVE EMBODIMENTS
For the purposes of promoting an understanding of the principles of the invention, reference will now be made to the embodiments illustrated in the drawings and specific language will be used to describe the same. It will nevertheless be understood that no limitation of the scope of the invention is thereby intended. Any alterations and further modifications in the described embodiments, and any further applications of the principles of the invention as described herein are contemplated as would normally occur to one skilled in the art to which the invention relates.
With reference to Figure 1 , an illustration is shown for a surface imaging inspection system 1100 representing an embodiment of the present application including an automated imaging process, algorithms, sensors, robotic positioning and other analysis to locate, evaluate and report surface images. Surface imaging inspection system 1100 is shown to include an inspection assembly 1120 and a controller 1130.
Inspection assembly 1120, as shown in the embodiment of Figure 1, includes a positioning system 1124 and an imaging system 1126. Positioning system 1124 of this embodiment operates with a part presentation technique based on an algorithm for manipulating a part 1122 in an efficient manner with minimum hunting for part surfaces and anomalies. Embodiments of positioning system 1124 can include a robotic part manipulator. Robotic part manipulation can provide consistent positioning of part 1122 during the inspection process which can reduce variation and improve efficiency of the inspection process.
In some embodiments, part manipulation can include presenting the part to a detection device such as a camera. In one particular embodiment, while positioning system 1124 utilizes lighting and image acquisition positions, imaging system 1126 acquires images used to identify the type of part 1122 being inspected during a registration process. From the registration process, a positioning algorithm is selected to provide predetermined part manipulation during further imaging processes.
Controller 1130 of surface imaging inspection system 1100 is shown
schematically in the exemplary embodiment of Figure 1 as a single component containing modules capable of performing various functions. Each function can be located on the same or separate pieces of hardware and can be one of several hardware varieties available and arrangable by one skilled in the art. Controller 1130 can also include one or more microprocessors where a single microprocessor provides the functions of each module or separate microprocessors are used for one or more of the control modules.
Controller 1130 as shown is capable of operating an image data processing system 1132 and a robotic manipulation module 1138. Robotic manipulation module 1138 is shown in Figure 1 as part of controller 1130. Robotic manipulation module 1138 can be part of the positioning equipment in positioning system 1124 as a single system or as separate components. For one specific embodiment, robotic manipulation module 1138 is capable of providing a positioning algorithm, a component type recognition database and a set of predetermined part manipulation instructions to surface imaging inspection system 1100.
Image data processing system 1132 can include an analyzer module 1134 and an imaging module 1136. In one embodiment, imaging module 1136 can include a controlled electromagnetic radiation configuration with a radiation media generator and a radiation detector whether the detector is physical or digital. The radiation media can include visible light, radio waves, microwaves, infrared radiation, ultraviolet light, x-rays and gamma rays to name a few. The intensity of the emitted radiation media can be adjusted to ensure adequate imaging. The type of radiation media can be selected based on criteria such as but not limited to, equipment availability, component sensitivity, material, estimated defect characteristics and the like.
In one specific embodiment, surface imaging inspection system 1100 can utilize a visible light generator with an optical camera for imaging system 1126 to produce images of a component as well as produce images of a surface or multiple surfaces of the component. Imaging module 1136 is then able to analyze the produced image(s) for surface features. In another embodiment, imaging module 1136 can interface with imaging system 1126 providing equipment controls as an alternative to controls provided directly with the imaging equipment or from another source.
Surface features indicated by imaging module 1136 of surface imaging inspection system 1100 can include but are not limited to cracks, porosity, damage, curvature, dimensions and the like. In some embodiments, the component being analyzed can include a single crystal, a directionally solidified, and/or an equiaxed microstructure. In a further embodiment, the component can include an airfoil component of a gas turbine engine. One embodiment operates to mechanically locate, evaluate, and report surface features on families of airfoil type components. Another embodiment of the present system generates a report of the sizes, locations and types of features on the surface of the component in tabular or graphical form.
Using one embodiment from the present application, the process variation for evaluating surface images can be reduced via automating the detection and evaluation of surface features and the application of pass/fail criteria using an analyzer module 1134. In one form analyzer module 1134 can be a fuzzy logic analyzer module capable of providing analysis of the image data sets from imaging system 1126. As will be discussed further herein, fuzzy logic can be used in surface imaging inspection system 1100 to deal with fuzzy concepts— concepts that cannot be expressed as "true" or "false" but rather as "partial truths." Fuzzy logic analysis allows an automated inspection to access a deposit of component images 1140 or a knowledge bank to apply cognitive characterization of features and provide a level of consistency to determine a pass/fail status according to a component specification.
Another embodiment of the present application applies a lighting configuration, a part presentation technique, and a fuzzy logic based image processing technique for identifying surface features in a single crystal cast airfoil component. Yet another embodiment includes an algorithm for manipulating a part with respect to lighting and camera positions in an efficient manner with minimum hunting for a subject and a fuzzy logic based image processing algorithm to identify surface features which indicate a surface defect.
One embodiment of the present application is shown in Figure 2 with a flow diagram of inspection process 1200. This embodiment shows inspection process 1200 to include five modules: an image acquisition module 1300, an image registration module 1400, an inspection module 1500, a condition assessment module 1600 and a reporting module 1700. A particular embodiment of the present application can transition between modules and some aspects may or may not be evident in each embodiment.
Inspection process 1200 is shown in this embodiment to begin with module 1300. Image acquisition module 1300 is shown with further detail for one embodiment in Figure 3. Image acquisition module 1300 of this exemplary embodiment begins by acquiring an image in operation 1310 with a detection device, camera or electronic image sensor, for example. Various methods are available for acquiring an image such as but not limited to multi-spectral imaging, single shot, multi-shot or scanning image capture. In various embodiments, image acquisition of module 1300 can include processing, compression and storage before the data is further processed in image acquisition module 1300 or inspection process 1200.
Modules which can be a part of the acquisition process to improve the quality of the data created during the acquisition process of module 1300 can include image quality verification in conditional 1320, image illumination adjustment in operation 1330, and background image removal in operation 1340. Conditional 1320 verifies the image quality. Image quality verification in operation 1320 can include a comparison with an image from a deposit of component images 1800. If the image quality cannot be verified, image acquisition module 1300 returns to operation 1310 to acquire another image. Once an image can be verified, image acquisition module 1300 moves to operation 1330. In operation 1330, the illumination can be adjusted to improve contrast, increase
illumination or reduce illumination or glare, for example. Operation 1330 is shown as preceding operation 1340 where a background image(s) can be removed. In one instance, background image removal could provide an image with fewer variations to evaluate or compare.
Following image acquisition module 1300 in inspection process 1200 as shown in Figure 2, image registration module 1400 is shown with further detail from one embodiment in Figure 4 and can begin with operation 1420 including image registration. Image registration of operation 1420 takes the image from image acquisition module 1300 and identifies a set of predetermined features. The part being tested can be identified by component type from the set of predetermined features where further processing routines such as inspection part manipulation can be set up according to the identified component type. The part identification in module 1400 can be based on a comparison between a collected image or images of the part being inspected and a database of part responses. In one embodiment of the present application, once the component type has been identified, the system provides a predetermined manipulation algorithm for part presentation during an imaging process. A part with multiple surfaces suitable for inspection can be automatically rotated and positioned according to a predetermined manipulation algorithm allowing for consistent multiple image acquisitions. For example, a cylindrical component can require rotation along the axis to expose a substantially complete exterior surface. A predetermined manipulation algorithm determined by the system can then include manipulation instructions for positioning the cylindrical component perpendicular to the angle of incidence of the cylinder surface and rotating 360° to maintain the perpendicular orientation.
In another embodiment, image registration module 1400 can also contain a macro component quality review with conditional 1430 where an image registration can be verified. Conditional 1430 determines whether or not the sample image was properly identified by component type. A part without proper features can have an image with missing or non-corresponding form. The missing or non-corresponding form can be an indication of a non-conforming part which requires repair or rejection. In a further embodiment, a part which has been identified to have non-conforming areas in conditional 1430 can be evaluated under operation 1460 to determine if the part should be rejected or repaired based on a degree of structural non-conformity. A report can be generated in operation 1470 and a human inspector can be contacted in operation 1480 for confirmation. A part can be labeled as a rejected part in conditional 1430 following image registration in operation 1420.
In one exemplary embodiment, an airfoil casting with incomplete mold fill would not have an image comparable to a standard image but would show a lack of material with a non-corresponding form. Image registration 1420 can determine an airfoil component using an outline image but registration verification 1430 can identify a missing portion of the image. The degree of deformity and location of the deformity can be factors in deciding whether the part is rejected or repaired in operation 1460. In another non-limiting example, a casting with a uniform surface can have a profile comparable to a standard image in image registration 1420. Registration verification 1430 can determine if no macro deformities are present when comparing the acquired image with a standard image and verifying the part is ready for inspection.
Once a part is registered in module 1400, inspection process 1200 can continue with inspection of the part in inspection module 1500. Inspection module 1500 is shown with further detail for one embodiment in Figure 5. For inspection of a part, module 1500 can be capable of presenting the part with a predetermined manipulation algorithm based on the automated recognition and registration in module 1400. As the part is presented, operation 1510 includes anomaly detection. Anomaly detection can include collecting surface images as the part is manipulated to pre-determined positions. Surface images can be generated and captured by processes such as but not limited to reflected light, negative light, luminescence, fluorescence, x-ray and the like. In a specific embodiment, a surface with grooves would reflect radiation media in planes related to the grooves; radiation media detected outside the expected planes can indicate undesirable groove geometries or anomalies.
Collected surface images are evaluated in operation 1520 for detecting potential irregularities. In one embodiment, evaluation in operation 1520 can include an image data processing module of a controller which analyzes surface images for variations outside a standard with model based irregularity detection. Variations can include low reflective response or excessive reflective response, a low negative light or excessive negative light, no luminescence detected or uncharacteristic diffraction patterns depending on the type of surface image generation applied and the method for capturing the image. In another embodiment, irregularities are determined when collected images are compared with an airfoil surface imperfection detection (ASID) technique data fusion 1530 or component inspection requirements 1540. Both ASID data fusion 1530 and inspection requirements 1540 can operate with access to deposit of component images 1800. Deposit of component images 1800 can provide standard data sets for comparison.
The image data collected during module 1500 can be provided to module 1600 which applies a fuzzy logic analysis of the sensed image(s). Condition assessment module 1600 is shown with further detail from one embodiment in Figure 6. Module 1600 is shown to begin with conditional 1610 where an initial determination of part quality can result in parts with no indications of anomalies moving to a final report in module 1700. Parts with identified anomalies move on for further assessment in module 1600 including an automated anomaly characterization in operation 1630. Operation 1630 accesses a knowledge bank 1640 to apply cognitive characterization of anomalies or features indicated by the image processing. Operation 1630 references inspection guidelines 1620. Inspection guidelines 1620 can result from design specifications, industry standards, or others. Guidelines can include specific criteria for smoothness, dimensional stability, porosity, density, chemical composition as well as physical characteristics such as cracks, damage, and other defects. With input from operation 1630, a fuzzy logic analysis is performed in module 1650. Fuzzy logic analysis and cognitive characterization in module 1600 provides an objective ability to determine a consistent pass/fail status for parts being inspected.
An automated image processing method in an embodiment of the present application can include fuzzy logic analysis to enable the system to use an analysis tool with appropriate processing times for part inspection. A fuzzy logic analysis system is a logic analysis system operable to process data by replacing what are commonly Boolean logic rules with a collection of fuzzy membership functions and rules. An example rule in a fuzzy logic system can be of the form:
If x is low and y is high, then z is low, where x and y are input variables, z is an output variable, "low" is a membership function defined on x and z, and
"high" is a membership function defined on y.
The rule's premise describes the degree to which the rule applies, while the rule's consequent assigns a membership function to the output variable(s), where the set of rules in a fuzzy logic analysis system is known as the rule base or knowledge base. Data processing in a fuzzy logic analysis system of an embodiment of the present application can include four high level steps that correspond roughly to an input stage, a processing stage, a compilation stage and an output stage.
Because fuzzy logic is a mathematical model for addressing inherently imprecise data, a fuzzy logic analysis can be applied to the present application. Fuzzy logic provides a mathematical model of the vagueness found in non-precise measurements allowing automated determinations regarding component analysis such as surface imaging.
Figure 6a shows four operations that can be part of fuzzy logic algorithm 1650 which are input 1651, processing 1652, compilation 1653 and output 1654. These operations can be described in slightly differing terms and can be combined, expanded or omitted based on the way the fuzzy logic analysis is described without changing the meaning or intent of using fuzzy logic in this embodiment of the present application.
1. Input Stage 1651 - Fuzzification: The membership functions defined for the input variables can be applied to the actual values of the input variables to determine the degree of truth for each rule premise. The input variables in a fuzzy control system are in general mapped into sets of membership functions known as "fuzzy sets" in the process of converting an input value to a fuzzy value. All the rules that apply can be invoked, using the membership functions and truth values obtained from the inputs, to determine the results of the rules.
2. Processing stage 1652 - Inference: The truth value for the premise of each rule can be computed and applied to its consequent. This computation results in one fuzzy subset being assigned to each output variable. The computation result in turn can be mapped into a membership function and truth value controlling the output variable.
3. Compilation stage 1653 - Composition: All of the fuzzy subsets assigned to each output variable can be combined together to form a single fuzzy output subset for each output variable.
4. Output stage 1654 - Defuzzification: The fuzzy output subset for each output variable can be convertible to a unique solution or a 'crisp' answer.
In an exemplary embodiment, a component image is acquired and a data set is created. Module 1650 compares the image data set to a set of rules assigning a degree of conformity to the data set. The degree of conformity is a representation of the amount of variation between the component image and an image from a deposit of images or set of guidelines. The degree of conformity can be representative of other levels of comparison in other embodiments. Continuing with this embodiment, the degree of conformity is compiled to produce an output data set related to position and level of conformity. The output data set is compared to data sets in the knowledge bank to determine whether the output data sets are consistent with anomalies. Output data sets consistent with anomalies provide an indication of the anomalies present in the component. Automated review of the conformity data set in this exemplary embodiment is capable of reducing variation found in surface indication detection.
Upon completion of fuzzy logic analysis in operation 1650, a part can move on to a final report in module 1700 following a passing result from operation 1650 or a part can move on to a failure report generation in operation 1670 following a failing result from operation 1650. Additionally, a part labeled as rejected can be provided for human inspection in cases where operation 1650 provides an uncertain result.
Reporting module 1700 can follow the fuzzy logic analysis in operation 1650. Reporting module 1700 can also follow other operations which end inspection such as rejected components in module 1400. In this embodiment, a reporting module 1700 is shown with further detail from one embodiment in Figure 7. Reporting module 1700 can include conditional 1710 which reviews whether the inspection is complete. In this exemplary embodiment, an incomplete inspection returns to inspection process 1200 to conduct the further inspections. A particular example of an inspection can include variations such as multiple planar surfaces or a single surface divided into several inspection tasks.
Once conditional 1720 determines the inspection is complete, operation 1730 can produce a report. The report from operation 1730 can be in tabular or graphical form intended to communicate the location and degree of deviation for the indicated features. In the embodiment shown in Figure 7, module 1700 can provide a report regarding the features from conditional 1430 and the results of the fuzzy logic analysis in operation 1650. In further embodiments, reporting module 1700 can include identifying the surface features indicated by the imaging process, allowing the automatic detection of features on the surface of the part, and applying an accept/reject criteria which utilizes the results from fuzzy logic algorithm 1650. The features can be cracks, pores, damage, missing material and combinations thereof.
In one embodiment, a deposit of component images module 1800 is accessed during multiple operations such as those found in conditional 1320, operation 1530, operation 1540 and operation 1630. Surface indication detection databases can be populated with data sets from components with known characteristics including conforming surfaces, non-conforming surfaces, defects, imperfections and the like. Data sets can also be generated through theoretical data from design applications including for example CAD and simulation software.
In the embodiment shown in Figure 8, deposit of component images module 1800 can include a set of component poses 1810 for reference. For example, a negative skeleton model 1840 of a part under inspection is produced with edge strengthened formation and then the sensed image is compared with a norm reference 1840a. Norm reference 1840a is analyzed with generalized reference model features in operation 1850. A context-based adaptive surface irregularity detection parameter tuning can be applied in operation 1860. Parameter tuning 1860 can include ensuring selected features are properly detected with applied enhancements and related detection system.
The skeleton model from operation 1840 can be further analyzed with ASID techniques such as but not limited to zero-crossing, constant false alarm rates (CFAR), salient-points, neural networks and the like to provide ASID data fusion in operation 1530 for irregularity detection in module 1500. Deposit of component images module 1800 can also utilize a positive reference model 1820. Positive reference model 1820 can compare an image with a reference norm 1820a representative of a positive reference model revealing desirable surface conditions. Norm 1820a can be applied to inspection requirements in operation 1540 of inspection module 1500. In a further embodiment, deposit module 1800 could be capable of retaining images produced during inspection process 1200 and categorizing the images with information determined according to the image analysis. New images could be stored in a knowledge bank. Deposit module 1800 could thereby learn from the inspection process.
In one embodiment of the present application, a method with this system includes applying a surface imaging process to a component, applying an algorithm to efficiently manipulate the component with robotic positioning and applying fuzzy logic analysis to identify surface features of the component shown by the surface imaging process.
With reference to Figure 9, an illustration is shown for a grain structure characterization system 2100 representing an embodiment of the present application including an inspection process, algorithms, sensors, robotic positioning and analysis to locate, evaluate and report grain structures. Grain structure characterization system 2100 is shown to include a positioning system 2110, a surface scanning system 2120 and a controller 2130.
Positioning system 2110 of this embodiment operates with a component presentation technique based on an algorithm for manipulating a component C with respect to surface scanning system 2120 in an efficient manner with minimum hunting for component surfaces and anomalies. In various embodiments, the positioning algorithm can be provided or can be selected based on other presented parameters designated to identify the type of component being characterized. Presented parameters could include a part number, a geometry or geometrical feature, a weight, and the like. Embodiments of positioning system 2110 can include a robotic component manipulator. In one particular embodiment, positioning system 2110 utilizes an imaging system to identify the type of component being characterized and determines a positioning algorithm with which to provide predetermined component manipulation during characterization. Robotic component manipulation can provide consistent positioning of components, such as component C and camera or vision system, during the
characterization process which can reduce variation and improve efficiency of the characterization process. In some embodiments, component manipulation can include presenting the component to a scanning device such as but not limited to a camera.
Surface scanning system 2120, as shown in the embodiment of Figure 9, includes a light source 2122 and a detector unit 2124. In one embodiment, light source 2122 includes bright field incident illumination such as but not limited to that of light microscopes in reflective mode. Light source 2122 can produce illumination in a variety of wavelengths depending on the material of the component and available equipment, thus the term "light" is not limited to visible light but includes the entire electromagnetic spectrum for shorthand. In one non- limiting embodiment, light from light source 2122 is directed at the component. Detector unit 2124 then detects the light reflected by the surface of component C. Variations in the reflectivity data collected by detector unit 2124 can be the result of differences in the reflectance of various features of the surface of component C.
The direction of reflected light can be determined by several factors including but not limited to surface topography and grain structure. Surface topography can include surface variations resulting from voids, grain boundaries (with or without etching) and grain defects.
In an exemplary embodiment, a single crystal cast component can be subjected to the characterization system of the present application. The surface of the cast component can be scanned to produce a reflectivity signal. The reflectivity signal can indicate grain boundaries or multiple phases due to varying surface texture. The varying surface texture can reflect directed light differently allowing reflectivity differentiation between the phases.
Most materials reflect under bright field incident illumination with specular reflection where the reflected light is strongest in a single direction. Specular reflection follows the physical principle where the angle of incidence equals the angle of reflection. The angle of incidence and the angle of reflection are determined from a plane normal to the reflecting surface. As the normal plane changes direction with variations in the reflecting surface, the angle of incidence and angle of reflection change accordingly. Surfaces with different normal planes will reflect light differently. In some applications, the component to be inspected includes one or more polished surfaces and/or as-cast surfaces that uniformly reflect light and otherwise not show contrast. The surface of such a component can be prepared such as, but not limited to, chemical etching to optically enhance microstructural features so as to not uniformly reflect light and otherwise show image contrast.
Defects in the grain structure of a single crystal component can be characterized by an embodiment of the present application. Various grain defects can find their way into a single crystal manufacturing process such as but not limited to high angle grain boundaries, low angle grain boundaries, recrystallized grains, and twinning defects. Examples of defects can include precipitates, dislocations and impurities. One embodiment of the present application is designed to automatically locate, evaluate and report grain defects on families of single crystal cast airfoil type components.
Embodiments disclosed herein can also generate a report of the sizes and the locations of the grain defects on the component. One embodiment of the present application includes identifying defects, defect locations and defect density.
Controller 2130 of grain structure characterization system 2100 is shown schematically in the embodiment of Figure 9 as a single component containing modules capable of performing various functions. Each function can be located on the same or separate pieces of hardware and can be one of several hardware varieties available and arranged by one skilled in the art. Controller 2130 can also include one or more microprocessors where a single microprocessor provides the functions of each module or separate microprocessors are used for one or more of the control modules.
Controller 2130 as shown in Figure 9 is capable of operating a scanned data processing module 2132 and a robotic manipulation module 2138. Robotic manipulation module 2138 is shown in Figure 9 as part of controller 2130. Robotic manipulation module 2138 can be part of the positioning equipment in positioning system 2124 as a single system or as separate components. For one specific embodiment, robotic manipulation module 2138 is capable of providing a positioning algorithm and predetermined part manipulation instructions to positioning system 2110 in response to an identification of component C.
Scanned data processing system 2132 can include an analyzer module 2134 and a scanning module 2136. In one embodiment, scanning module 2136 can interact with surface scanning system 2120 to provide equipment controls as an alternative to controls provided directly with the scanning system equipment or from another source and can be capable of providing acquisition and manipulation capabilities for data sets provided to surface scanning system 2120.
In one form, analyzer module 2134 can be a fuzzy logic analyzer module capable of providing analysis of the scanned data sets from surface scanning system 2120. Fuzzy logic can be used in grain structure characterization system 2100 to deal with fuzzy concepts— concepts that cannot be expressed as "true" or "false" but rather as "partial truths." In an automated grain structure characterization system, a reflectivity analysis method in an embodiment of the present application can include fuzzy logic analysis to enable the system to use an analysis tool with appropriate processing times for part inspection. In general, a fuzzy logic analysis system is a logic analysis system operable to process data by replacing what are commonly Boolean logic rules with a collection of fuzzy membership functions and rules. An example rule in a fuzzy logic system may be of the form:
If x is low and y is high, then z is low, where x and y are input variables, z is an output variable, "low" is a membership function defined on x and z, and
"high" is a membership function defined on y.
The rule's premise describes the degree to which the rule applies, while the rule's consequent assigns a membership function to the output variable(s), where the set of rules in a fuzzy logic analysis system is known as the rule base or knowledge base.
Because fuzzy logic is a mathematical model for addressing inherently imprecise data, a fuzzy logic analysis can be applied to the present application. For example, the concept of 'brightness' is not mathematically expressed in an equation. Luminescence may be a quantity but 'brightness' is not. A sharp cut off does not exist between 'bright' and 'not bright.' One cannot say that 'bright' is at X luminescence but 'not bright' is at X-l luminescence.
An operator is able to detect differing 'reflectivity' for a samplev How much 'reflectivity' observed will vary between operators leading to the reduced repeatability in component characterization. Fuzzy logic provides a mathematical model of the vagueness found in non-precise measurements of reflectivity allowing automated determinations regarding component analysis such as grain structure characterization.
Figure 10 shows a flow diagram of an embodiment of the present application including a scanning process 2200. Scanning process 2200 begins with operation 2210 where light is directed at a localized area of a component's surface. Operation 2210 may utilize light source 2122 from Figure 9. In response to operation 2210, operation 2220 senses the light which is reflected by the component surface. Operation 2220 may be accomplished with detection unit 2124 from Figure 9 to detect the reflected light over a specified range of angles. A reflection signal 2230 representing the intensity of the reflected light is provided for a reporting operation 2240. Operation 2240 may include further analysis regarding the reflected light intensity resulting in a characterization of the component surface. Analysis from operation 2240 can include fuzzy logic analysis. In an embodiment of the present application, grain structure characterization of a component surface as part of operation 2240 may include defect inspection of a single crystal airfoil casting.
Figure 11 shows an embodiment of a fuzzy logic analysis that can be part of reporting operation 2240 from Figure 10. Data processing in a fuzzy logic analysis system of an embodiment of the present application can include four high level steps that correspond roughly to an input stage 2310, a processing stage 2320, a compilation stage 2330 and an output stage 2340. These operations can be described in slightly differing terms and can be combined, expanded or omitted based on the way the fuzzy logic analysis is described without changing the meaning or intent of using fuzzy logic in this embodiment of the present application.
1. Input Stage 2310 - Fuzzification: The membership functions defined for the input variables can be applied to the actual values of the input variables to determine the degree of truth for each rule premise. The input variables in a fuzzy control system are in general mapped into sets of membership functions known as "fuzzy sets" in the process of converting an input value to a fuzzy value. All the rules that apply can be invoked, using the membership functions and truth values obtained from the inputs, to determine the results of the rules.
2. Processing stage 2320 - Inference: The truth value for the premise of each rule can be computed and applied to its consequent. This computation results in one fuzzy subset being assigned to each output variable. The computation result in turn can be mapped into a membership function and truth value controlling the output variable. 3. Compilation stage 2330 - Composition: All of the fuzzy subsets assigned to each output variable can be combined together to form a single fuzzy output subset for each output variable.
4. Output stage 2340 - Defuzzification: The fuzzy output subset for each output variable can be convertible to a unique solution or a 'crisp' answer.
In a specific embodiment, a grain structure characterization system of the present application utilizes fuzzy logic analysis to determine grain defects of a component. The characterization system presents a surface of the component to a scanning system by manipulating the component according to a positioning algorithm. The scanning system produces a reflectivity data set as a result of light directed to the surface of the component reflecting back. The reflectivity data set would include the intensity of reflecting light and location on the surface of the component. Fuzzy logic analysis is applied where the reflectivity data set is collected as the input variables. The input variables are assigned a degree of intensity. The degree of intensity is compiled to produce an output data set related to the level of reflectivity and location. The output data set can be characterized to indicate grain structure or, more specifically, grain defects. Characterization can be performed by comparing the output data set to standard data sets in an airfoil defect knowledge bank.
Referencing Figure 12, one embodiment of an apparatus includes a controller 2130 with various components illustrated as representative modules, inputs, outputs and intermediate data parameters. Module 2410 is a position module structured to determine a positioning algorithm 2414 in response to the identification of a type of component 2412 and a component positioning database 2416. Various systems can be available for identifying the component in component identification 2412 such as but not limited to bar code, scan, operator input, CMM data, imaging data and the like.
Component manipulation instructions 2418 for the positioning equipment are provided in response to positioning algorithm 2414. Manipulation instructions 2418 provides a component orientation data set 2401. In one embodiment, manipulation instructions 2418 can provide robotic manipulation of the component during grain structure characterization process 2100.
Module 2420 is a reflection module structured to direct a source of light 2422 on to the surface of a component and detect a quantity of reflected light 2424 with a detection unit. Module 2420 is further structured to provide a reflectivity data set 2426 in response to the quantity of reflected light 2424. Module 2430 is a characterization module where a fuzzy logic algorithm 2432 can be applied to the reflectivity data set. Fuzzy logic algorithm 2432 applies reflectivity data set 2426 to a set of input variables 2433. A set of fuzzy logic membership functions 2435 assigns a degree of intensity to the set of input variables 2433. Fuzzy logic algorithm 2432 determines an output data set 2434 which is converted into a solution set 2436. Indication module 2440 is structured to identify an indication 2445 of a grain structure feature in response to solution set 2436 and can also be in response to component orientation data set 2401.
Grain structure features indicated by a grain structure characterization system as part of an embodiment of the present application can include but are not limited to grain structure, grain defects, grain locations, grain size, and grain defect density. In some embodiments the part can include a single crystal, a directionally solidified, and/or an equiaxed grain structure. In a further embodiment the part can include an airfoil component of a gas turbine engine. One embodiment of a grain structure characterization system can operate to mechanically locate, evaluate, and report grain structure
characterization features on families of airfoil type components. Another embodiment of the system can generate a report of the sizesT ocations and types of grain structures of the component in tabular or graphical form.
An embodiment of the present application applies a special lighting configuration, a part presentation technique, and a fuzzy logic based processing technique for identifying grain structures in a single crystal cast airfoil component. Yet another embodiment can include an algorithm for manipulating a part with respect to lighting and sensing positions in an efficient manner with minimum hunting and a fuzzy logic based processing algorithm to identify grain structure=features which can indicate a
microstructural-defect. Embodiments of the present application can be applied to components requiring grain defect inspection such as but not limited to single crystal cast components, directionally solidified cast components, and equiax solidified cast components.
In one embodiment of the present application, a method includes applying a grain structure characterization process to a component, applying an algorithm to efficiently position the component with an automatic positioning algorithm and applying fuzzy logic analysis to identify grain structure characterizations of the component.
With reference to Figure 13, an illustration is shown for a surface inspection system 3100 representing a non- limiting embodiment of the present invention including an automated surface inspection process, algorithms, sensors, robotic positioning, and analysis to locate, evaluate and report surface variances. Surface inspection system 3100 is shown to include a preparation system 3110, an inspection system 3120 and a controller 3130.
Preparation system 3110, as shown in this embodiment, has four stages. In other embodiments, each stage can have multiple levels and one stage can be combined with another stage. In yet other embodiments, one or more of the stages may not be included. The embodiment shown with preparation system 3110 includes an initial cleaning process 3112, an indicator application process 3114, an excess indicator removal process 3116, and a developer application process 3118.
Initial cleaning process 3112 can be included when the surface of a part 3122 contains contamination such as but not limited to lubricant and material shavings from previous manufacturing processes or other sources. A surface of a part that is clear of oil or debris can reduce the opportunity for obscuring an anomaly or falsely indicating a defect on the surface. Indicator application process 3114 can include application techniques available to an operator including but not limited to dipping, brushing and spraying. Indicators can include liquid indicators such as a dye or non-liquid indicators such as magnetic-particles.
Application parameters for indicator application process 3114 can depend on the indicator chosen and the types of anomalies anticipated. For example, dyes with lower viscosity may penetrate faster and small anomalies may require more time for penetration. In some applications, surface porosity may affect the ability of a liquid indicator to adequately indicate surface defects and adjustments can be made to the application parameters.
Excess indicator removal process 3116 can remove substantially all of the excess indicator from a surface without removing too much indicator which can affect the accuracy of an surface inspection test. Not removing enough of the excess indicator can lead to false indications and removing more than just the excess indicator can deplete the amount of indicator necessary on the surface for indicating anomalies. In developer application process 3118, a developer can be used in some embodiments which apply certain types of indicators to provide additional contrast between a fluorescent dye and the surrounding surfaces.
Once part 3122 has been prepared with preparation system 3110, an embodiment of surface inspection system 3100 can continue with inspection system 3120. Inspection system 3120, as shown in the embodiment of Figure 13, includes a positioning system 3124 and an indication system 3126.
Positioning system 3124 of this embodiment operates with a part presentation technique based on an algorithm for manipulating part 3122 in an efficient manner with minimum hunting for part surfaces and anomalies. Embodiments of positioning system 3124 can include a robotic part manipulator with a discussion of further details to follow. In one particular embodiment, positioning system 3124 utilizes illumination and imaging components to identify the type of part 3122 being inspected. Illumination can be, for example, supplied for reflection detection or shadow detection. An imaging component can be, for example, a camera capable of reproducing the image, a photo sensor capable of detecting illumination, or the like.
Positioning system 3124 can determine the identity of part 3122 by analyzing the outline of part 3122 generated when the robotic part manipulator places the part in a predetermined position between a light source and an imaging component. In another embodiment, positioning system 3124 can analyze a reflection image based on light emitted toward part 3122 and reflected back to an imaging component. Radiation types other than light can be emitted. A detected image of part 3122 can be analyzed by comparison to a standard image within a library of images accessible by positioning system 3124. Comparison may include determining predetermined data points and comparing data points, overlaying images and determining differences, and other such methods known in the art.
In one embodiment, robotic part manipulation may include robotic positioning of part 3122 with preset coordinates placing predetermined features of a part in a predetermined position relative to recognition equipment according to a positioning algorithm. Part manipulation can also include predetermined repositioning of a part during further steps of the inspection process. In a specific embodiment, robotic part manipulation can provide consistent part positioning during the inspection process which can reduce variation and can improve efficiency of the inspection process. In another embodiment, positioning system 3124 can determine the positioning algorithm which would provide predetermined part manipulation based on part 3122 identification.
Indication system 3126 of inspection system 3120 may include an image capture device such as but not limited to a camera which may be capable of capturing the visible spectrum, a photo-emission sensor for various wavelengths including but not limited to ultraviolet and x-ray, detectors capable of sensing electromagnetic radiation, and the like. Other capture devices structured to capture an indication from suitable indicators are also contemplated herein. A light source can be a laser, a discharge tube, or other radiation source. In a non- limiting exemplary embodiment, indication system 3126 includes equipment with the capability to provide a radiation source to react with a fluorescent penetrant indicator causing an emission which can be detected by equipment of indication system 3126. Equipment of indication system 3126 can be contained in a single housing as shown in Figure 13 or can be contained in separate housings. Indication system 3126 can also include multiple radiation or illuminating sources and/or detection components. Components of indication system 3126 can also provide illuminating and image acquisition for use with positioning system 3124.
Controller 3130 of surface inspection system 3100 is shown in the embodiment of Figure 13 as a single component containing hardware capable of performing various functions. Each function can be located on a separate piece of hardware and can be one of several hardware varieties available and arranged by one skilled in the art. Controller 3130 can also include one or more microprocessors where, in one embodiment, a single microprocessor can provide the functions of each module or separate microprocessors can be used for one or more of the control modules. One skilled in the art would be able to determine a controller architecture.
Controller 3130 in the embodiment of Figure 13 is shown as being capable of operating an indication data processing system 3132 and a robotic manipulation module 3138. Indication data processing system 3132 can include an analyzer module 3134 with further details to follow and a sensor module 3136. In one non- limiting form the analyzer module 3134 is a fuzzy logic analyzer. In this embodiment, sensor module 3136 can interact with indication system 3126 to provide equipment controls as an alternative to controls provided directly with the indication equipment or from another source to interact with indication system 3126. Sensor module 3136 can be capable of providing acquisition and manipulation capabilities for data sets obtained by indication system 3126.
In one embodiment, analyzer module 3134 is a fuzzy logic analyzer module capable of providing analysis of the indication data sets from indication system 3126. Fuzzy logic analysis provides a mathematical model of the vagueness found in non- precise measurements of surface inspection techniques such as but not limited to FPI and magnetic-particle inspection. Fuzzy logic can be used in machine control in order to deal with fuzzy concepts— concepts that cannot be expressed as "true" or "false" but rather as "partial truths."
Fuzzy logic analyzer module can include an input stage, a processing stage, a compilation stage and an output stage. The input stage maps sensor or other inputs to appropriate membership functions and truth values. The processing stage invokes an appropriate set of logic rules in the form of IF-THEN statements - IF variable IS property THEN action. The compilation stage combines the results of the rules. Finally, the output stage converts the combined results into a control output value.
For an automated surface inspection system, an indication data processing method in an embodiment of the present invention includes fuzzy logic analysis to enable a system to use an analysis tool with appropriate processing times for part inspection. In general, a fuzzy logic analysis system is a logic analysis system operable to process data by replacing what are commonly Boolean logic rules with a collection of fuzzy membership functions and rules. An example rule in a fuzzy logic system may be of the form:
If x is low and y is high, then z is low, where x and y are input variables, z is an output variable, "low" is a membership function defined on x and z, and
"high" is a membership function defined on y.
Because fuzzy logic is a mathematical model for addressing inherently imprecise data, a fuzzy logic analysis can be applied to the present application. For an exemplary embodiment including a fluorescent penetrant indicator (FPI), surface anomalies are indicated by areas of brightness due to the presence of the fluorescent penetrant. The concept of 'brightness' is not mathematically expressed in an equation. Luminescence may be a quantity but 'brightness' is not. A sharp cut off does not exist between 'bright' and 'not bright.' One cannot simply say that 'bright' is at X luminescence but 'not bright' is at X-l luminescence. During FPI for example, an operator may be able to infer differing 'brightness' for the areas of a sample with differing levels fluorescent penetrant responding to the radiation. How much 'brightness' recorded will vary between operators leading to reduced repeatability.
In an exemplary embodiment including FPI, a radiance data set is collected and compared to a set of rules assigning a degree of intensity to the radiance data set. The degree of intensity in this embodiment is a representation of the amount of radiance the fluorescent penetrant produces when radiated. The degree of intensity may be representative of other levels of indicators in other embodiments. Continuing with this embodiment, the degree of intensity is compiled to produce an output data set related to position and level of radiance. The output data set is compared to data sets in a knowledge bank to determine whether the output data sets are consistent with anomalies. Output data sets consistent with anomalies provide an indication of the anomalies present in the component. Automated review of the radiance data set in this embodiment is capable of reducing variation found in surface variance detection.
Robotic manipulation module 3138 is shown in Figure 13 as part of controller 3130. Robotic manipulation module 3138 can, in the alternative, be part of the positioning equipment in positioning system 3124 as a single system or as separate components. For one embodiment, robotic manipulation module 3138 is capable of providing a positioning algorithm, a component type recognition database and predetermined part manipulation instructions.
A positioning algorithm can include predetermined coordinates for a robotic part manipulator where coordinates can be based on an absolute or comparative capacity. For one embodiment, once a part has been identified and the position of certain features determined in relation to a part manipulator, a positioning algorithm produces
predetermined rotation and positioning of the part during inspection thereby increasing the consistency in detecting variances with recognition equipment. For example, in one embodiment, inspection begins with predetermined initial coordinates within the robotic manipulator's coordinate measuring system. The positioning algorithm could then control movement of the part manipulator allowing the inspection to be systematically applied to related components.
Surface inspection system 3100 of Figure 13 can also include a final cleaning process 3140 which may allow a part 3122 to be returned to a manufacturing line following a surface inspection test. The care and degree of cleaning necessary can depend on the remaining manufacturing processes and the final function of the parts being tested.
Features of a surface indicated by an embodiment of surface inspection system 3100 can include but are not limited to micro and macro porosity, inclusion defects, inhomogeneities, and discontinuities. In some embodiments the part would include a single crystal, a directionally solidified, and/or an equiaxed microstructure. In a further embodiment the part could include an airfoil component of a gas turbine engine. Another embodiment can operate to mechanically locate, evaluate, and report surface variances on families of airfoil type components. Yet another embodiment of the present application generates a report of the sizes and locations of the variances on the surface of a component in tabular or graphical form.
With regards to Figure 14, an exemplary inspection process 3200 is shown.
Inspection process 3200 begins with operation 3220 which includes surface defect indicator preparation. Shown as following operation 3220 in process 3200 is optional operation 3230 which includes recognizing the part being tested. The recognition in operation 3230 can be based on a comparison with the sensed image of the part and a database of part responses. Operation 3240 is then capable of applying a predetermined positioning algorithm based on the automated recognition of operation 3230 to manipulate the part. Automatic part positioning may reduce variability and improve the efficiency of the test.
As the part is manipulated with operation 3240, operation 3250 provides a source of excitement and senses the response from the surface of the part to collect an indication data set. In one embodiment, UV radiation is directed toward a surface of a test part to irradiate a fluorescent dye. In another embodiment, ferrous iron particles are placed on a ferromagnetic component's surface and a magnetic field is applied to the component. The magnetic flux of the applied magnetic field leaks at surface anomalies. The iron particles are attracted to areas of flux leakage producing an indicator of the surface anomalies.
The indication data collected during operation 3250 is provided to operation 3260 which applies a fuzzy logic analysis. Figure 15 shows further detail regarding operation 3260 where, in one embodiment, four exemplary operations are part of a fuzzy logic analysis. These operations may be described in slightly differing terms and may be combined, expanded or omitted based on the way the fuzzy logic analysis is described without changing the meaning or intent of using fuzzy logic in this embodiment of the present invention.
1. Input Stage - Fuzzification (3262): The membership functions defined for the input variables can be applied to the actual values of the input variables to determine the degree of truth for each rule premise. The input variables in a fuzzy control system can be, in general, mapped into sets of membership functions known as "fuzzy sets" in the process of converting an input value to a fuzzy value. Any of the rules that apply can be invoked, using the membership functions and truth values obtained from the inputs, to determine the results of the rules. 2. Processing stage - Inference (3264): The truth value for the premise of each rule may be computed and applied to its consequent. This computation results in one fuzzy subset being assigned to each output variable. The computation result may be mapped into a membership function and truth value controlling the output variable.
3. Compilation stage - Composition (3266): All of the fuzzy subsets assigned to each output variable may be combined together to form a single fuzzy output subset for each output variable.
4. Output stage - Defuzzification (3268): The fuzzy output subset for each output variable may be convertible to a unique solution or a 'crisp' answer.
Returning to Figure 14, Operation 3270 follows the fuzzy logic analysis in operation 3260. Operation 3270 allows an automated identification of anomalies on the surface of the part as indicated by the indicator. The anomalies can be inhomogeneities, micro structural discontinuities, inclusions, micro-porosity, grain structure and combinations thereof. The fuzzy logic algorithm from operation 3260 can produce a characterization data set for comparison with a knowledge bank. This comparison in Operation 3270 allows the automated inspection process to apply cognitive
characterization of defects indicated by an indication process. In one embodiment, the knowledge bank includes, but is not limited to, data sets from previous surface inspection applications to standard components or data sets generated from theoretical calculations or simulations. Fuzzy logic analysis and cognitive characterization in operation 3270 can directly affect the ability to determine an automated pass/fail status for the part.
Operation 3280 includes the application of an accept/reject criteria which utilizes the results from the fuzzy logic algorithm in operation 3260 and the anomaly indication in operation 3270. Operation 3280 can also provide a report (3280a) regarding the anomalies from operation 3270 and the results of the fuzzy logic analysis in operation 3260. For some embodiments, the report from operation 3280 can be in tabular or graphical form intended to communicate the location and degree of deviation for the indicated anomalies. Using one embodiment of the present invention, inspection process variation can be greatly reduced via automating the detection of variances and the application of a pass/fail criteria using fuzzy logic analysis. Fuzzy logic analysis allows an automated inspection to access a knowledge bank to apply cognitive characterization of defects and provide a level of consistency to determine a pass/fail status according to a specification.
Another embodiment of the present application applies a special lighting configuration, a part presentation technique, and a fuzzy logic based image processing technique for identifying inhomogeneity in a single crystal cast airfoil component using a fluorescent penetrant process. Yet another embodiment includes an algorithm for manipulating a part with respect to lighting and camera positions in an efficient manner with minimum hunting and a fuzzy logic based image processing algorithm to identify anomalies which may indicate a surface defect. Embodiments from the present application can be applied to components utilizing FPI or magnetic-particle defect inspection such as but not limited to single crystal cast components, directionally solidified cast components, and equiax solidified cast components.
With respect to Figure 16, an embodiment of an intelligent automated visual inspection system 4100 is disclosed which is capable of acquiring and processing images of components such as, but not limited to, engine components such as airfoils of gas turbine assemblies. The embodiment of inspection system 4100 as shown in Figure 16 includes an illumination system 4110, an imaging system 4120, a manipulation system 4130, a user interface 4140, an inspection processor 4150, and an image library 4160.
An illumination system such as system 4110 can include a source of radiance to be directed toward a component C under inspection. The radiance can be reflected by the surface of component C and detected by imaging system 4120. Radiance type can include various wavelengths in the electromagnetic spectrum including but not limited to the visible spectrum, ultraviolet light, near infrared, and x-rays. The source of radiance can include a laser, a discharge tube and the like. In one embodiment, an imaging system can be a camera utilizing a conventional light or other electromagnetic radiation type such as x-ray, ultraviolet, fluorescent and the like. An embodiment of manipulation system 4130 can include a robotic part manipulator and positioning algorithms to provide predetermined part presentation and positioning during an inspection process. User interface 4140 includes an interface having parameters within modules to be selected by a user in determining a set of inspection protocols. The inspection protocols can provide control of illumination system 4110, imaging system 4120 and manipulation system 4130 to produce an acquired image of component C under inspection. The acquired image can be analyzed by inspection processor 4150. The inspection protocols can further be applied to the analysis of the acquired image. In another embodiment, the analysis includes referencing image library 4160.
Inspection system 4100 can be used to analyze and determine characteristics or manufacturing flaws in components being inspected. In one embodiment, inspection system 4100 is a protocol-based visual inspection system with image processing algorithms and techniques implemented in system software. A system of such an embodiment can offer intuitive and easy-to-use interfaces to develop visual inspection procedures of components. In some forms, inspection system 4100 can be used without writing lines of programming code. An inspection system of another embodiment of the present application is fully automated, adaptive, and customizable to perform complex visual inspection comparable to that of a human inspector. The protocol-based system of yet another embodiment can have a built-in capability to simultaneously facilitate automated control of the visual inspection system including the accompanying illumination, imaging, and component manipulation systems.
An inspection system of one embodiment can have a protocol-based development technique which follows an interactive process. Through a process such as the one found in this embodiment of the present application, inspectors can fine tune the inspection system control parameters to achieve the inspection of components to various degrees of requirements, yet within an acceptable margin recommended by the Engineering
Inspection Standards (EIS) among other potential standards, targets, goods, etc. The system protocols of this embodiment can have built-in memory where detected component features or flaws can be registered as a historical representation of previously detected manufacturing imperfections. As such, a component designer can view a surface map of previously encountered manufacturing features, trouble shoot the cause of the features and revise the design and/or the manufacturing processes which potentially may be causing such component imperfections. With respect to Figure 17, one embodiment of an inspection system's graphical user interface (GUI) for development of inspection protocols is illustrated. The description that follows is an example of a manner of interacting with a GUI and configuring and/or executing an inspection protocol. It will be appreciated that any number of variations in the GUI, in protocol development and execution, etc. are contemplated herein. Furthermore, additional or fewer GUI options, combinations of features used in the GUI, etc, than those described herein are contemplated.
In some forms, code programming is not required for development of the protocols. In one embodiment, an inspection protocol is interactively designed to meet a selected inspection requirement with inspection protocol development tools selected in an inspection protocol module 4215. A user designs a protocol by selecting a series of available inspection options in an inspection setup requirement module 4205 and by defining four inspection parameters in an inspection process control parameters module 4210 including: contrast strength, border strength, edge strength, and noise strength. In one embodiment, an inspection protocol is interactively designed to meet a selected inspection requirement with inspection protocol development tools selected in an inspection protocol module 4215. In a further embodiment, an inspection of a component can consist of individual protocols to be executed during the inspection process one by one in the order they are constructed.
As demonstrated in Figure 17, an exemplary inspection system in one
embodiment can further allow the user to select inspection regions of components with a designated component regions module 4220 consistent with the specified EIS
requirements in an EIS specification requirements module 4240. In a further
embodiment, per protocol, the user can specify the section of a component to be inspected in a component section inspection module 4230 according to component linguistic terminology. The exemplary inspection system protocol development GUI shown in Figure 17 can allow the user to specify surface conditions of components being inspected with an intrinsic surface conditions module 4250. A still further embodiment can allow the user to control parameters enabling synchronization of inspection subcomponents including, but not limited to, illumination, imaging, and component manipulation systems. For example, the embodiments of the inspection systems and associated systems described herein can be used to control the illumination system, camera parameters, and component manipulators together, either concurrently or in a controlled sequence, with minimal to no additional interaction from an operator.
Figure 18 illustrates a flow chart representing steps for an automated inspection process 4300 that can be done through an embodiment of a GUI, for example an embodiment that is disclosed herein. Inspection process 4300 is shown as initiating with operation 4305 which develops the inspection protocol. A designed inspection protocol of one embodiment can control the illumination system, the imaging parameters, and the component manipulation system in response to the inspection protocol. Built-in communication capabilities of an inspection system of an embodiment of the present application can facilitate synchronization of inspection hardware and software.
Following operation 4305 is operation 4310 which is an image acquisition operation accessing an imaging parameters module 4312, an illumination parameters module 4314, a component manipulation requirements module 4316 and an image depository requirement module 4318. An imaging system operating under imaging parameters module 4312 can be a camera utilizing a conventional light or other electromagnetic radiation type such as x-ray, ultraviolet, fluorescent and the like.
Illumination parameters module 4314 can correspond with the technology of imaging parameters module 4312. Component manipulation requirements module 4316 can include manual, automated, or semi-automated instructions and controls to manipulate a component during inspection. In a specific embodiment, automated component manipulation controls can be determined in response to a component identification process. The identification process can be integrated with image depository requirement module 4318.
Upon acquisition of the component image, inspection process 4300 performs steps to segment the background from the foreground and component in operation 4320 shown as following operation 4310. Operation 4320 is a feature extraction process including an image segmentation module 4322, a feature vector formation module 4324, a feature vector clustering module 4326 and a weak feature pruning module 4328.
Separately or in combination, modules 4322, 4324, 4326 and 4328 of feature extraction operation 4320 can identify and remove segments of the component image acquired in operation 4310 deemed unnecessary or periphery. In various embodiments, removal of these segments allows an image with sharper edges for edge detection analysis or smooth shading for defect detection analysis, for example. Once segmented in feature extraction of operation 4320, the image background information can be ignored.
In yet a further embodiment, the ability to include and exclude certain features of the component C can also be provided. For example, the GUI described above can include, or alternative take the form of, a mask construction GUI that permits an operator to mask a specific region of the component C. A polygon mask can be used in some forms and can have any geometrical shape useful to identify certain areas of the component C. The GUI can also permit an operator to import and export masks associated with an inspection protocol. The features available to the operator can permit the mask to be translated, rotated, expanded, shrunk, etc to identify certain areas. In some forms one or more vertices of a polygon mask can be manipulated through the GUI. Two types of masks can be used in the various embodiments of the system described herein. An "Include Mask" and an "Exclude Mask" can be used. The Include Mask can enclose a section of the component C that is subjected to inspection, while the Exclude Mack can define sections of the component C that should be excluded from inspection. In some forms one Include Mask and one Exclude mask are permitted for any given protocol.
Upon segmentation of the foreground including the component subject to inspection from the background, defective regions can be determined in defect detection and validation operation 4330. Defects can include burrs, nicks, marks, scores, pitting, dents, and visible cracks to name a few. Operation 4330 includes a defect spatial registration module 4332 and a defect verification checks module 4334. Defect spatial registration module 4332 can, for example, in one embodiment provide location information of a determined defect by coordinating with a component manipulation system. The spatial information can be used to communicate the location of the detected defect to a user. Defect verification checks module 4334 can operate to provide information regarding characterization of a defect such as, but not limited to, the severity and type of defect detected. Defect verification checks module 4334 can provide this characterization information to the next operation. Operation 4340 is shown following operation 4330 and is a defect characterization operation including quantitative and qualitative analysis. Operation 4340 applies a defect statistical data measurement module 4342 to define geometrical properties of an identified defective area. In one embodiment, fuzzy logic analysis can be applied in one or more portions of the inspection process 4300. The qualitative judgment can provide an indication of the acceptability of a component with a defect according to the inspection standards being applied. Furthermore, each defective area can be characterized based on both quantitative and qualitative measures with the application of a defect severity assessment module 4344 and a defect distribution assessment module 4346. Severity and distribution assessment can provide information relevant to determining a cause for the detected defects in addition to contributing to decisions regarding acceptability of a component.
In operation 4350, defect reasoning and decision making, inspection process 4300 can use an analysis technique to perform defect condition reasoning with respect to the inspection engineering standards and an image library. Fuzzy logic analysis can be applied in operation 4350. With the assessment of operation 4350, a recommendation can be made for passing, recalling, or rejecting the inspected component in decision making module 4352 and a report can be generated in report generation module 4354.
Figure 19 illustrates an embodiment of functional components with an inspection processor 4450 of an inspection system 4400. Inspection processor 4450 is represented as a single component containing hardware capable of performing various functions. Each function can be located on a separate piece of hardware and can be one of several hardware varieties available and arranged by one skilled in the art. Processor 4450 can also include one or more microprocessors where in one embodiment a single
microprocessor can provide the function of each module or separate microprocessors can be used for one or more of the modules.
Functional components can include a graphical user interface 4410, a component manipulation system interface 4420, an imaging system interface 4425, an image processing library 4430 and an inspection preference interface 4440. In various embodiments, image processing library is capable of providing images for identification, verification and assessment of images acquired from a component under inspection. Further, image processing library is capable of storing acquired images for application in subsequent image analysis. In one embodiment, component manipulation system interface 4420 provides a communication interface to a manipulation system. In some forms the system 4420 can be used to pass information from a manipulation programs module 4421 when positioning a component. Manipulation programs module 4421 can provide instructions for manipulating a component during an inspection process.
Manipulation programs module 4421 can also assess an object to determine the instructions to be applied when manipulating the component.
Imaging system interface 4425 provides a communication interface to a imaging system. In some forms the system 4425 can be used to pass information from an image calibration settings module 4426 when acquiring an image of the component. Image calibration settings module 4426 can provide assessment and control of the imaging system to ensure consistent performance. In one form, component manipulation system interface 4420 can be a communication interface. In an alternative and/or additional form, imaging system interface 4425 can be a communication interface.
In the embodiment shown in Figure 19, inspection processor 4450 is shown with a technique module 4460, a protocol module 4470 and an inspection tool module 4480. Technique module 4460 can include protocol based inspection 4462, quantitative characterization 4464, and fuzzy logic qualitative reasoning 4466. These techniques can be applied during image analysis. Protocol module 4470 can include design standards 4472, inspection requirements 4474, and protocol designs 4476 selected with a user interface to provide parameters for the inspection process. Inspection tool module 4480 can include post-inspection analysis 4482, defect training 4484, and results recording 4486. Module 4480 can operate in coordination with image processing library 4430 to store products of inspection tool module 4480. Inspection system 4400 can also provide a component quality report 4490 with status such as pass, reject, repair and recall, for example.
In another embodiment, an inspection system of one embodiment includes a defect training module. The inspection system supports an interactive process by which an inspector can train the inspection system to detect certain defect conditions. In one particular embodiment, the inspector can train the system with two different types of defects including Positive and Negative defects. Each defect category can be associated with a relative scaling factor of low, medium or large on a qualitative basis. Furthermore, the training can be used to specify the identified defect either as a "pass", a "reject", or a "rework" defective class. The system can maintain a library of inspection information such as a surface defect database. In one form, the library can contain hundreds of different surface conditions. The inspection library can be referenced when performing calculated assessments and intelligent reasoning about the condition of observed defects.
In yet another embodiment, an inspection system can support utilities for registering and displaying complete airfoil surface defect maps. The exemplary system can allow an inspector to view a substantially 360 degree surface defect map of inspected airfoil models. The inspection system can register and maintain spatial locations of defects in a traceable quad-tree format. The inspection system can also be capable of displaying historical inspection occurrence maps to allow the inspector to correlate defects with other input factors such as design and manufacturing parameters.
Figure 20 illustrates a process flow chart of one embodiment of the system software of an inspection process 4900. Inspection process 4900 as shown includes image acquisition 4300 and quality verification 4400 with comparison to a library of images 4800 for inspection. In one embodiment, quality verification 4400 can include modifying the image acquired to provide an image with edge strength for example. In another example, illumination can be adjusted to produce a specified image quality or the image can be segmented and background images can be removed. Image inspection 4500 can provide an indication whether a component under inspection includes anomalies and irregularities in reference to images from library of images 4800. In condition
assessment 4600, analysis of negative and positive imperfection and anomaly detections can be conducted using various techniques including model based, cognitive
characterization and fuzzy logic. A report generator 4700 can produce a report regarding the results of the various analysis techniques which can be made available to indicate component quality acceptability.
With reference to Figure 21, one embodiment of an automated object
manipulation system 5100 includes a support structure 5120 and two robotic fingers 5130, 5140 each with five degrees of freedom, but other embodiments of the system 5100 can include greater or fewer freedoms. The five degrees of freedom can include, but are not limited to, three degrees of rotational freedom and two degrees of linear freedom. In one embodiment, the degrees of rotational freedom provide object positioning capabilities during an object analysis process while the two degrees of linear freedom aid in object capture and alignment. In one embodiment, an automated object manipulation system can include a single robotic finger. Robotic fingers or end effectors capable of physically grasping an object with direct force can include various forms of mechanical grippers including parallel jaws, claws, grapples, tongs, multiple fingers, and the like.
In Figure 21, robotic fingers 5130, 5140 are shown to have a sliding drive 5131, 5141 for opening and closing a set of parallel jaws 5135, 5145. Robotic fingers 5130, 5140 are further shown having a y-axis rotary drive 5150, 5151 which is capable of providing a degree of rotational freedom about the y-axis. An L-bracket 5161, 5171 is shown linking robotic fingers 5130, 5140 with a z-axis rotary drive 5160, 5170. Z-axis rotary drives 5160, 5170 are capable of providing a degree of rotational freedom about the z-axis. Robotic fingers 5130, 5140 share a circular frame 5181 with an orientation controlled by an x-axis rotary drive 5180 shown mounted below circular frame 5181. X- axis rotary drive 5180 is capable of providing a degree of rotational freedom about the x- axis. Circular frame 5181 is shown housed inside a cavity of support structure 5120. For exemplary purposes, embodiments are described with a right-hand coordinate frame and should not be construed as limiting.
In one embodiment of Figure 21, object manipulation system 5100 is controlled by a processor 5110. Processor 5110 can contain modules for predetermined object manipulation by the fingers and thereby the fingers are capable of positioning the object in various positions to provide automated object presentation during an analysis.
Processor 5110 is represented as a single component containing hardware capable of performing various functions. Each function can be located on a separate piece of hardware and can be one of several hardware varieties available and arranged by one skilled in the art. Processor 5110 can also include one or more microprocessors where in one embodiment a single microprocessor can provide the function of each module or separate microprocessors can be used for one or more of the modules. In a further embodiment, processor 5110 can include a data storage module 5111, an instruction module 5112 and a control module 5113. Computerized control can allow preprogrammed and operator initiated control of object manipulation system 5100.
Control module 5113 can provide object features and position data from a sensor resulting from an object assessment. The feature and position data can be fed to instruction module 5112. Instruction module 5112 can supply preprogrammed manipulation instructions in response to the feature and position data of the object assessment. The preprogrammed manipulation instructions can be retrieved from data storage module 5111. In a further embodiment, the feature and position data from the object assessment can be stored in data storage module 5111.
In another embodiment, a processor can include manipulation instructions which are capable of controlling the position and alignment of an object in relation to a sensor for an object analysis in response to data obtained by a profiling assessment. A profiling assessment of an object can provide identification features to be used to establish a profile and preprogrammed manipulation during the object analysis. A mark or feature can be used to establish a zero reference point. An automated object manipulation system can allow for repeatable analysis on multiple objects utilizing object features and preprogrammed manipulation.
In yet another embodiment, a processor can include programming to continuously interpret data received from profiling assessments and object analysis in a repeatable manner, for instance. Object manipulation by finger assemblies of a manipulation system in response to an analysis program from a processor can repeatedly provide data including the positions of an object and indications of features above and below the surface as well as anomalies. For a further embodiment, once a first finger assembly has completed an assigned task, the object can be transferred to a second finger assembly in order to continue inspection of another end of the object.
One embodiment of the present application can include multiple stepper motors, position encoders, and limit switches to name a few which can be used to controllably position objects with the finger assemblies of a manipulation system. In a specific embodiment, a drive motor and position encoder can be included for each degree of freedom, which in some cases can be five degrees, to allow exposure of the object surfaces to a sensor. In another embodiment, a processor can receive input signals from transducers and position encoders associated with each degree of freedom where the signals can be incorporated as part of the object manipulation programming.
Figure 22 illustrates an exploded view of a manipulation system 5200 of one embodiment of the present application. In a specific embodiment, a finger assembly 5230 can be installed on a circular frame 5281 mechanically coupling finger assembly 5230 and a rotary drive 5280 and then the entire circular frame assembly 5201 can be mounted inside the cavity of a support structure 5220. In other embodiments, the circular frame can be integrally manufactured in the support structure and the finger assembly can be installed on to the circular frame as part of the support structure. Assembly
parameters and procedures can be determined based on the size and profile of the objects being analyzed as would be known to one skilled in the art.
Figures 23a, 23b, and 23c illustrate three views of one embodiment of an automated object manipulation system 5300. Figure 23a is representative of a front view. Figure 23b is representative of a side view. Figure 23c is representative of a back view. System 5300 can be adjusted to accommodate objects of varying sizes and profiles.
Figures 24a and 24b illustrate one embodiment of a finger assembly 5400 showing a C-bracket 5436 with an L-bracket linking structure 5461 and an alignment system 5431 including adjustable parallel jaws 5435a, 5435b for holding an object 5401 securely. Parallel jaws 5435a, 5435b are operated up and down by drive system 5431 to accommodate varying sizes and shapes of objects 5401. The L-bracket linking structure 5461 can be slidingly received with a component that allows the finger assembly 5400 to be adjusted toward or away from the opposing finger assembly (not shown). In this way the finger assemblies can be adjusted to alter the gap between the assemblies. The bracket between the rotary drive and the finger assembly is one example of a device that permits the L-bracket linking structure 5461 to be adjusted. In one embodiment the jaw 5435a can be slidingly adjusted relative to the C-bracket 5436 using a screw depicted at the top of the C-bracket, while the jaw 5435b is adjusted relative to the C-bracket using the drive system 5431. In this way the jaw 5435a is relatively static and the jaw 5435b is usually moveable to capture or release the object 5401. Other manners of driving the jaws 5435a and 5435b are contemplated herein. Figure 25 illustrates movement in one embodiment of the parallel jaws on a finger assembly. Finger assembly 5500 shown here includes a support bracket 5536 and two parallel jaws - a first jaw 5537 and a second jaw 5538. Second jaw 5538 height can be adjustable by a drive system 5531, a screw or other such means to accommodate objects 5501 with different dimensions. First jaw 5537 displacement can be controlled by a small scale linear actuator 5534. In other embodiments, the mechanism of drive system 5531 and actuator 5534 can be alternated or the displacement adjustment can be controlled or accomplished using other mechanisms known in the art. To allow repeatability of the grip of parallel jaws 5537, 5538, jaws 5537, 5538 slide up and down a grooved slot 5539 on support bracket 5536 housing parallel jaws 5537, 5538. This arrangement can allow a secure alignment of parallel jaws 5537, 5538 with respect to object 5501 and improve analysis repeatability.
The position of first jaw 5657 and second jaw 5538 are adjusted to an open position allowing placement of an object 5501 between jaws 5537, 5538. Jaws 5537, 5538 are then adjusted to a closed position thereby holding object 5501 for manipulation and analysis. Parallel jaws 5537, 5538 of finger assembly 5500 can include pads 5533 which can be replaceable and/or constructed of a high density polymer material to facilitate a firm and secure grip for the object. For example, in one specific embodiment for an airfoil component, if a first finger assembly is equipped with padding to secure an airfoil fir tree section, a second finger assembly receives padding conformal to accommodate the airfoil blade shape.
Figure 26 illustrates movement of a finger assembly 5632 about a z-axis. The right-hand coordinate frame assumed for exemplary purpose includes an origin at a center of a circular frame 5681 of an analysis system 5600. The z-axis points upward and aligns with the axis of rotation of a motor drive system 5660. This embodiment can provide substantially full angular rotation around the z-axis and finger assembly 5632 can alter the position relative to another finger assembly 5633. In one embodiment, the z-axis motion can facilitate analysis of relative bottom and top sections of an object as well as but not limited to components with intricate fillets, orifices, and labels (e.g., part number or serial number) engraved or embossed on a relative bottom surface of an object. In Position A of Figure 26, finger assembly 5632 is rotated partially around the z-axis by motor drive system 5660 In Position B, finger assembly 5632 is rotated around the z-axis by motor drive system 5660 approximately 90° exposing a relative bottom surface 5602 of object 5601. In Position C, finger assembly 6 is returned to the original position.
Figure 27 illustrates movement of a finger assembly 5732 about the x-axis. To achieve x-axis motion of finger assembly 5732 in an automated object manipulation system 5700, a rotary drive system 5780 is employed. Rotary drive system 5780 can be mounted below a circular frame 5781. As shown, circular frame 5781 is mechanically coupled to finger assembly 5732 and rotary drive 5780. In this particular embodiment, rotary drive system 5780 causes circular frame 5781 to rotate along a track. Circular frame 5781 can be capable of rotating in related clockwise and counterclockwise directions. In some embodiments, circular frame 5781 can be capable of rotating a substantially complete 5360° during object manipulation. For a specific embodiment, a circular frame has on the circumferential wall a tooth-belt acting as a rack. The pinion on the rotary drive engages with the outer rack of the circular frame to facilitate the x-axis motion of a finger assembly.
Figure 28 illustrates movement of a finger assembly 5800 about the y-axis. A y- axis motor drive 5850 rotates a support bracket 5836 holding an object 5801 in a set of parallel jaws 5837, 5838. Substantially full 5360-degree rotation of an object about the y-axis is achievable as a sequence of y-axis motions are shown in the series of illustrations in Figure 28. A linking bracket 5861 of finger assembly 5800 can be adjustable. By adjusting the spacing of linking brackets 5861 in one embodiment, finger assembly 5800 can accommodate objects of differing dimensions.
In one embodiment, the x-axis can point towards a sensor system (not shown). In other embodiments, the sensor can be positioned to operate along another axis. The sensor system can include various equipment such as illumination and imaging devices. These devices can operate with the generation/detection of electromagnetic radiation, visible light, x-ray, ultraviolet and the like. A sensor can also be based on sound or physical detection. For another embodiment, the sensor can be located a fixed distance from the object manipulator along the symmetric axis of the circular frame. The sensor system can be applied for object assessment and/or analysis of an object presented by an object manipulation system. An automated object manipulation system of the present application can include an automated analysis apparatus capable of improving the accuracy of repeated object manipulation with various components. In one embodiment, an automated object manipulation system can include a mechanical object manipulation support means having five degrees of freedom for supporting, aligning and positioning objects in proximity of an analysis tool. In another embodiment of the present application, an automated object manipulation system can adjust the location of an object to a position and orientation allowing analysis to be performed repeatedly and reliably.
With reference to Figure 29, an object illumination system 6100 of one embodiment of the present application is shown having a set of concentric fluorescent light bulbs 6110, a diffusion shield 6120 positioned radially inwards of concentric fluorescent light bulbs 6110, and a cylindrical illumination harvesting shield 6130 positioned radially outward from concentric fluorescent light bulbs 6110. In this embodiment, an imaging system 6140 is shown located to capture an image of a component 6101 positioned within diffusion shield 6120 by a manipulation system 6160.
Concentric fluorescent light bulbs 6110 can serve as the light source for object illumination system 6100. In one embodiment, concentric fluorescent light bulbs 6110 can include a single light or illumination source. In other embodiments, concentric fluorescent light bulbs 6110 can include partial rings or configurations positioned in concentric circles or related arrangements. Illumination from concentric fluorescent light bulbs 6110 can include fluorescent, incandescent, LED or other illumination as known in the art. Thus, as used herein, fluorescent light bulbs/sources used in any given embodiment can be replaced individually and/or collectively in other embodiments with other light bulbs/sources such as incandescent, LED, etc. Configurations and shapes other than rings or circles can, in some embodiments, be used for concentric fluorescent light bulbs 6110 to provide illumination to the portions of a component presented to the imaging system. Illumination arrangements can be determined based on component parameters. Other forms of radiance can be applied where diffused light can illuminate an object and produce an image having a relatively balanced intensity of radiance. In further embodiments, multiple light sources within a single system can provide a variation of illumination arrangements which can be specified based on component parameters such as material, shape, and size to name a few. Further, longitudinal adjustment of the distance between the light source and the position of the component can allow control of the extent of illumination intensity provided to the component.
In another embodiment, diffusion shield 6120 can have a truncated-type cone shape. Concentric fluorescent light bulbs 6110 can be positioned radially outwards of a small radius base 6121 of cone-shaped diffusion shield 6120. The truncated cone-shape provides an opening for placing the detection portion of imaging system 6140 at small radius base 6121 of diffusion shield 6120. In yet other embodiments, a background 6150 is positioned at a large radius base 6122 of cone-shaped diffusion shield 6120. In one form the background 6150 can be green.
Diffusion shield 6120 can be composed of a material which allows penetration of bright illumination from a light source external to diffusion shield 6120 to diffuse to the inside of a conic tunnel 6170 of diffusion shield 6120 where component 6101 can be positioned for imaging purposes. In one form, the diffusion shield can be a plastic polymer material. Diffusion shield 6120 is structured to diffuse light from concentric fluorescent light bulbs 6110 as the light passes through diffusion shield 6120 to illuminate the component.
A cylindrical illumination harvesting shield 6130 is structured to redirect at least a portion of the light from concentric fluorescent light bulbs 6110 to diffusion shield 6120. Cylindrical illumination harvesting shield 6130 is shown in this embodiment with a cylindrical shape. The shape of a harvesting shield lean be structured to accommodate various selections of illumination arrangements, diffusion shields, and components, for example. A harvesting shield can be permanently placed in the system or can be removable to provide flexibility in an illumination system. The harvesting shield can be composed of various materials. In some embodiments, the material can be selected to provide a degree of redirection or reflectivity. Illumination harvesting shield 6130 can be used to intensify and redirect ambient light through diffusion shield 6120 to illuminate component 6101.
In embodiments with imaging system 6140 positioned in relation to diffusion shield 6120 so as to produce an image of component 6101, imaging system 6140 can further include a camera utilizing a conventional light or other electromagnetic radiation type such as x-ray, ultraviolet, fluorescent and the like. In various embodiments, images taken of an object by imaging system 6140 can be utilized for conducting an analysis of the component. The analysis can include component identification, defect determination, quality assessments and the like, some or all of which are associated with an inspection regime. For example, the embodiments disclosed herein can be used with an inspection regime that involves evaluating gas turbine engines and associated components as will have been appreciated by other aspects of the instant application. Such components can includes vanes, blades, etc. of the gas turbine engine.
An illumination system of one embodiment is structured to place an object at one end of an illumination tunnel created by a generally cone-shaped diffusion shield. An illumination source produces illumination which is diffused as it passes through the diffusion shield. The diffused light then illuminates the object. In a further embodiment, the object being illuminated can be positioned near a large radius base of the diffusion shield with a manipulation system. The manipulation system can include a robotic system, such as a multi-axis robotic system. Once the object is positioned, an imaging system such as a camera is capable of capturing perspective images of the object as the object is manipulated by the multi-axis robotic system. In one form, the imaging system can be attached at a small radius base of the cone-shaped diffusion shield. The imaging system is capable of capturing images of the object for further analysis.
In further embodiments, a multi-axis robotic system is capable of performing a series of controlled motions to expose different sections of a component to an imaging system producing at least one image of the component within a diffusion system. The diffusion system can include a diffusion shield and a harvesting shield. The diffusion shield can be structured to diffuse illumination from an illumination arrangement as the illumination passes through the diffusion shield. The harvesting shield can be structured to redirect at least a portion of the illumination from the illumination arrangement toward the diffusion shield.
In yet another embodiment, by longitudinal adjustment of the distance of a set of fluorescent light bulbs to the location where a component is positioned by a manipulation system, the extent of illumination intensity reaching the component can be controlled. Inside an illumination tunnel of a diffusion system, the component can be exposed to diffused lighting without any direct projection of a light source on the component, and hence a well-balanced intensity of illumination can be achieved. With diffused illumination, an imaging system can be capable of acquiring images of a light reflecting component without irregularities from reflections.
In another embodiment, an external illumination harvesting shield can be introduced to change the intensity and uniformity of supplied ambient illumination to an object. Figure 30 is a cross-sectional schematic diagram illustrating the mechanics of an illumination harvesting shield 6230 of one embodiment of the present application.
Illumination harvesting shield 6230 allows illumination produced from a florescent light source 6210 to be redirected toward a cone-shaped diffusion shield 6220 where an object 6201 within the cone-shaped diffusion shield 6220 can be exposed to an imaging system 6240. In this embodiment, florescent light source 6210 is shown as a first ring fluorescent light bulb 6211 and a second ring florescent light bulb 6212. Florescent light source 6210 can have other shapes and configurations designed to meet the requirements of an inspection system including the object being inspected. Illumination harvesting shield 6230 can be used to intensify and unify the ambient radiance from florescent light bulbs 6211, 6212 which can diffuse through cone-shaped diffusion shield 6220 to illuminate object 6201 positioned within the conic tunnel of cone-shaped diffusion shield 6220.
Figure 31 demonstrates an illumination system 6300 having a cone-shaped diffusion shield 6320 with a small radius base 6321 and a large radius base 6322.
Illumination system 6300 is also shown with a manipulation system 6360 to position a component (not shown) within diffusion shield 6320 to expose the component to an imaging system 6340. Fluorescent lighting bulbs are present but not visible in Figure 31 on the opposing side of an illumination support 6380. An illumination harvesting shield can be used in the embodiment of Figure 31 even though such a shield is not currently depicted.
An embodiment with a manipulation system 6360 can include a robotic part manipulator and positioning algorithms to provide predetermined part presentation and positioning during an inspection process. Manipulator system 6360 is capable of presenting a component in at least one position for imaging system 6340. Manipulation system 6360 can include various forms of component positioning equipment. In one embodiment, the manipulation system provides a single stationary position for the component.
Figure 32 illustrates one embodiment of an illumination process 6400 utilizing an illumination system. Operation 6401 is shown initiating illumination process 6400 and includes providing a source of illumination or light 6410. In one embodiment, the source of light 6410 can be from a set of concentric fluorescent light bulbs. Illumination process 6400 continues with operation 6402 which creates a diffused light. Operation 6402 utilizes a diffusion system 6425. Diffusion system 6425 can include a diffusion shield 6420 and a harvesting shield 6430. Diffusion shield 6420 can be structured to diffuse illumination from the source of illumination as the illumination passes through the diffusion shield. Harvesting shield 6430 can be structured to redirect at least a portion of the illumination from the source of illumination toward the diffusion shield.
The diffused light produced in operation 6402 is provided in order to illuminate an object in operation 6403. An imaging system 6440 is then able to create an image of the object illuminated by the diffused light from operation 6403 in operation 6405.
Imaging system 6440 can include a camera and image analysis. In an alternative, operation 6404 can vary the position of the illuminated object during operation 6403 and 6405. Operation 6404 can utilize an automated manipulation system 6460.
Any of the embodiments disclosed in one or more figures above can be used in the others as will be appreciated by those in the art. For example, any of the positioning systems disclosed above can be used in any of the various other embodiments, just as can any of the imaging sytems, databases, controllers, protocol development tools, fuzzy logic analysis, report generators, etc.
While the invention has been illustrated and described in detail in the drawings and foregoing description, the same is to be considered as illustrative and not restrictive in character, it being understood that only the preferred embodiments have been shown and described and that all changes and modifications that come within the spirit of the inventions are desired to be protected. It should be understood that while the use of words such as preferable, preferably, preferred or more preferred utilized in the description above indicate that the feature so described may be more desirable, it nonetheless may not be necessary and embodiments lacking the same may be contemplated as within the scope of the invention, the scope being defined by the claims that follow. In reading the claims, it is intended that when words such as "a," "an," "at least one," or "at least one portion" are used there is no intention to limit the claim to only one item unless specifically stated to the contrary in the claim. When the language "at least a portion" and/or "a portion" is used the item can include a portion and/or the entire item unless specifically stated to the contrary.

Claims

CLAIMS WHAT IS CLAIMED IS:
1. A method comprising:
acquiring a surface image from a surface of a component;
providing an image registration for the surface image;
inspecting the component in response to the image registration to produce an input data set;
creating an output data set in response to the input data set utilizing a fuzzy logic algorithm; and
identifying a surface feature in response to the surface image and the output data set.
2. The method of claim 1 , wherein acquiring the surface image further includes generating a radiation media;
directing the radiation media at the component;
detecting a responding radiation media in response to the directed radiation media and the component; and
creating the surface image in response to detecting the responding radiation media.
3. The method of claim 2, wherein acquiring the surface image further includes adjusting the radiation media in response to the surface image and a standard image.
4. The method of claim 1 , wherein providing the image registration further includes accessing a deposit of component images wherein the deposit of component images is retrievable by a set of generalized features.
5. The method of claim 4, wherein providing the image registration further includes determining a failure response to a non-conformity indicated by assessing the surface image.
6. The method of claim 1, wherein inspecting the component further includes operating a part manipulator structured to position the component in response to the image registration and a positioning algorithm.
7. The method of claim 1, wherein inspecting the component further includes providing a radiation media configuration of the component to a detection device in response to the image registration and a positioning algorithm.
8. The method of claim 1, wherein inspecting the component further includes retrieving a set of inspection requirements from a deposit of component images.
9. The method of claim 1, wherein creating the output data set further includes conducting a fuzzy logic analysis and a learning process utilizing a surface component library.
10. The method of claim 1, further including generating a surface feature report.
11. A method comprising:
acquiring a surface image from a surface of a component utilizing an image acquisition process;
determining a degree of conformity in response to a set of generalized features; determining a compliance status in response to the degree of conformity;
providing an image registration for the surface image in response to the compliance status;
inspecting the surface of the component utilizing a positioning system selected in response to the image registration to produce an input data set;
creating an output data set in response to the input data set utilizing a fuzzy logic algorithm; and
identifying a surface feature in response to the surface image and the output data set.
12. The method of claim 11, wherein the image acquisition process further includes generating a radiation media;
directing the radiation media at the component utilizing uniform diffused light; detecting a responding radiation media in response to the directed radiation media and the component; and
creating the surface image in response to detecting the responding radiation media.
13. The method of claim 12, wherein the image acquisition process further includes adjusting generating the radiation media in response to the surface image and a standard image.
14. The method of claim 11, wherein the positioning system further includes a surface positioner structured to position the component in response to the image registration and a positioning algorithm.
15. An apparatus comprising :
a positioning system including a component manipulator structured to position a component in response to a positioning algorithm;
an image acquiring system structured to generate a component image including a radiation media director and a radiation media detector;
an image data processing system utilizing a fuzzy logic algorithm capable of: applying the component image as a set of input variables; assigning a degree of conformity to the set of input variables; determining an analysis data set in response to the set of input variables and the degree of intensity; and
converting the analysis data set to a set of solutions; and
a microprocessor structured to provide at least one surface indication in response to the set of solutions and a cognitive characterization process utilizing a deposit of component images.
16. The apparatus of claim 15, wherein providing the at least one surface feature further includes characterization of at least one surface anomaly.
17. A system comprising :
a component manipulator structured to position a component in response to a positioning algorithm;
a surface scanner capable of
producing an electromagnetic radiation directed at a surface of the component,
detecting the electromagnetic radiation from the surface of the component, and producing a reflectivity signal in response to the detecting; and a processor structured to:
apply a fuzzy logic algorithm to the reflectivity signal and determine a solution set in response to the reflectivity signal; and
produce a grain structure characterization in response to the solution set.
18. The system of claim 17, wherein the grain structure characterization further includes a pass/fail signal, and wherein the electromagnetic radiation is a visible or near- visible light.
19. The system of claim 17, wherein the grain structure characterization further includes a characterization report.
20. The system of claim 17, wherein the grain structure characterization further includes a grain structure selected from a group consisting of: grain structure, grain defects, grain locations, grain size, grain defect density and combinations thereof.
21. A method comprising :
applying a positioning algorithm to manipulate at least one piece of positioning equipment in response to a component;
analyzing a surface of the component with a light source and a reflectivity detection unit;
producing a reflectivity data set in response to analyzing the surface of the component;
applying a fuzzy logic analysis to the reflectivity data set wherein the fuzzy logic analysis is capable of producing a solution data set; and
providing a grain structure characterization in response to the solution data set.
22. The method of claim 21, wherein the fuzzy logic analysis includes an input collecting module, a processing module, a compiling module, and an output producing module.
23. The method of claim 6, wherein the input collecting module includes collecting the reflectivity data set.
24. The method of claim 22, wherein the output producing module includes producing the solution data set.
25. The method of claim 21, wherein the at least one piece of positioning equipment further includes a robotic manipulator capable of positioning the component in response to the component and the positioning algorithm.
26. The method of claim 21, wherein the grain structure characterization further includes indicating a grain structure selected from a group consisting of grain structure, grain defects, grain locations, grain size, grain defect density and combinations thereof.
27. The method of claim 26, wherein the grain structure characterization further includes a pass/fail signal.
28. The method of claim 26, wherein the grain structure characterization further includes a characterization report.
29. An apparatus comprising:
a position module structured to position a component;
a reflection module structured to provide
a light source directed to a surface of the component,
a detection unit to detect a quantity of reflected light from the surface of the component, and
a reflectivity data set in response to the quantity of reflected light;
a grain structure characterization module utilizing a fuzzy logic algorithm capable of:
applying the reflectivity data set as a set of input variables; assigning a degree of intensity to the set of input variables; determining an output data set; and
converting the output data set to a set of solutions; and
an indication module structured to provide at least one indication of a grain structure in response to the set of solutions.
30. The apparatus of claim 29, wherein providing the at least one indication further includes providing a characterization of at least one grain defect.
31. The apparatus of claim 29, wherein the at least one indication further includes a grain structure selected from a group consisting of: grain structure, grain defects, grain locations, grain size, grain defect density and combinations thereof.
32. The apparatus of claim 29, wherein the at least one indication further includes a grain structure characterization in response to the set of solutions and a cognitive characterization process utilizing an airfoil defect knowledge bank.
33. The apparatus of claim 29, wherein the position module is further structured to include a component manipulator wherein the component manipulator positions the component in response to a positioning algorithm.
34. The apparatus of claim 33, wherein the component manipulator utilizes an imaging system to identify a component type of the component and determines the positioning algorithm in response to the component type.
35. The apparatus of claim 33, wherein the positioning algorithm further includes identifying the component in response to the reflectivity data set and a component orientation database.
36. The apparatus of claim 35, wherein the component further includes a single crystal cast airfoil-type component.
37. An apparatus comprising:
a positioning system;
a surface indicator system structured to collect an indication data set from a surface of a component;
a surface indicator composition applied to the component and configured to emit an indication that can be sensed by the surface indication system;
an indication data processing system structured to create an output data set in response to the indication data set utilizing a fuzzy logic algorithm; and
a microprocessor structured to provide at least one surface variance in response to the indication data set and the output data set.
38. The apparatus of claim 37, wherein the positioning system further includes a manipulator structured to position the component in response to a positioning algorithm, and wherein the surface indicator composition can emit an electromagnetic signal.
39. The apparatus of claim 38, wherein the positioning algorithm further includes identifying the component with a component position database in response to the indication data set and a recognition source data set.
40. The apparatus of claim 37, wherein the surface indicator composition is a Fluorescent Penetrant Indicator (FPI).
41. The apparatus of claim 40, wherein the indication device further includes an indicator application system capable of applying the FPI to the surface of the component.
42. The apparatus of claim 37, wherein the at least one surface variance includes a variance selected from a group consisting of inhomogeneities, microstructural discontinuities, inclusions, micro and macro porosity and combinations thereof.
43. The apparatus of claim 42, wherein the at least one surface variance further includes a pass/fail signal.
44. The apparatus of claim 42, wherein the at least one surface variance further includes a variance report.
45. A method comprising :
applying a surface indicator material to a component;
utilizing a positioning algorithm to manipulate at least one piece of positioning equipment in response to the component;
directing an indication source to a surface of the component having the surface indicator material;
collecting an indication data set in response to directing the indication source to the surface of the component;
applying a fuzzy logic analysis in response to the indication data set, the fuzzy logic analysis capable of providing an output data set; and
providing at least one surface variance in response to the output data set.
46. The method of claim 45, wherein the surface indicator material is configured for use in a fluorescent penetration process.
47. The method of claim 45, wherein the fuzzy logic analysis includes an input collecting module, a processing module, a compiling module, and an output collecting module.
48. The method of claim 47, wherein the input collecting module further includes collecting the indication data set.
49. The method of claim 47, wherein the output collecting module further includes collecting the output data set.
50. The method of claim 45, further including performing a part recognition method with a component position database in response to at least one of the indication data set, a light source data set, and a recognition source data set.
51. The method of claim 45, wherein the at least one piece of positioning equipment further includes a robotic manipulator structured to position the component in response to the component and the positioning algorithm.
52. The method of claim 45, wherein the at least one surface variance includes a variance selected from a group consisting of inhomogeneities, microstructural discontinuities, inclusions, micro-porosity, and combinations thereof.
53. The method of claim 52, wherein the at least one surface variance further includes a pass/fail signal.
54. The method of claim 52, wherein the at least one surface variance further includes a variance report.
55. An apparatus comprising :
a positioning system having a manipulator device structured to position a component in response to a positioning algorithm;
an indication system structured to collect an indication data set utilizing a fluorescent penetration process;
an indication data processing system utilizing a fuzzy logic algorithm capable of: applying the indication data set as a set of input variables; assigning a degree of intensity to the set of input variables; determining an output data set; and
converting the output data set to a set of solutions; and
a microprocessor structured to provide at least one surface variance in response to the set of solutions.
56. The apparatus of claim 55, wherein providing the at least one surface variance further includes characterization of at least one surface anomaly.
57. A system comprising: an illumination source capable of providing an electromagnetic illumination of an engine component;
an imaging system structured to capture illumination of the engine component; a component manipulation system structured to position the engine component in a variety of orientations relative to one of the illumination source and the imaging system;
a computer based user interface capable of identifying an inspection protocol defined to acquire an image of the engine component at the variety of positions using the illumination system, the imaging system and the component manipulation system; and a processor configured to process the inspection protocol for the purposes of analyzing the acquired image in response to the inspection protocol.
58. The system of claim 57, wherein the user interface is configured to provide a plurality of inspection protocol attributes capable of being selected by an operator.
59. The system of claim 58, wherein the plurality of inspection protocol attributes includes one of image settings, lighting settings, and component features.
60. The system of claim 58, wherein the plurality of inspection protocol attributes includes a masking feature to be applied to the acquired image.
61. The system of claim 57, wherein the user interface permits a selection of a predefined inspection protocol.
62. The system of claim 58, wherein the user interface is capable of building an inspection protocol that can be saved and re-used at a later inspection activity.
63. The system of claim 62, which further includes an image library that can be referenced as a result of an inspection protocol, the image library providing a reference for determination of component inspection.
64. The system of claim 63, wherein the processor is capable of performing fuzzy analysis instructions to assess a defect in the engine component.
65. The system of claim 64, wherein the set of inspection protocols is further capable of being synchronized to facilitate automated control of at least one of the illumination source, the imaging system, and the component manipulation system.
66. A method comprising: developing an inspection protocol of an article of inspection with a graphical user interface, the developing including:
setting an image parameter including one of contrast, brightness, and noise;
selecting a protocol option including one of removing a background, labeling a flaw, detecting an edge, and applying a mask;
identifying a location of the component to be evaluated;
as a result of the developing, launching the inspection protocol; and
synchronizing a component illumination, image acquisition, and a component manipulation in operative communication with a processor that receives information from the developing.
67. The method of claim 66, wherein the developing is the result of loading an inspection protocol from a predefined protocol.
68. The method of claim 66, wherein the developing includes repeating the developing after the synchronizing prior to storing the inspection protocol.
69. The method of claim 66, which further includes evaluating a data produced from an inspection after the synchronizing.
70. The method of claim 69, wherein the evaluating includes conducting a fuzzy logic analysis to produce an output.
71. The method of claim 66, wherein the developing further includes applying a mask to an image provided by the image acquisition.
72. The method of claim 66, which further includes generating a report that describes the result of the inspection.
73. The method of claim 72, which further includes determining at least one of a pass grade, a reject grade, and a repair grade prior to the generating.
74. The method of claim 66, which further includes interfacing with a historical library and comparing an inspection conclusion of pass/fail as a result of the
synchronizing with the historical library.
75. The method of claim 66, wherein the component manipulation includes placing the article of inspection in a variety of positions in which an image is acquired of the article of inspection.
76. An apparatus comprising:
a support base;
a finger assembly mechanically coupled to the support base;
a first drive unit operable to rotate the finger assembly about a first axis;
a second drive unit operable to rotate the finger assembly about a second axis; and a third drive unit operable to rotate the finger assembly about a third axis.
77. The apparatus of claim 76, further including a control system capable of positioning each of the first drive unit, second drive unit, and third drive unit.
78. The apparatus of claim 76, wherein the finger assembly further includes a support bracket having a set of parallel jaws and wherein the support bracket is mechanically coupled to the first drive unit.
79. The apparatus of claim 78, wherein the set of parallel jaws include a first jaw with a first padded portion and a second jaw with a second padded portion.
80. The apparatus of claim 79, wherein the finger assembly further includes an alignment mechanism structured to adjust the set of parallel jaws.
81. The apparatus of claim 80, wherein the alignment mechanism further includes a screw adjustment for the first finger and a linear actuator for the second finger.
82. The apparatus of claim 76, wherein the finger assembly further includes a linking bracket mechanically coupled to the second drive unit.
83. The apparatus of claim 76, wherein the support base further includes a circular frame mechanically coupled to the finger assembly and the third drive unit.
84. The apparatus of claim 76, further including a supplemental finger assembly with a supplemental first drive unit and a supplemental second drive unit.
85. The apparatus of claim 84, wherein the supplemental finger assembly further includes a supplemental support bracket having a set of supplemental parallel jaws and wherein the supplemental support bracket is mechanically coupled to the supplemental first drive unit.
86. The apparatus of claim 84, wherein the supplemental finger assembly further includes a supplemental linking bracket mechanically coupled to the supplemental second drive unit.
87. The apparatus of claim 84, wherein the support base further includes a circular frame mechanically coupled to the finger assembly, the supplemental finger assembly and the third drive unit.
88. An apparatus comprising:
an object manipulation system having a carrier that supports opposing first and second finger assemblies, each of the first and second finger assemblies capable of being rotated in unison together in the carrier about a carrier axis, the first finger assembly rotatable relative to the carrier about a first finger first axis and a first finger second axis, the first finger assembly also having a first finger mechanism to adjust a gap between a plurality of first fingers, the second finger assembly rotatable relative to the carrier about a second finger first axis and a second finger second axis, the second finger assembly also having a second finger mechanism to adjust a gap between a plurality of second fingers.
89. The apparatus of claim 88, wherein a spacing between the opposing first and second finger assemblies is configured to be adjusted through a support bracket.
90. The apparatus of claim 89, which further includes a first finger assembly drive unit structured to rotate the first finger assembly about the first finger first axis.
91. The apparatus of claim 88, wherein the first finger mechanism includes an actuator capable of altering a gap between the plurality of fingers.
92. The apparatus of claim 91, wherein the first finger mechanism includes an adjustment member capable of adjusting one of the plurality of fingers, the actuator capable of adjusting another of the plurality of fingers.
93. The apparatus of claim 88, wherein the first finger assembly is a mirror image of the second finger assembly.
94. The apparatus of claim 88, wherein the carrier is rotatingly received in a static support structure.
95. A method comprising :
grasping a gas turbine engine component with a plurality of fingers of a first manipulator; handing off the gas turbine engine component to a second manipulator structured to rotate in a carriage that also includes the first manipulator;
rotating the gas turbine engine component about a first axis and a second axis fixed in the carriage prior to the handing off such that the first axis and second axis rotate with the carriage.
96. An apparatus comprising:
a set of concentric light sources capable of providing a quantity of light; and a diffusion shield structured to diffuse the quantity of light and produce a diffused illumination, said diffused illumination being provided to a component positioned in an interior portion of the diffusion shield.
97. The apparatus of claim 96, further including a support structure
wherein the diffusion shield is mechanically coupled with the support structure at a first base portion of the diffusion shield, and wherein the set of concentric light sources are mechanically coupled with the support structure and positioned radially outward from the first base portion of the diffusion shield.
98. The apparatus of claim 96 further including a harvesting shield
wherein the harvesting shield is mechanically coupled with the support structure and positioned radially outward from the set of concentric light sources; and
wherein the harvesting shield redirects at least a portion of the quantity of light from the set of concentric light sources towards the diffusion shield.
99. The apparatus of claim 2, furthering including an imaging system positioned at the first base portion of the diffusion shield and is capable of producing at least one image of the component.
100. The apparatus of claim 99, further including a manipulation system capable of presenting the component to the imaging system within the diffusion shield and wherein the manipulation system is positioned approximate a second base portion of the diffusion shield.
101. An apparatus comprising :
a florescent illumination arrangement; and
a system of shields wherein an object is substantially uniformly illuminated by the florescent illumination arrangement and the system of shields.
102. The apparatus of claim 101, further including a quantity of illumination from the florescent illumination arrangement being diffused by the system of shields to produce a diffused quantity of illumination wherein the object is substantially uniformly illuminated by the diffused quantity of illumination.
103. The apparatus of claim 101, wherein the system of shields further includes a diffusion shield and a harvesting shield.
104. The apparatus of claim 101, wherein the diffusion shield is structured to diffuse the quantity of illumination from the florescent illumination arrangement as the quantity of illumination passes through the diffusion shield.
105. The apparatus of claim 101, wherein the harvesting shield is structured to redirect at least a portion of the quantity of illumination from the florescent illumination arrangement toward the diffusion shield.
106. The apparatus of claim 101, wherein the florescent illumination arrangement further includes a set of concentric illumination sources.
107. The apparatus of claim 103, furthering including an imaging system positioned at a first base portion of the diffusion shield and is capable of producing at least one image of the object.
108. The apparatus of claim 107, further including a manipulation system capable of presenting the object to the imaging system within the diffusion shield and wherein the manipulation system is positioned approximate a second base portion of the diffusion shield.
109. A method comprising :
providing a quantity of light from a light source;
creating a quantity of diffused light by diffusing at least a portion of the quantity of light by passing the at least a portion of the quantity of light though a diffusion shield; providing the quantity of diffused light without direct projection of light from the light source on a component; and
observing the component with an imaging system.
110. The method of claim 109, further including providing the imaging system at a first end of a diffusion shield.
111. The method of claim 109, further including providing a manipulation system capable of positioning the component at a second end of the diffusion shield.
112. The method of claim 109, further including harvesting at least a portion of the quantity of light from the light source with a harvesting shield wherein the harvesting shield is structured to redirect the at least a portion of the quantity of light from the light source toward the diffusion shield.
113. The method of claim 109, wherein the quantity of diffused light is substantially continuous.
PCT/US2012/028636 2011-03-09 2012-03-09 Intelligent airfoil component surface imaging inspection WO2012122542A2 (en)

Applications Claiming Priority (12)

Application Number Priority Date Filing Date Title
US201161451035P 2011-03-09 2011-03-09
US201161451038P 2011-03-09 2011-03-09
US201161451005P 2011-03-09 2011-03-09
US201161450963P 2011-03-09 2011-03-09
US201161451036P 2011-03-09 2011-03-09
US201161450973P 2011-03-09 2011-03-09
US61/451,005 2011-03-09
US61/451,035 2011-03-09
US61/451,036 2011-03-09
US61/450,963 2011-03-09
US61/450,973 2011-03-09
US61/451,038 2011-03-09

Publications (2)

Publication Number Publication Date
WO2012122542A2 true WO2012122542A2 (en) 2012-09-13
WO2012122542A3 WO2012122542A3 (en) 2014-05-01

Family

ID=46798851

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2012/028636 WO2012122542A2 (en) 2011-03-09 2012-03-09 Intelligent airfoil component surface imaging inspection

Country Status (1)

Country Link
WO (1) WO2012122542A2 (en)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2891879A1 (en) * 2014-01-03 2015-07-08 Bell Helicopter Textron Inc. Automated nital etch inspection system
EP2891878A1 (en) * 2014-01-03 2015-07-08 Bell Helicopter Textron Inc. Automated magnetic particle and fluorescent penetrant defect detection system
EP3462165A1 (en) * 2017-09-27 2019-04-03 United Technologies Corporation System and method for automated fluorescent penetrant inspection
FR3082941A1 (en) * 2018-06-20 2019-12-27 Safran Aircraft Engines Mexico WETTING DETECTION TOOL FOR AERONAUTICAL MECHANICAL PART, ASSEMBLY COMPRISING SUCH TOOLS AND METHOD OF DETECTION BY WETTING WITH SUCH TOOLS
WO2020214249A1 (en) * 2019-04-15 2020-10-22 Illinois Tool Works Inc. System for visual scanning articles during non-destructive (ndt) inspections
EP3992738A1 (en) * 2020-10-29 2022-05-04 Oriental Bluesky Titanium Technology Co. Ltd Fluorescent inspection visual data processing system and processing method thereof based on mes
US11867774B2 (en) 2018-04-17 2024-01-09 Illinois Tool Works Inc. Systems and methods to use customized quality control tasks for non-destructive testing

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4968892A (en) * 1986-12-24 1990-11-06 General Electric Eompany Fluorescent penetrant inspection sensor
US5544256A (en) * 1993-10-22 1996-08-06 International Business Machines Corporation Automated defect classification system
US6091846A (en) * 1996-05-31 2000-07-18 Texas Instruments Incorporated Method and system for anomaly detection
US20050220335A1 (en) * 2004-03-30 2005-10-06 Budd Gerald W Surface inspection technology for the detection of porosity and surface imperfections on machined metal surfaces
US7397550B2 (en) * 1998-07-08 2008-07-08 Charles A. Lemaire Parts manipulation and inspection system and method

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4968892A (en) * 1986-12-24 1990-11-06 General Electric Eompany Fluorescent penetrant inspection sensor
US5544256A (en) * 1993-10-22 1996-08-06 International Business Machines Corporation Automated defect classification system
US6091846A (en) * 1996-05-31 2000-07-18 Texas Instruments Incorporated Method and system for anomaly detection
US7397550B2 (en) * 1998-07-08 2008-07-08 Charles A. Lemaire Parts manipulation and inspection system and method
US20050220335A1 (en) * 2004-03-30 2005-10-06 Budd Gerald W Surface inspection technology for the detection of porosity and surface imperfections on machined metal surfaces

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2891879A1 (en) * 2014-01-03 2015-07-08 Bell Helicopter Textron Inc. Automated nital etch inspection system
EP2891878A1 (en) * 2014-01-03 2015-07-08 Bell Helicopter Textron Inc. Automated magnetic particle and fluorescent penetrant defect detection system
US9501820B2 (en) 2014-01-03 2016-11-22 Bell Helicopter Textron Inc. Automated nital etch inspection system
US9921132B2 (en) 2014-01-03 2018-03-20 Bell Helicopter Textron Inc. Automated magnetic particle and fluorescent penetrant defect detection system
EP3462165A1 (en) * 2017-09-27 2019-04-03 United Technologies Corporation System and method for automated fluorescent penetrant inspection
US11867774B2 (en) 2018-04-17 2024-01-09 Illinois Tool Works Inc. Systems and methods to use customized quality control tasks for non-destructive testing
FR3082941A1 (en) * 2018-06-20 2019-12-27 Safran Aircraft Engines Mexico WETTING DETECTION TOOL FOR AERONAUTICAL MECHANICAL PART, ASSEMBLY COMPRISING SUCH TOOLS AND METHOD OF DETECTION BY WETTING WITH SUCH TOOLS
WO2020214249A1 (en) * 2019-04-15 2020-10-22 Illinois Tool Works Inc. System for visual scanning articles during non-destructive (ndt) inspections
US11022562B2 (en) 2019-04-15 2021-06-01 Illinois Tool Works Inc. Methods and systems for vision system assisted inspections
CN114041051A (en) * 2019-04-15 2022-02-11 伊利诺斯工具制品有限公司 System for visually scanning an article during non-destructive (NDT) inspection
JP2022529924A (en) * 2019-04-15 2022-06-27 イリノイ トゥール ワークス インコーポレイティド A system that visually scans articles during non-destructive (NDT) inspection
EP3992738A1 (en) * 2020-10-29 2022-05-04 Oriental Bluesky Titanium Technology Co. Ltd Fluorescent inspection visual data processing system and processing method thereof based on mes

Also Published As

Publication number Publication date
WO2012122542A3 (en) 2014-05-01

Similar Documents

Publication Publication Date Title
CN111272763B (en) System and method for workpiece inspection
WO2012122542A2 (en) Intelligent airfoil component surface imaging inspection
CA2829575C (en) Protocol-based inspection system
US8050486B2 (en) System and method for identifying a feature of a workpiece
US7689003B2 (en) Combined 2D and 3D nondestructive examination
CA2829589C (en) Intelligent airfoil component grain defect inspection
CN113748311B (en) Training method of automatic detection system for blade defects of turbine engine
CN113030108A (en) Coating defect detection system and method based on machine vision
KR20010024617A (en) Automatic lens inspection system
JP2001326263A (en) Method of assessing structural defect on wafer surface
JP2019032268A (en) Surface defect determination method and surface defect inspection device
CN103218805B (en) Handle the method and system for the image examined for object
US20170030842A1 (en) Device and method for optically inspecting and analyzing stent-like objects
EP3662272B1 (en) Inspection system and method for turbine vanes and blades
CN111383208A (en) Coating quality detection system and method
US9020878B2 (en) Intelligent airfoil component surface inspection
CA2829576C (en) Intelligent airfoil component surface imaging inspection
EP2846155B1 (en) Apparatus and method for inspecting an article
Mundy Industrial machine vision− is it practical?
US9383310B2 (en) Apparatus and method for inspecting an article
WO2024026932A1 (en) Optimized Path Planning for Defect Inspection based on Effective Region Coverage
Rosell et al. Machine learning-based system to automate visual inspection in aerospace engine manufacturing
CN115668292A (en) Optimized path planning for defect detection based on coverage of active area

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 12755084

Country of ref document: EP

Kind code of ref document: A2

122 Ep: pct application non-entry in european phase

Ref document number: 12755084

Country of ref document: EP

Kind code of ref document: A2