WO1998051452A1 - Enabling process control technology for automated dry media depaint system - Google Patents

Enabling process control technology for automated dry media depaint system Download PDF

Info

Publication number
WO1998051452A1
WO1998051452A1 PCT/CA1998/000464 CA9800464W WO9851452A1 WO 1998051452 A1 WO1998051452 A1 WO 1998051452A1 CA 9800464 W CA9800464 W CA 9800464W WO 9851452 A1 WO9851452 A1 WO 9851452A1
Authority
WO
WIPO (PCT)
Prior art keywords
effector
video
quality
information
stripping
Prior art date
Application number
PCT/CA1998/000464
Other languages
French (fr)
Inventor
Jean-Bernard Dambrin
Original Assignee
Cae Electronics Ltd.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Cae Electronics Ltd. filed Critical Cae Electronics Ltd.
Priority to AU74204/98A priority Critical patent/AU7420498A/en
Publication of WO1998051452A1 publication Critical patent/WO1998051452A1/en

Links

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B24GRINDING; POLISHING
    • B24CABRASIVE OR RELATED BLASTING WITH PARTICULATE MATERIAL
    • B24C1/00Methods for use of abrasive blasting for producing particular effects; Use of auxiliary equipment in connection with such methods
    • B24C1/08Methods for use of abrasive blasting for producing particular effects; Use of auxiliary equipment in connection with such methods for polishing surfaces, e.g. smoothing a surface by making use of liquid-borne abrasives
    • B24C1/086Descaling; Removing coating films
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B24GRINDING; POLISHING
    • B24CABRASIVE OR RELATED BLASTING WITH PARTICULATE MATERIAL
    • B24C3/00Abrasive blasting machines or devices; Plants
    • B24C3/02Abrasive blasting machines or devices; Plants characterised by the arrangement of the component assemblies with respect to each other
    • B24C3/06Abrasive blasting machines or devices; Plants characterised by the arrangement of the component assemblies with respect to each other movable; portable
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B25HAND TOOLS; PORTABLE POWER-DRIVEN TOOLS; MANIPULATORS
    • B25JMANIPULATORS; CHAMBERS PROVIDED WITH MANIPULATION DEVICES
    • B25J9/00Programme-controlled manipulators
    • B25J9/16Programme controls
    • B25J9/1679Programme controls characterised by the tasks executed
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/37Measurements
    • G05B2219/37572Camera, tv, vision
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/40Robotics, robotics mapping to robotics vision
    • G05B2219/40613Camera, laser scanner on end effector, hand eye manipulator, local
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/45Nc applications
    • G05B2219/45071Aircraft, airplane, ship cleaning manipulator, paint stripping
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B2219/00Program-control systems
    • G05B2219/30Nc systems
    • G05B2219/50Machine tool, machine tool null till machine tool work handling
    • G05B2219/50353Tool, probe inclination, orientation to surface, posture, attitude

Definitions

  • This invention relates to automated dry-media blasting coating removal systems, more particularly to the automated starch media dry-stripping (SMDS) or the plastic media blasting (PMB) processes, and to real time computer-vision controllers for guiding and controlling the depainting nozzle(s) for optimal coating removal or stripping performances.
  • SMDS automated starch media dry-stripping
  • PMB plastic media blasting
  • the blasting nozzle used for projecting the high speed dry medium (e.g. the wheat starch medium) on the surface to be depainted may have a flat rectangular section rather than a circular one, in order to ensure efficient and uniform coating removal over the entire surface to be treated. Since such a nozzle requires to be manipulated at regulated and controlled speed and at particular distance and incidence angle with respect to the surface to be depainted, automatic controllers for holding, moving and guiding the blasting nozzle are required.
  • a computer-vision controller for guiding in realtime the nozzle on the surface to be depainted, using image information of that surface collected through at least one video camera installed on the nozzle carrier, herein after called the end-effector.
  • the computer-vision system uses an edge-tracking method for determining the linear edge between the painted area and the depainted area in order to avoid depainting the same surface twice. Image color acquired by the cameras is used for assessing the paint stripping level for optimizing the traverse speed for the end-effector to obtain the desired depaint results.
  • Another object of the computer-vision controller according to the present invention is to provide depainting quality information via the same camera.
  • the control system is a closed-loop system so that the nozzle direction and speed may be adjusted in real time depending on the information provided by the cameras .
  • the main components of this automated system are the carrier, the Process Equipment Trailer (PET), the operator control station, and a robotic manipulator.
  • the robot's arm is comprised of a travel beam, a 4 degrees of freedom serial link manipulator, and an end effector.
  • the 4-degrees of freedom manipulator mounted to the travel beam provides the vertical movement and the compliance of the end effector to the aircraft surface.
  • the travel beam mounted on the carrier provides lateral movement for the robot's arm/end effector.
  • the end effector contains the blasting nozzle and the vacuum hose. The blasting nozzle delivers the stripping media and the vacuum hose recovers the used media and paint. Sensors mounted on the end effector are used to guide the robot in real-time through its desired stripping path.
  • the camera defines the acquisition system of the vision control system developed by the inventor for the automated depaint system.
  • This vision system is a crucial part of the automation process as it provides the capability of edge tracking and quality control during the coating removal process.
  • the edge tracking is used for strip-trace overlap control for step-down control at the end of a trace.
  • Quality control is mainly used to control the degree of depainting.
  • One of the main requirements which has been addressed in the vision system is the capability to remove coatings selectively, i.e. the removal of the top coat while leaving the primer.
  • an automated coating removal system for treating and stripping painted surfaces, said system comprising: a coat removal end-effector for treating and stripping painted surfaces; sensor means attached to said end-effector for acquiring surface information to be used for controlling and driving said end-effector along a particular path; blasting nozzle means for blasting media particles onto said painted surface, said blasting nozzle means being attached to said coating removal end-effector; vision controller means for analyzing said surface information and providing a control signal for controlling and driving said end-effector along said path; and robot means for driving said end-effector along said path according to said control signal provided by said vision controller means.
  • Yet another object of the invention is to provide an apparatus for processing a series of video image signals from a video camera to generate a trajectory control signal for an at least partially automated paint stripping blast end effector robotic system, said apparatus comprising: means for mounting said camera forward of at least one blast nozzle of said end effector; means for analyzing said image signals to obtain a position signal of a substantially linear boundary between a stripped and an unstripped portion of a painted surface; and means for generating said trajectory signal from said position signal, whereby said robotic system tracks said boundary.
  • An apparatus for processing a series of video image signals from a video camera and for determining a quality of a stripped portion of a painted surface, said video image signals comprising an intensity component and a color or hue component said apparatus comprising: means for comparing a contrast quality of said intensity component and said color component of at least one of said image signals to determine whether said quality is best determined by said intensity component or by said color component and to output an image type control signal; and means for selectively analyzing one of said intensity component and said color component of said video signals in response to said image type control signal to obtain a quality value of said depainted portion.
  • FIG. 1 represents a detailed view of the end- effector
  • FIG. 2 shows the hardware block diagram of the vision system
  • FIG. 3 represents the software block diagram of the vision system
  • FIG. 4 shows the strip-trace overlap definition as used in the present specification
  • FIG. 5.A shows the intensity contrast between an aluminum panel and twelve samples of the most common colors used in the aircraft industry
  • FIG. 5.B shows the color contrast between an aluminum panel and twelve samples of the most common colors used in the aircraft industry
  • FIG. 6 shows the first step in the edge detection which is the three dimension display of the acquired image color intensity level
  • FIG. 7 shows the gradient and filtering step in the edge detection
  • FIG. 8 shows the gradient selection during the edge detection process
  • FIG. 9 shows the final edge representation in the XY plane
  • FIG. 10 represents the color and intensity contrast levels for twelve overexposed top coats
  • FIG. 11 represents the color an intensity contrast level for twelve underexposed top coats
  • FIG. 12 shows the best situations for detecting the edge for 12 samples of surface
  • FIG. 13 represents the processing time as a function of the processed area
  • FIG. 14 shows the block diagram related to the color detection and quality control feature according to the preferred embodiment of the invention.
  • FIG. 15 shows the block diagram related to the edge tracking according to the preferred embodiment of the invention.
  • This invention discloses an automated SMDS vision controller, also called hereinafter an automated coating removal system, capable of satisfying both above requirements even when used in a constraint full environment specific to that field of activity.
  • the automated coating removal system disclosed by the present invention may comprise a carrier for carrying a Process Equipment Trailer (PET) on which a robot arm is mounted.
  • the robot arm comprises an end-effector at its free extremity which is responsible for the effective paint stripping activity.
  • the present invention relates specifically to the computerized vision controller which is used for driving the end-effector.
  • the vision controller ensure that the end-effector is driven through the right path so overlap between consecutive traces is avoided. It also performs depaint quality control by analyzing the color of the stripped trace so the speed of the end-effector may be adjusted for obtaining the desired coat removal and stripping quality.
  • Figure 1 illustrates the end-effector 10 during a typical stripping operation.
  • the end effector 10 which is mounted on the robot arm (not shown) is moved horizontally across the surface to be stripped 22. During the depaint process, the end effector 10 moves from left to right, then steps down at the end of the trace, and goes back from right to left.
  • the dark zone in Figure 1 is the stripped zone and the light zone is the painted one.
  • the blasting nozzle means may comprise two nozzles 16 and 18 located on the right side of the end effector 10 project wheat starch. The mixture's residues are collected with the vacuum hose 20 on the left side of the end effector 10.
  • the vision controller To be able to strip a second trace located under the first one, the vision controller must determine the real position of the lateral edge. This data is used by the robot's controller to ensure minimum overlap between traces.
  • the same sensors and the same algorithms are used for vertical edge detection, lateral edge detection and starting position identification. Hence, this disclosure focuses on lateral edge detection.
  • the image acquisition part of the system which is used to detect the color edge, may be composed of sensor means, such as two micro cameras 12 and 14 which are located at each end of the end-effector 10. Automatic switching between cameras 12 and 14 is performed based on the travel direction but, only one camera may be used as well.
  • the second problem addressed by the vision controller is quality control.
  • the stripping quality is a non-linear function of the robot's speed.
  • the vision system processes images acquired from the aircraft surface in order to define the speed required to maintain the quality of stripping.
  • the topcoat typically, for most aircraft, there are three different layers of paint: The topcoat, the primer which is used to increase the adhesion of the paint on the surface, and a chemical protection layer used against corrosion.
  • complete stripping refers to the removal of the top coat and the primer.
  • Selective stripping refers to the removal of the top coat only.
  • the quality control means the quantification of the level of primer and substrate seen by the camera. From these levels an appropriate analysis is performed to determine if the surface is perfectly stripped. For example, in the case of complete stripping, quality control defines if both top coat and primer are removed. Quality information generated is used to determine the stripping speed that leads to the desired performance.
  • Figure 2 illustrates the vision system's configuration. Shown in this figure is the image acquisition system composed by the two cameras 12 and 14, the vision processing unit 24 composed of a vision station 26, a server 28 and a robot controller 30, and the link between the vision station 26 and the robot motion controller 30. Positioning and quality data generated by the vision station 26 is sent to the controller 30, where it is used to guide the end-effector 10 for edge tracking, and for stripping quality control.
  • the acquisition system shown in Figure 2 is composed of two cameras 12 and 14 used for edge tracking and quality control, and the lighting system for each camera.
  • a spectral study is performed on different aircraft surfaces, and different colors representing top coat and primer.
  • Cameras and the light source are housed in a box which is optimally designed to take into account the viewing distance, the position, as well as the size and weight constraints.
  • polarized filters may be used.
  • Each picture acquired by the camera shows a part of the stripped zone, and a part of the unstripped zone. The picture is then sent to the processing unit 24 for processing.
  • the processing unit 24 is a PC-based platform. As processing time is a major constraint, some dedicated image processing boards are used to perform pipeline processing. Only a part of the image processing is done on these boards. The analysis and other specific tasks performed with the image are done at the host level. The resulting data from the analysis may be sent to the robot's arm motion controller 30 through an Ethernet link.
  • the first step of the process is related to the digitization of the analog signal for both the quality and the lateral edge tracking tasks .
  • These tasks may be performed by two RGB 24bits grabbers 32, each one receiving one of the two interlaced signals 34 or 36 from one of the two cameras 12 or 14.
  • numerical conversion is performed on the digitized data by the A/N converters 38 and 40.
  • Different image processing algorithms are applied on the data which is then sent from the dedicated hardware module 42 to the host 44.
  • the edge coordinates and the quality information is deduced from this analysis.
  • the host 44 is also used for operator data display and for validation of the data sent to the motion controller 30.
  • Edge tracking is used to generate the necessary data in real-time in order to guide the robot during the coat removal and stripping process, and thus minimize overlaps between consecutive traces.
  • Edge detection is also used between traces when the robot steps down, and at the beginning of the stripping for each zone. Edge detection during the step-down phase ensures that vertical edges between traces are aligned. Edge detection during the starting phase ensures that the stripping starts at the desired position and hence overlaps between zones are controlled.
  • Edge detection is useful to avoid a positive or negative overlap.
  • Figure 4 shows the difference between these two overlaps.
  • a positive overlap is defined by an unstripped zone located between two consecutive traces.
  • a negative overlap is defined by a zone stripped twice at the junction of the two traces.
  • the edge being detected comes from a difference in contrast between two regions of the acquired image.
  • the easiest way to detect this contrast is to use the intensity contrast level reflected by the aircraft surface.
  • the contrast in the color scale may be used as an alternate parameter for detecting the edge.
  • One of the main features of the present stripping controller is its capability of choosing one of the color information and the intensity information in order to use the chosen information for edge detection (the same choice is performed for quality control).
  • the two colors of the stripped and the unstripped portions respectively are compared using comparing means of the coat removal system.
  • the same procedure is applied to the intensity of the two portions, and the best contrast for this particular surface is chosen for being employed for further coat removal.
  • the system continues controlling the blasting end-effector using the chosen information, in order to find the position value of the linear boundary between the two portions .
  • Figures 5.A and 5.B show the average difference in intensity and in color between an aluminum stripped panel covered with the most common colors used by the aircraft industry.
  • Figure 5.A shows the intensity contrast between an aluminum panel (light column) and different classical topcoats (dark column). As illustrated in this figure, the detection becomes possible when we reach a difference of fifty or more, between two columns. For example, sample number six illustrates a non reflective light blue sample compared to an aluminum sample stripped of its paint. As shown in Figure 5.A, we have no contrast in intensity. However, Figure 5.B shows that the color information is better to detect the edge.
  • the detection of a line between two consecutive traces of the edge detector 10 is referred to as edge detection or detection of a substantially linear boundary.
  • the processing apparatus processes a series of video images acquired by the video cameras 12 and 14 in order to determine that boundary between the stripped and the unstripped portions.
  • Figures 6 to 9 show the same picture acquired by one of the cameras 12 or 14 at different steps of the processing phase. These figures show the three dimensional plot of intensity level versus location in the image (x,y) plane.
  • Figure 6 shows a separation between a substrate and classical paint. The goal is to isolate the junction between the two regions .
  • Figure 7 represents the gradient taken on the previous image after filtering.
  • Figure 8 shows highest gradient selection and, finally, Figure 9 shows the final edge representation in the XY plane. That isolated line can be analyzed to determine if it is really an edge.
  • Figure 15 represents the high level software diagram for the edge detection.
  • the light source level used has a large impact on the robustness of the system. Reflective or non-
  • Figure 11 shows a light underexposure situation. It may be observed that for this underexposed situation, samples two to eight loose their intensity contrast. But the color contrast is still useful.
  • Figure 12 presents the same information differently.
  • Each axis shows a sample with the color or intensity contrast level for one optimal light level.
  • the goal from an algorithm point of view, is to stay on the perimeter of the total surface drawn by the two curves in order to avoid any situation where no edge detection would be possible.
  • the little circle shown in the center illustrates the dangerous zone that must be avoided.
  • Quality control is based on color detection.
  • the vision system 24 In order to detect a quality variation, the vision system 24 must learn different color mixtures. The learning process is done during the calibration period.
  • the end-effector 10 In order to perform a quick and safe calibration, the end-effector 10 is commanded to strip twenty inches of surface using a constant acceleration. Hence, the trace shows a constant variation of quality. For example, for an aluminum panel, we will see a progression from completely stripped aluminum to the topcoat. Then, this sample will be used to teach the vision system a minimum of ten variations along the trace. Using these ten variations, a mathematical model is built. During the real stripping process, the real picture is compared with the model in order to determine the level of primer or aluminum seen on the surface.
  • the system uses the same selection between color information and intensity information of the two portions, respectively the from the stripped portion and the unstripped portion of the surface to be depainted. This procedure has been described in greater details in the previous section. It allows the selection of the best information, either the color or the intensity information, provided by the surface to the vision controller which uses it to compute and control the coat removal quality of the blasting nozzles, by adjusting the speed of the end-effector.
  • the simplified quality control software diagram is shown in Figure 14.
  • an image of the stripped surface is acquired by one of the cameras 12 or 14, depending on the direction of stripping.
  • the image is sent through cables to the vision station 26 where it is converted from analog to digital by the A/N converter 40.
  • the intensity contrast or the color contrast is digitized and is compared to a series of samples previously recorded by the system.
  • the best mach is found and is used to adjust the speed of the end-effector, controlling at the same time the quality of the stripping process performed on the aircraft surface.
  • Figure 13 shows the processing time as a function of the size of the image processed, for both quality and edge detection. Obviously, the larger the picture, the longer the processing time required.
  • the vision controller disclosed by the present invention is integrated within the overall coating removal system. As previously presented, the first task of this system is to generate data required to modify the real-time trajectory of the robotics arm so that minimum overlap is obtained between consecutive traces. The second task of the vision controller is to provide quality information used to define the stripping speed so that the desired quality of the stripping is obtained.

Abstract

A real-time computer-vision controller for automatic dry media depainting which provides both step-down control and strip-trace overlap control, in order to avoid excessive or insufficient overlap between consecutive traces. The invention also provides depaint performance and degree of coating removal control by means of at least one camera in association with a computer-based system by way of real-time image analysis.

Description

Enabling Process Control Technology for Automated Dry Media Depaint System
Field of the Invention This invention relates to automated dry-media blasting coating removal systems, more particularly to the automated starch media dry-stripping (SMDS) or the plastic media blasting (PMB) processes, and to real time computer-vision controllers for guiding and controlling the depainting nozzle(s) for optimal coating removal or stripping performances.
Background of the Invention
In the aircraft industry, it is necessary to remove an old layer of coating before repainting an aircraft part of or the entire aircraft. It is known that hundreds of pounds of paint may be used for painting an entire transport-size aircraft. If old coatings are not removed before repainting, the weight of the aircraft will be increased accordingly, and this will further increase the fuel consumption and affect the operation or the appearance of the aircraft.
Traditionally, chemical paint strippers have been used in the aircraft industry for depainting aircrafts before repainting. Increasing environmental awareness, worker health and safety concerns have led to significant efforts to find more acceptable ways to safely remove paint from aircrafts. The use of traditional harsh, toxic chemical paint stripper is being restricted by legislation in many parts of the world. The most promising alternate technology uses wheat starch as the stripping media. The wheat starch is used in the same way that sand is used in sand blasting. Wheat starch has proven to be an effective paint stripper without harming or degrading aluminum alloys and composite surfaces. On certain aircrafts, the plastic media blasting (PMB) process has also been found acceptable.
Such a coating removal and depainting systems only work adequately when the depainting nozzle, which blasts dry particles such as the wheat starch media on the surface to be depainted, is guided and controlled at a specific angle with respect to the surface and at a constant speed. Sophisticated vision controllers are needed to enhance the performances of these blasting systems. In recent years, many vision controller products have appeared on the market, but each product has been designed for a specific application. It is therefore difficult to find an off-the-shelf vision system that satisfies the particular requirements of all stripping systems. Therefore, due to the unique requirements of the automated depaint process, the applicant developed its own system, which is the subject matter of the present invention.
Summary of the Invention
It is an object of the present invention to provide an automated depaint system capable of depainting aircraft using dry-stripping processes, including SMDS and PMB. The blasting nozzle used for projecting the high speed dry medium (e.g. the wheat starch medium) on the surface to be depainted may have a flat rectangular section rather than a circular one, in order to ensure efficient and uniform coating removal over the entire surface to be treated. Since such a nozzle requires to be manipulated at regulated and controlled speed and at particular distance and incidence angle with respect to the surface to be depainted, automatic controllers for holding, moving and guiding the blasting nozzle are required. Thus, it is also an object of this invention to provide a computer-vision controller for guiding in realtime the nozzle on the surface to be depainted, using image information of that surface collected through at least one video camera installed on the nozzle carrier, herein after called the end-effector. The computer-vision system uses an edge-tracking method for determining the linear edge between the painted area and the depainted area in order to avoid depainting the same surface twice. Image color acquired by the cameras is used for assessing the paint stripping level for optimizing the traverse speed for the end-effector to obtain the desired depaint results. Another object of the computer-vision controller according to the present invention is to provide depainting quality information via the same camera. The control system is a closed-loop system so that the nozzle direction and speed may be adjusted in real time depending on the information provided by the cameras .
The main components of this automated system are the carrier, the Process Equipment Trailer (PET), the operator control station, and a robotic manipulator. The robot's arm is comprised of a travel beam, a 4 degrees of freedom serial link manipulator, and an end effector. The 4-degrees of freedom manipulator mounted to the travel beam provides the vertical movement and the compliance of the end effector to the aircraft surface. The travel beam mounted on the carrier provides lateral movement for the robot's arm/end effector. The end effector contains the blasting nozzle and the vacuum hose. The blasting nozzle delivers the stripping media and the vacuum hose recovers the used media and paint. Sensors mounted on the end effector are used to guide the robot in real-time through its desired stripping path. Among the sensors attached to the end effector there is at least one camera which is used to monitor the effectiveness of the stripping process. The camera defines the acquisition system of the vision control system developed by the inventor for the automated depaint system. This vision system is a crucial part of the automation process as it provides the capability of edge tracking and quality control during the coating removal process. The edge tracking is used for strip-trace overlap control for step-down control at the end of a trace. Quality control is mainly used to control the degree of depainting. One of the main requirements which has been addressed in the vision system is the capability to remove coatings selectively, i.e. the removal of the top coat while leaving the primer.
Accordingly, it is a broad object of the present invention to provide an automated coating removal system for treating and stripping painted surfaces, said system comprising: a coat removal end-effector for treating and stripping painted surfaces; sensor means attached to said end-effector for acquiring surface information to be used for controlling and driving said end-effector along a particular path; blasting nozzle means for blasting media particles onto said painted surface, said blasting nozzle means being attached to said coating removal end-effector; vision controller means for analyzing said surface information and providing a control signal for controlling and driving said end-effector along said path; and robot means for driving said end-effector along said path according to said control signal provided by said vision controller means.
It is another broad aspect of the invention to provide an apparatus for processing a series of video image signals from at least one video camera and for determining a position of a substantially linear boundary between a stripped and an unstripped portion of a painted surface, said video image signals comprising an intensity component and a color or hue component, said apparatus comprising: means for comparing a contrast quality of said intensity component and said color component of at least one of said image signals to determine whether said linear boundary is best defined by said intensity component or by said color component and to output an image type control signal; and means for selectively analyzing one of said intensity component and said color component of said video signals in response to said image type control signal to obtain a position value of said linear boundary.
Yet another object of the invention is to provide an apparatus for processing a series of video image signals from a video camera to generate a trajectory control signal for an at least partially automated paint stripping blast end effector robotic system, said apparatus comprising: means for mounting said camera forward of at least one blast nozzle of said end effector; means for analyzing said image signals to obtain a position signal of a substantially linear boundary between a stripped and an unstripped portion of a painted surface; and means for generating said trajectory signal from said position signal, whereby said robotic system tracks said boundary.
According to another broad aspect of the invention, there is provided An apparatus for processing a series of video image signals from a video camera and for determining a quality of a stripped portion of a painted surface, said video image signals comprising an intensity component and a color or hue component, said apparatus comprising: means for comparing a contrast quality of said intensity component and said color component of at least one of said image signals to determine whether said quality is best determined by said intensity component or by said color component and to output an image type control signal; and means for selectively analyzing one of said intensity component and said color component of said video signals in response to said image type control signal to obtain a quality value of said depainted portion.
Brief Description of the Drawings
FIG. 1 represents a detailed view of the end- effector;
FIG. 2 shows the hardware block diagram of the vision system;
FIG. 3 represents the software block diagram of the vision system;
FIG. 4 shows the strip-trace overlap definition as used in the present specification; FIG. 5.A shows the intensity contrast between an aluminum panel and twelve samples of the most common colors used in the aircraft industry;
FIG. 5.B shows the color contrast between an aluminum panel and twelve samples of the most common colors used in the aircraft industry;
FIG. 6 shows the first step in the edge detection which is the three dimension display of the acquired image color intensity level;
FIG. 7 shows the gradient and filtering step in the edge detection; FIG. 8 shows the gradient selection during the edge detection process;
FIG. 9 shows the final edge representation in the XY plane; FIG. 10 represents the color and intensity contrast levels for twelve overexposed top coats;
FIG. 11 represents the color an intensity contrast level for twelve underexposed top coats;
FIG. 12 shows the best situations for detecting the edge for 12 samples of surface;
FIG. 13 represents the processing time as a function of the processed area;
FIG. 14 shows the block diagram related to the color detection and quality control feature according to the preferred embodiment of the invention;
FIG. 15 shows the block diagram related to the edge tracking according to the preferred embodiment of the invention.
Description of the Preferred Embodiments
In an automated depaint process, machine vision is required to address two main issues: strip-trace overlap control and stripping process quality control. This invention discloses an automated SMDS vision controller, also called hereinafter an automated coating removal system, capable of satisfying both above requirements even when used in a constraint full environment specific to that field of activity.
The automated coating removal system disclosed by the present invention may comprise a carrier for carrying a Process Equipment Trailer (PET) on which a robot arm is mounted. The robot arm comprises an end-effector at its free extremity which is responsible for the effective paint stripping activity. The present invention relates specifically to the computerized vision controller which is used for driving the end-effector. The vision controller ensure that the end-effector is driven through the right path so overlap between consecutive traces is avoided. It also performs depaint quality control by analyzing the color of the stripped trace so the speed of the end-effector may be adjusted for obtaining the desired coat removal and stripping quality. Figure 1 illustrates the end-effector 10 during a typical stripping operation. The end effector 10, which is mounted on the robot arm (not shown) is moved horizontally across the surface to be stripped 22. During the depaint process, the end effector 10 moves from left to right, then steps down at the end of the trace, and goes back from right to left. The dark zone in Figure 1 is the stripped zone and the light zone is the painted one. The blasting nozzle means may comprise two nozzles 16 and 18 located on the right side of the end effector 10 project wheat starch. The mixture's residues are collected with the vacuum hose 20 on the left side of the end effector 10.
If, for example, we have an aluminum surface painted with a white topcoat, then the line between the stripped area and the unstripped one is well defined. This line is referred to as the lateral edge. To be able to strip a second trace located under the first one, the vision controller must determine the real position of the lateral edge. This data is used by the robot's controller to ensure minimum overlap between traces. The same sensors and the same algorithms are used for vertical edge detection, lateral edge detection and starting position identification. Hence, this disclosure focuses on lateral edge detection. The image acquisition part of the system, which is used to detect the color edge, may be composed of sensor means, such as two micro cameras 12 and 14 which are located at each end of the end-effector 10. Automatic switching between cameras 12 and 14 is performed based on the travel direction but, only one camera may be used as well.
The second problem addressed by the vision controller is quality control. For a constant pressure of wheat starch, and for a given surface, the stripping quality is a non-linear function of the robot's speed. During the stripping process, the vision system processes images acquired from the aircraft surface in order to define the speed required to maintain the quality of stripping.
Typically, for most aircraft, there are three different layers of paint: The topcoat, the primer which is used to increase the adhesion of the paint on the surface, and a chemical protection layer used against corrosion. In a depaint process, complete stripping refers to the removal of the top coat and the primer. Selective stripping refers to the removal of the top coat only.
From an image processing point of view, the quality control means the quantification of the level of primer and substrate seen by the camera. From these levels an appropriate analysis is performed to determine if the surface is perfectly stripped. For example, in the case of complete stripping, quality control defines if both top coat and primer are removed. Quality information generated is used to determine the stripping speed that leads to the desired performance.
Figure 2 illustrates the vision system's configuration. Shown in this figure is the image acquisition system composed by the two cameras 12 and 14, the vision processing unit 24 composed of a vision station 26, a server 28 and a robot controller 30, and the link between the vision station 26 and the robot motion controller 30. Positioning and quality data generated by the vision station 26 is sent to the controller 30, where it is used to guide the end-effector 10 for edge tracking, and for stripping quality control.
The acquisition system shown in Figure 2 is composed of two cameras 12 and 14 used for edge tracking and quality control, and the lighting system for each camera. In order to define the sensor and the light source a spectral study is performed on different aircraft surfaces, and different colors representing top coat and primer. Cameras and the light source are housed in a box which is optimally designed to take into account the viewing distance, the position, as well as the size and weight constraints. To avoid the spectral reflections generated by the light source when illuminating on bright surfaces, polarized filters may be used. Each picture acquired by the camera shows a part of the stripped zone, and a part of the unstripped zone. The picture is then sent to the processing unit 24 for processing.
The flowchart of the vision system is illustrated in Figure 3. The processing unit 24 is a PC-based platform. As processing time is a major constraint, some dedicated image processing boards are used to perform pipeline processing. Only a part of the image processing is done on these boards. The analysis and other specific tasks performed with the image are done at the host level. The resulting data from the analysis may be sent to the robot's arm motion controller 30 through an Ethernet link.
The first step of the process is related to the digitization of the analog signal for both the quality and the lateral edge tracking tasks . These tasks may be performed by two RGB 24bits grabbers 32, each one receiving one of the two interlaced signals 34 or 36 from one of the two cameras 12 or 14. Then, numerical conversion is performed on the digitized data by the A/N converters 38 and 40. Different image processing algorithms are applied on the data which is then sent from the dedicated hardware module 42 to the host 44. The edge coordinates and the quality information is deduced from this analysis. In addition to the image analysis tasks, the host 44 is also used for operator data display and for validation of the data sent to the motion controller 30.
The validity of the data is important. Even after all necessary precautions are taken into account, the rule of thumb in numerical vision is to consider potentially wrong data. To avoid a dangerous situation due to corrupted data, different validation measures are implemented at the image processing level and at the motion control level.
Edge tracking
Edge tracking is used to generate the necessary data in real-time in order to guide the robot during the coat removal and stripping process, and thus minimize overlaps between consecutive traces. Edge detection is also used between traces when the robot steps down, and at the beginning of the stripping for each zone. Edge detection during the step-down phase ensures that vertical edges between traces are aligned. Edge detection during the starting phase ensures that the stripping starts at the desired position and hence overlaps between zones are controlled.
The same sensors and the same algorithms are used for vertical edge detection, lateral edge detection, and starting position identification. Hence, in this specification, we focus on lateral edge detection only.
Edge detection is useful to avoid a positive or negative overlap. Figure 4 shows the difference between these two overlaps. A positive overlap is defined by an unstripped zone located between two consecutive traces. A negative overlap is defined by a zone stripped twice at the junction of the two traces.
The edge being detected comes from a difference in contrast between two regions of the acquired image. The easiest way to detect this contrast is to use the intensity contrast level reflected by the aircraft surface. To improve the robustness of the detection, especially when the intensity contrast is not good enough, the contrast in the color scale may be used as an alternate parameter for detecting the edge. One of the main features of the present stripping controller is its capability of choosing one of the color information and the intensity information in order to use the chosen information for edge detection (the same choice is performed for quality control). The two colors of the stripped and the unstripped portions respectively are compared using comparing means of the coat removal system. The same procedure is applied to the intensity of the two portions, and the best contrast for this particular surface is chosen for being employed for further coat removal. The system continues controlling the blasting end-effector using the chosen information, in order to find the position value of the linear boundary between the two portions .
To illustrate this approach, Figures 5.A and 5.B show the average difference in intensity and in color between an aluminum stripped panel covered with the most common colors used by the aircraft industry. Figure 5.A shows the intensity contrast between an aluminum panel (light column) and different classical topcoats (dark column). As illustrated in this figure, the detection becomes possible when we reach a difference of fifty or more, between two columns. For example, sample number six illustrates a non reflective light blue sample compared to an aluminum sample stripped of its paint. As shown in Figure 5.A, we have no contrast in intensity. However, Figure 5.B shows that the color information is better to detect the edge. The detection of a line between two consecutive traces of the edge detector 10 is referred to as edge detection or detection of a substantially linear boundary. According to the invention, the processing apparatus processes a series of video images acquired by the video cameras 12 and 14 in order to determine that boundary between the stripped and the unstripped portions. Figures 6 to 9 show the same picture acquired by one of the cameras 12 or 14 at different steps of the processing phase. These figures show the three dimensional plot of intensity level versus location in the image (x,y) plane.
Figure 6 shows a separation between a substrate and classical paint. The goal is to isolate the junction between the two regions . Figure 7 represents the gradient taken on the previous image after filtering. Figure 8 shows highest gradient selection and, finally, Figure 9 shows the final edge representation in the XY plane. That isolated line can be analyzed to determine if it is really an edge. At the end of the edge detection analysis, the coordinates of the edge extracted are validated. All these processing steps are summarized and shown in Figure 15, which represents the high level software diagram for the edge detection.
The light source level used has a large impact on the robustness of the system. Reflective or non-
SUBSTΓΓUTE SHEET(RULE 26) reflective panels could generate, for a specific light level, an overexposed or underexposed image. Consequently, the operator may have to adjust the light source level in function of the topcoat characteristics. To reduce operator ' s interventions , the system must work on a specific range of lighting variations. Figures 10 and 11 show the difference in the behavior of the color and intensity contrasts for 12 top coats overexposed to light. Figure 10 shows that in case of an overexposed panel, samples five to twelve loose their intensity contrast easily. But the color contrast is still good to work with.
Figure 11 shows a light underexposure situation. It may be observed that for this underexposed situation, samples two to eight loose their intensity contrast. But the color contrast is still useful.
Figure 12 presents the same information differently. Each axis shows a sample with the color or intensity contrast level for one optimal light level. The goal, from an algorithm point of view, is to stay on the perimeter of the total surface drawn by the two curves in order to avoid any situation where no edge detection would be possible. The little circle shown in the center illustrates the dangerous zone that must be avoided.
Quality Control
Quality control is based on color detection. In order to detect a quality variation, the vision system 24 must learn different color mixtures. The learning process is done during the calibration period. In order to perform a quick and safe calibration, the end-effector 10 is commanded to strip twenty inches of surface using a constant acceleration. Hence, the trace shows a constant variation of quality. For example, for an aluminum panel, we will see a progression from completely stripped aluminum to the topcoat. Then, this sample will be used to teach the vision system a minimum of ten variations along the trace. Using these ten variations, a mathematical model is built. During the real stripping process, the real picture is compared with the model in order to determine the level of primer or aluminum seen on the surface.
For the coat removal control too, the system uses the same selection between color information and intensity information of the two portions, respectively the from the stripped portion and the unstripped portion of the surface to be depainted. This procedure has been described in greater details in the previous section. It allows the selection of the best information, either the color or the intensity information, provided by the surface to the vision controller which uses it to compute and control the coat removal quality of the blasting nozzles, by adjusting the speed of the end-effector.
The simplified quality control software diagram is shown in Figure 14. First, an image of the stripped surface is acquired by one of the cameras 12 or 14, depending on the direction of stripping. The image is sent through cables to the vision station 26 where it is converted from analog to digital by the A/N converter 40. From here, the intensity contrast or the color contrast is digitized and is compared to a series of samples previously recorded by the system. The best mach is found and is used to adjust the speed of the end-effector, controlling at the same time the quality of the stripping process performed on the aircraft surface.
Figure 13 shows the processing time as a function of the size of the image processed, for both quality and edge detection. Obviously, the larger the picture, the longer the processing time required. The vision controller disclosed by the present invention is integrated within the overall coating removal system. As previously presented, the first task of this system is to generate data required to modify the real-time trajectory of the robotics arm so that minimum overlap is obtained between consecutive traces. The second task of the vision controller is to provide quality information used to define the stripping speed so that the desired quality of the stripping is obtained.

Claims

CLAIMS :
1. An automated coating removal system for treating and stripping painted surfaces, said system comprising: a coat removal end-effector for treating and stripping painted surfaces; sensor means attached to said end-effector for acquiring surface information to be used for controlling and driving said end-effector along a particular path; blasting nozzle means for blasting media particles onto said painted surface, said blasting nozzle means being attached to said coating removal end-effector; vision controller means for analyzing said surface information and providing a control signal for controlling and driving said end-effector along said path; and robot means for driving said end-effector along said path according to said control signal provided by said vision controller means.
2. The automated paint stripping system as claimed in claim 1, wherein said sensor means comprises at least one video camera.
3. The automated coating removal system as claimed in claim 2, wherein said vision controller means uses edge position information for driving said end-effector along said path, said vision controller means receiving from said video camera a video signal comprising a series of video images wherein each one of said images comprises a stripped portion of said surface and an unstripped portion of said surface, said vision controller means using information from said images to compute a location of an edge between said stripped portion and said unstripped portion, said vision controller means using said computed location to provide said control signal for driving said end-effector along said path.
4. The automated paint stripping system as claimed in claim 1, 2 and 3, wherein said information from said images is one of a color information and a contrast information.
5. The automated paint stripping system as claimed in claim 1, 2, 3 and 4, wherein said vision computer system uses blasting quality information for driving and controlling said end-effector, said blasting quality information being computed using one of a color information and a contrast information of said surface.
6. An apparatus for processing a series of video image signals from at least one video camera and for determining a position of a substantially linear boundary between a stripped and an unstripped portion of a painted surface, said video image signals comprising an intensity component and a color or hue component, said apparatus comprising: means for comparing a contrast quality of said intensity component and said color component of at least one of said image signals to determine whether said linear boundary is best defined by said intensity component or by said color component and to output an image type control signal; and means for selectively analyzing one of said intensity component and said color component of said video signals in response to said image type control signal to obtain a position value of said linear boundary .
7. The apparatus claimed in claim 6, further comprising: a coat removal end-effector for blasting painted bands, said end-effector comprising at least one blasting nozzle for blasting dry particles onto a surface to be depainted and said video camera for acquiring said series of video images ; and a robotic system means for driving said end-effector along said linear boundary using said position value of said linear boundary.
8. An apparatus as claimed in claim 6, wherein said at least one video camera acquires said video images so that each of said video images comprises a part of both said stripped and an unstripped portion of said painted surface for allowing said apparatus to compute said linear boundary using information from at least one of said video images .
9. An apparatus as claimed in claim 7, wherein said robotic system means adjusts a position and a speed of said end-effector using said position value of said linear boundary for optimizing a a coat removal path and a coat removal quality of said stripping path.
10. An apparatus for processing a series of video image signals from a video camera to generate a trajectory control signal for an at least partially automated paint stripping blast end effector robotic system, said apparatus comprising: means for mounting said camera forward of at least one blast nozzle of said end effector; means for analyzing said image signals to obtain a position signal of a substantially linear boundary between a stripped and an unstripped portion of a painted surface; and means for generating said trajectory signal from said position signal, whereby said robotic system tracks said boundary.
11. The apparatus as claimed in claim 10, wherein said generating means comprises: means for calibrating a position of depaint edge produced by said at least one nozzle as a function of a quality of said painted surface, a speed of said end effector and a quality of media flow through said at least one nozzle; and means for adjusting said trajectory signal as a function of said depaint edge position.
12. The apparatus as claimed in claim 11, wherein said calibrating means comprises a trailing camera for acquiring images of said surface at said linear boundary after stripping by said at least one nozzle.
13. An apparatus for processing a series of video image signals from a video camera and for determining a quality of a stripped portion of a painted surface, said video image signals comprising an intensity component and a color or hue component, said apparatus comprising: means for comparing a contrast quality of said intensity component and said color component of at least one of said image signals to determine whether said quality is best determined by said intensity component or by said color component and to output an image type control signal; and means for selectively analyzing one of said intensity component and said color component of said video signals in response to said image type control signal to obtain a quality value of said depainted portion.
14. An apparatus as claimed in claim 13, further comprising: coat removal end-effector means for stripping said painted surface; robotic means for driving said end-effector means along said painted surface; and means for controlling a speed of said blast end- effector means using said quality value.
15. An apparatus as claimed in claim 13, wherein said analyzing means comprises: means for storing sample values of stripped surfaces, said samples values representing a series of painted surfaces stripped to different degrees; means for comparing each of said sample values with a currently stripped surface for matching a closest sample value; and means for controlling a traverse speed of a blasting end-effector using said closest sample value.
16. The automated coating removal system as claimed in claim 3, wherein said control signal comprises information related to a position of said end-effector.
17. The automated coating removal system as claimed in claim 16, wherein said control signal further comprises information related to a direction of said end-effector.
PCT/CA1998/000464 1997-05-13 1998-05-13 Enabling process control technology for automated dry media depaint system WO1998051452A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
AU74204/98A AU7420498A (en) 1997-05-13 1998-05-13 Enabling process control technology for automated dry media depaint system

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US4632897P 1997-05-13 1997-05-13
US60/046,328 1997-05-13

Publications (1)

Publication Number Publication Date
WO1998051452A1 true WO1998051452A1 (en) 1998-11-19

Family

ID=21942871

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CA1998/000464 WO1998051452A1 (en) 1997-05-13 1998-05-13 Enabling process control technology for automated dry media depaint system

Country Status (2)

Country Link
AU (1) AU7420498A (en)
WO (1) WO1998051452A1 (en)

Cited By (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2006097133A1 (en) * 2005-03-14 2006-09-21 Workinter Limited Device and method for nozzle stripping by spraying a fluid loaded with solid particles forming an optimized stripping front
WO2006097134A1 (en) * 2005-03-14 2006-09-21 Workinter Limited Shoe and device for stripping surfaces having a curvature by directed spraying a discharge of a flow of particles
FR2891175A1 (en) * 2005-09-27 2007-03-30 Applic Lorraine Des Tech Nouve Metal surface cleaning system e.g. for ships' hulls uses high-pressure water jets on carriage and vacuum pump to collect waste
KR20160014585A (en) * 2013-03-15 2016-02-11 카네기 멜론 유니버시티 A supervised autonomous robotic system for complex surface inspection and processing
ES2695627A1 (en) * 2017-05-31 2019-01-09 Vilarino David Roca Robot- automatic structure painting machine (R-MAPE) (Machine-translation by Google Translate, not legally binding)
ES2727675A1 (en) * 2018-04-16 2019-10-17 Eseki S A L AUTOMATIC PARTS GRINDING SYSTEM (Machine-translation by Google Translate, not legally binding)
FR3101562A1 (en) * 2019-10-08 2021-04-09 Safran Aircraft Engines Process for stripping at least one area of a turbine engine blade
CN114227690A (en) * 2021-12-30 2022-03-25 无锡荣恩科技有限公司 Paint removing method for aviation component

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0165911A2 (en) * 1984-06-22 1985-12-27 VIANOVA S.p.A. Method and robot platform for washing, sandblasting and painting in shipbuilding dry dock
WO1991014539A1 (en) * 1990-03-27 1991-10-03 Southwest Research Institute Robotic system for paint removal
US5067085A (en) * 1989-05-15 1991-11-19 Southwest Research Institute Optical robotic canopy polishing system
US5077941A (en) * 1990-05-15 1992-01-07 Space Time Analyses, Ltd. Automatic grinding method and system
DE4428069A1 (en) * 1993-08-31 1995-03-02 Putzmeister Maschf Surface treatment arrangement, especially for cleaning the surfaces of large objects
US5394654A (en) * 1990-12-28 1995-03-07 Mazda Motor Corporation Method of wet-sanding defective parts of coating on vehicle body and system for carrying out the method
US5477268A (en) * 1991-08-08 1995-12-19 Mazda Motor Corporation Method of and apparatus for finishing a surface of workpiece

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP0165911A2 (en) * 1984-06-22 1985-12-27 VIANOVA S.p.A. Method and robot platform for washing, sandblasting and painting in shipbuilding dry dock
US5067085A (en) * 1989-05-15 1991-11-19 Southwest Research Institute Optical robotic canopy polishing system
WO1991014539A1 (en) * 1990-03-27 1991-10-03 Southwest Research Institute Robotic system for paint removal
US5077941A (en) * 1990-05-15 1992-01-07 Space Time Analyses, Ltd. Automatic grinding method and system
US5394654A (en) * 1990-12-28 1995-03-07 Mazda Motor Corporation Method of wet-sanding defective parts of coating on vehicle body and system for carrying out the method
US5477268A (en) * 1991-08-08 1995-12-19 Mazda Motor Corporation Method of and apparatus for finishing a surface of workpiece
DE4428069A1 (en) * 1993-08-31 1995-03-02 Putzmeister Maschf Surface treatment arrangement, especially for cleaning the surfaces of large objects

Cited By (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2006097133A1 (en) * 2005-03-14 2006-09-21 Workinter Limited Device and method for nozzle stripping by spraying a fluid loaded with solid particles forming an optimized stripping front
WO2006097134A1 (en) * 2005-03-14 2006-09-21 Workinter Limited Shoe and device for stripping surfaces having a curvature by directed spraying a discharge of a flow of particles
FR2891175A1 (en) * 2005-09-27 2007-03-30 Applic Lorraine Des Tech Nouve Metal surface cleaning system e.g. for ships' hulls uses high-pressure water jets on carriage and vacuum pump to collect waste
KR102211010B1 (en) 2013-03-15 2021-02-02 카네기 멜론 유니버시티 A supervised autonomous robotic system for complex surface inspection and processing
EP2973074A4 (en) * 2013-03-15 2016-11-16 Univ Carnegie Mellon A supervised autonomous robotic system for complex surface inspection and processing
US9796089B2 (en) 2013-03-15 2017-10-24 Carnegie Mellon University Supervised autonomous robotic system for complex surface inspection and processing
KR20160014585A (en) * 2013-03-15 2016-02-11 카네기 멜론 유니버시티 A supervised autonomous robotic system for complex surface inspection and processing
ES2695627A1 (en) * 2017-05-31 2019-01-09 Vilarino David Roca Robot- automatic structure painting machine (R-MAPE) (Machine-translation by Google Translate, not legally binding)
ES2727675A1 (en) * 2018-04-16 2019-10-17 Eseki S A L AUTOMATIC PARTS GRINDING SYSTEM (Machine-translation by Google Translate, not legally binding)
WO2019202192A1 (en) * 2018-04-16 2019-10-24 Eseki, S.A.L. Automatic system for shot-blasting workpieces
FR3101562A1 (en) * 2019-10-08 2021-04-09 Safran Aircraft Engines Process for stripping at least one area of a turbine engine blade
CN114227690A (en) * 2021-12-30 2022-03-25 无锡荣恩科技有限公司 Paint removing method for aviation component
CN114227690B (en) * 2021-12-30 2023-11-03 无锡荣恩科技有限公司 Paint removal method for aviation part

Also Published As

Publication number Publication date
AU7420498A (en) 1998-12-08

Similar Documents

Publication Publication Date Title
US8923602B2 (en) Automated guidance and recognition system and method of the same
US8098928B2 (en) Apparatus for picking up objects
US7283661B2 (en) Image processing apparatus
US20130057678A1 (en) Inspection system and method of defect detection on specular surfaces
CN111905983B (en) Vision following-based dispensing track correction method, device, system and medium
CN111923053A (en) Industrial robot object grabbing teaching system and method based on depth vision
WO1998051452A1 (en) Enabling process control technology for automated dry media depaint system
CN112334760A (en) Method and device for locating points on complex surfaces in space
CN107527368A (en) Three-dimensional attitude localization method and device based on Quick Response Code
JP2023090683A (en) Finishing automation system and method thereof
US20200047207A1 (en) Method and painting system for painting a workpiece by means of an atomizer
CN109079777B (en) Manipulator hand-eye coordination operation system
Prabhu et al. Dynamic alignment control using depth imagery for automated wheel assembly
CN114833040B (en) Gluing method and new energy electric drive end cover gluing equipment
WO2023118470A1 (en) Method and apparatus for cutting and removing parts
JP3543329B2 (en) Robot teaching device
CN113055562A (en) Input/output signal information display system
JPH0666732A (en) Inspecting and finishing method for painting
CN108435456A (en) Industrial robot spraying control system
Yanagihara et al. Task world reality for human and robot system. A multimodal teaching advisor and its implementation
JP3206849B2 (en) Robot coating equipment
KR100944425B1 (en) Apparatus for the detecting defect mark on steel plate
Ersü et al. Vision system for robot guidance and quality measurement systemsin automotive industry
Saputra et al. Development of Object Tracking System Using Remotely Operated Vehicle Based on Visual sensor
Sitti et al. Visual tracking for moving multiple objects: an integration of vision and control

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): AL AM AT AU AZ BA BB BG BR BY CA CH CN CU CZ DE DK EE ES FI GB GE GH GM GW HU ID IL IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MD MG MK MN MW MX NO NZ PL PT RO RU SD SE SG SI SK SL TJ TM TR TT UA UG US UZ VN YU ZW

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): GH GM KE LS MW SD SZ UG ZW AM AZ BY KG KZ MD RU TJ TM AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE BF BJ CF CG CI CM GA GN ML MR NE SN TD TG

DFPE Request for preliminary examination filed prior to expiration of 19th month from priority date (pct application filed before 20040101)
121 Ep: the epo has been informed by wipo that ep was designated in this application
NENP Non-entry into the national phase

Ref country code: CA

REG Reference to national code

Ref country code: DE

Ref legal event code: 8642

122 Ep: pct application non-entry in european phase