s00170-016-9481-8

Download as pdf or txt
Download as pdf or txt
You are on page 1of 17

Int J Adv Manuf Technol (2018) 94:13–29

DOI 10.1007/s00170-016-9481-8

ORIGINAL ARTICLE

A robust butt welding seam finding technique for intelligent


robotic welding system using active laser vision
Jawad Muhammad 1 & Halis Altun 2 & Essam Abo-Serie 3

Received: 5 March 2016 / Accepted: 15 September 2016 / Published online: 14 October 2016
# Springer-Verlag London 2016

Abstract Intelligent robotic welding requires automatic find- are performed to make sure that the laser profile is accurately
ing of the seam geometrical features in order for an efficient extracted within the region of interest (ROI). Feature extrac-
intelligent control. Performance of the system, therefore, tion algorithm based on pixels’ intensity distribution and
heavily depends on the success of the seam finding stage. neighbourhood search is also proposed that can effectively
Among various seam finding techniques, active laser vision extract the seam feature points. The proposed algorithms have
is the most effective approach. It typically requires high- been implemented and evaluated on various background com-
quality lasers, camera and optical filters. The success of the plexities, seam sizes, material type and laser types before and
algorithm is highly sensitive to the image processing and fea- during the welding operation.
ture extraction algorithms. In this work, sequential image pro-
cessing and feature extraction algorithms are proposed to ef- Keywords Seam finding . Robotic welding . Computer
fectively extract the seam geometrical properties from a low- vision . Feature extraction . Intelligent sensors . Robotic
quality laser image captured without the conventional narrow application
band filter. A novel method of laser segmentation and detec-
tion is proposed. The segmentation method involves averag-
ing, colour processing and blob analysis. The detection meth- 1 Introduction
od is based on a novel median filtering technique that involves
enhancing of the image object based on its underlying struc- Research in vision-based intelligent robotic welding is one of
ture and orientation in the image. The method when applied the most rapidly growing areas for robotic applications.
enhances the vertically oriented laser stripe in the image which However, to a large extent, teaching strategy of the robot for
improves the laser peak detection. The image processing steps the welding task remains ‘teach and playback’. Hence, fully
automated robotic welding system is yet to be effectively
achieved. This is due to harsh environmental conditions of
* Jawad Muhammad
mohdjawadi@yahoo.com
welding and various factors such as welding spatter and arc
light disturbance, welding material types, distortions due to
Halis Altun welding heat generation and varying structure of the welding
halisaltun@gmail.com seams [1]. Intelligent robotic welding system comprises of
Essam Abo-Serie
three basic components: (1) tracking and profiling of welding
essamaboserie@gmail.com seam and pool, (2) robot trajectory planning and control and
(3) parameter control of welding process []. Robot trajectory
1
Graduate School of Natural and Applied Sciences, Alaeddin planning and control involves teaching of the robot to follow
Keykubat Yerleskesi, Selcuk Universitesi, 42250 Konya, Turkey the welding seam path, at a specific welding torch orientation,
2
Department of Electrical and Electronics Engineering, Karatay and perform the required welding task efficiently. Intelligent
University, Konya, Turkey methods for the path planning have been proposed by re-
3
School of Mechanical, Aerospace and Automotive Engineering, searchers [–4]. The path to follow and the orientation of the
Coventry University, Coventry, UK torch to be set depend on the welding process parameters and
14 Int J Adv Manuf Technol (2018) 94:13–29

the welding seam geometrical properties. To achieve high- the image of the welding seam. A comprehensive survey of
quality welding, the process parameters need to be effectively the active vision methods of seam finding has been performed
controlled. Researchers have proposed various techniques to [18]. Seam finding in the active vision can be simply broken
control the process parameters [5–8]. However, these tech- down into two tasks: detecting the laser stripe in the image and
niques depend on the geometrical measurements of the extracting the feature points from the detected laser line that
welding bead and seam before, during and after the welding identify the seam.
process. The geometrical measurements are obtained through The maximum intensity is the most common feature which
the tracking and profiling of welding seam and pool. Methods is used for laser stripe extraction. Due to the high intensity
in tracking and profiling welding pool and seam can be values (higher brightness) at the region of the laser stripe,
categorised into (i) optical sensing and (ii) non-optical sensing many authors proposed a simple technique which explores
methods. One of the most popular non-optical sensing method this characteristic of assuming maximum intensity to be the
is the through-arc sensing method [1, 8, 9]. Optical sensing laser stripe position while extracting the laser stripe region
method can be further categorised based on either passive from an image [19–24]. The idea behind the maximum inten-
vision or active vision methods. The distinction between these sity strategy is to consider every row or column separately as a
two methods relies on the use of the optional light source. In 1D signal depending on the orientation of the laser stripe as
the active vision, a camera device and a light source are used, either horizontal or vertical. For horizontal laser stripe, col-
while in the passive vision, two camera devices are used with- umns in the image are treated independently. The row position
out a light source. In passive vision, due to the complex nature in each column that has the maximum intensity value is se-
of the welding environments, wide range of methods had been lected as a point in the laser stripe. Combining these points
proposed by researchers [10–17]. Using the passive vision from all the columns together makes up the position of the
system, two set of information can be obtained: (1) the seam laser stripe profile in the image. Sometimes, instead of taking
profile which can alternatively be acquired by using active one point with peak intensity, multiple peaks are chosen to
vision and (2) the welding pool profile which can only be produce a stripe of more than one pixel width, and the peaks
acquired with passive vision. More so, passive vision systems are subsequently discarded based on definite criteria [19, 20,
can be used to acquire the seam path holistically as oppose to 23]. In [20, 23], the middle pixel among the multiple peaks is
active vision which only provide one point at a time. selected as the laser stripe profile location. In [21], after
Numerous techniques have been proposed for the image pre- searching and combining pixel points with maximum intensi-
processing, seam profiling and weld pool profiling of the pas- ty, those points caused by false imaging are rejected by using
sive vision system. For the welding seam profiling, various temporal and spatial continuity constraints and the profile is
methods are proposed such as grey-level distribution methods obtained by using linear interpolation and Gaussian filtering.
[10] that consider the darkness characteristic of the welding In [22], five horizontal laser stripe lines are extracted by the
seam region, conventional edge detection methods like Canny maximum intensity method. The lines are extracted by sorting
edge detection [17], Sobel edge detection [11] that extract the intensity values sequentially in a decreasing order along
edges in the image from which the seam is then extracted each column and then taking the first five values as the posi-
and template matching methods [16] that searches for a tion of the peaks for the five lines in that column. Instead of
known pattern in an image by using a predefined template the traditional maximum intensity, a more accurate method is
which identifies the seam. For the welding pool profiling, to obtain the peak position at sub-pixel accuracy [25–28]. The
the basic task is detecting the edges of the welding pool from methods consider the imperfection of the laser stripe distribu-
which the pool dimensional features can be determined. tion which could make a pixel position of the peak erroneous.
Various methods have been proposed by researchers to detect To accurately detect the peak at sub-pixel level, methods such
the pool edges. Among which is the use of conventional edge as Gaussian approximation [25], centre of mass [25], linear
detection methods [12] such as Canny and Sobel edge detec- interpolation [25], Blais and Rioux [24] detectors and para-
tion. Another notable technique is the histogram analysis bolic estimator [25] are used. The distinction between these
methods [13] that analyse the histogram of the welding pool methods depends on the assumption of the intensity distribu-
image in order to obtain the boundaries of the pool as the tion of the laser stripe. Gaussian approximation and centre of
edges. In [14, 15], an analysis model and experimental tech- mass assumed that the spread of intensity values across the
nique on computing seam finding accuracy by using passive stripe conforms to a Gaussian distribution. Linear interpola-
vision has been proposed; the serial work makes the tion assumes that a simple, linear relationship defines the
path planning a further step forward in the passive vision spread of intensity values across the stripe. A comparative
field. analysis on the effectiveness and accuracy of these sub-pixel
On the other hand, in the active vision system, the concept methods was performed in [25]. The authors concluded that
of triangulation is applied to find the seam geometrical prop- these methods display comparable performance within the
erties. It employs a camera and a light source device to capture same range.
Int J Adv Manuf Technol (2018) 94:13–29 15

After the laser stripe profile extraction, the turning point In the aforementioned methods, single-channel images
and corner points are then extracted as feature points. In an (grey images) are used because of the conventional narrow
ideal condition, extracting these points could be a simple task band filter usually installed on the camera to increase its sen-
of performing turning points or corner point detection. sitivity to the laser. The narrow band filter when used during
However, in reality, the extracted laser stripe is far from its welding may reduce the laser contrast with respect to the
ideal shape. The extracted stripe may experience discontinu- welding arc noise. This is because the most dominant noise
ities along the lines, and the noise could suppress the feature in welding environment is the white light which is produced
points into higher or lower than their actual values. This by the welding arc. As the white light contains all wavelength
makes it more challenging to accurately detect these points. of light, the filter could not suppress the noise in the laser
Researchers have proposed methods to efficiently detect these spectral band due to white light. This makes the camera to
feature points. The use of split and merge algorithm [22, 29], capture the laser and the white noise with similar intensity
second central difference (second CD) [20, 30], local maxima thereby reducing the laser contrast with respect to the welding
and minima [21, 24, 31–33] and rule-based techniques [27, arc noise. Also, the extracted position profile of the laser stripe
28, 34–37] have proven to be effective in extracting the fea- could be noisy which affects adversely the performance of the
ture points. In [22], the researchers make use of the combina- feature point detection. Only few researches propose an addi-
tion of split and merge [38] and template matching method to tional processing to filter out the peaks. Furthermore, majority
determine the feature points. In the split and merge method, of the feature point extraction methods consider the feature
approximate straight line is generated from three points ac- points as corners and employ corner detection methods as
cording to the turning angle between the points. The points the feature extraction. However, in the presence of a noise or
in the stripe profile are scanned by considering three points at low-quality laser, false corners are inevitable, and this can lead
a time. A point is discarded as noise if the turning angle be- to catastrophic result. In this work, based on these issues, we
tween the two lines, which are joining the three points, ex- proposed four major contributions: (1) sequential image pro-
ceeds a certain threshold. The feature points are then extracted cessing steps that use the images of a low-quality cheap laser
by comparing the generated straight line with a welding joint and accurately determine the position of the seam in it, (2) a
template. In [29], similar strategy of selecting the feature proposed active vision system design without using a narrow
points based on computed turning angle of a point is proposed. band filter but with an additional software-based colour pro-
The feature points are determined by set of rules based on the cessing to increase the laser contrast with respect to the
position and value of the turning angle at each point. For each welding arc noise, (3) a novel laser profile pre-processing that
of the feature points, there exist associated rules that define the involves enhancing the laser image with a novel oriented me-
nature of the point. In [20], a method based on the second CD dian filter. This filter enhances an object based on its underly-
of the row index of each point in a horizontally oriented laser ing structure and orientation in the image. The method en-
stripe profile is proposed. First, the second CD is computed hances the vertically oriented laser stripe in the image which
for all the points in the laser stripe. Based on the values of the improves the laser profile extraction; and (4) a proposed fea-
second CD, a point scanning algorithm is proposed that ture extraction algorithms that involves pixel neighbourhood
searches for the feature points that meet predefined criteria. search based on a detected laser base line. The points are
In [30], unlike in [20], a group of points that have CD value extracted independently and irrespective of their turning angle
greater than the 70 % of the maximum CD value is selected or strategic position.
and the centre point in the group is taken as the position of the In the next section, the details of the system design and the
feature points. The feature points can also be extracted as the proposed algorithms will be discussed followed by the result
local maxima and minima of the second derivative of the laser and discussions.
stripe profile [21, 24, 31–33]. The method involves computing
the second derivative of the laser stripe, searching for the local
minima and maxima and selecting the points that correspond 2 The proposed algorithm
to the local maxima and minima as the feature points. Another
notable technique for extracting the feature points is the rule- 2.1 System configuration
based approach method [27, 28, 34–37]. The method involves
approximation of the extracted laser profile into line segments. The proposed system comprises of a laser light source, a cam-
The line segments are labelled according to predefined era device and a welding robot. For the light source, a 5-mW
welding segments. The labelled line segments are systemati- laser with a wavelength of 650 nm was used. This wavelength
cally combined to form a feature string. Based on certain de- was chosen according to the spectrum analysis of the welding
fined criteria and classification methods, the feature string is process as shown in Fig. 1a. From the figure, it is evident that
interpreted to be one of the predefined welding joint type and the arc light which has components in all over the spectrum is
the feature points are extracted from the joint. at its weakest intensity in the wavelength range of 620–
16 Int J Adv Manuf Technol (2018) 94:13–29

Fig. 1 a The spectral analysis of


GMAW for low-carbon steel of
Q235 [10]. b System
configuration showing the
camera, robot and welding
machine. c Welding image with a
narrow band filter and d welding
without the narrow band filter

720 nm [10]. A DFK 23G274I Sony CCD GigE industrial laser contrast with respect to the welding white noise which
camera device with a resolution of 1600 × 1200 was used. makes it difficult to be separated from the laser stripe. To
KUKA industrial welding robot coupled with CR-4 controller demonstrate this, Fig. 1c, d shows the images captured with
was used. Figure 1b shows the system configuration. a narrow band filter and that captured without the filter respec-
tively. The high red contrast of the laser in Fig. 1d can be
2.2 Proposed algorithm observed from the image. Therefore, a software-based colour
filtration is proposed to segment the laser stripe. The original
The proposed algorithm comprises of two steps: (1) detection image to be used is depicted in Fig. 2a. The process starts with
of the laser base line that represents the laser stripe without the application of an averaging filter described by Eq. (1) in
deformation and (2) seam feature point extraction. The aim of order to spread the laser red colour around neighbouring
these algorithms is to extract the seam features that can be pixels. The average filtering process is necessary to smoothen
used for automatic teaching of the welding robot before the high intensity saturated pixels (nearly white coloured) that
welding and for online tracking and control of the welding may be present in the centre of the laser due to the non-
robot during welding process. In the following sections, the uniform intensity of the laser light and also in order to sup-
details of these steps will be described. press high intensity noises in the background. The result of
applying the averaging filter is shown in Fig. 2b.

2.2.1 Laser base line detection 1 X LW X LW


F ði; jÞ ¼ I ði; jÞ ð1Þ
LW 2 i j

The proposed processing stages for detecting the base line


consists of three steps: a pre-processing step, a laser peak point where LW = 10 and it is the maximum expected laser width,
detection step and a line fitting step correspondingly. I(i, j) is the image intensity at row i and column j and F(i, j) is
the filter result at row i and column j.
Step 1: Pre-processing The processed image is then converted from an RGB col-
our space into hue, saturation, value (HSV) colour space. This
The pre-processing stage is performed in order to remove is a major step in order to extract the red colour of the laser line
unwanted objects in the image. Because the colour of the laser from the image. The hue represents the colour itself. The value
light in the image is predominantly red, a pre-processing tech- of the hue range is between 0 and 360 (0 to 1 if normalised);
nique is proposed that utilises this property. Usually, a narrow saturation indicates the degree to which the hue differs from a
band optical filter is used with the camera to provide more neutral grey, or simply, it indicates the purity of a colour. The
sensitivity and selective selection of the red light with specific saturation values varies from 0, which means no colour satu-
wavelength to pass through the camera. However, use of these ration (pure white), to 1 (pure colour), which is the fullest
filters is not flexible and during welding, it may reduce the saturation of a given hue at a given illumination. The value
Int J Adv Manuf Technol (2018) 94:13–29 17

Fig. 2 a Original image, b the


averaged image of Fig. 2a, c hue
channel of the input image, d
saturation channel of the image, e
value channel of the input image,
f generated hue mask (M1), g
generated saturation mask (M2), h a b c d
generated value mask (M3), i the
mask generated after thresholding
HSV masks, j the processed mask
after morphological operation and
blob analysis, k the grey image of
the RGB input image and l the
masked segmented image after e f g h
colour processing and median
filtering

i j k l

is the illumination level or measure of the level of brightness. binary objects in the mask except one object which includes
After converting the processed image into HSV colour space, the laser line. Two steps are followed in order to have a mask
the resulting HSV channels are shown in Fig. 2. It can be with only one object. The first step is applying morphological
observed from the hue image of Fig. 2c that the red laser line dilation operation on the mask to close and connect discon-
region has the least hue value, represented by black in the hue nected object that may belong together. The second is apply-
image. And, the background of the input image has higher hue ing blobbing analysis on the resulting mask to remove all the
values represented by the brighter colour. The saturation im- binary objects, except the longest object. The final exclusion
age of Fig. 2d also shows red laser region of which the mask is shown in Fig. 2j. The three-channel RGB image need
brightest colour indicates the purity of the colour combina- to be converted into single channel that can be masked. The
tions of the input image. The value image of Fig. 2e depicts conversion is either by converting to grey image or selecting
the illumination of the input image with the laser region show- the red channel depending on the laser quality. The resulting
ing higher illumination. The three HSV channels are single-channel image is shown in Fig. 2k. The generated ex-
thresholded by using Eqs. (2), (3) and (4). Three masks are clusion mask in Fig. 2j is then applied to the single-channel
generated shown in Fig. 2f, g, h respectively. The final exclu- grey image. The final segmented image after the masking
sion mask is generated by using (5) as shown in Fig. 2i. The operation is further filtered with a conventional square median
threshold values for hue channel are chosen to cover all the filter with a kernel of 5 by 5 in order to filter some of the white
range of red region in the hue value ranges. The saturation and and long trail noises that may be created by the reflection of
value thresholds are chosen to allow poor contrast laser stripe the laser light on an object or due to the welding arc light
typical characteristics of a low-quality laser to be accommo- spatter. The resulting image is shown in Fig. 2l. It can be
dated. observed that some of the small high-frequency reflections
8
< H ði; jÞ < 0:1 1 of the input image are suppressed and also the salt-and-
M 1 ¼ H ði; jÞ > 0:9 1 ð2Þ peppered particle noises are removed. This resulting image
:
 otherwise 0 is the image that would be used in the subsequent image pro-
S ði; jÞ > 0:2 1
M2 ¼ ð3Þ cessing steps.
otherwise 0
 Step 2: Laser peak detection
V ði; jÞ > 0:2 1
M3 ¼ ð4Þ
otherwise 0 The laser peak detection is aimed at extracting the profile
M ¼ M 1 ∩M 2 ∩M 3 ð5Þ pixels that will represent the laser stripe during the feature
point extraction stages. Typical intensity distribution of the
where M1, M2 and M3 are the thresholded masked for processed image for two of its rows marked in Fig. 3a as
the H, S and V channels respectively, i , j is the row and column 600th and 700th which are shown in Fig. 3e. It can be ob-
numbers and M is the final exclusion mask. served that each row has its peek pixel somewhere within the
It can be observed that the exclusion mask has multiple laser stripe region. To extract the peak in each row, the max-
binary objects. Hence, there is a need to remove all other imum intensity pixel is taken in each row as the position of the
18 Int J Adv Manuf Technol (2018) 94:13–29

600 600

700 700

a b c d
250
250
600th Row 600th Row
700th Row 700th Row
200 200
I nt ens it y Values

Intens ity Values


150 150

100 100

50 50

0 0
0 200 400 600 800 1000 1200 1400 1600 0 200 400 600 800 1000 1200 1400 1600
Image Columns Image Columns

e f
Fig. 3 a The processed image from Fig. 2l. b Typical intensity filter. e Typical intensity distribution of the enhanced image. f The peak
distribution of the two rows marked in a. c The peak line extracted line extracted from the enhanced image
from the image. d The enhanced image using vertically oriented median

laser stripe in that row. However, as shown in Fig. 3e, there It can be observed that the extracted peak line has a lot of
can be more than one maximum pixel of the same intensity. A unnecessary breaks along its length. This is due to noisy
peak extraction algorithm is proposed to extract these peaks. spikes in the rows of the processed image. Therefore, in order
The algorithm is given by algorithm 1 (appendix). The algorithm to alleviate this problem of the noisy spikes, a novel approach
only considers peaks that are greater than 80 % of the maximum is proposed which effectively improves the peak detection.
intensity level. If a peak is less than the given threshold, the The proposed approach is based on a novel oriented median
peak position is assigned to be zero and will not be considered filtering to enhance an oriented object in a noisy image. This
by the subsequent processing steps. The extracted peak line new approach will treat the horizontally oriented objects as
from this algorithm is shown in Fig. 3b. spurious noises and then effectively suppress the noisy effect
that can deteriorate the performance of the peak extraction.
Table I Details of the real-time test in the study The conventional way of applying the median filter is to select
a suitable square kernel size similar to that used in the previous
Image processing average running time 300 ms
section (as shown in Fig. 2l). The selection of the kernel size
Image capture rate Four images per second determines the neighbourhood around the pixels to be consid-
Camera resolution 1600 × 1200 ered and do not consider the underlying structure present in
Video frame rates Five frames per second the image. As we have an image with a vertical object struc-
Robot travelling speed/welding speed 0.01 m per second ture, the kernel size of the proposed median filter is selected as
Welding machine wire feed rate 40 m per second vertically oriented long rectangular kernel with LW × 200
neigbourhood, where LW is the expected maximum width of
Int J Adv Manuf Technol (2018) 94:13–29 19

the laser. The size of the kernel was determined experimental-


ly in order to produce an enhanced laser stripe in the image.
The result of applying the proposed vertically oriented median
Top point
filter to the processed image is shown in Fig. 3c. As it can be
seen from the results, the proposed median filter emphasises
the vertical laser line object in the image and attempt to cancel
any object shorter than its vertical length. The effect of the Deformation
filter on the intensity distribution of the image is also shown in Seam peak
region
Fig. 3f. The filter has successfully reduced some of the noisy
spikes located outside of the laser stripe region. As a result, the
width of the intensity distribution is reduced and the distribu-
tion has groups of spikes that can be easily differentiated. The Bottom Point
ultimate effect of the proposed filter is clearly shown in the
extracted profile result as shown in Fig. 3d.

Step 3: Line fitting Fig. 5 The welding seam joint

The extracted peak point positions are then fitted to a


straight line by using polynomial curve fitting [39] algorithm.
labelling and grouping, (c) horizontal ROI determination and
The profile points may contain empty rows with no points due
(d) determination of the feature points. The vertical ROI is
to the rejection of pixels with maximum less than the given
extracted along the extracted laser line in order to determine
threshold in algorithm 1 (appendix). This empty rows are
the top and bottom feature points. On the other hand, the
treated as outliers and will not be considered by the line fitting
horizontal ROI is extracted in order to extract the seam peak
algorithm. The result after the line fitting algorithm proposed
feature point. It is vertically bounded by the top and bottom
in [39] is shown in Fig. 4. The line returned by the line fitting
feature points.
algorithm is the detected position of the laser base line as
shown in Fig. 4.
Step 1: Determination of vertical ROI

2.2.2 Seam feature point extraction


From the original image in Fig. 2a, it is evident that the
The terminologies used for the feature points in this section feature extraction process can only affect the region where the
are explained in Fig. 5. It can observed that the deformation laser stripe is found. Marking this region as the ROI will
region is the region along the laser stripe that contains all the greatly simplify the feature extraction. The vertical ROI is
three feature points. The proposed image processing steps for determined by cropping the processed filtered image (shown
extracting the feature points from the detected laser stripe in Fig. 2l) around the region of the previously detected laser
includes (a) vertical ROI determination, (b) junction points line. Equation (6) is used for this purpose.

ROI ði; cÞ ¼ I ði; jÞ ð6Þ

LW LW
where p− ≤ j≤ p þ ; 0≤ i ≤ M ;
2 2
where LW is the expected laser width and M is the number of
rows in the image I. I(i, j) is the processed image intensity at
row i and column j. ROI(i, c) is the ROI image. p is the column
index of the previously detected laser line in the original
image.
Figure 6a, b shows the ROI marked in the image and the
extracted ROI image respectively. In order to clearly depict the
effect of the vertical ROI, the ROI is marked on a different
Fig. 4 Detected laser base line positions image containing a tilted laser object and the images are
20 Int J Adv Manuf Technol (2018) 94:13–29

Fig. 6 a The ROI region marked


in the original image and b the
extracted ROI. c The ROI marked
in another image and d the
extracted ROI

a b c d

shown in Fig. 6c, d. It can be observed that the ROI will trim points are shown in the Fig. 7a. The first and third groups,
some of the laser deformation region. which are labelled as ‘1’ and ‘3’ respectively, are obviously
false junction groups which are due to inherent noise. On the
Step 2: Labelling, grouping and selection of junction other hand, the second group labelled as ‘2’, is the true junc-
points tion point group that corresponds to the laser deformation
region. Algorithm 2 (appendix) is used to label a laser stripe
Junction points are disjointed points along the laser stripe profile point as a junction point belonging to a particular
profile that correspond to the pattern produced due to the group. The main idea of determining a junction point is to
projection of the laser light on a welding joint. They are sim- use the projection of the point on the fitted laser line as shown
ply the points within the laser deformation region from which in Fig. 7b. The farther the point from the line, the more likely it
the top and buttom feature point can be extracted. However, will be labelled as a junction point.
due to inherent noise, multiple junction point groups can be According to the junction group selection criteria of algo-
found. Figure 7 shows an example of labelled junction point rithm 2, a junction group is selected by using three criteria as
groups on an extracted laser stripe. Three groups of junction follows: (a) the number of points in the group, (b) the position
of the group along the vertical fitted line and (c) the average
maximum intensity of the group. It is evident that in moving
along the laser stripe region, the deformation of the welding
joint is presumed to be relatively large size of at least 5 pixels
Junction 1 height; hence, all junction groups with number of points less
points 1 than 5 pixels are discarded. Also, the deformation that corre-
groups Fitted sponds to true junction group is always anchored by vertical up
Laser and down line stripe segments, hence, all junctions positioned
line at the beginning or at the end points in the vertical line are
discarded. The remaining junction groups are evaluated based
2 on their average maximum intensities. The average maximum
2 intensity of a junction group is calculated by using (7).

1 XN
Average Max ¼ max1 ≤ j ≤ LW F ðJPðiÞ; jÞ ð7Þ
N i

3
where N is the number of points in the junction points group
3 JP, LW is the maximum expected laser width and F(i, j) is the
processed image.
The group among the remaining groups with the least
a b average maximum intensities is selected as the junction
Fig. 7 a Labelled junction points on laser profile. b Fitted laser line with group. The result after applying algorithm 2 to the ROI
labelled junction points image is shown in Fig. 8. In this example, there are two
Int J Adv Manuf Technol (2018) 94:13–29 21

1 86.8
1 86.8

2 55.0

2 55.0

Fig. 8 Marked junction points with their respective average maximum a b


intensities in ROI image and zoomed out from the ROI image Fig. 10 Marked top and bottom point; horizontal ROI marked by the
rectangle a before rotation correction and c after correction

candidate junction groups labelled as 1 and 2 respective-


ly. The false junction group is marked in red rectangle, number of junction groups present in the ROI, the algo-
and the selected junction group is marked in green. It can rithm was able to successfully apply the selection criteria
be observed that, according to the previously stated rules, and select the group 9 as the correct junction group as
the junction group 1 from Fig. 8 is selected even though shown in Fig. 9b.
it has an average maximum intensity of 86.8 that is great- From the selected junction group, the first and last points in
er than that of group 2 with average maximum intensity the group correspond to the top and bottom feature points. The
of 55.0. This is because of the position of group 2 that is two selected points are marked in Fig. 10a.
at the end of the stripe which violates one of the selection
criteria. To further demonstrate the effect of algorithm 2, Step 3: Determination of horizontal ROI
the algorithm is tested on a more complex ROI extracted
from a different image as shown in Fig. 9a. Despite the

a b
1 135.9

2
3 183.6
169.7
4 159.4
5 192.0 c
6 198.0
a b
7 166.8
8 192.0
9 32.0

d e
10 4.6
Fig. 11 Demonstration of the effect of rotation on the horizontal ROI
Fig. 9 Complex ROI with many junction groups. a Original ROI image extraction with the original image, marked ROI on the original image and
and b marked junction points with their respective average maximum extracted ROI: a–c without rotation correction and d–f with rotation
intensities correction
22 Int J Adv Manuf Technol (2018) 94:13–29

50
a 45

40

b 35

Image Rows
30

25
c 20

15 Original exracted line


Algorithm 3 output
d 10
1st half fitted line
2nd half fitted line
5
Detected point
0
0 20 40 60 80 100 120 140 160 180 200
e Image Columns

f
Fig. 12 Zoomed images of a extracted horizontal ROI, b extracted peak point detected by using algorithm 3 together with the fitted line. f
profile from the ROI by using algorithm 1, c processed profile by using The extracted, processed, fitted profile and detected seam peak point
algorithm 3, d processed profile points fitted in straight line, e final seam

The vertical region of interest marked previously only performed by rotating the image with the inverse of the
selects the region around the detected laser base line and tilt angle θ which is calculated by using Eq. (8). The
does not account for the wider region around the horizontal ROI is then determined by using Eq. (9).
deformation region. As such, a new ROI that will cover  . 
wider region around the stripe deformation needs to be θ ¼ −tan−1 ðTopx −Bottomx Þ Topy −Bottomy ð8Þ
selected in order to correctly identify the seam peak
point. The new ROI can be extracted horizontally from
0
the original processed image (shown in Fig. 2l) by using ROI ðc; jÞ ¼ I ði; jÞ ð9Þ
the knowledge about the position of top and bottom fea-
ture points. This is necessary in order to find the farthest
Top0 y ≤ i≤ Bottom0 y ; minimum ðTop0 x ; Bottom0 x Þ≤ j ≤ N ;
point in this region that corresponds to the seam peak
point of the welding joint. To properly select the region,
the orientation of the laser stripe line must be taking into where θ is the tilt angle of the laser stripe with regards to
consideration. Hence, before the extraction, the tilt in the vertical axis, Topy, Top′y, Bottomy and Bottom′y are the x and
original image must be removed. The correction is y values of the top and bottom points before and after rotating

1
2
3
1
3
2

a b
Fig. 13 Final detected points for a original image and b zoomed original image for clarity of the detected point
Int J Adv Manuf Technol (2018) 94:13–29 23

00 -100 -200 -300

1
3 1
2 3 1
2 3 1
2
2 3

100 200 350 400


1 3
1 1 3 2
3
2
2

Fig. 14 Extracting the feature points from rotated laser stripe images

with θ respectively. N is the number of columns in the image I. 3 (appendix) is used in determining this point. First, the
I′(i, j) is the processed image intensity at row i and column j profile points are extracted from the horizontal ROI
rotated with the angle θ. ROI(c, j) is the ROI image. (shown in Fig. 12a) by using algorithm 1, and the
The marked horizontal ROI after the rotation correc- resulting image is shown in Fig. 12b. From the extracted
tion is shown in Fig. 10b. In order to demonstrate the profile, breaks and some out of place noisy points can
needs of correcting rotation before the horizontal ROI be observed. These points are filtered with algorithm 3.
extraction, an image with a very wide seam and a rotat- The algorithm works by dividing the laser stripe profile
ed laser stripe is used as shown in Fig. 11. Due to the into two parts (upper and lower) and scan each part
slight rotation, if the horizontal ROI is extracted without independently. Since the top and bottom feature points
correcting the rotation as shown by Fig. 11a–c, some of already determined are in the extracted profile with top
the laser deformation region may not be selected and point being the first point in the stripe and the bottom
could lead to the deselection of the seam peak point point being the last point. Hence, the first point in the
region. However, the ROI will be properly extracted if upper part is the top point and last point in lower part is
the rotation is corrected first before the extraction step the bottom point. Therefore, the upper part is scanned
as shown in Fig. 11d–f. by starting from the first point to its last point. On the
other hand, the lower part is scanned from the last to its
Step 4: Detection of seam peak point first point. During the scanning, the change in column
index between a row and its previous row is checked in
The seam peak point is the uttermost or farthest point order to determine whether it is within the given limit. If
from the laser stripe line that correspond to the end of it is out of the limit, the column index of the current row
the laser obstruction on the welding seam. The position is changed to the limit value. The limit of
of the point is crucial because any small background −LW ≤ Column index ≤ LW is chosen. This is based on
noise could deter or completely change the position of the assumption that for consecutive rows, the laser pro-
the point and subsequently change the orientation of the file point column position in these rows should be as
whole seam from the laser stripe perspective. Algorithm close to each other as possible, at least within the laser

1 1
3
3
2 2
1
1 3
3 2
1 1 3
3 2 2
2

1
3
2

a b c d
Fig. 15 High-quality laser with strong reflection image result
24 Int J Adv Manuf Technol (2018) 94:13–29

1
1 1
3 3
3 2

2
2 1
2
3

1 3
2

1
1 1
2 3
3
3
2 2

a b c d

1
1
3
1
3 3
2 2 2
1 3 1
2 2 3
1 3
2

e f g h
Fig. 16 Low quality and complex background laser image results

width. After the scanning, two new profiles will be gen- points have been successfully extracted as shown
erated from the two parts. The two profiles when com- Fig. 13.
bined makes up the processed profile shown in Fig. 12c.
The two profiles are then fitted into a straight line by
using the polynomial fitting algorithm. The result is 3 Results and evaluation
shown in Fig. 12d. The intersection of the two lines is
selected as the detected peak shown in Fig. 12e. It can The proposed algorithm is implemented on a Windows 8 com-
be observed that the original extracted profile (shown in puter with Intel Core i7 2.0 GHz processor, 8 GB RAM. The
Fig. 12b) have been filtered and the outlier points computer is directly connected to the robot and the camera.
corrected by drawing the points closer to the profile as The algorithm performance is evaluated in two ways: (1) im-
shown in Fig. 12c. Figure 12f shows elaborately the age processing results’ evaluation and (2) real-time robot mo-
relationship between the original profile, processed pro- tion results.
file, fitted lines and the detected seam peak point when To evaluate the effectiveness of our proposed image
plotted on a graph. processing and feature extraction method, four different
After the laser stripe detection and the feature point set of images are used: (1) images containing rotated
extraction stages, the top, bottom and seam peak feature laser stripe, (2) high-quality laser with strong reflection

Fig. 17 Welding images without


a narrow band filter result

1 1
3 3
2 2

a b
Int J Adv Manuf Technol (2018) 94:13–29 25

Fig. 18 Welding images with a


narrow band filter result

1 1
3 3
2 2

a b

images, (3) low-quality and complex background laser the feature points in these cases. The results for welding
images, (4) welding images with narrow band filter and images captured with and without the narrow band filter
(5) welding images without narrow band filter. The aim are shown in Figs. 17 and 18 respectively. The distinc-
is to test the robustness of the proposed approach tion between these two set of images can be examined.
against rotation, poor-quality laser, background com- For the images captured without the filter, the red laser
plexities, reflection and welding noise. For the rotation, is well pronounced and the proposed algorithm was able
the results are shown in Fig. 14. Because part of the to accurately extract the feature points in them despite
algorithm implementation relies on vertical objects, the the strong arc white noise present. For the images with
rotation test is important to know how much the algo- the filter, the laser is somehow normalised with the
rithm will be affected when the laser is rotated. The welding noise and its contrast greatly affected.
images contain laser rotated from −30° to 40° with However, the proposed algorithm was able to extract
anti-clockwise been the positive direction. From the the feature points accurately. This can be attributed to
results, it can be observed that as the rotation increases the proposed oriented median filtering that suppress the
the detected base line got slightly out of phase. horizontally oriented randomly distributed welding
However, due to robustness of the feature extraction noise
stage, the position of the feature points is not affected.
The algorithm fails when the rotation reaches 40°.
420
However, the results indicate that the rotation of the
Detected Seam Left
laser up to a certain degree does not affect the result
400 Detected Seam Right
of the extracted seam feature points. Manually Programmed Robot Path
The results for the high-quality laser with strong re-
380
flection images are shown in Fig. 15. Due to the nature
of the laser, the reflection in the image is very strong.
However, the algorithm has successfully detected the 360
Robot Y Axis (mm)

seam positions in these images. The result in Fig. 15c


shows the extracted seam peak points which is slightly 340

not at the expected location. This is because, during the


seam peak extraction stage, the outmost laser pixels in 320

this region due to their slightly lower intensity will be


assumed to be part of reflections and thresholded. 300

Figure 16 The result for the low-quality laser images.


The algorithm successfully detects all the seam feature 280

points in all the images except that of Fig. 16d. This is


because the laser in this image is badly disrupted by the 260
-180 -160 -140 -120 -100 -80 -60 -40
strong lightening source. It is important to note despite Robot X Axis (mm)
the complexity of Fig. 16c and the strong illumination Fig. 19 Robot motion result compared with the result from the extracted
of Fig. 16f images that the algorithm was able to detect seam path
26 Int J Adv Manuf Technol (2018) 94:13–29

420
In order to evaluate the effectiveness of the proposed
method by using the robot motion, the algorithm was Computed path using proposed algorithm
400 Manually programmed Robot Path
tested in real time with the robot. During the test run,
the extracted image coordinates were converted to the
380
robot world coordinates by using Eqs. (10), (11) and
(12). Equation (10) was obtained after camera calibra-
360

Robot Y Axis (mm)


tion, where the camera extrinsic and intrinsic parame-
ters were determined. 340

2 3 2 3 320
u 2006:68 0 791:77
s4 v 5 ¼ 4 0 2006:10 590:38 5 ð10Þ 300
1 0 0 1
280
2 3
0:09 −1:00 −0:04
4 −1:00 −0:09 0:01 5 260
-180 -160 -140 -120 -100 -80 -60 -40
−0:01 0:04 −1:00 Robot X Axis (mm)
Fig. 20 Robot motion result compared with the result from the computed
2 3 2 3
X −8:24 curve path from the extracted seam
4 Y 5 þ 4 139:88 5
2:5 110:96 that the proposed seam finding method is successful in
extracting the seam path. From the extracted seam path,
the centroid coordinates in the path can be computed as
the robot path. Figure 20 shows the comparison of the
computed robot path with the actual manually pro-
X R ¼ X TCP −X ð11Þ grammed robot path. The two paths could be observed
to approximately fit each other.

Y R ¼ Y TCP −Y ð12Þ
4 Conclusion
where (u, v) are the extracted image coordinates, (X, Y) are the
converted camera coordinates, (XTCP , YTCP) are the current In this paper, we present a novel method that can effec-
position of the tool centre point (TCP) and (XR , YR) are the tively find the seam geometrical information by using
converted final robot coordinates. active vision. It utilises a combination of colour pro-
The details of the real-time test relevant to this work is cessing, median filtering and pixel neighbourhood
shown in Table I below. search. It was implemented and tested in various back-
From Table I, it can be observed that the average ground complexities, laser type and seam sizes. It has
image processing running time was found to be just proven to be effective with less expensive low-quality
about 300 ms for an image with 1200 × 1600 size. laser images and robust to reasonably rotated laser
This means that it can process at least three images in stripes. Using the proposed approach, we have been
1 s. For a welding process which is usually a very slow successful in implementing an active vision system with
process (like the 0.01 m/s used, which depends on the the following advantages: (1) does not use the conven-
welding scenario), processing of three images in a sec- tional narrow band filter for the laser wavelength selec-
ond is practically contented. tion which results to cost reduction and increased laser
The robot was first programmed manually to approx- contrast, (2) be able to work with very low-quality laser,
imately follow the centre path of the test curvature (3) robust to laser rotation and (4) effective in extracting
workpiece as shown in Fig. 19. The trajectory of the the required geometrical features.
robot teach and playback is shown in Fig. 19 as a
dashed line. For the same path, the robot has been guid-
ed by the proposed algorithm and the extracted seam Acknowledgments The authors would like to acknowledge the support
provided by The Scientific and Technological Research Council of
path is labelled as the detected seam left and the detect- Turkey-TÜBİTAK (grant no. 114M756) for conducting this research
ed seam right. It is obviously evident from the figure work.
Int J Adv Manuf Technol (2018) 94:13–29 27

Appendix

Algorithm 1:

Laser Line Peak Detection (I, Laser_Width, Threshold)


Parameters List:
Laser_Width: expected width of laser
Threshold: minimum laser intensity
Laser_Line: array containing column index in each row of the laser stripe line
row: index of the current row

01 INITIALISE Laser_Line=ARRAY[number of rows of image I ]; row=1;


02 LOOP FOR EACH row in the image I
03 FIND maximum intensity in the row
04 IF maximum intensity IS GREATER THAN Threshold
05 FIND all columns equal maximum intensity
06 GROUP maximum intensity columns that is at least one column apart
07 SELECT the GROUP that has the highest number of elements and with number of elements less
than Laser_Width
08 SET Laser_Line[row]=column index of centre element in the selected group
09 ELSE
10 SET Laser_Line[row]=0
11 ENDIF
12 SET row=row+1;
13 END LOOP
14 RETURN Laser_Line

Algorithm 2:

Junction Points labelling Algorithm (ROI, Threshold)


Parameter List:
ROI: the extracted ROI
Threshold: cut off intensity value for non-laser line regions
Junction_Points: binary array indicating if a row is a Junction point or not
row: index of the current row

01 INITIALISE Junction_Points=ARRAY [number of rows of ROI ]=0; row=1;


02 LOOP FOR EACH row in the ROI
03 FIND maximum intensity in the row
04 IF maximum intensity > Threshold
05 SET Junction_Points [row]=1
06 END IF
07 SET row=row+1;
08 END LOOP
09 GROUP Junction_Points that are at least more than one row apart
10 SELECT the GROUP that certisfy the junction group selection criteria
11 SET Junction_Points [NOT Selected GROUPs]=0
12 RETURN Junction_Points
28 Int J Adv Manuf Technol (2018) 94:13–29

Algorithm 3:
Seam Peak Detection Algorithm (ROI, Laser_Width)
Parameter List:
seam_ROI, ROI , Line_1 and Line_2: vectors containing the seam
point column indexes of each row of the seam ROI
Laser_Width: expected width of laser
current_Ind: seam point column index in the current row;
prev_Ind: seam point column index of the previous row
row: index of the current row;
rows: total number of rows in the ROI; half_rows: rows/2
Zero_Count: number of rows with zero column index after the last Non-zero index row
seam_peak: row index of the seam peak position

01: SET seam_ROI=ROI(RANGE(1 TO half_rows));


02: INITIALISE row=1; prev_Ind=seam_ROI[row]; Zero_Count=0;
03: LOOP FOR EACH row in the seam_ROI
04 SET current_Ind=seam_ROI[row]
05 IF absolute(d) > Zero_Count * Laser_Width
06 Zero_Count=Zero_Count+1; CONTINUE LOOPING;
07 ENDIF
08 SET d=current_Ind-prev_Ind
09 IF absolute(d) > Zero_Count * Laser_Width
10 IF d > 0; SET current_Ind=prev_Ind+Laser_Width;
11 ELSE SET current_Ind=prev_Ind-Laser_Width;
12 ENDIF
13 ENDIF
14 SET Zero_Count=0; prev_Ind=current_Ind; seam_ROI[row]=current_Ind; row=row+1;
15 END LOOP
16: SET Line_1=LINE FITTING(seam_ROI);
17 SET seam_ROI=ROI(RANGE(rows TO half_rows)); Goto Line 02
18 SET Line_2=LINE FITTING(INVERSE OF seam_ROI)
19 SET seam_peak=INTERSECTION OF Line_1 AND Line_2
20 RETURN seam_peak

References 5. Nele L, Sarno E, Keshari A (2013) Modeling of multiple charac-


teristics of an arc weld joint. Int J Adv Manuf Technol 69(5–8):
1331–1341
1. Pires JN, Loureiro A, Bölmsjo G (2006) Welding robots: technolo- 6. Park YW, Rhee S (2008) Process modeling and parameter optimi-
gy, system issues and application. Springer Science & Business zation using neural network and genetic algorithms for aluminum
Media laser welding automation. Int J Adv Manuf Technol 37(9–10):1014–
2. He-Xi L, Yong-Hua S, Guo-Rong W, Xiao-Xi Z (2009) Automatic 1021
teaching of welding robot for 3-dimensional seam based on ant 7. Chaki S, Shanmugarajan B, Ghosal S, Padmanabham G (2015)
colony optimization algorithm. In Intelligent Computation Application of integrated soft computing techniques for optimisa-
Technology and Automation, 2009. ICICTA’09. Second tion of hybrid CO 2 laser–MIG welding process. Appl Soft Comput
International Conference on (Vol. 3, pp. 398–402). IEEE 30:365–374
3. Ryberg A, Ericsson M, Christiansson AK, Eriksson K, Nilsson J, 8. Cook GE, Andersen K, Fernandez KR, Shepard ME, Wells AM Jr
Larsson M (2010) Stereo vision for path correction in off-line pro- (1987). Electric arc sensing for robot positioning control.
grammed robot welding. In Industrial Technology (ICIT), 2010 I.E. IFS(Publications) Ltd., Robotic Welding, 181–216
International Conference on (pp. 1700–1705). IEEE 9. Fridenfalk M (2003) Development of intelligent robot systems
4. Kim P, Rhee S, Lee CH (1999) Automatic teaching of welding based on sensor control. Lund University
robot for free-formed seam using laser vision sensor. Opt Lasers 10. Fang Z, Xu D, Tan M (2013) Vision-based initial weld point posi-
Eng 31(3):173–182 tioning using the geometric relationship between two seams. Int J
Adv Manuf Technol 66(9–12):1535–1543
Int J Adv Manuf Technol (2018) 94:13–29 29

11. Dinham M, Fang G (2013) Autonomous weld seam identification Industrial Electronics Society, 1998. IECON’98. Proceedings of
and localisation using eye-in-hand stereo vision for robotic arc the 24th Annual Conference of the IEEE (Vol. 2, pp. 1236–1241).
welding. Robot Comput Integr Manuf 29(5):288–301 IEEE
12. Shen HY, Wu J, Lin T, Chen SB (2008) Arc welding robot system 28. Gong, Y., Dai, X., & Li, X. (2010, April). Structured-light based
with seam tracking and weld pool control based on passive vision. joint recognition using bottom-up and top-down combined visual
Int J Adv Manuf Technol 39(7–8):669–678 processing. In Image Analysis and Signal Processing (IASP), 2010
13. Gao X, Ding D, Bai T, Katayama S (2011) Weld-pool image cen- International Conference on (pp. 507–512). IEEE
troid algorithm for seam-tracking vision model in arc-welding pro- 29. Shi, Y. H., Wang, G. R., & Li, G. J. (2007, May). Adaptive robotic
cess. IET Image Process 5(5):410–419 welding system using laser vision sensing for underwater engineer-
14. Chen SB, Chen XZ, Qiu T, Li JQ (2005) Acquisition of weld seam ing. In Control and Automation, 2007. ICCA 2007. IEEE
dimensional position information for arc welding robot based on International Conference on (pp. 1213–1218). IEEE
vision computing. J Intell Robot Syst 43(1):77–97 30. Kim, J. W., & Bae, H. S. (2005). A study on a vision sensor system
15. Chen, X. Z., Huang, Y. M., & Chen, S. B. (2012). Model analysis for tracking the I-butt weld joints. J Mech Sci Technol, 19(10),
and experimental technique on computing accuracy of seam spatial 1856–1863
position information based on stereo vision for welding robot. 31. Xu, D., Tan, M., Zhao, X., & Tu, Z. (2004). Seam tracking and
Industrial Robot: An International Journal, 39(4), 349–356 visual control for robotic arc welding based on structured light
16. Chen X, Chen S, Lin T, Lei Y (2006) Practical method to locate the stereovision. International Journal of Automation and Computing,
initial weld position using visual technology. Int J Adv Manuf 1(1), 63–75
Technol 30(7–8):663–668 32. Xu, D., Wang, L., & Tan, M. (2004). Image processing and visual
17. Chen, X. Z., & Chen, S. B. (2010). The autonomous detection and control method for arc welding robot. In Robotics and Biomimetics,
guiding of start welding position for arc welding robot. Industrial 2004. ROBIO 2004. IEEE International Conference on (pp. 727–
Robot: An International Journal, 37(1), 70–78 732). IEEE
18. Muhammad, J., Altun, H., & Abo-Serie, E. (2016). Welding seam 33. Xu, D., Jiang, Z., Wang, L., & Tan, M. (2004, December). Features
profiling techniques based on active vision sensing for intelligent extraction for structured light image of welding seam with arc and
robotic welding. Int J Adv Manuf Technol, 1–19 splash disturbance. In Control, Automation, Robotics and Vision
19. Hang, K., & Pritschow, G. (1999). Reducing distortions caused by Conference, 2004. ICARCV 2004 8th(Vol. 3, pp. 1559–1563). IEEE
the welding arc in a laser stripe sensor system for automated seam
34. Kim, J. S., Son, Y. T., Cho, H. S., & Koh, K. I. (1996). A robust
tracking. In Industrial Electronics, 1999. ISIE'99. Proceedings of
visual seam tracking system for robotic arc welding. Mechatronics,
the IEEE International Symposium on (Vol. 2, pp. 919–924). IEEE
6(2), 141–163
20. Huang W, Kovacevic R (2012) Development of a real-time laser-
35. Kim, J. S., Son, Y. T., Cho, H. S., & Koh, K. I. (1995, August). A
based machine vision system to monitor and control welding pro-
robust method for vision-based seam tracking in robotic arc
cesses. Int J Adv Manuf Technol 63(1–4):235–248
welding. In Intelligent Control, 1995., Proceedings of the 1995
21. Li Y, Li YF, Wang QL, Xu D, Tan M (2010) Measurement and
IEEE International Symposium on (pp. 363–368). IEEE
defect detection of the weld bead based on online vision inspection.
Instrum Meas, IEEE Transactions on 59(7):1841–1849 36. Sicard, P., & Levine, M. D. (1989). Joint recognition and tracking
22. Sung K, Lee H, Choi YS, Rhee S (2009) Development of a multi- for robotic arc welding. Systems, Man and Cybernetics, IEEE
line laser vision sensor for joint tracking in welding. Weld J Transactions on, 19(4), 714–728
23. Huang, W., & Kovacevic, R. (2011). A laser-based vision system 37. Nan, H., Beattie, R. J., & Davey, P. G. (1988). A rule-based system
for weld quality inspection. Sensors, 11(1), 506–521 for interpreting weld seam images. The International Journal of
24. Li, Y., Wang, Q. L., Li, Y. F., Xu, D., & Tan, M. (2008, May). On- Advanced Manufacturing Technology, 3(3), 111–121
line visual measurement and inspection of weld bead using struc- 38. Xiao, Y., Zou, J. J., & Yan, H. (2001). An adaptive split-and-merge
tured light. In Instrumentation and Measurement Technology method for binary image contour data compression. Pattern Recogn
Conference Proceedings, 2008. IMTC 2008. IEEE (pp. 2038– Lett, 22(3), 299–307
2043). IEEE 39. Pavlidis, T., & Horowitz, S. L. (1974). Segmentation of plane
25. Naidu DK, Fisher RB (1991) A comparative analysis of algorithms curves. IEEE transactions on Computers, (8), 860–870
for determining the peak position of a stripe to sub-pixel accuracy. 40. Gu, W. P., Xiong, Z. Y., & Wan, W. (2013). Autonomous seam
In BMVC91 (pp. 217–225). Springer London acquisition and tracking system for multi-pass welding based on
26. Usamentiaga R, Molleda J, García DF (2012) Fast and robust laser vision sensor. Int J of Adv Manuf Technol, 69(1–4), 451–460
stripe extraction for 3D reconstruction in industrial environments. 41. Ballard, D. H. (1981). Generalizing the Hough transform to detect
Mach Vis Appl 23(1):179–196 arbitrary shapes. Pattern Recognit, 13(2), 111–122
27. Haug K, Pritschow G (1998) Robust laser-stripe sensor for auto- 42. Berkan, R. C., & Trubatch, S. (1997). Fuzzy System Design
mated weld-seam-tracking in the shipbuilding industry. In Principles. Wiley-IEEE Press

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy