0% found this document useful (0 votes)
7 views

sesnor

Uploaded by

22mi041
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
7 views

sesnor

Uploaded by

22mi041
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 11

Journal of Physics: Conference

Series

PAPER • OPEN ACCESS You may also like


- Industrial space in Bydgoszcz
Common Sensors in Industrial Robots: A Review Piotr Brzeziski

- Industrial Detection Efficiency Based on


To cite this article: Peng Li and Xiangpeng Liu 2019 J. Phys.: Conf. Ser. 1267 012036 Automatic Industrial CT Detection System
Kai Xu

- Research on the influence of straightness


deviations on positioning precision of
Cartesian industrial robots
View the article online for updates and enhancements. A Luncanu and G Stan

This content was downloaded from IP address 123.63.236.166 on 07/08/2024 at 02:19


AIACT 2019 IOP Publishing
IOP Conf. Series: Journal of Physics: Conf. Series 1267 (2019) 012036 doi:10.1088/1742-6596/1267/1/012036

Common Sensors in Industrial Robots: A Review

Peng Li1 and Xiangpeng Liu1,*


1
School of Mechanical Engineering, Shanghai Jiao Tong University, 800 Dongchuan
Road, Shanghai 200240, China
*
Corresponding E-mail: x.liu@sjtu.edu.cn

Abstract. The application of industrial robots has greatly promoted the development of industry
in the past decades. Now with the proposal and prevalence of Industry 4.0, industrial robots are
required to be more independent and intelligent to accomplish more complex and flexible tasks.
The advancement of industrial robots relies on the development and progress of multiple
technologies, among which sensors are the indispensable part. They can acquire abundant
information to help industrial robots implement their functions. This paper reviews the recent
literatures and gives a summary and introduction of the commonly used sensors in industrial
robots. Additionally, the applications of these sensors in diverse functions of industrial robots
are also presented. Finally, the developing direction and challenges of industrial robots in the
future are discussed in the last part of this article.

1. Introduction
Nowadays many countries focus on the development of Industry 4.0 to promote the transition and
upgrade of the manufacturing sector. Industry 4.0, which is considered to be the fourth industrial
revolution, is proposed by Germany at the industrial fair in Hannover in 2013. Its objective is to realize
the intelligent, flexible and personalized process of manufacture with the help of Internet of Things
(IOT), digital information technology, and computer technology [1]. In Industry 4.0, industrial robots
are the indispensable component.
Industrial robots have a history of more than 60 years [2]. The development and utilization of them
have vastly improved the productivity and promoted the progress of industry, as well as relieved workers
of onerous and iterative tasks. What’s more, with the advancement of sensors technology, automation
and computer information technology in recent years, industrial robots are developed to be more
intelligent. Now they can implement multiple functions and are extensively employed in many fields of
manufacture, i.e., welding, assembly, materials transportation, mechanical processing, etc [2].
Sensors are the essential parts for industrial robots to accomplish their tasks. The information
acquired by sensors can be used to judge the state of industrial robots and the circumstances of external
environment, thus helping control and regulate robots to execute appointed functions. This paper
reviews the recent literatures and summarizes the frequently-used sensors and their applications in
industrial robots.
The structure of this article is organized as follows: The introduction of background in Section 1 is
followed by the presentation and analysis of the common sensors employed in industrial robots in
Section 2. Section 3 introduces the concrete applications of these sensors in industrial robots. And the
last section concludes this article and makes a discussion about the future works.

Content from this work may be used under the terms of the Creative Commons Attribution 3.0 licence. Any further distribution
of this work must maintain attribution to the author(s) and the title of the work, journal citation and DOI.
Published under licence by IOP Publishing Ltd 1
AIACT 2019 IOP Publishing
IOP Conf. Series: Journal of Physics: Conf. Series 1267 (2019) 012036 doi:10.1088/1742-6596/1267/1/012036

2. Common Sensors in Industrial Robots


Sensors are of great importance to industrial robots. They are equivalent to the sensorium of industrial
robots to help recognize the motion status of the robot and the circumstance of environment. With the
help of these information, the controller can issue corresponding instructions to make the robot complete
the required operations. After a survey on literatures in recent years, the common sensors deployed in
industrial robots are presented as follows.

2.1. Tactile Sensors


Just as the significance of tactile perception for people to perceive multiple properties of external objects
and interact with the environment, industrial robots also need the sense of touch to cognize the outer
world. Therefore, tactile sensors, which endue industrial robots with the ability of tactile sensation, are
vital components to make them more intelligent.
Tactile sensors have been developed for several decades, and they are gradually used in various
robots now. The types of tactile sensors mainly include capacitive, piezoelectric, piezo-resistive and
optical [3]. The capacitive tactile sensor utilizes the change of capacitance to measure the contact force.
It has great spatial resolution and low power consumption, but its robustness to interference is poor [4].
The piezoelectric tactile sensor depends on the theory of piezoelectric effect which means that the
electrical charge will be emerged on the surface of piezoelectric materials when exterior load is exerted.
The frequency response of it is good and its measuring range is large, but its resolution is not very
satisfying [3]. The piezo-resistive tactile sensor bases on the principle of piezo-resistive effect which
refers to the variance of resistance when outer force is implemented. It has large measuring range and
good robustness, however, it is impressionable to hysteresis [4]. The optical tactile sensor can transduce
the external contact information into the variance of parameters of light. It is resistant to the external
interference and possesses great spatial resolution [5]. Furthermore, some researchers have developed
artificial skins to imitate human’s skin, which may be the development trend of the future. Multiple
sensors are deployed on each cell of the skin to provide abundant data.
Although tactile sensors are attracting more and more attention and interest, their performance, like
versatility and adaptability, is not very satisfying at present. The development of them depends on the
advancement of various domains of technology, such as materials, electronics, relevant algorithms, and
so on [5]. To achieve the level of human’s tactile perception, there is still a long way to go.

2.2. Visual Sensors


Visual sensing technology has developed rapidly over the past few years, and nowadays it is extensively
utilized in many fields, such as face recognition, three-dimensional reconstruction and multiple robots,
etc. Images captured by visual sensors are processed by the processor to extract useful information for
specific tasks.
Visual sensors mainly include various kinds of cameras, like RGB cameras, multispectral cameras,
and depth cameras [6]. The photosensitive elements equipped in multiple cameras are usually CCD or
CMOS, which can transduce the light information into electrical signals based on the principle of
photoelectric effect. The flexibility and pictures’ quality of CCD cameras are better than that of CMOS
cameras, but CMOS has its superiority in the aspects of cost and power consumption [7].
Different types of cameras can provide diverse information. RGB cameras, the mostly used one in
people’s daily life, capture chromatic images on the basis of the theory that each kind of visible color
can be obtained by three colors: red, green, blue and their combination. Multispectral cameras are able
to acquire images in diverse bands of spectrum, involving visible and invisible wavelength,
consequently, they can get the information that cannot be provided by RGB cameras [8]. Depth cameras
add the distance information to two-dimensional images, thus realizing stereo imaging. They can be
classified as RGB binocular, structured light and TOF according to their operational principles.
Visual sensors enjoy great popularity for their advantages of low cost, luxuriant information supplied
and ease of use [6]. Nonetheless, the processing of data from visual sensors are complex and time-
consuming. Although many researchers have put forward several algorithms, the applicability and

2
AIACT 2019 IOP Publishing
IOP Conf. Series: Journal of Physics: Conf. Series 1267 (2019) 012036 doi:10.1088/1742-6596/1267/1/012036

flexibility are still not very satisfying.

2.3. Laser Sensors


Laser is invented in twentieth century and has outstanding monochromaticity, directivity and brightness
[9], thus being widely employed for diverse applications. Laser sensors are the kind of sensor that
utilizes laser technology to accomplish measuring tasks. They are usually composed of laser emitter,
detector and measuring circuit. The working substance of the emitter mainly contains four categories:
solid, liquid, gas and semiconductor [10].
Laser sensors are mostly utilized to measure physical parameters, like distance, velocity and vibration.
The common types are laser range finder, laser displacement sensor, laser scanner, laser tracker, etc.
The fundamentals of laser range measurement mainly include three kinds: time of flight (TOF),
triangulation method and optical interference [11]. TOF refers to the time from projecting laser to
receiving the catoptric light. It is often used in laser range finder for long distance measuring, and the
accuracy depends on the measuring precision of the flying time because of the high speed of light. The
triangulation method utilizes the homothetic triangle theory and trigonometric function to calculate the
distance of objects. Laser displacement sensor is based on this method to implement short range
measuring. Optical interference is a phenomenon that the superposition of two light beams with different
phase will generate bright and dark fringes. It is employed in the laser tracker to gauge the moved
distance of the target fitted with a reflector.
Laser sensors can realize contactless telemetering, and the measuring speed and accuracy are
satisfying [6]. However, the wavelength of laser can be influenced by temperature, atmospheric pressure
and air humidity, so the compensation will be required when the above parameters change.

2.4. Encoder
The encoder is a kind of sensor that can transduce the angular displacement or velocity into electrical
impulse or digital quantity. It can be divided into four categories: photoelectric, magnetic, inductive and
capacitive according to the detection principles [12].
The frequently-used type is the photoelectric encoder, which utilizes the theory of photoelectric
effect to complete signal conversion. It is usually composed of an optical coded disc and the
photoelectric detecting appliance. In terms of the calibration mode of the coded disc, photoelectric
encoders can be classified into incremental encoders and absolute encoders. The output of incremental
photoelectric encoders is a series of square wave pluses. The rotation angle can be calculated on the
basis of the amount of the pluses, and a zero reference position is required to determine the absolute
position of the rotating shaft. Absolute photoelectric encoders output the binary digital quantity that
corresponds to each location of the axis, therefore, they can get the absolute position directly.
Encoders have been widely used for many years on account of their advantages of compactness, long
service life, ease of use, and mature technology [13]. The resolution of encoders relies on the amount of
scribed lines on the coded disc in one circle. Smaller angle can be discriminated with more lines, thereby
generating higher resolution, in the meanwhile, the cost will be higher too.

2.5. Other Sensors


Apart from the above four kinds of sensors, some other sensors are also deployed in industrial robots to
realize several functions, such as proximity sensors, inertial sensors, torque sensors, acoustic sensors,
magnetic sensors, ultrasonic sensors, etc.
The proximity sensor is a kind of non-contact device that can detect the approach of objects and
output homologous switching signals. It can be categorized into capacitive, inductive and photoelectric
on the basis of the operational fundamentals [14]. Capacitive proximity sensors utilize the alteration of
the circuit status caused by the capacity variance of the detecting electrode to perceive the approaching
objects. Inductive proximity sensors are based on the electromagnetic induction to work. The sensing
elements of them are the detecting coils, whose inductance will change when a metallic conductor comes
near. Photoelectric proximity sensors are usually composed of the light emitting diode and the

3
AIACT 2019 IOP Publishing
IOP Conf. Series: Journal of Physics: Conf. Series 1267 (2019) 012036 doi:10.1088/1742-6596/1267/1/012036

photoelectric device. The light emitted from the diode will be reflected to the photoelectric device when
objects approach, then the detecting circuit will generate corresponding output signals.
Inertial sensors usually refer to the accelerometer, gyroscope and magnetometer. They are widely
employed to measure the kinematical parameters of moving objects, including the acceleration, angular
speed and azimuthal angle. Specially, the triaxial combination of them is known as the inertial measuring
unit (IMU). The measurement principle of inertial sensors is dead reckoning (DR) that utilizes the
method of integration to calculate the amount of objects’ movement. The precision of inertial sensors is
satisfying in a short time, but the error will be larger and larger as time goes by [15].
Table 1. A brief summary of the presented sensors.
Sensor type Principle Information obtained Applications in industrial robots
capacitive,
contact force, area, human-robot collaboration (HRC),
Tactile sensors piezoelectric, piezo-
position objects grasping, quality monitoring
resistive, optical
human-robot collaboration (HRC),
CCD or CMOS
Visual sensors images navigation, manipulator control,
imaging
assembly, robot programming
time of flight (TOF),
distance, human-robot collaboration (HRC),
Laser sensors triangulation, optical
displacement navigation, manipulator control
interference
photoelectric,
Encoders magnetic, inductive, angular displacement navigation, manipulator control
capacitive
Proximity capacitive, inductive, approaching of human-robot collaboration (HRC),
sensors photoelectric objects objects grasping
acceleration, angular
Inertial
dead reckoning (DR) speed, azimuthal navigation, manipulator control
Sensors
angle
human-robot collaboration (HRC),
inductive, resistance
Torque sensors torque objects grasping, robot
strain
programming
Acoustic human-robot collaboration (HRC),
capacitive sound signals
sensors welding
Magnetic magnetic field
Hall Effect navigation
sensors intensity
Ultrasonic
time of flight (TOF) distance obstacles avoidance
Sensors
Torque sensors are mostly used to measure the torque exerted to mechanical axis. The common types
are the inductive torque sensor and the resistance strain torque sensor. The structure of them usually
consists of a torsion bar and detecting elements like coils or resistance strain gage. The input shaft and
the output shaft are connected by the torsion bar, and the torsional deformation of the torsion bar caused
by a torque can be transduced into electric signals via the variance of parameters of detecting elements,
thus realizing the torque measurement.
Acoustic sensors are able to transduce the sound waves into electric signals. A capacitive electret
microphone is installed in them, and sound waves can cause the vibration of the electret membrane in
the microphone, thus resulting in the variance of capacity and generating weak voltage. Then the voltage
is transformed for subsequent processing.
Magnetic sensors are mainly deployed to detect the magnetic field intensity. The principle of them
is Hall Effect which refers to the phenomenon that when the electricity flows through a conductor, an

4
AIACT 2019 IOP Publishing
IOP Conf. Series: Journal of Physics: Conf. Series 1267 (2019) 012036 doi:10.1088/1742-6596/1267/1/012036

electric field which is perpendicular to the direction of the magnetic field and the electricity will be
generated, thus causing a potential difference on the surface of the conductor.
Ultrasonic sensors are often used to detect obstacles. They estimate the range of objects according to
the time from projecting the ultrasonic waves to detecting the echo. The size and weight of them are
small, and their cost and power consumption are low [16].
Table 1 provides a brief summary of the above presented sensors.

3. Concrete Applications
Sensors are frequently deployed to assist the controller to control industrial robots to implement
appointed functions, which involve various aspects, such as human-robot collaboration (HRC),
autonomous navigation, manipulator control, etc. In the following part, this article gives a presentation
and analysis of the above sensors’ concrete applications in industrial robots.

3.1. Human-Robot Collaboration


Recently the concept of human-robot collaboration (HRC) has been brought out to realize the
collaborative operation of workers and robots. This mode of production can combine the cognitive
ability and strain capacity of human beings with the accuracy and tirelessness of robots to improve the
flexibility and adaptability of manufacture systems [17].
The primary problem needed to be considered in HRC is safety [18]. Robots should be able to detect
and recognize objects to avoid collisions or stop moving immediately when collisions occur. The
common sensors applied to implement this function are vision sensors, laser sensors, proximity sensors,
torque sensors and tactile sensors. For example, in [19, 20], tactile sensors are deployed to perceive the
physical contact and locate the position of collisions to protect workers cooperating with the robot. And
the internal joint torque sensors are utilized to detect and classify collisions by estimating the external
forces in [21]. However, tactile sensors and torque sensors have a shortcoming that they can only detect
but cannot avoid collisions. The other three kinds of sensors can fill the gap. In [22, 23], the capacitive
or photoelectric proximity sensors are deployed to detect the approaching humans or objects so as to
help the industrial robots react timely before collisions happen. Depth cameras are used to recognize
humans in the operating scope of industrial robots to ensure safety in [24]. And in [25], laser scanners
are equipped to monitor the working circumstances of robots for collisions avoidance. Furthermore,
multiple sensors fusion is applied to guarantee the collaborative safety in HRC. For instance, in [26], a
RGBD camera, proximity sensors and a laser range finder are utilized to monitor the working area of
industrial robots and detect the approaching human beings. In [27], RGBD cameras and laser scanners
are employed to recognize and track the workers in the working range of robots.
The interaction of humans and robots is also an important part in HRC. Workers can effectively
control computer programming. For example, speech recognition assisted by acoustical sensors is
exploited to help humans interact with robots in [28]. But the performance of this method may not be
satisfying when the environment is noisy. Targeting this problem, hand gestures recognition with the
help of vision sensors is proposed to control the robots in [29].

3.2. AGV Navigation


Nowadays AGV (Automated Guided Vehicle) is widely deployed in industrial environment for
materials transportation, flexible production lines and some other applications [30]. Autonomous
navigation is the primary function of it, and the current technology can be classified into two main types:
physical paths and virtual paths [31].
Physical paths mainly refer to magnetic tapes or optical lines. In this method, magnetic sensors [32]
or vision sensors [33] are commonly used to detect the path to estimate the position deviation of AGV,
then the controller can make homologous adjustments to ensure AGV travels along the predetermined
trajectory. The physical path has advantages of low cost, high reliability and maturity, but its flexibility
is poor [31]. In contrast, the mode of virtual paths is more convenient to alter the route of AGV, and the
sensors frequently utilized in it are vision sensors, laser sensors, encoders and inertial sensors. For

5
AIACT 2019 IOP Publishing
IOP Conf. Series: Journal of Physics: Conf. Series 1267 (2019) 012036 doi:10.1088/1742-6596/1267/1/012036

instance, in [34], a TOF camera is deployed to perceive the environment and locate and guide a forklift
AGV. And in [35], laser sensors are employed to calculate the position of AGV robots based on the
triangular measurement method. Encoders and inertial sensors are also used to estimate the shift amount
of AGV to execute navigation [36]. Additionally, the mode of multiple sensors fusion is utilized to
improve the performance of navigation. In [37], magnetic sensors are appended to calibrate the location
of AGV and modify the cumulative deviation of encoders and inertial sensors. And TOF cameras and
laser range finders are used together to complete autonomous navigation in [38].
In addition to the above navigation technology, obstacles detection and avoidance is also an essential
component in automated traveling of AGV. Ultrasonic sensors [39] or laser sensors [40] are usually
deployed to detect obstacles to help the controller re-plan the route of AGV for collision avoidance.

3.3. Manipulator Control


Manipulator control is of great importance to the accuracy of various tasks implemented by industrial
robots, such as mechanical processing, assembly, welding, etc. It involves the position and motion
trajectory control of the end-effector. In order to get high precision of industrial robots, errors caused by
several factors like compliance of joints, transmission of gears and temperature variation are needed to
be gauged and modified [41].
The commonly used sensors in this respect are laser sensors, vision sensors, inertial sensors and
encoders. For example, in [42], laser trackers are deployed to measure the current location of the end-
effector to get the absolute position deviation, then the controller can make a real time calibration to
improve the accuracy of industrial robots. And in [43, 44], the method of visual servoing is utilized to
compensate errors of the robot actuator timely and realize the accurate path tracking. IMU is also used
to measure the joint angles to estimate the position of the end-effector [45], but its precision is not very
satisfying because of the accumulated errors. In addition, the accuracy of encoders equipped on servo
motors of the robot joints is also not good because they cannot detect the deviation generated by the
mechanical deformation or transmission of gears. Targeting this problem, the secondary encoders are
deployed on each output axis to obtain high position accuracy of the actuator and improve the
performance of mechanical processing [46], however, their price is a little expensive. Furthermore, the
combination of inertial sensors and encoders is also employed to get better location estimation of the
robot end-effector in [47].

3.4. Objects Grasping and Assembly


Objects grasping is a common operation in industrial manufacture. To complete this task, robots need
to recognize the target object and determine its position coordinates, then the actuator should pick it in
a proper force in case of damage or slippage. The frequently applied sensors in this process are usually
vision sensors, proximity sensors, torque sensors and tactile sensors. In [48], depth cameras are deployed
to identify the location and posture of the objects to help the controller plan the motion path of the robot
end-effector. And in [49], proximity sensors are also configured to help modify the approaching
deviation and improve the accuracy of grasping. On the other hand, grasping force of the end-effector
are ordinarily estimated and controlled by torque sensors [50]. Additionally, in order to detect and avoid
the slippage of the grasped object, tactile sensors are often employed to help adjust the picking force
[51, 52].
Assembly is an important component of the industrial production lines. Apart from the above
presented operation of parts grasping, the recognition of assembly position is also essential to
autonomous assembling. In [53, 54], vision sensors are utilized to detect and localize the installation site
to guide the end-effector to implement assembly. Moreover, torque sensors [55] or laser sensors [56]
are also configured to help compensate the location error of visual sensors to improve the accuracy and
efficiency of assembling.

3.5. Other Applications


In addition to the above introduced applications, sensors presented in section 2 are also applied in some

6
AIACT 2019 IOP Publishing
IOP Conf. Series: Journal of Physics: Conf. Series 1267 (2019) 012036 doi:10.1088/1742-6596/1267/1/012036

other aspects of industrial robots, such as welding, quality monitoring, robot programming, etc.
For instance, in [57, 58], visual sensors are deployed to capture the real time welding images which
will be processed to obtain the position of welding seam, then the controller can control the end-effector
to execute welding accurately. Furthermore, acoustic sensors are also used to realize welding seam
tracking by analyzing the characteristics of arc sound signals in [59]. And in [60], vision sensors and
acoustic sensors are integrated to monitor the welding penetration status to control the quality of welding.
In the aspect of quality monitoring on production line, in [61], stereo vision sensors are utilized to
inspect the quality of products via the processing and features extraction of the acquired images. And
tactile sensors are also deployed on the robot end-effector to implement quality control by perceiving
the surface information of products in [62].
Currently some intuitive methods are employed with the help of several sensors to simplify the
process and improve the efficiency of robot programming. For example, in [63, 64], torque sensors are
utilized to measure forces and torques to estimate the motion parameters of the robot manipulator during
the procedure of manual guidance programming. And in [65], cameras and a luminous marker are used
to track the movement of the human wrist to realize the demonstration programming of industrial robots.
Additionally, in [66], tactile sensors are also deployed to recognize the tactile gestures to implement
robot programming.

4. Conclusion
This article reviews the common sensors and their concrete applications in industrial robots in recent
years. From the literatures, it can be seen that industrial robots are developing towards the direction of
more independent and intelligent at present, which will help realize more flexible, accurate and
personalized industrial manufacture. However, one challenge in the development of industrial robots is
to develop sensors with better properties to improve the robots’ perception of the external environment
and their own status, and this relies on the development of multidisciplinary technologies. Additionally,
the algorithms for handling the information acquired from sensors are also needed to be optimized to
improve the processing speed and accuracy to meet the higher performance requirements of industrial
robots.

References
[1] Keliang Zhou, Taigang Liu, Lifeng Zhou. Industry 4.0: Towards Future Industrial Opportunities
and Challenges. 12th International Conference on Fuzzy Systems and Knowledge Discovery
(FSKD), Zhangjiajie, China, August 15-17, 2015
[2] Tianmiao Wang, Yong Tao, 2014. Research status and industrialization development strategy of
Chinese industrial robot. Journal of Mechanical Engineering, 50(9), 1-13.
[3] Liang Zou, Chang Ge, Z. Jane Wang, Edmond Cretu, Xiaoou Li, 2017. Novel tactile sensor
technology and smart tactile sensing systems: a review. Sensors, 17, 2653.
[4] Uriel Martinez-Hernandez, 2016. Tactile sensors (Paris: Atlantis Press).
[5] Mohsin I. Tiwana, Stephen J. Redmond, Nigel H. Lovell, 2012. A review of tactile sensing
technologies with applications in biomedical engineering. Sensors and Actuators A: Physical, 179,
17-31.
[6] Francisco Yandun Narvaez, Giulio Reina, Miguel Torres-Torriti, George Kantor, Fernando Auat
Cheein, 2017. A Survey of Ranging and Imaging Techniques for Precision Agriculture
Phenotyping. IEEE/ASME Transactions on Mechatronics, 22(6), 2428-2439.
[7] Dave Litwiller, 2001. CCD vs. CMOS: Facts and Fiction. Photonics Spectra, Laurin Publishing
Co. Inc.
[8] Francisco Yandun Narvaez, Giulio Reina, Miguel Torres-Torriti, George Kantor, Fernando Auat
Cheein, 2017. A survey of ranging and imaging techniques for precision agriculture phenotyping.
IEEE/ASME Transactions on Mectronics, 22(6), 2428-2439.
[9] Avanish KumarDubey, VinodYadava, 2008. Laser beam machining—A review. International
Journal of Machine Tools and Manufacture, 48(6), 609-628.

7
AIACT 2019 IOP Publishing
IOP Conf. Series: Journal of Physics: Conf. Series 1267 (2019) 012036 doi:10.1088/1742-6596/1267/1/012036

[10] J.D. Majumdar, I. Manna, 2003. Laser processing of materials. Sadhana, 28(3-4), 495-562.
[11] Markus-Christian Amann, Thierry M. Bosch, Marc Lescure, Risto A. Myllylae, Marc Rioux,
2001. Laser ranging: a critical review of unusual techniques for distance measurement. Optical
Engineering, 40(1), https://doi.org/10.1117/1.1330700.
[12] Josef Janisch, 2006. Summary of Untouched Circumgyrate Coder. Global Electronics China, 4,
53-54,56.
[13] Shan Chen, 2018. The review of patented encoder technology. China Science and Technology
Information, 22, 18-21.
[14] Dick Johnson, 2007. Proximity sensors. Computer Engineering & Software, 11, 34-38.
[15] Hossein Mousazadeh, 2013. A technical review on navigation systems of agricultural autonomous
off-road vehicles. Journal of Terramechanics, 50(3), 211-232.
[16] He Zhao, Zheyao Wang, 2012. Motion Measurement Using Inertial Sensors, Ultrasonic Sensors,
and Magnetometers With Extended Kalman Filter for Data Fusion. IEEE Sensors Journal, 12(5),
943-953.
[17] GeorgeMichalos, et al, 2014. ROBO-PARTNER: Seamless Human-Robot Cooperation for
Intelligent, Flexible and Safe Operations in the Assembly Factories of the Future. Procedia CIRP,
23, 71-76.
[18] Ales Vysocky, Petr Novak, 2016. Human-Robot Collaboration in Industry. MM Science Journal,
2, 903-906.
[19] John O'Neill, Jason Lu, Rodney Dockter, Timothy Kowalewski. Practical, Stretchable Smart Skin
Sensors for Contact-Aware Robots in Safe and Collaborative Interactions. IEEE International
Conference on Robotics and Automation (ICRA), Seattle, WA, USA, May 26-30, 2015.
[20] Markus Fritzsche, Jose Saenz, Felix Penzlin. A Large Scale Tactile Sensor for Safe Mobile Robot
Manipulation. 11th ACM/IEEE International Conference on Human-Robot Interaction (HRI),
Christchurch, New Zealand, March 7-10, 2016.
[21] Dmitry Popov, Alexandr Klimchik, Nikolaos Mavridis. Collision Detection, Localization &
Classification for Industrial Robots with Joint Torque Sensors. 26th IEEE International
Symposium on Robot and Human Interactive Communication (RO-MAN), Lisbon, Portugal, Aug
28-Sept 1, 2017.
[22] Alwin Hoffmann, Alexander Poeppel, Andreas Schierl, Wolfgang Reif. Environment-aware
Proximity Detection with Capacitive Sensors for Human-Robot-Interaction. IEEE/RSJ
International Conference on Intelligent Robots and Systems (IROS), Daejeon, Korea, October 9-
14, 2016.
[23] Nicola Maria Ceriani, Andrea Maria Zanchettin, Paolo Rocco, Andreas Stolt, Anders Robertsson,
2015. Reactive Task Adaptation Based on Hierarchical Constraints Classification for Safe
Industrial Robots. IEEE/ASME Transactions on Mechatronics, 20(6), 2935-2949.
[24] Abdullah Mohammed, Bernard Schmidt, Lihui Wang, 2017. Active collision avoidance for
human–robot collaboration driven by vision sensors. International Journal of Computer
Integrated Manufacturing, 30(9), 970-980.
[25] Dario Antonelli, Giulia Bruno, 2017. Human-Robot Collaboration Using Industrial Robots.
Advances in Engineering Research, 86, 99-102.
[26] Stephan Kallweit, Robert Walenta, Michael Gottschalk, 2016. ROS Based Safety Concept for
Collaborative Robots in Industrial Applications. Advances in Robot Design and Intelligent
Control, 371, 27-35.
[27] Joséde Gea Fernández, et al, 2017. Multimodal sensor-based whole-body control for human–
robot collaboration in industrial settings. Robotics and Autonomous Systems, 94, 102-119.
[28] Dr. Jayashri Vajpai, Avnish Bora, 2016. Industrial Applications of Automatic Speech
Recognition Systems. Journal of Engineering Research and Applications, 6(3), 88-95.
[29] Gilbert Tang, Seemal Asif, Phil Webb, 2015. The integration of contactless static pose recognition
and dynamic hand motion tracking control system for industrial human and robot collaboration.
Industrial Robot: An International Journal, 42(5), 416-428.

8
AIACT 2019 IOP Publishing
IOP Conf. Series: Journal of Physics: Conf. Series 1267 (2019) 012036 doi:10.1088/1742-6596/1267/1/012036

[30] B.Y. Qi, Q.L. Yang, Y.Y. Zhou. Application of AGV in intelligent logistics system. 5th Asia
International Symposium on Mechatronics (AISM), Guilin, China, October 7-10, 2015.
[31] Alberto Vale, Rodrigo Ventura, Pedro Lopes, Isabel Ribeiro, 2017. Assessment of navigation
technologies for automated guided vehicle in nuclear fusion facilities. Robotics and Autonomous
Systems, 97, 153-170.
[32] BoYang Xu, Dongqing Wang. Magnetic Locating AGV Navigation Based on Kalman Filter and
PID Control. Chinese Automation Congress (CAC), Xi'an, China, November 30- December 2,
2018.
[33] Chunfu Wu, Xiaolong Wang, Qingxie Chen, Xiaowei Cai, Guodong Li. Research on Visual
Navigation Algorithm of AGV used in the Small Agile Warehouse. Chinese Automation
Congress (CAC), Jinan, China, October 20-22, 2017.
[34] Ulrich Behrje, Marian Himstedt, and Erik Maehle. An Autonomous Forklift with 3D Time-of-
Flight Camera-Based Localization and Navigation. 15th International Conference on Control,
Automation, Robotics and Vision (ICARCV), Singapore, Singapore, November 18-21, 2018.
[35] Hang Li, Andrey V. Savkin, 2018. Robotics and Computer Integrated Manufacturing. Robotics
and Computer Integrated Manufacturing, 54, 65-82.
[36] Hyunhak Cho, Eun Kyeong Kim, Eunseok Jang, Sungshin Kim, 2017. Improved Positioning
Method for Magnetic Encoder Type AGV Using Extended Kalman Filter and Encoder
Compensation Method. International Journal of Control, Automation and Systems, 15(4), 1844-
1856.
[37] Pingbao Yin, Wenfeng Li, Ying Duan. Combinatorial Inertial Guidance System for an Automated
Guided Vehicle. IEEE 15th International Conference on Networking, Sensing and Control
(ICNSC), Zhuhai, China, March 27-29, 2018.
[38] Andreas Do¨mel, et al, 2017. Toward fully autonomous mobile manipulation for industrial
environments. International Journal of Advanced Robotic Systems, 1-19.
[39] J. Sankari ; R. Imtiaz. Automated guided vehicle(AGV) for industrial sector. 10th International
Conference on Intelligent Systems and Control (ISCO), Coimbatore, India, January 7-8, 2016.
[40] Elena Cardarelli, Valerio Digani, Lorenzo Sabattini, Cristian Secchi, Cesare Fantuzzi, 2017.
Cooperative cloud robotics architecture for the coordination of multi-AGV systems in industrial
warehouses. Mechatronics, 45, 1-13.
[41] Christian Moeller, et al, 2017. Real Time Pose Control of an Industrial Robotic System for
Machining of Large Scale Components in Aerospace Industry Using Laser Tracker System. SAE
Int. J. Aerosp, 10(2), 100-108.
[42] J. R. Diaz Posada, et al. High Accurate Robotic Drilling with External Sensor and Compliance
Model-Based Compensation. IEEE International Conference on Robotics and Automation
(ICRA), Stockholm, Sweden, May 16-21, 2016.
[43] Tingting Shu, Sepehr Gharaaty, WenFang Xie, Ahmed Joubair, Ilian A. Bonev, 2018. Dynamic
Path Tracking of Industrial Robots with High Accuracy Using Photogrammetry Sensor.
IEEE/ASME Transactions on Mechatronics, 23(3), 1159-1170.
[44] Mohammad Keshmiri, Wenfang Xie, 2017. Image-Based Visual Servoing Using an Optimized
Trajectory Planning Technique. IEEE/ASME Transactions on Mechatronics, 22(1), 359-370.
[45] Luciano Cantelli, Giovanni Muscato, Marco Nunnari, Davide Spina, 2015. A Joint-Angle
Estimation Method for Industrial Manipulators Using Inertial Sensors. IEEE/ASME Transactions
on Mechatronics, 20(5), 2486-2495.
[46] A. Klimchik, A. Pashkevich, 2018. Robotic manipulators with double encoders: accuracy
improvement based on advanced stiffness modeling and intelligent control. IFAC PapersOnLine,
51(11), 740–745.
[47] Benigno Munoz-Barron, Jesus R. Rivera-Guillen, Roque A. Osornio-Rios, Rene J. Romero-
Troncoso, 2015. Sensor Fusion for Joint Kinematic Estimation in Serial Robots Using Encoder,
Accelerometer and Gyroscope. Journal of Intelligent & Robotic Systems, 78(3-4), 529-540.
[48] Ryo KABUTAN, et al. Development of Robotic Intelligent Space Using Multiple RGB-D

9
AIACT 2019 IOP Publishing
IOP Conf. Series: Journal of Physics: Conf. Series 1267 (2019) 012036 doi:10.1088/1742-6596/1267/1/012036

Cameras for Industrial Robots. ICT-ROBOT, Busan, Korea, September, 2016.


[49] Om Prakash Sahu, Bibhuti Bhusan Biswal, Saptarshi Mukharjee, Panchanand Jha, 2015.
Development of Robotic End-effector Using Sensors for Part Recognition and Grasping.
International Journal of Materials Science and Engineering, 3(1), 39-43.
[50] Eduardo Moreira, Luís F. Rocha, Andry M. Pinto, A. Paulo Moreira, Germano Veiga, 2016.
Assessment of Robotic Picking Operations Using a 6 Axis Force/Torque Sensor. IEEE Robotics
and Automation Letters, 1(2), 768-775.
[51] Rocco A. Romeo, Calogero M. Oddo, Maria Chiara Carrozza, Eugenio Guglielmelli, Loredana
Zollo, 2017. Slippage Detection with Piezoresistive Tactile Sensors. Sensors, 17, 1844.
[52] Giuseppe De Maria, Pietro Falco, Ciro Natale, Salvatore Pirozzi. Integrated Force/Tactile Sensing:
The Enabling Technology for Slipping Detection and Avoidance. IEEE International Conference
on Robotics and Automation (ICRA), Seattle, Washington, USA, May 26-30, 2015.
[53] Chyi-Yeu Lin, Le Thai Son, Yu-Lun Chang, Ya-Shiun Shiue, 2017. Image-Sensor-Based Fast
Industrial-Robot Positioning System for Assembly Implementation. Sensors and Materials, 29(7),
935-945.
[54] Rafiq Ahmad, Peter Plapper, 2016. Safe and Automated Assembly Process using Vision assisted
Robot Manipulator. Procedia CIRP, 41, 771-776.
[55] Mustafa W. Abdullah, Hubert Roth, Michael Weyrich, Jürgen Wahrburg, 2015. An Approach for
Peg-in-Hole Assembling using Intuitive Search Algorithm based on Human Behavior and Carried
by Sensors Guided Industrial Robot. IFAC-PapersOnLine, 48(3), 1476–1481.
[56] Zhengke Qin, Peng Wang, Jia Sun, Jinyan Lu, Hong Qiao, 2016. Precise Robotic Assembly for
Large-Scale Objects Based on Automatic Guidance and Alignment. IEEE Transactions on
Instrumentation and Measurement, 65(6), 1398-1411.
[57] Yanling Xu, Gu Fang, Na Lv, Shanben Chen, Jujia Zou, 2015. Computer vision technology for
seam tracking in robotic GTAW and GMAW. Robotics and Computer-Integrated Manufacturing,
32, 25-36.
[58] Yanling Xu, et al, 2017. Welding seam tracking in robotic gas metal arc welding. Journal of
Materials Processing Tech., 248, 18-30.
[59] Na Lv, et al, 2017. Real-time monitoring of welding path in pulse metal-inert gas robotic welding
using a dual-microphone array. The International Journal of Advanced Manufacturing
Technology, 90(9-12), 2955–2968.
[60] Tao Zhu, Yonghua Shi, Shuwan Cui, Yanxin Cui, 2019. Recognition of Weld Penetration During
K‑TIG Welding Based on Acoustic and Visual Sensing. Sensing and Imaging, 20, 3,
https://doi.org/10.1007/s11220-018-0224-9.
[61] Daniel Frank, Jimmy Chhor, Robert Schmitt. Stereo-vision for autonomous industrial inspection
robots. IEEE International Conference on Robotics and Biomimetics, Macau SAR, China,
December 5-8, 2017.
[62] Nathan F. Lepora, Benjamin Ward-Cherrier, 2016. Tactile Quality Control with Biomimetic
Active Touch. IEEE Robotics and Automation Letters, 1(2), 646-652.
[63] Rebecca Hollmann, Arne Rost, Martin Hägele, Alexander Verl. A HMM-based Approach to
Learning Probability Models of Programming Strategies for Industrial Robots. IEEE
International Conference on Robotics and Automation, Anchorage, Alaska, USA, May 3-8, 2010.
[64] Daniele Massa, Massimo Callegari, Cristina Cristalli, 2015. Manual guidance for industrial robot
programming. Industrial Robot: An International Journal, 42(5), 457-465.
[65] Marcos Ferreira, Paulo Costa, Luís Rocha, A. Paulo Moreira, 2016. Stereo-based real-time 6-DoF
work tool tracking for robot programing by demonstration. The International Journal of
Advanced Manufacturing Technology, 85(1-4), 57-69.
[66] Daniel Kubus, Arne Muxfeldt, Konrad Kissener, Jan Haus, Jochen Steil. Robust Recognition of
Tactile Gestures for Intuitive Robot Programming and Control. IEEE/RSJ International
Conference on Intelligent Robots and Systems (IROS), Vancouver, BC, Canada, September 24–
28, 2017.

10

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy