sesnor
sesnor
Series
Abstract. The application of industrial robots has greatly promoted the development of industry
in the past decades. Now with the proposal and prevalence of Industry 4.0, industrial robots are
required to be more independent and intelligent to accomplish more complex and flexible tasks.
The advancement of industrial robots relies on the development and progress of multiple
technologies, among which sensors are the indispensable part. They can acquire abundant
information to help industrial robots implement their functions. This paper reviews the recent
literatures and gives a summary and introduction of the commonly used sensors in industrial
robots. Additionally, the applications of these sensors in diverse functions of industrial robots
are also presented. Finally, the developing direction and challenges of industrial robots in the
future are discussed in the last part of this article.
1. Introduction
Nowadays many countries focus on the development of Industry 4.0 to promote the transition and
upgrade of the manufacturing sector. Industry 4.0, which is considered to be the fourth industrial
revolution, is proposed by Germany at the industrial fair in Hannover in 2013. Its objective is to realize
the intelligent, flexible and personalized process of manufacture with the help of Internet of Things
(IOT), digital information technology, and computer technology [1]. In Industry 4.0, industrial robots
are the indispensable component.
Industrial robots have a history of more than 60 years [2]. The development and utilization of them
have vastly improved the productivity and promoted the progress of industry, as well as relieved workers
of onerous and iterative tasks. What’s more, with the advancement of sensors technology, automation
and computer information technology in recent years, industrial robots are developed to be more
intelligent. Now they can implement multiple functions and are extensively employed in many fields of
manufacture, i.e., welding, assembly, materials transportation, mechanical processing, etc [2].
Sensors are the essential parts for industrial robots to accomplish their tasks. The information
acquired by sensors can be used to judge the state of industrial robots and the circumstances of external
environment, thus helping control and regulate robots to execute appointed functions. This paper
reviews the recent literatures and summarizes the frequently-used sensors and their applications in
industrial robots.
The structure of this article is organized as follows: The introduction of background in Section 1 is
followed by the presentation and analysis of the common sensors employed in industrial robots in
Section 2. Section 3 introduces the concrete applications of these sensors in industrial robots. And the
last section concludes this article and makes a discussion about the future works.
Content from this work may be used under the terms of the Creative Commons Attribution 3.0 licence. Any further distribution
of this work must maintain attribution to the author(s) and the title of the work, journal citation and DOI.
Published under licence by IOP Publishing Ltd 1
AIACT 2019 IOP Publishing
IOP Conf. Series: Journal of Physics: Conf. Series 1267 (2019) 012036 doi:10.1088/1742-6596/1267/1/012036
2
AIACT 2019 IOP Publishing
IOP Conf. Series: Journal of Physics: Conf. Series 1267 (2019) 012036 doi:10.1088/1742-6596/1267/1/012036
2.4. Encoder
The encoder is a kind of sensor that can transduce the angular displacement or velocity into electrical
impulse or digital quantity. It can be divided into four categories: photoelectric, magnetic, inductive and
capacitive according to the detection principles [12].
The frequently-used type is the photoelectric encoder, which utilizes the theory of photoelectric
effect to complete signal conversion. It is usually composed of an optical coded disc and the
photoelectric detecting appliance. In terms of the calibration mode of the coded disc, photoelectric
encoders can be classified into incremental encoders and absolute encoders. The output of incremental
photoelectric encoders is a series of square wave pluses. The rotation angle can be calculated on the
basis of the amount of the pluses, and a zero reference position is required to determine the absolute
position of the rotating shaft. Absolute photoelectric encoders output the binary digital quantity that
corresponds to each location of the axis, therefore, they can get the absolute position directly.
Encoders have been widely used for many years on account of their advantages of compactness, long
service life, ease of use, and mature technology [13]. The resolution of encoders relies on the amount of
scribed lines on the coded disc in one circle. Smaller angle can be discriminated with more lines, thereby
generating higher resolution, in the meanwhile, the cost will be higher too.
3
AIACT 2019 IOP Publishing
IOP Conf. Series: Journal of Physics: Conf. Series 1267 (2019) 012036 doi:10.1088/1742-6596/1267/1/012036
photoelectric device. The light emitted from the diode will be reflected to the photoelectric device when
objects approach, then the detecting circuit will generate corresponding output signals.
Inertial sensors usually refer to the accelerometer, gyroscope and magnetometer. They are widely
employed to measure the kinematical parameters of moving objects, including the acceleration, angular
speed and azimuthal angle. Specially, the triaxial combination of them is known as the inertial measuring
unit (IMU). The measurement principle of inertial sensors is dead reckoning (DR) that utilizes the
method of integration to calculate the amount of objects’ movement. The precision of inertial sensors is
satisfying in a short time, but the error will be larger and larger as time goes by [15].
Table 1. A brief summary of the presented sensors.
Sensor type Principle Information obtained Applications in industrial robots
capacitive,
contact force, area, human-robot collaboration (HRC),
Tactile sensors piezoelectric, piezo-
position objects grasping, quality monitoring
resistive, optical
human-robot collaboration (HRC),
CCD or CMOS
Visual sensors images navigation, manipulator control,
imaging
assembly, robot programming
time of flight (TOF),
distance, human-robot collaboration (HRC),
Laser sensors triangulation, optical
displacement navigation, manipulator control
interference
photoelectric,
Encoders magnetic, inductive, angular displacement navigation, manipulator control
capacitive
Proximity capacitive, inductive, approaching of human-robot collaboration (HRC),
sensors photoelectric objects objects grasping
acceleration, angular
Inertial
dead reckoning (DR) speed, azimuthal navigation, manipulator control
Sensors
angle
human-robot collaboration (HRC),
inductive, resistance
Torque sensors torque objects grasping, robot
strain
programming
Acoustic human-robot collaboration (HRC),
capacitive sound signals
sensors welding
Magnetic magnetic field
Hall Effect navigation
sensors intensity
Ultrasonic
time of flight (TOF) distance obstacles avoidance
Sensors
Torque sensors are mostly used to measure the torque exerted to mechanical axis. The common types
are the inductive torque sensor and the resistance strain torque sensor. The structure of them usually
consists of a torsion bar and detecting elements like coils or resistance strain gage. The input shaft and
the output shaft are connected by the torsion bar, and the torsional deformation of the torsion bar caused
by a torque can be transduced into electric signals via the variance of parameters of detecting elements,
thus realizing the torque measurement.
Acoustic sensors are able to transduce the sound waves into electric signals. A capacitive electret
microphone is installed in them, and sound waves can cause the vibration of the electret membrane in
the microphone, thus resulting in the variance of capacity and generating weak voltage. Then the voltage
is transformed for subsequent processing.
Magnetic sensors are mainly deployed to detect the magnetic field intensity. The principle of them
is Hall Effect which refers to the phenomenon that when the electricity flows through a conductor, an
4
AIACT 2019 IOP Publishing
IOP Conf. Series: Journal of Physics: Conf. Series 1267 (2019) 012036 doi:10.1088/1742-6596/1267/1/012036
electric field which is perpendicular to the direction of the magnetic field and the electricity will be
generated, thus causing a potential difference on the surface of the conductor.
Ultrasonic sensors are often used to detect obstacles. They estimate the range of objects according to
the time from projecting the ultrasonic waves to detecting the echo. The size and weight of them are
small, and their cost and power consumption are low [16].
Table 1 provides a brief summary of the above presented sensors.
3. Concrete Applications
Sensors are frequently deployed to assist the controller to control industrial robots to implement
appointed functions, which involve various aspects, such as human-robot collaboration (HRC),
autonomous navigation, manipulator control, etc. In the following part, this article gives a presentation
and analysis of the above sensors’ concrete applications in industrial robots.
5
AIACT 2019 IOP Publishing
IOP Conf. Series: Journal of Physics: Conf. Series 1267 (2019) 012036 doi:10.1088/1742-6596/1267/1/012036
instance, in [34], a TOF camera is deployed to perceive the environment and locate and guide a forklift
AGV. And in [35], laser sensors are employed to calculate the position of AGV robots based on the
triangular measurement method. Encoders and inertial sensors are also used to estimate the shift amount
of AGV to execute navigation [36]. Additionally, the mode of multiple sensors fusion is utilized to
improve the performance of navigation. In [37], magnetic sensors are appended to calibrate the location
of AGV and modify the cumulative deviation of encoders and inertial sensors. And TOF cameras and
laser range finders are used together to complete autonomous navigation in [38].
In addition to the above navigation technology, obstacles detection and avoidance is also an essential
component in automated traveling of AGV. Ultrasonic sensors [39] or laser sensors [40] are usually
deployed to detect obstacles to help the controller re-plan the route of AGV for collision avoidance.
6
AIACT 2019 IOP Publishing
IOP Conf. Series: Journal of Physics: Conf. Series 1267 (2019) 012036 doi:10.1088/1742-6596/1267/1/012036
other aspects of industrial robots, such as welding, quality monitoring, robot programming, etc.
For instance, in [57, 58], visual sensors are deployed to capture the real time welding images which
will be processed to obtain the position of welding seam, then the controller can control the end-effector
to execute welding accurately. Furthermore, acoustic sensors are also used to realize welding seam
tracking by analyzing the characteristics of arc sound signals in [59]. And in [60], vision sensors and
acoustic sensors are integrated to monitor the welding penetration status to control the quality of welding.
In the aspect of quality monitoring on production line, in [61], stereo vision sensors are utilized to
inspect the quality of products via the processing and features extraction of the acquired images. And
tactile sensors are also deployed on the robot end-effector to implement quality control by perceiving
the surface information of products in [62].
Currently some intuitive methods are employed with the help of several sensors to simplify the
process and improve the efficiency of robot programming. For example, in [63, 64], torque sensors are
utilized to measure forces and torques to estimate the motion parameters of the robot manipulator during
the procedure of manual guidance programming. And in [65], cameras and a luminous marker are used
to track the movement of the human wrist to realize the demonstration programming of industrial robots.
Additionally, in [66], tactile sensors are also deployed to recognize the tactile gestures to implement
robot programming.
4. Conclusion
This article reviews the common sensors and their concrete applications in industrial robots in recent
years. From the literatures, it can be seen that industrial robots are developing towards the direction of
more independent and intelligent at present, which will help realize more flexible, accurate and
personalized industrial manufacture. However, one challenge in the development of industrial robots is
to develop sensors with better properties to improve the robots’ perception of the external environment
and their own status, and this relies on the development of multidisciplinary technologies. Additionally,
the algorithms for handling the information acquired from sensors are also needed to be optimized to
improve the processing speed and accuracy to meet the higher performance requirements of industrial
robots.
References
[1] Keliang Zhou, Taigang Liu, Lifeng Zhou. Industry 4.0: Towards Future Industrial Opportunities
and Challenges. 12th International Conference on Fuzzy Systems and Knowledge Discovery
(FSKD), Zhangjiajie, China, August 15-17, 2015
[2] Tianmiao Wang, Yong Tao, 2014. Research status and industrialization development strategy of
Chinese industrial robot. Journal of Mechanical Engineering, 50(9), 1-13.
[3] Liang Zou, Chang Ge, Z. Jane Wang, Edmond Cretu, Xiaoou Li, 2017. Novel tactile sensor
technology and smart tactile sensing systems: a review. Sensors, 17, 2653.
[4] Uriel Martinez-Hernandez, 2016. Tactile sensors (Paris: Atlantis Press).
[5] Mohsin I. Tiwana, Stephen J. Redmond, Nigel H. Lovell, 2012. A review of tactile sensing
technologies with applications in biomedical engineering. Sensors and Actuators A: Physical, 179,
17-31.
[6] Francisco Yandun Narvaez, Giulio Reina, Miguel Torres-Torriti, George Kantor, Fernando Auat
Cheein, 2017. A Survey of Ranging and Imaging Techniques for Precision Agriculture
Phenotyping. IEEE/ASME Transactions on Mechatronics, 22(6), 2428-2439.
[7] Dave Litwiller, 2001. CCD vs. CMOS: Facts and Fiction. Photonics Spectra, Laurin Publishing
Co. Inc.
[8] Francisco Yandun Narvaez, Giulio Reina, Miguel Torres-Torriti, George Kantor, Fernando Auat
Cheein, 2017. A survey of ranging and imaging techniques for precision agriculture phenotyping.
IEEE/ASME Transactions on Mectronics, 22(6), 2428-2439.
[9] Avanish KumarDubey, VinodYadava, 2008. Laser beam machining—A review. International
Journal of Machine Tools and Manufacture, 48(6), 609-628.
7
AIACT 2019 IOP Publishing
IOP Conf. Series: Journal of Physics: Conf. Series 1267 (2019) 012036 doi:10.1088/1742-6596/1267/1/012036
[10] J.D. Majumdar, I. Manna, 2003. Laser processing of materials. Sadhana, 28(3-4), 495-562.
[11] Markus-Christian Amann, Thierry M. Bosch, Marc Lescure, Risto A. Myllylae, Marc Rioux,
2001. Laser ranging: a critical review of unusual techniques for distance measurement. Optical
Engineering, 40(1), https://doi.org/10.1117/1.1330700.
[12] Josef Janisch, 2006. Summary of Untouched Circumgyrate Coder. Global Electronics China, 4,
53-54,56.
[13] Shan Chen, 2018. The review of patented encoder technology. China Science and Technology
Information, 22, 18-21.
[14] Dick Johnson, 2007. Proximity sensors. Computer Engineering & Software, 11, 34-38.
[15] Hossein Mousazadeh, 2013. A technical review on navigation systems of agricultural autonomous
off-road vehicles. Journal of Terramechanics, 50(3), 211-232.
[16] He Zhao, Zheyao Wang, 2012. Motion Measurement Using Inertial Sensors, Ultrasonic Sensors,
and Magnetometers With Extended Kalman Filter for Data Fusion. IEEE Sensors Journal, 12(5),
943-953.
[17] GeorgeMichalos, et al, 2014. ROBO-PARTNER: Seamless Human-Robot Cooperation for
Intelligent, Flexible and Safe Operations in the Assembly Factories of the Future. Procedia CIRP,
23, 71-76.
[18] Ales Vysocky, Petr Novak, 2016. Human-Robot Collaboration in Industry. MM Science Journal,
2, 903-906.
[19] John O'Neill, Jason Lu, Rodney Dockter, Timothy Kowalewski. Practical, Stretchable Smart Skin
Sensors for Contact-Aware Robots in Safe and Collaborative Interactions. IEEE International
Conference on Robotics and Automation (ICRA), Seattle, WA, USA, May 26-30, 2015.
[20] Markus Fritzsche, Jose Saenz, Felix Penzlin. A Large Scale Tactile Sensor for Safe Mobile Robot
Manipulation. 11th ACM/IEEE International Conference on Human-Robot Interaction (HRI),
Christchurch, New Zealand, March 7-10, 2016.
[21] Dmitry Popov, Alexandr Klimchik, Nikolaos Mavridis. Collision Detection, Localization &
Classification for Industrial Robots with Joint Torque Sensors. 26th IEEE International
Symposium on Robot and Human Interactive Communication (RO-MAN), Lisbon, Portugal, Aug
28-Sept 1, 2017.
[22] Alwin Hoffmann, Alexander Poeppel, Andreas Schierl, Wolfgang Reif. Environment-aware
Proximity Detection with Capacitive Sensors for Human-Robot-Interaction. IEEE/RSJ
International Conference on Intelligent Robots and Systems (IROS), Daejeon, Korea, October 9-
14, 2016.
[23] Nicola Maria Ceriani, Andrea Maria Zanchettin, Paolo Rocco, Andreas Stolt, Anders Robertsson,
2015. Reactive Task Adaptation Based on Hierarchical Constraints Classification for Safe
Industrial Robots. IEEE/ASME Transactions on Mechatronics, 20(6), 2935-2949.
[24] Abdullah Mohammed, Bernard Schmidt, Lihui Wang, 2017. Active collision avoidance for
human–robot collaboration driven by vision sensors. International Journal of Computer
Integrated Manufacturing, 30(9), 970-980.
[25] Dario Antonelli, Giulia Bruno, 2017. Human-Robot Collaboration Using Industrial Robots.
Advances in Engineering Research, 86, 99-102.
[26] Stephan Kallweit, Robert Walenta, Michael Gottschalk, 2016. ROS Based Safety Concept for
Collaborative Robots in Industrial Applications. Advances in Robot Design and Intelligent
Control, 371, 27-35.
[27] Joséde Gea Fernández, et al, 2017. Multimodal sensor-based whole-body control for human–
robot collaboration in industrial settings. Robotics and Autonomous Systems, 94, 102-119.
[28] Dr. Jayashri Vajpai, Avnish Bora, 2016. Industrial Applications of Automatic Speech
Recognition Systems. Journal of Engineering Research and Applications, 6(3), 88-95.
[29] Gilbert Tang, Seemal Asif, Phil Webb, 2015. The integration of contactless static pose recognition
and dynamic hand motion tracking control system for industrial human and robot collaboration.
Industrial Robot: An International Journal, 42(5), 416-428.
8
AIACT 2019 IOP Publishing
IOP Conf. Series: Journal of Physics: Conf. Series 1267 (2019) 012036 doi:10.1088/1742-6596/1267/1/012036
[30] B.Y. Qi, Q.L. Yang, Y.Y. Zhou. Application of AGV in intelligent logistics system. 5th Asia
International Symposium on Mechatronics (AISM), Guilin, China, October 7-10, 2015.
[31] Alberto Vale, Rodrigo Ventura, Pedro Lopes, Isabel Ribeiro, 2017. Assessment of navigation
technologies for automated guided vehicle in nuclear fusion facilities. Robotics and Autonomous
Systems, 97, 153-170.
[32] BoYang Xu, Dongqing Wang. Magnetic Locating AGV Navigation Based on Kalman Filter and
PID Control. Chinese Automation Congress (CAC), Xi'an, China, November 30- December 2,
2018.
[33] Chunfu Wu, Xiaolong Wang, Qingxie Chen, Xiaowei Cai, Guodong Li. Research on Visual
Navigation Algorithm of AGV used in the Small Agile Warehouse. Chinese Automation
Congress (CAC), Jinan, China, October 20-22, 2017.
[34] Ulrich Behrje, Marian Himstedt, and Erik Maehle. An Autonomous Forklift with 3D Time-of-
Flight Camera-Based Localization and Navigation. 15th International Conference on Control,
Automation, Robotics and Vision (ICARCV), Singapore, Singapore, November 18-21, 2018.
[35] Hang Li, Andrey V. Savkin, 2018. Robotics and Computer Integrated Manufacturing. Robotics
and Computer Integrated Manufacturing, 54, 65-82.
[36] Hyunhak Cho, Eun Kyeong Kim, Eunseok Jang, Sungshin Kim, 2017. Improved Positioning
Method for Magnetic Encoder Type AGV Using Extended Kalman Filter and Encoder
Compensation Method. International Journal of Control, Automation and Systems, 15(4), 1844-
1856.
[37] Pingbao Yin, Wenfeng Li, Ying Duan. Combinatorial Inertial Guidance System for an Automated
Guided Vehicle. IEEE 15th International Conference on Networking, Sensing and Control
(ICNSC), Zhuhai, China, March 27-29, 2018.
[38] Andreas Do¨mel, et al, 2017. Toward fully autonomous mobile manipulation for industrial
environments. International Journal of Advanced Robotic Systems, 1-19.
[39] J. Sankari ; R. Imtiaz. Automated guided vehicle(AGV) for industrial sector. 10th International
Conference on Intelligent Systems and Control (ISCO), Coimbatore, India, January 7-8, 2016.
[40] Elena Cardarelli, Valerio Digani, Lorenzo Sabattini, Cristian Secchi, Cesare Fantuzzi, 2017.
Cooperative cloud robotics architecture for the coordination of multi-AGV systems in industrial
warehouses. Mechatronics, 45, 1-13.
[41] Christian Moeller, et al, 2017. Real Time Pose Control of an Industrial Robotic System for
Machining of Large Scale Components in Aerospace Industry Using Laser Tracker System. SAE
Int. J. Aerosp, 10(2), 100-108.
[42] J. R. Diaz Posada, et al. High Accurate Robotic Drilling with External Sensor and Compliance
Model-Based Compensation. IEEE International Conference on Robotics and Automation
(ICRA), Stockholm, Sweden, May 16-21, 2016.
[43] Tingting Shu, Sepehr Gharaaty, WenFang Xie, Ahmed Joubair, Ilian A. Bonev, 2018. Dynamic
Path Tracking of Industrial Robots with High Accuracy Using Photogrammetry Sensor.
IEEE/ASME Transactions on Mechatronics, 23(3), 1159-1170.
[44] Mohammad Keshmiri, Wenfang Xie, 2017. Image-Based Visual Servoing Using an Optimized
Trajectory Planning Technique. IEEE/ASME Transactions on Mechatronics, 22(1), 359-370.
[45] Luciano Cantelli, Giovanni Muscato, Marco Nunnari, Davide Spina, 2015. A Joint-Angle
Estimation Method for Industrial Manipulators Using Inertial Sensors. IEEE/ASME Transactions
on Mechatronics, 20(5), 2486-2495.
[46] A. Klimchik, A. Pashkevich, 2018. Robotic manipulators with double encoders: accuracy
improvement based on advanced stiffness modeling and intelligent control. IFAC PapersOnLine,
51(11), 740–745.
[47] Benigno Munoz-Barron, Jesus R. Rivera-Guillen, Roque A. Osornio-Rios, Rene J. Romero-
Troncoso, 2015. Sensor Fusion for Joint Kinematic Estimation in Serial Robots Using Encoder,
Accelerometer and Gyroscope. Journal of Intelligent & Robotic Systems, 78(3-4), 529-540.
[48] Ryo KABUTAN, et al. Development of Robotic Intelligent Space Using Multiple RGB-D
9
AIACT 2019 IOP Publishing
IOP Conf. Series: Journal of Physics: Conf. Series 1267 (2019) 012036 doi:10.1088/1742-6596/1267/1/012036
10