0% found this document useful (0 votes)
18 views

Ijeee V11i5p123

Uploaded by

giapthuong162909
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
18 views

Ijeee V11i5p123

Uploaded by

giapthuong162909
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 7

SSRG International Journal of Electrical and Electronics Engineering Volume 11 Issue 5, 261-267, May 2024

ISSN: 2348-8379/ https://doi.org/10.14445/23488379/IJEEE-V11I5P123 © 2024 Seventh Sense Research Group®

Original Article

An Investigation of Novel Control Strategy for An AMR


Mapping and Inventory Management Using Lidar Sensor
Thi-Mai-Phuong Dao1, Thu-Ha Nguyen1, Duy-Thuan Vu2, Thi-Duyen Bui3, Duc-Hiep Nguyen1,
Ngoc-Khoat Nguyen3*
1
Faculty of Electrical Engineering, Hanoi University of Industry, Hanoi, Vietnam.
2
Faculty of Energy Technology, Electric Power University, Hanoi, Vietnam.
3
Faculty of Control and Automation, Electric Power University, Hanoi, Vietnam.

*Corresponding Author : khoatnn@epu.edu.vn

Received: 14 March 2024 Revised: 14 April 2024 Accepted: 12 May 2024 Published: 29 May 2024

Abstract - Traditional material handling methods in industrial environments often involve manually operated carts and forklifts.
These methods are labor-intensive, inherently hazardous, and characterized by repetitive tasks. Autonomous Mobile Robots
(AMRs) have emerged as a promising solution to address these limitations, offering the potential for significant efficiency
improvements in factories and warehouses. This paper presents a research study on an AMR system designed for material
handling applications. The robot is built on the Raspberry Pi 3B+ embedded computing platform and utilises the LM298 power
amplifier module. These hardware components are integrated to form a robust control system. The Raspberry Pi serves as the
Central Processing Unit (CPU), receiving sensor data from the A1M8 Lidar sensor. The processed data is then transmitted to
the Arduino Mega 2560 microcontroller, which controls the LM298 driver circuit. The Lidar sensor plays a critical role in
constructing a map of the surrounding environment and providing essential data to the Raspberry Pi. The LM298 driver circuit
effectively controls the motors, enabling the robot’s movement. The Raspberry Pi’s Broadcom BCM2837B0 processor, a quad-
core A53 (ARMv8) 64-bit SoC operating at 1.4 GHz, ensures efficient data processing and control capabilities. Experimental
results verify the applicability of the control system proposed in this study in achieving reliable and efficient material handling
operations.

Keywords - AMR, AMR vehicle, ROS, Mapping, Lidar sensor.

1. Introduction and avoid obstacles when moving. AMRs employ the popular
Amidst the current drive for economic growth, factories Robot Operating System (ROS) environment to deploy
and workshops are rapidly expanding in scale and capacity, autonomous functionalities. ROS provides software packages
accompanied by a corresponding increase in workshop size. and tools that provide tools and methods for building
However, the movement of goods and materials within programs that control self-directed robots [1].
workshops has become a labour-intensive and time-
consuming task. Autonomous Mobile Robots (AMRs) are The AMRs represent one of the fastest-growing scientific
gaining widespread adoption globally, playing a pivotal role research fields today. Their applications span a vast spectrum
in reducing the labour burden on humans and delivering of domains, including industrial automation, agriculture,
superior performance. healthcare, eldercare, planetary exploration, entertainment,
search and rescue operations, transportation, personal
The AMRs, with their self-navigation capabilities, services, product distribution, smart warehousing,
represent a class of mobile robots that operate without the need construction, sports, self-driving cars, unmanned aerial
for human intervention. They are characterized by their ability vehicle applications, and numerous others. AMRs are
to self-localize, navigate, and plan their movement based on particularly adept at assisting humans in performing tasks in
information gathered from integrated sensors and control hazardous or specialized environments [2].
systems. AMRs are typically equipped with a suite of sensors,
including Lidar, cameras, tactile sensors, and infrared sensors, This diverse range of applications ensures robust growth
to collect data from the surrounding environment. Utilizing and economic impact. Even amidst the global industry
data processing algorithms and control systems, AMRs are disruption caused by the COVID-19 pandemic, the mobile
able to determine the location, create maps, as well as detect robotics market was projected to continue its record-breaking

This is an open access article under the CC BY-NC-ND license (http://creativecommons.org/licenses/by-nc-nd/4.0/)


Ngoc-Khoat Nguyen et al. / IJEEE, 11(5), 261-267, 2024

growth, reaching USD 23 billion in 2021 and USD 54 billion distances. By varying the beam’s angle, it constructs a 3D
by 2023 [3, 4]. Therefore, AMRs are widely employed and point cloud depicting the surrounding environment. Analysis
will serve as the focus of this paper’s research [5-10]. of the generated point cloud data subsequently enables the
localization and shape recognition of things which are in the
This paper proposes novel research and design for an sensor’s field of view.
AMR carrying an embedded computer and a Lidar sensor
operating within a factory environment. The Lidar sensor
utilized in this study is the A1M8 Lidar, which gathers spatial Lidar sensor
information surrounding the robot. Concurrently, the paper
presents an intelligent control method based on Lidar data
processing algorithms to ensure the robot can accurately
determine its position recognize, and interact with objects in
the working environment. Power supply for
Raspberry
Raspberry
The research integrates systems for map building and
navigation on the constructed map. This work developed an
algorithm to identify the shortest path planning and move
along the planned path. Subsequently, the research established Arduino Mega 2560
and constructed an application based on the software packages Microcontroller
available in the ROS community and developed additional
software packages to suit practical requirements.

2. Hardware System Design and Dynamical Motor


Module LM298
Analysis Power supply
2.1. Hardware System Block Diagram
The AMR autonomous vehicle is constructed based on
the following two hardware components. A LM298 power
Motor Encoder
amplifier is used first. Then, a computer/laptop with
Raspberry Pi 3B+ is employed. These hardware devices are
interconnected according to the connection diagram in Figure Fig. 1 Configuration of the hardware system
1. At the heart of the system lies the Raspberry Pi, serving as
the central processing unit. Clockwise Rotation

It receives signals from the Lidar sensor, processes them,


and relays the processed signals to the Arduino Mega 2560 Specifications:
microcontroller for control of the LM298. The Lidar plays a Ranger Scanner
Angular Range: 0-360° System
crucial role in scanning the surrounding environment and
transmitting the acquired map data to the Raspberry Pi for Distance Range: 0.15-6m

further processing. The LM298 acts as a buffer stage, Resolution: 0.5 mm Motor System
amplifying and conditioning the control signals to drive the Power Supply: 5 V
motors. The Lidar sensor A1M8 operates on the principle of
emitting and receiving laser pulses to determine the distance Communication Interface:
UART/USB
and spatial information surrounding the sensor. The
fundamental operational principle of the sensor employs Laser
Imaging Detection and Ranging (Lidar) technology.
Fig. 2 Lidar sensor A1M8 [2]

This sensor emits a narrow laser beam and tracks the time 2.2. Design of the AMR - Based Vehicle
it takes for the laser beam to travel from the sensor to objects In the design of autonomous vehicles, achieving
within the scanning range. Upon encountering an object, the simplicity, symmetrical structure, and ease of control is
laser beam is reflected and returned to the sensor. The A1M8 paramount. Utilizing basic geometric shapes with bilateral
LiDAR sensor operates by employing a combination of filters symmetry facilitates maneuverability. The design of the
and highly sensitive detectors. This system captures and vehicle incorporates a two-tier architecture [11-14].
measures the time-of-flight of a laser pulse, which is the time
required for the pulse to reflect off a target and return to the The uppermost stratum of the system houses the control
sensor. The Lidar A1M8 employs a laser beam to measure circuitry, motor driver units, and a computer, which is

262
Ngoc-Khoat Nguyen et al. / IJEEE, 11(5), 261-267, 2024

integrated into the system. The lower tier accommodates the The data relating to the velocity of the robot obtained from
lifting frame for the Lidar sensor. The real technical the sensors must be converted into the desired translational
specifications are presented in Appendix 1. velocities 𝑣𝑥, 𝑣y, and rotational speed 𝜔. This work aims to
achieve perfect velocity matching between the vehicle’s
The AMRs are designed for autonomous operation and center and the target velocity. To address the unique
receive control signals transmitted by the Robot Operating challenges posed by the four-wheel mecanum wheel
System (ROS). These signals typically specify the desired configuration with its offset rollers, a multi-differential system
Linear velocity (𝑣) and Rotational velocity (𝜔). The is implemented.
embedded program on the AMR is then responsible for
interpreting these commands and controlling the vehicle’s This system functions similarly to conventional
actuators to achieve the specified velocities. differential systems by regulating rotational forces applied to
each wheel. However, unlike conventional differentials, the
multi-differential system allows for independent lateral
movement of each wheel, granting the vehicle improved
maneuverability [5]. The determination of each wheel’s
specific angular speed is based on kinetic analysis. Four
rotational velocities corresponding to the four wheels are
computed in (1).

 1
1  r (vx  v y  (l x  l y ) ),

  1 (v  v  (l  l ) ),
 2 r x y x y
(1)

  1 (v  v  (l  l ) ),
 3 r x y x y
 1
4  (vx  v y  (l x  l y ) ),
Fig. 3 Mechanical design of the AMR vehicle
 r

From (1), it is straightforward to deduce the following:

 
i=0 viw Ei  vx     
1 1 1 1  1
Si   r 
1   2  (2)
i 
vir  vy   1 1 1
  4   3 
YR XR  z  1 1 1 1  
 
 l l
 x y lx  l y lx  l y lx  l y   4 
i
i=2 The linear velocity in x – direction is calculated in (3).

z v vx (t )  (1  2  3  4 ).
r
(3)
Y vy x 4
O i=1 The linear velocity in y – the direction is computed in (4).
r
liy lix r
v y (t )  (1  2  3  4 ). (4)
4

The rotational speed should be computed as follows:

i=3 r
z (t )  (1  2  3  4 ). (5)
4(lx  l y )

G Where: vx and vy denote two elements of wheel velocity


X
Fig. 4 Four-wheel construction of the AMR vehicle robot
obtained in the x and y directions; l x is half the front wheel

263
Ngoc-Khoat Nguyen et al. / IJEEE, 11(5), 261-267, 2024

distance; ly is half the distance between the two wheels (front 3.1.2. SLAM Mapping Diagram
and rear sides); ωz denotes the rotational speed; ω1, ω2, ω3, ω4 To implement Simultaneous Localization and Mapping
are the four rotational velocities corresponding to the four (SLAM), we utilize the hector_slam package for the AMRs.
wheels; and r in length, unit denotes the symmetric radius for This is achieved by collecting data from the Rplidar sensor
each wheel of the robot. and converting the base frame (where the sensor is mounted)
to a laser frame with the part structure, as shown in Figure 6.
3. Design of Control Algorithm
3.1. Algorithmic Diagram for the Whole System The SLAM process involves a network of interconnected
3.1.1. Flow Chart to Motion Control nodes that work together to achieve accurate navigation and
Upon completion of each timer operation cycle, the mapping. The core nodes involved in this process are:
microcontroller acquires velocity values (𝑣, 𝜔) from global
variables and calculates the required number of pulses for each (1) Rplidar_node: This node operates the Lidar sensor and
left and right motor. Subsequently, it outputs the pulse signals acquires sensor data. The sensor data is then published to
to the driver. If an encoder is present in the system, the encoder the robot via the topic “Rplidar_msgs/LaserScan”.
signal is fed back to the pulse calculation unit to generate the (2) Key_teleop: This node facilitates robot movement control
appropriate number of pulses for the motors. by publishing velocity data. The published velocity data
guides the robot’s motion. AMR_core: This node receives
During operation, if new signals are received, data from data from Key_teleop, including the robot’s angular
the Raspberry Pi is transferred to the buffer via USB. Upon deviation and movement speed. While publishing the
receiving a sufficient amount of data (bytes) in the USB “Odom” message for odometry and robot state
buffer, an interrupt signal triggers the microprocessor. This estimation, it also publishes the relative coordinates
signal prompts the microprocessor to perform the following converted in the order odom → base_footprint →
actions: base_link → base_scan.
(3) Hector_mapping: This node generates a map based on
(i) Load data structure from a buffer object; input data comprising “tf” and “scan” messages. The
(ii) Compute the necessary velocity values based on the output of this node is an occupancy grid map
retrieved data. and (OccupancyGrid).
(iii) Update the global variables with the calculated velocity (4) Map_server: This node stores the processed map from the
values. gmapping node into files named “map.pgm” and
“map.yaml”. These files represent the map used for
navigation and path planning.
Begin

AMR_core Key_teleop Rplidar_node


Global parameters Initialization

Obtain (v, w) odom

Raspberry /tf /scan

Compute pulses which Base_footprint


are necessary for driver
Hector_mapping

Output of motor Base link


driver
/map Map_server
Base scan
Encoder Motor

Fig. 5 Motor control – an algorithmic block representation Fig. 6 SLAM mapping diagram

264
Ngoc-Khoat Nguyen et al. / IJEEE, 11(5), 261-267, 2024

3.1.3. Navigation Diagram for the AMRs  Map_server: Map_server serves as a ROS node that
Upon acquiring environmental and localization publishes a 2D map, typically generated from laser scans
information from the SLAM system, the robot can be assigned or other mapping sensors. This map provides a
target locations. However, these targets must be confined representation of the robot’s environment and is utilized
within the mapped environmental region to ensure successful for navigation purposes.
navigation. The robot’s navigation system then assumes  Sensor source: Sensor source refers to the origin of sensor
responsibility for guaranteeing safe and autonomous motion data, in this case, laser scans. Laser scanners emit laser
in the real environment. This entails effectively avoiding both beams and measure the reflected light to create a detailed
dynamic and static obstacles while closely adhering to a pre- representation of the surrounding environment.
defined trajectory that guides the robot from its initial position  /move_base_simple/goal: This topic serves as the input
to the desired target within the mapped range. for navigation commands. It specifies the desired
destination for the robot, typically in the form of a pose
The robot’s navigation system relies on the following (x, y, orientation) within the map.
elements: a local planner and a global planner [15-18]. These  /cmd_vel: The /cmd_vel topic publishes the linear and
components work together to achieve efficient navigation. angular velocity commands for the robot. The base
Path planning for the robot involves mapping the surrounding controller utilizes this information to regulate the robot’s
environment using data from the perception system. movement according to the desired motion.
Leveraging a digital map representation of the environment,  Global_costmap: Global_costmap represents a 2D map
the global planner within the robot’s control system is tasked that has been integrated and processed by the global
with strategically determining the optimal path for navigation. planner. It incorporates obstacle information from the
This process entails meticulously calculating the shortest path robot’s sensors and serves as the basis for path planning.
between the robot’s current location, designated as the initial  Global_planner serves as the strategic decision-maker
position, and the designated target location. within the robot’s navigation system. Its primary function
entails the computation of an optimal path for the robot to
/move_base_simple/goal /scan Sensor traverse, facilitating its movement towards the designated
AMCL source goal location. It utilizes various path-planning algorithms
to determine the most efficient and collision-free route
global_planner global_costmap
within the environment.
 Local_costmap: Local_costmap represents a smaller
Sensor /tf
Recovery_behaviors
/map
map_server
windowed region within the global map that is centered
transforms
around the current location of the robot. It provides a
localized view of the environment for more immediate
/odom obstacle avoidance and reactive navigation.

Odometry Local_planner Local_costmap
Local_planner: Local_planner employs algorithms like
DWA to generate a collision-free path within the
/cmd_vel local_costmap. It considers the robot’s immediate
surroundings and provides velocity commands to ensure
Fig. 7 Navigation diagram for the AMR proposed in this study smooth and safe navigation.
 /tf provides the transform information between various
 Odometry: Odometry utilizes encoder readings to components in the robot system. which includes the
determine the number of pulses generated by the encoder relationship between the robot’s base frame, sensor
during wheel rotation. By applying kinematic principles, frames, and end-effector frames.
the displacement of the wheel can be calculated. This
information is then employed to determine the wheel’s 4. Simulation and Experiment Results
relative position compared to its initial location. 4.1. Mapping Process and Map Results
 Sensor transforms: Sensor transforms encompass the Figure 8 illustrates the mapping process. The black area
transformation of sensor data from its local frame to a represents the projection of an obstacle onto the 2D plane of
common reference frame, enabling the integration of the map, while the red lines represent laser scans extracted
sensor measurements into a unified coordinate system. from LaserScan at a fixed height above the ground.
This typically involves sensors like the encoder and Lidar.
 AMCL: AMCL is an algorithm that estimates the location 4.2. Robot Navigation Process
of the robot within a map relying on sensor data. It To initialize the robot’s position, it is mandatory first to
employs a probabilistic approach to maintain multiple inform it of its starting location and then specify its
hypotheses about the robot’s location, gradually refining destination. Subsequently, the global planner will generate a
its estimation as sensor measurements become available. path from the starting position to the desired destination.

265
Ngoc-Khoat Nguyen et al. / IJEEE, 11(5), 261-267, 2024

Fig. 8 AMR vehicle scans signals that match the constructed 2D map Fig. 11 Re-planning process for the AMR in case of obstacles

Fig. 9 Plan routes for AMR vehicles Fig. 12 The AMR is moving to avoid obstacles

Fig. 10 The AMR vehicle tracks a scheduled route and reaches the Fig. 13 An illustration of the AMR reaching the destination
destination

In Figure 9, the blue area represents the local costmap navigation system exhibits the capability of dynamically re-
area. The robot will then move autonomously according to the planning its path when obstacles are detected. This real-time
path plan calculated by the local planner. Figure 10 illustrates adaptation to environmental changes ensures the robot’s
the robot’s successful arrival at the predetermined destination. ability to safely navigate and reach its intended destination
(see Figure 12). The robot has reached its destination, as
4.3. Control of the AMR when Obstacles Occur shown in Figure 13. The AMR’s ability to promptly detect and
Obstacle avoidance during navigation is achieved using the respond to obstacles highlights the effectiveness of the
Dynamic Window Approach (DWA) algorithm. Figure 11 proposed control algorithm. The real-time path re-planning
illustrates the path re-planning process that occurs when capabilities enable the AMR to navigate around obstacles and
obstacles are encountered during navigation. This dynamic reach its intended destination successfully. This achievement
adaptation to environmental changes is crucial for ensuring the underscores the high feasibility of the proposed control
robot’s ability to reach its destination safely. The robot’s algorithm in practical applications.

266
Ngoc-Khoat Nguyen et al. / IJEEE, 11(5), 261-267, 2024

5. Conclusion and Future Work of the autonomous vehicle. By implementing these time-
Through comprehensive computational analysis, varying parameters, the controller is able to maintain the
simulations, experiments, and discussions presented in this stability of the vehicle even when navigating across a variety
paper, we have achieved significant advancements in of complex trajectories. Additionally, an embedded computer
developing a novel control algorithm for steering robots has been integrated into the vehicle, reducing its overall size
within the ROS framework. The implemented algorithm and enhancing its maneuverability. The presented research
effectively utilizes a Lidar sensor on an autonomous vehicle lays the foundation for the successful construction and
to perform map scanning and stable navigation. This study development of autonomous vehicles at a higher level. Future
successfully identified the time-varying parameters governing research directions will focus on refining the overall system
the controller. This identification was achieved by leveraging for more comprehensive and stable operation across diverse
information regarding the desired linear and angular velocities and even challenging environments.

References
[1] Gerald Cook, and Feitian Zhang, Mobile Robots: Navigation, Control and Sensing, Surface Robots and AUVs, IEEE Press, 2020. [Google
Scholar] [Publisher Link]
[2] Francisco Rubio, Francisco Valero, and Carlos Llopis-Albert, “A Review of Mobile Robots: Concepts, Methods, Theoretical Framework,
and Applications,” International Journal of Advanced Robotic Systems, vol. 16, no. 2, 2019. [CrossRef] [Google Scholar] [Publisher Link]
[3] Manuel Cardona, Allan Palma, and Josue Manzanares, “COVID-19 Pandemic Impact on Mobile Robotics Market,” 2020 IEEE Andescon,
Ecuador, pp. 1-4, 2020. [CrossRef] [Google Scholar] [Publisher Link]
[4] Rplidar A1: Low Cost 360 Degree Laser Range Scanner, Introduction and Datasheet, Shanghai Slamtec. Co. Ltd., 2016. [Online].
Available: https://www.generationrobots.com/media/rplidar-a1m8-360-degree-laser-scanner-development-kit-datasheet-1.pdf
[5] Lentin Joseph, and Jonathan Cacace, Mastering ROS for Robotics Programming: Design, Build, and Simulate Complex Robots using the
Robot Operating System, 2nd ed., Packt Publishing Ltd., 2018. [Google Scholar] [Publisher Link]
[6] Giuseppe Fragapane et al., “Planning and Control of Autonomous Mobile Robots for Intralogistics: Literature Review and Research
Agenda”, European Journal of Operational Research, vol. 294, no. 2, 2021, pp. 405-426. [CrossRef] [Google Scholar] [Publisher Link]
[7] Shuzhi Sam Ge, and Frank L. Lewis, Autonomous Mobile Robots: Sensing, Control, Decision Making, and Applications, CRC Press,
2006. [Google Scholar] [Publisher Link]
[8] Wikipedia, Automated Guided Vehicle. [Online]: Available: https://en.wikipedia.org/wiki/Automated_guided_vehicle
[9] Roland Siegwart, Illah R. Nourbakhsh, and Davide Scaramuzza, Introduction to Autonomous Mobile Robots, The MIT Press, 2nd ed.,
2011. [Google Scholar] [Publisher Link]
[10] ROS.org, Costmap_2D. [Online]. Available: http://wiki.ros.org/costmap_2d
[11] Kaveh Azadeh, Rene De Koster, and Debjit Roy, “Robotized and Automated Warehouse Systems: Review and Recent Developments,”
Transportation Science, vol. 53, no. 4, pp. 917-1212, 2019. [CrossRef] [Google Scholar] [Publisher Link]
[12] Takeshi Shimmura, Ryosuke Ichikari, and Takashi Okuma, “Human-Robot Hybrid Service System Introduction for Enhancing Labor and
Robot Productivity,” Advances in Production Management Systems, Towards Smart and Digital Manufacturing, pp. 661-669, 2020.
[CrossRef] [Google Scholar] [Publisher Link]
[13] Samyeul Noh, Jiyoung Park, and Junhee Park, “Autonomous Mobile Robot Navigation in Indoor Environments: Mapping, Localization,
and Planning,” 2020 International Conference on Information and Communication Technology Convergence (ICTC), Korea (South), pp.
908-913, 2020. [CrossRef] [Google Scholar] [Publisher Link]
[14] N. Dinesh Kumar et al., “Mapping and Navigation of Autonomous Robots with Lidar for Indoor Applications”, Mapping and Navigation
of Autonomous Robot with LiDAR for Indoor Applications, pp. 1-11, 2023.
[15] Oliveira Junior, and Alexandre de, “Combining Particle Filter and Fiducial Markers in a Slam-based Approach to Indoor Localization
of Mobile Robots,” Master Thesis, IPB Digital Library, 2022. [Google Scholar] [Publisher Link]
[16] Morgan Quigley, Brian Gerkey, and William D. Smart, Programming Robots with ROS: A Practical Introduction to the Robot Operating
System, O’Reilly Media Inc., USA, 2015. [Google Scholar] [Publisher Link]
[17] ROS.org, Move_base. [Online]. Available: http://wiki.ros.org/move%20_base
[18] Dang Thai Son et al., “The Practice of Mapping-based Navigation System for Indoor Robot with RPLidar and Raspberry Pi,” International
Conference on System Science and Engineering (ICSSE), Vietnam, pp. 279-282, 2021. [CrossRef] [Google Scholar] [Publisher Link]

Appendix 1. Technical Specifications of the AMR Vehicle


Vehicle Length: 245 mm; Vehicle Width: 250 mm; Vehicle Height: 260 mm; Distance between Tier 1 and Tier 2: 100 mm;
Camera Mount Height: 50 mm.

267

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy