Robotics (EL-422) Spring 22 Part 5
Robotics (EL-422) Spring 22 Part 5
Robotics
Teacher:
Dr. Syed Riaz un Nabi Jafri
2
Moving Robots
• A robot having capability to change its
location is known as moving robot
• Variety of moving robots are available
Moving Robots
Structure
• Different kind of structures are in use
according to demand
• This presentation is covering Differential Drive
Rover which are most common structure for
indoor/outdoor environments
Structure
Main body
Joints
Controller
Actuators
Processor
Sensors
Moving Robot Components
• Rover: main body of robot
• Tx/Rx: Communication modules
• Joints: Provide connectivity between different parts
• Actuators: Agents which are responsible to provide force for motion on
different parts
• Sensors: Agents which are responsible to sense rate of motion of robot
itself or objects nearer to robot with further details
• End effector: The part at the last joint (hand/gripper)
• Controller: A numerically supported agent responsible to control all
motions of robot
• Processor (Brain): A numerically supported agent responsible to
determine and estimate behavior of robot and its surroundings and to
generate commands to different parts
• Software: A series of instructions which are provided to Processor
• Base station : For data logging and control
7
Block Diagram
Base Station
Motion
Sensors
Perception
Sensors
8
Workspace
9
Movement Principle
Two Actuators are connected on each
wheel. A balancing wheel can be placed at
front or back side.
10
Movement Principle
One Actuator is connected on front shaft for
two wheels. It is responsible to turn right or
left the vehicle.
11
Examples
12
Applications
• In use for indoor/outdoor cleaning
• For surveillance
• As an industrial loader
• For inspection
• Other domestic use
13
Degree Of Freedom (DOF)
14
Actuators
Actuators are referred as Muscles of robot
• Electric motors
Servo motors
Stepper motors
• Pneumatic actuators
• Hydraulic actuators
15
Actuators
Pneumatic actuator
DC Servomotor
Hydraulic actuator
Stepper motor
16
Actuator Selection
• Select best actuator for following:
17
Sensors
• Potentiometers
• Encoders
• LVDT
• Resolvers
• GPS (Global positioning system)
• IMU (inertial measurement unit)
• Proximity Sensors
• Range Finders
• Object Identifiers
18
Sensors
Potentiometer
Encoder IMU
Resolver
GPS
19
Sensors
20
Sensors
IR Sensor
Camera
Kinect (RGB-D)
21
Sensors Combination
• Position sensing (Localization)
Indoor ? (encoder, Pot, IMU,GPS)
Outdoor ?
22
Processors (Brain)
A variety of processors solutions are available
depending on the requirement.
23
Processors (Brain)
ARM
Raspberry PI
FPGA
Tern
24
Reference Frame
25
Sensor Working Principle
XS
YS
27
Objects points in Sensor Frame
29
Objects features in Robotic Frame
30
Objects features in World Frame
(Transformations)
31
Real Environment Features
Most of the scan points have merged into line segments, there
are some points which have not merged. This is the demerit of
feature based approach.
32
Scan points conversion into features
(Split & Merge)
Yw
Translation (dZ = 0)
Xw
34
Transformations
Consider following wall W1 has
been sensed by the laser scanner
Example:
Robot center is placed at (5,5) W1
with orientation 40 degrees.
Scanner center has displacement
of 10 units from robot center.
Both Scanner and robot frames
5,5 10
are aligned with each other.
35
Transformations
Scanning values of the wall has been given
(4,6)
W1
(7,2)
XS
36
Transformations
37
Transformations
Matlab script:
Ps => P_s = [4; 6; 0; 1]
Trans(10,0,0) => T2 = [1 0 0 10;0 1 0 0;0 0 1 0;0 0 0 1 ]
38
Transformations
Exercise:
Robot center is placed at (5,0) with orientation 0 degree. Sensor center
has displacement of 5 units from robot center. Both Sensor and robot
frames are aligned with each other. Robot sensor is detecting a line
end point at (5,0) in sensor frame. Determine end point coordinates in
world frame. Also verify result by plotting all frames manually.
39
Transformations
Exercise:
1
3 (1,1)
XR XS
3
XW
40
Transformations
Line parameters conversion
(rg = 4, ϕg = 90 degrees)
line
XR
6 , 0 degrees
XW
41
Ultrasonic Sensor Scans
42
Environment Perception (Features
based Map)
45
Localization
We compare two consecutive scans to
determine robot new pose
47
Simultaneous Localization and Mapping
(SLAM)
What is SLAM?
Estimate the pose of a robot and the map of the environment
at the same time
SLAM is hard, because
a map is needed for localization and
a good pose estimate is needed for mapping
Localization: inferring location given a map
Mapping: inferring a map given locations
SLAM: learning a map and locating the robot simultaneously
Various algos are available like EKF and particle filter SLAM
48
Simultaneous Localization and Mapping
(SLAM)
49
Simultaneous Localization and Mapping
(SLAM)
50
Simultaneous Localization and Mapping
(SLAM)
51
Simultaneous Localization and Mapping
(SLAM)
52
Simultaneous Localization and Mapping
(SLAM)
53
Environment Perception (3D)
Single Camera as Range & Bearing Sensor
54
Environment Perception (3D)
55
Environment Perception (3D)
56
Useful links
• Books:
Introduction to Robotics: Analysis, Control, Applications (By Saeed B. Niku)
Robotic Vision and Control (By Peter Corke)
Probabilistic Robotics (By Thrun et al)
• Journals/Conferences
IEEE Transactions on Robotics, Advanced Robotics, Journal of Field Robotics, IEEE ICRA, IEEE
IROS, IEEE ROBIO
• Online courses
MIT, Stanford, Coursera, EDX, IIT India
• Platforms
ROS , Webots, Vrep, OpenCV
• Groups
Robotics worldwide (search more specified)
57