Haptic Technology: Paper Presentation On
Haptic Technology: Paper Presentation On
Haptic Technology: Paper Presentation On
HAPTIC TECHNOLOGY
Authorised By
SANTOSH BHARADWAJ REDDY
Email: help@matlabcodes.com
Engineeringpapers.blogspot.com
More Papers and Presentations available on above site
ABSTRACT
“HAPTICS”-- a technology that adds the sense of touch to virtual
environment .Haptic interfaces allow the user to feel as well as to see virtual
objects on a computer, and so we can give an illusion of touching surfaces,
shaping virtual clay or moving objects around.
1
With this technology we can now sit down at a computer terminal
and touch objects that exist only in the "mind" of the computer.By using
special input/output devices (joysticks, data gloves, or other devices), users
can receive feedback from computer applications in the form of felt
sensations in the hand or other parts of the body. In combination with a
visual display, haptics technology can be used to train people for tasks
requiring hand-eye coordination, such as surgery and space ship maneuvers.
In this paper we explicate how sensors and actuators are used for
tracking the position and movement of the haptic device moved by the
operator. We mention the different types of force rendering algorithms.
Then, we move on to a few applications of Haptic Technology. Finally we
conclude by mentioning a few future developments.
2
Introduction
1 What is Haptics?
Haptics refers to sensing and manipulation through touch. The word
comes from the Greek ‘haptesthai’, meaning ‘to touch’.
The history of the haptic interface dates back to the 1950s, when a
master-slave system was proposed by Goertz (1952). Haptic interfaces were
established out of the field of tele- operation, which was then employed in
the remote manipulation of radioactive materials. The ultimate goal of the
tele-operation system was "transparency". That is, an user interacting with
the master device in a master-slave pair should not be able to distinguish
between using the master controller and manipulating the actual tool itself.
Early haptic interface systems were therefore developed purely for
telerobotic applications.
Working of Haptic Devices
3
Process by which desired sensory stimuli are imposed on the
user to convey information about a virtual haptic object.
The human operator typically holds or wears the haptic interface device
and perceives audiovisual feedback from audio (computer speakers,
headphones, and so on) and visual displays (a computer screen or head-
mounted display, for example).
Audio and visual channels feature unidirectional information and energy
flow (from the simulation engine towards the user) whereas, the haptic
modality exchanges information and energy in two directions, from and
toward the user. This bi directionality is often referred to as the single most
important feature of the haptic interaction modality.
System architecture for haptic rendering:
An avatar is the virtual representation of the haptic interface through
which the user physically interacts with the virtual environment.
Haptic-rendering algorithms compute the correct interaction forces
between the haptic interface representation inside the virtual environment
and the virtual objects populating the environment. Moreover, haptic
rendering algorithms ensure that the haptic device correctly renders such
forces on the human operator.
4
1.)Collision-detection algorithms detect collisions between objects and avatars
in the virtual environment and yield information about where, when, and
ideally to what extent collisions (penetrations, indentations, contact area, and
so on) have occurred.
2.) Force-response algorithms compute the interaction force between avatars
and virtual objects when a collision is detected. This force approximates as
closely as possible the contact forces that would normally arise during contact
between real objects.
Hardware limitations prevent haptic devices from applying the exact force
computed by the force-response algorithms to the user.
3.) Control algorithms command the haptic device in such a way that
minimizes the error between ideal and applicable forces. The discrete-time
nature of the haptic- rendering algorithms often makes this difficult.
The force response algorithms’ return values are the actual force and torque
vectors that will be commanded to the haptic device.
Existing haptic rendering techniques are currently based upon two main
principles: "point-interaction" or "ray-based".
In point interactions, a single point, usually the distal point of a probe,
thimble or stylus employed for direct interaction with the user, is employed in
the simulation of collisions. The point penetrates the virtual objects, and the
depth of indentation is calculated between the current point and a point on the
5
surface of the object. Forces are then generated according to physical models,
such as spring stiffness or a spring-damper model.
In ray-based rendering, the user interface mechanism, for example, a
probe, is modeled in the virtual environment as a finite ray. Orientation is thus
taken into account, and collisions are determined between the simulated
probe and virtual objects. Collision detection algorithms return the intersection
point between the ray and the surface of the simulated object.
Computing contact-response forces:
Humans perceive contact with real objects through sensors (mechanoreceptors) located in
their skin, joints, tendons, and muscles. We make a simple distinction between the information
these two types of sensors can acquire.
1.Tactile information refers to the information acquired through sensors in the skin with particular
reference to the spatial distribution of pressure, or more generally, tractions, across the contact
area.
To handle flexible materials like fabric and paper, we sense the pressure
variation across the fingertip. Tactile sensing is also the basis of complex
perceptual tasks like medical palpation, where physicians locate hidden
anatomical structures and evaluate tissue properties using their hands.
2.Kinesthetic information refers to the information acquired through the sensors in the joints.
Interaction forces are normally perceived through a combination of these two.
To provide a haptic simulation experience, systems are designed to recreate the contact
forces a user would perceive when touching a real object.
6
All real surfaces contain tiny irregularities or indentations. Higher accuracy, however,
sacrifices speed, a critical factor in real-time applications. Any choice of modeling technique must
consider this tradeoff. Keeping this trade-off in mind, researchers have developed more accurate
haptic-rendering algorithms for friction.
In computer graphics, texture mapping adds realism to computer-
generated scenes by projecting a bitmap image onto surfaces being rendered.
The same can be done haptically.
Controlling forces delivered through haptic interfaces:
Once such forces have been computed, they must be applied to the user. Limitations of
haptic device technology, however, have sometimes made applying the force’s exact value as
computed by force-rendering algorithms impossible. They are as follows:
1 • Haptic interfaces can only exert forces with limited magnitude and not equally well in all
directions
2 • Haptic devices aren’t ideal force transducers. An ideal haptic device would render zero
impedance when simulating movement in free space, and any finite impedance when
simulating contact with an object featuring such impedance characteristics. The friction,
inertia, and backlash present in most haptic devices prevent them from meeting this ideal.
3 • A third issue is that haptic-rendering algorithms operate in discrete time whereas users
operate in continuous time.
7
1 • Devices that allow users to touch and manipulate 3-dimentional virtual
objects.
PHANTOM:
8
comes into 'contact' with the virtual model, providing accurate, ground
referenced force feedback. The physical working space is determined by the
extent of the arm, and a number of models are available to suit different user
requirements.
The phantom system is controlled by three direct current (DC) motors that
have sensors and encoders attached to them. The number of motors
corresponds to the number of degrees of freedom a particular phantom
system has, although most systems produced have 3 motors.
The encoders track the user’s motion or position along the x, y and z
coordinates the motors track the forces exerted on the user along the x, y and
z-axis. From the motors there is a cable that connects to an aluminum linkage,
which connects to a passive gimbals which attaches to the thimble or stylus. A
gimbal is a device that permits a body freedom of motion in any direction or
suspends it so that it will remain level at all times.
Used in surgical simulations and remote operation of robotics in hazardous environments
Cyber Glove:
Cyber Glove can sense the position and movement of the fingers and wrist.
Cyber Grasp:
The Cyber Grasp is a full hand force-feedback exo
skeletal device, which is worn over the CyberGlove.
9
CyberGrasp consists of a lightweight mechanical assembly, or exoskeleton,
that fits over a motion capture glove. About 20 flexible semiconductor sensors
are sewn into the fabric of the glove measure hand, wrist and finger
movement. The sensors send their readings to a computer that displays a
virtual hand mimicking the real hand’s flexes, tilts, dips, waves and swivels.
The same program that moves the virtual hand on the screen also directs
machinery that exerts palpable forces on the real hand, creating the illusion
of touching and grasping. A special computer called a force control unit
calculates how much the exoskeleton assembly should resist movement of the
real hand in order to simulate the onscreen action. Each of five actuator
motors turns a spool that rolls or unrolls a cable. The cable conveys the
resulting pushes or pulls to a finger via the exoskeleton.
Applications
Medical training applications:
Such training systems use the Phantom’s
force
display capabilities to let medical trainees
10
experience and learn the subtle and complex
physical interactions needed to become skillful in their art.
A computer based teaching tool has
been developed using haptic technology to train veterinary students to
examine the bovine reproductive tract, simulating rectal palpation. The
student receives touch feedback from a haptic device while palpating virtual
objects. The teacher can visualize the student's actions on a screen and give
training and guidance.
Collision Detection:-
The main goal of this project is to measure forces and torques exerted by the
surgeon during minimally-invasive surgery in order to optimize haptic
feedback. A standard da Vinci tool affixed with a 6 DOF force/torque
transducer will be used to perform basic surgical procedures and the forces
applied by the tool will be recorded and analyzed. This will help determine in
which degrees of freedom forces are most commonly applied.
11
Stroke patients:
Stroke patients who face months of tedious rehabilitation to regain the use of
impaired limbs may benefit from new haptics systems -- interfaces that add
the sense of touch to virtual computer environments -- in development at the
University of Southern California's Integrated Media Systems Center (IMSC).
Prostate Cancer:
12
Prostate cancer is the third leading cause of death among American men,
resulting in approximately 31,000 deaths annually. A common treatment
method is to insert needles into the prostate to distribute radioactive seeds,
destroying the cancerous tissue. This procedure is known as brachytherapy.
The prostate itself and the surrounding organs are all soft tissue. Tissue
deformation makes it difficult to distribute the seeds as planned. In our
research we have developed a device to minimize this deformation, improving
brachytherapy by increasing the seed distribution accuracy.
surgeons complete removal of the lens segments in the same way: by holding
them at the mouth of the laser/aspiration probe using vacuum and firing the
laser to fragment them for aspiration. However, several surgeons have
developed different techniques for nuclear disassembly. These include:
13
Settings: Aspiration: 275 to 300 mmHg; Air infusion: 80 to 100 mmHg; Laser pulses: 1
Hz.
Wehner backcracking. This technique, developed by Wolfram Wehner, M.D., uses the
Wehner Spoon, an irrigating handpiece that resembles a shovel at the tip. The
surgeon lifts the nucleus using the laser/aspiration probe, inserts the Wehner spoon
underneath, and uses the two probes to backcrack the nucleus. The Wehner spoon
provides support during removal of the lens segments.
Settings: Aspiration: 275 mmHg; Air infusion: 95 mmHg; Laser pulses: 3 Hz.
Intelligent machines:
14
Computer Science, Department of Electrical and Computer Engineering, and
the Department of Mechanical Engineering. It is this diversity of interests
along with the spirit of collaboration which forms the driving force behind this
dynamic research community.
Human fingers are able to manipulate delicate objects without either dropping
or breaking them, but lose this ability to a certain degree when using a tele-
operated system. One reason for this is that human fingers are equipped with
sensors that tell us when our fingerprints at the edge of the contact area start
to come off the object we are holding, allowing us to apply the minimum force
necessary to hold the object. While several other researchers have built
synthetic skins for their robot fingers that work in a similar way to human
fingerprints, a tactile haptic device is needed to display these sensations to a
human using a tele-operated system. For this purpose we have designed the 2
degree of freedom Haptic Slip Display. We have conducted psychophysical
experiments validating the device design and demonstrating that it can
improve user performance in a delicate manipulation task in a virtual
environment.
Gaming technology:
Flight Simulations: Motors and actuators push, pull, and shake the flight yoke,
throttle, rudder pedals, and cockpit shell, replicating all the tactile and
kinesthetic cues of real flight. Some examples of the simulator’s haptic
capabilities include resistance in the yoke from pulling out of a hard dive, the
shaking caused by stalls, and the bumps felt when rolling down concrete
runway. These flight simulators look and feel so real that a pilot who
successfully completes training on a top-of-the-line Level 5 simulator can
immediately start flying a real commercial airliner.
15
Today, all major video consoles have built-in tactile feedback capability.
Various sports games, for example, let you feel bone-crushing tackles or the
different vibrations caused by skateboarding over plywood, asphalt, and
concrete. Altogether, more than 500 games use force feedback, and more
than 20 peripheral manufacturers now market in excess of 100 haptics
hardware products for gaming.
Mobile Phones: Samsung has made a phone, which vibrates, differently for
different callers. Motorola too has made haptic phones.
Cars: For the past two model years, the BMW 7 series has contained the
iDrive (based on Immersion Corp's technology), which uses a small wheel on
the console to give haptic feedback so the driver can control the peripherals
like stereo, heating, navigation system etc. through menus on a video screen.
The firm introduced haptic technology for the X-by-Wire system and was
showcased at the Alps Show 2005 in Tokyo. The system consisted of a
"cockpit" with steering, a gearshift lever and pedals that embed haptic
technology, and a remote-control car. Visitors could control a remote control
car by operating the steering, gearshift lever and pedals in the cockpit seeing
the screen in front of the cockpit, which is projected via a camera equipped on
the remote control car.
16
Experiments on robot control using haptic devices have shown the
effectiveness of haptic feedback in a mobile robot tele-operation system for
safe navigation in a shared autonomy scenario.
Future Enhancements:
Force Feedback Provided In Web Pages:
The Virtual Braille Display (VBD) project was created to investigate the
possibility of using the lateral skin stretch technology of the STReSS tactile
display for Braille. The project was initially conducted at VisuAide inc. and is
now being continued in McGill's Haptics Laboratory.
17
CONCLUSION:
REFERENCES:
http://www.sensable.com/products/datafiles/phantom_ghost/Salisbury_H
aptics95.pdf
http://www.wam.umd.edu/~prmartin/3degrees/HAPTIC
%20TECHNOLOGY1.doc
http://www.computer.org/cga/cg2004/g2024.pdf
http://www.dcs.gla.ac.uk/~stephen/papers/EVA2001.pdf
http://cda.mrs.umn.edu/~lopezdr/seminar/spring2000/potts.pdf
http://www.sensable.com
http://www.immersion.com
http://www.logitech.com
http://www.technologyreview.com
18
19
20
21
22
23
24
25