0% found this document useful (0 votes)
78 views

Case Study Report

1. The document discusses two humanoid robots, Sophia and Pepper. Sophia was created by Hanson Robotics and can display over 50 facial expressions. Pepper is manufactured by SoftBank Robotics and can read human emotions. 2. Both robots use sensors and artificial intelligence to interact with humans. Sophia uses computer vision, speech recognition and emotional analysis software. Pepper analyzes facial expressions to understand emotions. 3. The document outlines the purpose, features, and advantages of using humanoid robots like Sophia and Pepper for healthcare, customer service, education, and other applications.

Uploaded by

divya
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
78 views

Case Study Report

1. The document discusses two humanoid robots, Sophia and Pepper. Sophia was created by Hanson Robotics and can display over 50 facial expressions. Pepper is manufactured by SoftBank Robotics and can read human emotions. 2. Both robots use sensors and artificial intelligence to interact with humans. Sophia uses computer vision, speech recognition and emotional analysis software. Pepper analyzes facial expressions to understand emotions. 3. The document outlines the purpose, features, and advantages of using humanoid robots like Sophia and Pepper for healthcare, customer service, education, and other applications.

Uploaded by

divya
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 18

CASE STUDY REPORT

ON
HUMANOID ROBOTS

(SOPHIA AND PEPPER)

INTRODUCTION TO HUMANOID ROBOT


A humanoid robot is a robot with its body shape built to
resemble the human body. The design may be for functional
purposes, such as interacting with human tools and
environments, for experimental purposes, such as the study
of al locomotion, or for other purposes. In general, humanoid
robots have a torso, a head, two arms, and two legs, though
some forms of humanoid robots may model only part of the
body, for example, from the waist up. Some humanoid robots
also have heads designed to replicate human facial features
such as eyes and mouths. Androids are humanoid robots built
to aesthetically resemble humans.

PURPOSE
Humanoid robots are now used as research tools in several
scientific areas. Researchers study the human body structure
and behavior (biomechanics) to build humanoid robots. On
the other side, the attempt to simulate the human body leads to
a better understanding of it. Human cognition is a field of
study which is focused on how humans learn from sensory
information in order to acquire perceptual and motor skills.
This knowledge is used to develop computational models of
human behavior and it has been improving over time.
It has been suggested that very advanced robotics will
facilitate the enhancement of ordinary humans.
Although the initial aim of humanoid research was to build
better orthosis and prosthesis for human beings, knowledge
has been transferred between both disciplines. A few
examples are powered leg prosthesis for neuromuscularly
impaired, ankle-foot orthosis, biological realistic leg
prosthesis and forearm prosthesis.
Besides the research, humanoid robots are being developed to
perform human tasks like personal assistance, through which
they should be able to assist the sick and elderly, and dirty or
dangerous jobs. Humanoids are also suitable for some
procedurally-based vocations, such as reception-desk
administrators and automotive manufacturing line workers. In
essence, since they can use tools and operate equipment and
vehicles designed for the human form, humanoids could
theoretically perform any task a human being can, so long as
they have the proper software. However, the complexity of
doing so is immense
SENSORS
A sensor is a device that measures some attribute of the world.
Being one of the three primitives of robotics (besides planning
and control), sensing plays an important role in robotic
paradigms. Sensors can be classified according to the physical
process with which they work or according to the type of
measurement information that they give as output. In this
case, the second approach was used.
Proprioceptive sensors
Proprioceptive sensors sense the position, the orientation and
the speed of the humanoid's body and joints.
In human beings the otoliths and semi-circular canals (in the
inner ear) are used to maintain balance and orientation. In
addition humans use their own proprioceptive sensors (e.g.
touch, muscle extension, limb position) to help with their
orientation. Humanoid robots use accelerometers to measure
the acceleration, from which velocity can be calculated by
integration; tilt sensors to measure inclination; force sensors
placed in robot's hands and feet to measure contact force with
environment; position sensors, that indicate the actual position
of the robot (from which the velocity can be calculated by
derivation) or even speed sensors.
Exteroceptive sensors
Arrays of tactels can be used to provide data on what has been
touched. The Shadow Handuses an array of 34 tactels
arranged beneath its polyurethane skin on each finger tip.
[3]
 Tactile sensors also provide information about forces and
torques transferred between the robot and other objects.
Vision refers to processing data from any modality which uses
the electromagnetic spectrum to produce an image. In
humanoid robots it is used to recognize objects and determine
their properties. Vision sensors work most similarly to the
eyes of human beings. Most humanoid robots
use CCD cameras as vision sensors.
Sound sensors allow humanoid robots to hear speech and
environmental sounds, and perform as the ears of the human
being. Microphones are usually used for this task.
Case Study
on
Humanoid ROBOTS
Sophia – The ROBOT - Sophia is a social humanoid
robot developed by Hong Kong-based company Hanson
Robotics. Sophia was activated on April 19, 2015 and made
her first public appearance at South by Southwest Festival in
mid-March 2016 in Austin, Texas, United States.  She is able
to display more than 50 facial expressions.
Sophia has been covered by media around the globe and
has participated in many high-profile interviews. In October
2017, Sophia, the robot became the first robot to receive
citizenship of any country.
HISTORY
 Sophia was created Henson Robotics in collaboration
with AL developers.
 The robot was modelled after actress Audrey Hepburis
known for human like appereance.
 Sophia also imitates human gestures and facial
expressions and is able to answer certain questions and
make a simple conversation on pre defined topics(e.g.
weather).
 Sophia uses voice recognition technology from Alphabet
Inc and is designed to get smarter overtime.
 Sophia has seven robot Humanoid “siblings” who were
also created by Henson robotics.
FEATURES
 Cameras within Sophia´ eyes combined with computer
algorithms allow her to see.
 She can follow faces ,sustain eye contact and recognize
individuals.
 She is also able to process speech and have conversations
using Alphabet´s Goggle chrome voice recognition
technology and other tools.
 The software has been programmed to give pre wrriten
responses to specific questions or phrases.
 Sophia would ultimately be a good fit to serve in
healthcare, customer service, therapy and education.
PUBLIC FIGURE

 On November 21, 2017 sophia was named the United


Nations Development programme´s first ever Innovation
champion for Asia and the pacific.
 This is historical to be the first Robot in the world to be
recognized with citizenship.
 Sophia also visited India for the first time in IIT Bombay
´s Tech fest on 30 December.
 The robot appeared in traditional Indian attire and
greeted the crowd with a “Namaste”.
EVENTS
 On October 11, 2017 Sophia was introduced to the
United Nations with a brief conversation with the United
Nations Deputy Secretary- General , Amina J.
Mohammed.
 In a interview with Business Insider´s chief UK Editor
Jim Edwards, he predicted it was a step towards “
conversational artificial intelligence”.
 Sophia also impressed CNBC interviewer in which she
was interviewed like a human.
 Recently on March 21,2018 , Sophia addressed a
conference in Kathmandu Nepal as a part of UN´s
sustainable Development goals in Asia.
ARTIFICIAL INTELLIGENCE USED IN SOPHIA
On the perceptual side, Hanson Robotics have used deep
neural networks to create tools that assess a person’s emotion
lfrom their facial expression and their tone of voice. The idea
is that assessment of the user’s emotional state will allow the
system to modulate its behavior appropriately and
significantly enhance the interactive bonding experienced of
the user. They have also developed software enabling a robot
or avatar to recognize and mirror a human’s facial expressions
and vocal quality. They have also done significant work
fleshing out the motivational and emotional aspect of the
OpenCog system. These aspects are critical because the
Hanson robots have the ability to express emotion via choice
of animations, modulation of animations, and tone of voice;
and to ?experience? emotions in the OpenCog system
controlling it via modulating its action selection and cognitive
processes based on emotional factors. OpenPsi, the core
framework underlying OpenCog’s motivational system, was
originally based on MicroPsi , Joscha Bach’s elaboration of
Dietrich Dorner’s Psi theory, a psychological theory covering
human intention selection, action regulation, and emotion. In
our work on the Loving AI project, they have extended
OpenPsi to incorporate aspects of Klaus Scherer’s Component
Process Model (CPM) [Sch84], an appraisal theory of human
emotion. In Psi theory, emotion is understood as an emergent
property of the modulation of perception, behavior, and
cognitive processing. Emotions are interpreted as different
state space configurations of cognitive modulators along with
the valence (pleasure/distress) dimension, the assessment of
cognitive urges, and the experience of accompanying physical
sensations that result from the effects of the particular
modulator settings on the physiology of the system. In CPM
events relevant to needs, goals or values trigger dynamical,
recursive emotion processes. Events and their consequences
are appraised with a set of criteria on multiple levels of
processing.
ADVANTAGES
 Sophia can be used for healthcare sector.
 Sophia can solve CBI and FBI cases very easily.
 She can be good friend to those children and adults.
 She can express feelings - She wants to live and work
with humans so She need to express the emotions to
understand humans and build trust with people.
 She can be a good teacher in the villages and city
colleges.
 She can be a business robot.
 As she has artificial Intelligence
PEPPER – The Humanoid Robot
PEPPER - Pepper is a semi-humanoid robot manufactured
by SoftBank Robotics (formerly Aldebaran Robotics), which
is owned by SoftBank, designed with the ability to read
emotions. It was introduced in a conference on 5 June 2014,
and was showcased in Softbank mobile phone stores in Japan
beginning the next day. Pepper's emotion comes from the
ability to analyze expressions and voice tones

Manufacturer Aldebaran Robotics (now SoftBank


Robotics)
Foxconn
Country France
Japan
Year of 2014 prototype
creation
Type Humanoid
Purpose Technology demonstrator

A robot designed to interact with humans.


Standing 120cm tall, Pepper has no trouble in perceiving his
environment and entering into a conversation when he sees a
person.

The touch screen on his chest displays content to highlight


message and support speech.

His curvy design ensures danger-free use and a high level of


acceptance by users.

Functionalities of PEPPER:
1. 20 degrees of freedom for natural and expressive
movements.
2. Speech recognition and dialogue available in 15
languages. English, French, Spanish, German, Italian, Arabic,
Dutch...

3. Perception modules to recognize and interact with the


person talking to him.

4. Touch sensors, LEDs and microphones for multimodal


interactions.
5. Infrared sensors, bumpers, an inertial unit, 2D and 3D
cameras, and sonars for omnidirectional and autonomous
navigation.

6. Open and fully programmable platform.


DESIGN
Purpose

Pepper is not a functional robot for domestic use. Instead,


Pepper is intended "to make people happy", enhance people's
lives, facilitate relationships, have fun with people and
connect people with the outside world. Pepper's creators hope
that independent developers will create new content and uses
for Pepper.
Specifications
The robot's head has four microphones, two HD cameras (in
the mouth and forehead), and a 3-D depth sensor (behind the
eyes). There is a gyroscope in the torso and touch sensors in
the head and hands. The mobile base has two sonars,
six lasers, three bumper sensors, and a gyroscope.
USE OF PEPPER
Commercial
Pepper is currently being used as a receptionist at several
offices in the UK and is able to identify visitors with the use
of facial recognition, send alerts for meeting organisers and
arrange for drinks to be made. Pepper is said to be able to chat
to prospective clients.
The robot has also been used at banks and medical facilities in
Japan, using applications created by Seikatsu Kakumei. and is
also used at the four hundred branches of Hamazushi
restaurants in Japan.
Consumer
Pepper is used in over a thousand homes in Japan.
AcademicPepper is available as a research and educational
robot for schools, colleges and universities to teach
programming and conduct research into human-robot
interactions.In 2017, an international team began research into
using Pepper as versatile robot to help look after older people
in care homes or sheltered accommodation. The project
received funding worth two million pound, with donors
including the European Union and the Japanese government.
The project was expected to run for three years. Institution
evolved in the research include Middlesex University and
the University of Bedfordshire. On Tuesday 16 October 2018,
a Pepper robot from the project gave evidence to the
Education Committee of the House of Commons of the
United Kingdom ParliamentPEDIA.

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy