Artificial Intelligence - The Nature of Environments

Download as pptx, pdf, or txt
Download as pptx, pdf, or txt
You are on page 1of 27

16CS314 -

Artificial Intelligence
PEASPEAS
PEAS


E.g., the task of designing an automated taxi

Performance measure?? safety, destination, legality, comfort

Environment?? US streets/freeways, pedestrians, weather

Actuators?? steering, accelerator, brake, horn,
speaker/display

Sensors?? video, accelerometers, gauges, engine sensors,
keyboard, GPS
PEASPEAS
Agent
PEAS Type Performance Environments Actuators Sensors
Measure

Taxi driver Safe: fast, Roads, Steering Cameras, sonar,


legal, other traffic, accelerator Speedometer,G
comfortable pedestrians, brake, PS,
trip, Customers Signal, Odometer,
maximize horn, engine
profits display sensors
,keyboards,
accelerometer
PEASPEAS
PROPERTIES

• Fully observable vs Partially Observable


• Determinstic vs Stochastic
• Episodic vs Sequential
• Static vs Dynamic
• Discrete vs Continuous
• Single agent vs Multi agent
PEASPEAS
Fully observable vs Partially Observable

 Fully observable-
 Agent’s sensor – complete access to state of

environment
 Partially Observable

 Noise – agent with inaccurate sensors may miss

some state information


PEASPEAS
Fully observable

 Fully observable-Example
PEASPEAS
Fully observable

 Fully observable-
 Example

 Puzzle game
 Image analysis
PEASPEAS
Partially Observable

 Partially observable-
 Agent can’t see other cards
Determinstic vs Stochastic

 Determinstic
 Agent can take the next action or can process

remaining part of image based on current knowledge


• Stochastic
agent is in stochastic (random probability)
• environment , see the goal and from all current and
previous percepts agent needs to take action
Determinstic - Example

 Video analysis
Stochastic
- Example

 Car driving
 Boat driving agent- next driving is not based on

current state.
Episodic Vs Sequential

 Episodic
Agent experience is divided into atomic episodes such
that each episode consists of, the agent perceiving
process and then performing single action.
 Previous episode does not affect the current actions.

 Sequential

 Current decision could affect all future decision


Episodic example

 Agent finding defective part of assembled computer


machine.
 Will inspect the current part and take action which

does not depend on pervious decision


 Blood testing for patient

 Card games
Sequential example

Chess is sequential example- agent takes action based


on pervious decisions
Chess with a clock
Refinery controller
Static vs Dynamic

 Static
 Easy to tackle agent need not worry about changes

around (will not change) while taking actions


 Example

8 queen puzzle
Static vs Dynamic

 Dynamic
 Keep on changing continuously which makes agent to

be more attentive to make decision for act


 Example

 Driving boat( a big wave can come it can be more

windy)
Discrete vs Continuous

 Discrete
 Has fixed finite discrete states over the time and each

state has associated percepts and action


 Tic tac toe depicts -where every

state is stable -associated percept –


outcome of some action
Discrete vs Continuous

 Continuous
 Is not stable at any given point of time – changes

randomly – make agent learn continuously – and make


decision.
 Flight controller
Single agent vs Multi agent

 Single agent
 Well defined single agent

 Boat driving [ here single agent perceives and acts]


Single agent vs Multi agent

Multi agent
 Various agent or various

group of agents working


together to take decisions
1.Multi agent independent
environment
 Many agent in game of maze
Single agent vs Multi agent

2.Multi agent cooperative environment


Many agent working together to achieve goal
Football
3.Multi agent competitive environment
Here many agents are working but opposite to each
other
Trading agents
Single agent vs Multi agent

4.Multi agent antagonistic environment


Here multiple agents are working opposite to each other
but one side (agent/ agent teams ) is having negative
goal
War games
Properties of Environments

 Accessible/ Inaccessible.
 If an agent's sensors give it access to the complete state of the
environment needed to choose an action, the environment is accessible.
 Such environments are convenient, since the agent is freed from the task
of keeping track of the changes in the environment.
 Deterministic/ Nondeterministic.
 An environment is deterministic if the next state of the environment is
completely determined by the current state of the environment and the
action of the agent.
 In an accessible and deterministic environment the agent need not deal
with uncertainty.
01/21/2022 Artificial Intelligence 24
Contd..
 Episodic/ Non episodic.
 An episodic environment means that subsequent episodes do not
depend on what actions occurred in previous episodes.
 Such environments do not require the agent to plan ahead.

01/21/2022 Artificial Intelligence 25


Properties of Environments

 Static/ Dynamic.
 An environment which does not change while the
agent is thinking is static.
 In a static environment the agent need not worry
about the passage of time while he is thinking, nor
does he have to observe the world while he is thinking.
 Discrete/ Continuous.
 If the number of distinct percepts and actions is limited
the environment is discrete, otherwise it is continuous.
01/21/2022 Artificial Intelligence 26
Contd..
 With/ Without rational adversaries.
 If an environment does not contain other rationally
thinking, adversary agents, the agent need not worry
about strategic, game theoretic aspects of the
environment
 As example for a game with a rational adversary, try
the Prisoner's Dilemma

01/21/2022 Artificial Intelligence 27

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy