2.1 Agent and Environment
2.1 Agent and Environment
2.1 Agent and Environment
AI
CH:02
Intelligent Agents
Content
• Introduction of agents,
• Structure of Intelligent Agent,
• Characteristics of Intelligent Agents
• Types of Agents: Simple Reflex, Model Based, Goal
Based, Utility Based Agents.
• Environment Types: Deterministic, Stochastic,
Static, Dynamic,Observable, Semi-observable,
Single Agent, Multi Agent
Agent
• An agent is anything that can be viewed as
perceiving its environment through sensors
and acting upon that environment through
actuators
Vacuum-cleaner world
• Situatedness
The agent receives some form of sensory input from its
environment, and it performs some action that changes its
environment in some way.
• Examples of environments: Vacuum cleaner
• Autonomy
The agent can act without direct intervention by humans or other
agents and that it has control over its own actions and internal
state.
• Adaptivity
The agent is capable of (1) reacting flexibly to changes in its
environment; (2) taking goal-directed initiative (i.e., is pro-active),
when appropriate; and (3) learning from its own experience, its
environment, and interactions with others.
• Sociability
The agent is capable of interacting in a peer-to-peer manner with
other agents or humans.
Types of Environment
• Fully observable vs. partially observable
• Single agent vs. multiagent
• Deterministic vs. stochastic
• Episodic vs. sequential
• Static vs. dynamic
• Discrete vs. continuous
• Known vs. unknown
1.Fully observable vs. partially
observable:
• If an agent’s sensors give it access to the complete state of the
environment at each point in time, then we say that the task
environment is fully observable.
• Fully observable environments are convenient because agent need not
maintain any internal state to keep track of the world.
• An environment is partially observable because of noisy and inaccurate
sensors or parts of state are missing from sensor data.
• For example, a vacuum agent with only a local dirt sensor cannot tell
whether there is dirt in other squares, and an automated taxi cannot see
what other drivers are thinking.
2.Single agent vs. multiagent:
• For example, an agent solving a crossword puzzle by itself is clearly
in a single-agent environment,
• Carrom is multiagent.
3.Deterministic vs. stochastic.
• Deterministic vs. stochastic.
• If the next state of the environment is completely
determined by the current state and the action executed by
the agent, then we say the environment is deterministic;
• otherwise, it is stochastic.
each state.
Difference between goal-based agents and
utility-based agents
• Goal based agents decides its actions based on goal whereas Utility
based agents decides its actions based on utilities.
• Goal based agents are more flexible whereas Utility based agents are
less flexible.
• Goal based agents are less faster whereas Utility based agents are
more faster.
• Goal based agents are not enough to generate high-quality behavior
in most environment whereas Utility based agents are enough to
generate high-quality behavior in most environment.
• Goal based agents can not specify the appropriate tradeoff whereas
Utility based agents can specify the appropriate tradeoff .
• Goal based agents are less safer whereas Utility based agents are
more safer
Rational
• Rational dictionary meaning is something
logical, sensible and not emotional.
• A rational agent is one that does the right
thing.
• The rationality of an agent is measured by its
performance measure.
Thank you !
39