0% found this document useful (0 votes)
0 views

SEN CHP 4

The document outlines key aspects of software project estimation, focusing on the Management Spectrum's four P's: People, Product, Process, and Project. It discusses various metrics for size estimation, including Lines of Code (LOC) and Function Points (FP), as well as different estimation techniques such as heuristic, analytical, and empirical methods. Additionally, it introduces the COCOMO model for cost estimation and highlights the importance of risk management in software projects.

Uploaded by

Zahoor Alam
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
0 views

SEN CHP 4

The document outlines key aspects of software project estimation, focusing on the Management Spectrum's four P's: People, Product, Process, and Project. It discusses various metrics for size estimation, including Lines of Code (LOC) and Function Points (FP), as well as different estimation techniques such as heuristic, analytical, and empirical methods. Additionally, it introduces the COCOMO model for cost estimation and highlights the importance of risk management in software projects.

Uploaded by

Zahoor Alam
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 42

Software Project Estimation

The Management Spectrum


4 P’s
The Management Spectrum describe the
management of a software project or how to
make a project successful.
It focuses on following four P’s
- The People
- The Product
- The Process
- The Project
 The people
 Deals with the cultivation of motivated, highly skilled
people
 Includes from manager to developer, from customer
to end-user.
 The product
 Product objectives and scope should be established
before a project can be planned
 The process
 The software process provides the framework from
which a comprehensive plan for software
development can be established
 The project
 Planning and controlling a software project is done
for one primary reason…it is the only known way to
manage complexity
Metrics for size estimation
Estimation of the size of software is an essential part of
Software Project Management.
 It helps the project manager to further predict the effort
and time which will be needed to build the project.
In order to be able to accurately estimate the project
size, some important metrics should be defined in terms
of which the project size can be expressed
Currently two metrics are popularly being used widely to
estimate size: lines of code (LOC) and function point
(FP).
The usage of each of these metrics in project size
estimation has its own advantages and disadvantages.
. Lines of Code (LOC)
As the name suggest, LOC count the total number
of lines of source code in a project. The units of
LOC are:
KLOC- Thousand lines of code
NLOC- Non comment lines of code
KDSI- Thousands of delivered source instruction
The size is estimated by comparing it with the
existing systems of same kind. The experts use it
to predict the required size of various components
of software and then add them to get the total size.
Advantages:
Universally accepted and is used in many
models like COCOMO.
Estimation is closer to developer’s perspective.
Simple to use.
Disadvantages:
Different programming languages contains
different number of lines.
No proper industry standard exist for this
technique.
It is difficult to estimate the size using this
technique in early stages of project.
Function Point
In this method, the number and type of
functions supported by the software are
utilized to find FPC(function point count).
The various parameters are:
 Number of inputs
 Number of user outputs
 Number of inquiries
 Number of files.
 Number of interfaces
 Once the data have been collected, a complexity value is
associated with each count. Each entry can be simple,
average or complex. Depending upon these complexity
values is calculated.
 To compute function points, we use
 FP = count-total X [ 0.65 + 0.01 * SUM(Fi) ]
 Where, count-total is the sum of all entries obtained .
 Fi(I= 1 to 14) are complexity adjustment values based on
response to questions(1-14) given below.
 Fi
1. Does the system require reliable backup and recovery?
2. Is performance critical?
3. Does the system require on-line entry?
4 .Is the internal processing complex?
5. Is the code designed to be reusable?
Rate each factor on a scale of 0 to 5
0 - No influence
1 - Incidental
2 - Moderate
3 - Average
4 - Significant
5 - Essential
Count-total is sum of all FP entries.
Once function points have been calculated,
productivity, quality, cost and documentation can be
evaluated.
PROJECT COST ESTIMATION APPROACHES
Estimation techniques

1. Heuristic techniques
2. Analytical estimation techniques
3. Empirical estimation techniques
Heuristic technique
It assumes that the relationships among the
different project parameters can be modelled
using suitable mathematical expressions.
Once the basic parameters are known, the other
parameters can be easily determined by
substituting the value of the basic parameters in
the mathematical expression.
 Different heuristic models can be divided into two
classes : single variable model and the multi
variable model.
Single Variable Estimation
Models:
 It provides a means to estimate the desired characteristics
Of a problem, using some previously estimated basic
characteristic of the software product such as its size.
 A single variable estimator model takes the following
form:
 Estimated Parameter=c1*ed1
 e= characteristic which already have been calculated.
 Estimated parameter is the dependent parameter to be
estimated. The dependent parameters to be estimated
could be effort, duration, staff size etc.
 c1 and d1 are constants- calculated from past projects.
 COCOMO is one of this type of models example.
Multivariable Cost Estimation Model:
 It has the following form
 Estimated Resources=c1*e1d1+c2*e1d1+----
 e1 and e2 are the basic independent characteristics of the
software already estimated.
 c1, c2, d1, d2, are constants.
 Multivariable Estimation Models are expected to give more
accurate estimate compared to the Single Variable Models, since
a project parameters is typically influenced by several
independent parameters.
 The independent parameters influence the dependent parameter
to different extents.
 This is modelled by the constants
 c1,c2,d1,d2.....
 These constants are determined from historical data.
 Intermediate Model of COCOMO is an example of this
Analytical Estimation technique
It derives the required results starting with basic
assumptions regarding the project.
Thus, unlike empirical and heuristic techniques,
analytical techniques do have scientific basis,
Halstead's software science is an example of an
analytical technique.
 It can be used to derive some interesting results
starting with a few simple assumptions.
It is especially useful for estimating software
maintenance efforts.
In fact, it outperforms both empirical and heuristic
techniques when used for predicting software
maintenance efforts.
Halstead complexity
measures
 In 1977, Mr. Maurice Howard Halstead introduced metrics to
measure software complexity.
 Halstead’s metrics depends upon the actual implementation
of program and its measures, which are computed directly
from the operators and operands from source code, in static
manner. It allows to evaluate testing time, vocabulary, size,
difficulty, errors, and efforts for C/C++/Java source code.
 According to Halstead, “A computer program is an
implementation of an algorithm considered to be a collection
of tokens which can be classified as either operators or
operands”. Halstead metrics think a program as sequence of
operators and their associated operands.
 He defines various indicators to check complexity of module.
Halstead matrix are
Halstead Program Length – The total number of
operator occurrences and the total number of
operand occurrences.
N = N1 + N2
And estimated program length is, N ^ = n1log2n1 +
n2log2n2

Halstead Vocabulary – The total number of


unique operator and unique operand occurrences.
n = n1 + n2
Program Volume – Proportional to program size,
represents the size, in bits, of space necessary for
storing the program. This parameter is dependent on
specific algorithm implementation. The properties V,
N, and the number of lines in the code are shown to
be linearly connected and equally valid for measuring
relative program size.V = Size * (log2 vocabulary) = N
* log2(n)
The unit of measurement of volume is the common
unit for size “bits”. It is the actual size of a program if a
uniform binary encoding for the vocabulary is used.
And error = Volume / 3000
Program Difficulty – This parameter shows how
difficult to handle the program is.

D = (n1 / 2) * (N2 / n2)

D=1/L

As the volume of the implementation of a


program increases, the program level decreases
and the difficulty increases.
Thus, programming practices such as redundant
usage of operands, or the failure to use higher-
level control constructs will tend to increase the
volume as well as the difficulty.
Programming Effort – Measures the amount of
mental activity needed to translate the existing
algorithm into implementation in the specified program
language.

E = V / L = D * V = Difficulty * Volume

Language Level – Shows the algorithm


implementation program language level. The same
algorithm demands additional effort if it is written in a
low-level program language. For example, it is easier
to program in Pascal than in Assembler.

L’ = V / D / D
lambda = L * V* = L2 * V
Intelligence Content – Determines the amount of
intelligence presented (stated) in the program This
parameter provides a measurement of program
complexity, independently of the program language in
which it was implemented.
I=V/D
Programming Time – Shows time (in minutes) needed to
translate the existing algorithm into implementation in the
specified program language.
T = E / (f * S)The concept of the processing rate of the
human brain, developed by the psychologist John Stroud,
is also used. Stoud defined a moment as the time required
by the human brain requires to carry out the most
elementary decision. The Stoud number S is therefore
Stoud’s moments per second with:
5 <= S <= 20. Halstead uses 18. The value of S has been
empirically developed from psychological reasoning, and
its recommended value for programming applications is
18.
Overview of empirical
estimation
Empirical estimation technique are based
on the data taken from the previous project
and some based on guesses and
assumptions.
Expert judgment technique
 Expert Judgment is a technique in which judgment is
provided based upon a specific set of criteria and/or
expertise that has been acquired in a specific
knowledge area, application area, or product area, a
particular discipline, an industry, etc. Such expertise
may be provided by any group or person with
specialized education, knowledge, skill, experience, or
training.[1]. This knowledge base can be provided by a
member of the project team, or multiple members of
the project team, or by a team leader or team leaders.
However, typically expert judgment requires an
expertise that is not present within the project team
and, as such, it is common for an external group or
person with a specific relevant skill set or knowledge
base to be brought in for a consultation,
Delphi cost estimation
 Delphi Method is a structured communication technique,
originally developed as a systematic, interactive forecasting
method which relies on a panel of experts. The experts answer
questionnaires in two or more rounds. After each round, a
facilitator provides an anonymous summary of the experts’
forecasts from the previous round with the reasons for their
judgments. Experts are then encouraged to revise their earlier
answers in light of the replies of other members of the panel.
 It is believed that during this process the range of answers will
decrease and the group will converge towards the "correct"
answer. Finally, the process is stopped after a predefined stop
criterion (e.g. number of rounds, achievement of consensus,
and stability of results) and the mean or median scores of the
final rounds determine the results.
 Delphi Method was developed in the 1950-1960s at the RAND
Corporation.
COCOMO(Constructive cost model)
 Cocomo (Constructive Cost Model) is a regression model based on
LOC, i.e number of Lines of Code. It is a procedural cost estimate
model for software projects and often used as a process of reliably
predicting the various parameters associated with making a project such
as size, effort, cost, time and quality. It was proposed by Barry Boehm in
1970 and is based on the study of 63 projects, which make it one of the
best-documented models.
 The key parameters which define the quality of any software products,
which are also an outcome of the Cocomo are primarily Effort &
Schedule:
 Effort: Amount of labor that will be required to complete a task. It is
measured in person-months units.
 Schedule: Simply means the amount of time required for the completion
of the job, which is, of course, proportional to the effort put. It is
measured in the units of time such as weeks, months.
Basic COCOMO
 Basic COCOMO computes software development effort (and
cost) as a function of program size. Program size is expressed
in estimated thousands of lines of code (KLOC) COCOMO
applies to three classes of software projects:
 • Organic projects - "small" teams with "good" experience
working with "less than rigid" requirements
 • Semi-detached projects - "medium" teams with mixed
experience working with a mix of rigid and less than rigid
requirements
 • Embedded projects - developed within a set of "tight"
constraints (hardware, software, operational, ......)
 The basic COCOMO equations take the form
 Effort Applied = ab (KLOC)b b [ man-months ]
 Development Time = cb (Effort Applied)d b [months]
 People required = Effort Applied / Development Time [count]
The coefficients a b , b b , c b and d b are given
in the following table.

 Basic COCOMO is good for quick estimate of


software costs. However it does not account for
differences in hardware constraints, personnel
quality and experience, use of modern tools and
techniques, and so on.
 Different models of Cocomo have been proposed to predict the cost
estimation at different levels, based on the amount of accuracy and
correctness required.
 Boehm’s definition of organic, semidetached, and embedded
systems:
 Organic – A software project is said to be an organic type if the
team size required is adequately small, the problem is well
understood and has been solved in the past and also the team
members have a nominal experience regarding the problem.
 Semi-detached – A software project is said to be a Semi-detached
type if the vital characteristics such as team-size, experience,
knowledge of the various programming environment lie in between
that of organic and Embedded. The projects classified as Semi-
Detached are comparatively less familiar and difficult to develop
compared to the organic ones and require more experience and
better guidance and creativity. Eg: Compilers or different Embedded
Systems can be considered of Semi-Detached type.
 Embedded – A software project with requiring the highest level of
complexity, creativity, and experience requirement fall under this
category. Such software requires a larger team size than the other
two models and also the developers need to be sufficiently
experienced and creative to develop such complex models.
Types of Models:
 COCOMO consists of a hierarchy of three
increasingly detailed and accurate forms. Any of
the three forms can be adopted according to our
requirements. These are types of COCOMO
model:
Basic COCOMO Model
Intermediate COCOMO Model
Detailed COCOMO Model
The first level, Basic COCOMO can be used for
quick and slightly rough calculations of Software
Costs. Its accuracy is somewhat restricted due to
the absence of sufficient factor considerations.
Intermediate COCOMO takes these Cost
Drivers into account and Detailed
COCOMO additionally accounts for the
influence of individual project phases, i.e in
case of Detailed it accounts for both these cost
drivers and also calculations are performed
phase wise henceforth producing a more
accurate result.
Risk Management
Definition of Risk
 A risk is a potential problem – it might happen and it might not
 Conceptual definition of risk
 Risk concerns future happenings
 Risk involves change in mind, opinion, actions, places, etc.
 Risk involves choice and the uncertainty that choice entails
 Two characteristics of risk
 Uncertainty – the risk may or may not happen, that is, there
are no 100% risks (those, instead, are called constraints)
 Loss – the risk becomes a reality and unwanted
consequences or losses occur

33
kinds of risk
 Schedule Risk:
Project schedule get slip when project tasks and
schedule release risks are not addressed properly.
Schedule risks mainly affect a project and finally on
company economy and may lead to project failure.
 Schedules often slip due to the following
reasons:
 Wrong time estimation
 Resources are not tracked properly. All resources like
staff, systems, skills of individuals etc.
 Failure to identify complex functionalities and time
required to develop those functionalities.
 Unexpected project scope expansions
Budget Risk:
 Wrong budget estimation.
 Cost overruns
 Project scope expansion

Operational Risks:
Risks of loss due to improper process implementation
failed system or some external events risks.
Causes of Operational risks:
 Failure to address priority conflicts
 Failure to resolve the responsibilities
 Insufficient resources
 No proper subject training
 No resource planning
 No communication in the team.
Technical risks:
Technical risks generally lead to failure of functionality and
performance.
Causes of technical risks are:
 Continuous changing requirements
 No advanced technology available or the existing technology is in
initial stages.
 The product is complex to implement.
 Difficult project modules integration.

Programmatic Risks:
These are the external risks beyond the operational limits. These are
all uncertain risks are outside the control of the program.
These external events can be:
 Running out of the fund.
 Market development
 Changing customer product strategy and priority
 Government rule changes.
Risk Assessment
 Risk assessment is a term used to describe the overall process
or method where you:
 Identify hazards and risk factors that have the potential to
cause harm (hazard identification).
 Analyze and evaluate the risk associated with that hazard (risk
analysis, and risk evaluation).
 Determine appropriate ways to eliminate the hazard, or control
the risk when the hazard cannot be eliminated (risk control).
 A risk assessment is a thorough look at your workplace to
identify those things, situations, processes, etc. that may
cause harm, particularly to people. After identification is made,
you analyze and evaluate how likely and severe the risk is.
When this determination is made, you can next, decide what
measures should be in place to effectively eliminate or control
the harm from happening.
Risk Identification
 It is a systematic attempt to specify threats to the project plans.

Two different types of risk:

1. Generic risks: These risks are a potential threat to each software project.
 2. Product-specific risks: These risks are recognized by those with a clear
understanding of the technology, the people and the environment which is
specific to the software that is to be built.
 A method for recognizing risks is to create item checklist.
 The checklist is used for risk identification and focus is at the subset of
known and predictable risk in the following categories:

1. Product size
2. Business impact
3. Customer characteristic
4. Process definition
5. Development environment
6. Technology to be built
7. staff size and experience
Risk Analysis
 Software Risk analysis is a very important aspect of risk management.
In this phase the risk is identified and then categorized. After the
categorization of risk, the level, likelihood (percentage) and impact of
the risk is analyzed. Likelihood is defined in percentage after
examining what are the chances of risk to occur due to various
technical conditions.
These technical conditions can be:
 Complexity of the technology
 Technical knowledge possessed by the testing team
 Conflicts within the team
 Teams being distributed over a large geographical area
 Usage of poor quality testing tools
 With impact we mean the consequence of a risk in case it happens. It
is important to know about the impact because it is necessary to know
how a business can get affected:
 What will be the loss to the customer
 How would the business suffer
 Loss of reputation or harm to society
 Monetary losses
 Legal actions against the company
 Cancellation of business license
 Level of risk is identified with the help of:

 Qualitative Risk Analysis: Here you define risk as:


 High
 Low
 Medium
 Quantitative Risk Analysis: can be used for software risk analysis
but is considered inappropriate because risk level is defined in %
which does not give a very clear picture.
Risk Containment
 Risk management means risk containment and mitigation.
First, you’ve got to identify and plan. Then be ready to act
when a risk arises, drawing upon the experience and
knowledge of the entire team to minimize the impact to
the project.

Risk management includes the following tasks:

 Identify risks and their triggers


 Classify and prioritize all risks
 Craft a plan that links each risk to a mitigation
 Monitor for risk triggers during the project
 Implement the mitigating action if any risk materializes
 Communicate risk status throughout project
RMMM Strategy
 Risk analysis support the project team in constructing a strategy to deal with
risks.

There are three important issues considered in developing an effective


strategy:

Risk avoidance or mitigation - It is the primary strategy which is fulfilled


through a plan.
 Risk monitoring - The project manager monitors the factors and gives an
indication whether the risk is becoming more or less.
 Risk management and planning - It assumes that the mitigation effort failed
and the risk is a reality.
 RMMM Plan
 It is a part of the software development plan or a separate document.
 The RMMM plan documents all work executed as a part of risk analysis and
used by the project manager as a part of the overall project plan.
 The risk mitigation and monitoring starts after the project is started and the
documentation of RMMM is completed.

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy