SPPM 1st and 2nd Units Notes
SPPM 1st and 2nd Units Notes
What is CMM?
CMM was developed at the Software engineering institute in the late 80's. It was developed as a
result of a study financed by the U.S Air Force as a way to evaluate the work of subcontractors.
CMM was first introduced in late 80's in U.S Air Force to evaluate the work of subcontractors.
Later on, with improved version, it was implemented to track the quality of the software
development system.
Today CMM act as a "seal of approval" in the software industry. It helps in various ways to
improve the software quality.
It guides towards repeatable standard process and hence reduce the learning time on
how to get things done
Practicing CMM means practicing standard protocol for development, which means it not
only helps the team to save time but also gives a clear view of what to do and what to
expect
The quality activities gel well with the project rather than thought of as a separate
event
It acts as a commuter between the project and the team
CMM efforts are always towards the improvement of the process
1. CMM came first but was later improved and was succeeded by CMMI.
2. Different sets of CMMS have problems with overlaps, contradictions, and lack of
standardization. CMMI later addressed these problems.
3. Initially, CMM describes specifically about software engineering whereas CMMI
describes integrated processes and disciplines as it applies both to software and
systems engineering.
4. CMMI is much more useful and universal than the older CMM.
The People Capability Maturity Model consists of five maturity levels. Each maturity level is
an evolutionary plateau at which one or more domains of the organization’s processes are
transformed to achieve a new level of organizational capability. The five levels of People CMM
are defined as follows:
The People Capability Maturity Model was designed initially for knowledge- intense
organizations and workforce management processes. However, it can be applied in almost any
organizational setting, either as a guide in implementing workforce improvement activities or as
a vehicle for assessing workforce practices.
The PSP can be applied to many parts of the software development process, including
- small-program development
- requirement definition
- document writing
- systems tests
- systems maintenance
- enhancement of large software systems
Benefits of TSP
The TSP provides a defined process framework for managing, tracking and reporting
the team's progress.
Using TSP, an organization can build self-directed teams that plan and track their work,
establish goals, and own their processes and plans. These can be pure software teams or
integrated product teams of 3 to 20 engineers.
TSP will help your organization establish a mature and disciplined
engineering practice that produces secure, reliable software.
2. In order to mange and control all of the intellectual freedom associated with software
development one should follow the following steps:
3. Since the testing phase is at the end of the development cycle in the waterfall model, it may be risky and
invites failure.
Software process and project management Page 17
So we need to do either the requirements must be modified or a substantial design changes is warranted by
breaking the software in to different pieces.
-There are five improvements to the basic waterfall model that would eliminate most of the development risks
are as follows:
a) Complete program design before analysis and coding begin (program design comes first):-
- By this technique, the program designer give surety that the software will not fail because of storage, timing, and
data fluctuations.
- Begin the design process with program designer, not the analyst or programmers.
- Write an overview document that is understandable, informative, and current so that every worker on the project
can gain an elemental understanding of the system.
b) Maintain current and complete documentation (Document the design):-
-It is necessary to provide a lot of documentation on most software programs.
- Due to this, helps to support later modifications by a separate test team, a separate maintenance team, and
operations personnel who are not software literate.
c) Do the job twice, if possible (Do it twice):-
- If a computer program is developed for the first time, arrange matters so that the version finally delivered
to the customer for operational deployment is actually the second version insofar as critical design/operations are
concerned.
- “Do it N times” approach is the principle of modern-day iterative development.
d) Plan, control, and monitor testing:-
- The biggest user of project resources is the test phase. This is the phase of greatest risk in terms of cost and
schedule.
- In order to carryout proper testing the following things to be done:
i) Employ a team of test specialists who were not responsible for the original design.
ii) Employ visual inspections to spot the obvious errors like dropped minus signs, missing factorsof
two, jumps to wrong addresses.
iii) Test every logic phase.
iv) Employ the final checkout on the target computer.
e) Involve the customer:-
- It is important to involve the customer in a formal way so that he has committed himself at earlier points
before final delivery by conducting some reviews such as,
i) Preliminary software review during preliminary program design step.
ii) Critical software review during program design.
iii) Final software acceptance review following testing.
IN PRACTICE:-
- Whatever the advices that are given by the software developers and the theory behind the waterfall model, some
software projects still practice the conventional software management approach.
Projects intended for trouble frequently exhibit the following symptoms:
i) Protracted (delayed) integration
- In the conventional model, the entire system was designed on paper, then implemented all at once,
then integrated. Only at the end of this process was it possible to perform system testing to verify that the
- It includes four distinct periods of risk exposure, where risk is defined as “the probability of missing a cost,
schedule, feature, or quality goal”.
Project Stakeholders :
Page 20
Software process and project management
v) Focus on Documents and Review Meetings
- The conventional process focused on various documents that attempted to describe the software product.
- Contractors produce literally tons of paper to meet milestones and demonstrate progress to stakeholders, rather
than spend their energy on tasks that would reduce risk and produce quality software.
- Most design reviews resulted in low engineering and high cost in terms of the effort and schedule involved in
their preparation and conduct.
1) Finding and fixing a software problem after delivery costs 100 times more than finding and fixing
the problem in early design phases.
2) You can compress software development schedules 25% of nominal (small), but no more.
4) Software development and maintenance costs are primarily a function of the number of source lines
of code.
5) Variations among people account for the biggest difference in software productivity.
6) The overall ratio of software to hardware costs is still growing. In 1955 it was 15:85; in 1985, 85:15.
8) Software systems and products typically cost 3 times as much per SLOC as individual software
programs. Software-system products cost 9 times as much.
9) Walkthroughs catch 60% of the errors.
- 80% of the software scrap and rework is caused by 20% of the errors.
Project Sizes :
The less software we write, the better it is for project management and for product quality
- The cost of software is not just in the cost of „coding‟ alone; it is also in
Analysis of requirements
– Design
– Review of requirements, design and code
– Test Planning and preparation
– Testing
– Bug fix
– Regression testing
– „Coding‟ takes around 15% of development cost
- Clearly, if we reduce 15 hrs of coding, we can directly reduce 100 hrs of development effort, and
also reduce the project team size appropriately !
Size reduction is defined in terms of human-generated source code.
Most often, this might still mean that the computer-generated executable code is at least the same or even
more
- If there is no proper well-documented case studies then it is difficult to estimate the cost of the
- But the cost model vendors claim that their tools are well suitable for estimating
- There are a lot of software cost estimation models are available such as, COCOMO,
- Of which COCOMO is one of the most open and well-documented cost estimation models
- Most software experts argued that the SLOC is a poor measure of size. But it has some value in the
software Industry.
- SLOC worked well in applications that were custom built why because of easy to automate and
instrument.
- Now a days there are so many automatic source code generators are available and there are so
many advanced higher-level languages are available. So SLOC is a uncertain measure.
- The main advantage of function points is that this method is independent of the technology and is
therefore a much better primitive unit for comparisons among projects and organizations.
- The main disadvantage of function points is that the primitive definitions are abstract and
measurements are not easily derived directly from the evolving artifacts.
- Function points is more accurate estimator in the early phases of a project life cycle. In later
phases, SLOC becomes a more useful and precise measurement basis of various metrics perspectives.
- The most real-world use of cost models is bottom-up rather than top-down.
- The software project manager defines the target cost of the software, then manipulates the parameters and
sizing until the target cost can be justified.
- It is not that much easy to improve the software economics but also difficult to measure and
validate.
- There are many aspects are there in order to improve the software economics they are, Size,
Process, Personnel, Environment and quality.
- These parameters (aspects) are not independent they are dependent. For example, tools enable size
reduction and process improvements, size- reduction approaches lead to process changes, and
process improvements drive tool requirements.
- GUI technology is a good example of tools enabling a new and different process. GUI builder tools
permitted engineering teams to construct an executable user interface faster and less cost.
- Two decades ago, teams developing a user interface would spend extensive time analyzing factors,
screen layout, and screen dynamics. All this would done on paper. Where as by using GUI, the
paper descriptions are not necessary.
Along with these five basic parameters another important factor that has influenced software
technology improvements across the board is the ever- increasing advances in hardware Performance.
- There has been a widespread movements in the 1990s toward Object- Oriented
technology.
- Some studies concluded that Object-Oriented programming languages appear to benefit both software
productivity and software quality. One of such Object-Oriented method is UML-Unified Modeling Language.
Booch described the following three reasons for the success of the projects that are using Object-
Oriented concepts:
1) An OO-model of the problem and its solution encourages a common vocabulary between the
end user of a system and its developers, thus creating a shared understanding of the problem being
solved.
2) The use of continuous integration creates opportunities to recognize risk early and make
incremental corrections without weaken the entire development effort.
3) An OO-architecture provides a clear separation among different elements of a system, crating
firewalls that prevent a change in one part of the system from the entire architecture.
1) A cruel focus on the development of a system that provides a well understood collection
of essential minimal characteristics.
2) The existence of a culture that is centered on results, encourages
communication, and yet is not afraid to fail.
3) The effective use of OO-modeling.
REUSE:
- They take ownership of improving product quality, adding new features and
transitioning to new technologies.
- They have a sufficiently broad customer base to be profitable.
COMMERCIAL COMPONENTS
The focus of this process is of organizational economics, long-term strategies, and a software ROI.
- Macro process:
The focus of the microprocess is on achieving an intermediate product baseline with sufficient
functionality as economically and rapidly as practical.
The objective of process improvement is to maximize the allocation of resources to productive
activities and minimize the impact of overhead activities on resources such as personnel, computers, and
schedule.
- COCOMO model suggests that the combined effects of personnel skill and experience can have an
impact on productivity as much as a factor of four over the unskilled personnel.
- Balance and coverage are two of the most important features of excellent teams. Whenever a
team is in out of balance then it is vulnerable.
- It is the responsibility of the project manager to keep track of his teams. Since teamwork is much more
important than the sum of the individuals.
Boehm – staffing principles:
4) The principle of team balance: Select people who will complement and synchronize with
one another.
5) The principle of phase-out: Keeping a misfit on the team doesn‟t benefit anyone.
– If people are already available with required skill set, just take them
– If people are already available but do not have the required skills, re-train them
– If you are not able to recruit skilled people, recruit and train people
- Project Manager
- Software Architect
Hiring skills. Few decisions are as important as hiring decisions. Placing the right person in the right
job seems obvious but is surprisingly hard to achieve.
Customer-interface skill. Avoiding adversarial relationships among stake- holders is a prerequisite
for success.
Decision-making skill. The jillion books written about management have failed to provide a
clear definition of this attribute. We all know a good leader when we run into one, and decision-
making skill seems obvious despite its intangible definition.
Team-building skill. Teamwork requires that a manager establish trust, motivate progress, exploit
eccentric prima donnas, transition average people into top performers, eliminate misfits, and
consolidate diverse opinions into a team direction.
Selling skill. Successful project managers must sell all stakeholders (including themselves) on decisions
and priorities, sell candidates on job positions, sell changes to the status quo in the face of resistance, and
sell achievements against objectives. In practice, selling requires continuous negotiation, compromise,
and empathy.
• Technical Skills: the most important skills for an architect. These must include skills in both, the
problem domain and the solution domain
• People Management Skills: must ensure that all people understand and implement the architecture
in exactly the way he has conceptualized it. This calls for a lot of people management skills and
patience.
• Role Model: must be a role model for the software engineers – they would emulate all good (and
also all bad !) things that the architect does
IMPROVING AUTOMATION THROUGH SOFTWARE ENVIRONMENTS
The following are the some of the configuration management environments which provide the
foundation for executing and implementing the process:
Planning tools, Quality assurance and analysis tools, Test tools, and User interfaces provide crucial
automation support for evolving the software engineering artifacts.
Software process and project management Page 32
PEER INSPECTIONS: A PRAGMATIC VIEW:
- Over the past two decades software development is a re-engineering process. Now it is replaced by
advanced software engineering technologies.
- This transition is was motivated by the unsatisfactory demand for the software and reduced cost.
THE PRINCIPLES OF CONVENTIONAL SOFTWARE ENGINEERING
Based on many years of software development experience, the software industry proposed so many
principles (nearly 201 by – Davis‟s). Of which Davis‟s top 30 principles are:
1) Make quality #1: Quality must be quantified and mechanisms put into place to motivate its
achievement.
2) High-quality software is possible: In order to improve the quality of the product we need to involving
the customer, select the prototyping, simplifying design, conducting inspections, and hiring the best people.
3) Give products to customers early: No matter how hard you try to learn user‟s needs during the
requirements phase, the most effective way to determine real needs is to give users a product and let them
play withit.
4) Determine the problem before writing the requirements: Whenever a problem is raised most
engineers provide a solution. Before we try to solve a problem, be sure to explore all the alternatives and
don‟t be blinded by the understandable solution.
9) Get it right before you make it faster: It is very easy to make a working program run faster than it is
to make a fast program work. Don’t worry about optimization during initial coding.
10) Inspect the code: Examine the detailed design and code is a much better way to find the errors
than testing.
11) Good management is more important than good technology
12) People are the key to success: Highly skilled people with appropriate experience, talent, and
training are key. The right people with insufficient tools, languages, and process will succeed.
13) Follow with care: Everybody is doing something but does not make it right for you. It may be
right, but you must carefully assess its applicability to your environment.
14) Take responsibility: When a bridge collapses we ask “what did the engineer do wrong?”. Similarly if
the software fails, we ask the same. So the fact is in every engineering discipline, the best methods can
be used to produce poor results and the most out of date methods to produce stylish design.
15) Understand the customer’s priorities. It is possible the customer would tolerate 90% of the
functionality delivered late if they could have 10% of it on time.
16) Plan to throw one away .One of the most important critical success factors is whether or not
a product is entirely new. Such brand-new applications, architectures, interfaces, or algorithms rarely work
the first time.
17) Design for change. The architectures, components, and specification techniques
you use must accommodate change.
18) Design without documentation is not design. I have often heard software engineers say, “I have
finished the design. All that is left is the documentation.”
vi) Use tools, but be realistic. Software tools make their users more efficient.
viii) Encapsulate. Information-hiding is a simple, proven concept that results in software that is easier
to test and much easier to maintain.
ix) Use coupling and cohesion. Coupling and cohesion are the best ways to
measure software’s inherent maintainability and adaptability.
x) Use the McCabe complexity measure. Although there are many metrics available to report the inherent
complexity of software, none is as intuitive and easy to use as Tom McCabe’s.
xi) Don’t test your own software. Software developers should never be the primary testers of their own
software.
xii) Analyze causes for errors. It is far more cost-effective to reduce the effect of an error by preventing
it than it is to find and fix it. One way to do this is to analyze the causes of errors as they are
detected.
xiii) Realize that software’s entropy increases. Any software system that undergoes continuous
change will grow in complexity and become more and more disorganized.
With today’s sophisticated systems, it is not possible to define the entire problem, design
the entire solution, build the software, then test the end product in sequence. Instead, and iterative process
that refines the problem understanding, an effective solution, and an effective plan over several iterations
encourages balanced treatment of all stakeholder objectives.
Major risks must be addressed early to increase predictability and avoid expensive downstream
scrap and rework.
4) Enhance change freedom through tools that support round-trip engineering: (The automation
element)
- A model-based approach supports the evolution of semantically rich graphical and textual design
notations.
- Visual modeling with rigorous notations and formal machine- process able language provides more
objective measures than the traditional approach of human review and inspection of ad hoc design
representations in paper doc.
6) Instrument the process for objective quality control and progress assessment:
- Life-cycle assessment of the progress and quality of all intermediate product must be integrated into
the process.
- The best assessment mechanisms are well-defined measures derived directly from the evolving
engineering artifacts and integrated into all activities and teams.
LIFE-CYCLE PHASES
- If there is a well defined separation between “research and development” activities and“production”
activities then the software is said to be in successful development process.
- Most of the software’s fail due to the following characteristics ,
1) An overemphasis on research and development.
2) An overemphasis on production.
ENGINEERING AND PRODUCTION STAGES :
To achieve economics of scale and higher return on investment, we must move toward a software
manufacturing process which is determined by technological improvements in process automation
andcomponent based development.
There are two stages in the software development process
1) The engineering stage: Less predictable but smaller teams doing design and production activities.
This stage is decomposed into two distinct phases inception and elaboration.
2) The production stage: More predictable but larger teams doing construction, test, and deployment
activities. This stage is also decomposed into two distinct phases construction and transition.
These four phases of lifecycle process are loosely mapped to the conceptual framework of the spiral model is
as shown in the following figure.
- In the above figure the size of the spiral corresponds to the inactivity of the project with respect to the
breadth and depth of the artifacts that have been developed.
- This inertia manifests itself in maintaining artifact consistency, regression testing, documentation, quality
analyses, and configuration control.
- Increased inertia may have little, or at least very straightforward, impact on changing any given discrete
component or activity.
- However, the reaction time for accommodating major architectural changes, major requirements changes,
major planning shifts, or major organizational perturbations clearly increases in subsequent phases.
1. INCEPTION
PHASE:
The main goal of this phase is to achieve agreement among stakeholders on the life-cycle objectives for the
project.
PRIMARY OBJECTIVES
1) Establishing the project’s scope and boundary conditions
2) Distinguishing the critical use cases of the system and the primary scenarios of operation
3) Demonstrating at least one candidate architecture against some of the primary scenarios
4) Estimating cost and schedule for the entire project
5) Estimating potential risks
Transition: The main focus is on achieving consistency and completeness of the deployment set in the
context of another set. Residual defects are resolved, and feedback from alpha, beta, and system testing is
incorporated.
MANAGEMENT ARTIFACTS:
of WBS is dependent on product management style , organizational culture, custom
performance, financial constraints and several project specific parameters.
• The WBS is the architecture of project plan. It encapsulate change and evolve with appropriatelevel of
details.
• A WBS is simply a hierarchy of elements that decomposes the project plan into discrete work task.
• A WBS provides the following information structure
- A delineation of all significant tasks.
- A clear task decomposition for assignment of responsibilities.