0% found this document useful (0 votes)
23 views13 pages

Software Engineering Management

Uploaded by

olsowyverena
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
23 views13 pages

Software Engineering Management

Uploaded by

olsowyverena
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 13

CHAPTER 8

SOFTWARE ENGINEERING MANAGEMENT

ACRONYM
PMBOK Guide to the Project is not to diminish the importance of organizational
Management Body of management issues.
Knowledge Since the link to the related disciplines—obviously
SQA Software Quality Assurance management—is important, it will be described in more
detail than in the other KA descriptions. Aspects of
INTRODUCTION organizational management are important in terms of their
impact on software engineering—on policy management,
Software Engineering Management can be defined as for instance: organizational policies and standards provide
the application of management activities—planning, the framework in which software engineering is
coordinating, measuring, monitoring, controlling, and undertaken. These policies may need to be influenced by
reporting—to ensure that the development and maintenance the requirements of effective software development and
of software is systematic, disciplined, and quantified maintenance, and a number of software engineering-
(IEEE610.12-90). specific policies may need to be established for effective
The Software Engineering Management KA therefore management of software engineering at an organizational
addresses the management and measurement of software level. For example, policies are usually necessary to
engineering. While measurement is an important aspect of establish specific organization-wide processes or
all KAs, it is here that the topic of measurement programs procedures for such software engineering tasks as
is presented. designing, implementing, estimating, tracking, and
reporting. Such policies are essential to effective long-term
While it is true to say that in one sense it should be possible
software engineering management, by establishing a
to manage software engineering in the same way as any
consistent basis on which to analyze past performance and
other (complex) process, there are aspects specific to
implement improvements, for example.
software products and the software life cycle processes
which complicate effective management—just a few of Another important aspect of management is personnel
which are as follows: management: policies and procedures for hiring, training,
and motivating personnel and mentoring for career
Š The perception of clients is such that there is often a
development are important not only at the project level but
lack of appreciation for the complexity inherent in
also to the longer-term success of an organization. Software
software engineering, particularly in relation to the
engineering personnel may present unique training or
impact of changing requirements.
personnel management challenges (for example,
Š It is almost inevitable that the software engineering maintaining currency in a context where the underlying
processes themselves will generate the need for new or technology undergoes continuous and rapid change).
changed client requirements. Communication management is also often mentioned as an
Š As a result, software is often built in an iterative process overlooked but major aspect of the performance of
rather than a sequence of closed tasks. individuals in a field where precise understanding of user
needs and of complex requirements and designs is
Š Software engineering necessarily incorporates aspects necessary. Finally, portfolio management, which is
of creativity and discipline—maintaining an appropriate the capacity to have an overall vision not only of the
balance between the two is often difficult. set of software under development but also of the
Š The degree of novelty and complexity of software is software already in use in an organization, is necessary.
often extremely high. Furthermore, software reuse is a key factor in maintaining
Š There is a rapid rate of change in the underlying and improving productivity and competitiveness. Effective
technology. reuse requires a strategic vision that reflects the unique
power and requirements of this technique.
With respect to software engineering, management
activities occur at three levels: organizational and In addition to understanding the aspects of management
infrastructure management, project management, and that are uniquely influenced by software, software
measurement program planning and control. The last two engineers must have some knowledge of the more general
are covered in detail in this KA description. However, this aspects, even in the first four years after graduation that is
targeted in the Guide.

© IEEE – 2004 Version 8–1


Organizational culture and behavior, and functional models are developed using statistical, expert
enterprise management in terms of procurement, supply knowledge or other techniques.
chain management, marketing, sales, and distribution, all The software engineering project management subareas
have an influence, albeit indirectly, on an organization’s make extensive use of the software engineering
software engineering process. measurement subarea.
Relevant to this KA is the notion of project management, as Not unexpectedly, this KA is closely related to others in the
“the construction of useful software artifacts” is normally Guide to the SWEBOK, and reading the following KA
managed in the form of (perhaps programs of) individual descriptions in conjunction with this one would be
projects. In this regard, we find extensive support in the particularly useful.
Guide to the Project Management Body of Knowledge
(PMBOK) (PMI00), which itself includes the following Š Software Requirements, where some of the activities to
project management KAs: project integration management, be performed during the Initiation and Scope definition
project scope management, project time management, phase of the project are described
project cost management, project quality management, Š Software Configuration Management, as this deals with
project human resource management, and project the identification, control, status accounting, and audit
communications management. Clearly, all these topics have of the software configuration along with software
direct relevance to the Software Engineering Management release management and delivery
KA. To attempt to duplicate the content of the Guide to the Š Software Engineering Process, because processes and
PMBOK here would be both impossible and inappropriate. projects are closely related (this KA also describes
Instead, we suggest that the reader interested in project process and product measurement)
management beyond what is specific to software
engineering projects consult the PMBOK itself. Project Š Software Quality, as quality is constantly a goal of
management is also found in the Related Disciplines of management and is an aim of many activities that must
Software Engineering chapter. be managed
The Software Engineering Management KA consists of BREAKDOWN OF TOPICS FOR SOFTWARE
both the software project management process, in its first
ENGINEERING MANAGEMENT
five subareas, and software engineering measurement in the
last subarea. While these two subjects are often regarded as As the Software Engineering Management KA is viewed
being separate, and indeed they do possess many unique here as an organizational process which incorporates the
aspects, their close relationship has led to their combined notion of process and project management, we have created
treatment in this KA. Unfortunately, a common perception a breakdown that is both topic-based and life cycle-based.
of the software industry is that it delivers products late, However, the primary basis for the top-level breakdown is
over budget, and of poor quality and uncertain the process of managing a software engineering project.
functionality. Measurement-informed management — an There are six major subareas. The first five subareas largely
assumed principle of any true engineering discipline — can follow the IEEE/EIA 12207 Management Process. The six
help to turn this perception around. In essence, subareas are:
management without measurement, qualitative and
Š Initiation and scope definition, which deals with the
quantitative, suggests a lack of rigor, and measurement
decision to initiate a software engineering project
without management suggests a lack of purpose or context.
In the same way, however, management and measurement Š Software project planning, which addresses the
without expert knowledge is equally ineffectual, so we activities undertaken to prepare for successful software
must be careful to avoid over-emphasizing the quantitative engineering from a management perspective
aspects of Software Engineering Management (SEM). Š Software project enactment, which deals with generally
Effective management requires a combination of both accepted software engineering management activities
numbers and experience. that occur during software engineering
The following working definitions are adopted here: Š Review and evaluation, which deal with assurance that
Š Management process refers to the activities that are the software is satisfactory
undertaken in order to ensure that the software Š Closure, which addresses the post-completion activities
engineering processes are performed in a manner of a software engineering project
consistent with the organization’s policies, goals, and
standards. Š Software engineering measurement, which deals with
the effective development and implementation of
Š Measurement refers to the assignment of values and measurement programs in software engineering
labels to aspects of software engineering (products, organizations (IEEE12207.0-96)
processes, and resources as defined by [Fen98]) and the
models that are derived from them, whether these The breakdown of topics for the Software Engineering
Management KA is shown in Figure 1.

8–2 © IEEE – 2004 Version


1. Initiation and Scope Definition
Software requirement methods for requirements elicitation
The focus of this set of activities is on the effective (for example, observation), analysis (for example, data
determination of software requirements via various modeling, use-case modeling), specification, and validation
elicitation methods and the assessment of the project’s (for example, prototyping) must be selected and applied,
feasibility from a variety of standpoints. Once feasibility taking into account the various stakeholder perspectives.
has been established, the remaining task within this process This leads to the determination of project scope, objectives,
is the specification of requirements validation and change and constraints. This is always an important activity, as it
procedures (see also the Software Requirements KA). sets the visible boundaries for the set of tasks being
1.1. Determination and Negotiation of Requirements undertaken, and is particularly so where the novelty of the
undertaking is high. Additional information can be found in
[Dor02: v2c4; Pfl01: c4; Pre04: c7; Som05: c5] the Software Requirements KA.

© IEEE – 2004 Version 8–3


1.2. Feasibility Analysis (Technical, Operational, ongoing plan management, review, and revision are also
Financial, Social/Political) clearly stated and agreed.
[Pre04: c6; Som05: c6] 2.1. Process Planning
Software engineers must be assured that adequate Selection of the appropriate software life cycle model (for
capability and resources are available in the form of people, example, spiral, evolutionary prototyping) and the
expertise, facilities, infrastructure, and support (either adaptation and deployment of appropriate software life
internally or externally) to ensure that the project can be cycle processes are undertaken in light of the particular
successfully completed in a timely and cost-effective scope and requirements of the project. Relevant methods
manner (using, for example, a requirement-capability and tools are also selected. [Dor02: v1c6,v2c8; Pfl01: c2;
matrix). This often requires some “ballpark” estimation of Pre04: c2; Rei02: c1,c3,c5; Som05: c3; Tha97: c3] At the
effort and cost based on appropriate methods (for example, project level, appropriate methods and tools are used to
expert-informed analogy techniques). decompose the project into tasks, with associated inputs,
1.3. Process for the Review and Revision of Requirements outputs, and completion conditions (for example, work
breakdown structure). [Dor02: v2c7; Pfl01: c3; Pre04: c21;
Given the inevitability of change, it is vital that agreement Rei02: c4,c5; Som05: c4; Tha97: c4,c6] This in turn
among stakeholders is reached at this early point as to the influences decisions on the project’s high-level schedule
means by which scope and requirements are to be reviewed and organization structure.
and revised (for example, via agreed change management
procedures). This clearly implies that scope and 2.2. Determine Deliverables
requirements will not be “set in stone” but can and should The product(s) of each task (for example, architectural
be revisited at predetermined points as the process unfolds design, inspection report) are specified and characterized.
(for example, at design reviews, management reviews). If [Pfl01: c3; Pre04: c24; Tha97: c4] Opportunities to reuse
changes are accepted, then some form of traceability software components from previous developments or to
analysis and risk analysis (see topic 2.5 Risk Management) utilize off-the-shelf software products are evaluated. Use of
should be used to ascertain the impact of those changes. A third parties and procured software are planned and
managed-change approach should also be useful when it suppliers are selected.
comes time to review the outcome of the project, as the 2.3. Effort, Schedule, and Cost Estimation
scope and requirements should form the basis for the
evaluation of success. [Som05: c6] See also the software Based on the breakdown of tasks, inputs, and outputs, the
configuration control subarea of the Software expected effort range required for each task is determined
Configuration Management KA. using a calibrated estimation model based on historical
size-effort data where available and relevant, or other
2. Software Project Planning methods like expert judgment. Task dependencies are
established and potential bottlenecks are identified using
The iterative planning process is informed by the scope and suitable methods (for example, critical path analysis).
requirements and by the establishment of feasibility. At this Bottlenecks are resolved where possible, and the expected
point, software life cycle processes are evaluated and the schedule of tasks with projected start times, durations, and
most appropriate (given the nature of the project, its degree end times is produced (for example, PERT chart). Resource
of novelty, its functional and technical complexity, its requirements (people, tools) are translated into cost
quality requirements, and so on) is selected. Where estimates. [Dor02: v2c7; Fen98: c12; Pfl01: c3; Pre04: c23,
relevant, the project itself is then planned in the form of a c24; Rei02: c5,c6; Som05: c4,c23; Tha97: c5] This is a
hierarchical decomposition of tasks, the associated highly iterative activity which must be negotiated and
deliverables of each task are specified and characterized in revised until consensus is reached among affected
terms of quality and other attributes in line with stated stakeholders (primarily engineering and management).
requirements, and detailed effort, schedule, and cost
estimation is undertaken. Resources are then allocated to 2.4. Resource Allocation
tasks so as to optimize personnel productivity (at [Pfl01: c3; Pre04: c24; Rei02: c8,c9; Som05: c4;
individual, team, and organizational levels), equipment and Tha97: c6,c7]
materials utilization, and adherence to schedule. Detailed
risk management is undertaken and the “risk profile” of the Equipment, facilities, and people are associated with the
project is discussed among, and accepted by, all relevant scheduled tasks, including the allocation of responsibilities
stakeholders. Comprehensive software quality management for completion (using, for example, a Gantt chart). This
processes are determined as part of the planning process in activity is informed and constrained by the availability of
the form of procedures and responsibilities for software resources and their optimal use under these circumstances,
quality assurance, verification and validation, reviews, and as well as by issues relating to personnel (for example,
audits (see the Software Quality KA). As an iterative productivity of individuals/teams, team dynamics,
process, it is vital that the processes and responsibilities for organizational and team structures).

8–4 © IEEE – 2004 Version


2.5. Risk Management expectation that such adherence will lead to the successful
Risk identification and analysis (what can go wrong, how satisfaction of stakeholder requirements and achievement
and why, and what are the likely consequences), critical of the project objectives. Fundamental to enactment are the
risk assessment (which are the most significant risks in ongoing management activities of measuring, monitoring,
terms of exposure, which can we do something about in controlling, and reporting.
terms of leverage), risk mitigation and contingency 3.1. Implementation of Plans
planning (formulating a strategy to deal with risks and to [Pfl01: c3; Som05: c4]
manage the risk profile) are all undertaken. Risk
assessment methods (for example, decision trees and The project is initiated and the project activities are
process simulations) should be used in order to highlight undertaken according to the schedule. In the process,
and evaluate risks. Project abandonment policies should resources are utilized (for example, personnel effort,
also be determined at this point in discussion with all other funding) and deliverables are produced (for example,
stakeholders. [Dor02: v2c7; Pfl01: c3; Pre04: c25; Rei02: architectural design documents, test cases).
c11; Som05: c4; Tha97: c4] Software-unique aspects of 3.2. Supplier Contract Management
risk, such as software engineers’ tendency to add unwanted [Som05:c4]
features or the risks attendant in software’s intangible
nature, must influence the project’s risk management. Prepare and execute agreements with suppliers, monitor
supplier performance, and accept supplier products,
2.6. Quality Management incorporating them as appropriate.
[Dor02: v1c8,v2c3-c5; Pre04: c26; Rei02: c10; 3.3. Implementation of measurement process
Som05: c24,c25; Tha97: c9,c10] [Fen98: c13,c14; Pre04: c22; Rei02: c10,c12;
Quality is defined in terms of pertinent attributes of the Tha97: c3,c10]
specific project and any associated product(s), perhaps in The measurement process is enacted alongside the software
both quantitative and qualitative terms. These quality project, ensuring that relevant and useful data are collected
characteristics will have been determined in the (see also topics 6.2 Plan the Measurement Process and 6.3
specification of detailed software requirements. See also Perform the Measurement Process).
the Software Requirements KA.
3.4. Monitor Process
Thresholds for adherence to quality are set for each
indicator as appropriate to stakeholder expectations for the [Dor02: v1c8, v2c2-c5,c7; Rei02: c10;
software at hand. Procedures relating to ongoing SQA Som05: c25; Tha97: c3;c9]
throughout the process and for product (deliverable) Adherence to the various plans is assessed continually and
verification and validation are also specified at this stage at predetermined intervals. Outputs and completion
(for example, technical reviews and inspections) (see also conditions for each task are analyzed. Deliverables are
the Software Quality KA). evaluated in terms of their required characteristics (for
2.7. Plan Management example, via reviews and audits). Effort expenditure,
[Som05: c4; Tha97: c4] schedule adherence, and costs to date are investigated, and
resource usage is examined. The project risk profile is
How the project will be managed and how the plan will be revisited, and adherence to quality requirements is
managed must also be planned. Reporting, monitoring, and evaluated.
control of the project must fit the selected software
engineering process and the realities of the project, and Measurement data are modeled and analyzed. Variance
must be reflected in the various artifacts that will be used analysis based on the deviation of actual from expected
for managing it. But, in an environment where change is an outcomes and values is undertaken. This may be in the
expectation rather than a shock, it is vital that plans are form of cost overruns, schedule slippage, and the like.
themselves managed. This requires that adherence to plans Outlier identification and analysis of quality and other
be systematically directed, monitored, reviewed, reported, measurement data are performed (for example, defect
and, where appropriate, revised. Plans associated with other density analysis). Risk exposure and leverage are
management-oriented support processes (for example, recalculated, and decisions trees, simulations, and so on are
documentation, software configuration management, and rerun in the light of new data. These activities enable
problem resolution) also need to be managed in the same problem detection and exception identification based on
manner. exceeded thresholds. Outcomes are reported as needed and
certainly where acceptable thresholds are surpassed.
3. Software Project Enactment 3.5. Control Process
The plans are then implemented, and the processes [Dor02: v2c7; Rei02: c10; Tha97: c3,c9]
embodied in the plans are enacted. Throughout, there is a The outcomes of the process monitoring activities provide
focus on adherence to the plans, with an overriding the basis on which action decisions are taken. Where
© IEEE – 2004 Version 8–5
appropriate, and where the impact and associated risks are in topic 2.2 Objectives of Testing and in the Software
modeled and managed, changes can be made to the project. Quality KA, in topic 2.3 Reviews and Audits.
This may take the form of corrective action (for example, 4.2. Reviewing and Evaluating Performance
retesting certain components), it may involve the
incorporation of contingencies so that similar occurrences [Dor02: v1c8,v2c3,c5; Pfl01: c8,c9; Rei02: c10;
are avoided (for example, the decision to use prototyping to Tha97: c3,c10]
assist in software requirements validation), and/or it may Periodic performance reviews for project personnel provide
entail the revision of the various plans and other project insights as to the likelihood of adherence to plans as well as
documents (for example, requirements specification) to possible areas of difficulty (for example, team member
accommodate the unexpected outcomes and their conflicts). The various methods, tools, and techniques
implications. employed are evaluated for their effectiveness and
In some instances, it may lead to abandonment of the appropriateness, and the process itself is systematically and
project. In all cases, change control and software periodically assessed for its relevance, utility, and efficacy
configuration management procedures are adhered to (see in the project context. Where appropriate, changes are
also the Software Configuration Management KA), made and managed.
decisions are documented and communicated to all relevant
parties, plans are revisited and revised where necessary, 5. Closure
and relevant data is recorded in the central database (see The project reaches closure when all the plans and
also topic 6.3 Perform the Measurement Process).
embodied processes have been enacted and completed. At
3.6. Reporting this stage, the criteria for project success are revisited.
[Rei02: c10; Tha97: c3,c10] Once closure is established, archival, post mortem, and
process improvement activities are performed.
At specified and agreed periods, adherence to the plans is
reported, both within the organization (for example to the 5.1. Determining Closure
project portfolio steering committee) and to external [Dor02: v1c8,v2c3,c5; Rei02: c10; Tha97: c3,c10]
stakeholders (for example, clients, users). Reports of this The tasks as specified in the plans are complete, and
nature should focus on overall adherence as opposed to the satisfactory achievement of completion criteria is
detailed reporting required frequently within the project confirmed. All planned products have been delivered with
team. acceptable characteristics. Requirements are checked off
and confirmed as satisfied, and the objectives of the project
4. Review and Evaluation
have been achieved. These processes generally involve all
At critical points in the project, overall progress towards stakeholders and result in the documentation of client
achievement of the stated objectives and satisfaction acceptance and any remaining known problem reports.
of stakeholder requirements are evaluated. Similarly, 5.2. Closure Activities
assessments of the effectiveness of the overall process to
date, the personnel involved, and the tools and methods [Pfl01: c12; Som05: c4]
employed are also undertaken at particular milestones. After closure has been confirmed, archival of project
4.1. Determining Satisfaction of Requirements materials takes place in line with stakeholder-agreed
methods, location, and duration. The organization’s
[Rei02: c10; Tha97: c3,c10] measurement database is updated with final project data
Since attaining stakeholder (user and customer) satisfaction and post-project analyses are undertaken. A project post
is one of our principal aims, it is important that progress mortem is undertaken so that issues, problems, and
towards this aim be formally and periodically assessed. opportunities encountered during the process (particularly
This occurs on achievement of major project milestones via review and evaluation, see subarea 4 Review and
(for example, confirmation of software design architecture, evaluation) are analyzed, and lessons are drawn from the
software integration technical review). Variances from process and fed into organizational learning and
expectations are identified and appropriate action is taken. improvement endeavors (see also the Software Engineering
As in the control process activity above (see topic 3.5 Process KA).
Control Process), in all cases change control and software
configuration management procedures are adhered to (see 6. Software Engineering Measurement
the Software Configuration Management KA), decisions
[ISO 15939-02]
are documented and communicated to all relevant parties,
plans are revisited and revised where necessary, and The importance of measurement and its role in better
relevant data are recorded in the central database (see also management practices is widely acknowledged, and so its
topic 6.3 Perform the Measurement Process). More importance can only increase in the coming years. Effective
information can also be found in the Software Testing KA, measurement has become one of the cornerstones of
organizational maturity.

8–6 © IEEE – 2004 Version


Key terms on software measures and measurement methods Š Identify information needs. Information needs are based
have been defined in [ISO15939-02] on the basis of the on the goals, constraints, risks, and problems of the
ISO international vocabulary of metrology [ISO93]. organizational unit. They may be derived from business,
Nevertheless, readers will encounter terminology organizational, regulatory, and/or product objectives.
differences in the literature; for example, the term They must be identified and prioritized. Then, a subset
“metrics” is sometimes used in place of “measures.” to be addressed must be selected and the results
This topic follows the international standard ISO/IEC documented, communicated, and reviewed by
15939, which describes a process which defines the stakeholders [ISO 15939-02: 5.2.2].
activities and tasks necessary to implement a software Š Select measures. Candidate measures must be selected,
measurement process and includes, as well, a measurement with clear links to the information needs. Measures
information model. must then be selected based on the priorities of the
6.1. Establish and Sustain Measurement Commitment information needs and other criteria such as cost of
collection, degree of process disruption during
Š Accept requirements for measurement. Each collection, ease of analysis, ease of obtaining accurate,
measurement endeavor should be guided by consistent data, and so on [ISO15939-02: 5.2.3 and
organizational objectives and driven by a set of Appendix C].
measurement requirements established by the
organization and the project. For example, an Š Define data collection, analysis, and reporting
organizational objective might be “first-to-market with procedures. This encompasses collection procedures
new products.” [Fen98: c3,c13; Pre04: c22] This in and schedules, storage, verification, analysis, reporting,
turn might engender a requirement that factors and configuration management of data [ISO15939-02:
contributing to this objective be measured so that 5.2.4].
projects might be managed to meet this objective. Š Define criteria for evaluating the information products.
- Define scope of measurement. The organizational Criteria for evaluation are influenced by the technical
unit to which each measurement requirement is to be and business objectives of the organizational unit.
applied must be established. This may consist of a Information products include those associated with the
functional area, a single project, a single site, or product being produced, as well as those associated
even the whole enterprise. All subsequent with the processes being used to manage and measure
measurement tasks related to this requirement the project [ISO15939-02: 5.2.5 and Appendices D, E].
should be within the defined scope. In addition, the Š Review, approve, and provide resources for
stakeholders should be identified. measurement tasks.
- Commitment of management and staff to - The measurement plan must be reviewed and
measurement. The commitment must be formally approved by the appropriate stakeholders. This
established, communicated, and supported by includes all data collection procedures, storage,
resources (see next item). analysis, and reporting procedures; evaluation
Š Commit resources for measurement. The organization’s criteria; schedules; and responsibilities. Criteria for
commitment to measurement is an essential factor for reviewing these artifacts should have been
success, as evidenced by assignment of resources for established at the organizational unit level or higher
implementing the measurement process. Assigning and should be used as the basis for these reviews.
resources includes allocation of responsibility for the Such criteria should take into consideration previous
various tasks of the measurement process (such as user, experience, availability of resources, and potential
analyst, and librarian) and providing adequate funding, disruptions to projects when changes from current
training, tools, and support to conduct the process in an practices are proposed. Approval demonstrates
enduring fashion. commitment to the measurement process
[ISO15939-02: 5.2.6.1 and Appendix F].
6.2. Plan the Measurement Process
- Resources should be made available for
Š Characterize the organizational unit. The organizational implementing the planned and approved
unit provides the context for measurement, so it is measurement tasks. Resource availability may be
important to make this context explicit and to articulate staged in cases where changes are to be piloted
the assumptions that it embodies and the constraints that before widespread deployment. Consideration
it imposes. Characterization can be in terms of should be paid to the resources necessary for
organizational processes, application domains, successful deployment of new procedures or
technology, and organizational interfaces. An measures [ISO15939-02: 5.2.6.2].
organizational process model is also typically an
element of the organizational unit characterization Š Acquire and deploy supporting technologies. This
[ISO15939-02: 5.2.1]. includes evaluation of available supporting
technologies, selection of the most appropriate

© IEEE – 2004 Version 8–7


technologies, acquisition of those technologies, and reviewing the data to ensure that they are meaningful
deployment of those technologies [ISO 15939-02: and accurate, and that they can result in reasonable
5.2.7]. actions [ISO 15939-02: 5.3.3 and Appendix G].
6.3. Perform the Measurement Process Š Communicate results. Information products must be
Š Integrate measurement procedures with relevant documented and communicated to users and
processes. The measurement procedures, such as data stakeholders [ISO 15939-02: 5.3.4].
collection, must be integrated into the processes they 6.4. Evaluate Measurement
are measuring. This may involve changing current Š Evaluate information products. Evaluate information
processes to accommodate data collection or generation products against specified evaluation criteria and
activities. It may also involve analysis of current determine strengths and weaknesses of the information
processes to minimize additional effort and evaluation products. This may be performed by an internal process
of the effect on employees to ensure that the or an external audit and should include feedback from
measurement procedures will be accepted. Morale measurement users. Record lessons learned in an
issues and other human factors need to be considered. appropriate database [ISO 15939-02: 5.4.1 and
In addition, the measurement procedures must be Appendix D].
communicated to those providing the data, training may
need to be provided, and support must typically be Š Evaluate the measurement process. Evaluate the
provided. Data analysis and reporting procedures must measurement process against specified evaluation
typically be integrated into organizational and/or project criteria and determine the strengths and weaknesses of
processes in a similar manner [ISO 15939-02: 5.3.1]. the process. This may be performed by an internal
process or an external audit and should include
Š Collect data. The data must be collected, verified, and feedback from measurement users. Record lessons
stored [ISO 15939-02 :5.3.2]. learned in an appropriate database [ISO 15939-02: 5.4.1
Š Analyze data and develop information products. Data and Appendix D].
may be aggregated, transformed, or recoded as part of Š Identify potential improvements. Such improvements
the analysis process, using a degree of rigor appropriate may be changes in the format of indicators, changes in
to the nature of the data and the information needs. The units measured, or reclassification of categories.
results of this analysis are typically indicators such as Determine the costs and benefits of potential
graphs, numbers, or other indications that must be improvements and select appropriate improvement
interpreted, resulting in initial conclusions to be actions. Communicate proposed improvements to the
presented to stakeholders. The results and conclusions measurement process owner and stakeholders for
must be reviewed, using a process defined by the review and approval. Also communicate lack of
organization (which may be formal or informal). Data potential improvements if the analysis fails to identify
providers and measurement users should participate in improvements [ISO 15939-02: 5.4.2].

8–8 © IEEE – 2004 Version


MATRIX OF TOPICS VS. REFERENCE MATERIAL

[Dor02] [ISO15939-02] [Fen98] [Pfl01] [Pre04] [Rei02] [Som05] [Tha97]


1. Initiation and scope definition
1.1 Determination and negotiation of
v2c4 c4 c7 c5
requirements
1.2 Feasibility analysis c6 c6
1.3 Process for the review and revision of
c6
requirements
2. Software Project Planning
v1c6,v2c7,
2.1 Process planning c2,c3 c2,c21 c1,c3,c5 c3,c4 c3,c4,c6
v2c8
2.2 Determine deliverables c3 c24 c4
23 Effort, schedule and cost estimation v2c7 c12 c3 C23,c24 c5,c6 c4,c23 c5
2.4 Resource allocation c3 c24 c8,c9 c4 c6,c7
2.5 Risk management v2c7 c3 c25 c11 c4 c4
v1c8,v2c3-
2.6 Quality management c26 c10 c24,c25 c9,c10
c5
2.7 Plan management c4 c4
3. Software Project Enactment
3.1 Implementation of plans c3 c4
3.2 Supplier contract management c4
3.3 Implementation of measurement
c13c,14 c22 c10,c12 c3,c10
process
v1c8,v2c2-
3.4 Monitor process c10 c25 c3,c9
c5,c7
3.5 Control process v2c7 c10 c3,c9
3.6 Reporting c10 c3,c10
4. Review and evaluation
4.1 Determining satisfaction of
c10 c3,c10
requirements
v1c8,v2c3,
4.2 Reviewing and evaluating performance c8,c9 c10 c3,c10
c5
5. Closure
v1c8,v2c3,
5.1 Determining closure c10 c3,c10
c5
5.2 Closure activities c12 c4
6. Software Engineering Measurement *
6.1 Establish and sustain measurement
c3,c13 c22
commitment
6.2 Plan the measurement process c5,C,D,E,F
6.3 Perform the measurement process c5,G
6.4 Evaluate measurement c5,D

© IEEE – 2004 Version 8–9


[Pfl01] S.L. Pfleeger, Software Engineering: Theory and
RECOMMENDED REFERENCES FOR SOFTWARE Practice, second ed., Prentice Hall, 2001, Chap. 2-4, 8, 9,
ENGINEERING MANAGEMENT 12, 13.
[Dor02] M. Dorfman and R.H. Thayer, eds., Software [Pre04] R.S. Pressman, Software Engineering: A
Engineering, IEEE Computer Society Press, 2002, Vol. 1, Practitioner's Approach, sixth ed., McGraw-Hill, 2004,
Chap. 6, 8, Vol. 2, Chap. 3, 4, 5, 7, 8. Chap. 2, 6, 7, 22-26.
[Fen98] N.E. Fenton and S.L. Pfleeger, Software Metrics: A [Rei02] D.J. Reifer, ed., Software Management, IEEE
Rigorous & Practical Approach, second ed., International Computer Society Press, 2002, Chap. 1-6, 7-12, 13.
Thomson Computer Press, 1998, Chap. 1-14. [Som05] I. Sommerville, Software Engineering, seventh
[ISO15939-02] ISO/IEC 15939:2002, Software ed., Addison-Wesley, 2005, Chap. 3-6, 23-25.
Engineering — Software Measurement Process, ISO and [Tha97] R.H. Thayer, ed., Software Engineering Project
IEC, 2002. Management, IEEE Computer Society Press, 1997, Chap.
1-10.

8–10 © IEEE – 2004 Version


(Fay96) M.E. Fayad and M. Cline, “Managing Object-
APPENDIX A. LIST OF FURTHER READINGS Oriented Software Development,” Computer, September
(Adl99) T.R. Adler, J.G. Leonard, and R.K. Nordgren, 1996, pp. 26-31.
“Improving Risk Management: Moving from Risk (Fen98) N.E. Fenton and S.L. Pfleeger, Software Metrics: A
Elimination to Risk Avoidance,” Information and Software Rigorous & Practical Approach, second ed., International
Technology, vol. 41, 1999, pp. 29-34. Thomson Computer Press, 1998.
(Bai98) R. Baines, “Across Disciplines: Risk, Design, (Fle99) R. Fleming, “A Fresh Perspective on Old
Method, Process, and Tools,” IEEE Software, July/August Problems,” IEEE Software, January/February 1999, pp.
1998, pp. 61-64. 106-113.
(Bin97) R.V. Binder, “Can a Manufacturing Quality Model (Fug98) A. Fuggetta et al., “Applying GQM in an Industrial
Work for Software?” IEEE Software, September/October Software Factory,” ACM Transactions on Software
1997, pp. 101-102,105. Engineering and Methodology, vol. 7, iss. 4, 1998, pp. 411-
(Boe97) B.W. Boehm and T. DeMarco, “Software Risk 448.
Management,” IEEE Software, May/June 1997, pp. 17-19. (Gar97) P.R. Garvey, D.J. Phair, and J.A. Wilson, “An
(Bri96) L.C. Briand, S. Morasca, and V.R. Basili, Information Architecture for Risk Assessment and
“Property-Based Software Engineering Measurement,” Management,” IEEE Software, May/June 1997, pp. 25-34.
IEEE Transactions on Software Engineering, vol. 22, iss. 1, (Gem97) A. Gemmer, “Risk Management: Moving beyond
1996, pp. 68-86. Process,” Computer, May 1997, pp. 33-43.
(Bri96a) L. Briand, K.E. Emam, and S. Morasca, “On the (Gla97) R.L. Glass, “The Ups and Downs of Programmer
Application of Measurement Theory in Software Stress,” Communications of the ACM, vol. 40, iss. 4, 1997,
Engineering,” Empirical Software Engineering, vol. 1, pp. 17-19.
1996, pp. 61-88. (Gla98) R.L. Glass, “Short-Term and Long-Term Remedies
(Bri97) L.C. Briand, S. Morasca, and V.R. Basili, for Runaway Projects,” Communications of the ACM, vol.
“Response to: Comments on ‘Property-based Software 41, iss. 7, 1998, pp. 13-15.
Engineering Measurement: Refining the Addivity (Gla98a) R.L. Glass, “How Not to Prepare for a Consulting
Properties,’” IEEE Transactions on Software Engineering, Assignment, and Other Ugly Consultancy Truths,”
vol. 23, iss. 3, 1997, pp. 196-197. Communications of the ACM, vol. 41, iss. 12, 1998, pp. 11-13.
(Bro87) F.P.J. Brooks, “No Silver Bullet: Essence and (Gla99) R.L. Glass, “The Realities of Software Technology
Accidents of Software Engineering,” Computer, Apr. 1987, Payoffs,” Communications of the ACM, vol. 42, iss. 2,
pp. 10-19. 1999, pp. 74-79.
(Cap96) J. Capers, Applied Software Measurement: (Gra99) R. Grable et al., “Metrics for Small Projects:
Assuring Productivity and Quality, second ed., McGraw- Experiences at the SED,” IEEE Software, March/April
Hill, 1996. 1999, pp. 21-29.
(Car97) M.J. Carr, “Risk Management May Not Be For (Gra87) R.B. Grady and D.L. Caswell, Software Metrics:
Everyone,” IEEE Software, May/June 1997, pp. 21-24. Establishing A Company-Wide Program. Prentice Hall,
(Cha96) R.N. Charette, “Large-Scale Project Management 1987.
Is Risk Management,” IEEE Software, July 1996, pp. 110- (Hal97) T. Hall and N. Fenton, “Implementing Effective
117. Software Metrics Programs,” IEEE Software, March/April
(Cha97) R.N. Charette, K.M. Adams, and M.B. White, 1997, pp. 55-64.
“Managing Risk in Software Maintenance,” IEEE (Hen99) S.M. Henry and K.T. Stevens, “Using Belbin’s
Software, May/June 1997, pp. 43-50. Leadership Role to Improve Team Effectiveness: An
(Col96) B. Collier, T. DeMarco,and P. Fearey, “A Defined Empirical Investigation,” Journal of Systems and Software,
Process for Project Postmortem Review,” IEEE Software, vol. 44, 1999, pp. 241-250.
July 1996, pp. 65-72. (Hoh99) L. Hohmann, “Coaching the Rookie Manager,”
(Con97) E.H. Conrow and P.S. Shishido, “Implementing IEEE Software, January/February 1999, pp. 16-19.
Risk Management on Software Intensive Projects,” IEEE (Hsi96) P. Hsia, “Making Software Development Visible,”
Software, May/June 1997, pp. 83-89. IEEE Software, March 1996, pp. 23-26.
(Dav98) A.M. Davis, “Predictions and Farewells,” IEEE (Hum97) W.S. Humphrey, Managing Technical People:
Software, July/August 1998, pp. 6-9. Innovation, Teamwork, and the Software Process: Addison-
(Dem87) T. DeMarco and T. Lister, Peopleware: Wesley, 1997.
Productive Projects and Teams, Dorset House Publishing, (IEEE12207.0-96) IEEE/EIA 12207.0-1996//ISO/
1987. IEC12207:1995, Industry Implementation of Int. Std.
(Dem96) T. DeMarco and A. Miller, “Managing Large ISO/IEC 12207:95, Standard for Information Technology-
Software Projects,” IEEE Software, July 1996, pp. 24-27. Software Life Cycle Processes, IEEE, 1996.
(Fav98) J. Favaro and S.L. Pfleeger, “Making Software (Jac98) M. Jackman, “Homeopathic Remedies for Team
Development Investment Decisions,” ACM SIGSoft Toxicity,” IEEE Software, July/August 1998, pp. 43-45.
Software Engineering Notes, vol. 23, iss. 5, 1998, pp. 69-74. (Kan97) K. Kansala, “Integrating Risk Assessment with Cost

© IEEE – 2004 Version 8–11


Estimation,” IEEE Software, May/June 1997, pp. 61-67. (Put97) L.H. Putman and W. Myers, Industrial Strength
(Kar97) J. Karlsson and K. Ryan, “A Cost-Value Aproach Software — Effective Management Using Measurement,
for Prioritizing Requirements,” IEEE Software, IEEE Computer Society Press, 1997.
September/October 1997, pp. 87-74. (Rob99) P.N. Robillard, “The Role of Knowledge in
(Kar96) D.W. Karolak, Software Engineering Risk Software Development,” Communications of the ACM, vol.
Management, IEEE Computer Society Press, 1996. 42, iss. 1, 1999, pp. 87-92.
(Kau99) K. Kautz, “Making Sense of Measurement for (Rod97) A.G. Rodrigues and T.M. Williams, “System
Small Organizations,” IEEE Software, March/April 1999, Dynamics in Software Project Management: Towards the
pp. 14-20. Development of a Formal Integrated Framework,”
(Kei98) M. Keil et al., “A Framework for Identifying European Journal of Information Systems, vol. 6, 1997, pp.
Software Project Risks,” Communications of the ACM, vol. 51-66.
41, iss. 11, 1998, pp. 76-83. (Rop97) J. Ropponen and K. Lyytinen, “Can Software Risk
(Ker99) B. Kernighan and R. Pike, “Finding Performance Management Improve System Development: An
Improvements,” IEEE Software, March/April 1999, pp. 61-65. Exploratory Study,” European Journal of Information
(Kit97) B. Kitchenham and S. Linkman, “Estimates, Systems, vol. 6, 1997, pp. 41-50.
Uncertainty, and Risk,” IEEE Software, May/June 1997, (Sch99) C. Schmidt et al., “Disincentives for
pp. 69-74. Communicating Risk: A Risk Paradox,” Information and
(Lat98) F. v. Latum et al., “Adopting GQM-Based Software Technology, vol. 41, 1999, pp. 403-411.
Measurement in an Industrial Environment,” IEEE (Sco92) R.L. v. Scoy, “Software Development Risk:
Software, January-February 1998, pp. 78-86. Opportunity, Not Problem,” Software Engineering Institute,
(Leu96) H.K.N. Leung, “A Risk Index for Software Carnegie Mellon University CMU/SEI-92-TR-30, 1992.
Producers,” Software Maintenance: Research and Practice, (Sla98) S.A. Slaughter, D.E. Harter, and M.S. Krishnan,
vol. 8, 1996, pp. 281-294. “Evaluating the Cost of Software Quality,”
(Lis97) T. Lister, “Risk Management Is Project Communications of the ACM, vol. 41, iss. 8, 1998, pp. 67-73.
Management for Adults,” IEEE Software, May/June 1997, (Sol98) R. v. Solingen, R. Berghout, and F. v. Latum,
pp. 20-22. “Interrupts: Just a Minute Never Is,” IEEE Software,
(Mac96) K. Mackey, “Why Bad Things Happen to Good September/October 1998, pp. 97-103.
Projects,” IEEE Software, May 1996, pp. 27-32. (Whi95) N. Whitten, Managing Software Development
(Mac98) K. Mackey, “Beyond Dilbert: Creating Cultures Projects: Formulas for Success, Wiley, 1995.
that Work,” IEEE Software, January/February 1998, pp. 48-49. (Wil99) B. Wiley, Essential System Requirements: A
(Mad97) R.J. Madachy, “Heuristic Risk Assessment Using Practical Guide to Event-Driven Methods, Addison-
Cost Factors,” IEEE Software, May/June 1997, pp. 51-59. Wesley, 1999.
(McC96) S.C. McConnell, Rapid Development: Taming (Zel98) M.V. Zelkowitz and D.R. Wallace, “Experimental
Wild Software Schedules, Microsoft Press, 1996. Models for Validating Technology,” Computer, vol. 31, iss.
(McC97) S.C. McConnell, Software Project Survival 5, 1998, pp. 23-31.
Guide, Microsoft Press, 1997.
(McC99) S.C. McConnell, “Software Engineering
Principles,” IEEE Software, March/April 1999, pp. 6-8.
(Moy97) T. Moynihan, “How Experienced Project
Managers Assess Risk,” IEEE Software, May/June 1997,
pp. 35-41.
(Ncs98) P. Ncsi, “Managing OO Projects Better,” IEEE
Software, July/August 1998, pp. 50-60.
(Nol99) A.J. Nolan, “Learning From Success,” IEEE
Software, January/February 1999, pp. 97-105.
(Off97) R.J. Offen and R. Jeffery, “Establishing Software

Measurement Programs,” IEEE Software, March/April


1997, pp. 45-53.
(Par96) K.V.C. Parris, “Implementing Accountability,”
IEEE Software, July/August 1996, pp. 83-93.
(Pfl97) S.L. Pfleeger, “Assessing Measurement (Guest
Editor’s Introduction),” IEEE Software, March/April 1997,
pp. 25-26.
(Pfl97a) S.L. Pfleeger et al., “Status Report on Software
Measurement,” IEEE Software, March/April 1997, pp. 33-
43.

8–12 © IEEE – 2004 Version


APPENDIX B. LIST OF STANDARDS
(IEEE610.12-90) IEEE Std 610.12-1990 (R2002), IEEE
Standard Glossary of Software Engineering Terminology,
IEEE, 1990.
(IEEE12207.0-96) IEEE/EIA 12207.0-1996//ISO/
IEC12207:1995, Industry Implementation of Int. Std.
ISO/IEC 12207:95, Standard for Information Technology-
Software Life Cycle Processes, IEEE, 1996.
(ISO15939-02) ISO/IEC 15939:2002, Software
Engineering-Software Measurement Process, ISO and
IEC, 2002.
(PMI00) Project Management Institute Standards
Committee, A Guide to the Project Management Body of
Knowledge (PMBOK), Project Management Institute,
2000.

© IEEE – 2004 Version 8–13

You might also like

pFad - Phonifier reborn

Pfad - The Proxy pFad of © 2024 Garber Painting. All rights reserved.

Note: This service is not intended for secure transactions such as banking, social media, email, or purchasing. Use at your own risk. We assume no liability whatsoever for broken pages.


Alternative Proxies:

Alternative Proxy

pFad Proxy

pFad v3 Proxy

pFad v4 Proxy