Software Testing Concepts: Quality
Software Testing Concepts: Quality
Software Testing Concepts: Quality
J.Srinivasa Reddy
Quality
Quality is the degree to which a program possesses a desired combination of attributes that enable it to perform its specified end use. Quality is defined as the confirmation to the specifications of the customer. In other words justification of all the requirements in the product that produces customer satisfaction. OR Quality is defined as not only the presence of the requirements or the absence of the defects, but also the presence of value. What is software Quality? Quality software is Reasonably bug free Delivered on time Within budget Is maintainable Meets the requirements End user might define quality as User Friendly Bug Free
Why Quality is required? To satisfy the customer. To compete in market. To retain the customer. To get new customer. To reduce maintenance cost. To maintain consistency. To maintain schedule.
JSR
Manual Testing
J.Srinivasa Reddy
Software Testing
Testing involves operating an Application under specified conditions with the intention of finding bugs (or) evaluating result. Operating an application: Giving Inputs / Actions to the Application. Specified Conditions: Hardware Configuration / Operating Systems / Types of browser. Testing is a process to verify the software to detect defects or problem areas. Testing evaluates the product against specifications and requirements. Testing is applicable for all the stages of product development. Why do we need to test software? Defect can exists in the software, as it is developed by human who can make mistakes during the development of software. However it is the primary duty of software vendor to ensure the software delivered does not have defects and the customers day - to day operations do not get affected this can be achieved by regularly testing the software. The main aim of testing is to detect problems in the early stages of product / project development. Why is software testing? To discover defects. To avoid user detecting problems. To prove that software has no faults. To learn about the reliability of the software. To ensure that product works as user expected. To stay in Business. To detect defects early, which helps in reducing the cost of defect fixing.
Why does software have bugs? miscommunication or no communication software complexity changing requirements programming errors time pressures poorly documented code software development tools Egos of developer.
JSR
J.Srinivasa Reddy
Solid requirements :Clear, complete, detailed, testable requirements. Realistic schedule :Allow sufficient time for planning, designing, testing, bug fixing, re-testing, changes and documentation. Adequate Testing. Stick to initial requirements. Communication. Software Quality Assurance SQA is defined as planned and systematic approach to the evolution of quality. SQA involves the entire software development process for Monitoring and improving the process. Making sure any agreed-upon standards and procedures are followed. Ensuring that problems are found and dealt with. Activities of SQA Group Identifying the standards that are applicable for the project. Preparation of process/procedural documents and templates. Tracking deviation from the process and inform of any non conformance to the management. Review the test reports and summarize the test result. Quality Control Testing is the process of creating, implementing and evaluating tests. Testing measures software quality. Testing can find faults; when they are removed, software quality is improved. Testing involves operation of a system or application under controlled conditions and evaluating the result The controlled conditions should include both normal and abnormal conditions. Testing should intentionally attempt to make things go wrong to determine if things happen when they shouldnt or things dont happen when they should. It is oriented to detection.
JSR
Manual Testing What is Verification & Validation? Verification : Are we building the product right? Validation : Are we building the right product?
J.Srinivasa Reddy
Verification typically involves reviews and meetings to evaluate documents, plans, code, requirements, and specifications. This can be done with checklists, issues lists, and walkthroughs and inspection meetings. Requirement review Design review Code walkthrough Code Inspection
Validation typically involves actual testing. It looks at the system correctness i.e. is the process of checking that what has been specified is what the user actually wanted. Unit Testing Integration Testing system testing Functional testing and structural testing User acceptance testing Verification is Un-Conventional Testing typically done by QA Team Un-Conventional Testing associated with Process Validation is nothing but Conventional Testing typically done by QC Team Conventional Testing associated with Product Product Right: Product is developing in right way or not, is Un-Conventional Testing Right Product: Out come project is Right product or not, is Conventional Testing
JSR
Manual Testing
J.Srinivasa Reddy
Specific products and services Focuses on inspection, Testing and removal of the defects Responsibility of worker Concerned with specific products
Quality Factors
We can define software quality in several ways that could, ideally, all tie together. Some views of Quality are Satisfaction level - the degree to which a software product meets a user's needs and expectations. A software product's value relative to its various stakeholders and its competition. The extent to which a software product exhibits desired properties. The degree to which a software product works correctly in the environment it was designed for, without deviating from expected behavior. The effectiveness and correctness of the process employed in developing the software product. Quality means meeting requirement/ customer needs. Quality means defect free products (Fit to use). Quality is an attribute of a product. Services are form of products. Management is responsible for quality. Producer must be involved in quality control. Quality is a journey not a destination. The objective of quality is continuous improvement.
JSR
Manual Testing
J.Srinivasa Reddy
Preliminary Investigation:
When the request is made, the first systems activity the preliminary investigation begins. This activity has three parts. 1. Request clarification. 2. Feasibility study. 3. Request approval. Request clarification: Many requests from employees, users in the organizations are not clearly defined. Therefore, it becomes necessary that project request must be examined and clarified properly considering systems investigation. Feasibility study: An important outcome of the preliminary investigation is the determination that system requested is feasible. There are three aspects in the feasibility study portion of the preliminary investigation. i. Technical Feasibility : Can the work for the project be done with current equipment, existing software technology and available personnel? If new technology is needed what is likelihood that it can be developed? ii. Economic Feasibility : Are there sufficient benefits in creating the system to make the cost acceptable? Or are the cost of not creating the system so grate that it is advisable to undertake the project. iii. Operational Feasibility : Will the system be used if it is developed and implemented? Will there be resistance form users that will undermine the possible application benefits?
JSR
Manual Testing
J.Srinivasa Reddy
The feasibility study is carried out by a small group of people who are familiar with information system techniques, understands the parts of the business or organization that will be involved or affected by the project, and are skilled in the system analysis and design process.
The analyst conducts an initial study of the problem and asks is the solution
Request approval: It is not necessary that all requested projects are desirable or feasible management decides which projects are most urgent and schedules them accordingly. After a project request is approved, its cost, priority, completion time, and personal requirements are estimated and used to determine where to add it to any existing projects list. Later on when the other projects have been completed, the proposed application development can be initiated.
JSR
Manual Testing
J.Srinivasa Reddy
All the stages from start to finish that take place when developing new software
Analysis
Analysis: What exactly is the system supposed to do? Determine and list out the details of problem. Design: How will the system solve the problem?
Design
Coding
Testing: Does the system solve the problem? Have the requirements been satisfied? Does the system work properly in all situations? Maintenance: Bug Fixes
The software life cycle is a description of the events that occur between the birth and death of a software project inclusively. SDLC is separated into phases. SDLC also determines the order of the phases, and the criteria for transitioning from phase to phase.
JSR
Manual Testing
J.Srinivasa Reddy
System Analysis:
System analysis and design is the process of investigating a business. With a view to determining how best to manage the various procedures and information processing tasks that it involves. System analyst performs the investigation and might recommend the use of a computer to improve the efficiency of the information system being investigated. System analysis is done with the intention to determine how well a business copes with its current information processing needs, and whether it is possible to improve the procedures in order to make it more efficient or profitable. The System Analysis Report: BRS FRS SRS Use Cases Business Requirement Specifications Functional Requirement Specifications System Requirement Specifications User Action and System Response
Note: FRS contains input, output, process but no format. Use cases contain user action and system response with fixed format.
System Design:
Planning the structure of the information system to be implemented known as system design. System analysis determines what the system should do. System design determines how it should be done. System design can be further divided into architectural design or top-level, detailed design or logic design. In architectural design, the various sub-systems/ modules are identified and their interconnections are worked out. In detailed design, each module design is done and the algorithms and data structures are identified. The design Documents prepared by software architects. System Design Report Design Documents that consist of Architectural Design Database Design Detailed Design
JSR
Manual Testing
J.Srinivasa Reddy
Coding:
Coding is the process of translating the design into the actual system. Program Development Unit testing by Development Team Programmers develop all the programs, functions, reports that related to system.
Testing:
Testing is process of verifying behavior of the developed system. The foremost thing in testing is to test the functionality of the software. In addition to functionality, the software has to be tested to ensure that it meets the performance requirements, the reliability requirements and other requirements, such as portability usability etc.
10
JSR
Manual Testing
J.Srinivasa Reddy
In typical model, a project begins with feasibility analysis. On successfully demonstrating the feasibility of a project, the requirements analysis and project planning begins. The design starts after the requirement analysis is complete, and coding begins after the design is complete. Once the programming is completed, the code is integrated and testing is done. On successful completion of testing the system is installed. After this the regular operation and maintenance of the system takes place.
11
JSR
Manual Testing
J.Srinivasa Reddy
System Feasibility Requirement Analysis & Project Planning Architectural Design Detailed Design Coding Testing & Integration Installation Operation & Maintenance
Waterfall Model
Advantages
Simple and easy to use. Easy to manage due to the rigidity of the model each phase has specific deliverables and a review process. Phases are processed and completed one at a time. Works well for smaller projects where requirements are very well understood.
12
JSR
J.Srinivasa Reddy
Adjusting scope during the life cycle can kill a project No working software is produced until late during the life cycle. High amounts of risk and uncertainty. Poor model for complex and object-oriented projects. Poor model for long and ongoing projects. Poor model where requirements are at a moderate to high risk of changing.
Prototyping Model :
13
JSR
Manual Testing
J.Srinivasa Reddy
In software development, a prototype is a rudimentary working model of a product or information system, usually built for demonstration purposes or as part of the development process The Prototyping Model is a systems development method (SDM) in which a prototype (an early approximation of a final system or product) is built, tested, and then reworked as necessary until an acceptable prototype is finally achieved from which the complete system or product can now be developed. This model works best in scenarios where not all of the project requirements are known in detail ahead of time. It is an iterative, trial-and-error process that takes place between the developers and the users. There are several steps in the Prototyping Model: The new system requirements are defined in as much detail as possible. This usually involves interviewing a number of users representing all the departments or aspects of the existing system. A preliminary design is created for the new system. A first prototype of the new system is constructed from the preliminary design. This is usually a scaled-down system, and represents an approximation of the characteristics of the final product. The users thoroughly evaluate the first prototype, noting its strengths and weaknesses, what needs to be added, and what should to be removed. The developer collects and analyzes the remarks from the users. The first prototype is modified, based on the comments supplied by the users, and a second prototype of the new system is constructed. The second prototype is evaluated in the same manner as was the first prototype. The preceding steps are iterated as many times as necessary, until the users are satisfied that the prototype represents the final product desired. The final system is constructed, based on the final prototype. The final system is thoroughly evaluated and tested. Routine maintenance is carried out on a continuing basis to prevent large-scale failures and to minimize downtime.
14
JSR
Manual Testing
J.Srinivasa Reddy
Spiral Model :
The spiral model is similar to the incremental model, with more emphases placed on risk analysis. The spiral model has four phases: Planning, Risk Analysis, Engineering and Evaluation. A software project repeatedly passes through these phases in iterations (called Spirals in this model). The baseline spiral, starting in the planning phase, requirements are gathered and risk is assessed. Each subsequent spirals builds on the baseline spiral. Requirements are gathered during the planning phase. In the risk analysis phase, a process is undertaken to identify risk and alternate solutions. A prototype is produced at the end of the risk analysis phase. Software is produced in the engineering phase, along with testing at the end of the phase. The evaluation phase allows the customer to evaluate the output of the project to date before the project continues to the next spiral.
15
JSR
J.Srinivasa Reddy
High amount of risk analysis Good for large and mission-critical projects. Software is produced early in the software life cycle.
Disadvantages
Can be a costly model to use. Risk analysis requires highly specific expertise. Projects success is highly dependent on the risk analysis phase. Doesnt work well for smaller projects.
Fish Model
Coding System Testing Programs Black box Testing Reviews White box Testing
Maintenance
QA (Verification)
QC (Validation)
16
JSR
Manual Testing BRS stands for Business Requirements Specification SRS stands for Software Requirements Specification HLDD stands for High Level Design Document LLDD stands for Low Level Design Document Reviews:
J.Srinivasa Reddy
A document level testing technique during this review, the responsible people are estimating the completeness & correctness of corresponding document. There are three ways of reviews 1. Walkthrough: Study a document from 1st line to last line 2. Inspection: Search for a specific issue in a document. 3. Peer Review: Compare a document with other similar document.
Analysis :
A process to list out the details of problem and functionalities. BRS: BRS defines the requirements of customer to be developed. SRS: SRS defines the functional requirements to be developed and the system requirements to be used.
Reviews at Analysis: In general, the software development process starts with requirements gathering & analysis. In this phase, the Business Analyst category people develop BRS and SRS. They conduct review on the documents for completeness & correctness. The Business Analyst prepares these questions on BRS / SRS. Are they Right Requirements? Are they Complete Requirements? Are they Achievable Requirements? Are they Reasonable Requirements? Are they Testable Requirements?
17
JSR
J.Srinivasa Reddy
pictorial representation of the project/Software to be developed. HLD: The HLD documents defined the overall architecture of the system.
Root
Mailing
Chatting
Logout
Leaf
Customer Accounts
Transactions
Reports
Exit Leaf
The above overall design is also known as Architectural Design / External Design.
18
JSR
Manual Testing
J.Srinivasa Reddy
LLD: The LLD documents define the internal structure of every module or functionality
LLD of Login
Invalid
Valid
Main Window
Database
Reviews at Designing After completion of Analysis & their reviews the designer category people develop HLDD & LLDDs and conduct reviews on the documents for completeness & correctness. The designers prepare these questions on the HLD & LLDs. Coding: Coding is the process of translating the design into the actual system. Programmers develop all the programs, functions, reports that related to system. Are they understandable designs? Are they meeting the right requirements? Are they complete designs? Are they followable designs? Are they handling errors?
19
JSR
Manual Testing
J.Srinivasa Reddy
Program: A set of executable statements is called a program. Software consists of multiple programs. A program consists multiple statements.
White Box Testing: A program level testing technique. In this technique, the responsible people are verifying the internal structure of the corresponding program. These White Box Testing techniques are also known as Open Box Testing / Glass Box Testing / Clear Box Testing. Black Box Testing: It is a Software level testing technique. During this test the responsible people are validating external functionality of the developed system.
V Model
V stands for Verification & Validation This model defines the conceptual mapping in between Development stages & testing stages.
Requirements
Acceptance Testing
Specification System Testing Architectural Design Integration Testing Detailed Design Unit Testing
Coding
20 JSR
Manual Testing
J.Srinivasa Reddy
In the above model, the separate testing team is available for system testing phase because this phase is Bottleneck Phase to software development process. In remaining stages of testing, the same development people are involved. To decrease project cost.
Requirements Acceptance Test Design Specification System Test Design System Test Execution Acceptance Test Execution
Code
Unit
testing checks whether code meets the detailed design. testing checks whether previously tested components fit together.
Integration System
Acceptance
testing checks whether the product meets the final user requirements.
21
JSR
Manual Testing
J.Srinivasa Reddy
To be fair, users of the V model will often separate test design for test implementation. The test design is done when the appropriate development is ready.
1. Preliminary Investigation
Request clarification. Feasibility study. Request approval. Project Process Project Plan
2. Requirement Gathering
Collect information about business operations, operations to be automized and its business data. Acceptance test Plan Acceptance Test Design
3. System Analysis
Develop BRS Develop FRS Develop SRS Develop Use Cases Business Requirement Specifications Functional Requirement Specifications System Requirement Specifications User Action and System Response
4. System Design
Architectural Design Database Design Interface Design (HLDD) (E-R Diagram) (LLDD)
Reviews at Design Integration Test Plan Integration test Design Unit Test Plan Unit Test Design
22
JSR
Manual Testing
J.Srinivasa Reddy
5. Coding
Program Development Unit testing by Development Team Soon after the smallest unit of an application/module is developed one has to perform testing on it to ensure it is working fine as per the requirements to its perfection using white box testing Techniques. Since the testing is very much confined to the program level this level of testing follows under the category of white Box Testing and so mostly the developers are involved in the Unit testing. White Box Testing is defined as the method of testing on which one can perform testing on an application having internal structural knowledge of it. White box testing takes into account the program code, code structure, and internal design flow. In other words the program of an application that is structural part of an application will be tested Aims to establish that the code works as designed. Examines the internal structure and implementation of the program. Target specific paths through the program. Needs accurate knowledge of the design, implementation and code.
There are 4 White Box Testing Techniques: 1.Base Path Testing 2.Control Structure testing 3.Program technique Testing 4.Mutation Testing These Techniques are applicable only for Programs. Base Path Testing : During this test the programmers concentrate on the execution of programs without any runtime errors. To conduct this test, the corresponding programmer follows the below approach. Write a program with respect to LLD (Low Level Design) Draw a flow graph for that program. Calculate cyclomatic complexity. Runt that program more than one time to cover all executable areas.
23
JSR
Manual Testing
J.Srinivasa Reddy
Control Structure Testing : During this test, the corresponding programmer concentrates on correctness of program execution in this test, they verify every statements of program execution. In this test, they verify every statements input state & Output state. Eg: Debugging Program Technique Testing : During this test, the programmers concentrate on the execution speed of a program. If the execution speed is not reasonable, then programmers perform changes in the structure of the program without disturbing functionality of the program. Eg: Swapping Program i. c=a; a=b; b=c; More Memory usage for fast running a=a+b; b=a-b; c=a-b Low memory usage for fast running
Mutation Testing: Testing of software by introducing the bugs intentionally to verify the behavior. During this test, the corresponding programmers estimate completeness & correctness of a program.
24
JSR
Manual Testing
Root Module
M1 M2
M3 M4
M5 M6
25
JSR
J.Srinivasa Reddy
Root Module
M1
Stub
M3 M2
Stub
M5 M4
Stub
M6
Bottom Up integration :
Major steps 1. Low-level components will be tested individually first. 2. A driver(a control program for testing) is written to coordinate test case input and output. 3. The driver is removed and integration moves upward in the program structure. 4. Repeat the process until all components are included in the test. Adv: Compared with stubs, drivers are much easier to develop. Disadv: Major control and decision problems will be identified later in the testing process.
26
JSR
J.Srinivasa Reddy
Driver
Root Module
Driver
M1 M2
M3 M4
M5 M6
Hybrid integration. It is defined, as the integration approaches in which the practices of both Top Down integration and Bottom Up integration are club together. Driver
Root Module
Driver
M1
Stub
M3 M2
Stub
M5 M4 M6
27
JSR
Manual Testing
J.Srinivasa Reddy
7. System Testing
Objective: To ensure that the system does what the customer wants it to do. Once the application is developed it will be deployed into the customers specified environment, which is simulated into the testing lab. Once the application (software) is deployed into the environment (Hardware) it becomes a system.
Application = Environment =
Software Hardware
System
If one performs testing on the entire system it is known as system testing. Testing Engineer does system testing as a Black Box Testing. The system testing is classified into 2 categories. 1. Functional Testing 2. Non- Functional Testing
28
JSR
Manual Testing
J.Srinivasa Reddy
Types Of testing
Static Testing
Static Testing is a form of software testing where the software isn't actually used. It is generally not detailed testing, but checks mainly for the code, algorithm, or document. The Verification activities fall into the category of Static Testing. During static testing, you have a checklist to check whether the work you are doing is going as per the set standards of the organization. These standards can be for Coding, Integrating and Deployment. Review's, Inspection's and Walkthrough's are static testing methodologies
Dynamic Testing
Dynamic Testing describes the testing of the dynamic behavior of code. Dynamic Testing involves working with the software, giving input values and checking if the output is as expected. These are the Validation activities. Unit Tests, Integration Tests, System Tests and Acceptance Tests are few of the Dynamic Testing methodologies
Functional Testing
Testing the functionalities expected from the software is called functional testing. It is performed by Black Box testing. These are the functionalities for which the software is made. Functions are tested by feeding the input and examining the output to the expected outputs
Functionality Testing
During this test, the testing team concentrates on correctness of every functionality with respect to requirements. In this test, the testing team follows the below coverage. GUI Coverage / Behavioral Coverage (Changes in Properties of Objects in Screen) Error Handling Coverage (Preventing incorrect Operations)
29
JSR
Manual Testing Input Domain Coverage (Taking correct size & type of Inputs) Manipulations Coverage (Returning correct output)
J.Srinivasa Reddy
Backend Coverage (The Impact of front-end screens operations on backend tables) Order of functionalities Coverage
Sanitation Testing
This is also known as Garbage Testing. During this test, the testing team identifies extra functionalities in the software build with respect to customer requirements.
Sanity Testing
Typically an initial testing effort to determine if a new software version is performing well enough to accept it for major testing effort. This test should exercise the entire system from end to end, it does not have to be exhaustive, but it should be capable for exposing major problem. It is also like Smoke Testing, i.e., conducted in a short span of time. To ensure everything is proper in terms of availability. Practically, Conceptually Smoke Testing is Same as Sanity Testing but they different from each other. Smoke Testing is always associated with Negative Testing. Sanity Testing is always associated with Positive Testing.
Functional Testing: Functionality Testing: Sanitation testing: Sanity Testing Regression Testing:
30
JSR
Manual Testing
J.Srinivasa Reddy
Regression testing is the renewed testing of already tested program or part after modification with the aim that the modification had not created new fault
Regression Testing: testing other parts of the application whether any thing is effected by fixing the Bug. Regression - To check whether new Modification have not caused any unintended side effects
Re Testing
Retesting is the repeated execution of the test case which results in a fault, with the aim that fault has been cured. Re-Testing is to verify whether the particular test case or set of test cases have been assigned by the developer properly and then executing the same test case and verify whether they r fixed or not. It is testing the same Scenario in which we Found Bug.
Retesting- To recheck whether the particular functionality is working properly or not after the modification have been made.
Ad-hoc Testing:
It is also known as random testing. It is software testing performed without planning and documentation. The tests are intended to be run only once, unless a defect is discovered. Testing done without using any formal testing technique and in unplanned manner is ad hoc testing. This kind of testing dose not have any process/ test case / test scenarios defined to do it. Testers must have significant understanding of the software before testing it.
31
JSR
Manual Testing
J.Srinivasa Reddy
Testers tries to break the system by randomly trying the systems functionality and can include negative testing as well.
a) Monkey Testing Dose not has any fixed methodology of testing. Jest be a ready user of the system. Jest try any random combination, destructive testing.
b) Buddy Testing The management groups programmers & testers. The team members are identified as buddies .the buddies mutually help each other, with a common goal of identifying defects early and correcting them.
Usability means that systems are easy and fast to learn, Efficient to use, easy to remember, cause on operating errors and offer a high degree of satisfaction for the user.
a) User Interface Testing
It focuses on verifying the user interface. Testing of user actions with soft keys and keyboard operations.
In User Interface Testing software build is tested for Ease of use (Understandability) Look & Feel (Attractiveness) Speed in Interface (Short navigations) These are applied on every screen in the software build. b) Manuals Support Testing Also known as Help - documents testing. During this test, the testing team concentrates on correctness & completeness of Help Documents / User Manuals. At the end of testing process, the testing team concentrates on Manuals Support Testing.
i. ii.
32
Manual Testing
J.Srinivasa Reddy
Configuration Testing Inter system Testing Installation Testing Load Testing Stress Testing Data Volume Testing Parallel Testing
33
JSR