01-Acceptance Test Preparation Procedure
Purpose
Testing
process is a category of V & V, which takes part during the software life cycle.
The testing process can be represented in three phases. Test preparation phase,
test execution phase and test analysis phase. These phases are applicable for
all types of testing (integration, system and acceptance and unit testing).
Preparation and execution have separate procedures, while the analysis is a
part of the releasing procedure.
Inputs
|
No |
Input |
Coming from/Output Of |
|
1 |
Approved Software
Development Plan |
PD_Planning_Procedure |
|
2 |
Baselined
EA File and RTM |
PD_Requirements_Validation_Procedure |
Activities
|
No |
Activity |
Responsible |
Input |
Output |
Temp/Solution |
|
1. |
Identify
testing period on client side |
Scrum Master & QC |
Plan |
Identified Testing Period in
Implementation phase |
MS Project |
|
2. |
Prepare
Client Environment needs |
Scrum Master & QC |
Plan |
Identified Testing Period in
Implementation phase |
TFS,MTM |
|
3. |
Determine
items to be tested in the acceptance test. |
QC |
All Procedure Inputs |
Identified items |
TFS,MTM |
|
4. |
Determine
features to be tested in the acceptance test. |
QC |
All Procedure Inputs |
Identified Features |
TFS,MTM |
|
5. |
Determine
testing approach for the acceptance test. |
QC |
All Procedure Inputs |
Identified Approach |
TFS,MTM |
|
6. |
Determine
pass and fail criteria in the acceptance test. |
QC |
All Procedure Inputs |
Identified Criteria |
TFS,MTM |
|
7. |
Determine
suspension and resuming criteria in the acceptance test. |
QC |
All Procedure Inputs |
Identified Criteria |
TFS,MTM |
|
8. |
Determine
test deliverables from the acceptance test. |
QC |
All Procedure Inputs |
Identified Deliverables |
TFS,MTM |
|
9. |
Determine
testing tasks in the acceptance test. Ex: Like
installing system, creating database, ect. |
QC |
All Procedure Inputs |
Identified Tasks |
TFS,MTM |
|
10. |
Describe
the testing environment for the acceptance test. |
QC |
All Procedure Inputs |
Identified Environment |
TFS,MTM |
|
11. |
Assignee
roles and responsibilities for the acceptance test. |
QC |
All Procedure Inputs |
Identified Roles and Responsibilities |
TFS,MTM |
|
12. |
Develop
all possible associated test procedures for acceptance test. |
QC |
All Procedure Inputs |
Identified Procedures |
TFS,MTM |
|
13. |
Develop
all possible associated test cases for acceptance test. |
QC |
All Procedure Inputs |
Identified Cases |
TFS,MTM |
|
14. |
Plan
for smart devices performance test using the following research document: Mobile
research document |
QC |
All Procedure Inputs |
Identified Cases |
TFS,MTM |
Outputs
|
No |
Output |
Going To/Input Of |
|
1 |
Approved
Acceptance Test Plan |
PD_Acceptance_Test_Execution_Procedure |
|
2 |
Approved
Acceptance Test Procedures |
PD_Acceptance_Test_Execution_Procedure |
|
3 |
Approved
Acceptance Test Cases |
PD_Acceptance_Test_Execution_Procedure |
02-System Test Preparation Procedure
Purpose
Testing
process is a category of V & V, which takes part during the software life
cycle. The testing process can be represented in three phases. Test preparation
phase, test execution phase and test analysis phase. These phases are applicable
for all types of testing (integration, system and acceptance and unit testing).
Preparation and execution have separate procedures, while the analysis is a
part of the releasing procedure.
Validation in CMMi
is a Software Quality Control (SQC) process that addresses the question:-
Are we building the correct product?
This question can be asked at any time during
the SDLC but the essence of the question is, when this software product is
placed in its proper environment does it fulfill the goals and needs of the end
user, as expressed in the requirements.
Contrast this with Verification another
SQC process that makes sure a given work product (which could be a design specification
or software component) is written to a given standard or meets a particular
specification[1].
Inputs (OR)
|
No |
Input |
Coming from/Output Of |
|
1.
|
Baselined EA
file |
PD_Requirements_Validation_Procedure |
|
2.
|
PMP |
The
document file, which have information about the target environment. |
|
3.
|
System
specifications |
System
specifications |
Activities
|
No |
Activity |
Responsible |
Input |
Output |
Temp/Solution |
|
1. |
Determine items
and features to be tested in the system test. |
TT ,TL |
All Procedure Inputs |
Identified Items |
TFS,MTM |
|
2. |
Determine
testing types for the system test. |
TT ,TL |
All Procedure Inputs |
Identified Approach |
TFS,MTM |
|
3. |
Determine Testing
environment, which includes: 1. Database server
configuration [version and setup variables] 2. Application server
configuration[version and setup variables] 3. Modules
configuration and setup[version and setup variables] |
TT,TL |
All Procedure Inputs |
Identified Approach |
TFS,MTM |
|
4. |
Develop all
possible associated test cases for system testing. |
TT |
All Procedure Inputs |
Identified Cases |
TFS,MTM |
|
5. |
Determine pass
and fail criteria in the system test. |
TT |
All Procedure Inputs |
Identified Criteria |
TFS,MTM |
|
6. |
Determine
suspension and resuming criteria in the system test. Means when you
will stop your testing because you need a further action of what you notice. |
TT |
All Procedure Inputs |
Identified Criteria |
TFS,MTM |
|
7. |
Determine test deliverables
from the system test. |
TT |
All Procedure Inputs |
Identified Deliverables |
TFS,MTM |
|
8. |
Determine
testing tasks in the system test. |
TT |
All Procedure Inputs |
Identified Tasks |
TFS,MTM |
|
9. |
Review use
cases in order to: 1. Find missing
scenarios 2. Find alternative
scenarios 3. Find
inconsistencies between scenarios 4. Check requirements
completeness 5. Check requirements
testability (Clarity) 6. Check requirements
mapping to the persona 7. Collected from
relevant sources |
TT |
All Procedure Inputs |
Identified Tasks |
TFS,MTM |
|
10. |
Record Test
Cases[2]
for selected items as follow: 1. Use Case 2. Scenario 3. Pre-Conditions 4. Scenario Steps 5. Scenario Data (The
key data required to execute the operation) 6. Expected Results 7. Execution Time |
TT |
All Procedure Inputs |
Test Cases |
TFS,MTM |
|
11. |
Describe the
testing environment for the system test. |
TT |
All Procedure Inputs |
Identified Environment |
TFS,MTM |
|
12. |
Plan for smart
devices performance test using the following research document: |
TT |
All Procedure Inputs |
Identified Environment |
TFS,MTM |
Outputs
|
No |
Output |
Going To/Input Of |
|
1 |
Approved System
Test Plan |
PD_System_Test_Execution_Procedure |
|
2 |
Approved System
Test Procedures |
PD_System_Test_Execution_Procedure |
|
3 |
Approved System
Test Cases |
PD_System_Test_Execution_Procedure |
Implementation Guidelines
|
S# |
Guidelines |
|
1. |
Items to be tested in system
test can be screen (UI) or functionality. |
|
2. |
testing types for the
system test may be: UI test. Security testing. Stress testing. Volume testing. Functional=I/O Structure = Detailed Performance: stress(max.
no. of users), security test., volume(amount of handled data), timing test |
|
3. |
fail criteria in the system
test may be: 1-Success of all functional
test cases. 2- Success of all test
cases |
|
4. |
Ex: Lookups for Stock module is
not open. |
|
5. |
Ex: 1-Defects on BugAuto 2-Proposal for Enhancements |
|
6. |
Ex: 1-Test Knowledge Base 2-Reporting 3-Test Functionality 4-Reporting 5-Test Security 6-Reporting |
|
7. |
The environment should be
driven from the client environment in PMP in order to prevent risks in
delivery phase. Ex: Windows 7 + SQL 2008 |
Purpose
Unit
testing process can be represented in three phases. Test preparation phase,
test execution phase and test analysis phase. These phases are applicable for all
types of testing (integration, system and acceptance and unit testing).
Preparation and execution have separate procedures, while the analysis is a
part of the releasing procedure.
As a
developer, you can't proof you finished your work till an evidence of
completeness, unit test is your evident if you meet code coverage% meet
benchmarked percentage.
Integration
testing occurred per modules in order to assure collaboration between them.
It can
be a part of the VER process area in CMMI
VER
means: Are
we building the product correctly?
Inputs
|
No |
Input |
Coming from/Output Of |
|
1 |
Source code
modules and classes |
In
case of FDD |
|
2 |
Interfaces |
In
case of TDD |
Activities
|
No |
Activity |
Responsible |
Input |
Output |
Temp/Solution |
|
13. |
Determine modules,
namespaces, classes and Functions to be tested.[3] ·
Classes testing for unit tests ·
Multiple Components testing for integration Means What to test[4] |
Dev/TT |
Understood Product |
Identified items |
TFS,MTM |
|
14. |
Review implementation
module, class, functions to be tested in order to understand behavior. |
Dev/TT |
Understood Product |
Identified Features |
TFS,MTM |
|
15. |
Define each
method expected success and fail criteria in order to be used in the unit
testing methods |
|
|
|
|
|
16. |
Plan for
integration testing by generating DOTNET unit test project for each module
and implement unit test upon: ·
Testing methodology (Test driven development(TDD) or
Behavior driven development(BDD) ·
Define Testing projects within the CI build in TFS |
Dev/TT |
Understood Product |
Identified Approach |
TFS,MTM |
|
17. |
Define Code
Coverage% to be used as a
benchmark for testing coverage[5] |
Dev/TT |
Understood Product |
Identified Criteria |
TFS,MTM |
|
18. |
Determine pass
and fail criteria for the test. ·
Coverage% ·
% of failed unit tests |
TFS,MTM |
|||
|
19. |
Determine test
deliverables from the unit test execution |
Dev/TT |
Understood Product |
Identified Deliverables |
TFS,MTM |
|
20. |
Describe the
testing environment for the unit test. |
Dev/TT |
Understood Product |
Identified Environment |
TFS,MTM |
|
21. |
Assign roles
and responsibilities for the unit test. |
Dev/TT |
Understood Product |
Identified Roles and Responsibilities |
TFS,MTM |
|
22. |
Develop all
possible associated test procedures for unit test in order to meet or
exceed Coverage% |
Dev/TT |
Understood Product |
Identified Procedures |
TFS,MTM |
|
23. |
Submit unit
test project(s) on CM Environment. |
Dev/TT |
Understood Product |
Identified Cases |
TFS,MTM |
1.0
|
Define CI (Continuous
integration) info:
|
DM,
Scrum Master |
Technology
selection |
Update
TFS Solution Build Definition |
TFS Build |
Outputs
|
No |
Output |
Going To/Input Of |
|
1 |
Approved Unit
Test Plan |
PD_Unit_Test_Execution_Procedure |
|
2 |
Approved Unit
Test Procedures |
PD_Unit_Test_Execution_Procedure |
|
3 |
Approved Unit
Test Cases |
PD_Unit_Test_Execution_Procedure |
04-System Test Execution Procedure
Purpose
This is the procedure
in which the actual test takes place. Concurrently to the testing activities,
errors are corrected and have to be re-tested. In order to effectively
coordinate development and testing activities a defined procedure for error
report and tracking is needed. Based on a comparison of the actual results with
the expected results, and according to the pass/fail rate, determination shall
be made and recorded in a defect log. The defect tracking enables defect
reporting against discovered problems in the software, managing and tracking
the status of the defects from creation to closure.
Difference between system and acceptance testing
|
Perspective |
System Testing |
Acceptance Testing |
|
Scope |
|
|
|
Environment |
|
|
|
Procedure |
|
|
|
|
|
Inputs
|
No |
Input |
Coming from/Output Of |
|
1 |
Approved
Integrated Product |
PD_Integration_Test_Execution_Procedure |
|
2 |
Approved System
Test Plan |
PD_System_Test_Preparation_Procedure |
|
3 |
Approved System
Test Procedures |
PD_System_Test_Preparation_Procedure |
|
4 |
Approved System
Test Cases |
PD_System_Test_Preparation_Procedure |
Activities
|
No |
Activity |
Responsible |
Input |
Output |
Temp/Solution |
|
1. |
Prepare
suitable Testing Environment. Standard environment
in case of system testing, or same like client environment in case of
acceptance testing. |
TT |
Testing plan |
Testing environment |
TFS,MTM |
|
2. |
Execute system
test plan, procedures and test cases. All in case of system
testing, or selected issues and requirements in case of acceptance testing
from the client. |
TT |
All Procedure Inputs |
Discovered Defects |
TFS,MTM |
|
3. |
Report
discovered system defects and all its parameters in detail[6]. |
TT |
Understood System Test |
Reported Defects |
TFS,MTM |
|
4. |
Assess the
reported system defects to classify and assign priorities to it. |
TM |
Reported Defects |
Assessed Defects |
TFS,MTM |
|
5. |
Delegate the
system defects that have to be corrected to a developer to solve. |
TM |
Assessed Defects |
Delegated Defects |
TFS,MTM |
|
6. |
Correct the
delegated system defects and report the type of the defect. |
DEV |
Delegated Defects |
Updated Code and Updated Defect Summary
Report |
TFS,MTM |
|
7. |
(Regression
Test)= Retest the
system defect to try to reproduce the defect and report if new defects found. It includes the
following activities: ·
Testing issues on isolation perspectives, like records,
or environment ·
Test Solution on Clients database ·
Test alternative scenarios for the main scenario |
TT |
Updated Code and Updated Defect Summary
Report |
Reported Defects |
TFS,MTM |
|
8. |
Close the
corrected system defect. |
TM |
Defect Summary Report |
Updated Defect Summary Report |
TFS,MTM |
|
9. |
Push Version to
delivery |
TM and Requirement provider |
Defect Summary Report |
Updated Defect Summary Report |
TFS,MTM |
|
10. |
Obtain customer
formal acceptance. |
PM |
Final Delivered System |
Formal Acceptance |
(Specific to each company) |
Outputs
|
No |
Output |
Going To/Input Of |
|
1 |
Defects On
Tracking system |
|
|
2 |
Approved System
|
|
Measures
|
No |
Measure |
Description |
|
1 |
No. Bugs |
|
|
2 |
Rework% |
|
|
3 |
Rework Effort |
|
05-Unit Test Execution Procedure
Purpose
This is the procedure in
which the actual test takes place. Concurrently to the testing activities,
errors are corrected and have to be re-tested. In order to effectively
coordinate development and testing activities, a defined procedure for error
reporting and tracking is needed. Based on a comparison of the actual results
with the expected results, and according to the pass/fail rate, determination
shall be made and recorded in a defect log. The defect tracking enables defect
reporting against discovered problems in the software, and managing and
tracking the status of the defects from creation to closure.
Inputs
|
No |
Input |
Coming from/Output Of |
|
1 |
Approved
Executable Product Components |
PD_Implementation_Procedure |
|
2 |
Approved
Initial Integrated Product |
PD_Implementation_Procedure |
|
3 |
Approved Unit
Test Plan |
PD_Unit_Test_Preparation_Procedure |
|
4 |
Approved Unit
Test Procedures |
PD_Unit_Test_Preparation_Procedure |
|
5 |
Approved Unit
Test Cases |
PD_Unit_Test_Preparation_Procedure |
Activities
|
No |
Activity |
Responsible |
Input |
Output |
Temp/Solution |
|
1. |
Execute unit
test plan, procedures and test cases. |
TT |
All _Procedure Inputs |
Discovered Defects |
TFS,MTM |
|
2. |
Confirm that
unit test coverage meets benchmarked % |
TT |
Discovered Defects |
Reported Defects |
TFS,MTM |
|
3. |
If the actual
Code coverage is less than predefined, update the unit test in order to meet
predefined %. |
TT |
Discovered Defects |
Reported Defects |
TFS,MTM |
|
4. |
Retest to
assure coverage and find failed test cases. |
TT |
Discovered Defects |
Reported Defects |
TFS,MTM |
|
5. |
Resolve unit
testing notes failed unit tests. |
TT |
Discovered Defects |
Reported Defects |
TFS,MTM |
|
6. |
Retest all
defects that resulted from the unit test trying to reproduce the defects,
check resolving, and report if new defects found and reopen any unresolved
defects. |
TT |
Updated Code and Updated Defect Status |
Reported Defects Solved Defects |
TFS,MTM |
|
7. |
Analyze testing
results and Notify relevant stakeholders. Notify DEV, QA,
Process improvement to take corrective actions |
TT |
Solved Defects |
Archived Defects |
TFS,MTM |
|
8. |
Archive the
corrected unit defect. |
TT |
Solved Defects |
Archived Defects |
TFS,MTM |
Outputs
|
No |
Output |
Going To/Input Of |
|
1 |
Defects On
BugAuto, TFS |
PD_Integration_Test_Execution_Procedure CM_Functional_Configuration_Audit_Procedure PD_Product_Release_Procedure |
|
2 |
Baselined
Corrected Executable Product Components |
PD_Integration_Test_Execution_Procedure |
Metrics
|
No |
Output |
Description |
|
1 |
Code Coverage% |
Describes
the percentage of covered source code within the unit testing process. For
example: 85% means that the unit tests passed throw 85% of your source code
blocks, and does not enter the remaining 15% |
|
2 |
%
of failed unit tests |
Describes
the percentage of covered source code within the unit testing process |
06-Product Release Procedure
Purpose
Software releasing refers to the creation
and availability of a new version of a software product. Each time a software
product has major changes, the project team should decide on how to distribute
the changes or the changed system to the customer. Release procedure is the
procedure concerned with the compilation, assembly and delivery of source code
and any related documentation into finished products or other software
components.
Inputs
|
No |
Input |
Coming from/Output Of |
|
1.
|
Any System
Defects are solved |
PD_System_Test_Execution_Procedure |
|
2.
|
Approved
System |
PD_System_Test_Execution_Procedure |
Activities
|
No |
Activity |
Responsible |
Input |
Output |
Temp/Solution |
|
1. |
Review finished
work products against released scope |
DEV Team |
Models and deliverables |
Missing scope |
|
|
2. |
Communicate for
missing features/user stories/requirements to take corrective action |
DEV Team |
Models and deliverables |
Complete tested solution |
|
|
3. |
If exceptions exceed
more than 3% of bugs, then prevent releasing procedure execution |
DEV Team |
Models and deliverables |
Installation Procedure |
|
|
4. |
Collect work
products changes: ·
Source codes (modules and components) ·
Database changes (DML and DDL) ·
Resource files (Localization) ·
Solution Content (Doc, Video, images,...) ·
Configuration files ·
Service packs (Can be fixes) ·
Tools |
DEV Team |
Models and deliverables |
Installation Procedure |
TFS/DevOps Notes query |
|
5. |
Conduct
integration for collected work products to confirm successful build |
DEV Team |
Models and deliverables |
|
|
|
6. |
Execute unit
tests to confirm quality level against the predefined code coverage and
recall unit testing execution procedure. |
DEV Team |
Models and deliverables |
Confidence on Quality Level |
|
|
7. |
Review the
release completeness by verifying the UC & UX and solution mapping |
PO, QC |
Models and deliverables |
Confidence on Quality Level |
|
|
8. |
Produce the release
note for this release. |
DEV Team |
Models and deliverables |
Product Release Note |
TFS/DevOps Notes query |
|
9. |
Build package
for once-click deployment |
DEV Team |
Models and deliverables |
||
|
10. |
Baseline the software
part of the system according to the CM_Baselining _Procedure. |
CC |
As Per The Process |
As Per The Process |
As Per The Process |
|
11. |
Install and
deliver a running system at customer site. |
PM, TL |
Installation Procedure and Approved
System |
Running System |
TFS/DevOps Notes query |
|
12. |
Send
notification message to relevant users/customers |
CC |
Release |
Message |
Portal, Mail according to the application type |
Outputs
|
No |
Output |
Going To/Input Of |
|
1 |
Internally
Approved Delivered System |
PD_Acceptance_Test_Execution_Procedure |
[1] - VAL and VER processes
[2] Expected test Cases according to Object type will be:
|
|
1/1 Screen |
1/M Screen |
Operation Screen |
Lookup Screen |
Report |
|
Create New |
Y |
Y |
X |
X |
X |
|
Update |
Y |
Y |
X |
X |
X |
|
Delete |
Y |
Y |
X |
X |
X |
|
Copy |
Y |
Y |
X |
X |
X |
|
Paste |
Y |
Y |
X |
X |
X |
|
Export |
Y |
Y |
X |
X |
X |
|
Import |
Y |
Y |
X |
X |
X |
|
Operation1[ like Approve] |
Y |
N |
Y |
X |
X |
|
Operation2 Operation1[ like
Dis-Approve] |
Y |
N |
Y |
X |
X |
|
Operation3 [ like Post] |
Y |
N |
Y |
X |
X |
|
Filter |
Y |
Y |
X |
Y |
X |
|
Select |
Y |
Y |
X |
Y |
X |
|
Display |
Y |
Y |
Y |
Y |
Y |
[3] - Best practice is to start with:
· Integration modules
· Data models
· Reusable components
· Daily operation modules
· Unstable modules
[4] -By nature, each developer should test his output, otherwise we can't proof that work is done.
[5] -The greater coverage the greater job, best practice is 85% of overall code.
[6] -Bug report should includes the following:
1.
Name should represent issue
2.
Scenario, which can
reproduce the bug
3.
issue description: contain
details about the issue
4.
screen shot for the issue
if found
5.
Expected result instead the
actual
6.
Bug severity
7.
Link to test cases
8.
Repeated issues show be
communicated once in one issue with different the path of its locations
9.
Issues to be reviewed by
team leader