- 2.127.2  IT Testing Process and Procedures
	
- 2.127.2.1  Program Scope and Objectives
		
- 2.127.2.1.1 Background
 - 2.127.2.1.2 Authority
 - 2.127.2.1.3 Responsibility
 - 2.127.2.1.4 Program Management and Review
 - 2.127.2.1.5 Program Controls
 - 2.127.2.1.6 Terms and Acronyms
 - 2.127.2.1.7 Related Resources
 
 - 2.127.2.2  Test Overview
		
- 2.127.2.2.1 Test Types
 - 2.127.2.2.2  Testing Lifecycle and Procedures
			
- 2.127.2.2.2.1  Perform Planning
				
- 2.127.2.2.2.1.1  Assess Requirements
					
- 2.127.2.2.2.1.1.1 Review Requirements Documentation
 - 2.127.2.2.2.1.1.2 Verify Requirements Repository
 - 2.127.2.2.2.1.1.3 Review Prior Design Documentation, as applicable
 - 2.127.2.2.2.1.1.4 Conduct Requirements Analysis, List Constraints
 - 2.127.2.2.2.1.1.5 Determine Test Types
 - 2.127.2.2.2.1.1.6 Establish Incident/Problem Reporting
 - 2.127.2.2.2.1.1.7 Conduct Walkthrough Meeting to Approve Final Testable Requirements
 - 2.127.2.2.2.1.1.8 Brief Project Stakeholders on which Types of Tests will be Performed
 - 2.127.2.2.2.1.1.9 Develop Testing Schedule as Input to the Integrated Master Schedule
 - 2.127.2.2.2.1.1.10 Establish Project Folder(s)
 
 - 2.127.2.2.2.1.2  Establish Test Environment
					
- 2.127.2.2.2.1.2.1 Submit Request for Testing Services through UWR Process
 - 2.127.2.2.2.1.2.2 Identify Test Environment(s)
 - 2.127.2.2.2.1.2.3 Submit Request for Enterprise File Transfer Utility (EFTU) Support, as applicable
 - 2.127.2.2.2.1.2.4 Submit Request for Computer Services (RCS), as applicable
 - 2.127.2.2.2.1.2.5 Incorporate Privacy and Civil Liberties Impact Assessment (PCLIA), as applicable
 - 2.127.2.2.2.1.2.6 Submit Sensitive But Unclassified (SBU) Questionnaire(s), as applicable
 
 - 2.127.2.2.2.1.3 Train Test Team
 - 2.127.2.2.2.1.4  Develop Test Artifacts
					
- 2.127.2.2.2.1.4.1 Review Previous Lessons Learned and/or Post Implementation Review (PIR) Documents
 - 2.127.2.2.2.1.4.2 Develop Test Documentation (TSP, TP, Automated PFC, etc.)
 
 - 2.127.2.2.2.1.5 Planning Summary
 
 - 2.127.2.2.2.1.1  Assess Requirements
					
 - 2.127.2.2.2.2  Perform Preparation
				
- 2.127.2.2.2.2.1  Verify Test Environment
					
- 2.127.2.2.2.2.1.1 Verify Type of Equipment
 - 2.127.2.2.2.2.1.2 Verify All Items Listed in the TP are Available
 - 2.127.2.2.2.2.1.3 Update Executive Control Language for Test Environment, as applicable
 - 2.127.2.2.2.2.1.4 Coordinate Interface Database Agreements/Files (Optional)
 
 - 2.127.2.2.2.2.2  Review Documentation
					
- 2.127.2.2.2.2.2.1 Developer Submits Functional Documentation that Represents the Requirements to Test Teams
 - 2.127.2.2.2.2.2.2 Review Approved Testable Requirements
 - 2.127.2.2.2.2.2.3 Review internal Documents and All Other Relevant Documents, as applicable
 - 2.127.2.2.2.2.2.4 Submit or Post All Relevant Documents, as applicable
 - 2.127.2.2.2.2.2.5 Document Results in Defect Repository, as applicable
 
 - 2.127.2.2.2.2.3  Prepare Test Cases/Scripts/Data
					
- 2.127.2.2.2.2.3.1 Review Prior External Documentation
 - 2.127.2.2.2.2.3.2 Create/Modify Test Cases
 - 2.127.2.2.2.2.3.3 Create/Modify Test Scripts
 - 2.127.2.2.2.2.3.4 Create Test Data
 - 2.127.2.2.2.2.3.5 Report Test Status
 - 2.127.2.2.2.2.3.6 Conduct Peer Review
 
 - 2.127.2.2.2.2.4  Conduct Test Readiness Review, as applicable
					
- 2.127.2.2.2.2.4.1 Identify TRR Participants
 - 2.127.2.2.2.2.4.2 Prepare TRR Checklist and Memorandum
 - 2.127.2.2.2.2.4.3 Conduct TRR meeting
 
 - 2.127.2.2.2.2.5 Preparation Summary
 
 - 2.127.2.2.2.2.1  Verify Test Environment
					
 - 2.127.2.2.2.3  Execute and Document Test
				
- 2.127.2.2.2.3.1  Execute Test Cases/Scripts
					
- 2.127.2.2.2.3.1.1 Verify Program/Code has been Transmitted to Test Environment
 - 2.127.2.2.2.3.1.2 Verify Program/Code Transmittal is Received by Test Team
 - 2.127.2.2.2.3.1.3 Validate Input Data
 - 2.127.2.2.2.3.1.4 Execute Test Cases/Scripts
 - 2.127.2.2.2.3.1.5 Determine Pass/Fail status
 
 - 2.127.2.2.2.3.2  Document Results
					
- 2.127.2.2.2.3.2.1 Document Results in Authorized Traceability Repository
 - 2.127.2.2.2.3.2.2 Document Results in Defect Repository (if test case/script failed)
 - 2.127.2.2.2.3.2.3 Prepare Test Case Deferral and Waiver form, if applicable
 - 2.127.2.2.2.3.2.4 Update ELM with Test Results
 
 - 2.127.2.2.2.3.3  Report Test Status
					
- 2.127.2.2.2.3.3.1 Conduct Peer Review
 - 2.127.2.2.2.3.3.2 Develop and Provide Status Reports
 
 - 2.127.2.2.2.3.4 Execute and Document Test Summary
 
 - 2.127.2.2.2.3.1  Execute Test Cases/Scripts
					
 - 2.127.2.2.2.4  Closeout Test
				
- 2.127.2.2.2.4.1  Finalize Test Artifacts
					
- 2.127.2.2.2.4.1.1 Update Repository/Folder with Final Results Using Automated PFC, as applicable
 - 2.127.2.2.2.4.1.2 Update and Finalize Test Schedule with Actual Results
 - 2.127.2.2.2.4.1.3 Resolve Outstanding Issues
 - 2.127.2.2.2.4.1.4 Complete Work Products
 
 - 2.127.2.2.2.4.2  Issue End of Test Reports
					
- 2.127.2.2.2.4.2.1 Finalize End of Test Report for Each Test Type
 - 2.127.2.2.2.4.2.2 Submit End of Test Report for Approval and Concurrence
 - 2.127.2.2.2.4.2.3 Distribute End of Test Report Electronically to Stakeholders
 
 - 2.127.2.2.2.4.3  Conduct Closeout Meetings
					
- 2.127.2.2.2.4.3.1 Lessons Learned
 - 2.127.2.2.2.4.3.2 Post Implementation Review (PIR)
 
 - 2.127.2.2.2.4.4  Finalize Project Folder
					
- 2.127.2.2.2.4.4.1 Place Test Related Documents in Project Folder
 - 2.127.2.2.2.4.4.2 Finalize Automated PFC
 - 2.127.2.2.2.4.4.3 Records Retention
 
 - 2.127.2.2.2.4.5 Closeout Summary
 
 - 2.127.2.2.2.4.1  Finalize Test Artifacts
					
 
 - 2.127.2.2.2.1  Perform Planning
				
 - 2.127.2.2.3  Roles and Responsibilities
			
- 2.127.2.2.3.1 Roles and Responsibilities Overview
 - 2.127.2.2.3.2  Roles and Responsibilities by Activities
				
- 2.127.2.2.3.2.1 Perform Planning
 - 2.127.2.2.3.2.2 Perform Preparation
 - 2.127.2.2.3.2.3 Execute and Document Test
 - 2.127.2.2.3.2.4 Closeout Test
 
 
 - 2.127.2.2.4  Test Delivery Model
			
- 2.127.2.2.4.1 Readiness State
 - 2.127.2.2.4.2 Execution State
 
 - 2.127.2.2.5  Test Governance and Compliance
			
- 2.127.2.2.5.1 Compliance Artifacts
 - 2.127.2.2.5.2 Testing Work Products
 - 2.127.2.2.5.3  Test Reporting
				
- 2.127.2.2.5.3.1 Weekly Status Reporting
 
 
 
 - 2.127.2.3 References
 - Exhibit 2.127.2-1 Terms and Definitions
 - Exhibit 2.127.2-2 Acronyms
 - Exhibit 2.127.2-3 Weekly TSR Snapshot
 - Exhibit 2.127.2-4 Examples of Requirements, Functional, Operational, Security, Privacy and Project Documentation
 - Exhibit 2.127.2-5 Activities and Steps
 
 - 2.127.2.1  Program Scope and Objectives
		
 
Part 2. Information Technology
Chapter 127. Testing Standards and Procedures
Section 2. IT Testing Process and Procedures
2.127.2 IT Testing Process and Procedures
Manual Transmittal
July 31, 2025
Purpose
(1) This transmits revised Internal Revenue Manual (IRM) 2.127.2, Testing Standards and Procedures, IT Testing Process and Procedures.
Material Changes
(1) Internal controls were added to comply with Internal Management Documents (IMD) requirements.
(2) IRM subsections were adjusted to fit the Internal Control format.
(3) IRM 2.127.2.2 Process Overview was replaced with Test Overview.
(4) RM 2.127.2.3 Tailoring Guidelines was replaced with References.
(5) IRM 2.127.2.4 CMMI, ITIL, PMI Compliance was removed.
(6) IRM 2.127.2.5 Definition, References was removed.
(7) The following updates were made throughout the IRM:
- 
	
Enterprise Lifecycle (ELC) to One Solution Delivery Lifecycle (OneSDLC)
 - 
	
Rational Collaborative Lifecycle Management (CLM) to Engineering Lifecycle Management (ELM); Rational Quality Manager (RQM) to Engineering Test Management (ETM); Rational Team concert (RTC) to Engineering Workflow Management (EWM)
 - 
	
Knowledge Incident Service Asset Management (KISAM) to Internal Revenue Workflow Optimization, Request, and Knowledge System (IRWorks) ServiceNow
 
(8) Titles, website addresses, and references were updated.
(9) Editorial changes were made throughout the IRM to improve clarity and consistency.
Effect on Other Documents
IRM 2.127.2, dated December 11, 2024, is superseded.Audience
This process description (PD) is applicable to all Information Technology (IT) organizations, contractors, and stakeholders performing testing.Effective Date
(07-31-2025)Kaschit Pandya
Acting Chief Information Officer
- 
	
The scope of this IRM applies to all testing (i.e. software application, hardware, infrastructure upgrade projects, as well as, new and current (legacy) production system upgrade projects, etc.) within IT organizations following One Solution Delivery Lifecycle (OneSDLC).
 - 
	
The purpose of this IRM is to establish guidelines, expectations, authority, and documentation responsibility for development and facilitation of testing standards.
 - 
	
The audience for this IRM are all organizations within IT responsible for testing.
 - 
	
The policy owner of this IRM is the Chief Information Officer (CIO).
 - 
	
The program owner of this IRM is Enterprise Systems Testing (EST).
 - 
	
The primary stakeholders of this IRM are IT organizations responsible for testing.
 - 
	
The goal of this IRM is to provide guidance and support to all IT organizations responsible for testing.
 
- 
	
EST serves as the Test Process Owner and supports the development, facilitation, and institutionalization of the test processes within IT. EST works in collaboration with other IT organizations and stakeholders for the successful promotion of product quality. This IRM has been created to centralize and establish practices for effective testing. It establishes guidelines for performing validation and verification activities throughout all phases of the testing lifecycle.
 
- 
	
This Directive is to establish standards, expectations, authority, and documentation responsibility for development and facilitation of testing standards. Approval of this document, including updates, rests with the CIO and Associate Chief Information Officer (ACIO) for Enterprise Services.
 
- 
	
EST is responsible for the development, implementation, and maintenance of this directive. All proposed changes to this directive must be submitted to EST. The CIO and ACIO for Enterprise Services is responsible for the approval of any changes to this directive.
 
- 
	
EST conducts reviews of the IRM and supporting documents internally and externally with our stakeholders at least once every two years.
 - 
	
EST must manage and evaluate the process based on the following mandates:
- 
		
EST has the authority and responsibility for developing IT Test Assets including Process Descriptions, Procedures, and related guidance materials.
 - 
		
EST has the authority to develop, facilitate and coordinate the appropriate use of IT Test Process Assets.
 - 
		
The planning, management, execution, and quality responsibilities of verification activities and validation activities explicitly belong to Project Managers or designated Project Leads. These methods must be defined, including any limitations, and outlined in the project's test plan.
 - 
		
Responsibility for all information system project management activities must be explicitly assigned by the applicable IT Executive.
 - 
		
All system, program, or test plans must include verification strategies addressing system integration, acceptance, regression, privacy, and security as required by Cybersecurity and Section 508 of the 1973 Rehabilitation Act as currently amended.
 - 
		
Project generated test artifacts or work products, such as test plans, test scripts, test cases, test reports and measurements, must be recorded and maintained in an approved repository.
 - 
		
Measures collected and used by the projects to determine test status and/or produce resultant work products must be reviewed during program and project reviews.
 - 
		
All testing must have a plan that addresses verification activities and validation activities through all lifecycle phases. This is applicable to all test releases, formal or informal, whether testing is conducted by EST or is executed by individual projects, other testing components, or outside contractors. This applies to any and all approved IRS lifecycle development methodologies a project may choose to follow.
 
 - 
		
 
- 
	
This IRM complies with the Internal Revenue Service (IRS) Internal Management Documents (IMD) requirements to establish controls.
 - 
	
Any waivers or deviations of this Directive require written approval from the ACIO, Enterprise Services.
 
- 
	
See Exhibit 2.127.2-1 and Exhibit 2.127.2-2 for Terms and Acronyms.
 
- 
	
This section provides all applicable resources closely related to or referenced by this IRM.
- 
		
IRM 2.31.1 One Solution Delivery Lifecycle (OneSDLC) Guidance
 - 
		
IT Test Reference Documents
 - 
		
IT Security, Policy and Guidance
 - 
		
IT Program Governance Directive, Process Description and Procedures
 - 
		
Release Readiness Review Board Procedure
 - 
		
IRS Privacy Testing Guidance
 - 
		
Enterprise Organizational Readiness Directive
 
 - 
		
 
- 
	
This manual defines the Enterprise System Testing (EST) activities and test products, and documents the testing procedures to verify that software applications:
- 
		
Adhere to standards for systems development
 - 
		
Comply with approved processing requirements
 - 
		
Meet customer needs
 
 - 
		
 - 
	
Testing will be conducted according to test procedures described below in IRM 2.127.2.2.2
 - 
	
In general, testing:
- 
		
Verifies and validates the software requirements through all available systems documentation
 - 
		
Evaluates lifecycle deliverables to ensure they are clear, correct, complete and consistent
 - 
		
Validates the application software and ensures that interfaces with other systems function properly
 - 
		
Reviews specified deliverables for conformance to approved standards
 - 
		
Ensures deliverables are consistent with current IRS rules and regulations
 
 - 
		
 
- 
	
This section lists test types that are commonly conducted in IRS IT:
- 
		
Unit Testing
 - 
		
Compatibility Testing
 - 
		
Integration Testing
 - 
		
Development Integration Testing (DIT)
 - 
		
Development System Integration Testing (DSIT)
 - 
		
Final Integration Testing (FIT)
 - 
		
Systems Acceptability Testing (SAT)
 - 
		
Independent Systems Acceptability Testing (ISAT)
 - 
		
Regression Testing
 - 
		
Usability Testing/User Acceptability Testing (UAT)
 - 
		
Accessibility Testing (508 Compliance)
 - 
		
Performance Testing
 - 
		
Application Network Review
 - 
		
Security Testing
 - 
		
Web Testing
 - 
		
Automation Testing
 
 - 
		
 - 
	
The table below depicts description and ownership for the test types.
Test Type Description Owner Unit Testing Unit Testing is a procedure used to validate that individual units of source code are working properly. Application Development (AD) Compatibility Testing Compatibility testing is conducted to determine whether the program correctly exchanges files with other software. Application Development (AD), 
Enterprise System Testing (EST)Integration Testing Integration Testing verifies that the system integrates properly, and functions as required. Application Development (AD), 
Enterprise System Testing (EST)Development Integration Testing (DIT) After Unit Testing, a Development Integration Testing (DIT) combines the parts of an application to determine if they function together correctly. In its simplest form, two units that have already been tested are combined into a component and the interface between them is tested. Application Development (AD) Development System Integration Testing (DSIT) Development System Integration Testing (DSIT) is a logical extension of DIT testing. After the individual projects have been independently integrated into applications, tested in a DIT, and meet their respective requirements, they will be integrated into a complete system and tested in DSIT. Application Development (AD) External Trading Partners Testing External Trading Partner Testing is special testing that is conducted on programs that exchange files with an entity outside the IRS. Enterprise Systems Testing (EST) Systems Acceptability Testing (SAT) SAT assesses the quality of the application software by testing with controlled data to determine conformance to customer requirements and to aid the customer and developer in determining the system’s production readiness. Enterprise Systems Testing (EST) Independent Systems Acceptability Testing (ISAT) ISAT assesses the quality of the application software by testing with controlled data to determine conformance of the system to customer requirements and to aid the customer and developer in determining the systems’ production readiness. ISAT is required if SAT is not performed by EST. Application Development (AD) Final Integration Testing (FIT) Final Integration Test (FIT) is the integrated test of multiple systems that support the high-level business requirements of the IRS. Enterprise Systems Testing (EST) Regression Testing Regression testing is performed to determine whether changes to the application have adversely affected previously tested functionality. Regression Testing demonstrates system integrity after changes are made to the configuration or environment of a system. Application Development (AD), 
Enterprise System Testing (EST)Usability Testing/User Acceptability Testing (UAT) Usability Testing or User Acceptability Testing (UAT) analyzes the user experience through direct observation and evaluation to determine whether a system works from the users’ perspective or whether it requires rework. Business Acceptability Testing (508 Compliance) Accessibility testing determines whether Electronic and Information Technology (EIT) satisfies federal law for accessibility to people with disabilities. Application Development (AD), 
Enterprise System Testing (EST)Performance Testing Performance testing determines whether the system undergoing testing can effectively process transactions under expected normal and peak workload conditions, within acceptable response time thresholds. Enterprise Systems Testing (EST) Application Network Review The Application Network Review (ANR) is performed to confirm deployed applications provide the intended performance levels; make efficient use of network infrastructure resources; and evaluate whether the enterprise network is efficiently configured to support the deployed application. Application Development (AD), 
Enterprise System Testing (EST)Security Testing Security Testing is conducted in the IRS production environment and consists of activities designed to ensure that the system’s security safeguards are in place and functioning as intended. Cyber Security Web Testing Web testing is a computer-based test delivered via the internet and written in the ″language″ of the internet, HTML and possibly enhanced by scripts. Application Development (AD), 
Enterprise System Testing (EST)Automation Testing Automation Testing uses automation tools to maintain test data, execute tests, and analyze test results. Application Development (AD), 
Enterprise System Testing (EST) 
- 
	
Software applications that are developed and maintained within the IRS, or by contractors for the IRS, and are subject to testing will utilize these procedures to identify and resolve as many defects as possible prior to projects’ deployment.
 - 
	
The IT Testing Lifecycle comprises the following four phases/activities:
- 
		
Perform Planning: Initiate Project Folder, review previous Lessons Learned/Post Implementation Review (PIR), conduct Unified Work Request (UWR) reviews, begin Requirements analysis
 - 
		
Perform Preparation: Update test schedule, create and map data to test cases, confirm test environment is ready, update and issue Test Plan
 - 
		
Execute and Document Test: Conduct Test Processing, capture test results, provide status
 - 
		
Closeout Test: Conduct Lessons Learned/PIR, issue the End of Test Report (EOTR)/ End of Test Completion Report (EOTCR), close out the Project Folder
 
 - 
		
 
- 
	
Testing is one method used to verify that the work products have met their specified requirements. Testing can also validate that the final product will fulfill its intended purpose when placed in its intended environment. All testing begins with test planning.
 - 
	
In the IRS, testing is performed using a variety of test types. For a full list of test types and their definitions, see Section IRM 2.127.2.2.1 above and the IT Test Type Reference Guide in IRM 2.127.2.3 References.
 - 
	
Systems that are developed and maintained within the IRS or developed and/or maintained by IRS contractors will use these procedures to plan the verification and validation of the product components.
 - 
	
During Perform Planning, the Test Strategy and Plan (TSP) will address the scope and types of testing and describe the type of data required to complete planned test types. Test data creation/management/control remains the responsibility of each project team.
 - 
	
The Perform Planning procedures cover the activities listed below and are described in more detail in the following sections:
- 
		
Assess Requirements
 - 
		
Establish Test Environment
 - 
		
Train Test Team
 - 
		
Develop Test Artifacts
 
 - 
		
 
- 
	
Assess requirements as described below:
 
- 
	
Documentation Review is the process of confirming the translation of requirements into documentation associated with the system development effort. The review serves to ensure test case requirements are mapped to functional documentation processes. The review also confirms operations for the system prior to test execution.
 
- 
	
Once the requirements have been defined, the tester verifies that they are entered in the authorized repository within the ELM. Business requirements are stored in Engineering Requirement Management-Door Next Generation (ERM-DNG). Technical/decomposed requirements are stored in EWM as Epics and User Stories, if applicable.
 
- 
	
Test analysts should review any prior design documentation, such as Business System Reports (BSR), if applicable, to determine if the test was for a maintenance project. The document(s) are usually received from the development organization and referenced to create/update requirements. The document is stored in Documentum for Information Technology (DocIT), SharePoint, and/or other approved repositories.
 
- 
	
Analyzing requirements for testability is the cornerstone activity of test planning. Requirements should be well-formed and organized in the requirements repository where testing can be performed on an identified/predefined outcome or result. Additional information regarding analyzing requirements can be found in the Requirements Engineering Learning Center.
 - 
	
High-level requirements are usually provided via a UWR that is created within the Work Request Management System (WRMS). Organizations should follow the procedures for processing UWRs as outlined by the Business Planning and Requirements Management (BPRM) organization.
 - 
	
Reusable Program Level Requirements (RPLR) should be identified in conjunction with the respective Content Owner, in addition to the requirements that support system features and business needs. Any requirement and test-related artifacts should be captured in the ELM tools suite where trace relationships can be created to show the association among those artifacts.
 
- 
	
Most tests fall into the following test categories: unit, integration, functional, regression, and security. Additional information regarding test types is found in the Test Type Reference Guide.
 - 
	
The term “test types” describes tests based on the purpose or objective of the test. The types of tests generally performed in the IRS are discussed in IRM 2.127.2.2.1 of this IRM. For this guide, the phrase “type of test” has the same meaning as “test type”.
 - 
	
The Project Manager (PM) determines which test type(s) to perform and includes the list of those test type(s) in the project’s TSP.
 
- 
	
The purpose of this procedure is to outline the steps required to process a problem that is a deviation from the predetermined results. For this document, the term problem is used for incident, defect, or discrepancy. Problems should be reported during the earliest step of the Testing Lifecycle to ensure the earliest possible correction against requirements. Problems may be detected during the Planning phase, Preparation phase, or Execute and Document phase of testing. Examples may include problems found with design flow, non-adherence to documentation standards, system performance, code, etc.
- 
		
Create Test Project ID - Test Project ID (a.k.a Test ID) is a unique eight-character identifier used to track Problem Tickets for individual tests conducted by EST. Other organizations can define their own methods.
Position 1 - EST Branch
Position 2 - EST Section
Position 3 - The last digit of the production year
Position 4 - “A” for annual testing or “M” for mid-year testing
Position 5-8 - Unique project identifiers
If your project does not already have an established Test Project ID, access Test Project ID Generator to create it. Once created, the Test Project ID needs to be entered into IRWorks ServiceNow the first time it is used to report problems. IRWorks ServiceNow will be addressed as IRWorks in remaining parts of this document.
 - 
		
Create IRWorks Repository - Problem tickets will be reported in the authorized problem reporting repository, IRWorks. Projects may use other defect tools when prior approval has been granted by management. Additional information on Problem Reporting can be found in the Perform Planning and Execute and Document Test sections. Detailed instruction may be accessed at Problem Reporting Procedures in the EST Customer Corner SharePoint site.
 
 - 
		
 
- 
	
The Business Unit’s respective manager, or in some cases the Program Management Office (PMO), will conduct a walkthrough meeting to approve the final testable requirements.
 
- 
	
During the planning phase, meetings are held to discuss the requirements, determine test types, identify constraints, etc. During these meetings, the stakeholders are informed as determinations are made based on the information gathered prior to and during the meeting.
 
- 
	
The Testing Schedule in the Integrated Master Schedule (IMS) is a mechanism used to aid in planning and monitoring each test through its life cycle. Microsoft Project is the standard software tool for planning test activities. Projects should use Microsoft Project to prepare the test schedule, if available. Microsoft Project software is available by placing a request at IRS Service Central ticket to obtain a license for downloading the software. Search for Microsoft Project when placing the request. In the event the software is not available, Microsoft Excel can be used as an alternative tool.
 - 
	
Each project has steps and tasks that occur during the test. The Testing Schedule is used to document, schedule, and track activities performed during each test. The Testing Schedule provides the steps and significant tasks, and contains the dates noted in (c) below.
- 
		
The four steps of the testing lifecycle (Planning, Preparation, Execution and Closeout) are designated as high-level tasks in the Testing Schedule.
 - 
		
Activities in the Testing Schedule are grouped chronologically and/or logically. The test steps identified above depict the minimum tasks required for a test.
 - 
		
Planned Start Date and Planned End Date can change throughout the lifecycle of the test. Actual Start Date and Actual End Date reflect the activity dates as they occur.
 
 - 
		
 
- 
	
The Project Folder should contain a history of the specific test and is a useful source for auditing purposes. The folder can be used for future project planning, allocation of resources, and process improvement. All critical test documentation and work products should be included in the Project Folder.
 - 
	
The Project Folder is established in the project’s authorized repository in DocIT. However, electronic documentation such as requirements, test cases, etc. are stored in the ELM repository. The Project Folder(s) must be cross-referenced in the “Direct Link to DocIT Repository Location” column of the Automated Project Folder Checklist (PFC) template to expedite location and access. See Subsection IRM 2.127.2.2.2.1.4.2 (3) in this IRM for instructions on how to create a Project Folder Checklist.
 - 
	
The Project Folder may vary in volume depending on the size and/or complexity of the project. Information that cannot be incorporated into the project’s electronic repository, such as boxes of printed output computer files, interface tape(s) and/or file backups, should be stored using appropriate records management processes, and a reference in the PFC where the non-electronic documents can be located.
 - 
	
The Project Folder is a requirement for every test and must be retained in the designated repository for 365 calendar days beyond the scheduled program implementation date, or until there is no longer a business need. Beyond that date, Records Management retention procedures should be followed as outlined in Records Control Schedule (RCS).
 - 
	
Projects are required to use EWM to capture requirements and any other relevant test information provided by the Business Owner.
 
- 
	
Below are the processes followed to establish a test environment.
 
- 
	
IT products and requests for services are managed through WRMS via the UWR process. The process uses work requests to represent a contract between IT and our Business and Functional Operating Divisions (BODs/FODs). Organizations should follow the procedures for processing UWRs as outlined by the Business Planning and Risk Management (BPRM) organization, and view the IRM 2.22.1 associated with these processes.
 
- 
	
Confirm a testing environment is available and ready, by coordinating with the following organizations: IT EOPS - EOps_Central, which includes Infrastructure Services Division (ISD), Enterprise Server Division (ESD), Server & Support Services Division (SSSD), and also IT Enterprise Services (ES) Support Services - ES S&S.
 - 
	
Determine support resources assigned roles to ensure environment is fully established and viable for testing. All participants are invited to any coordination meetings to ensure accuracy of environment information and environment build out scheduling.
 - 
	
Ensure the test environment emulates the production environment to the greatest degree possible. If the required test environment is in existence, verify that the environment is correct. Coordination of the testing environment will vary depending on the complexity of the changes, type(s) of testing required, practices of the customer, methodologies used to develop the system, and the impact on the existing system.
 - 
	
Identify the type of equipment, operating system (OS), software (SW), configuration tools, portal and all communications media used in the test environment.
 
- 
	
EFTU is a utility that moves data in a controlled, structured, and secured environment through the organization. EFTU expands functionality that was initially provided in its predecessor system Enterprise File Transfer Protocol Network Server (EFNS) which includes:
- 
		
Improved authentication and authorization
 - 
		
Centralized control and logging of transfer activities
 - 
		
Greater automation for routine transfers
 - 
		
Increased monitoring and recover features
 - 
		
Increased security features
 - 
		
Detailed audit trails for log files of all transfer attempts, within the IRS internal network
 
 - 
		
 - 
	
Detailed information on how to submit a request for EFTU support can be found in Section 15 in the Test Reference Guide (TRG).
 
- 
	
A Request for Computer Services (RCS) is the vehicle by which automation software and software training is formally requested. The requesting analyst gathers all record, file, and processing specifications that are needed for the development of an appropriate automation program. The analyst contacts EST Performance Assurance (PA) Branch Section 5 (PATB5) to assist in the submission of this request.
 - 
	
Detailed instructions on how to submit an RCS can be found in Section 11.2 of the TRG.
 
- 
	
The purpose of a PCLIA is to demonstrate that PMs, system owners, and developers have consciously incorporated privacy and civil liberties protections throughout the entire lifecycle of a system. This involves making certain that privacy and civil liberties protections are built into the system from the beginning when it is less costly and more effective.
 - 
	
The PCLIA process is an analysis of how information in an identifiable form is collected, stored, protected, shared, and managed. It also provides a means to assure compliance with all applicable laws and regulations governing taxpayer and employee privacy. The PCLIA preparer may be anyone designated by the System Owner (SO), including the SO, who completes the applicable form and works with the Privacy Analyst to identify and address any risks. The PM approves the PCLIA for submission to Privacy Review.
 - 
	
See PCLIA for more information.
 
- 
	
This section details the procedures for obtaining approval and documenting the use of SBU data that was sourced from Production, in non-production environments. SBU Data Use Requests may be approved for a period of up to six (6) months unless the project spans at least three (3) years. For projects with a multi-year (three or more years) development plan or ongoing work, renewals may be requested on an annual basis. Below are the common forms used in determining the validity of whether to use SBU data.
 - 
	
Sensitive But Unclassified (SBU) Data Use Questionnaire - Form 14664 is designed to ensure the use of SBU data is necessary, to ensure analysis of whether sensitive or sanitized data can replace SBU data, or to determine whether a SBU Data Use Request is necessary.
 - 
	
The purposes of Sensitive But Unclassified Data use Request - Form 14665 are to ensure thorough consideration is given to the privacy and security of SBU data prior to use in non-production environments, and that risk is adequately mitigated. It is also used to document the use of SBU data (or use replacement of SBU data with synthetic and/or sanitized data), to document the movement or replication of SBU data between environments, and to document the request to use SBU data. This gives the Information Owner (IO) and the Privacy, Compliance and Assurance (PCA) office the necessary information to allow determination whether to accept the risk of using SBU data as applicable.
 - 
	
The purpose of SBU Data Use Recertification - Form 14659 is to document the continued use of SBU data (or replacement of SBU data with synthetic and/or sanitized data).
 - 
	
See IRM 10.5.8 for additional information regarding the process of using SBU data.
 
- 
	
Test teams must receive proper training on test processes and procedures, test environments, and test tools. Test tools may include the ELM Suite of Tools (ETM, ERM-DNG, EWM, etc.), and IRWorks. In addition, testers may need additional training on Test Schedule creation, submitting requests through EFTU and/or EWM, and requirements analysis.
 - 
	
PMs and/or test leads determine specific training needs and will notify respective team members as appropriate.
 
- 
	
Below are the processes followed to develop test artifacts.
 
- 
	
Prior to beginning your test, it would be helpful to review any documents used in prior tests (Lessons Learned, PIR documents, deferred/waived test cases, and any other documents that may be available). Identify any similarities in the test, key points, test data, test cases, etc. that can be used in the current test. Identifying these key points will help expedite the lifecycle deliverables going forward.
 
- 
	
Create Test Strategy and Plan (TSP)
- 
		
The Test Strategy and Plan (TSP) is a OneSDLC requirement for all delivery approaches. The TSP is a summarization of all testing for the project’s release and must be maintained in the organization’s authorized repository for audit purposes.
 - 
		
Instructions and a TSP Template are located in the Customer Corner on SharePoint. The purpose of the TSP template is to provide a standard artifact to summarize the complete test effort for the release. The TSP also allows the PM an opportunity to mitigate risks that may cause delays to project implementation.
 
 - 
		
 - 
	
Create System Plan (TP)
- 
		
The Test Plan (TP) is a requirement for all testing. Each testing organization can create a TP for each test type or may combine their test types into one TP. The Test Plan containing the traceability of the project’s business/technical requirements to specific test cases can be used in lieu of Requirements Traceability Verification Matrix (RTVM). Detailed instruction on how to create RTVM may be accessed at RTVM Manual Process Instructions.
 - 
		
The purpose of the TP is to provide a standard artifact to summarize the complete test effort for the test type(s). The TP also allows the test manager an opportunity to mitigate risks that may cause delays to project implementation. Instructions are included in the TP Template and are located in the EST Customer Corner on SharePoint.
 - 
		
EST has approved that the Test Plan may now be created in the ELM environment. The ELM template contains the primary guidance for the required contents of the TP, and the instructions clarify how to create the corresponding materials within the ELM. Please check with your test manager to decide if the ELM TP is applicable to your project. The ELM Test Plan Instructions are located in the EST Customer Corner on SharePoint.
 
 - 
		
 - 
	
Create Automated Project Folder Checklist (PFC)
- 
		
The Automated PFC must be used to review the Project Folder contents for completeness and accuracy. The Automated PFC Template is located in the Customer Corner on SharePoint. The Automated PFC must be included in the Project Folder. When deficiencies are identified during the initial or subsequent reviews, follow-up reviews must be conducted to ensure corrective action was taken.
 - 
		
Each time a review is conducted, a copy of the revised checklist is uploaded in DocIT and added to the Project Folder. Each item on the checklist must have a Yes, No, or N/A in the appropriate box and DocIT link, if applicable. For some items, specific information (such as location or target completion date) should be added to the comments section:
- 
			
Direct Link to DocIT Repository Location - Enter the corresponding DocIT link, if applicable. If the item cannot be stored electronically, mark the box as Yes and cross reference its location in the Comments section
 - 
			
Yes - Checklist item in question is represented in the folder. If item is not in the folder, cross reference its location in the Comments section
 - 
			
No - Checklist item in question is not represented in the folder, explanatory comment(s) is required
 - 
			
N/A - Checklist item in question is not represented in the folder, explanatory comment(s) are recommended
 
 - 
			
 
 - 
		
 - 
	
Conduct Peer Review
- 
		
Peer reviews are performed to ensure accuracy, completeness, and consistency throughout the testing lifecycle. Ideally, a test lead and/or test analyst should schedule and conduct a peer review of available test artifacts including TSP, TP, and PFC. Work product reviews are to be conducted as needed throughout the testing lifecycle. PMs have the authority to designate the work products that are formally reviewed for each project. Major steps involved in peer reviews include:
- 
			
Plan and coordinate review
 - 
			
Conduct review
 - 
			
Rework, if needed
 - 
			
Closeout review
 
 - 
			
 - 
		
Peer Review documentation should be kept in the project folder along with all other official documents. The Peer Review Procedures in the EST Customer Corner SharePoint site contains additional information regarding peer review.
 
 - 
		
 - 
	
Request Artifact Deferral, as applicable
- 
		
EST has the authority to defer EST artifacts upon acceptance of the justification for the deferral provided by a project. Approval is done on a ‘case by case’ basis.
 - 
		
An artifact deferral is required when the Project Management Plan (PMP) has been approved with the standard documentation tailoring code for the EST Artifact(s). In addition, during the project development process, the PM or designee may determine that the standard EST Artifact(s) will be produced in a different milestone within the release. Projects must complete a separate form for each artifact being deferred. To obtain an approved deferral, a project must contact EST and submit the required form with a valid justification.
 - 
		
The Test Artifact Deferral in the EST Customer Corner SharePoint site contains detailed instructions to complete the request.
 
 - 
		
 
- 
	
Common activities that should be performed in the planning phase are listed on the preceding pages to ensure successful test planning. Communication with the business and development organizations to understand the system requirements and coordinating with the appropriate organizations, such as EOps or AD, to establish the test environment is essential to successful test planning. Testing tools that will be used throughout the test will need to be acquired before testing can begin. Test schedule should be determined. The Project Folder, based on the Automated Project Folder Checklist template, should be established. The Problem Reporting Procedures should be established and reviewed for accuracy before proceeding to the next step of the testing lifecycle.
 - 
	
In the next step of the testing lifecycle, we will identify the activities necessary to ensure successful test preparation.
 
- 
	
With the start of the initial activities and the agreement on the project scope, we are ready to proceed to the Perform Preparation phase. In this phase the major activities are as follows:
- 
		
Verify Test Environment
 - 
		
Review Documentation
 - 
		
Prepare Test Cases/Scripts/Data
 - 
		
Conduct Test Readiness Review (TRR), as applicable
 
 - 
		
 - 
	
While some high-level preparation and one-time activities in the Perform Planning step should be conducted in the Readiness state of OneSDLC prior to Readiness Exit Review, such as setting up testing tools and environment as well as assessing high level project scope and requirements, other activities can happen during the Readiness state or the Product Planning phase of the Execution state depending on the Delivery Approach of the project. For example, since detailed requirements might be confirmed in the Readiness Exit Review for a Semi-annual project, test cases, scripts and data could be prepared prior to Readiness Exit Review within the OneSDLC model. However, detailed requirements might not be available for review until the Product Planning (including Mid-range Planning and/or Iteration Planning) phase of each cycle of Execution in an Agile or Frequent project. Therefore, test cases, scripts, and data couldn’t be prepared until the Product Planning phase of the Execution State and might need to be repeated prior to each cycle of execution. OneSDLC is described in more detail below in IRM 2.127.2.2.4 in this IRM
 
- 
	
The test environment should be functional and simulate the production environment. Use a set of verification tests to ensure the environment hardware, database management systems, other system-level software, and intra-suite and inter-suite communications are present and functioning. Together, the verification tests demonstrate each of the systems within a test suite permits the application software to run and that data can be exchanged between and within the test suites.
 - 
	
Review the plan, environment requirements, Development, Integration and Testing Environment (DITE) Service Level Agreement (SLA), EOps Organizational Level Agreement (OLA), and support requirements. Ensure the test environment can support integration testing of each of the test items. Review the DITE SLA and EOps OLA to ensure service levels are met and products and services are obtained. Audit the test environment to make sure it contains what is needed. Identify and correct, if possible, any discrepancies between what is loaded into the test environment.
 - 
	
The description and requirements for the test environment and data is updated in the ELM.
 
- 
	
When preparing for testing, the tester will need to verify the type of equipment needed to perform the test, such as hardware, software, etc. to mimic a production environment. This includes assessing the readiness of the equipment (e.g. Sufficient tapes and disk space), making sure the latest versions of system software have been installed and ensuring operational staff has been identified and is in place. Perform verification runs that include acceptance testing of any test tools. Changes to a test tool after it has been used will require review of previous test results obtained through use of that tool and may require retesting. A description of the specific equipment should be included in the design documents.
 
- 
	
The test plan lists specific items needed to perform the test. It is the tester’s responsibility to verify that all items listed are available. Some of those items might include Job Control Language (JCL)/ Executive Control Language (ECL), test tools, and test data files.
 
- 
	
Executive control language will need to be modified to fit the specific test. Examples that may need to be updated include JCL, ECL, JavaScript, Active Server Pages, etc.
 
- 
	
When applicable, the test analyst should create interface database agreements with other EST test teams and projects. An Interface Database Agreement assists in tracking the transfer and coordination of the following:
- 
		
Data required to perform interface testing
 - 
		
Interface media by creating formal agreements within and between the EST branches
 
 - 
		
 - 
	
This procedure currently applies only to EST. For more detailed information regarding the creation of an interface database agreement, see Section 9 in the TRG.
 
- 
	
Review documentation as described below.
 
- 
	
The developer is responsible for submitting the functional documentation pertaining to the requirements that determine how the software functions. Documentation is submitted to the ELM Requirements Repository for review.
 - 
	
The primary documents to be reviewed are:
- 
		
Functional: Configuration Management Plan (CMP), Design Specification Report (DSR), Enterprise Organizational Readiness (EOR) Service Package, or equivalent functional design documents, for software supplied by IRS modernization projects
 
- 
		
Requirements: Business Systems Report (BSR), Statement of Work (SOW), Unified Work Request (UWR), or equivalent requirements documents, for software supplied by IRS modernization projects
 
- 
		
Operational: Computer Operator Handbooks (COHs), User Documentation and Training Materials, or equivalent computer operations documents, defining the JCL, ECL, Core Record Layouts (CRLs), etc.
 
 - 
		
 - 
	
The documents are stored in the approved ELM Requirements Repository and/or DocIT. If discrepancies are encountered in any of the documents, a Priority 3 IRWorks ticket should be created. Please see Section IRM 2.127.2.2.2.1.1.6 - Establish Incident Problem Reporting in this IRM for information creating a ticket.
 
- 
	
Requirements should be baselined during the Perform Preparation step and before test execution begins. Baselines are created during various points in time and project needs. The project team and external stakeholder groups commit to the agreed set of allocated requirements for the given effort as a baseline for the next increment of work. Testers should be included in commitments to allocations and use baselined information to conduct testing. More information concerning Requirements Baselines can be found on the Requirements Engineering Learning Center site.
 - 
	
As outlined by Requirements Engineering and Program Office (REPO), IT testing organizations should follow the procedures to determine costing and level of effort (LOE) to test requirements and complete the requirements analysis accordingly.
 - 
	
All high-level requirements provided via a UWR should be at a testable level by the end of the preparation step. The testable requirements should be documented in ELM as Requirement Collections and Items in DNG, and/or as User Stories in EWM.
 
- 
	
All internal documents received and created during the planning stage should be reviewed prior to executing the test. Examples of these internal documents include the TP, Interface Database Agreement, Test Schedule, PFC, and all other relevant documents, as applicable.
 
- 
	
All documents received that are relevant to the test should be posted in the authorized repository. In addition, any documents that require managerial signature should be submitted electronically to all stakeholders at least 30 days prior to the planned test start date or as mutually agreed upon by the key stakeholders involved.
 
- 
	
At this point in the testing process, you have reviewed all related design documentation and internal documents related to your test. If you discover any discrepancies in those documents, an incident report ticket should be opened in IRWorks. Tickets opened should reflect Priority 3 status for a documentation error. The Problem Reporting Procedure contains information regarding the prioritization of problems in IRWorks. Developers should report any defects they find in their repository of record, e.g. ELM EWM.
 
- 
	
Prepare test cases, test scripts and test data as described below.
 
- 
	
External documentation (previous test cases and/or scripts including any that had been deferred or waived) regarding prior testing should be reviewed to assist in creating/modifying test cases/scripts and/or data for the current test.
 
- 
	
Test cases are created/modified to specify and define the test condition(s) to be tested and to verify that system functionality meets customer requirements. Test cases can also be used to verify the negative expected functions to ensure that the system continues to perform as expected should an error be encountered during processing.
 - 
	
Each test case should reflect the fields identified in ELM.
 - 
	
Test the requirement as identified in the requirements repository to reference specific test data and its expected results associated with specific program criteria and/or UWR requirements. Match the documented expected output results to pass the test. Below are the minimum fields that the test analyst must include in a test case:
- 
		
Description – Defines the test condition for the requirement being tested
 - 
		
Development Items – View the Configuration Management item that could be associated with a test case
 - 
		
Expected Results – Document what results must be achieved before you can consider the test case successful
 - 
		
Pre-Condition – Defines the terms that must occur before you can begin executing the test case
 - 
		
Requirement Links – View the requirements that are in the ELM
 
 - 
		
 
- 
	
Once test cases have been created/modified, test scripts should be created/modified as applicable and linked to the associated test case. A test script in software testing is a set of instructions that will be performed on the system under test to ensure that the system functions as expected. Each test script should reflect the fields identified in the ELM.
 - 
	
Below is the minimum information that the test analyst must include in the development of test scripts:
- 
		
Description
 - 
		
Expected Results
 - 
		
Attachments/Links
 
 - 
		
 
- 
	
Test data should be mapped to test cases by identifying the field, dataset, file, etc. in one of the mandatory test case fields within ETM. Data is created or modified and used to verify that all conditions are met. Prepare new data or modify existing data for valid and invalid test conditions. The data used should be analyzed for quality purpose to verify whether the programs will or will not produce the expected results. In addition, the test data used should be consistent with the requirements in the repository and should emulate the inputs as required by the modules, programs, applications, or systems. If live data is used, prepare live data waiver, and ensure all testers have complete live data training. Refer to Section IRM 2.127.2.2.2.1.2.6 (5) in this IRM for details.
 
- 
	
Test status should be reported to management regarding test cases, test scripts, and test data to be created, and any mitigating circumstances and/or constraints identified. Jazz Reporting Service (JRS) will generate a report for the Test Plan/Test Project Work Item (TPWI).
 
- 
	
Peer reviews are conducted as needed throughout the testing lifecycle. PMs have the authority to designate the work products that are formally reviewed for each project. At this point in the test, a peer review should be conducted, as applicable. Peer reviews would be conducted regarding the relevant documents that would be used for testing and any other documentation that has been developed up to this point. The RTVM and Test Schedule should also be updated .
 - 
	
The Peer Review Procedure in EST Customer Corner SharePoint site contains detailed instructions.
 
- 
	
A Test Readiness Review (TRR) is a milestone signifying the transition of the product from the development organization to the testing organization. It can be held within the week prior to program delivery or as mutually agreed upon by the key stakeholders involved. It is the responsibility of the test organization to ensure that all stakeholders agree that requirements, design, software, environment, and work products are at a level of readiness to commence testing.
 - 
	
A TRR is a formal review by the test organization and is conducted in partnership with development, business customer, and any other stakeholders that may be involved in the test effort. Other stakeholders may include operations, networks, and infrastructure. Collectively, the group determines the readiness of the system to proceed with testing. The decision to proceed with testing is made based on a successful TRR outcome. If the TRR is not successful, follow-up meetings should be conducted to address all outstanding items and the revised dates should be updated in the test plan.
Note:
Formal TRR is not required under OneSDLC. The PM decides whether a TRR is necessary. It is recommended for the standing up of any new program or project in its initial release, or projects following the semi-annual delivery path with a relative long planning horizon. For projects following the Agile or Frequent delivery paths, regular Integrated Project Team (IPT) meetings and/or Daily Standup calls can be used as tools to achieve the objectives of a TRR, while promoting frequent communication and rapid resolution of any project issues impacting testing.
 
- 
	
Participants in the TRR include:
- 
		
Programmers
 - 
		
Test Analysts
 - 
		
Customers
 - 
		
Project Managers
 - 
		
Requirements Managers
 - 
		
Database/System Administrators
 - 
		
System Architects
 - 
		
Configuration Managers
 - 
		
External Organizations, Partners, and Stakeholders, as appropriate
 
 - 
		
 - 
	
The test manager identifies the appropriate points of contact required for the TRR meeting.
 
- 
	
The test manager or designee prepares the checklist to identify critical items and insert questions for project-specific concerns. The following documentation should be received by the test team prior to the TRR, including but not limited to:
- 
		
Inventory of documentation delivered
 - 
		
List of all known problems
 - 
		
Updated/Signed Test Plan
 - 
		
Design and Technical Documentation
 - 
		
Any updates to the testing schedule
 
 - 
		
 - 
	
TRR participants are to review all documentation prior to the TRR meeting.
 
- 
	
The main activities in a TRR meeting include:
- 
		
Record attendance
 - 
		
Ensure all open questions on the TRR Checklist are addressed by participants
 - 
		
Update the TRR Checklist and document any action items and/or recommendations
 - 
		
Document test readiness results in the TRR Checklist/Meeting Minutes, as appropriate
 - 
		
Decide whether the system is ready, or system is not ready
 - 
		
Distribute the updated TRR Checklist/Meeting Minutes to all TRR Meeting participants
 
 - 
		
 - 
	
If concurrence on test readiness is not obtained from all TRR participants or there is a TRR Checklist item that is critical for testing and not satisfied, the Test Manager schedules a follow-up TRR meeting(s) to review resolution of outstanding issues. Activity steps are repeated until all items on the TRR checklist critical for testing are satisfied.
 - 
	
Once concurrence is reached and all TRR Checklist items that are critical for testing are satisfied, the Test Manager will prepare a signed TRR Findings Memorandum – System is Ready and issue it via email within five workdays to all participants.
 - 
	
The Test Readiness Review in the EST Customer Corner SharePoint site contains detailed instructions on conducting the review.
 
- 
	
At the end of this phase, sufficient activities started in the Perform Planning phase, and continued through the Perform Preparation phase should be completed before the next step in the testing lifecycle can begin. The test team has been selected. The Project Folder has been created. The requirements have been analyzed and dispositioned to a testable level. Test data, test cases and test scripts have been developed. Problem reporting procedures have been established and the TRR meeting has been held. The test environment is ready.
 - 
	
In the next phase of the testing lifecycle, we will identify the activities necessary to ensure successful test execution and documentation.
 
- 
	
During Test Execution the test cases and/or test scripts that were generated during the Perform Preparation phase are executed, using the test data created, to perform the verification and/or validation against the system to confirm if the product fulfills the customer’s requirements.
 - 
	
The Execute and Document Test procedure covers the activities listed below which are further described in more details in the following sections:
- 
		
Execute Test Cases/Scripts
 - 
		
Document Results
 - 
		
Report Test Status
 
 - 
		
 - 
	
Activities in the Execute and Document Test phase are generally performed in the Execution State within the OneSDLC model prior to the Product Review stage. These activities may be repeated in each cycle of execution for an Agile or Frequent project. See Section IRM 2.127.2.2.4.2 in this IRM for more details.
 
- 
	
Execute test cases and test scripts as described below.
 
- 
	
Development sends the program/code to the environment and verifies that it has been successfully transmitted. A transmittal memo or a program/code deployment notification is then sent to the test team to notify them.
 
- 
	
After the transmittal is submitted to the test team, the test team should verify that the transmittal has been received. Verification is done through the consumer (i.e. EOps, EST analyst, etc.), who in turn installs/loads the code onto the designated computer.
 
- 
	
Perform the following steps to review input data:
- 
		
Examine the test cases to ensure all test cases have necessary input data
 - 
		
Review the input test data to ensure accuracy
 
 - 
		
 
- 
	
Test Execution should begin by validating the test environment is ready before going forward with test execution activities. In some cases, a Pre-Test may be executed to ensure that the system is working properly before proceeding with extensive test processing. Once it has been verified that the test system is ready, begin processing the test cases developed and finalized for the test during the Perform Preparation step.
 - 
	
Process the test cases using the appropriate test tool, such as the ELM, Load Runner, etc., if applicable and review the expected results and output received to determine whether the test case will pass, fail, etc.
 - 
	
Disposition all test cases by the end of the test and verify that the test cases were complete and consistent with the requirements. All test cases should be dispositioned by one of the following designations:
- 
		
Passed
 - 
		
Failed
 - 
		
Deferred
 - 
		
Waived
 
 - 
		
 
- 
	
Document results as described below.
 
- 
	
All test results should be documented in the designated authorized repository (DocIT and/or ELM). Guidelines in Section IRM 2.127.2.2.2.4.4.3 in this IRM should be followed regarding Records Retention.
 
- 
	
After you complete your test and identify any defects, a test incident ticket should be opened in IRWorks.
 - 
	
If expected results are not met:
- 
		
Research the problem
 - 
		
Define the problem
 - 
		
Prioritize the problem
 - 
		
Open a ticket for the problem
 
 - 
		
 - 
	
Once a correction has been retransmitted, verify that it addresses the reported problem. Close the Problem Ticket only if the solution corrects the original problem and is verified by regression testing, or the resolution satisfactorily explains why the problem occurs. Otherwise, reopen the Problem ticket.
 
- 
	
Test Case Deferral and Waiver Requests are initiated within the test team and are approved by the PM, to request exemption from executing test cases planned for the test step. This procedure applies to all IT projects in which testing is performed. This procedure may not be used to amend requirements. Addition, modification, or deletion of requirements must follow the IT Change Request (CR) process.
 - 
	
A deferral requests that the verification of a test case be moved to another test release. A waiver requests that the obligation to execute a test case be eliminated, thereby departing from the course established in the test plan. The associated requirement must have been either approved for deletion or verified in another test case.
 - 
	
The Test Case Deferral and Waiver Procedure in the EST Customer Corner SharePoint site contains information on preparing a request.
 
- 
	
The ELM suite of tools ensures that a traceability is developed to confirm that all requirements can be mapped to specific business processes, systems development documents, test cases and test results within the ELM. The ELM should be updated to show the final disposition of all test cases created to satisfy each of the baselined requirements. Once all test cases have been processed and dispositioned, as appropriate, and the end of the test has been reached, a final report that includes the results of the test can be issued.
 - 
	
Other work products that may need to be updated are the PFC, Test Schedule, etc.
 
- 
	
Report test status as described below.
 
- 
	
Peer reviews continue to be conducted as needed throughout the testing lifecycle. PMs have the authority to designate the work products that are formally reviewed for each project.
 - 
	
Test cases should be refined and/or updated based on peer reviews when necessary and incorporated into the test plan. The ELM and the Test Schedule should be updated to reflect completion of the test cases.
 - 
	
The Peer Review Procedure in the EST Customer Corner SharePoint site contains detailed instructions on conducting the review.
 
- 
	
EOTR and EOTCR provide information regarding the test results to allow customers and developers the ability to assess the quality of the product. In describing the function of an end of test report here, there is no difference between the EOTCR and the EOTR. Both can be used to describe the outcome of the test. However, the EOTR is used to document the results of a specific test type while the EOTCR is used to document all testing efforts for a release. The general information required is essentially the same for both. The EOTR should address any issues that were highlighted in the Test Plan and any issues identified after the initial Test Plan was approved and issued. Likewise, the EOTCR should address any issues that were highlighted in the TSP and any issues identified after the initial TSP was approved and issued.
- 
		
Create End of Test Report
The EOTR addresses:
- 
			
Scope of the test
 - 
			
Test assumptions, constraints, and risks
 - 
			
Test results (Pass/Fail/Waive/Defer)
 - 
			
Requirements traceability (e.g. requirements linked to test cases)
 - 
			
Test schedule
 - 
			
Problem tickets and dispositions
 
A Template for the EOTR is found in the EST Customer Corner SharePoint site. A link to the template can also be found in the References section of this IRM.
EST has approved that the EOTR may now be created in the ELM. The ELM template contains the primary guidance for the required contents of the EOTR, and the instructions clarify how to create the corresponding materials within the ELM. Please check with your test manager to decide if the ELM EOTR is applicable to your project. The ELM End of Test Report Instruction in the EST Customer Corner SharePoint site contains detailed instructions.
 - 
			
 - 
		
Create End of Test Completion Report
The EOTCR is a OneSDLC requirement for all delivery approaches. The EOTCR is a summarization of all testing for the project’s release and must be maintained in the organization’s authorized repository for audit purposes.
The EOTCR also allows the PM an opportunity to mitigate risks that may cause delays to project implementation.
A Template for the EOTCR is found in the EST Customer Corner SharePoint site. The purpose of the EOTCR template is to provide a standard artifact to summarize the complete test effort for the release. A link to the template can also be found in the References section of this IRM.
Note:
If the PM decides to keep the TSP open for updates during test execution and provide final approval at the end of testing (e.g., for an Agile or Frequent project), an EOTCR is not necessary nor required. Otherwise, it is needed for documentation and/or confirmation of any deviations from the initial approved and issued plan.
 
 - 
		
 
- 
	
At the end of this phase, testing is completed. The ELM has been updated to reflect test cases have been processed and dispositioned. The End of Test Report has been created and the status of the test has been reported.
 - 
	
In the next step of the testing lifecycle, we will identify the activities necessary to ensure successful closure of testing activities.
 
- 
	
The Closeout phase signals the end of the test. This phase includes issuing the final test report (EOTR or EOTCR), conducting lessons learned, finalizing the Project Folder, dispositioning any outstanding issues, and preparing for the PIR.
 - 
	
The Closeout phase covers the activities listed below which are described in more details in the following sections:
- 
		
Finalize Test Artifacts
 - 
		
Issue End of Test Reports
 - 
		
Conduct Closeout Meetings
 - 
		
Finalize Project Folder
 
 - 
		
 - 
	
Activities of the Closeout Test step are conducted in the Product Review stage of the Execution State within the OneSDLC model in general. Some of these activities may need to be repeated for each cycle of execution during Mid-range Reviews and/or Iteration Reviews prior to final Product Review. See Section IRM 2.127.2.2.4.2 in this IRM for more details.
 
- 
	
Perform the following activities to finalize test artifacts.
 
- 
	
The Automated PFC should be completed with any outstanding documents/results. Using the completed PFC, update the authorized repository/project folder with all final results. The Automated PFC is then stored in DocIT.
 
- 
	
Disposition all outstanding issues and update the Test Schedule.
 
- 
	
Any issues that have not been completed or closed should be worked to a final disposition. Upon final resolution of any open issues that involved one of the systems development stakeholders (EOps, UNS, or AD), document the resolution and any outstanding issues in the summary section of the EOTR.
 
- 
	
Perform the following activities to issue end of test reports.
 
- 
	
The EOTR includes:
- 
		
A summary of the scope of the release including test type executed and dates of the test
 - 
		
A summary of all risk factors, issues or areas of concern, open defect tickets/reports, and/or test constraints and/or variances from the Test Plan
 - 
		
A clear statement of the overall testing assessment along with recommendations for proceeding to the next step of testing or release
 
 - 
		
 - 
	
An EOTCR should also be prepared as a summarization of all testing for the project’s release.
 
- 
	
Once your EOTR/EOTCR is complete, it must be submitted to the PM and test manager for approval and concurrence. The signature page can be tailored to add additional signatures, but must include at a minimum, the Preparer, Test Type Manager, and Project/Release Manager for sign off.
 
- 
	
The test type manager will approve the EOTR/EOTCR within five days after completion of test(s) or as mutually agreed upon by the key stakeholder involved. After an EOTR/EOTCR is approved and posted to the authorized repository, it can be submitted electronically to all stakeholders.
 - 
	
A revised EOTR/EOTCR should be prepared if a project has reached its test completion date and the EOTR/EOTCR has been approved, but a significant number of Incident/Problem Tickets have been identified. A revised report will be issued upon completion of the retest.
 
- 
	
Closeout meetings are held to ensure that all requirements have been satisfied during testing, lessons learned are documented, and teams are ready for the PIR.
 
- 
	
A Lessons Learned (LL) document describes the knowledge and wisdom acquired from actual events or work practices. When projects do not go as planned, the reasons should be documented so that other projects can benefit from those experiences. If a project’s revised process or work product is an improvement over the standard, that process or product should be captured for future use.
 - 
	
Lessons Learned is the process of documenting events that have caused a success or shortfall on a project. The successes are repeatable processes that are used to aid and assist current and future projects. The shortfalls are used by current and future projects to learn from and to prevent repeating.
 - 
	
Lessons Learned are a critical part of the testing process and a mechanism for:
- 
		
Discovering opportunities for improvement,
 - 
		
Identifying both strengths and weaknesses,
 - 
		
Communicating acquired knowledge more effectively, and
 - 
		
Ensuring beneficial information is factored into planning, work processes and current and future activities of the testing life cycle.
 
 - 
		
 - 
	
Furthermore, documenting lessons learned is a principal component of IRS’ culture committed to continuous process improvement.
 - 
	
For more detailed information regarding the completion of a Lessons Learned Report, refer to IPE Lessons Learned Standard Procedure, OneSDLC Lessons Learned Template Guidance, OneSDLC Lessons Learned Template on SharePoint.
 
- 
	
Post-Implementation Review (PIR) is a formal end-of-test meeting held with developers, customers, and EST representatives to review the test and identify process improvement areas, assess the test completion status, and document success factors. A PIR meeting will be held within 60 days following the completion of each test to allow sufficient time to assess the status of Production start-up.
 - 
	
Items to be discussed with customers, developers and EST Management include, but are not limited to:
- 
		
Deliverables (i.e., UWR, FSP/PRP/DSP, programs, design documents, etc.)
 - 
		
Testing issues
 - 
		
Communication
 - 
		
Lessons Learned
 - 
		
Upcoming tests
 
 - 
		
 - 
	
An open forum will be provided to discuss any other concerns or issues.
 - 
	
Two weeks prior to the scheduled PIR meeting, develop an agenda and checklist for the planned PIR meeting. The checklist identifies the items to be reviewed during the meeting. The initiator will then send an agenda and discussion sheet to all participants.
 - 
	
Perform the following PIR preparation activities:
- 
		
Distribute test results, test logs, and other materials for review by the meeting participants
 - 
		
Assign responsibilities for presenting testing materials at the meeting
 - 
		
Assign responsibilities for collecting meeting results
 - 
		
Perform other preparation activities as necessary to enable efficient conduct of the PIR meeting
 
 - 
		
 - 
	
Conduct the PIR meeting in accordance with the agenda/checklist developed prior to the meeting. Indicate which requirements were verified based on the requirements traceability matrices within each test case. During the meeting, any problems encountered by a team are reviewed for the purpose of identifying problems/changes missed during test execution. Other problems are collected and documented as lessons learned for the future.
 - 
	
The PIR Discussion Sheet will be used to document any findings and will be placed in the project folder. The PIR Agenda and Discussion Sheet Template is located under the Support Documents folder in the EST Customer Corner SharePoint site.
 - 
	
After all meetings have been conducted and documented, those documents should be posted to the authorized repository, and then submitted electronically to all stakeholders.
 - 
	
For more detailed information regarding the completion of a PIR Report, refer to IPE Post Implementation Reviews on SharePoint.
 
- 
	
Once the test is complete, the contents of the Project Folder should be verified for completeness and accuracy and must be approved by the appropriate management entity. The Project Folder should not contain any obsolete test data or other test materials. Any obsolete test data should be disposed of in accordance with the Records Retention policy.
 
- 
	
All completed critical test related documents must be placed in the appropriate project folder, either within the ELM or in DocIT. A copy of any Live Data used for the test should be stored in the ELM only. The ELM area where Live Data is stored should have access control in place where only authorized users can access. Once testing is complete, go back to the ELM project folder and delete the live data. You must contact Rational Infrastructure Section (RIS) through a IRWorks ticket, and request that the trash can be deleted from the ELM.
 
- 
	
An Automated PFC has been created to use as a guide to determine what the contents of the Project Folder should be. The Automated PFC is merely a guide. Any test documents deemed important enough to be stored in the Project Folder by the PM, but not listed in the Automated PFC, can be annotated in the miscellaneous section of the checklist, under the appropriate documentation category.
 - 
	
The Automated PFC is a requirement for every test and must be stored in the Project’s authorized repository. The Automated PFC should be reviewed and approved by the PM, Section Chief, or Team member prior to the official close of the test. It must be retained for one year after the completion of the test.
 
- 
	
All final contents of the Project Folder should be retained and stored in accordance with IRS record keeping procedures. Specific procedures for records in conjunction with SAT testing can be found in Records Control Schedules Document 12990, Item 17.
 
- 
	
After all closeout activities are completed, the testing lifecycle of the project is completed. The PM should check with their OneSDLC Rep to schedule the Product Review (or Mid-range/Iteration Review and Retrospective) and ensure that all OneSDLC requirements for the project (or a cycle/iteration of the project) have been met. All project test work products and approved test deliverables should be placed in the authorized project repository. Any obsolete data should be deleted from the ELM.
 
- 
	
This section defines the roles and responsibilities. The responsibilities for each role may vary based upon project structure and development methodology.
 
- 
	
Roles involved in the IT Testing process are shown in the below table.
Role Description Definition of Responsibility 
Business Lead- 
					
Create, communicate, coordinate, and interpret the business requirements
 - 
					
Approve various artifacts
 
Requirements Manager- 
					
Ensure requirements are maintained in the authorized repository (thereby available for testing)
 - 
					
Ensure traceability to other artifacts such as business rules (calculations, business decisions, information routing, etc.) that need to get tested
 - 
					
Facilitate validation of requirements with the business (so the "right" requirements are there in the first place). See Requirements Engineering and Program Office (REPO) Guidance
 
Developer- 
					
Coordinate identified issues/problems/defects with other testing or project stakeholders or provide a workaround
 - 
					
Document all coding
 - 
					
Participate in peer reviews of coding and documentation
 - 
					
Deliver working application
 - 
					
Conduct testing such as Unit/Dev/Developer Integration Testing (DIT) on the created/changed code
 - 
					
Notify Project Manager (PM) of testing status
 - 
					
Provide appropriate artifacts to the next phase of testing/deployment
 - 
					
Create, update, and maintain appropriate artifacts for testing phases
 
Project Manager- 
					
Ensure team understanding of the business requirements
 - 
					
Develop high level strategies needed to support the development life cycle
 - 
					
Ensure that Verification and Validation methods are planned, documented, and performed
 - 
					
Ensure process activities are performed timely
 - 
					
Ensure coordination activities are held
 - 
					
Ensure issues/problems/defects not adequately addressed are raised to the appropriate level for resolution
 - 
					
Ensure all milestones are met
 - 
					
Approve various artifacts
 - 
					
Ensure OneSDLC project deliverables and work products have been completed
 
Test Manager- 
					
Provide guidance on test strategy, scope, and approves TP(s) in accordance with standards and procedures
 - 
					
Manage project schedule
 - 
					
Manage test issues/problems/defects logged by testers
 - 
					
Generate problem reports
 - 
					
Ensure issues/problems/defects are assigned to the appropriate developer for resolution
 
Test Lead- 
					
Ensure that all work products are completed (TP, EOTR, etc.)
 - 
					
Ensure the verification and acceptance of all TP(s) and documentation
 - 
					
Triage open testing problems, problem reporting, update problem status, and provide solutions or workarounds for test issues/problems/defects
 - 
					
Create, update, and maintain appropriate artifacts for testing phases
 
Test Analyst- 
					
Create test related work products (test cases/scripts, test datasets, etc.)
 - 
					
Prepare required reporting documentation, if any, for the respective testing activities
 - 
					
Execute and document test activities
 - 
					
Manage testing requirements
 - 
					
Create, duplicate test cases/scripts
 - 
					
Identify and document testing problems
 - 
					
Report testing status
 - 
					
Analyze appropriate documentation to extract project requirements
 
 - 
					
 
- 
	
Below are the roles and responsibilities that may get involved in each phase/activity.
 
- 
	
The project manager is responsible for assigning team responsibilities, facilitating team understanding of the business requirements, identifying and providing the OneSDLC testing artifacts at the Milestone Readiness Review, and developing the high-level strategies needed to deliver the solution.
 - 
	
The test manager is responsible for developing test strategies, monitoring workload, redistributing workload, reporting status to upper management, and other duties as assigned.
 - 
	
The business lead is responsible for defining business requirements, and other duties as assigned.
 - 
	
The test lead is responsible for analyzing documentation, creating the project folder, developing the TP, conducting peer reviews, verifying requirements are clear and testable, confirming that entrance and exit criteria are met, and other duties as assigned.
 - 
	
The test analyst is responsible for documentation review, and other duties as assigned.
 
- 
	
The test manager is responsible for providing guidance on test strategy, scope, approving the TP(s) in accordance with standards and procedures, ensuring environment is established, TRR conducted (if required), and other duties as assigned.
 - 
	
The test lead is responsible for the test schedule, status reporting, ensuring that work products are complete, test environment is functional and ready, and other duties as assigned.
 - 
	
The test analyst is responsible for reviewing requirements, developing test data, creating test cases/scripts, identifying and documenting problems, participating in TRR (if required) and other duties as assigned.
 - 
	
The developer is responsible for providing functional documentation that represents the requirements.
 
- 
	
The test manager is responsible for monitoring workload, redistributing workload, reporting status to upper management, and other duties as assigned.
 - 
	
The test lead is responsible for developing and managing the TP and test schedule, conducting peer reviews, verifying requirements, validating entrance and exit criteria, reporting testing status, ensuring all test products are completed and documented, and other duties as assigned.
 - 
	
The test analyst is responsible for developing test data, creating and executing test cases/scripts, identifying and documenting problems, reporting testing status, developing test artifacts, and other duties as assigned.
 
- 
	
The project manager is responsible for ensuring the OneSDLC project deliverables and work products have been completed.
 - 
	
The test manager is responsible for ensuring all tests have been dispositioned, all required deliverables and work products (e.g., EOTR, PFC, etc.) have been completed, and other duties as assigned.
 - 
	
The test lead is responsible for developing the deliverables and work products, conducting close out meetings, and other duties as assigned.
 - 
	
The test analyst is responsible for ensuring all required test artifacts are in the project folder, and other duties as assigned.
 
- 
	
EST follows the delivery models established by OneSDLC.
 - 
	
OneSDLC supports all development paths within the IRS such as Waterfall, Planned Maintenance, COTS, Managed Services, Iterative, Agile, etc.
 - 
	
The OneSDLC delivery model consists of two states:
- 
		
Readiness - A one-time preparation state that gets new teams ready to execute funded work, as soon as possible.
 - 
		
Execution - A continuous delivery state that empowers teams to plan, perform, and produce by building, testing, and delivering solutions through ongoing learning.
 
 - 
		
 
- 
	
The Readiness state includes a one-time sequence of activities that allows new initiatives to transition from planning to execution.
 - 
	
In this state:
- 
		
Initiatives are approved
 - 
		
Funding is available
 - 
		
Proposal Owner is informed
 - 
		
Product Manager is assigned
 - 
		
Environment is established
 - 
		
Testing team is formed
 
 - 
		
 
- 
	
The Execution State supports three delivery cycles which enable products to easily increase their delivery cadence (deliver more often).
- 
		
Agile delivery approach – Two-week Cycles
 - 
		
Frequent delivery approach – Two-Month Cycles
 - 
		
Semi-annual delivery approach – Six-Month Cycles
 
 - 
		
 - 
	
For new Agile Delivery, the Execution State starts after exiting the one-time Readiness State. Existing products can start from Execution State. The Product Team executes a sequence of activities where they continuously refine, develop, and deliver the work in the backlog with the potential to ’Release on Demand’ in the future.
 - 
	
In each cycle, continuous planning, preparation, and execution happens as per the demands of the selected approach, which may include all the activities below:
- 
		
Perform Planning
 - 
		
Perform Preparation
 - 
		
Execute Test and Document
 - 
		
Closeout Test
 
 - 
		
 
- 
	
Test governance aligns with OneSDLC governance processes allowing for structure around how IRS aligns IT strategy with business strategy. This ensures the team stays on track to achieve their strategies and goals, and implement best practices to measure IT performance. Test governance also makes sure that stakeholders’ interests are considered and that processes provide measurable results.
 - 
	
Test compliance aligns with OneSDLC compliance process. In OneSDLC, new initiatives will complete compliance documentation in Readiness Exit Review and Product Review for Agile projects. Compliance documents will be signed before entering the Readiness Exit Review. At the Readiness Exit Review, the projects will present their roadmap and get Governance Board approval to start executing. Once a new initiative exits the Readiness State, they will not return to it. Governance and compliance will continually occur in the Execution State. The reduction of governance and compliance to just two reviews initially for the first cycle and then just one review in execution reduces the amount of time producing compliance documentation.
 
- 
	
Testing Compliance artifacts for existing projects occur in the Execution State where the four activities continually happen.
 - 
	
The following are list of artifacts for each activity:
- 
		
Perform Planning
 
- 
		
Test Strategy and Plan
 - 
		
Test Plan
 
- 
		
Perform Preparation
 
- 
		
Test cases/scripts/data
 - 
		
Peer Review
 - 
		
TRR (as applicable)
 
- 
		
Execute Test and Document
 
- 
		
Problem Defects (as applicable)
 - 
		
Test Case Deferral and Waiver form (as applicable)
 
- 
		
Closeout Test
 
- 
		
End of Test Reports (EOTR)
 - 
		
End of Test Completion Reports (EOTCR)
 
 - 
		
 - 
	
Lists of all artifacts are found in detail in Section 2.127.2.2.5.2. below.
 
- 
	
This section lists the work products needed to execute the process (known as inputs) as well as those produced by the IT Testing process (known as outputs) as described in the table below.
Artifact Name Category Process Owner Required by OneSDLC Available in OneSDLC State/Event Required by EST Needed for EST in Phase/Activity Additional Information Project Management Plan (PMP) Other Artifacts Agile Central, PMO Yes Readiness Yes Performing Planning Prepared by Project Manager using Template owned by Agile Central. Replaced by ‘About Page’ in OneSDLC project site and/or ELM with access to all stakeholders Project Charter Enterprise Services EA Yes Readiness Yes Performing Planning IRS Standard Engineering Lifecycle Management (ELM Repository Requirement Repository REPO Yes Readiness Yes Performing Planning ELM contains DNG, EWM, and ETM modules for Requirements, Development, and Test Management Enterprise Organizational Readiness (EOR) Other EOR Yes Readiness Optional Performing Planning N/A for Planned Maintenance Lessons Learned (from previous product cycle) Other Artifacts IPE Yes Execution / Product Planning Yes Performing Preparation Review remaining unresolved lessons learned from previous product cycle Post Implementation Review (PIR) (from previous product cycle) Other Artifacts IPE No Execution / Product Planning Optional N/A For projects following Agile path, it is replaced by Iteration and Mid-range Retrospective activities, and the Requirements Engineering Self-Certification and Lessons Learned process Test Strategy and Plan (TSP) Enterprise Services EST, PMO Yes Readiness, Execution / Product Review Yes Perform Preparation, Closeout Test Prepared by Project Manager using template owned by EST. Started in Readiness and completed in Execution Product Review. TSP is created to replace existing STP with more details Test Plan (TP) Other Artifacts EST, etc. Yes Execution / Product Planning Yes Perform Preparation As a supplement to the TSP, besides EST, each organization responsible for a specific test type is required to provide their own Test Plan and make available to all stakeholders. Test Plans are created electronically in ELM ETM Systems Acceptability Testing (SAT) Plan Other Artifacts EST No Execution / Product Planning Yes Perform Preparation A Test Plan is created by EST for the specific type of testing called SAT. Other teams may create their Test Plans with names matching their specific test types Test Cases Other Artifacts EST, etc No Execution / Product Planning Yes Perform Preparation Besides EST, each organization responsible for a specific test type is required to develop their own Test Cases in support of their Test Plan Test Data Other Artifacts EST, etc No Execution / Product Planning Yes Perform Preparation Besides EST, each organization responsible for a specific test type is required to create their own Test Data associated with their Test Cases Business System Report (BSR) Enterprise Services EA Yes Execution / Product Planning Yes Perform Preparation BSR is equivalent to VSA. Only one is needed for a project Unified Work Request (UWR) Requirement Documentation Business Yes Execution / Product Planning Yes Perform Preparation UWRs are usually developed into Business Requirements and documented in ELM DNG, and further decomposed as testable requirements, User Stories and Product Backlogs and documented in ELM EWM Statement of Work (SOW) Requirement Documentation Business No N/A Optional Perform Preparation Provides supplemental and background information on requirements if necessary Change Request (CR) Requirement Documentation Business Yes Execution / Product Planning Optional Perform Preparation Needed if applicable. May be reflected in updated User Stories and/or Product Backlogs Configuration Management Plan (CMP) Other Artifacts CM Yes Readiness, Execution / Product Review Optional Perform Preparation, Closeout Test N/A for Planned Maintenance. Started in Readiness and completed in Execution Product Review Design Specification Report (DSR) Enterprise Services SE Yes Execution Yes Perform Preparation, Closeout Test Some projects may only require a Simplified Design Specification Report (SDSR). Developed during Execution and signed off at Execution Product Review Design Specification Package (DSP) Other Artifacts SE Yes Execution Optional Perform Preparation, Closeout Test The information may be included in a DSR Functional Specification Package (FSP) Other Artifacts SE Yes Execution Optional Perform Preparation, Closeout Test The information may be included in a DSR Interface Control Document (ICD) Enterprise Services SE Yes Execution Yes Perform Preparation, Closeout Test Developed during Execution and signed off at Execution Product Review Program Requirements Package (PRP) Requirement Documentation REPO No Execution / Product Planning Optional Perform Preparation, Closeout Test The information may be included in a BSR Vision, Scope and Architecture (VSA) Enterprise Services EA Yes Readiness, Execution / Product Review Optional Perform Preparation, Closeout Test N/A for Planned Maintenance. Started in Readiness and completed in Execution Product Review 
VSA is equivalent to BSR. Only one is needed for a projectComputer Operator Handbook (COH) Computer Operator Handbook ECC Yes Execution Optional Perform Preparation, Closeout Test N/A for Managed Services. Developed during Execution and signed off at Execution Product Review Computer Program Book (CPB) Other Artifacts ECC No Execution Optional Perform Preparation, Closeout Test Needed if applicable. Equivalent of COH User Documentation and Training Materials Other Artifacts Business No Execution Optional Perform Preparation, Closeout Test Needed if available. Equivalent of User Guide Interconnection Security Agreement (ISA) Security SRM Yes Readiness, Execution / Product Review Optional Perform Preparation, Closeout Test Started in Readiness and completed for Initial Production Deployment (IPD). Updated for Product Review Privacy and Civil Liberties Impact Assessment (PCLIA) Privacy PGLD Yes Readiness, Execution / Product Review Yes Perform Preparation, Closeout Test Started in Readiness and completed in Execution Product Review Test Readiness Review (TRR) Other Artifacts EST No Execution Optional Immediately Prior to Execute and Document Test Including TRR Agenda, Checklist and Memorandum. Formal TRR meeting is not required under OneSDLC. The Project Manager decides whether a TRR is necessary. It is recommended for the standing up of any new program or project in its initial release, or projects following the Semi-annual delivery path with a relative long planning horizon End of Test Report (EOTR) Other Artifacts EST, etc. No Execution / Product Review Optional Closeout Test EOTR is only needed when the testing effort of each test type has deviated from the established Test Plan. Besides EST, each organization responsible for a specific test type is required to create their own EOTR associated with their Test Plan, if necessary EOTR can be produced electronically in ELM ETM End of Test Completion Report (EOTCR) Other Artifacts EST, PMO No Execution / Product Review Optional Closeout Test Prepared by Project Manager using Template provided by EST. EOTCR is only needed when the overall testing effort of all test types has deviated from the established TSP Lessons Learned (for the current product cycle) Other Artifacts IPE Yes Execution / Product Review Yes Closeout Test Also captured in iteration and Mid-range Review and Retrospective processes Other Artifacts IPE No Execution / Product Review Optional Closeout Test For projects following Agile path, it is replaced by Iteration and Mid-range Retrospective activities, and the Requirements Engineering Self-Certification and Lessons Learned process Project Folder Checklist (PFC) Other Artifacts EST No Execution / Product Review Yes Closeout Test EST’s primary pre-audit process to ensure compliance with IT Testing Process and Procedures  
- 
	
The EST Test Report is an aggregated status report of all active projects in EST and is found in EST SharePoint site.
 - 
	
Information is pulled from the Narrative tab of the Test Project Work Item (TPWI) in EWM to the Test Status Report (TSR).
 - 
	
The TSR for all EST projects includes start and end dates for test preparation and test execution; number of test cases completed, passed, failed, deferred, or waived.
 - 
	
The Status Reporting System web site may be accessed at: Enterprise Systems Testing Reporting - TSR Defect Trend Reports - All Documents
 - 
	
Other IT organizations may use their own reporting system.
 
- 
	
The following is a list of Reference Documents addressed in this IRM:
- 
		
Automated PFC
 - 
		
End of Test Completion Report (EOTCR)
 - 
		
End of Test Report (EOTR)
 - 
		
Peer Review Procedure
 - 
		
Problem Reporting Procedure
 - 
		
Test Artifact Deferral
 - 
		
Test Case Deferral and Waiver
 - 
		
Test Environment Checklist
 - 
		
Test Plan (TP)
 - 
		
Test Readiness Review (TRR) Guide
 - 
		
Test Reference Guide
 - 
		
Test Strategy and Plan (TSP)
 - 
		
TP Template
 - 
		
Test Type Reference Guide
 - 
		
TRR Agenda
 - 
		
TRR Memorandums
 
 - 
		
 - 
	
For your convenience, the Reference Documents can be viewed at the following sites:
- 
		
SharePoint - Enterprise Systems Testing Reporting - Enterprise Systems Testing Reporting - IRM 2.127 Reference Documents - All Documents
 - 
		
Document Management for Information Technology - DocIT
 
 - 
		
 - 
	
The following resources are either in this document or were used to create it.
- 
		
OneSDLC IRM 2.31.1
 - 
		
IRWorks Procedure
 - 
		
Sensitive But Unclassified (SBU) Data use Questionnaire - Form 14664,
 - 
		
Sensitive But Unclassified Data Use Request - Form 14665, as applicable.
 - 
		
Privacy Testing Guidance
 
 - 
		
 
Terms and definitions are listed in the below table.
| Terms | Definition | 
|---|---|
| Application | Collection of software programs that automates a business function. Each application may be part of more than one application and can run on one or more servers or other hardware. | 
| Baseline | A baseline is a specification or product that has been formally reviewed and agreed on and thereafter serves as the basis for further development. It is changed only through formal change procedures. | 
| Compliance | Compliance is ensuring that a standard or set of guidelines is followed, or that proper, consistent accounting or other practices are being employed. | 
| Deployment | The activity responsible for movement of new or changed hardware, software, documentation, process, etc. to the Live Environment (Production). Deployment is part of the Release and Deployment Management Process. | 
| End Of Test Completion Report (EOTCR) | This document can be used by systems classified as New Development or Planned Maintenance. The EOTCR is a testing requirement. The purpose of the EOTCR is to provide a standard artifact to summarize the complete test effort for the release. The EOTCR also allows the PM an opportunity to mitigate risks that may cause delays to project implementation. | 
| End Of Test Report (EOTR) | The EOTR is a requirement for all testing and may be used as a functional equivalent for the EOTCR. The purpose of the EOTR is to provide a standard artifact to summarize the complete test effort for the test type(s). The EOTR also allows the test managers an opportunity to mitigate risks that may cause delays to project implementation. | 
| IRWorks | The official reporting tool for problem management with all IRS developed applications, and shares information with the Enterprise Service Desk (ESD). | 
| OneSDLC | OneSDLC is a delivery model that positions our IT assets to respond to emergent needs as quickly as possible without impacting productivity, product quality, security, and/or employee morale. | 
| Privacy and Civil Liberties Impact Assessment (PCLIA) | The purpose of a PCLIA is to demonstrate that program/project managers, system owners, and developers have consciously incorporated privacy and civil liberties protections throughout the entire life cycle of a system. This involves making certain that privacy and civil liberties protections are built into the system from the beginning when it is less costly and more effective. The PCLIA process is an analysis of how information in an identifiable form is collected, stored, protected, shared, and managed and provides a means to assure compliance with all applicable laws and regulations governing taxpayer and employee privacy. | 
| Process | A structured set of activities designed to accomplish a specific objective. A process takes one or more defined inputs and turns them into defined outputs. A process may include any of the roles and responsibilities, tools, and management controls required to reliably deliver the output. A process may define policies, standards, guidelines, activities, and work instructions if they are needed. | 
| Process Owner | A role responsible for ensuring that a process is fit for its purpose. The Process Owner's responsibilities include sponsorship, design, change management and continual improvement of the process and its assets. | 
| Project Folder and Automated Project Folder Checklist (PFC) | The Project Folder is a requirement for every test and must be stored in an authorized repository. See Records Control Schedule (RCS) 17, Item 17 for IRS retention requirements. The Project Folder and Automated PFC provide a history of the test. It is a useful source document for auditing purposes, and can be used for future project planning, allocation of resources, and process improvement. The Project Folder and Automated PFC contain copies of all required items pertinent to the specific test, including all critical test documentation and work products. It is the responsibility of the Test Manager to review and approve the Project Folder from their team. | 
| Requirements | A requirement describes a condition or capability to which a system must conform; either derived directly from user needs, or stated in a contract, standard, specification, or other formally imposed document. A desired feature, property, or behavior of a system. | 
| ServiceNow | A cloud-based, software-as-a-service (SaaS) workflow automation platform. It Provides the capability to process and catalog customer service tickets. At IRS, it replaced multiple IT service management tools that collectively made up the KISAM system which is now called IRWorks. | 
| Test Plan (TP) | The TP is a requirement for all testing and may be used as a functional equivalent for the STP. The purpose of the TP is to provide a standard artifact to summarize the complete test effort for the test type(s). The TP also allows the test managers an opportunity to mitigate risks that may cause delays to project implementation. | 
| Test Strategy and Plan (TSP) | The TSP is a summarization of the testing strategy and the overall plan of testing for any given program/project and must be maintained in the organization’s approved repository for audit purposes | 
| Test Types | A test type is a specific test name whose group of activities has the intention of checking the system in respect to a number of correlated quality characteristics. During testing, various quality characteristic types are reviewed. Quality characteristic examples include functionality, accessibility, performance, security, and continuity. | 
| Triage | A process in which things are ranked in terms of importance or priority to allocate scarce resources. It also applies to different types of business process or workflow situations. In an IT department, IT issues can be categorized by a predefined probability scale factoring in risks and business impacts. It is used in EST to manage allocation of resources to fix testing anomalies. | 
| Validation | Validation is the process whose purpose is to demonstrate that a product or product component fulfills its intended use when placed in its intended environment. In testing, validation is performed upon the completion of a given module, or even the completion of the entire application. | 
| Verification | Verification is the process for ensuring that selected work products meet their specified requirements. In testing, verification is the process performed at the end of a test cycle phase with the objective of ensuring that the requirements established have been met. It is an overall evaluation activity that includes reviewing, inspecting, testing, checking, and auditing. | 
Acronyms are listed in the below table.
| Acronyms | Definition | 
|---|---|
| ACIO | Associate Chief Information Officer | 
| BOD | Business Operating Division | 
| BPRM | Business Planning and Risk Management | 
| BSR | Business System Report | 
| CIO | Chief Information Officer | 
| CM | Configuration Management | 
| COH | Computer Operator Handbook | 
| CPB | Computer Program Book | 
| CR | Change Request | 
| CRL | Core Record Layout | 
| DIT | Development Integration Testing | 
| DITE | Development, Integration and Testing Environment | 
| DNG | Doors Next Generation | 
| DocIT | Document Management for Information Technology | 
| DSIT | Development System Integration Testing | 
| DSP | Design Specification Package | 
| DSR | Design Specification Report | 
| EA | Enterprise Architecture | 
| ECC | Enterprise Computing Center | 
| ECL | Executive Control Language | 
| EFNS | Enterprise File Transfer Protocol Network Server | 
| EFTU | Enterprise File Transfer Utility | 
| ELM | Engineering Lifecycle Management | 
| EOPs | Enterprise Operations | 
| EOR | Enterprise Organizational Readiness | 
| EOTCR | End Of Test Completion Report | 
| EOTR | End Of Test Report | 
| ES | Enterprise Services | 
| ESD | Enterprise Service Desk | 
| ESD | Enterprise Server Division | 
| EST | Enterprise Systems Testing | 
| ETM | Engineering Test Management | 
| EWM | Engineering Workflow Management | 
| FDS | Functional Design Specification | 
| FIT | Final Integration Test | 
| FOD | Functional Operating Division | 
| FSP | Functional Specification Package | 
| ICD | Interface Control Document | 
| IMD | Internal Management Documents | 
| IMS | Integrated Master Schedule | 
| IPE | Investment Portfolio Evaluation | 
| IPM | Integrated Process Management | 
| IPT | Integrated Project Team | 
| IRM | Internal Revenue Manual | 
| IRS | Internal Revenue Service | 
| IRWorks | Internal Revenue Workflow Optimization, Requests, and Knowledge System | 
| ISA | Interconnection Security Agreement | 
| ISD | Infrastructure Services Division | 
| IT | Information Technology | 
| JCL | Job Control Language | 
| JRS | Jazz Reporting Service | 
| LOE | Level of Effort | 
| OLA | Organizational Level Agreement | 
| OneSDLC | One Solution Delivery LifeCycle | 
| OS | Operating System | 
| PCLIA | Privacy and Civil Liberties Impact Assessment | 
| PD | Process Description | 
| PFC | Project Folder Checklist | 
| PGLD | Privacy Government Liaison Disclosure | 
| PIR | Post Implementation Review | 
| PM | Project Manager | 
| PMP | Project Management Plan | 
| PRP | Program Requirements Package | 
| RCS | Request for Computer Services | 
| REPO | Requirements Engineering Program Office | 
| RIS | Rational Infrastructure Section | 
| RPLR | Reusable Program Level Requirements | 
| SAT | Systems Acceptability Testing | 
| SBU | Sensitive But Unclassified | 
| SLA | Service Level of Management | 
| SOW | Statement Of Work | 
| SRM | Security Risk Assessment | 
| SSSD | Server & Support Services Division | 
| SW | Software | 
| TP | Test Plan | 
| TSP | Test Strategy and Plan | 
| TRG | Test Reference Guide | 
| TRR | Test Readiness Review | 
| UWR | Unified Work Request | 
| VSA | Vision, Scope and Architecture | 
| WRMS | Work Request Management System | 
The below table lists requirement documentation(s).
| Requirements Documentation | 
|---|
| Business System Report (BSR) | 
| Change Request (CR) | 
| Statement of Work (SOW) | 
| Unified Work Request (UWR) | 
The below table lists functional documentation(s).
| Functional Documentation | 
|---|
| Configuration Management Plan (CMP) | 
| Design Specification Report (DSR) | 
| End of Test Completion Report (EOTCR) | 
| Enterprise Organizational Readiness (EOR) Service Request | 
| Interface Control Document (ICD) | 
The below table lists operational documentation(s).
| Operational Documentation | 
|---|
| Computer Operator Handbook (COH) | 
| User Documentation and Training Materials | 
The below table lists security documentation(s).
| Security Documentation | 
|---|
| Security Package (includes Interconnection Security Agreement (ISA)) | 
The below table lists privacy documentation(s).
| Privacy Documentation | 
|---|
| Sensitive But Unclassified (SBU) Data use Questionnaire - Form 14664, and Sensitive But Unclassified Data Use Request - Form 14665, as applicable. See IRM 10.5.8 Sensitive But Unclassified (SBU) Data Policy: Protecting SBU in Non-Production Environments | 
| Privacy and Civil Liberties Impact Assessment (PCLIA) | 
The below table lists project documentation(s).
| Project Documentation | 
|---|
| Project Charter | 
| Project Management Plan (PMP) | 
| Vision, Scope and Architecture (VSA) | 
This section delineates the activity steps for test projects, as needed per the selected test approach.
The below table lists activities during Perform Planning.
| A1: Perform Planning | |
|---|---|
| Steps | Roles | 
1. Assess Requirements
			
  | 
			Test Analyst Test Lead Test Manager Project Manager Business Lead  | 
		
2. Establish Test Environment
			
  | 
			Test Analyst Test Lead Test Manager Project Manager  | 
		
3. Train Test Team
			
  | 
			Test Lead | 
4. Develop Test Artifacts
			
  | 
			Test Analyst Test Lead  | 
		
The below table lists activities during Perform Preparation.
| A2: Perform Preparation | Roles | 
|---|---|
| Steps | |
1. Verify Test Environment
			
  | 
			Test Analyst Test Lead Test Manager  | 
		
2. Review Documentation
			
  | 
			Test Analyst Test Lead Developer  | 
		
3. Prepare Test Cases/Scripts/Data
			
  | 
			Test Analyst Test Lead  | 
		
4. Conduct Test Readiness Review (TRR) (if applicable)
			
  | 
			Test Analyst Test Lead Test Manager  | 
		
The below table lists activities during Execute and Document Test.
| A3: Execute and Document Test | |
|---|---|
| Steps | Roles | 
1. Execute Test Cases/Scripts
			
  | 
			Test Analyst Test Lead  | 
		
2. Document Results
			
  | 
			Test Analyst Test Lead  | 
		
3. Report Test Status
			
  | 
			Test Analyst Test Lead Test Manager  | 
		
The below table lists activities during Closeout Test.
| A4: Closeout Test | |
|---|---|
| Steps | Roles | 
1. Finalize Test Artifacts
			
  | 
			Test Lead Test Manager Test Manager  | 
		
2. Issue End of Test Reports
			
  | 
			Test Lead Test Manager Project Manager  | 
		
3. Conduct Closeout Meetings
			
  | 
			Test Lead Test Manager Project Manager  | 
		
4. Finalize Project Folder
			
  | 
			Test Analyst Test Lead Test Manager  |