Document Comparison

PCI-Secure-Software-ROV-Template-v1_2.pdf PCI-SecureSoftware-v2.0-ROV-Template.pdf
7% similar
165 → 104 Pages
50153 → 32761 Words
290 Content Changes

From Revision History

  • February 2026 1.0 Initial release to support the PCI Software Security Framework – Secure Software Standard, v2.0.

Content Changes

290 content changes. 256 administrative changes (dates, page numbers) hidden.

Added p. 6
Terminology Terminology used in this ROV Template can be found in the PCI Secure Software Standard v2.0 and the respective PCI Secure Software Program Guide. General terminology can be found at: https://www.pcisecuritystandards.org/glossary.

This document, the PCI Secure Software Standard

• Secure Software Standard v2.0 (PCI Secure Software Standard, Secure Software Standard) and Program. A Software Product assessment involves thorough testing and assessment activities from which the assessor generates detailed workpapers for each security requirement and its associated test requirements. These workpapers contain comprehensive records of the assessment activities, including observations, configurations, process information, interview notes, documentation excerpts, references, and other evidence collected during the assessment. A completed Secure Software Report on Validation (ROV) acts as a comprehensive summary of the testing activities performed, the information that is collected during the assessment, and the findings and observations. The information contained in a completed ROV must provide sufficient detail and coverage to support the …
Added p. 7
Keyword Test Requirement Methods Reporting Instructions Examine The assessor critically evaluates evidence. Common examples include, but are not limited to, software design and architecture documents (electronic or physical), source code, configuration and metadata files, bug tracking data, log files, and security- testing results. The choice of evidence that may be used to meet an “examination” requirement is deliberately left open for the tester to determine.

Detail the documentation, evidence, or equivalent examined by the assessor that was used to verify the Security Requirement is satisfied as instructed and stated in the Test Requirement.

If the evidence examined is documented in a table in the ROV Template, provide the unique reference(s) from the table that corresponds to the Test Requirement.

Interview The assessor converses with individual personnel. The purpose of interviews includes determining how an activity is performed, whether an activity is performed as defined, and whether personnel have particular knowledge or understanding of …
Added p. 8
Detail the activity performed that was used to verify the Security Requirement is satisfied as instructed and stated in the Test Requirement. If the activity performed is detailed in a table in the ROV Template, provide the reference(s) from the table that corresponds to the Test Requirement. Note: static and dynamic analyses implicitly include the “Test” keyword, and therefore also requires satisfying the “Test” Reporting Instruction.

Test The assessor evaluates the software to analyze its characteristics and behavior in various scenarios to assist in determining whether the associated security requirement is satisfied. Testing is generally carried out using either static and/or dynamic analysis as specified in the test requirements. Testing generally includes both positive and negative test activities. Both static and dynamic analyses are considered types of test activities.

• Positive testing: Generally used to confirm information, attributes, and expected behavior based on vendor documentation.

• Negative testing: Generally used to identify undocumented …
Added p. 10
Finding Description In Place The expected testing has been performed and the test requirements satisfied, which contribute to the verification that the associated Security Requirement is satisfied. Detailed testing must be performed and reporting provided that demonstrates how the assessor verified the In Place Finding. Note: A Security Objective Finding is a result of the cumulative Findings of all its underlying Security Requirements. To achieve an In Place Finding for a Security Objective, either:

(a) All underlying Security Requirements for that Security Objective are verified to be “In Place”, or (b) There is a combination of “In Place” and “N/A” Findings for the underlying Security Requirements. However, one or more “Not In Place” Findings must result in a Not In Place Finding for the overarching Security Objective.

Not Applicable (N/A) “Not Applicable”, or “N/A”, is only acceptable as a Finding where the Security Requirement, through testing and review, is determined to not …
Added p. 11
Vendor Contact Name: Vendor Contact Phone:

Does the Vendor have a ‘Qualified’ (Non-Expired) PCI Secure SLC Listing? Yes If Yes, provide the Vendor’s SSLC Qualified Listing Reference #: (Not applicable if the SSLC Qualified Listing is Expired) Indicate if the Vendor affirms their SSLC-qualified processes were used to develop this Software Product: Select ‘N/A’ if the Vendor either (a) does not have an SSLC-Qualified Listing, or (b) the SSLC Listing is Expired. Yes No N/A 1.2 Software Product Identification Software Product Name: Software Product Version:

Is the Software Product currently (or was it previously) Listed on the PCI SSC List of Validated Secure Software? No (The Software Product has never been Listed.) Yes (Provide the Reference #) è If ‘Yes’, is this a Reassessment as per the PCI Secure Software Program Guide? Note: The Listed status must be Validated (Not Expired) to qualify as a Reassessment. Otherwise, it will be regarded as …
Added p. 12
SSF Assessor Company Contact Name: Lead Assessor Phone:

SSF Assessor Company Contact Email: Lead Assessor Email:

SSF Assessor Company Contact Phone: Lead Assessor Secure Software Certificate Number:

Internal SSF Assessor Company QA Review Affirm that internal QA was performed on the completed ROV.

Yes (Internal QA has been performed in accordance with the ROV instructions herein and with the PCI Secure Software Program Requirements) Primary QA Reviewer Name: Primary QA Reviewer Credentials:

Primary QA Reviewer Email: Primary QA Reviewer Phone:

Provide details for additional Secure Software Assessors involved with this assessment.
Added p. 12
Affirm the Independence requirements as stated in the SSF Qualification Requirements have been read and fully understood. è Affirm Document any consultation services provided to the Software Vendor by the SSF Assessor Company as it relates to this Software Product and its assessment.

Disclose all other products or services provided by the SSF Assessor Company to the Software Vendor that were reviewed during this assessment or that could reasonably be viewed to affect assessment independence.
Added p. 13
Affirm the Subcontracting requirements as stated in the SSF Qualification Requirements have been read and fully understood. è Affirm Affirm if subcontracting was utilized as it relates to this Software Product assessment è No, subcontracting was not utilized.

Yes, subcontracting was utilized. (Complete the remainder of the table) If Yes, subcontracting was utilized for this Software Product assessment, affirm that per the SSF Qualification Requirements, prior written consent was obtained from PCI SSC. è Yes, prior consent was obtained.

Identify the companies and personnel that were subcontracted.
Added p. 14
Not Applicable Security Objective 1: Software Architecture, Composition, and Versioning Security Objective 2: Sensitive Asset Identification Security Objective 3: Sensitive Asset Storage and Retention Security Objective 4: Sensitive Modes of Operation Security Objective 5: Sensitive Asset Protection Mechanisms Security Objective 6: Sensitive Asset Output Security Objective 7: Random Numbers Security Objective 8: Key Management Security Objective 9: Cryptography Security Objective 10: Threats and Vulnerabilities Security Objective 11: Secure Deployment and Management Module A: Account-Data Protection Module B: POI Device Software Module C: Publicly-accessible Software Module D: Software Development Kits
Added p. 15
No, the assessment activity performed does not qualify as a Remote Assessment as described in the PCI Secure Software Program Guide.

Yes

• the “Addendum for ROC/ROV: Remote Assessments”, as provided in Appendix A of the PCI SSC Remote Assessment Guidelines and Procedures, has been completed and is being provided as part of the associated submission.
Added p. 16
• There are four (4) categories of “Required Dependencies”. All four tables below require input.

• Failure to properly and accurately document the Required Dependencies will result in the rejection of the associated submission. - As a Required Dependency relates to the Software Product assessment, document in the respective Required Dependency tables below all Security Requirements in the Findings and Observations where a Required Dependency is factored into. - Required Dependencies are a defined Listing Element. - Affirm the PCI Secure Software Program Guide regarding Required Dependencies and the relevant instructions herein have been read and fully understood. - Upon Acceptance and Listing, the Validated Secure Software Product is required to use the Required Dependencies in the manner representative of the Software Product assessment. As such, any deployment, operation, or equivalent of the Validated Secure Software Product that is not in accordance with its Listing, including the use of the noted …
Added p. 16
Yes, there are Listed Secure Software Required Dependencies.

- A Full Assessment (New or Reassessment) Software Product submission will not be considered for Acceptance if it claims a Listed Secure Software Product that is Expired as a Required Dependency. - If at any time prior to Acceptance of the Software Product submission, including during the PCI SSC AQM review process, the Listed Secure Software being claimed as a Required Dependency transitions from Validated to Expired, the Software Product submission will be rejected.

Listing Reference # Secure Software Name Version(s) Related Security Reqts Description of the reliance this Software Product under assessment has with the Listed and Validated Secure Software Required Dependency
Added p. 17
Yes, there are PTS HSM Required Dependencies.

Do NOT put FIPS HSMs in this table. Instead use the ‘Non-PCI-Listed Required Dependencies’ table. - A New Assessment Software Product submission will not be considered for Acceptance if it claims an HSM as a Required Dependency when the PTS Approval is Expired. - If at any time prior to Acceptance of the Software Product submission, including during the PCI SSC AQM review process, the HSM being claimed as a Required Dependency transitions to Expired, the Software Product submission will be rejected. - There may be Program allowances for the continued use of HSMs for an eligible Software Product Reassessment. Refer to the PCI Secure Software Program Guide and the PCI Secure Software Technical FAQs to determine if there are any relevant allowances for Reassessments. - Entries such as “Any HSM Device” are not permissible.

PTS Approval # Model Name & Number Hardware #(s) Firmware …
Added p. 18
Yes, there are PTS POI Device Required Dependencies.

• Assessments that include Module B must claim and document all PTS POI devices applicable to the assessment.

• While PTS POI devices can be claimed as a Required Dependency for Software Products where Module B is Not Applicable, the PTS POI devices claimed must satisfy the criteria and context of a Required Dependency as per the PCI Secure Software Program Guide. - A New Assessment Software Product submission will not be considered for Acceptance if it claims a PTS POI device as a Required Dependency when the PTS Approval is Expired. - If at any time prior to Acceptance of the Software Product submission, including during the PCI SSC AQM review process, the PTS POI device being claimed as a Required Dependency transitions to Expired, the Software Product submission will be rejected. - There may be Secure Software Program allowances for the continued …
Added p. 19
If the PTS POI Device is associated with Module B, document “Module B” below. If the PTS POI Device is NOT related to Module B, document the Security Requirements and the reliance this Software Product has with the PTS POI Device Required Dependency PTS Approval # Make / Mfr.

Model Name / Number Hardware #(s) Firmware #(s) Applic #(s)
Added p. 20
Yes, there are Non-PCI-Listed Required Dependencies.

Do NOT put PTS HSMs, PTS POI Devices, or Listed Secure Software Required Dependencies in this table. FIPS Modules, which can include FIPS HSMs, can be entered here. - A New Assessment Software Product submission will not be considered for Acceptance if it claims a FIPS module as a Required Dependency when the FIPS Certificate is Revoked. - FIPS modules on the CMVP Historical Validation List might be able to be used for a New Assessment, however, the Assessor must determine the Historical Reason for the transition to the ‘CMVP Historical Validation List’ does not compromise the Software Product from satisfying the security requirements the FIPS module is being relied upon for. This analysis must be documented below in the far right column.

Hardware, Software, or Combination Product Type / Description Mfr or Vendor Hardware version(s) (if applicable) Software version(s) (if applicable) Cert # (e.g., FIPS) …
Added p. 21
Describe how the software is sold, distributed, or licensed to third parties (for example, licensed as software-as-a-service, stand-alone application, etc.). If the software is only used by the Software Vendor, indicate as such:

Describe how the software is designed (for example, as a standalone application, as a component or library, or as part of a suite of applications):
Added p. 21
Ensure diagrams are clearly visible (not blurry) and comprehensible.

- The associated PCI Secure Software Program Guide for v2.x contains details and criteria on versioning, which includes the use of wildcards and separators. It may be possible that related Secure Software Technical FAQs have also been published. - Affirm the published Secure Software Program information regarding allowable versioning is fully understood.
Added p. 23
Affirm the most recent version of PCI Secure Software Standard

• Sensitive Asset Identification for v2.x has been read, fully understood, and used in the assessment activity as represented herein. è Affirm 6.2 Software Vendor Sensitive Asset Documentation Check here if this information is being provided within the SAID Appendix A as part of the relevant Software Product submission. Yes This table is provided to facilitate the required documentation for identifying and documenting sensitive assets, as well as make it easier to reference the documentation within other respective tables, as well as within the Findings and Observations. Document ID: Self-assigned unique identifier used to allow for references to entries in this table. Document Name: Name of the document. Date / Version: The date and/or version number of the document. Description: Describe the general content of the document and how it pertains to the Software Product assessment.

Document ID Document Name Date / …
Added p. 24
PCI Secure Software Requirements

• This table is used to capture required information pertaining to requirements: 2-1.8[.x] List all cryptographic key types used by the software that are associated with sensitive assets.

Key ID: Self-assigned unique identifier used for references to entries in this table. Key Type: E.g., DEK, KEK, PEK, MAC, Public, Private, etc. Algorithm: E.g., AES, RSA, DSA, SHA3, etc. Key Mgmt: E.g., DUKPT, MK/SK, Fixed, One-time use, etc.

Key Length: Full length (include parity bits as applicable) Key Generation: Generation method and origin Key Destruction: List destruction methods for each storage method Note: As cryptographic keys are sensitive data, additional attributes for keys are accounted for in Table 6.4, which includes their storage location.

ID Key Type Algorithm Key Mgmt Key Length (bits) Fill out all the information below for each key type Description & Purpose:

Description & Purpose:

Description & Purpose:
Added p. 25
PCI Secure Software Requirements

• This table is used to capture required information pertaining to requirements: 2-1, 2-1.1, 2-1.2 Sensitive Data ID: Self-assigned unique identifier used to allow for references to entries in this table. Sensitive Data Type: The type of sensitive data. Refer to the “Sensitive Data Categories” in the SAID as a guide. Sensitive Data Element: Specific sensitive data element in relation to the Sensitive Data Type. Stored: Indicate with a ‘Yes’ or ‘No’ if the data element is stored. Storage Locations: If stored, then document the location(s) where the data is stored persistently. Else if not stored, indicate ‘N/A’. Key ID: If the data element is a cryptographic key, populate the Key ID from Table 6.3. Else, indicate ‘N/A’. Doc ID: As applicable, references to entries in Table 6.2 herein. Description / Use: Concisely describe the purpose/use of the sensitive data element within/by the software.

Sensitive Data ID Sensitive …
Added p. 26
PCI Secure Software Requirements

• This table is used to capture required information pertaining to requirements: 2-3, 2-3.1, 2-3.2, 2-3.4, 2-3.5, 2-3.6 Sensitive Funct. ID: Self-assigned unique identifier used to allow for references to entries in this table. Sensitive Funct. Name: Unique identifier (names) of the functionality in relation to the Sensitive Functionality Type, e.g., name of the function, process, etc. Sensitive Funct. Cat.: The categorial type of functionality. Refer to the “Sensitive Functionality Categories” in the SAID as a guide. Sensitive Data IDs: If sensitive data is associated with the functionality, enter the Sensitive Data IDs from Table 6.4. Else indicate ‘N/A’. Sensitive Resource IDs: If sensitive resources are associated with the functionality, enter the Sensitive Resource IDs from Table 6.5. Else indicate ‘N/A’. Externally Accessible: Indicate with a ‘Yes’ or ‘No’ if the functionality is externally accessible, e.g., as an API. Sens. Mode of Op.: Indicate with a ‘Yes’ …
Added p. 27
PCI Secure Software Requirements

• This table is used to capture required information pertaining to requirements: 2-1.3[.x], 2-2.4[.x] Sensitive Data ID: Reference to entries in the Sensitive Data Information table. Enter as many IDs as pertains to the same parameters for that row. Sensitive Resource ID: Reference to entries in the Sensitive Resource Information table. Enter as many IDs as pertains to the same parameters for that row. Configurable Retention: Indicate ‘Yes’ or ‘No’ if the retention period is configurable after the software is deployed (in use). Retention Period: Document the retention period. If configurable, document if there is a defined range of allowable periods/settings. Sen. Funct. ID: If the retention and/or deletion mechanism pertains to Sensitive Functionality in Table 6.6, document the Sensitive Functionality ID. Doc ID: As applicable, references to entries in Table 6.2. Deletion Method: Describe the method used to securely delete the sensitive asset or otherwise render …
Added p. 28
Affirm that the use and expectations of the defined Reporting Instructions herein have been read, fully understood, and satisfied in the Findings and Observations. è Affirm 7.2 Sampling The PCI Secure Software Standard contains information on sampling (“Use of Sampling”), and sampling of the Software Product is not permitted.

Affirm that sampling of the Software Product cannot be used in the Software Product assessment è Affirm 7.3 Technical FAQs The PCI Secure Software Standard and the PCI Secure Software Program Guide contain information regarding Technical FAQs, which relates to a separate document that is published and updated on an as needed basis. As Technical FAQs are mandatory for consideration as part of a Software Product assessment, it is imperative to review and understand them as part of the assessment activity.

Affirm that the most recent PCI Secure Software Technical FAQs for v2.x were reviewed thoroughly as part of this assessment. è Affirm

Note: …
Added p. 30
ID # Description Scope Objective / Purpose 7.7 Not Applicable Findings Identify all Security Objectives, Security Requirements, and Modules marked as Not Applicable (N/A) in the Findings and Observations herein. - Only Security Objectives and Security Requirements having the potential to be marked as N/A will have the N/A option available as a Finding. - A “Not Applicable”, or “N/A” finding is only acceptable where an appropriate degree of analysis and testing is used to determine the Finding. - Mark entries below in the order they appear in the Findings and Observations herein. - If an entire applicable Security Objective or Module is marked as N/A, do not list all underlying Security Requirements as N/A. - Provide the analysis for the use of N/A for an entire module at the beginning of that Module’s section where indicated herein.

Affirm that the criteria and use regarding Not Applicable (N/A) as a Finding …
Added p. 31
Affirm the criteria and use regarding Technical Constraints is fully understood. è Affirm Affirm if Technical Constraints are being claimed for this assessment. è No, Technical Constraints are not being claimed.

Yes, Technical Constraints are being claimed.

If Technical Constraints are being claimed, then for each row entry below:

1. Document the Security Requirements or Test Requirements in the Findings and Observations where a Technical Constraint is being claimed. 2. Affirm that the Findings and Observations match this list for all claims of a Technical Constraint, and that each respective Security Requirement and/or Test Requirement in the Findings and Observations contains a description, at a minimum, of the following. a. That a Technical Constraint is being claimed. b. The extent to which the security requirement or test requirement can be satisfied, if at all. c. The technical limitation of the implementation and why the limitation cannot be resolved or otherwise remediated. d. The …
Added p. 32
The use of this ROV Template is mandatory for submissions of Validated Secure Software Products to PCI SSC for consideration of Acceptance and Listing. The ROV Template must be completed as instructed and accurately reflect the exact submission it is being used for. Existing text must not be modified in any way. Only the determined Findings and documented Observations are to be populated by the Assessor.

• All Software This section provides a minimum baseline set of security objectives and associated security requirements required for all software being assessed to this standard. The applicability of the modules

•which contain additional requirements

•to the software assessment does not supersede any requirements in this section.

Security Objective 1: Software Architecture, Composition and Versioning The architecture, composition, and versioning schema of the software are documented. Notes: This section encapsulates the entirety of the software intended to be assessed to this standard and subsequently represented by its potential …
Added p. 34
Implementation Notes The software vendor can choose the format for the bill of materials.

In Place Not In Place 1-2.a Examine evidence to verify the composition of the software, including its software and hardware dependencies, is accurately documented in a bill of materials.

<Assessor Response> 1-3 The software is designed and built in a manner that restricts its overall composition to only what is required for its intended functionality.

In Place Not In Place 1-3.a Examine vendor documentation to verify the software is designed and built in a manner that facilitates its overall composition being restricted to only what is required for its intended functionality. Leverage information from 1-1 and 1-2.

<Assessor Response> 1-3.b Perform static analysis to verify the information from 1-3.a. <Assessor Response> 1-3.c Examine evidence of the software-build process to verify the information from 1-2.

<Assessor Response> 1-3.1 This includes all third-party elements. In Place Not In Place 1-3.1.a Examine vendor documentation …
Added p. 35
In Place Not In Place 1-5.a Examine vendor documentation describing the versioning schema of the software and verify it is in accordance with the Program.

<Assessor Response> 1-5.1 If wildcards are being used, the wildcarding schema is explicitly documented and will be implemented per the Program and used only for non- security-impacting changes to the software.

In Place N/A Not In Place 1-5.1.a Examine vendor documentation describing the wildcarding schema of the software and verify it is in accordance with the Program and intended for only non-security impacting changes to the software.
Added p. 36
• Sensitive Asset Identification document for assistance in identifying and documenting sensitive assets. 2 - Accurate and complete identification and documentation for all sensitive assets is crucial

•this information is relevant to, and required for, additional security objective sections and their associated security requirements in this standard. 3 - Software vendors are encouraged to identify the sensitive assets early and often in the design of their software to assist in the software being designed with intent to satisfy the security objectives and security requirements in this standard. 4 - If the software contains account data as defined by PCI DSS (which is considered sensitive data within this standard), refer to Module A

• Account Data Protection in this standard for additional requirements and information. Module A

• Account Data Protection does not circumvent any requirements in the “Core

Select the overall Finding for this Security Objective è In Place Not In Place Security Objective …
Added p. 37
<Assessor Response> 2-1.2 The storage, including storage locations, of all sensitive data where storage is permissible, are documented.

In Place Not In Place 2-1.2.a Examine vendor documentation to verify it contains details that describe whether or not each sensitive data element is, or can be, stored, in addition to the storage locations as applicable.

<Assessor Response> 2-1.2.b Verify the sensitive data elements denoted as being stored, or capable of being stored, are permissible for storage.

Testing Notes The software is analyzed and tested in 3-1.1.

Testing Notes The software is analyzed and tested in 3-1.2.

<Assessor Response> 2-1.3 The retention policies for all sensitive data are documented, which includes:

In Place Not In Place 2-1.3.a Examine vendor documentation to verify it contains details that describe the retention policies attributed to, and implemented for, each sensitive data element that:

<Assessor Response> ROV Instruction: Use the instruction in 2-1.3.a to test the criteria in 2-1.3.1.a and 2-1.3.2.a.

2-1.3.1 The retention …
Added p. 38
Implementation Notes: This applies regardless of the form, e.g., cleartext, encrypted, etc.

In Place Not In Place 2-1.4.a Examine vendor documentation to verify it contains details that describe the methods implemented for each sensitive data element to render it unrecoverable once it is no longer required.

Testing Notes The software is analysed and tested in 3-1.3, 3-1.4, and 3-2.

<Assessor Response> 2-1.5 The protection classifications for all sensitive data, in accordance with the appropriate protection needs for each type and its use in order to facilitate mitigating their unauthorized access, disclosure, modification, and/or misuse, are documented.

In Place Not In Place 2-1.5.a Examine vendor documentation to verify it contains details that describe the protection classification attributed to each sensitive data element.

<Assessor Response> 2-1.5.b Based on the evidence examined for 2-1.1 and in 2-1.5.a, verify the protection classification attribution for each sensitive data element facilitates mitigating its unauthorized access, disclosure, modification, and/or misuse.

<Assessor Response> 2-1.6 …
Added p. 39
<Assessor Response> 2-1.8 For all cryptographic keys associated with sensitive assets, document the following additional information, at a minimum:

Implementation Notes: All uses of cryptography to protect sensitive assets must satisfy the definition of strong cryptography.

<Assessor Response> 2-1.8.1 Key type In Place Not In Place 2-1.8.1.a Examine vendor documentation to verify it denotes the key type for each cryptographic key.

<Assessor Response> 2-1.8.2 Associated cryptographic algorithm In Place Not In Place 2-1.8.2.a Examine vendor documentation to verify it denotes the associated cryptographic algorithm for each cryptographic key.

<Assessor Response> 2-1.8.3 Associated key management schema In Place Not In Place 2-1.8.3.a Examine vendor documentation to verify it denotes the associated key management schema for each cryptographic key.

<Assessor Response> 2-1.8.4 Key length In Place Not In Place 2-1.8.4.a Examine vendor documentation to verify it denotes the associated key length for each cryptographic key.

<Assessor Response> 2-1.8.5 Generation Method & Origin In Place Not In Place 2-1.8.5.a …
Added p. 40
<Assessor Response> 2-1.8.8 All associations with sensitive resources, as applicable. In Place Not In Place 2-1.8.8.a Examine vendor documentation to verify it denotes, as applicable, all associations between each cryptographic key to sensitive resources. Leverage the information examined as part of 2-2.1.

<Assessor Response> 2-1.8.9 All associations with sensitive functionality, as applicable. In Place Not In Place 2-1.8.9.a Examine vendor documentation to verify it denotes, as applicable, all associations between each cryptographic key with specific sensitive functionality. Leverage the information examined as part of 2-3.1.

<Assessor Response> 2-2 The sensitive resources associated with the software are identified and documented, including the following details, at a minimum:

In Place Not In Place This requirement is tested via 2.2.1 through 2-2.8.

2-2.1 The description and use of all sensitive resources are documented. In Place Not In Place 2-2.1.a Examine vendor documentation to verify it contains details that describe each sensitive resource and its use.

<Assessor Response> 2-2.1.b …
Added p. 41
In Place Not In Place 2-2.2.a Examine vendor documentation to verify it denotes, as applicable, all sensitive data associated with each sensitive resource. Leverage the information examined as part of 2-1.1.

<Assessor Response> 2-2.3 The storage, including storage locations, of all sensitive resources where storage is permissible are documented.

In Place Not In Place 2-2.3.a Examine vendor documentation to verify it contains details that describe whether or not each sensitive resource is, or can be, stored, in addition to the storage locations as applicable.

<Assessor Response> 2-2.3.b Verify the sensitive resources that can be stored are permissible for storage <Assessor Response> 2.2.3.c Examine vendor documentation to determine if there are any sensitive data elements contained within the sensitive resource and verify those sensitive data elements are permitted for storage.

<Assessor Response> 2-2.4 The retention policies for all sensitive resources are documented, which includes:

In Place Not In Place 2-2.4.a Examine vendor documentation to verify it …
Added p. 42
In Place Not In Place 2-2.6.a Examine vendor documentation to verify it contains details that describe the protection classification attributed to each sensitive resource.

<Assessor Response> 2-2.6.b Based on the evidence examined for 2-2.1 and in 2-2.6.a, verify the protection classification attribution for each sensitive resource facilitates mitigating its unauthorized access, disclosure, modification, and/or misuse.

<Assessor Response> 2-2.7 The protection methods for all sensitive resources, to facilitate mitigating their unauthorized access, disclosure, modification, and/or misuse, are documented.

Implementation Notes: All uses of cryptography to protect sensitive resources must satisfy the definition of strong cryptography.

In Place Not In Place 2-2.7.a Examine vendor documentation to verify it contains details that describe the protection methods attributed to, and implemented for, each sensitive resource.

<Assessor Response> 2-2.7.b Examine vendor documentation to verify the protection methods employed for each sensitive resource are in accordance with and satisfy the protection classification attributions verified in 2-2.6.

<Assessor Response> 2-2.8 The resource flows …
Added p. 43
<Assessor Response> 2-3.1.b Perform static analysis to verify the sensitive functionality identified in 2-3.1.a.

<Assessor Response> 2-3.1.c Perform static analysis to identify any functionality that satisfies the definition of sensitive functionality and was not previously identified as sensitive functionality. The analysis is expected to check for qualifying sensitive functionality that has been unaccounted for. Verify that all sensitive functionality is identified.

Testing Notes: The test requirement 2-3.1.c is intended as a means to corroborate the sensitive functionality claimed by the vendor against the source code of the software product under assessment. A reasonable analysis should check for obvious qualifying sensitive functionality that has been unaccounted for.

<Assessor Response> 2-3.2 The external accessibility of the sensitive functionality is documented. In Place N/A Not In Place 2-3.2.a Examine vendor documentation to verify it contains details that describe the capability of external accessibility for all applicable sensitive functionality.

<Assessor Response> 2-3.2.1 Sensitive functionality is only externally accessible …
Added p. 44
Implementation Notes:

This requirement is specific to sensitive functionality, whereas requirements in 5-1[.x] are specific to the overall security architecture of the software itself. There may be overlap that can be leveraged. All uses of cryptography to protect sensitive assets, which includes the software itself, must satisfy the definition of strong cryptography.

In Place Not In Place 2-3.3.a Examine vendor documentation to verify it contains details that describe how the software is designed and implemented to facilitate mitigating the unauthorized access, disclosure, modification, and/or misuse of all sensitive functionality, as appropriate. Leverage the information examined as part of 2-3.1 and 2-3.2.

<Assessor Response> 2-3.4 All sensitive data associated with the sensitive functionality is documented.

In Place Not In Place 2-3.4.a Examine vendor documentation to verify it denotes all sensitive data elements associated with the sensitive functionality. Leverage the information examined as part of 2-1.1 and 2-3.1.

In Place Not In Place 2-3.5.a Examine vendor documentation …
Added p. 45
<Assessor Response> 2-3.6.c Perform static analysis to identify any functionality that satisfies the definition of sensitive modes of operation and was not previously identified as a sensitive mode of operation. The analysis is expected to check for qualifying sensitive modes of operation that have been unaccounted for. Verify that all sensitive modes of operation are identified.
Added p. 46
Select the overall Finding for this Security Objective è In Place Not In Place Security Objective 3: Sensitive Asset Storage and Retention Security Requirements and Test Requirements Assessor’s Findings and Observations 3-1 Sensitive data that is capable of being stored is: In Place N/A Not In Place ROV Instruction: If the assessment of security requirement 2-1.1 results in determining the software product does not have any sensitive data and security requirement 2-1 is marked as ‘N/A’ (with the appropriate assessor response justifying the ‘N’A’ finding documented in each 2-1.1.a/b/c test requirement as it relates to the prescribed test activity), then 3-1 can be marked as ‘N/A’. For an ‘N/A’ finding for 3-1 via assessing 2-1.1, the remaining 3-1.x and 3-2 requirements can then be left blank. Note: An ‘N/A’ finding due to zero sensitive data being identified is considered to be significant and of low probability.

3-1.a Examine vendor documentation to …
Added p. 47
• Attempting to violate, bypass, or otherwise circumvent the defined and implemented protection methods.

<Assessor Response> 3-1.2.1 Strong cryptography is used where cryptography is implemented or required to protect sensitive data in storage.

In Place Not In Place 3-1.2.1.a Perform static analysis to verify that cryptography leveraged to protect sensitive data in storage satisfies the definition of strong cryptography.

<Assessor Response> 3-1.2.1.b Perform static and/or dynamic analysis as necessary to verify that the use of strong cryptography being leveraged to protect sensitive data in storage cannot be violated, bypassed, or otherwise circumvented.

<Assessor Response> 3-1.3 Stored in accordance with the defined retention policies. In Place Not In Place 3-1.3.a Perform static analysis to verify the sensitive data is stored in accordance with the defined retention policies. Leverage the information from 2-1.3[.x].

<Assessor Response> 3-1.3.b Perform dynamic analysis to verify that sensitive data is retained in accordance with the defined retention policies. Testing includes but is …
Added p. 48
<Assessor Response> 3-1.4.b Perform dynamic analysis to verify the analysis and findings in 3- 1.4.a. Testing should include, but is not limited to:

• Attempting to violate, bypass, or otherwise circumvent the methods employed to securely delete or render the sensitive data unrecoverable.

• Attempting to violate, bypass, or otherwise circumvent the methods employed to securely delete or render the sensitive data unrecoverable.
Added p. 48
• Attempting to recover sensitive data after being securely deleted or otherwise rendered unrecoverable.

• Attempting to recover sensitive data after being securely deleted or otherwise rendered unrecoverable.

<Assessor Response> 3-2 Sensitive data is only retained in non-persistent memory for the duration necessary, after which time it is securely deleted, else it is rendered unrecoverable.

In Place Not In Place 3-2.a Perform static analysis to verify the sensitive data is securely deleted once it is no longer necessary to retain. If this is not possible due to a legitimate and verified technical constraint, then verify the sensitive data is rendered unrecoverable.

<Assessor Response> 3-2.b Perform dynamic analysis to verify the analysis and findings in 3-2.a. Testing should include, but is not limited to:

<Assessor Response> 3-3 Sensitive resources that are capable of being stored are: In Place Not In Place 3-3.a Examine vendor documentation to verify the requirements in 3-3.1 through 3-3.4. Leverage the information …
Added p. 49
In Place Not In Place 3-3.1.a Perform static analysis to verify that all sensitive resources capable of being stored are only stored as necessary and are permissible to be stored.

<Assessor Response> 3-3.1.b Perform static analysis to identify all sensitive resources that are capable of being stored that were not previously identified as being stored. Verify at this point that all sensitive resources capable of being stored are accounted for. The intent here is to uncover any sensitive resources that have not been previously identified.

<Assessor Response> 3-3.1.c Perform dynamic analysis to verify that all sensitive resources capable of being stored are only stored as necessary and are permissible to be stored.

<Assessor Response> 3-3.1.d Perform dynamic analysis to verify that all sensitive resources that are not permissible to be stored are not capable of being stored.

In Place Not In Place 3-3.2.a Perform static analysis to verify that all sensitive resources capable of …
Added p. 50
<Assessor Response> 3-3.3 Stored in accordance with the defined retention policies. In Place Not In Place 3-3.3.a Perform static analysis to verify the sensitive resources are stored in accordance with the defined retention policies that are reviewed and verified from all applicable requirements in Security Objective 2, in particular 2-2.4[.x].

<Assessor Response> 3-3.3.b Perform dynamic analysis to verify that sensitive resources are retained in accordance with the defined retention policies. Testing includes, but is not limited to:

• Attempting to violate, bypass, or otherwise circumvent the defined retention policies.

Testing Notes The testing and analysis performed for 3-3.3 will be related to, and leveraged for, the testing and analysis in 3-3.4.

<Assessor Response> 3-3.4 Only stored until they are no longer necessary, at which time they are securely deleted, else they are rendered unrecoverable.

In Place Not In Place 3-3.4.a Perform static analysis to verify the sensitive resources are securely deleted once they are no …
Added p. 51
• Attempting to violate, bypass, or otherwise circumvent the methods employed to securely delete or render the sensitive resources unrecoverable.

• Attempting to violate, bypass, or otherwise circumvent the methods employed to securely delete or render the sensitive resources unrecoverable.

• Attempting to recover sensitive resources after being securely deleted or otherwise rendered unrecoverable.

• Attempting to recover sensitive resources after being securely deleted or otherwise rendered unrecoverable.

<Assessor Response> 3-4 Sensitive resources are only retained in non-persistent memory for the duration necessary, at which time they are securely deleted, else they are rendered unrecoverable.

In Place Not In Place 3-4.a Perform static analysis to verify the sensitive resources are securely deleted once they are no longer necessary to retain. If this is not possible due to a legitimate and verified technical constraint, then verify the sensitive resources are rendered unrecoverable.
Added p. 52
Select the overall Finding for this Security Objective è In Place N/A Not In Place ROV Instruction: If the assessment of security requirement 2-3.6 results in determining the software does not implement a sensitive mode of operation and is therefore marked and documented appropriately and accurately as ‘N/A’, then Security Objective 4 can be marked as ‘N/A’, provided the criteria for the use of ‘N/A’ as described herein is satisfied. The remainder of this section and the 4-x requirements can then be left blank.

Security Objective 4: Sensitive Modes of Operation Security Requirements and Test Requirements Assessor’s Findings and Observations 4-1 Sensitive modes of operation are designed to facilitate mitigating their unauthorized access and minimizing their misuse during authorized access, which includes but is not limited to the following:

In Place Not In Place 4-1.a Examine vendor documentation to verify the requirements in 4-1.1 through 4-1.9. Leverage the information examined and verified …
Added p. 52
• Attempting to violate, bypass, or otherwise circumvent the implemented strong authentication mechanisms.

<Assessor Response> 4-1.2 Implement a defined maximum number of failed access attempts in a defined period of time.

In Place Not In Place 4-1.2.a Examine vendor documentation to verify a defined maximum number of failed access attempts within a defined period of time is implemented for each sensitive mode of operation.
Added p. 53
• Attempting to violate, bypass, or otherwise circumvent the threshold limits and/or duration of time it occurs in.

<Assessor Response> 4-1.3 Implement a defined lockout period, initiated upon the allowable maximum number of failed access attempts in a defined period of time being reached.

In Place Not In Place 4-1.3.a Examine vendor documentation to verify a defined lockout period is initiated upon the maximum number of failed access attempts within a defined period of time being reached for each sensitive mode of operation.

• Attempting to violate, bypass, or otherwise circumvent the defined lockout periods.

<Assessor Response> 4-1.4 Are designed in a manner that failed access attempts do not disclose information that can assist in gaining unauthorized access.

In Place Not In Place 4-1.4.a Examine vendor documentation to verify for each sensitive mode of operation that failed access attempts do not disclose information that can assist in gaining unauthorized access.

• Attempting to violate, bypass, or …
Added p. 54
• Attempting to violate, bypass, or otherwise circumvent the implemented inactivity timeouts.

<Assessor Response> 4-1.6 Implement a defined maximum duration of use timeout, upon which the software exits the sensitive mode of operation and effectively returns to normal operation.

In Place Not In Place 4-1.6.a Examine vendor documentation to verify for each sensitive mode of operation that a maximum duration of use timeout has been implemented, upon which the software exits the sensitive mode of operation and returns to normal operation.

• Attempting to violate, bypass, or otherwise circumvent the implemented maximum duration of use timeouts.

<Assessor Response> 4-1.7 The software is designed to retain, or facilitate the retention of, a record of all failed and successful access to sensitive modes of operation.

Implementation Notes The software can create the record or otherwise provide the required and pertinent information such that a record can be created. The event information is the essential aspect that needs …
Added p. 55
• Attempting to access each sensitive mode of operation using invalid information and verifying the subsequent record creation for the unique failed access attempt.

<Assessor Response> 4-1.7.2 The records include information that can uniquely identify each successful access event, including traceability to the entity that access was granted to.

In Place Not In Place 4-1.7.2.a Examine vendor documentation to verify for each sensitive mode of operation that records include information that can uniquely identify each successful access event, including the entity that access was granted to.

• Accessing each sensitive mode of operation using valid information and verifying the subsequent record creation for the unique successful access event.

<Assessor Response> 4-1.7.3 The records include information that can uniquely identify the net effect, or change, resulting from access to the sensitive mode of operation.

In Place Not In Place 4-1.7.3.a Examine vendor documentation to verify for each sensitive mode of operation that records include information that …
Added p. 56
<Assessor Response> 4-1.7.4 The software is designed to protect these records from compromise using strong cryptography.

In Place Not In Place 4-1.7.4.a Examine vendor documentation to verify for each sensitive mode of operation that records are protected using strong cryptography.

• Attempting to bypass the protection mechanisms to gain access to cleartext records and/or the relative information.

<Assessor Response> 4-1.7.5 The software is designed to require strong authentication to access these records.

In Place Not In Place 4-1.7.5.a Examine vendor documentation to verify for each sensitive mode of operation that strong authentication is required to access associated records.

• Attempting to bypass the authentication mechanisms to gain access to records and/or the relevant information.

<Assessor Response> 4-1.7.6 The records are retained for a defined retention period.

Implementation Notes The software is not required to retain the records on the same system the software resides. The records can be offloaded elsewhere, in either physical and/or logical form. However, …
Added p. 57
• Attempting to violate, bypass, or otherwise circumvent the implemented record-retention parameters.

<Assessor Response> 4-1.7.7 Records transmitted outside the software are protected in accordance with requirement 6-2.

In Place Not In Place 4-1.7.7.a Verify that records associated with sensitive modes of operation (requirements 4-1.7[.x]) have been accounted for in the assessment of requirement 6-2[.x].

<Assessor Response> 4-1.8 Sensitive modes of operation are designed in accordance with requirement 5-4.3.1.

<Assessor Response> 4-1.9 Sensitive modes of operation are designed in accordance with requirement 5-5.3.1.

In Place Not In Place 4-1.8.a Verify that each sensitive mode of operation has been accounted for in the assessment of requirement 5-4.3.1 regarding secure authorization.

In Place Not In Place 4-1.9.a Verify that each sensitive mode of operation has been accounted for in the assessment of requirement 5-5.3.1 regarding mitigating inadvertently disclosing, exposing, or otherwise leaking sensitive assets.
Added p. 58
Select the overall Finding for this Security Objective è In Place Not In Place Security Objective 5: Sensitive Asset Protection Mechanisms Security Requirements and Test Requirements Assessor’s Findings and Observations 5-1 Platform-based security mechanisms relied upon by the software to facilitate protecting sensitive assets have been evaluated. Implementation Notes Leveraging underlying platform-based security mechanisms is not required; however, their use does not supersede or otherwise replace any security requirements in this standard.

In Place N/A Not In Place ROV Instruction: If the assessment of security requirement 5-1 results in determining the finding is ‘N/A’, then security requirement 5-1 can be marked as ‘N/A’ (with the appropriate assessor response justifying the ‘N’A finding documented in 5-1.a), provided the criteria for the use of ‘N/A’ as described herein is satisfied. The remaining 5-1 test requirements can then be left blank.

5-1.a Examine vendor documentation to verify the platform mechanisms used, and to what extent …
Added p. 59
<Assessor Response> 5-2.b Perform static analysis to verify the information from 5-2.a. <Assessor Response> 5-2.c Perform dynamic analysis to verify the analysis and findings from 5- 2.a/b. Testing should include, but is not limited to:

• Attempting to bypass or otherwise circumvent the implemented mechanisms.

<Assessor Response> 5-2.1 Facilitating the mitigation of anomalous behavior as a result of input from external sources.

In Place Not In Place 5-2.1.a Examine vendor documentation to verify how the software is designed to facilitate mitigating anomalous behavior as a result of input from external sources.
Added p. 59
• Attempting to purposefully manipulate inputs with intent to cause risk to sensitive assets. Testing Notes The testing strategies will be highly contingent on, and should be catered to: the type of software, the programming language(s), the specific design and interfaces, etc. The goal is to verify there is demonstrable evidence that mechanisms are in place and seemingly effective at satisfying the requirement.

<Assessor Response> 5-2.2 Facilitating the mitigation of anomalous behavior as a result of error conditions.

In Place Not In Place 5-2.2.a Examine vendor documentation to verify how the software is designed to facilitate mitigating anomalous behavior as a result of expected error conditions.

<Assessor Response> 5-2.2.b Perform static analysis to verify the information from 5-2.2.a. <Assessor Response>
Added p. 60
• Attempting to violate, bypass, or otherwise circumvent the implemented mechanisms.

• Attempting to purposefully manipulate the software with intent to cause risk to sensitive assets due to unexpected consequences of errors. Testing Notes The testing strategies will be highly contingent on, and should be catered to: the type of software, the programming language(s), the specific design and architecture, etc. The goal is to verify there is demonstrable evidence that mechanisms are in place and seemingly effective at satisfying the requirement.

<Assessor Response> 5-2.3 Facilitating the mitigation of anomalous behavior as a result of retrieving or receiving externally-hosted third-party elements during runtime.

In Place Not In Place 5-2.3.a Examine vendor documentation to verify if the software is capable of retrieving or receiving externally-hosted third-party elements during runtime.

<Assessor Response> 5-2.3.b Perform static analysis to verify the information from 5-2.3.a. If it is determined that the software is capable of retrieving or receiving externally- hosted …
Added p. 61
In Place Not In Place 5-3.a Examine vendor documentation to verify how the software is designed to facilitate detecting suspected anomalous behavior in order to protect sensitive assets.

• Attempting to violate, bypass, or otherwise circumvent the implemented mechanisms. Testing Notes The testing strategies will be highly contingent on, and should be catered to: the type of software, the programming language(s), the specific design and architecture, etc. The goal is to verify there is demonstrable evidence that mechanisms are in place and seemingly effective at satisfying the requirement.

<Assessor Response> 5-3.1 The software is designed to facilitate mitigating, or at least minimizing, the impact of suspected anomalous behavior, or otherwise fails in a secure manner.

In Place Not In Place 5-3.1.a Examine vendor documentation to verify how the software is designed to facilitate mitigating, or at least minimizing, the impact of suspected anomalous behavior, or otherwise fails in a secure manner.
Added p. 62
• Attempting to violate, bypass, or otherwise circumvent the implemented mechanisms. Testing Notes The testing strategies will be highly contingent on, and should be catered to: the type of software, the programming language(s), the specific design and architecture, etc. The goal is to verify there is demonstrable evidence that mechanisms are in place and seemingly effective at satisfying the requirement.

<Assessor Response> 5-3.2 The software is designed to provide an immediate indication of suspected anomalous behavior.

In Place Not In Place 5-3.2.a Examine vendor documentation to verify how the software is designed to provide an immediate indication of suspected anomalous behavior.

<Assessor Response> 5-3.2.1 The mechanism used to provide an indication of suspected anomalous behavior is protected against compromise.

In Place Not In Place 5-3.2.1.a Examine vendor documentation to verify how the software is designed to protect the indication of suspected anomalous behavior against compromise.

Implementation Notes The software can create the record or otherwise …
Added p. 63
<Assessor Response> 5-3.3.1 The records include information that can uniquely identify the suspected anomalous behavior event.

In Place Not In Place 5-3.3.1.a Examine vendor documentation to verify that records of suspected anomalous behavior include information that can uniquely identify each event.

• Triggering, or simulating, an anomalous behavior event and verifying the subsequent record creation for the unique event.

In Place Not In Place 5-3.3.2.a Examine vendor documentation to verify that records are protected using strong cryptography.

• Attempting to bypass the protection mechanisms to gain access to cleartext records and/or the relevant information.

In Place Not In Place 5-3.3.3.a Examine vendor documentation to verify that strong authentication is required to access associated records.

• Attempting to bypass the authentication mechanisms to gain access to records and/or the relevant information.

• Attempting to violate, bypass, or otherwise circumvent the implemented record-retention parameters.

<Assessor Response> 5-3.3.5 Records transmitted outside the software are protected in accordance with requirement 6-2.
Added p. 64
<Assessor Response> 5-3.3.4 The records are retained for a defined retention period. Implementation Notes The software is not required to retain the records on the same system the software resides. The records can be offloaded elsewhere, in either physical and/or logical form. However, doing so still requires the records to be protected.

In Place Not In Place 5-3.3.4.a Examine vendor documentation to verify records of suspected anomalous behavior events are retained for a defined retention period.

In Place Not In Place 5-3.3.5.a Verify that records associated with anomalous behavior events (requirements 5-3[.x]) have been accounted for in the assessment of requirement 6-2[.x].

<Assessor Response> 5-4 The software is designed to facilitate securely implementing authorized access to sensitive assets, which includes access to:

In Place Not In Place 5-4.a Examine vendor documentation to verify that the software is designed to facilitate securely implementing authorized access to sensitive assets, including the avoidance of known authorization-based flaws. …
Added p. 65
• Attempting to violate, bypass, or otherwise circumvent the authorized access mechanisms to sensitive data.

<Assessor Response> 5-4.2 Sensitive resources In Place Not In Place 5-4.2.a Perform static analysis to verify the context of 5-4 in relation to sensitive resources.
Added p. 65
• Attempting to violate, bypass, or otherwise circumvent the authorized access mechanisms to sensitive resources.

<Assessor Response> 5-4.3 Sensitive functionality, which includes: In Place Not In Place 5-4.3.a Perform static analysis to verify the context of 5-4 in relation to sensitive functionality.

• Attempting to violate, bypass, or otherwise circumvent the authorized access mechanisms to sensitive functionality.

<Assessor Response> 5-4.3.1 Sensitive modes of operation In Place N/A Not In Place ROV Instruction: If the assessment of security requirement 2-3.6 results in determining the software does not implement a sensitive mode of operation and is therefore marked and documented appropriately and accurately as ‘N/A’, then security requirement 5-4.3.1 can be marked as ‘N/A’ above, provided the criteria for the use of ‘N/A’ as described herein is satisfied. The test requirements 5-4.3.1.x can then be left blank.

5-4.3.1.a Perform static analysis to verify the context of 5-4 in relation to sensitive modes of operation.

<Assessor Response> 5-4.3.1.b …
Added p. 66
In Place Not In Place 5-5.a Examine vendor documentation to verify the software is designed to facilitate mitigating inadvertently disclosing, exposing, or otherwise leaking sensitive assets. This information will be used to assist in the assessment for 5-5.1 through 5-5.3[.x]. Testing Notes The intent here is not to reconfirm testing/evidence from other requirements. This is not the same context as 5-4 regarding authorized access. This requirement 5-5 is regarding unexpected disclosure/exposure in unintended ways.

<Assessor Response> 5-5.1 Sensitive data In Place Not In Place 5-5.1.a Perform static analysis to verify the software is designed to facilitate mitigating inadvertently disclosing, exposing, or otherwise leaking sensitive data. Leverage, in part, the information from Security Objective 2 regarding sensitive data, including information regarding sensitive functionality related to sensitive data.

<Assessor Response> 5-5.2 Sensitive resources In Place Not In Place 5-5.2.a Perform static analysis to verify the software is designed to facilitate mitigating inadvertently disclosing, exposing, …
Added p. 67
<Assessor Response> 5-5.3.1 Sensitive modes of operation In Place N/A Not In Place ROV Instruction: If the assessment of security requirement 2-3.6 results in determining the software does not implement a sensitive mode of operation and is therefore marked and documented appropriately and accurately as ‘N/A’, then security requirement 5-5.3.1 can be marked as ‘N/A’ above, provided the criteria for the use of ‘N/A’ as described herein is satisfied. The test requirements 5-5.3.1.x can then be left blank.

<Assessor Response> 5-5.3.1.b Perform dynamic analysis to verify the analysis and findings from 5-5.3.1.a. Testing should include, but is not limited to:

• Attempting to find unexpected ways to gain access to sensitive resources.

<Assessor Response> 5-5.3 Sensitive functionality, which includes: In Place Not In Place 5-5.3.a Perform static analysis to verify the software is designed to facilitate mitigating inadvertently disclosing, exposing, or otherwise leaking sensitive functionality. Leverage, in part, the information from Security Objective …
Added p. 68
Select the overall Finding for this Security Objective è In Place Not In Place Security Objective 6: Sensitive Asset Output Security Requirements and Test Requirements Assessor’s Findings and Observations 6-1 All forms of sensitive assets that are capable of being output from the software are identified and documented. Implementation Notes While the flow diagrams in 2-1.7 and 2-2.8 are used to document all sensitive data and sensitive resources being output from the software, they are confirmed here in the context of Security Objective 6. This requirement is in regard to any output: cleartext, encrypted, truncated, hashed, and/or any other form.

In Place Not In Place 6-1.a Leverage the information from 2-1[.x] and 2-2[.x], in addition to the flow information from 2-1.7 and 2-2.8 and perform static analysis to verify all sensitive assets that are capable of being output from the software are identified and documented, including all forms of potential output …
Added p. 69
In Place N/A Not In Place ROV Instruction: If the assessment of security requirement 6-2.1 results in determining the finding is ‘N/A’, then security requirement 6-2.1 can be marked as ‘N/A’ (with the appropriate assessor response justifying the ‘N’A finding documented in 6-2.1.a), provided the criteria for the use of ‘N/A’ as described herein is satisfied. The remaining 6-2.1 test requirements can then be left blank.

6-2.1.a Examine vendor documentation to verify the cryptographic algorithms and associated key lengths are documented and satisfy the use of strong cryptography.

<Assessor Response> 6-2.1.b Perform static analysis to verify the software is designed to encrypt the applicable cleartext sensitive assets identified in 6-1 using strong cryptography.

<Assessor Response> 6-2.1.c Perform dynamic analysis to verify the analysis and findings from 6-2.1.a/b. Testing should include, but is not limited to:

• Attempting to obtain applicable sensitive assets being output in cleartext from the software.

<Assessor Response> 6-2.2 If the software …
Added p. 70
<Assessor Response> 6-2.2.3 The root of trust used for each secure channel is documented. In Place Not In Place 6-2.2.3.a Examine vendor documentation to verify the root of trust of each secure channel is identified and documented.

<Assessor Response> 6-2.2.4 The cryptography supported for each secure channel is documented and satisfies the use of strong cryptography.

In Place Not In Place 6-2.2.4.a Examine vendor documentation to verify the cryptography supported for each secure channel is identified, documented, and satisfies the use of strong cryptography.

<Assessor Response> 6-2.2.5 The establishment of each secure channel and how mutual authentication is guaranteed is documented.

In Place Not In Place 6-2.2.5.a Examine vendor documentation to verify the mutual authentication details of each secure channel are identified and documented.

<Assessor Response> 6-2.2.6 Secret or private cryptographic keys used to establish and maintain secure channels are unique per session, except by chance.

In Place Not In Place 6-2.2.6.a Examine vendor documentation to …
Added p. 72
Select the overall Finding for this Security Objective è In Place N/A Not In Place ROV Instruction: If the assessment of security requirement 7-1 results in determining the software does not implement random values associated with sensitive assets, Security Objective 7 and security requirement 7-1 can be marked as ‘N/A’ (with the appropriate assessor response justifying the ‘N’A finding documented in 7-1.a), provided the criteria for the use of ‘N/A’ as described herein is satisfied. The remaining 7-x requirements can then be left blank.

Security Objective 7: Random Numbers Security Requirements and Test Requirements Assessor’s Findings and Observations 7-1 The sensitive assets associated with random numbers are identified and documented.

In Place N/A Not In Place 7.1.a Examine vendor documentation to verify if the software uses random values, and if so, all correlations with the sensitive assets identified and verified in Security Objective 2.

<Assessor Response> 7.1.b Perform static analysis and verify the …
Added p. 73
In Place Not In Place 7-1.2.a Examine vendor documentation to verify the justification for the RNG implementation used and its appropriateness in relation to sensitive assets is documented and explained.

<Assessor Response> 7-1.3 If the software is designed with its own random-number generator implementation, then the following applies:

Implementation Notes By definition, this is a deterministic random-number generator (DRNG) and therefore requires a sufficient entropy source. For software being assessed to Module B, refer to B2-7.

In Place N/A Not In Place ROV Instruction: If the assessment of security requirement 7-1.3 results in determining the finding is ‘N/A’, then security requirement 7-1.3 can be marked as ‘N/A’ (with the appropriate assessor response justifying the ‘N’A finding documented in 7-1.3.a), provided the criteria for the use of ‘N/A’ as described herein is satisfied. The remaining 7-1.3[.x] security and test requirements can then be left blank.

7-1.3.a Examine vendor documentation to verify if the software design …
Added p. 74
In Place Not In Place 7-1.3.3.a Examine vendor documentation to verify the seeding period. <Assessor Response> 7-1.3.4 The seed values are protected from disclosure and modification using strong cryptography.

In Place Not In Place 7-1.3.4.a Examine vendor documentation to verify the protection mechanisms implemented to facilitate the mitigation of the seed values from disclosure/modification.

<Assessor Response> 7-1.3.4.b Perform static analysis to verify the analysis and findings from 7- 1.3.4.a, in the context of the software under assessment, regarding the protection of seed values from their initial existence within the software up to their use in the RNG and their subsequent secure deletion.

In Place N/A Not In Place ROV Instruction: If the assessment of security requirement 8-1 results in determining the finding is ‘N/A’, then security requirement 8-1 can be marked as ‘N/A’ (with the appropriate assessor response justifying the ‘N’A finding documented in 8-1.a), provided the criteria for the use of ‘N/A’ …
Added p. 75
Select the overall Finding for this Security Objective è In Place Not In Place Security Objective 8: Key Management Security Requirements and Test Requirements Assessor’s Findings and Observations 8-1 Cryptographic keys that are generated by the software use an entropy source as input that is at least equal to the intended effective strength of the key being generated.

Implementation Notes The software is not required to generate its own cryptographic keys; however, if it does, then the generation process must be sufficient, including the sourced entropy.

8-1.a Examine vendor documentation to verify if the software implements cryptographic key generation.

<Assessor Response> 8-1.b Examine vendor documentation to verify the source and expected entropy used in the key generation process.

<Assessor Response> 8-1.c Perform static analysis to verify the analysis and findings from 8- 1.a/b.

<Assessor Response> 8-1.d Examine vendor documentation and necessary evidence/testing to verify the expected entropy used in the key generation process is at …
Added p. 76
In Place N/A Not In Place ROV Instruction: If the assessment of security requirement 2-1.8[.x] results in determining the software does not utilize secret or private keys in association with sensitive assets, then security requirement 8-2 can be marked as ‘N/A’ (with the appropriate assessor response justifying the ‘N’A finding documented in 8-2.a), provided the criteria for the use of ‘N/A’ as described herein is satisfied. The remaining 8-2.1 security and test requirements can then be left blank.

8-2.a Perform static analysis to verify the confidentiality of cryptographic keys is protected.

<Assessor Response> 8-2.1 Cleartext cryptographic keys are not stored in non-volatile memory. In Place Not In Place 8-2.1.a Perform static analysis to verify cleartext cryptographic keys are not stored in non-volatile memory.

<Assessor Response> 8-3 Cryptographic keys are only used for a single, predetermined purpose. In Place Not In Place 8-3.a Perform static analysis to verify cryptographic keys are only used for …
Added p. 77
<Assessor Response> 8-5 Public keys are protected for integrity and authenticity and are authenticated before they are used.

In Place N/A Not In Place ROV Instruction: If the assessment of security requirement 2-1.8[.x] results in determining the software does not utilize public keys in association with sensitive assets, then security requirement 8-5 can be marked as ‘N/A’ (with the appropriate assessor response justifying the ‘N’A finding documented in 8-5.a), provided the criteria for the use of ‘N/A’ as described herein is satisfied. The remaining 8-5 test requirements can then be left blank.

8-5.a Leverage the information from Security Objective 2 regarding cryptographic keys. Perform static analysis to verify public keys are protected for integrity and authenticity and are authenticated before they are used.

<Assessor Response> 8-5.b Perform dynamic analysis to verify the analysis and findings from 8- 5.a. Attempt to use public keys in a manner that is not in accordance with requirement …
Added p. 78
<Assessor Response> 8-7.b Perform static analysis to verify the analysis and findings from 8-7.a. <Assessor Response>
Added p. 79
Select the overall Finding for this Security Objective è In Place N/A Not In Place ROV Instruction: All 9-1.x test requirements are expected to be conducted. If the assessment of security requirement 9-1 results in determining the finding is ‘N/A’, then security requirement 9-1 can be marked as ‘N/A’ (with the appropriate assessor response justifying the ‘N’A finding documented in each 9-1.x test requirement as it relates to the prescribed test activity), provided the criteria for the use of ‘N/A’ as described herein is satisfied. I.e., even for a ‘N/A’ finding, each test requirement is expected to have been conducted and include an appropriate assessor response. If 9-1 is determined to be ‘N/A’, then Security Objective 9 can be marked as ‘N/A’ herein.

Security Objective 9: Cryptography Security Requirements and Test Requirements Assessor’s Findings and Observations 9-1 The use of cryptography related to sensitive assets that is not already accounted for …
Added p. 80
In Place Not In Place 10-1.a Examine vendor documentation to verify the software is designed to mitigate known threats and vulnerabilities that could pose risks to sensitive assets.

<Assessor Response> 10-1.b Examine vendor documentation to verify that processes exist to update the software as needed to account for newly-discovered relevant threats and vulnerabilities.

<Assessor Response> 10-1.1 Known security issues and vulnerabilities are accounted for in the following, at a minimum:

In Place Not In Place 10-1.1.a Leverage the information from 10-1 to assess 10-1.1.[1-3]. <Assessor Response> 10-1.1.1 The programming languages used to develop the software.

Implementation Notes The term “programming language” is being used generically. This includes all constructs that generally consist of defined syntax and associated semantics being used to “create/implement” the software.

In Place Not In Place 10-1.1.1.a Examine vendor documentation to verify the software is designed with an awareness of avoiding known security issues based on the underlying programming languages used.

<Assessor Response> …
Added p. 81
In Place Not In Place 10-1.1.2.a Examine vendor documentation to verify the software is designed with an awareness of avoiding known security issues in all third- party elements being used by, or within, the software. Leverage information from Security Objective 1 regarding the composition of the software, as well as noted dependencies, including the documented provenance information.

<Assessor Response> 10-1.1.2.b Perform static analysis to verify the software is designed in accordance with requirement 10-1.1.2 and the information from 10-1.1.2.a.

<Assessor Response> 10-1.1.3 Protocols Implementation Notes It is possible this might overlap with 10-1.1.1 and/or 10-1.1.2. However, the context of protocols and their potential risk warrants being accounted for explicitly.

In Place Not In Place 10-1.1.3.a Examine vendor documentation to verify the software is designed with an awareness of avoiding known security issues in all utilized protocols.

<Assessor Response> 10-1.1.3.b Perform static analysis, and research as needed, to determine if there are known security-relevant issues …
Added p. 82
<Assessor Response> 10-2.b Examine vendor documentation to verify the potential risk relative to the use of the unsupported dependencies is documented and accounted for.

<Assessor Response> 10-2.c Perform static analysis, to verify the findings and analysis from 10- 2.b is appropriate and seemingly effective in accounting for the potential risk of the use of the documented unsupported dependencies.
Added p. 83
In Place Not In Place 11-1.a Examine vendor documentation to verify the processes involving the release and delivery of the software are intended to facilitate maintaining the security and integrity of the software.

<Assessor Response> 11-2 The processes to provide security-relevant software updates facilitate prompt deployment to all affected customers.

In Place Not In Place 11-2.a Examine vendor documentation to verify the processes involving the release and delivery of security-relevant software updates facilitate prompt deployment to affected customers.

<Assessor Response> 11-3 The software vendor maintains and provides current secure implementation guidance to customers, which includes, at a minimum, instructions regarding performing the following procedures in a secure manner:

In Place Not In Place 11-3.a Examine vendor documentation, including the implementation guidance, to verify the requirements in 11-3.1 through 11-3.7.

<Assessor Response> 11-3.b Leveraging the information from 11-3.a, perform the functions in 11-3.1 through 11-3.7 by following the implementation guidance. Verify the accuracy of the implementation …
Added p. 84
In Place Not In Place 11-4.a Examine vendor documentation to verify the mechanisms that are implemented to verify the integrity of the software as part of the initial installation and for updates.

<Assessor Response> 11-4.b Perform dynamic analysis to verify the information from 11-4.a. Testing should include attempts to bypass or circumvent the integrity verification mechanisms.

<Assessor Response> 11-5 The software provides a mechanism to provide its software version information.

In Place Not In Place 11-5.a Leverage the information from 11-3.7 and verify that the software provides a mechanism to provide its software version information.

<Assessor Response> 11-6 The software is designed to force changing all default authentication values/credentials/accounts that facilitate access to sensitive assets upon software installation, initialization, or otherwise before business use.

In Place Not In Place 11-6.a Leverage the information from 11-3.[x] and perform static analysis to verify the software is designed to force changing all default authentication values/credentials/accounts that facilitate access …
Added p. 85
ROV Instruction: If Module A is being indicated as ‘N/A’ in tables 3.1 and 7.7, mark ‘N/A’ here with the appropriate assessor response justifying the ‘N/A’ finding, provided the criteria for the use of ‘N/A’ as described herein is satisfied. The remainder of this Module A section can then be left blank.

Module A is N/A <Assessor Response> Security Objective A1: Securing Account Data Sensitive Authentication Data (SAD) and Primary Account Numbers (PANs) are handled in accordance with PCI DSS requirements. Notes: The handling of PAN and SAD by the software is accounted for generically in relative requirements regarding sensitive data in the “Core

• Sensitive Asset Identification document for additional assistance regarding PAN and SAD. These requirements in “Module A

• Account-Data Protection” are intended to facilitate explicitly satisfying respective requirements regarding PAN and SAD in PCI DSS. Refer to the latest version of PCI DSS for complete information and expectations regarding …
Added p. 86
<Assessor Response> A1-2.c Based on evidence from A1-2.a/b, perform static and/or dynamic analysis as necessary to verify the software manages PANs in accordance with the latest version of PCI DSS.
Added p. 87
ROV Instruction: If Module B is being indicated as ‘N/A’ in tables 3.1 and 7.7, mark ‘N/A’ here with the appropriate assessor response justifying the ‘N/A’ finding, provided the criteria for the use of ‘N/A’ as described herein is satisfied. The remainder of this Module B section can then be left blank.

Module B is N/A <Assessor Response> Security Objective B1: PTS Approval The software is designed for the secure integration and operation on an approved PTS POI device. Notes: Refer to the PCI SSC’s List of Approved PTS Devices at: https://listings.pcisecuritystandards.org/assessors_and_solutions/pin_transaction_devices Select the overall Finding for this Security Objective è In Place Not In Place Security Objective B1: PTS Approval Security Requirements and Test Requirements Assessor’s Findings and Observations B1-1 The software is intended for deployment and operation on devices that have been evaluated and approved per the PCI PTS POI Standard and its Program.

In Place Not In Place B1-1.a …
Added p. 88
Select the overall Finding for this Security Objective è In Place Not In Place Security Objective B2: Approved POI Device Functionality Security Requirements and Test Requirements Assessor’s Findings and Observations If the POI device's HW/FW is not approved to SRED:

In Place N/A Not In Place ROV Instruction: If the assessment of security requirement B2-1 results in determining that all supported POI device HW and FW is approved to SRED, then security requirement B2-1 can be marked as ‘N/A’ (with the appropriate assessor response justifying the ‘N’A finding documented in B2-1.a), provided the criteria for the use of ‘N/A’ as described herein is satisfied. The remaining B2-1 test requirements can then be left blank.

B2-1.a Examine the software vendor’s documentation to verify all mechanisms used to protect account data, both from the underlying POI device platform and within the software itself. Leverage all information/testing from relevant Core

• All Software requirements.

<Assessor Response> B2-1.b …
Added p. 89
Testing Notes Testing needs to determine the functionality of the underlying POI HW/FW that is being leveraged, in addition to all relevant functionality implemented in the software being assessed. The analysis and testing required will depend on the unique instance of the implementation. The intent is not to reassess the POI device’s firmware. All firmware functionality being called on or otherwise leveraged by the software can be noted and confirmed. However, all relevant functionality being implemented in the software directly needs to be properly assessed.

<Assessor Response> If the POI device's HW/FW is approved to SRED:

B2-2 The software does not bypass or circumvent the PTS-approved SRED- related functions of the POI device.

In Place N/A Not In Place ROV Instruction: If the assessment of security requirement B2-2 results in determining that none of the supported POI device HW and FW is approved to SRED, then security requirement B2-2 can be marked as …
Added p. 90
Testing Notes If all the HW/FW in a unique PTS approval is approved to SRED, only one HW/FW combination for that PTS POI device from the PTS approval needs to be tested in the following requirements.

If some of the HW/FW in a unique PTS approval is approved to SRED and some of the HW/FW is not, only one HW/FW combination for that PTS POI device from the PTS approval needs to be tested in the following requirements.

<Assessor Response> B2-2.1 If the software requires account data related encryption functionality that the POI device’s firmware and SRED-related functionality do not provide, then:

In Place N/A Not In Place ROV Instruction: If the assessment of security requirement B2-2.1 results in a finding of ‘N’A, then security requirement B2-2.1 can be marked as ‘N/A’ (with the appropriate assessor response justifying the ‘N’A finding documented in B2-2.1.a), provided the criteria for the use of ‘N/A’ as …
Added p. 91
In Place Not In Place B2-3.a Examine the software vendor’s documentation to verify the software does not contain any functionality to share cleartext account data with any other software other than the PTS POI device’s firmware.
Added p. 91
<Assessor Response> B2-4 The software does not output cleartext account data outside of the POI device, which also includes:

In Place Not In Place B2-4.a Examine the software vendor’s documentation to verify the software does not contain any functionality that is capable of outputting cleartext account data from the PTS POI device.

<Assessor Response> B2-4.b Perform static analysis to verify B2-4.a. <Assessor Response> B2-4.c Perform dynamic analysis to verify B2-4.a/b. <Assessor Response> B2-4.1 The software does not facilitate the visual presentation of cleartext account data.

In Place Not In Place B2-4.1.a Examine the software vendor’s documentation to verify the software does not contain any functionality that is capable of facilitating the visual presentation of cleartext account data.

<Assessor Response> B2-4.1.b Perform static analysis to verify B2-4.1.a. <Assessor Response> B2-4.1.c Perform dynamic analysis to verify B2-4.1.a/b. <Assessor Response> B2-4.2 The software does not facilitate the audible presentation of cleartext account data.

In Place Not In Place …
Added p. 92
In Place Not In Place B2-5.a Examine the software vendor’s documentation to verify the software does not implement its own functionality that is considered an “Open Protocol” as defined in the PCI PTS POI Standard. It is acceptable for the software to use the approved Open Protocols of the PTS POI device’s firmware.

In Place Not In Place B2-6.a Examine the software vendor’s documentation to verify the software facilitates the management or use of shared platform resources in a secure manner and in accordance with applicable POI-device guidance.

<Assessor Response> B2-7 The software does not implement its own random-number generator (RNG) and, if needed, only uses the available RNG functions of the POI device's firmware.

In Place Not In Place B2-7.a Examine the software vendor’s documentation to verify the software does not implement its own random-number generator (RNG) and, if needed, only uses the available RNG functions of the POI device's firmware.
Added p. 94
B3-1 The software vendor includes additional details in the secure implementation guidance for the process to facilitate the authentication of the files/content by the POI device’s firmware.

In Place N/A Not In Place ROV Instruction: If the assessment of security requirement B3-1 results in a finding of ‘N’A, then security requirement B3-1 can be marked as ‘N/A’ (with the appropriate assessor response justifying the ‘N’A finding documented in B3-1.a), provided the criteria for the use of ‘N/A’ as described herein is satisfied. The remaining B3-1 test requirements can then be left blank.

B3-1.a Examine the software vendor’s documentation to verify if the software is capable of facilitating loading additional files/content into the POI device.

<Assessor Response> B3-1.b Perform static analysis to verify B3-1.a. <Assessor Response> B3-1.c Perform dynamic analysis to verify B3-1.a/b. <Assessor Response> B3-1.d Examine the implementation documentation from Security Objective 11 and verify it includes secure implementation guidance for the processes …
Added p. 95
ROV Instruction: If Module C is being indicated as ‘N/A’ in tables 3.1 and 7.7, mark ‘N/A’ here with the appropriate assessor response justifying the ‘N/A’ finding, provided the criteria for the use of ‘N/A’ as described herein is satisfied. The remainder of this Module C section can then be left blank.

Module C is N/A <Assessor Response> Security Objective C1: HTTP Headers The software securely implements HTTP Headers. Notes: There is significant potential risk related to HTTP headers. However, there are also dedicated informative resources available online to assist in utilizing and implementing HTTP headers in a [more] secure manner. As this information is subject to change based on known issues/threats, proactive research is prudent.

Select the overall Finding for this Security Objective è In Place Not In Place Security Objective C1: HTTP Headers Security Requirements and Test Requirements Assessor’s Findings and Observations C1-1 HTTP security-related headers are used and configured …
Added p. 96
In Place Not In Place C1-2.a Perform research as needed to determine the current HTTP headers that have known security-impacting concerns. Leverage this information in the subsequent test requirements.

<Assessor Response> C1-2.b Examine vendor documentation to verify that HTTP headers that are known to be vulnerable or otherwise could negatively impact the security of the software are avoided. Where that isn’t being accommodated, assess C1-2.1.

<Assessor Response> C1-2.1 Where the use of a header or its configuration satisfying this condition cannot be avoided, vendor documentation explains the constraint.

In Place Not In Place C1-2.1.a Leverage information from C1-2 to verify that the use of HTTP headers that are known to be vulnerable or otherwise could negatively impact the security of the software cannot be avoided, vendor documentation explains the constraint.
Added p. 97
Select the overall Finding for this Security Objective è In Place Not In Place Security Objective C2: Input Protection Mechanisms Security Requirements and Test Requirements Assessor’s Findings and Observations C2-1 The software is designed to facilitate mitigating injection attacks. In Place Not In Place C2-1.a Perform research as needed to determine the current recommendations/options regarding mitigating injection attacks as it pertains to the unique software implementation being assessed (e.g., the programming languages used, the API interface for inputs, etc.). Leverage this information in the subsequent test requirements.

<Assessor Response> C2-1.b Examine vendor documentation to verify that the software is designed to facilitate mitigating anomalous behavior from content being uploaded.

<Assessor Response> C2-1.c Perform static analysis to verify the information and analysis from C2-1.b.

<Assessor Response> C2-2.c Perform static analysis to verify the information and analysis from C2-2.b.

<Assessor Response> C2-1.d Perform dynamic analysis to verify the information and analysis from C2-1.b/c. Testing should include …
Added p. 98
<Assessor Response> C2-3 The software is designed to securely implement and configure the use of parser and interpreter functionality.

In Place Not In Place C2-3.a Perform research as needed to determine the current recommendations/options regarding secure parser/interpreter implementations and configurations that apply to the unique implementation of the software being assessed. Leverage this information in the subsequent test requirements.

<Assessor Response> C2-3.b Examine vendor documentation to verify that the software is designed to securely implement and configure the use of parser and interpreter functionality.

<Assessor Response> C2-3.c Perform static analysis to verify the information and analysis from C2-3.c.

<Assessor Response> C2-3.d Perform dynamic analysis to verify the information and analysis from C2-3.b/c. Testing should include attempts to exploit the parser/interpreter functionality.

<Assessor Response> C2-4 The software is designed to facilitate mitigating anomalous behavior from content being uploaded.

In Place Not In Place C2-4.a Examine vendor documentation to verify that the software is designed to facilitate mitigating …
Added p. 99
<Assessor Response> C2-5.c Perform dynamic analysis to verify the information and analysis from C2-5.a/b. Testing should include attempts to exploit the implemented resource starvation mitigation mechanisms.
Added p. 100
In Place Not In Place C3-1.a Perform research as needed to determine mechanisms and strategies regarding session management (including all contexts from C3- 1.[x]) that apply to the unique implementation of the software being assessed. Leverage this information in the subsequent test requirements.

<Assessor Response> C3-1.b Examine vendor documentation to verify that the software is designed to securely implement and manage sessions.

<Assessor Response> C3-1.1 Session identifier token exchange mechanisms In Place Not In Place C3-1.1.a Examine vendor documentation to verify that the software is designed to securely manage and implement session identifier token exchange mechanisms.

<Assessor Response> C3-1.1.b Perform static analysis to verify the information and analysis from C3-1.1.a.

<Assessor Response> C3-1.1.c Perform dynamic analysis to verify the information and analysis from C3-1.1.a/b.

<Assessor Response> C3-1.2 Session-identifier attributes In Place Not In Place C3-1.2.a Examine vendor documentation to verify that the software is designed to securely manage and implement session-identifier attributes.
Added p. 100
<Assessor Response> C3-1.3 Session timeouts In Place Not In Place C3-1.3.a Examine vendor documentation to verify that the software is designed to securely manage and implement session timeouts.
Added p. 101
<Assessor Response> C3-1.4 Session termination and re-instantiation In Place Not In Place C3-1.4.a Examine vendor documentation to verify that the software is designed to securely manage and implement session termination and re- instantiation.

<Assessor Response> C3-1.5 Concurrent sessions In Place Not In Place C3-1.5.a Examine vendor documentation to verify that the software is designed to securely manage and implement concurrent sessions.

<Assessor Response> C3-1.6 Use of secure session-management implementations In Place Not In Place C3-1.6.a Examine vendor documentation to verify that the software is designed to utilize known-good session-management implementations.
Added p. 102
<Assessor Response> C4-1.b Perform static analysis to verify the information and analysis from C4-1.a.

<Assessor Response> C4-1.c Perform dynamic analysis to verify the information and analysis from C4-1.a/b. Testing should include ensuring non-authorized users cannot be authenticated or otherwise provided access. Leverage information and testing from 5-4.
Added p. 103
ROV Instruction: If Module D is being indicated as ‘N/A’ in tables 3.1 and 7.7, mark ‘N/A’ here with the appropriate assessor response justifying the ‘N/A’ finding, provided the criteria for the use of ‘N/A’ as described herein is satisfied. The remainder of this Module D section can then be left blank.

Module D is N/A <Assessor Response> Security Objective D1: SDK Integrity The SDK is designed and delivered in a manner that facilitates its integrity. Notes: [No Notes] Select the overall Finding for this Security Objective è In Place Not In Place Security Objective D1: SDK Integrity Security Requirements and Test Requirements Assessor’s Findings and Observations D1-1 The SDK is designed to mitigate the tampering of its execution and the compromise of its sensitive assets.

In Place Not In Place D1-1.a Examine the software vendor’s documentation to verify the SDK is designed to mitigate the tampering of its execution and the …
Removed p. 5
• Secure Software Requirements and Assessment Procedures (Secure Software Standard) Version 1.2.

It is the mandatory template for Secure Software Assessors completing a Secure Software Assessment.

A Secure Software Assessment involves thorough testing and assessment activities from which the assessor generates detailed workpapers for each control objective and its associated test requirements. These workpapers contain records of the assessment activities, including observations, configurations, process information, interview notes, documentation excerpts, references, screenshots, and other evidence collected during the assessment. The Secure Software Report on Validation (ROV) acts as a comprehensive summary of the testing activities performed and the information that is collected during the Secure Software Assessment. The information contained in a Secure Software ROV must provide enough detail and coverage to support the assessor’s opinion that the validated software has met all control objectives within the PCI Secure Software Standard.

Using this Document The PCI Secure Software Report on Validation Template provides reporting …
Modified p. 5 → 6
Secure Software Template for Report on Validation (Secure Software ROV Template) is for use with the PCI Software Security Framework
• Report on Validation Template (ROV Template) is provided by PCI SSC to support assessments of Software Products in accordance with the PCI Software Security Framework
Removed p. 6
• Secure Software Requirements and Assessment Procedures (Secure Software Standard).

• Secure Software Program Guide (Secure Software Program Guide)

• Secure Software Attestation of Validation (Secure Software AOV)

• Glossary of Terms, Abbreviations, and Acronyms (SSF Glossary)

• Qualification Requirements for SSF Assessors (SSF Assessor Qualification Requirements) Documenting the Assessment Findings and Observations The results of the Secure Software Assessment are documented within the Detailed Findings and Observations section of the Secure Software ROV Template. An example layout of the Detailed Findings and Observations section is provided in Table 1.
Removed p. 7
Table 1. Detailed Findings and Observations Control Objectives / Test Requirements Reporting Instructions Assessor’s Findings Control Objective 1: Control Objective Title Parent Control Objective Summary In Place Not in Place N/A 1.1 Child Control Objective Summary In Place Not in Place N/A 1.1.a Test Requirement R1 Reporting Instruction R2 Reporting Instruction For the Summary of Assessment Findings, there are three results possible•In Place, Not in Place, and N/A (Not Applicable). Only one selection is to be made for each control objective. Table 2 provides a helpful representation when considering which selection to make. Reporting details and results should be consistent throughout the ROV, as well as consistent with other related reporting materials, such as the Attestation of Validation (AOV).

Table 2. Selecting the Appropriate Validation Result Response When to use this response:

In Place The expected testing has been performed and all elements of the control objective have been met. Detailed testing …
Removed p. 8
To provide consistency in how Secure Software Assessors document their findings, the reporting instructions use standardized terms. Those terms and the context in which they should be interpreted are provided in Table 3.

Table 3. Reporting Instruction Terms and Response Formats Reporting Instruction Example Usage Description of Response Describe Describe each of the software tests performed to identify the transaction types and card data elements supported by the software.

The response would include a detailed description of the item or activity in question

• for example, details of how evidence examined and/or individuals interviewed demonstrate a control objective was met, or how the assessor concluded an implemented security control is fit-for-purpose. The response should be of sufficient detail to provide the reader with a comprehensive understanding of the item or activity being described.

Identify Identify the evidence obtained that details all configuration options provided by the software.

The response would be a brief overview or …
Removed p. 9
• Complete all sections in the order specified, with concise detail.
Modified p. 9
• Read and understand the intent of each control objective and test requirement.
• Read and understand the intent of each security requirement and test requirement.
Modified p. 9
• Provide a response for every reporting instruction.
• Provide a response for every reporting instruction, unless explicitly instructed otherwise.
Modified p. 9
• Describe how a control objective was verified as the reporting instruction directs, not just that it was verified.
• Describe how a security requirement was verified as the reporting instruction directs, not just that it was verified.
Modified p. 9
• Provide full dates where dates are required, using the “dd-mm-yyyy” format consistently throughout the document.
• Provide full dates where dates are required.
Modified p. 9
• Do not simply repeat or echo the test requirements in the response.
• Do not repeat or echo the test requirements in the response.
Removed p. 10
Using the Appendices The Secure Software ROV Reporting Template includes two appendices:

• Appendix A, Additional Information Worksheet

• Appendix B, Testing Environment Configuration for Secure Software Assessments Appendix A is optional and may be used to add extra information to support the assessment findings if the information is too large to fit in the Assessor’s Findings column within the Detailed Findings and Observations section. Examples of information that may be added in Appendix A include diagrams, flowcharts, or tables that support the Secure Software Assessor’s findings. Any information recorded in Appendix A should reference back to the applicable Secure Software Standard control objectives and test requirements.

Appendix B is mandatory and must be used to confirm that the environment used by the assessor to conduct the Secure Software Assessment was configured in accordance with Section 4.6.1 of the Secure Software Program Guide. This confirmation must be submitted to PCI SSC along with …
Removed p. 12
DBA (doing business as):

Company main website:

Company main website:

Contact phone number:
Modified p. 12 → 11
Contact email address:
Vendor Contact Email: Vendor Company Mailing Address:
Removed p. 13
Lead Assessor phone number:

Lead Assessor email address:

Lead Assessor PCI credentials:
Removed p. 13
QA reviewer phone number:

QA reviewer email address:

QA reviewer PCI credentials:
Removed p. 13
Assessor Name: Assessor PCI Credentials: Assessor Role or Function During the Assessment:

Section 2.2 of the Qualification Requirements for SSF Assessors specifies the independence requirements that the Assessor Company must adhere to at all times when conducting Secure Software Assessments. Assessors are encouraged to review the independence requirements prior to completing the following table:
Removed p. 14
Confirmation of Consultation Services Provided:
Removed p. 14
Product or Service Name: Product Description:
Modified p. 14 → 24
Description of Services Provided:
Description & Purpose:
Removed p. 16
Software name tested: Software version(s) tested:

Already listed on PCI SSC website? Yes No PCI identifier (if applicable):
Removed p. 16
(01) POS Suite/General (02) Payment Middleware (03) Payment Gateway/Switch (04) Payment Back Office (05) POS Admin (06) POS Specialized (07) POS Kiosk (08) POS Face-to-Face/POI (09) Shopping Cart / Store Front (10) Card-Not-Present (11) Automated Fuel Dispenser (12) Payment Component Describe the general software function and purpose (for example, the types of transactions performed, the payment acceptance channels supported, etc.).
Modified p. 17 → 21
Describe a typical implementation of the software (for example, how it is configured in the execution environment, how it typically interacts with other components or services, where those components or services reside, and who is responsible for maintaining them).
Describe a typical implementation of the software (for example, how it is configured in the execution environment or how it typically interacts with other systems or components).
Modified p. 17 → 21
<Insert payment software architecture diagram(s) here>
<Insert Architectural Diagrams Here>
Removed p. 19
Required dependencies are those that would render the assessed payment software inoperable or useless, if unavailable. Such dependencies typically involve hardware, software or services that must be purchased, licensed, and/or maintained separately by an implementing entity (such as a merchant or other type of entity). These types of dependencies do not include hardware, software, or services that are packaged and distributed with the assessed software.
Removed p. 19
• Provider / Supplier: The manufacturer and/or supplier of the device. Also referred to as “device vendor.”

• Make / Model #: The device name and/or model number.

• Version(s) Supported: The version(s) of the device that is/are supported by the assessed payment software.

• Version(s) Tested: The version(s) of the device that was/were used during the software assessment.

• PTS Approval #: The PTS approval number for the associated PCI-approved PTS device(s), where applicable. This information need only be specified if the required device(s) has been approved by PCI SSC under the PCI PIN Transaction Security (PTS) Point-of-Interaction (POI) device validation program.

Note 1: POI device approval listings that appear similar or identical on the PCI SSC List of Approved PTS Devices may be associated with different versions of the PTS POI Standard. Be sure the correct listing is referenced and used during the assessment. The most recent device approvals should be referenced.

Note 2: …
Removed p. 20
• Provider / Supplier: The manufacturer and/or supplier of the software.

• Software Description: A brief description of the software type and/or intended function (e.g., database, web server, etc.)

• Version(s) Supported: The version(s) of the software supported by the assessed payment software.

• Version(s) Tested: The version(s) of the software that was/were used during the software assessment.

Provider / Supplier Software Name Software Description Version(s) Supported Version(s) Tested
Modified p. 20 → 36
Software Name: The name of the software.
All Software” section in this standard.
Removed p. 21
• Sensitive Data Type: The type of data deemed sensitive. Examples include Account Data, authentication credentials, cryptographic keys, etc.

• Sensitive Data Elements: The names of the individual data elements in relation to the Sensitive Data Type. Examples include PAN/SAD, username/password, etc.

• Protection Requirements: Indicates whether the data requirements confidentiality or integrity protection, or both.

• Storage Locations: The locations where sensitive data is stored persistently. Examples include file [name], table [name], etc.

Sensitive Data Type Sensitive Data Elements Protection Requirements Storage Location(s) 2.4.2 Sensitive Data Flows Provide high-level data flow diagrams that show the details of all sensitive data flows, including:

• All flows and locations of encrypted sensitive data (including all sensitive data inputs/outputs both within and outside the execution environment).

• All flows and locations of clear-text sensitive data (including all sensitive data inputs/outputs both within and outside the execution environment).

For each data flow, identify the following:

• How and where sensitive data …
Modified p. 21 → 27
Note: Specify all types of sensitive data flows, including any output to hardcopy, paper, or other external media. Sensitive data flows must also denote locations where sensitive data crosses trust boundaries and where it is passed to other applications or services that were not included in the assessment.
• All components involved in the storage, processing, and/or transmission of the sensitive assets above. Specify all types of sensitive asset flows, including any output to hardcopy, paper, or other external media. Sensitive asset flows must also indicate locations where sensitive assets cross trust boundaries and where it is passed to other applications or services that were not included in the assessment. Ensure diagrams are clearly visible (not blurry) and comprehensible.
Modified p. 21 → 44
All sensitive functions and resources associated with the sensitive data flow.
<Assessor Response> 2-3.5 All sensitive resources associated with the sensitive functionality are documented.
Modified p. 22 → 27
<Insert data flow diagram(s) here>
<Insert Flow Diagrams Here>
Removed p. 23
Assessment start date:
Removed p. 24
Assessed Modules Justification (if excluded from the software assessment) Core Requirements N/A Module A

• Account Data Protection Requirements Module B

• Terminal Software Requirements Module C

• Web Software Requirements 3.2.2 Requirements Deemed Not Applicable Identify any control objectives and test requirements that were determined to be “Not Applicable” to the assessed software or the assessed software vendor. List applicable control objectives and test requirements in the order they appear in Section 4, “Detailed Findings and Observations” (adding additional rows as needed).

Important Note: A “Not Applicable” finding is only acceptable where the control objective has been verified to be not applicable to the assessed software through an appropriate degree of testing. All “Not Applicable” responses MUST be tested, and details MUST be provided to describe how it was determined that a control objective does not apply to the assessed software.

Control Objective or Test Requirement #:

Describe how it was determined that the requirement …
Removed p. 25
• Reference #: A reference number used to uniquely identify the documentation or evidence obtained during the assessment. Generic values, such as “Doc-1,” “Doc-2,” and so on, may be used in lieu of formal reference numbers.

• Document Name: The title given to the documentation or evidence obtained. Document Names may be formal or informal and should include any relevant version information (if applicable).

• Document Description / Purpose: A brief description of the contents and/or purpose of the documentation or evidence obtained.

• Date Created: The date the documentation or evidence was last generated or updated.

• Date Reviewed: The date the documentation or evidence was last retrieved. In certain cases, the Date Reviewed may be the same as the Date Created (for example, where the documentation or evidence is generated by the Assessor during testing).

• Source: The entity who created and/or generated the documentation or evidence. Documentation and evidence are typically the …
Removed p. 26
• Reference #: A reference number used to uniquely identify each distinct interview. Generic values such as “Int-1,” “Int-2,” and so on may be used in lieu of formal reference numbers.

• Interviewee(s): The name of the individual(s) who participated in the interviews.

• Job Title: The job title or job function of the interviewee(s).

• Organization: The organization(s) represented by the interviewee(s).

• Topics Covered: A high-level summary of the topics covered during each interview.

• Interview Notes Ref#: The notes and/or audio files generated by the Assessor to record the interview results. Values in this column should include references to the appropriate documentation or evidence recorded in Section 3.3.1, “Documentation and Evidence Obtained.” Reference # Interviewee(s) Job Title Organization Topics Covered Interview Notes Ref#
Removed p. 27
Tests may be grouped together if performed as part of a common test goal or objective. However, the details provided in each row should be sufficient to differentiate tests where variations are necessary to validate different Secure Software Requirements.

The required attributes that must be specified for each test are described below:

• Reference #: A reference number used to uniquely identify each distinct test. Generic values such as “Test-1,” “Test-2,” and so on may be used in lieu of formal reference numbers.

• Test Description: A brief description of the types of testing performed (static source code analysis, dynamic analysis), the tools or methods used, etc.

• Test Scope: The specific features, functions, or components assessed. Examples could include module names (user authentication module, payment module, etc.), webpages (transaction summary page, payment page), etc.

• Test Objective / Purpose: The primary purpose of the test(s). Example objectives include “determining how the payment software responds …
Removed p. 28
Where sampling is used, samples must be representative of the total population of possible items. The sample size must be sufficiently large and diverse to provide assurance that the selected sample accurately reflects the overall population.

The required attributes that must be specified for each sample set are described below:

• Reference #: A reference number used to uniquely identify each sample set. Generic values such as “Set-1,” “Set-2,” and so on may be used in lieu of formal reference numbers.

• Sample Description: A brief description of the items sampled. For example, “a sample of software updates” or “a sample of user IDs.”

• Total Sampled: The number of items included in the sample set. This could also be expressed in other relevant terms, such as lines of code (if applicable).

• Total Population: The total number of possible items available for testing.

• Sample Justification: The Assessor’s justification for why the Total Sampled is …
Removed p. 29
Validated: All applicable control objectives are marked “In Place,” thereby Secure Software Name(s) and Version(s) has achieved full validation with the PCI Secure Software Standard.

The ROV was completed according to the PCI Secure Software Standard Version 1.2, in adherence with the instructions therein.

All information within this ROV represents this Secure Software Assessment in all material aspects.
Removed p. 30
This assessment was conducted in a manner intended to preserve at all times the professional judgment, integrity, impartiality, and professional skepticism of the SSF Assessor Company.

This Report on Validation accurately identifies, describes, represents, and characterizes all factual evidence that the SSF Assessor Company and its Assessor Employees gathered, generated, discovered, reviewed and/or determined in their sole discretion to be relevant to this assessment while performing the assessment.

The judgments, conclusions, and findings contained in this Report on Validation (a) accurately reflect and are based solely upon the factual evidence described immediately above, (b) reflect the independent judgments, findings and conclusions of the SSF Assessor Company and its Assessor Employees only, acting in their sole discretion, and (c) were not in any manner influenced, directed, controlled, modified, provided or subjected to any prior approval by the assessed Vendor, any contractor, representative, professional advisor, agent or affiliate thereof, or any other person or …
Modified p. 31 → 12
Lead Assessor Name: SSF Assessor Company Name:
Secure Software Assessor Name: Secure Software Assessor Email:
Removed p. 32
In Place Not in Place N/A 1.1 All sensitive data stored, processed, or transmitted by the software is identified. In Place Not in Place N/A 1.1.a The assessor shall examine evidence to confirm that information is maintained that details all sensitive data that is stored, processed, and/or transmitted by the software. At a minimum, this shall include all payment data; authentication credentials; cryptographic keys and related data (such as IVs and seed data for random number generators); and system configuration data (such as registry entries, platform environment variables, prompts for plaintext data in software allowing for the entry of PIN data, or configuration scripts).

R1 Identify the evidence obtained that details the sensitive data that is stored, processed, and transmitted by the assessed software.

1.1.b The assessor shall examine evidence to confirm that information is maintained that describes where sensitive data is stored. This includes the storage of sensitive data in temporary …
Modified p. 32 → 13
R2 Describe any other assessment activities performed and/or findings for this test requirement.
Describe the assessment-related activity performed by the subcontractors.
Removed p. 33
R1 Identify the evidence obtained that details the security controls that are implemented to protect sensitive data during storage, processing, and transmission.

1.1.d The assessor shall test the software to validate the evidence obtained in Test Requirements 1.1.a through 1.1.c.

R1 Describe each of the tests performed, including the tool(s) and/or method(s) used, to confirm that the evidence obtained in Test Requirement 1.1.a accurately reflects the sensitive data stored, processed, and transmitted by the assessed software.

R2 Describe each of the software tests performed, including the tool(s) and/or method(s) used and the scope of each test, to confirm that the evidence obtained in Test Requirement 1.1.b accurately reflects the locations where sensitive data is stored.

R3 Describe each of the software tests performed, including the tool(s) and/or method(s) used and the scope of each test, to confirm that the evidence obtained in Test Requirement 1.1.c accurately reflects the software security controls implemented to protect …
Removed p. 33
1.1.e The assessor shall examine evidence and test the software to identify the transaction types and/or card data elements that are supported by the software, and to confirm that the data for all of these is supported by the evidence examined in Test Requirements 1.1.a through 1.1.c.

R1 Identify the evidence obtained that details the transaction types supported, and the associated card data elements stored, processed, and transmitted by the assessed software.
Removed p. 34
R1 Identify the evidence obtained that details the cryptographic implementations supported by the assessed software and whether they are implemented by the software itself, through third- party software, or as functions of the execution environment.

1.1.g The assessor shall examine evidence and test the software to identify the accounts and authentication credentials supported by the software (including both default and user-created accounts) and to confirm that these accounts and credentials are supported by the evidence examined in Test Requirements 1.1.a through 1.1.c.

R1 Identify the evidence obtained that details the types of authentication methods and credentials supported by the assessed software.

1.1.h The assessor shall examine evidence and test the software to identify the configuration options provided by the software that can impact sensitive data (including those provided through separate files or scripts, internal functions, or menus and options), and to confirm that these are supported by the evidence examined in Test Requirements …
Removed p. 35
R1 Identify the evidence obtained that details the sensitive functions and sensitive resources provided or relied upon by the assessed software.

1.2.b The assessor shall examine evidence to confirm that information is maintained that clearly describes how and where the sensitive data associated with these functions and resources is stored. This includes the storage of sensitive data in temporary storage (such as volatile memory), semi-permanent storage (such as RAM disks), and non-volatile storage (such as magnetic and flash storage media). The assessor shall confirm that this information is supported by the evidence examined in Test Requirement 1.1.a through 1.1.c.

R1 Identify the evidence obtained that details the locations where sensitive data that is associated with sensitive functions and sensitive resources are stored.

1.2.c Where the sensitive functions or sensitive resources are provided by third-party software or systems, the assessor shall examine evidence and test the software to confirm that the software correctly follows …
Removed p. 36
• The software vendor defines criteria for classifying critical assets in accordance with the confidentiality, integrity, and resiliency requirements for each critical asset.

• An inventor of all critical assets with appropriate classifications is maintained.

R1 Identify the evidence obtained that details the confidentiality, integrity, and resiliency requirements for each critical asset.
Removed p. 37
In Place Not in Place N/A 2.1 All functions exposed by the software are enabled by default only when and where it is a documented and justified part of the software architecture.

In Place Not in Place N/A 2.1.a The assessor shall examine evidence and test the software to identify any software APIs or other interfaces that are provided or exposed by default upon installation, initialization, or first use. For each of these interfaces, the assessor shall confirm that the vendor has documented and justified its use as part of the software architecture. Testing shall include methods to reveal any exposed interfaces or other software functionality (such as scanning for listening services where applicable).

Note: This includes functions that are auto-enabled as required during operation of the software.

R1 Identify the evidence obtained that details all interfaces (user interfaces, APIs, etc.) that are accessible or that can be made accessible (through user input …
Removed p. 37
2.1.b The assessor shall test the software to determine whether any of the interfaces identified in Test Requirement 2.1.a rely on external resources for authentication. Where such resources are relied upon, the assessor shall examine evidence to confirm that methods are implemented to ensure that proper authentication remains in place and that these methods are included in the assessment of other applicable requirements in this standard.

R1 Indicate whether any of the interfaces identified in Test Requirement 2.1.a rely on external resources for authentication, such as those that are provided by the execution environment or that reside outside of the execution environment.

R2 If R1 is “No,” then describe what the assessor observed that confirms that none of the interfaces identified in Test Requirement 2.1.a rely on external resources for authentication.

R3 If R1 is “Yes,” then describe the methods that are implemented to ensure that proper authentication always remains in place during …
Removed p. 38
2.1.c The assessor shall test the software to determine whether any of the interfaces identified in Test Requirement 2.1.a rely on external resources for the protection of sensitive data during transmission. Where such resources are relied upon, the assessor shall examine evidence to confirm that methods are implemented to ensure proper protection remains in place and that these methods are included in the assessment of other applicable requirements in this standard.

R1 Indicate whether any of the interfaces identified in Test Requirement 2.1.a rely on external resources for the protection of sensitive data.

R2 If R1 is “No,” then describe what the assessor observed that confirms that none of the interfaces identified in Test Requirement 2.1.a rely on external resources for the protection of sensitive data.

R3 If R1 is “Yes,” then describe the methods that are implemented to ensure that the protection of sensitive data always remains in place during software operation.

2.1.d …
Removed p. 39
• Methods are implemented to mitigate the exploitation of these weaknesses.

• The risks posed by the use of known vulnerable protocols, functions, or ports are documented.

• Clear and sufficient guidance on how to correctly implement sufficient security to meet applicable control objectives in this standard is provided to stakeholders in accordance with Control Objective 12.1.

Note: The assessor should reference the vendor threat information defined in Control Objective 4.1 for this item.

R1 Identify the evidence obtained that details the software vendor’s analysis of the risks of using known vulnerable protocols, functions, and ports.

R2 Describe the protection methods that are implemented to mitigate the exploitation of known vulnerable protocols, functions, and ports.

2.1.f The assessor shall examine evidence to identify any third-party modules used by the software and to confirm that any such functions exposed by each module are disabled, unable to be accessed through mitigation methods implemented by the software, or formally …
Removed p. 40
Note: Specific software security controls required to protect the integrity and confidentiality of sensitive data, sensitive functions, and sensitive resources are captured in the Software Protection Mechanisms section.

In Place Not in Place N/A 2.2.a The assessor shall examine evidence and test the software to identify all software security controls, features and functions relied upon by the software for the protection of critical assets and to confirm that all are enabled upon installation, initialization, or first use of the software.

R1 Identify the evidence obtained that details the software security controls, features, and functions relied upon by the software for the protection of critical assets.

R2 Describe what the assessor observed that confirms that all software security controls, features, and functions relied upon by the software for the protection of critical assets are enabled (or can be enabled) upon software installation, initialization, or first use.

2.2.b Where any software security controls, features and functions …
Removed p. 41
R1 Indicate whether user input or interaction is required to enable any software security controls, features, or functions after installation, initialization, or first use.

R2 If R1 is “Yes,” then identify the evidence obtained that details the software vendor’s guidance for configuring all configurable software security controls, features, or functions.
Removed p. 41
In Place Not in Place N/A 2.3.a The assessor shall examine evidence to identify the default credentials, keys, certificates, and other critical assets used for authentication by the software.

Note: The assessor should refer to evidence obtained in the testing of Control Objectives 1, 5, and 7 to determine the authentication and access control mechanisms, keys, and other critical assets used for authentication.

R1 Identify the evidence obtained that details the credentials, keys, certificates, and other data relied upon by the software for authentication purposes.
Removed p. 42
Note: It is expected that this analysis will include, but not necessarily be limited to, the use of entropy analysis tools to look for hardcoded cryptographic keys, searches for common cryptographic function call and structures such as S-Boxes and big- number library functions (and tracing these functions backwards to search for hardcoded keys), as well as checking for strings containing common user account names or password values.

R1 Describe the tests performed, including the tool(s) and/or method(s) used and the scope of each test, to confirm that the evidence obtained in Test Requirement 2.3.a accurately represents the credentials, keys, certificates, and other data relied upon by the software for authentication purposes.

2.3.c Where user input or interaction is required to disable or change any authentication credentials or keys for built-in accounts, the assessor shall examine evidence to confirm that guidance on configuring these options is provided to stakeholders in accordance with Control …
Removed p. 43
Note: The assessor should refer to evidence obtained in the testing of Control Objective 6 to determine the software security controls implemented to protect sensitive data.

R1 Identify the evidence obtained that details the purposes for which cryptographic keys are used by the software.

R2 Describe what the assessor observed that confirms that cryptographic keys used for authentication are not also used for other purposes.
Removed p. 43
In Place Not in Place N/A 2.4.a The assessor shall examine evidence to identify the privileges and resources required by the software and to confirm that information is maintained that describes and reasonably justifies all privileges and resources required, including explicit permissions for access to resources, such as cameras, contacts, etc.

R1 Identify the evidence obtained that details the resources and access privileges required by the software from the execution environment, and the software vendor’s justification for why such resources and access privileges are necessary.

2.4.b Where limiting access is not possible due to the architecture of the solution or the execution environment in which the software is executed the assessor shall examine evidence to identify all mechanisms implemented by the software to prevent unauthorized access, exposure, or modification of critical assets, and to confirm that guidance on properly implementing and configuring these mechanisms is provided to stakeholders in accordance with Control …
Removed p. 44
R5 Describe any other assessment activities and/or findings for this test requirement.

2.4.c The assessor shall test the software to confirm that access permissions and privileges are assigned according to the evidence examined in Test Requirement 2.4.a. The assessor shall, where possible, use suitable tools for the platform on which the software is installed to review the permissions and privileges of the software itself, as well as the permissions and privileges of any resources, files, or additional elements generated or loaded by the software during use.

Note: Where the above testing is not possible, the assessor shall justify why this is the case and that the testing that has been performed is sufficient.

R1 Describe each of the software tests performed, including the tool(s) or method(s) used and the scope of each test, to verify that the evidence obtained in Test Requirement 2.4.a accurately reflects the execution environment resources and access privileges required …
Removed p. 45
In Place Not in Place N/A 2.5.a The assessor shall examine evidence to identify all default accounts provided by the software and to confirm that the privileges assigned to these accounts are justified and reasonable.

R1 Identify the evidence obtained that details the default privileges assigned to built-in accounts and the software vendor’s justification for assigning such privileges by default.

2.5.b The assessor shall test the software to confirm that all default accounts provided or used by the software are supported by the evidence examined in Test Requirement 2.5.a.

R1 Describe each of the software tests performed, including the tool(s) or method(s) used and the scope of each test, to verify that the evidence obtained in Test Requirement 2.5.a accurately reflects the built-in accounts and default privileges assigned to those accounts.

2.5.c The assessor shall examine evidence and test the software to confirm that exposed interfaces, such as APIs, are protected from attempts by …
Removed p. 46
In Place Not in Place N/A 3.1 The software only retains the sensitive data absolutely necessary for the software to provide its intended functionality.

In Place Not in Place N/A 3.1.a The assessor shall examine evidence to identify the sensitive data that is collected by the software for use beyond any one transaction, the default time period for which it is retained, and whether the retention period is user-configurable, and to confirm that the purpose for retaining the sensitive data in this manner is justified and reasonable.

Note: The assessor should refer to evidence obtained in the testing of Control Objective 1.1 to determine the sensitive data retained by the software.

R1 Identify the evidence obtained that details the sensitive data collected and retained by the software beyond a single transaction, the default time period for which it is retained, and whether the retention period is user configurable.

3.1.b The assessor shall test the …
Removed p. 47
R1 Indicate whether the software facilitates the persistent storage of sensitive data for the purposes of debugging, error finding, or system testing.

R2 If R1 is “No,” then describe what the assessor observed that confirms that sensitive data is not retained persistently for this purpose.

R3 If R1 is “Yes,” then describe the methods implemented to protect sensitive data when retained for this purpose.

R4 If R1 is “Yes,” then describe how the software handles sensitive data retained for this purpose when debugging, error finding, and/or testing functions are terminated.
Removed p. 47
3.1.d Where user input or interaction is required to configure the retention period of sensitive data, the assessor shall examine vendor evidence to confirm that there is clear and sufficient guidance on this process provided in the software vendor’s implementation guidance made available to stakeholders per Control Objective 12.1.

R1 Indicate whether the software requires or otherwise enables users to configure the retention period for persistently stored sensitive data.

R2 If R1 is “Yes,” identify the evidence obtained that details the options available to users to configure the retention periods for this data.
Removed p. 48
In Place Not in Place N/A 3.2.a Using information obtained in Test Requirement 1.1.a, the assessor shall examine evidence to identify all sensitive data that is retained by the software for transient use, what triggers the secure deletion of this data, and to confirm that the purposes for retaining the data are justified and reasonable. This includes data that is stored only in memory during the operation of the software.

R1 Identify the evidence obtained that details the sensitive data that is retained by the software for transient use.

R2 Describe the mechanisms used or relied upon by the software to securely delete sensitive data from transient storage facilities once the purpose for retaining this data has been fulfilled.

3.2.b Using information obtained in Test Requirement 1.2.a, the assessor shall test the software to confirm that all available functions or services that retain transient sensitive data are supported by evidence examined in Test …
Removed p. 49
R1 Indicate whether the software facilitates the storage of sensitive data in temporary storage facilities for the purposes of debugging, error finding, or system testing.

R2 If R1 is “Yes,” then describe the methods implemented to protect this data when retained for this purpose.

R3 If R1 is “Yes,” then describe how the software handles this data when debugging, error finding, and/or testing functions are terminated.

3.2.d Where users can configure retention of transient sensitive data, the assessor shall examine vendor evidence to confirm that clear and sufficient guidance on this process is provided in the software vendor’s implementation guidance made available to stakeholders per Control Objective 12.1.

R1 Indicate whether the software requires or otherwise enables users to configure the retention periods for sensitive data stored in temporary storage facilities.

R2 If R1 is “Yes,” then identify the evidence obtained that details the options available to users to configure the retention periods for this …
Removed p. 49
Note: The Software Protection Mechanisms section includes several specific software security controls that are required to be implemented to protect sensitive data during storage, processing, or transmission. Those software security controls should be analyzed to determine their applicability to the types of sensitive data retained by the software.
Removed p. 50
R1 Identify the evidence examined that details the methods implemented and/or relied upon to protect sensitive data (both transient and persistent) during retention.

3.3.b Where sensitive data is stored outside of temporary variables within the code itself, the assessor shall test the software to confirm that sensitive data is protected using either strong cryptography or other methods that provide an equivalent level of security.

R1 Indicate whether the software stores any sensitive data within the code itself (e.g., is ‘hardcoded’).

R2 If R1 is “No,” describe how the assessor confirmed that no sensitive data is stored within the code.

R3 If R1 is “Yes,” then identify the evidence obtained that details the locations within the code where this data is stored.

R4 If R1 is “Yes,” then describe the methods implemented to protect this data from unauthorized disclosure and/or modification (as applicable).

3.3.c Where protection methods use cryptography, the assessor shall examine evidence and test the …
Removed p. 51
R2 If R1 is “Yes,” then describe the protection methods used or relied upon by the software for this purpose.

R3 If R1 is “Yes,” then describe the methods implemented to ensure that these protection methods are present in all environments where the software is designed to be executed.

3.3.e Where user input or interaction is required to configure protection methods, the assessor shall examine evidence to confirm that guidance on configuring these options is provided to stakeholders in accordance with Control Objective 12.1.

R1 Indicate whether the software requires or otherwise enables users to configure methods to protect sensitive data in storage facilities (transient or persistent).

R2 If R1 is “Yes,” then identify the evidence obtained that details the options available to users for configuring protection methods.
Removed p. 52
R1 Identify the evidence obtained that details the methods implemented to render persistent sensitive data irretrievable when no longer needed.

3.4.b The assessor shall examine evidence and test the software to identify any platform or implementation level issues that complicate the secure deletion of non-transient sensitive data and to confirm that any non-transient sensitive data is securely deleted using a method that ensures that the data is rendered unrecoverable. Methods may include (but are not necessarily limited to) overwriting the data, deletion of cryptographic keys (of sufficient strength) that have been used to encrypt the data, or platform-specific functions that provide for secure deletion. Methods must accommodate for platform-specific issues, such as flash wear-levelling algorithms or SSD over- provisioning, which may complicate simple over- writing methods.

R1 Indicate whether known platform or implementation-level issues exist that complicate the secure deletion of sensitive data from persistent data stores.

R2 If R1 is “Yes,” then …
Removed p. 53
Note: Where forensic testing of some or all aspects of the platform is not possible, the assessor should examine additional evidence to confirm secure deletion of sensitive data. Such evidence may include (but is not necessarily limited to) memory and storage dumps from development systems, evidence from memory traces from emulated systems, or evidence from physical extraction of data performed on-site by the software vendor.

R1 Describe each of the software tests performed, including the tool(s) or method(s) used and the scope of each test, to confirm that sensitive data stored in persistent data stores is rendered irretrievable upon secure deletion.
Removed p. 53
In Place Not in Place N/A 3.5.a The assessor shall examine evidence to identify the methods implemented to render transient sensitive data irretrievable and to confirm that sensitive data is unrecoverable after the process is complete.

Note: This includes data which may be stored only temporarily in program memory / variables during operation of the software.

R1 Identify the evidence obtained that details the methods implemented to render sensitive data stored in transient data stores irretrievable upon secure deletion.

R2 If R1 is “Yes,” then describe the methods implemented to mitigate the risks associated with such complications.

Note: Where forensic testing of some or all aspects of the platform is not possible, the assessor should examine additional evidence to confirm secure deletion of sensitive data. Such evidence may include (but is not necessarily limited to) memory and storage dumps from development systems, evidence from memory traces from emulated systems, or evidence from physical extraction …
Removed p. 54
R1 Indicate whether known platform or implementation-level issues were discovered that complicate the secure deletion of sensitive data from transient data stores.

3.5.c The assessor shall test the software to identify any sensitive data residue in the execution environment and to confirm that the methods implemented are implemented correctly and enforced for all transient sensitive data. This analysis should accommodate for the data structures and methods used to store the sensitive data (for example, by examining file systems at the allocation level and translating data formats to identify sensitive data elements) and cover all non- transient sensitive data types.

R1 Describe each of the software tests performed, including the tool(s) or method(s) used and the scope of each test, to confirm that sensitive data retained in transient data stores is rendered irretrievable upon secure deletion.
Removed p. 55
• Error messages, error logs, or memory dumps.

• Execution environments that may be vulnerable to remote side-channel attacks to expose sensitive data, such as attacks that exploit cache timing or branch prediction within the platform processor.

• Automatic storage or exposure of sensitive data by the underlying execution environment, such as through swap-files, system error logging, keyboard spelling, and auto-correct features.

• Sensors or services provided by the execution environment that may be used to extract or leak sensitive data, such as through use of an accelerometer to capture input of a passphrase to be used as a seed for a cryptographic key, or through capture of sensitive data through use of cameras or near-field communication (NFC) interfaces.

R1 Identify the evidence obtained that details the software vendor’s sensitive data disclosure attack vector analysis.

R2 Describe what the assessor observed in the evidence obtained that confirms the software vendor’s analysis accounts for the attack …
Removed p. 56
R1 Identify the evidence obtained that details the sensitive data stored, processed, or transmitted by the software that requires confidentiality protection.

R2 Describe the methods implemented to protect this data from unauthorized disclosure through the vectors identified in Test Requirement 3.6.a.

3.6.c Where protection methods require user input or interaction, the assessor shall examine evidence to confirm that guidance on the proper configuration and use of such methods is provided to stakeholders in accordance with Control Objective 12.1.

R1 Indicate whether the software requires or otherwise enables users to configure any of the protection methods identified in Test Requirement 3.6.b.

R2 If R1 is “Yes,” then identify the evidence obtained that details the software vendor’s guidance on how to configure these protection methods.

3.6.d The assessor shall test the software to identify any sensitive data residue in the execution environment, and to confirm that protection methods are implemented correctly and the software does not expose …
Removed p. 58
In Place Not in Place N/A 4.1 Attack scenarios applicable to the software are identified.

Note: This control objective is an extension of Control Objective 10.1. Validation of both control objectives should be performed at the same time.

In Place Not in Place N/A 4.1.a The assessor shall examine evidence to confirm that the software vendor has identified and documented relevant attack scenarios for the software.

R1 Identify the evidence obtained that details the software vendor’s analysis of potential threats and attack scenarios applicable to the assessed software.

R2 Identify the date when the software vendor’s threat analysis was last performed or updated.

4.1.b The assessor shall examine evidence to determine whether any specific industry-standard methods or guidelines were used to identify relevant attack scenarios.

Where such industry standards are not used, the assessor shall confirm that the methodology used provides equivalent coverage for the attack scenarios applicable to the software under evaluation.

R1 Identify the evidence …
Removed p. 59
• A formal owner of the software is assigned. This may be a role for a specific individual or a specific name, but evidence must clearly show an individual who is accountable for the security of the software.

• A methodology is defined for measuring the likelihood and impact for any exploit of the system.

• Generic threat methods and types that may be applicable to the software are documented.

• All critical assets managed, and all sensitive resources used by the system are documented.

• All entry and egress points for sensitive data, as well as the authentication and trust model applied to each of these entry/egress points, are documented.

• All data flows, network segments, and authentication/privilege boundaries are documented.

• All static IPs, domains, URLs, or ports required by the software for operation are documented.

• Considerations for cryptography elements like cipher modes, and protecting against relevant attacks such as timing attacks, padded oracles, …
Removed p. 60
• Execution environment implementation specifics or assumptions, such as network configurations and operating system security configurations, are documented.

• Considerations for the software execution environment, the size of the install base, and the attack surfaces that must be mitigated are documented. Examples of such attack surfaces may include insecure user prompts or protocol stacks, or the storage of sensitive data post authorization or using insecure methods.
Removed p. 60
R1 Identify the evidence obtained that details the methods implemented to mitigate each of the threats identified in Test Requirement 4.1.a.

R2 Identify the evidence obtained that details the software vendor’s justification(s) for any threats identified in Test Requirement 4.1.a that were not mitigated.
Removed p. 61
R1 Indicate whether any of the mitigations identified in Test Requirement 4.2.a rely on settings within the software.

R2 If R1 is “Yes,” then describe each of the software tests performed, including the tool(s) or method(s) used and the scope of each test, to confirm that all such settings are applied upon software installation, initialization, or first use.

4.2.c Where user input or interaction can disable, remove, or bypass any such mitigations, the assessor shall examine evidence and test the software to confirm that such action requires authentication and authorization and that guidance on the risk of such actions is provided to stakeholders in accordance with Control Objective 12.1.

R1 Indicate whether users can disable, remove, or bypass any of the settings identified in Test Requirement 4.2.b.

R2 If R1 is “Yes,” then describe each of the software tests performed, including the tool(s) or method(s) used and the scope of each test, to confirm …
Removed p. 62
R1 Indicate whether the software relies on APIs provided by the execution environment to query the status of software security controls.

R2 If R1 is “Yes,” then describe each of the software test(s) performed, including the tool(s) or method(s) used and the scope of each test, to confirm that checks are implemented to ensure that these security controls are in place and active upon and throughout software execution.
Removed p. 63
In Place Not in Place N/A 5.1 Access to critical assets is authenticated. In Place Not in Place N/A 5.1.a The assessor shall examine evidence to confirm that authentication requirements are defined (i.e., type and number of factors) for all roles based on critical asset classification, the type of access (e.g., local, non-console, remote) and level of privilege.

Note: The assessor should refer to evidence obtained in the testing of Control Objective 1.3 to determine the classifications for all critical assets.

R1 Identify the evidence obtained that details all authentication methods relied upon by the software.

R2 Describe the software vendor’s methodology for defining authentication requirements and whether authentication methods differ based on the types of critical assets accessed and the access privileges required.

5.1.b The assessor shall examine evidence and test the software to confirm that access to critical assets is authenticated and authentication mechanisms are implemented correctly.

R1 Describe the assessment activities performed …
Removed p. 64
R1 Indicate whether the software relies on or supports the use of external authentication mechanisms.

R2 If R1 is “Yes,” then identify the evidence obtained that details the software vendor’s guidance on how to configure the software to securely use such authentication mechanisms.

5.1.d The assessor shall examine evidence to confirm that sensitive data associated with authentication credentials, including public keys, is identified as a critical asset.

R1 Identify the evidence obtained that confirms that all data associated with authentication credentials, including public keys, is appropriately protected.
Removed p. 64
R1 Describe the assessment activities performed and what the assessor observed in the evidence obtained that confirms that all authentication mechanisms relied upon by the software require unique user identification.
Removed p. 65
R1 Indicate whether the software provides or otherwise facilitates automated API access to critical assets.

R2 If R1 is “Yes,” then identify the evidence obtained that confirms that unique identification is required for each different program and system accessing these APIs.

5.2.c Where identification is supplied across a non- console interface, the assessor shall test the software to confirm that authentication credentials are protected from attacks that attempt to intercept them in transit.

R1 Indicate whether any authentication credentials (user, API, etc.) are supplied across a non-console interface.

R2 If R1 is “Yes,” then describe the methods implemented within or by the software to protect these authentication credentials from attempts to intercept them in transit.

5.2.d The assessor shall examine evidence to confirm that the guidance provided to stakeholders per Control Objective 12.1 specifically notes that identification and authentication parameters must not be shared between individuals, programs, or in any way that prevents the unique …
Removed p. 66
R1 Identify the evidence obtained that demonstrates that all authentication methods implemented by the software are evaluated to determine whether they contain known vulnerabilities.

R2 Describe how the implementation of these authentication methods mitigates vulnerabilities common to those methods.

5.3.b The assessor shall examine evidence to confirm that the implemented authentication methods are robust, and that the robustness of the authentication methods was evaluated using industry-accepted methods.

Note: The vendor assessment and robustness justification include consideration of the full path of the user credentials, from any input source (such as a Human Machine Interface or other program), through transition to the execution environment of the software (including any switched/network transmissions and traversal through the execution environment’s software stack before being processed by the software itself).

R1 Identify the evidence obtained that details the software vendor’s analysis of the implemented authentication methods and their ability to resist attacks common to such methods.

R2 Describe how the …
Removed p. 67
In Place Not in Place N/A 5.4.a The assessor shall examine evidence to confirm that information is maintained that identifies and justifies the required access for all critical assets.

R1 Identify the evidence obtained that details the access privileges granted to critical assets by default, and the software vendor’s justification for granted such access.

5.4.b The assessor shall examine evidence and test the software to identify the level of access that is provided to critical assets and to confirm that such access correlates with the evidence examined in Test Requirement 5.4.a. Testing to confirm access to critical assets is properly restricted should include attempts to access critical assets through user accounts, roles, or services which should not have the required privileges.

R1 Describe each of the software tests performed, including the tool(s) or method(s) used and the scope of each test, to confirm that the access privileges granted to critical assets by default …
Removed p. 68
In Place Not in Place N/A 6.1 Sensitive data is secured anywhere it is stored. In Place Not in Place N/A 6.1.a The assessor shall examine evidence to confirm that protection requirements for all sensitive data are defined, including requirements for rendering sensitive data with confidentiality considerations unreadable anywhere it is stored persistently.

R1 Identify the evidence obtained that details the integrity and confidentiality protection requirements for all sensitive data identified in Test Requirement 1.1.a.

R2 Describe any other testing activities performed and/or findings for this test requirement.

6.1.b The assessor shall examine evidence and test the software to confirm that security controls are implemented to protect sensitive data during storage and that they address all defined protection requirements and identified attack scenarios.

Note: The assessor should refer to evidence obtained in the testing of Control Objective 1.1 to determine all sensitive data retained by the software, and Control Objective 4.1 to identify all …
Modified p. 68 → 26
R1 Identify the evidence obtained that details the security controls implemented to protect sensitive data during storage.
ID Doc ID Description of security mechanism(s) implemented to protect the Sensitive Data / Resource
Removed p. 69
R2 If R1 is “Yes,” then identify the evidence obtained that details the cryptographic algorithms and cipher modes used or relied upon by the software.

R3 Describe what the assessor observed in the evidence obtained that confirms that each of the cryptographic algorithms and cipher modes used or relied upon complies with all applicable requirements within Control Objective 7.

6.1.d Where index tokens are used for securing sensitive data, the assessor shall examine evidence and test the software to confirm that these are generated in a way that ensures there is no correlation between the value and the sensitive data that is being referenced (without access to the software to perform the correlation as part of a formally defined and assessed feature of that software, such as “de-tokenization”).

R1 Indicate whether index tokens are used or otherwise relied upon to protect stored sensitive data.

R2 If R1 is “Yes,” then describe how the index …
Modified p. 69 → 88
R1 Indicate whether the software relies on cryptography to protect stored sensitive.
B2-1 The software uses strong cryptography to protect account data.
Removed p. 70
6.1.f Where protection methods rely on the security properties of third-party software, the assessor shall examine evidence and test the software to confirm that there are no unmitigated vulnerabilities or issues with the software providing the security properties.

R1 Indicate whether implemented software protection methods rely on the security properties of third-party software to protect stored sensitive data.

R2 If R1 is “Yes,” then describe what the assessor observed in the evidence obtained that confirms there are no unmitigated vulnerabilities in the third- party software.
Removed p. 70
R1 Identify the evidence obtained that details the locations within the software where sensitive data is transmitted outside of the physical execution environment.

R2 Identify the evidence obtained that details the protection requirements for sensitive data transmitted outside of the physical execution environment.
Removed p. 71
Note: The assessor should refer to evidence obtained in the testing of Control Objective 1.1 to determine the sensitive data stored, processed, or transmitted by the software.

R1 Describe what the assessor observed in the evidence obtained that confirms that sensitive data transmitted outside of the execution environment is encrypted using strong cryptography.

6.2.c Where third-party or execution-environment features are relied upon for the security of the transmitted data, the assessor shall examine evidence to confirm that guidance on how to configure such features is provided to stakeholders in accordance with Control Objective 12.1.

R1 Indicate whether the software relies on third- party software or features of the execution environment to protect sensitive data during transmission.

R2 If R1 is “Yes,” then identify the evidence obtained that details the software vendor’s guidance on how to configure the assessed software to use these features in a secure manner.

6.2.d Where transport layer encryption is used to …
Removed p. 72
R1 Indicate whether the encryption methods implemented to protect sensitive data during transmission allow for the use of different types of cryptography or cryptography with different effective key strengths.

R2 If R1 is “Yes,” then describe the method(s) used or relied upon by the software to ensure that strong cryptography is always enforced where sensitive data is transmitted outside of the execution environment.
Removed p. 72
Note: The assessor should refer to Control Objective 7 to identify all requirements for appropriate and correct implementation of cryptography.

R1 Indicate whether the software relies on cryptography for the protection of sensitive data during storage, processing, or transmission.

R2 If R1 is “Yes,” then describe what the assessor observed in the evidence obtained that confirms that all cryptography relied upon for the protection of sensitive data during storage, processing, or transmission complies (or can be configured to comply) with all applicable sections of Control Objective 7.
Removed p. 73
R1 Indicate whether the software relies on any cryptographic methods provided by third-party software or the execution environment to protect sensitive data.

R2 If R1 is “Yes,” then identify the evidence obtained that details the software vendor’s guidance on how to configure the assessed software to use these methods in a secure manner.

6.3.c Where asymmetric cryptography such as RSA or ECC is used for protecting the confidentiality of sensitive data, the assessor shall examine evidence and test the software to confirm that private keys are not used for providing confidentiality protection to the data.

R1 Indicate whether the software relies on asymmetric cryptography to encrypt sensitive data during storage, transmission, or processing.

R2 If R1 is “Yes,” then describe what the assessor observed in the evidence obtained through examination and testing that confirms private keys are not used to protect the confidentiality of sensitive data during storage, transmission, or processing.
Removed p. 74
In Place Not in Place N/A 7.1 Industry-standard cryptographic algorithms and methods are used for securing critical assets. Industry- standard cryptographic algorithms and methods are those recognized by industry-accepted standards bodies such as NIST, ANSI, ISO, and EMVCo. Cryptographic algorithms and parameters that are known to be vulnerable are not used.

In Place Not in Place N/A 7.1.a The assessor shall examine evidence to determine how cryptography is used for the protection of critical assets and to confirm that:

• Industry-standard cryptographic algorithms and modes of operation are used.

• The use of any other algorithms is in conjunction with industry-standard algorithms.

• The implementation of non-standard algorithms does not reduce the equivalent cryptographic key strength provided by the industry-standard algorithms.

R1 Identify the evidence obtained that details the cryptographic algorithms and cipher modes relied upon by the software for the protection of sensitive data.

R2 Indicate whether any of these algorithms or cipher modes are …
Removed p. 75
• Only documented cryptographic algorithms and modes of operation are used in the software.

• Protection methods are implemented to mitigate common attacks on cryptographic implementations (for example, the use of the software as a decryption oracle, brute- force or dictionary attacks against the input domain of the sensitive data, the re-use of security parameters such as IVs, or the re- encryption of multiple datasets using linearly applied key values, such as XOR’d key values in stream ciphers or one-time pads).

R1 Identify the evidence obtained that confirms that only documented cryptographic algorithms and cipher modes are relied upon by the software for the protection of sensitive data.

R2 Identify the evidence obtained that confirms that protection methods are implemented to mitigate threats to the cryptographic algorithms and cipher modes relied upon for the protection of sensitive data.

7.1.c Where cryptographic implementations require a unique value per encryption operation or session, the assessor shall …
Removed p. 76
R1 Indicate whether the software relies upon padding methods prior to or during encryption operations involving sensitive data.

R2 If R1 is “Yes,” then describe what the assessor observed in the evidence obtained that confirms that only industry-standard padding methods are used.

7.1.e Where hash functions are used to protect sensitive data, the assessor shall examine evidence and test the software to confirm that:

• Only approved, collision-resistant hash algorithms and methods are used for this purpose, and

• A salt value of appropriate strength that is generated using a secure random number generator is used to ensure the resultant hash has sufficient entropy.

Note: The assessor should refer to Control Objective 7.3 for more information on secure random number generators.

R1 Indicate whether the software relies upon hash functions for the protection of sensitive data.

R2 If R1 is “Yes,” then describe what the assessor observed in the evidence obtained that confirms that only industry-standard, collision-resistant …
Removed p. 77
In Place Not in Place N/A 7.2.a The assessor shall examine evidence to confirm that information is maintained that describes the following for each key specified in the inventory:

• Key generation method/algorithm used

• Key length R1 Identify the evidence obtained that details the characteristics of each cryptographic key relied upon by the software for the protection of sensitive data.
Removed p. 78
• All cryptographic keys that are used for providing security to critical assets (confidentiality, integrity, and authenticity) and other security services to the software have a unique purpose, and that no key is used for both encryption and authentication operations.

• All keys have defined generation methods, and no secret or private cryptographic keys relied upon for security of critical assets are shared between software instances, except when a common secret or private key is used for securing the storage of other cryptographic keys that are generated during the installation, initialization, or first use of the software (for example, white-box cryptography).

• All cryptographic keys have an equivalent bit strength of at least 128 bits in accordance with industry standards.

• All keys have a defined cryptoperiod aligned with industry standards, and methods are implemented to retire and/or update each key at the end of the defined cryptoperiod.

R1 Describe what the assessor observed …
Removed p. 80
R1 Indicate whether the software relies on public keys for the protection of sensitive data.

R2 If R1 is “Yes, then describe what the assessor observed in the evidence obtained that confirms that the authenticity of these public keys is preserved.

7.2.e Where public or white-box keys are not unique per software instantiation the assessor shall examine evidence to confirm that methods and procedures to revoke and/or replace such keys (or key pairs) exist.

R1 Indicate whether the software relies upon public or white-box keys that are not unique to each software instance.

R2 If R1 is “Yes,” then describe what the assessor observed in the evidence obtained that confirms that methods are implemented to enable all such keys to be revoked and/or replaced with a unique key per instance.

7.2.f Where the software relies upon external files or other data elements for key material, such as for public TLS certificates, the assessor shall examine …
Removed p. 81
R1 Indicate whether the software uses or relies upon public keys that are manually loaded or used as root keys.

R2 If R1 is “Yes,” then describe what the assessor observed in the evidence obtained that confirms these keys are installed and stored in a manner that provides for dual control, or that protects the keys from unauthorized substitution or modification where dual control is infeasible.
Removed p. 81
In Place Not in Place N/A 7.3.a The assessor shall examine evidence and test the software to identify all random number generators used by the software and to confirm that all random number generation methods:

• Use at least 128 bits of entropy prior to the output of any random numbers.

• Ensure it is not possible for the system to provide or produce reduced entropy upon start-up or entry of other predictable states of the system.

R1 Identify the evidence obtained that details all locations within the software where random numbers are required.

R2 Describe what the assessor observed in the evidence obtained that confirms that all random number generation methods implemented use at least 128 bits of entropy prior to the output of any random numbers from the random number generator.

R3 Describe what the assessor observed in the evidence obtained that confirms that sufficient entropy (at least 128 bits) is always provided …
Removed p. 82
R1 Indicate whether the software relies upon third- party software, platforms, or libraries for all or part of the random number generation process.

R2 If R1 is “Yes,” then describe what the assessor observed in the evidence obtained that confirms that the third-party software, platforms, or libraries do not contain or otherwise expose any known vulnerabilities that would compromise its ability to secure generate random numbers.

7.3.c Where the software vendor relies on a previous assessment of the random number generator or source of initial entropy, the assessor shall examine evidence (such as the approval records of the previous assessment) to confirm that this scheme and specific approval include the correct areas of the software in the scope of its assessment, and that the vendor claims do not exceed the scope of the evaluation or approval of that software. For example, some cryptographic implementations approved under FIPS 140-2 or 140-3 require seeding …
Removed p. 83
Note: The assessor can use the NIST Statistical Test Suite to identify statistical correlation in the random number generation implementation.

R1 Indicate whether the software relies upon any random number generators that have NOT been previously assessed to ensure they comply with industry-accepted standards for random number generation.

R2 If R1 is “Yes,” then describe what the assessor observed in the evidence obtained that confirms that there is a lack of statistical correlation in the output from these random number generators.
Removed p. 83
In Place Not in Place N/A 7.4.a The assessor shall examine evidence and test the software to confirm that the methods used for the generation of all cryptographic keys and other material (such as IVs, “k” values for digital signatures, and so on) have entropy that meets the minimum effective strength requirements of the cryptographic primitives and keys.

R1 Describe what the assessor observed in the evidence obtained that confirms that all methods used to generate cryptographic keys and other material have entropy that meets the minimum effective strength requirements of the cryptographic primitives and keys.
Removed p. 84
• Methods used for generating keys directly from a password/passphrase enforce an input domain that is able to provide sufficient entropy, such that the total possible inputs are at least equal to that of the equivalent bit strength of the key being generated (for example, a 32- hex-digit input field for an AES128 key).

• Passphrases are passed through an industry-standard key-derivation function, such as PBKDF2 or bcrypt, which extends the work factor for any attempt to brute- force a passphrase value. The assessor shall confirm that a work factor of at least 10,000 is applied to any such implementation.

• Guidance is provided to stakeholders in accordance with Control Objective 12.1 that includes instructions that any passphrase used must:

- Be randomly generated itself using a valid and secure random process, and that an online random number generator must not be used for this purpose.

- Not be implemented by a single person, …
Removed p. 85
In Place Not in Place N/A 8.1 All access attempts and usage of critical assets are tracked and traceable to a unique user.

Note: This Secure Software Standard recognizes that some execution environments cannot support the detailed logging requirements in other PCI standards. Therefore, the term “activity tracking” is used here to differentiate the expectations of this standard with regards to logging from similar requirements in other PCI standards.

In Place Not in Place N/A 8.1 The assessor shall examine evidence and test the software to confirm that all access attempts and usage of critical assets are tracked and traceable to a unique individual, system, or entity.

R1 Identify the evidence obtained that details the mechanisms implemented to track all user interactions with the software involving critical assets.

R2 Describe the different types of user accounts supported by the software (e.g., individual-level, system-level, entity-level, etc.) and the types of software assets each type of …
Removed p. 86
In Place Not in Place N/A 8.2.a The assessor shall examine evidence and test the software to confirm that the tracking method(s) implemented capture specific activity performed, including:

• Enablement of any privileged modes of operation.

• Exporting of sensitive data to other systems or processes.

• Failed authentication attempts.

• Disabling or deleting a security control or altering security functions.

R1 Identify the evidence obtained that details the specific activities involving critical assets that are captured by activity tracking mechanisms and confirms that the activities specified in this test requirement are covered.

R2 Indicate whether there are any limitations that complicate the software’s ability to capture the activities specified in this test requirement.

R3 If R2 is “Yes,” then describe the technical constraints that complicate the software’s ability to capture the activities specified in this test requirement.

8.2.b The assessor shall examine evidence and test the software to confirm that the tracking method(s) implemented provide the following:

• …
Modified p. 86 → 27
Decryption of sensitive data.
The sensitive data involved.
Modified p. 86 → 66
Disabling of encryption of sensitive data.
Attempting to find unexpected ways to gain access to sensitive data.
Removed p. 87
R1 Describe how the assessor confirmed that clear-text confidential data (i.e., sensitive data with confidentiality protection requirements) is not directly recorded in the output from activity tracking mechanisms.
Removed p. 87
R1 Indicate whether the software maintains its own activity tracking records (even if only temporarily).

R2 If R1 is “Yes,” then describe the methods implemented by the software to ensure the completeness, accuracy, and integrity of its activity tracking records.

8.3.b Where the software utilizes external or third- party systems for the maintenance of tracking data, such as a log server, the assessor shall examine evidence to confirm that guidance on the correct and complete setup and/or integration of the software with the external or third-party system(s) is provided to stakeholders in accordance with Control Objective 12.1.

R1 Indicate whether the software relies upon or supports the use of external and/or third-party activity tracking mechanisms.

R2 If R1 is “Yes,” then identify the evidence obtained that details the software vendor’s guidance on how to configure the software to use these activity tracking mechanisms in a secure manner.
Removed p. 88
R1 Describe what the assessor observed in the evidence obtained that confirms that the integrity of activity tracking data and records is always maintained.
Removed p. 88
In Place Not in Place N/A 8.4.a The assessor shall examine evidence and test the software to confirm that the failure of the activity-tracking mechanism(s) does not violate the integrity of existing records by confirming that:

• The software does not overwrite existing tracking data upon a restart of the software. Each new start shall only append to existing datasets or shall create a new tracking dataset.

• Where unique dataset names are relied upon for maintaining integrity between execution instances, the implementation ensures that other software (including another instance of the same software) cannot overwrite or render invalid existing datasets.

R1 Describe the protection methods implemented to ensure prevent existing activity tracking records and data from being overwritten or corrupted when activity tracking mechanisms fail.

• The software applies, where possible, suitable file privileges to assist with maintaining the integrity of the tracking dataset (such as applying an append-only access control to a …
Removed p. 91
In Place Not in Place N/A 9.1 The software detects and alerts upon detection of anomalous behavior, such as changes in post- deployment configurations or obvious attack behavior.

In Place Not in Place N/A 9.1.a The assessor shall examine evidence and test the software to confirm that methods are implemented to validate the integrity of software executables and any configuration options, files, and datasets that the software relies upon for operation such that unauthorized post-deployment changes are detected.

Where the execution environment prevents this, the assessor shall examine evidence (including publicly available literature on the platform and associated technologies) to confirm that there are indeed no methods for validating authenticity, and that additional security controls are implemented to minimize the associated risk.

R1 Describe the methods that are implemented or relied upon to validate the integrity of the software’s execution and configuration files.

R2 Indicate whether there are any constraints that complicate the implementation …
Removed p. 93
R1 Indicate whether the software relies on third- party tools or services to provide attack detection capabilities.

R2 If R1 is “Yes,” then identify the evidence obtained that details the software vendor’s guidance on how to configure the software to use the third-party tools or services in a manner that meets applicable security requirements within this standard.
Removed p. 94
In Place Not in Place N/A 10.1 Software threats and vulnerabilities are identified, assessed, and addressed. In Place Not in Place N/A 10.1.a Using information obtained in Test Requirement 4.1.a, the assessor shall examine evidence to confirm that common attack methods against the software are identified. This may include platform-level, protocol-level, and/or language-level attacks.

R1 Describe the methods used by the software vendor to identify and/or detect vulnerabilities in the software that could be exploited by an attacker.

R2 Identify the evidence obtained that details the software vendor’s most recent analysis of potential vulnerabilities within the software.

R3 Describe any other assessment activities performed and/or findings from this test requirement.

10.1.b The assessor shall examine evidence to confirm that the identified attacks are valid for the software and shall note where this does not include common attack methods detailed in industry- standard references such as OWASP and CWE lists.

R1 Describe the methods used by …
Removed p. 95
R1 Describe the methods used and the frequency with which testing is performed to ensure new and evolving vulnerabilities are detected and that the existing mitigations remain effective.

10.2.b The assessor shall examine evidence including documented testing processes and output of several instances of the testing to confirm that the testing process:

• Includes, at a minimum, the use of automated tools capable of detecting vulnerabilities both in software code and during software execution.

• Includes the use of security testing tools that are suitable for the software architecture, development languages, and frameworks used in the development of the software.

• Accounts for the entire code base and detects vulnerabilities in third-party, open- source, or shared components and libraries.

• Accounts for common vulnerabilities and attack methods.

• Demonstrates a history of finding software vulnerabilities and remediating them prior to software release.

R1 Describe how the software vendor uses automated tools to detect vulnerabilities in both the …
Removed p. 96
• An industry-standard vulnerability-ranking system (such as CVSS) is used to classify/categorize vulnerabilities.

• A remediation plan is maintained for all detected vulnerabilities that ensures vulnerabilities do not remain unmitigated for an indefinite period.

R1 Describe the software vendor’s methodology for classifying and/or categorizing software vulnerabilities.

R2 Describe how the software vendor ensures that known vulnerabilities do not remain unmitigated indefinitely.
Removed p. 97
In Place Not in Place N/A 11.1 Software updates to fix known vulnerabilities are made available to stakeholders in a timely manner. In Place Not in Place N/A 11.1.a The assessor shall examine evidence to confirm that:

• Reasonable criteria are defined for releasing software updates to fix security vulnerabilities.

• Security updates are made available to stakeholders in accordance with the defined criteria.

R1 Describe the software vendor’s process for determining when a software update is required to address security vulnerabilities, and how this relates to the software vendor’s methodology for classifying and/or categorizing software vulnerabilities.

R2 Describe the methods used and the frequency (if applicable) with which the software vendor makes security updates available to stakeholders.

11.1.b The assessor shall examine evidence, including update-specific security-testing results and details, to confirm that security updates are made available to stakeholders in accordance with the defined criteria. Where updates are not provided in accordance with the …
Removed p. 98
In Place Not in Place N/A 11.2.a The assessor shall examine evidence to confirm that the method(s) by which the vendor releases software updates maintain the integrity of the software code during transmission and installation.

R1 Describe the methods used by the software vendor to ensure that the integrity of the software code is maintained throughout distribution and installation of software updates.

11.2.b Where user input or interaction is required to validate the integrity of the software code, the assessor shall examine evidence to confirm that guidance on this process is provided to stakeholders in accordance with Control Objective 12.1.

R1 Indicate whether the software requires user input or interaction to validate the integrity of software updates prior to implementation.

R2 If R1 is “Yes,” then identify the evidence obtained that details the software vendor’s guidance on how to validate the integrity of software updates.

11.2.c Where the integrity method implemented is not cryptographically secure, …
Removed p. 99
R1 Describe the methods used by the software vendor to notify stakeholders of the availability of software updates.

R2 Describe what the assessor observed in the evidence obtained that confirms guidance is provided to stakeholders on how to implement software updates.

11.2.e The assessor shall examine evidence to confirm that stakeholders are notified when known vulnerabilities are detected in software that has not yet been updated with a fix. This includes vulnerabilities that may exist in third-party software and libraries used by the software. The assessor shall confirm that this process includes providing the stakeholders with suggested mitigations for any such vulnerabilities.

R1 Describe the software’s vendor criteria and process for notifying stakeholders of known vulnerabilities in the assessed software for which a fix is not yet available.

R2 Describe what the assessor observed in the evidence obtained that confirms the software vendor provides stakeholders with suggested mitigations to address known vulnerabilities in the …
Removed p. 100
In Place Not in Place N/A 12.1 The software vendor provides stakeholders with clear and thorough guidance on the secure implementation, configuration, and operation of its payment software.

In Place Not in Place N/A 12.1.a The assessor shall examine evidence to confirm that the vendor creates and provides stakeholders, clear and sufficient guidance to allow for the secure installation, configuration, and use of the software.

R1 Identify the evidence obtained that details the software vendor’s guidance for stakeholders on how to install and/or configure the security features of the software.

12.1.b The assessor shall examine evidence to confirm that the guidance:

• Includes details on how to securely and correctly install any third-party software that is required for the operation of the vendor software.

• Provides instructions on the correct configuration of the platform(s) on which the software is to be executed, including setting security parameters and installation of any data elements (such as certificates).

• …
Removed p. 102
In Place Not in Place N/A A.1.1 The software does not store sensitive authentication data after authorization (even if encrypted) unless the software is intended only for use by issuers or organizations that support issuing services.

In Place Not in Place N/A A.1.1 Using information obtained in Test Requirement 1.1.a in the Core Requirements section, the assessor shall examine evidence and test the software to identify all potential storage locations for Sensitive Authentication Data, and to confirm that the software does not store such data after transaction authorization is complete. This includes storage of SAD in temporary storage (such as volatile memory), semi-permanent storage (such as RAM disks), and non-volatile storage (such as magnetic and flash storage media).

Where Sensitive Authentication Data is stored after authorization, the assessor shall examine evidence to confirm that the software is designed explicitly for issuing purposes or for use by issuers or organizations that support issuing …
Removed p. 103
In Place Not in Place N/A A.2.1 The software vendor provides guidance to stakeholders regarding secure deletion of cardholder data after expiration of defined retention period(s).

In Place Not in Place N/A A.2.1 The assessor shall examine evidence to confirm that guidance is provided to stakeholders in accordance with Control Objective 12.1 that details:

• All locations where the software stores cardholder data.

• How to securely delete cardholder data stored by the payment software, including cardholder data stored on underlying software or systems (such as in OS files or in databases).

• How to configure the underlying software or systems to prevent the inadvertent capture or retention of cardholder data (for example, by system backup or restore points).

R1 Identify the evidence obtained that details the software vendor’s guidance on handling cardholder data.

R2 Describe what the assessor observed in the evidence obtained that confirms all locations where sensitive data is stored in the assessed …
Removed p. 104
In Place Not in Place N/A A.2.2.a The assessor shall examine evidence to confirm that the software provides features that enable responsible parties to restrict or otherwise mask the display of PAN to the minimum number of digits required to meet a defined business need.

R1 Describe the options available within the software to restrict the display of PAN.

A.2.2.b The assessor shall examine evidence to confirm that all displays of PAN are completely masked by default, and that explicit authorization is required to display any digits of the PAN.

R1 Describe the default masking settings for all PAN displays within the software.

R2 Describe the process to enable and authorize the display of PAN for users.

A.2.2.c Where user input or interaction is required to configure PAN masking features and options, the assessor shall examine evidence to confirm that guidance on how to configure these features/options is provided to stakeholders in accordance with Control …
Removed p. 105
• Truncation (hashing cannot be used to replace the truncated segment of PAN).

• Index tokens and pads (pads must be securely stored).

• Strong cryptography with associated key-management processes and procedures.

In Place Not in Place N/A A.2.3.a The assessor shall examine evidence and test the software to confirm that methods are implemented to render PAN unreadable anywhere it is stored using the following methods:

• Index tokens and pads, with the pads being securely stored.

• Strong cryptography, with associated key- management processes and procedures.

Note: The assessor should examine several tables, files, log files, and any other resources created or generated by the software to verify the PAN is rendered unreadable.

R1 Describe the methods relied upon by the software to render PAN unreadable anywhere it is stored.

R2 Describe what the assessor observed in the evidence obtained that confirms hashing is not used to render PAN unreadable.

R6 Describe any other assessment activities performed and/or …
Removed p. 106
• Details of any configurable options for each method used to render cardholder data unreadable, and instructions on how to configure each method for all locations where cardholder data is stored.

• A list of all instances where cardholder data may be output for storage outside of the payment application, and instructions that the implementing entity is responsible for rendering the PAN unreadable in all such instances.

• Instruction that if debugging logs are ever enabled (for troubleshooting purposes) and they contain PAN, they must be protected, that debugging must be disabled as soon as troubleshooting is complete, and that debugging logs must be securely deleted when no longer needed.

R1 Indicate whether user input or interaction is required to configure methods to render PAN unreadable where stored.

R2 If R1 is “Yes,” then identify the evidence obtained that details the software vendor’s guidance on rendering PAN unreadable where stored.

R3 If R1 is “Yes,” …
Removed p. 108
In Place Not in Place N/A B.1.1 The software vendor maintains documentation that describes all software components, interfaces, and services provided or used by the software.

In Place Not in Place N/A B.1.1 The assessor shall examine evidence to confirm that documentation is maintained that describes the software’s overall design and function including, but not limited to, the following:

• All third-party and open-source components, external services, and Application Programming Interfaces (APIs) used by the software.

• All User Interfaces (UI) and APIs provided or made accessible by the software.

R1 Identify the evidence obtained that details the software’s overall design and function.

R2 Describe what the assessor observed that confirms the software design documentation covers all third-party and open-source components, external services, and APIs used by the software.

R3 Describe what the assessor observed that confirms the software design documentation covers all interfaces and APIs provided or made accessible by the software.
Removed p. 109
Note: This control objective is an extension of Control Objectives 1.1 and 1.2. Validation of these control objectives should be performed at the same time.

In Place Not in Place N/A B.1.2.a The assessor shall examine evidence to confirm that documentation is maintained that describes all sensitive data flows including, but not limited to, the following:

• All sensitive data stored, processed, or transmitted by the software.

• All locations where sensitive data is stored, including both temporary and persistent storage locations.

• How sensitive data is securely deleted from storage (both temporary and persistent) when no longer needed.

R1 Identify the evidence obtained that details all data flows involving sensitive data.

R2 Describe what the assessor observed in the evidence obtained that confirms it details all sensitive data stored, process, and transmitted by the software.

R3 Describe what the assessor observed in the evidence obtained that confirms it details all locations where sensitive data is stored, …
Removed p. 110
• All inputs, outputs, and possible error conditions for each function that handles sensitive data.

• All cryptographic algorithms, modes of operation, and associated key management practices for all functions that employ cryptography for the protection of sensitive data.

R1 Identify the evidence obtained that details all software functions that handle sensitive data.

R2 Describe what the assessor observed in the evidence obtained that confirms it details all inputs, outputs, and possible error conditions for each function that handles sensitive data.

R3 Describe what the assessor observed in the evidence obtained that confirms it details all cryptographic algorithms, modes of operation, and associated key management practices for all functions that employ cryptography for the protection of sensitive data.
Removed p. 111
In Place Not in Place N/A B.1.3 The assessor shall examine evidence to confirm that documentation is maintained that describes all configurable options provided or made available by the software that can impact the security of sensitive data including, but not limited to, the following:

• All configurable options that could allow access to sensitive data.

• All configurable options that could enable modification of any mechanisms used to protect sensitive data.

• All remote access features, functions, and parameters provided or made available by the software.

• All remote update features, functions, and parameters provided or made available by the software.

• The default settings for each configurable option.

R1 Identify the evidence obtained that details all configurable options available that can impact the security of sensitive data.

R2 Describe what the assessor observed in the evidence obtained that confirms it details all configurable options that facilitate access to sensitive data.

R3 Describe what the assessor observed …
Removed p. 112
In Place Not in Place N/A B.2.1 The software is intended for deployment and operation on payment terminals (PCI-approved POI devices).

In Place Not in Place N/A B.2.1 The assessor shall examine evidence to determine the payment terminals upon which the software is to be deployed. For each of the payment terminals identified and included in the software assessment, the assessor shall examine the payment terminal’s device characteristics and compare them with the following characteristics specified in the PCI SSC’s List of Approved PTS Devices to confirm they match:

• Firmware version number(s) R1 Identify the evidence obtained that details the PCI PTS POI devices supported by the software.

R2 Describe what the assessor observed in the evidence obtained that confirms the devices included in the software assessment match the characteristics of those same devices on the PCI SSC’s List of Approved PTS Devices.

B.2.2 The software uses only the external communication methods included …
Modified p. 112 → 87
• PTS approval number
• PTS approval number(s) <Assessor Response>
Modified p. 112 → 87
• Hardware version number
• Hardware Version Number(s)
Removed p. 113
B.2.2.b Where the software supports external communications, the assessor shall examine all relevant payment terminal documentation (including the payment terminal vendor’s security guidance/policy) to determine which external communication methods were included in the payment terminal’s PTS device evaluation.

R1 Identify the evidence obtained that details the terminal vendor’s security guidance or policy for the PCI PTS POI devices included in the software assessment.

R2 Identify the communication methods included in the PTS device evaluation for each of the PCI PTS POI devices included in the software assessment.

B.2.2.c The assessor shall examine evidence (including source code) to confirm that the software uses only the external communication methods included in the payment terminal’s PTS device evaluation and does not implement its own external communication methods or IP stack.

R1 Describe what the assessor observed in the evidence obtained that confirms the software uses only the external communication methods included in the PTS device evaluations for …
Removed p. 115
In Place Not in Place N/A B.2.3.a The assessor shall examine evidence (including source code) to determine whether the software provides encryption of sensitive data. Where the software does provide such a function, the assessor shall confirm the software does not bypass or render ineffective any encryption methods or account data security methods implemented by the payment terminal as follows:

R1 Indicate whether the software provides its own methods to facilitate the encryption of sensitive data.

R2 If R1 is “Yes,” then describe the methods provided by the software to facilitate sensitive data encryption.

B.2.3.b The assessor shall examine all relevant payment terminal documentation (including payment terminal vendor security guidance/policy) to determine which encryption methods are provided by the payment terminal.

R1 Indicate whether the PCI PTS POI devices included in the software assessment provide methods to facilitate the encryption of sensitive data.

R2 If R1 is “Yes,” then describe the encryption methods provided by …
Removed p. 116
R1 Indicate whether the device approvals for the PCI PTS POI devices included in the software assessment require that their own encryption methods be used.

R2 If R1 is “No,” then describe what the assessor observed in the evidence obtained that confirms the methods provided by the software to encrypt sensitive data provide for strong cryptography.

B.2.4 The software uses only the random number generation function(s) included in the payment terminal’s PTS device evaluation for all cryptographic operations involving sensitive data or sensitive functions where random values are required and does not implement its own random number generation function(s).

In Place Not in Place N/A B.2.4.a The assessor shall examine evidence (including source code) to determine whether the software requires random values to be generated for any cryptographic operations involving sensitive data or sensitive functions.

R1 Indicate whether the software relies on random values to be generated for cryptographic operations involving sensitive data or …
Removed p. 117
R1 Describe what the assessor observed in the evidence obtained that confirms the software uses only those random number generation functions included in the device approvals for the PCI PTS POI devices included in the software assessment.

B.2.5 The software does not provide, through its own logical interface(s), the sharing of clear-text account data directly with other software.

Note: The software is allowed to share clear-text account data directly with the payment terminal’s firmware.

In Place Not in Place N/A B.2.5.a The assessor shall examine evidence (including source code) to determine all logical interfaces of the software, including:

• All logical interfaces and the purpose and function of each.

• The logical interfaces intended for sharing clear-text account data, such as those used to pass clear-text account data back to the approved firmware of the payment terminal.

• The logical interfaces not intended for sharing of clear-text account data, such as those for communication with other …
Removed p. 118
R1 Describe each of the tests performed, including the tool(s) or method(s) used and the scope of each test, to confirm that the software does not allow the sharing of clear-text account data directly with other software through its own logical interfaces.
Removed p. 119
In Place Not in Place N/A B.2.6.a The assessor shall examine evidence (including source code) to determine whether and how the software connects to and/or uses any shared resources provided by the payment terminal, and to confirm that:

• The guidance required in Control Objectives 12.1 and B.5.1 includes detailed instructions for how to configure the software to ensure secure integration with shared resources.

• The required guidance for secure integration with shared resources is in accordance with the payment terminal vendor’s security guidance/policy.

R1 Indicate whether the software relies on any shared resources provided by the PCI PTS POI devices that are included in the software evaluation.

R2 If R1 is “Yes,” then identify the evidence obtained that details the software vendor’s guidance for how to configure the software to securely integrate with the shared resources.

B.2.6.b The assessor shall install and configure the software in accordance with the guidance required in Control Objectives …
Modified p. 119 → 92
R3 If R1 is “Yes,” describe what the assessor observed in the evidence obtained that confirms the software integrates the shared resources securely in accordance with the applicable PCI PTS POI device guidance/policy.
<Assessor Response> B2-6 The software facilitates the management or use of shared platform resources in a secure manner and in accordance with applicable POI device guidance.
Removed p. 120
In Place Not in Place N/A B.2.7.a The assessor shall examine all relevant payment terminal documentation (including the payment terminal vendor’s security guidance/policy) to determine whether and how application segregation is enforced by the payment terminal.

R1 Identify the evidence obtained to support this test requirement.

B.2.7.b The assessor shall examine evidence (including source code) to confirm that the software does not introduce any function(s) that would allow it to bypass or defeat any device- level application segregation controls.

R1 Indicate whether any of the PCI PTS POI devices included in the software assessment provide for or enforce application segregation within the device.

R2 If R1 is “Yes,” then describe what the assessor observed in the evidence obtained that confirms the software adheres to all application segregation provided or enforced by applicable PCI PTS POI devices.

B.2.8 All software files are cryptographically signed to enable cryptographic authentication of the software files by the payment terminal …
Removed p. 121
R1 Identify each of the software tests performed, including the tool(s) or method(s) used and the scope of each test, to confirm that all software files can be cryptographically signed in a manner that supports the cryptographic authentication of those files by the applicable PCI PTS POI devices.

R2 Identify the evidence obtained that confirms that all software files can be cryptographically signed in a manner that supports the cryptographic authentication of those files by the applicable PCI PTS POI devices.

B.2.8.c Where the software supports the loading of files outside of the base software package(s), the assessor shall examine evidence and test the software to determine whether each of those files is cryptographically signed in a manner that enables the cryptographic authentication of those files by the payment terminal. For any files that cannot be cryptographically signed, the assessor shall justify why the inability to cryptographically sign such files does not …
Removed p. 122
R1 Indicate whether the software supports EMV® payment transactions.

R2 If R1 is “Yes,” then identify evidence obtained that demonstrates that all EMV Certification Authority Public Keys can be cryptographically signed in a manner that enables the cryptographic authentication of those files by applicable PCI PTS POI devices.

B.2.9 The integrity of software prompt files is protected in accordance with Control Objective B.2.8. In Place Not in Place N/A B.2.9.a The assessor shall examine evidence (including source code) to determine whether the software supports the use of data entry prompts and/or prompt files. Where the software supports such features, the assessor shall confirm the software protects the integrity of those prompts as defined in Test Requirements B.2.9.b through B.2.9.c.

R1 Indicate whether the software supports the use of data entry prompts or prompt files.

R2 Identify the evidence obtained to support these findings.

B.2.9.b The assessor shall examine the guidance required in Control Objectives 12.1 …
Removed p. 123
R1 If applicable, describe each of the software tests performed, including the tool(s) and/or method(s) used and the scope of each test, to confirm that all prompt files are cryptographically signed in a manner that enables the cryptographic authentication of those files by the payment terminal in accordance with B.2.8.

R2 Identify the evidence obtained that confirms that all prompt files are cryptographically signed in a manner that enables the cryptographic authentication of those files by the payment terminal in accordance with B.2.8.
Removed p. 124
In Place Not in Place N/A B.3.1 The software validates all user and other external inputs.

Note: Control Objectives B.3.1 through B.3.3 are extensions of Control Objective 4.2. Validation of these control objectives should be performed at the same time.

In Place Not in Place N/A B.3.1.a The assessor shall examine evidence (including source code) to identify all locations where the software accepts input data from untrusted sources. For each instance, the assessor shall confirm that input data is required to conform to a list of expected characteristics and that all input that does not conform to the list of expected characteristics is rejected by the software or otherwise handled securely.

R1 Identify the evidence obtained that details all locations within the software where input data from external or untrusted sources is accepted.

R2 Describe the method(s) used or relied upon by the software to ensure that input data conforms to a set of …
Removed p. 125
R1 Identify the evidence obtained that details all locations within the software where string values from external or untrusted sources are accepted as inputs.

R2 Describe the method(s) used or relied upon by the software to prevent input data from external or untrusted sources from being interpreted as a command.

B.3.1.1.b The assessor shall install and configure the software in accordance with the guidance required in Control Objectives 12.1 and B.5.1. Using an appropriate “test platform” and suitable forensic tools and/or methods, the assessor shall test the software by attempting to supply each of the identified functions with data that includes commands to confirm that the software either rejects such inputs or otherwise handles such inputs securely.

R1 Identify each of the software tests performed, including the tool(s) or method(s) used and the scope of each test, to confirm the findings in Test Requirement B.3.1.1.a.

R2 Identify the evidence obtained that demonstrates that the …
Removed p. 126
In Place Not in Place N/A B.3.1.2.a The assessor shall examine evidence (including source code) to identify all software functions that handle buffers and process data supplied from untrusted sources. For each of the noted functions, the assessor shall confirm that each of the identified functions:

• Uses only unsigned variables to define buffer sizes.

• Conducts checks to confirm that buffers are sized appropriately for the data they are intended to handle, including consideration for underflows and overflows.

• Rejects or otherwise securely handles any inputs that violate buffer size or other memory allocation thresholds.

R1 Identify the evidence obtained that details all the software functions that handle buffers and accept data from external or untrusted sources.

R2 Identify the evidence obtained that demonstrates that only unsigned variables are used to define buffer sizes.

R3 Describe how the software ensures that buffers are sized appropriately for the data they are intended to store.

R4 Describe how …
Removed p. 127
• Checks return values for the presence of sensitive data.

• Processes the return values in a way that does not inadvertently “leak” sensitive data.

R1 Describe the methods used by the software to check return values for the presence of sensitive data.

R2 Describe the protection methods implemented to ensure return values are processed in a way that does not inadvertently “leak” sensitive data.

B.3.2.b The assessor shall install and configure the software in accordance with the guidance required in Control Objectives 12.1 and B.5.1. Using an appropriate “test platform” and suitable forensic tools and/or methods, the assessor shall test each software function that handles sensitive data by attempting to manipulate the software in a manner that generates an unhandled exception to confirm that error conditions do not expose sensitive data.

R1 Describe each of the software tests performed, including the tool(s) or method(s) used and the scope of each test, to validate the …
Removed p. 129
In Place Not in Place N/A B.4.1 A documented process is maintained and followed for testing software for vulnerabilities prior to each update or release.

Note: This control objective is an extension of Control Objective 10.2. Validation of these control objectives should be performed at the same time.

In Place Not in Place N/A B.4.1.a The assessor shall examine evidence to confirm that the software vendor maintains a documented process in accordance with Control Objective 10.2 for testing the software for vulnerabilities prior to each update or release, and that the documented process includes detailed descriptions of how the vendor tests for the following:

• The presence or use of any unnecessary ports and protocols.

• The unintended storage, transmission, or output of any clear-text account data.

• The presence of any default user accounts with default or static access credentials.

• The presence of any hard-coded authentication credentials in code or in configuration files.

• The …
Removed p. 130
• The presence of any default user accounts with static access credentials.

R1 Identify the evidence obtained that confirms the findings for this test requirement.

Note: This control objective is an extension of Control Objective 12.1. Validation of these control objectives should be performed at the same time.
Removed p. 131
In Place Not in Place N/A B.5.1 The software vendor provides implementation guidance on how to implement and operate the software securely for the payment terminals on which it is to be deployed.

In Place Not in Place N/A B.5.1 The assessor shall examine evidence to confirm that guidance on how to securely implement and operate the software for all applicable payment terminals is provided to stakeholders in accordance with Control Objective 12.1.

R1 Identify the evidence obtained that details the software vendor’s guidance on the implementation and operation of the software for applicable payment terminals.

B.5.1.1 Implementation guidance includes detailed instructions for how to configure all available security options and parameters of the software.

In Place Not in Place N/A B.5.1.1 The assessor shall examine evidence to confirm that the required guidance includes detailed instructions on how to configure all available security options and parameters of the software in accordance with Control Objective …
Removed p. 132
R1 Describe what the assessor observed in the evidence obtained that confirms vendor guidance includes instructions on configuring the software to use the security features and functions of applicable PCI PTS POI devices can be found.

B.5.1.3 Implementation guidance includes detailed instructions for how to configure the software to securely integrate or use any shared resources provided by the payment terminal.

In Place Not in Place N/A B.5.1.3 The assessor shall examine evidence to confirm that the required guidance includes detailed instructions on how to configure the software to securely integrate or use any shared resources provided by the payment terminal in accordance with Control Objective B.2.6.

R1 Describe what the assessor observed in the evidence obtained that confirms vendor guidance includes instructions on how to configure the software to use shared resources provided by applicable PCI PTS POI devices can be found.

B.5.1.4 Implementation guidance includes detailed instructions on how to cryptographically sign …
Removed p. 133
In Place Not in Place N/A B.5.1.5 The assessor shall examine evidence to confirm that the required guidance includes detailed instructions for stakeholders to cryptographically sign all prompt files in accordance with Control Objective B.2.9.

R1 Describe what the assessor observed in the evidence obtained that confirms that vendor guidance includes instructions on how to cryptographically sign prompt files.

B.5.2 Implementation guidance adheres to payment terminal vendor guidance on the secure configuration of the payment terminal.

In Place Not in Place N/A B.5.2 The assessor shall examine evidence (including the payment terminal vendor’s security guidance/policy and the guidance required in Control Objective B.5.1) to confirm that the guidance aligns with the payment terminal vendor’s security guidance/policy.

R1 Describe what the assessor observed in the evidence obtained that confirms the software vendor’s guidance does not conflict with the payment terminal vendors’ security guidance for the PCI PTS POI devices included in the software assessment.
Removed p. 134
In Place Not in Place N/A C.1.1 All software components and services are documented or otherwise cataloged in a software bill of materials (SBOM).

In Place Not in Place N/A C.1.1 The assessor shall examine evidence to confirm that information is maintained that describes all software components and services comprising the software solution, including:

• All proprietary software libraries, packages, modules, and/or code packaged in a manner that enables them to be tracked as a freestanding unit of software.

• All third-party and open-source frameworks, libraries, and code embedded in or used by the software during operation.

• All third-party software dependencies, APIs, and services called by the software during operation.

R1 Identify the evidence obtained that details the assessed software’s bill of materials (SBOM).

C.1.2 The SBOM describes each of the primary components and services in use, as well as their secondary transitive component relationships and dependencies to the greatest extent feasible.

In Place Not in …
Removed p. 135
R1 Identify each of the software tests performed, including the tool(s) or method(s) used and the scope of each test, to validate the evidence obtained in Test Requirement C.1.2.a.

R2 Indicate whether software testing identified any components or services used during software operation that were not reflected in the SBOM.

R3 If R2 is “Yes,” then describe why the assessor considers it acceptable for these components or services to be excluded from the SBOM.

C.1.3 Where the software is provided “as a service,” the SBOM includes information describing the software dependencies present in the production software execution environment to the greatest extent feasible.

In Place Not in Place N/A C.1.3.a The assessor shall examine evidence to confirm that the SBOM describes all dependencies present in the production software execution environment that the software relies upon for operation or to satisfy security requirements in this standard.

R1 Indicate whether the software is provided “as-a- service”.

R2 If …
Removed p. 136
R3 If R2 is “Yes,” then describe why the assessor considers it acceptable for these dependencies to be excluded from the SBOM.
Removed p. 137
In Place Not in Place N/A C.1.4.a The assessor shall examine evidence to confirm that information is maintained in the SBOM that describes the following for each component and service in use, including secondary component relationships and dependencies:

• The original source/supplier of the component or service.

• The name of the component or service as defined by the original supplier.

• A description of the relationship(s) between the component and service and other components/services embedded in or used by the software.

• The version of the component or service as defined by the original supplier to differentiate it from previous or other versions.

• The name of the author who designed/developed the component or service.

• Any other identifiers provided by the original supplier to uniquely identify the component or service.

R1 Describe the overall structure of the SBOM, the nomenclature and attributes used, and how the SBOM accounts for components and services.

C.1.4.b The assessor shall …
Removed p. 138
R1 Describe the software vendor’s processes for generating SBOMs and how it ensures one is generated for each new software release.

C.1.6 Vulnerabilities in third-party components and services are monitored and managed in accordance with Control Objective 10.

In Place Not in Place N/A C.1.6.a The assessor shall examine evidence to confirm that third-party components and services present in and/or in use by the software are regularly monitored for vulnerabilities in accordance with Control Objective 10.1.

R1 Describe how the software vendor leverages the SBOM to monitor and manage vulnerabilities in third-party components and services.

C.1.6.b The assessor shall examine evidence to confirm that vulnerabilities in third-party components and services are identified and are patched or otherwise mitigated in a timely manner in accordance with Control Objective 10.2.

R1 Identify the evidence obtained that confirms that vulnerabilities in third-party components are patched or mitigated in a timely manner.

C.1.7 Where software components and/or resources are hosted …
Removed p. 140
In Place Not in Place N/A C.2.1 User access to sensitive functions and sensitive resources exposed through Internet-accessible interfaces is authenticated.

In Place Not in Place N/A C.2.1 Using information obtained in Test Requirements 1.2.a and 2.1.a in the Core Requirements, the assessor shall examine evidence to identify all sensitive functions and sensitive resources exposed through Internet- accessible interfaces.

R1 Identify the evidence obtained that details all sensitive functions and sensitive resources that are exposed, or that may be exposed, through Internet- accessible interfaces.

C.2.1.1 The methods implemented to authenticate user access to sensitive functions and sensitive resources use industry-standard mechanisms.

R1 Describe the method(s) relied upon by the software to authenticate access to the sensitive functions and sensitive resources identified in Test Requirement C.2.1.

C.2.1.1.b The assessor shall examine evidence to confirm that the implemented methods use industry-standard mechanisms that are:

• Provided by well-known and industry- accepted third-party suppliers; or

• Designed and implemented in …
Modified p. 140 → 102
In Place Not in Place N/A C.2.1.1.a The assessor shall examine evidence to identify all methods implemented by the software to authenticate access to sensitive functions and sensitive resources.
In Place Not In Place C4-1.a Examine vendor documentation to verify that the software authenticates authorized user access to sensitive assets via publicly- accessible interfaces.
Removed p. 141
C.2.1.1.c Where sessions are used to authenticate user access to sensitive functions and sensitive resources, the assessor shall examine evidence to confirm that the sessions are handled in accordance with industry-recognized standards and best practices for secure session management.

R1 Indicate whether the software relies on “sessions” to authenticate user access to the sensitive functions and sensitive resources identified in Test Requirement C.2.1.

R2 If R1 is “Yes,” then describe what the assessor observed in the evidence obtained that confirms the use of “sessions” industry-recognized standards and best practices for secure session management.

C.2.1.1.d Where tokens (for example, access tokens and refresh tokens) are used to authenticate user access to sensitive functions and sensitive resources, the assessor shall examine evidence to confirm that the tokens are handled in accordance with industry-recognized standards and best practices for secure token management.

R1 Indicate whether the software relies on tokens to authenticate user access to the sensitive …
Removed p. 142
In Place Not in Place N/A C.2.1.2 Using information obtained in Test Requirement C.2.1.1.a, the assessor shall examine evidence to confirm that the authentication methods implemented are sufficiently strong and robust to protect authentication credentials in accordance with Control Objective 5.3 in the Core Requirements section.

R1 Describe how the methods implemented to authenticate user access to the sensitive functions and sensitive resources identified in Test Requirement C.2.1 mitigate the likelihood of authentication credentials being forged, spoofed, guessed, or otherwise compromised by an unauthorized entity.

C.2.1.3 Authentication decisions are enforced within a secure area of the software. In Place Not in Place N/A C.2.1.3.a The assessor shall examine evidence to identify where within the software architecture authentication decisions are enforced.

R1 Describe the locations within the software architecture where authentication decisions are enforced.

R2 Identify the evidence obtained that supports this finding.

C.2.1.3.b The assessor shall examine evidence to confirm that all authentication decisions are …
Removed p. 143
R1 Indicate whether client-side or browser-based functions, scripts, or data are used for authenticating access to software interfaces.

R2 If R1 is “Yes,” then describe how the software uses these functions, scripts, and data for authenticating access to software interfaces.

R3 Describe the methods implemented to protect these functions, scripts, and data from compromise or manipulation by an unauthorized entity.

C.2.2 Access to all Internet-accessible interfaces is restricted to explicitly authorized users only. In Place Not in Place N/A C.2.2.a Using information obtained in Test Requirement 2.1.a in the Core Requirements section, the assessor shall examine evidence to identify all software interfaces that are exposed to the Internet or that can be configured in a way that exposes them to the Internet.

R1 Identify the evidence obtained that details the software interfaces that are exposed, or that could be configured in a way to expose them, to the Internet.

C.2.2.b The assessor shall examine evidence …
Removed p. 144
• implemented correctly;

• appropriate for the types of users expected to use the interface; and

• does not expose known vulnerabilities.

R1 Describe what the assessor observed in the evidence obtained that confirms that the methods identified in Test Requirement C.2.2.b to restrict access to software interfaces are appropriate for the type(s) of interface provided and does not expose known vulnerabilities.

C.2.2.d Where the methods used to authorize access to Internet-accessible interfaces is user configurable, or otherwise requires user input or interaction, the assessor shall examine evidence to confirm that appropriate guidance is made available to stakeholders in accordance with Control Objective 12.1 that describes the configurable options available and how to configure each method securely.

R1 Indicate whether any of the methods identified in Test Requirement C.2.2.b requires or enables users to configure those methods.

R2 If R1 is “Yes,” then identify the evidence obtained that details the configurable options available and the software …
Removed p. 145
C.2.3 Access to all software functions and resources exposed through Internet-accessible interfaces is restricted to explicitly authorized users only.

In Place Not in Place N/A C.2.3 Using information obtained in Test Requirement C.2.2.a, the assessor shall examine evidence to identify all software functions and resources that are exposed, or that can be configured in a way that exposes them, through Internet-accessible interfaces.

R1 Identify the evidence obtained that details all software functions or resources that are exposed, or that can be exposed, through APIs or other interfaces.

C.2.3.1 The software ensures the enforcement of access control rules at both the function level and resource level with fine-grained access control capabilities.

In Place Not in Place N/A C.2.3.1.a Using information obtained in Test Requirement C.2.3, the assessor shall examine evidence to determine how the software controls access to individual functions and resources exposed (or potentially exposed) through Internet- accessible interfaces.

R1 Describe the methods relied upon …
Removed p. 146
• appropriate for the type of function(s) and resource(s) provided; and

R1 Describe what the assessor observed in the evidence obtained that confirms the methods described in Test Requirement C.2.3.1.a are implemented correctly and do not expose known vulnerabilities.

C.2.3.1.c Where the methods used to authorize access to the functions and resources exposed (or potentially exposed) through Internet-accessible interfaces are user configurable or otherwise requires user input or interaction, the assessor shall examine evidence to confirm that guidance is made available to stakeholders in accordance with Control Objective 12.1 that describes the mechanisms and configurable options available to restrict access to the functions and resources exposed through these interfaces, and how to configure such mechanisms.

R1 Indicate whether any of the methods described in Test Requirement C.2.3.1.a are user configurable.

R2 If R1 is “Yes,” then identify the evidence obtained that details the software vendor’s guidance on how to configure such methods securely.

C.2.3.1.d Where …
Removed p. 147
R1 Describe each of the software tests performed, including the tool(s) or method(s) used and the scope of each test, to confirm that access to functions and resources exposed through APIs or other interfaces requires users to be explicitly authorized before access is granted.

C.2.3.2 Authorization rules are enforced upon each user request to access software functions and resources through Internet-accessible interfaces.

In Place Not in Place N/A C.2.3.2.a Using information obtained in Test Requirement C.2.3.1.a, the assessor shall examine evidence to confirm that authorization checks are performed each time users request access to a function or resource exposed (or potentially exposed) through Internet-accessible interfaces to verify they are authorized for the function, resource, and type of access requested.

R1 Describe how the software verifies whether users are authorized to access functions or resources exposed through APIs or other interfaces.

C.2.3.2.b The assessor shall examine evidence and test the software to confirm that access …
Removed p. 148
R1 Identify the evidence obtained that details the locations within the software architecture where authorization and access control decisions are enforced.

C.2.3.3.b The assessor shall examine evidence to confirm that all access control decisions are enforced within a secure area of the software architecture.

R1 Describe what the assessor observed in the evidence obtained that confirms all authorization and access control decisions are enforced within a secure area of the software architecture.

C.2.3.3.c The assessor shall examine evidence and test the software to confirm that client-side or browser-based functions, scripts, and data are never solely relied upon for access control purposes.

R1 Indicate whether client-side or browser-based functions, scripts, or data are relied upon for access control purposes.

R2 If R1 is “Yes,” then describe how the software uses these functions, scripts, or data for access control.

R3 If R1 is “Yes,” then describe the methods implemented to ensure the compromise of client- side or browser-based …
Removed p. 149
In Place Not in Place N/A C.3.1 The software enforces or otherwise supports the use of the latest HTTP Security Headers to protect Internet accessible interfaces from attacks.

In Place Not in Place N/A C.3.1.a The assessor shall examine evidence to confirm the software supports the use of the latest HTTP Security Headers, and to determine the options available and how such settings are configured.

R1 Identify the evidence obtained that details the primary set of HTTP Security Headers and configuration options that are supported by the software.

C.3.1.b Where HTTP Security Headers are configured and controlled by the software provider, the assessor shall examine evidence to confirm that the software is configured to use the latest available HTTP Security Headers and that the configuration settings are reasonable and justified.

R1 Indicate whether HTTP Security Headers are configured and controlled by the assessed software or entity.

R2 If R1 is “Yes,” then describe what the …
Removed p. 150
In Place Not in Place N/A C.3.2.a Using information obtained in Test Requirement C.2.1.a, the assessor shall examine evidence to identify all interfaces that accept data input from untrusted sources.

R1 Identify the evidence obtained that details all APIs and other interfaces that accept input data from untrusted sources.

C.3.2.b Where the software accepts input from untrusted sources, the assessor shall examine evidence to identify the data format(s) expected by the software for each input field and the parsers and interpreters involved in processing the input data.

R1 Identify the evidence obtained that details the data format(s) expected for each of the input fields identified in Test Requirement C.3.2.a and the parsers or interpreters involved in the processing of the input data.

C.3.2.c Using information obtained in Test Requirement 4.1.a in the Core Requirements section, the assessor shall examine evidence to determine whether attacks that target all such parsers and interpreters are acknowledged in …
Removed p. 151
R1 Indicate whether any of the security controls described in Test Requirement C.3.2.d are user configurable.

R2 If R1 is “Yes,” then identify the evidence obtained that details the software vendor’s guidance on configuring these security controls securely.

C.3.2.1 Industry-standard methods are used to protect software inputs from attacks that attempt to exploit vulnerabilities through the manipulation of input data.

In Place Not in Place N/A C.3.2.1.a Using information obtained in Test Requirement 4.2.a in the Core Requirements section, the assessor shall examine evidence to identify all software security controls implemented to mitigate attacks that attempt to exploit vulnerabilities through the manipulation of input data.

R1 Describe the methods relied upon by the software to mitigate attempts to exploit vulnerabilities in parsers and interpreters through the manipulation of input data.

C.3.2.1.b The assessor shall examine evidence to confirm that the methods implemented to protect against such attacks use industry-standard mechanisms and/or techniques that are:

R1 Describe …
Removed p. 152
• Are implemented correctly in accordance with available guidance, and

• Do not expose any vulnerabilities.

R1 Describe what the assessor observed in the evidence obtained that confirms the methods implemented to protect against attempts to exploit vulnerabilities in parsers and interpreters through the manipulation of input data are implemented correctly and do not expose vulnerabilities.

C.3.2.2 Parsers and interpreters are configured with the most restrictive configuration feasible. In Place Not in Place N/A C.3.2.2.a Using information obtained in Test Requirement C.3.2.b, the assessor shall examine evidence to identify the configurations for each parser or interpreter used to process untrusted input data.

R1 Identify the evidence obtained that details the (default) configurations for each parser or interpreter used to process untrusted input data.

C.3.2.2.b For each of the parsers/interpreters and the configurations identified, the assessor shall examine evidence to confirm that parsers and interpreters are configured with the most restrictive set of capabilities feasible and …
Removed p. 153
In Place Not in Place N/A C.3.3.a Using information obtained in Test Requirements C.2.1.a and C.2.2, the assessor shall examine evidence to identify all Internet accessible interfaces and the functions and resources exposed (or potentially exposed) through those interfaces to identify where such interfaces, functions, and resources may be susceptible to resource starvation attacks.

R1 Identify the evidence obtained that details the interfaces, functions, and resources potentially susceptible to resource starvation attacks.

C.3.3.b Where such interfaces, functions, and resources are potentially susceptible to resource starvation attacks, the assessor shall examine evidence to confirm that:

• The threat of such attacks is documented in accordance with Control Objective 4.1, and
Removed p. 153
R1 Describe what the assessor observed in the evidence obtained that confirms that the threats related to resource starvation attacks are documented in the software vendor’s threat analysis.

R2 Describe the security controls implemented to protect against resource starvation attacks.

C.3.3.c The assessor shall examine evidence to confirm that the software security controls implemented to mitigate resource starvation and other similar attacks on Internet accessible interfaces are designed and implemented in accordance with applicable industry standards and best practices.

R1 Describe what the assessor observed in the evidence obtained that confirms that software security controls implemented to protect against resource starvation attacks are aligned with industry standards and best practices regarding such protections.
Removed p. 154
R1 Indicate whether software security controls to protect against resource starvation attacks are user configurable.

R2 If R1 is “Yes,” then identify the evidence obtained that details the software vendor’s guidance on configuring such methods.

C.3.4 Software security controls are implemented to protect Internet accessible interfaces from malicious file content.

In Place Not in Place N/A C.3.4.a Using information obtained in Test Requirement C.2.1.a, the assessor shall examine evidence to identify all Internet accessible interfaces that accept file uploads and the file types permitted.

R1 Identify the evidence obtained that details the software interfaces that accept file uploads, and file types supported.

C.3.4.b Where the software accepts file uploads over Internet accessible interfaces, the assessor shall examine evidence to confirm that:

• The threat of attacks on file upload mechanisms is documented in accordance with Control Objective 4.1, and

R1 Describe what the assessor observed in the evidence obtained that confirms that threats to interfaces that accept …
Removed p. 155
R1 Describe what the assessor observed in the evidence obtained that confirms that software security controls implemented to mitigate attacks on file upload mechanisms are designed in accordance with applicable industry-standard methods.

C.3.4.d The assessor shall examine evidence to confirm that the software security controls implemented to mitigate attacks on file upload mechanisms include methods to restrict the file types permitted by the file upload mechanisms.

R1 Describe the methods implemented by the software to restrict the types of files permitted and the types of files permitted by default.

C.3.4.e The assessor shall examine evidence to confirm that the software security controls implemented to mitigate attacks on file upload mechanisms include methods to restrict the maximum number and size of files permitted for upload.

R1 Describe the methods and/or mechanisms implemented by the software to mitigate attacks that attempt to overwhelm or exploit file parsing mechanisms using excessive file sizes or excessive file uploads.

C.3.4.f …
Removed p. 156
R1 Describe the methods and/or mechanisms implemented by the software to mitigate attacks that attempt to remotely execute malicious code through direct calls to uploaded files.

C.3.4.h The assessor shall examine evidence to confirm that the use of file-parsing mechanisms does not rely on file names or file extensions for security purposes.

R1 Describe the methods and/or mechanisms implemented by the software to mitigate attacks that attempt to trick the software into interpreting files of one type as another type.

C.3.4.i Where the implementation of software security controls is user configurable or otherwise requires user input or interaction, the assessor shall examine evidence to confirm that guidance is made available to stakeholders in accordance with Control Objective 12.1 that describes how to configure such mechanisms.

R1 Indicate whether any of the software security controls implemented to protect against attacks on file parsing mechanisms are user configurable or require user input or interaction to be …
Removed p. 157
• The threat of hostile object creation and data tampering attacks is documented in accordance with Control Objective 4.1, and

R1 Describe what the assessor observed in the evidence obtained that confirms that threats to interfaces that accept and process data objects as inputs are documented.

R2 Describe the software security controls implemented to mitigate common attacks on these types of interfaces and functions.

C.3.5.c The assessor shall examine evidence to confirm that the software security controls implemented to mitigate hostile object creation and data tampering attacks are implemented in accordance with applicable industry standards and best practices.

R1 Describe what the assessor observed in the evidence obtained that confirms that software security controls implemented to mitigate hostile object creation and data tampering attacks are designed in accordance with applicable industry- standard methods.

C.3.5.d The assessor shall examine evidence to confirm that the software security controls implemented to mitigate hostile object creation and data tampering …
Removed p. 158
R1 Describe what the assessor observed in the evidence obtained that confirms that file-parsing mechanisms do not contain or otherwise expose vulnerabilities.

C.3.5.g Where the software accepts serialized objects as inputs, the assessor shall examine evidence to confirm that software security controls are implemented to protect against deserialization attacks and that such security controls adhere to applicable industry standards and best practices.

R1 Indicate whether the software accepts serialized objects as inputs.

R2 If R2 is “Yes,” then describe the software security controls implemented to protect against deserialization attacks.

C.3.5.h Where the software security controls implemented to protect against hostile object creation and data tampering are user configurable or otherwise require user input or interaction, the assessor shall examine evidence to confirm that guidance is made available to stakeholders in accordance with Control Objective 12.1 that describes how to configure such mechanisms.

R1 Indicate whether any of the software security controls are user configurable or …
Removed p. 159
In Place Not in Place N/A C.3.6.a The assessor shall examine evidence to determine if and/or how the software supports cross-origin access to Internet accessible interfaces, and to confirm that access to software APIs and resources from browser-based scripts is disabled by default.

R1 Indicate whether the software supports cross- origin access to software interfaces.

R2 If R1 is “Yes,” then describe the mechanisms implemented to restrict access to API endpoints and resources from browser-based scripts.

C.3.6.b Where cross-origin access is enabled, the assessor shall examine evidence to confirm that the reasons for enabling cross-origin access are reasonable and justified, and that access is restricted to the minimum number of origins feasible.

R1 Describe what the assessor observed in the evidence obtained that confirms access is restricted to the minimum number of origins feasible.

C.3.6.c The assessor shall test the software to confirm that the claims made by the assessed entity regarding cross-origin access are …
Removed p. 161
In Place Not in Place N/A C.4.1 Sensitive data transmissions are encrypted in accordance with Control Objectives 6.2 and 6.3. In Place Not in Place N/A C.4.1.a Using information obtained in Test Requirement 6.2.a, the assessor shall examine evidence to determine how communications are handled by the software, including those between separate systems in the overall software architecture.

R1 Identify the evidence obtained that details the full architecture of the assesses software, including all components that reside both within and outside the physical execution environment.

C.4.1.b Where the software allows or otherwise supports the transmission of sensitive data between users and systems in different security contexts, the assessor shall examine evidence to confirm that all such communications are encrypted using strong cryptography in accordance with Control Objectives 6.2 and 6.3.

R1 Describe what the assessor observed in the evidence obtained that confirms that communications between components in different security contexts is encrypted using …
Removed p. 162
Control Objective Test Requirement Additional Information Ex: 3.2 3.2.b A table containing an inventory of all open-source components used by the vendor’s software is attached to this ROV.
Removed p. 163
B.1 Secure Software Assessor Company Testing Environment Describe the Secure Software Assessor Company’s Test Environment(s) used by the Assessor for this assessment (adding rows as needed).

Identify the organization(s) responsible for configuring the lab/test environment used for this assessment (select all that apply):

Secure Software Assessor Company Software Vendor Third Party (please specify):

Identify the address(es) and/or location(s) of the lab/test environment(s) used for this assessment. If the lab/test environment is virtual, then identify the platform(s) used and the geographic region(s) and/or availability zone(s) where the lab/test environment resides.

Describe each of the lab/test environments used for this assessment including how they are configured. Where more than one lab/test environment is used, be clear which lab/test environment is being described and who was responsible for configuring the test environment.

Describe methods implemented to prevent test environment tampering to ensure the integrity of the software assessment.

B.2 Confirmation of Testing Environment Used Indicate whether the Secure Software …
Removed p. 164
Note: If any of the questions below are determined to be “not applicable,” select “No” for the response and provide a detailed explanation as to why the questions are not applicable in B.4 where prompted.

All testing of the Payment Software occurred in a pristine computing environment, free from potentially conflicting applications, network traffic, security and/or access controls, software versions, and artifacts or “orphaned” components left behind from other software installations.

The testing environment simulated the “real world” use of the Payment Software. Yes No The Payment Software was installed and/or configured in accordance with the Vendor’s installation manual, training materials, and Security Guidance.

All implementations of the Payment Software, including region/country specific versions, intended to be listed on the PCI SSC website were tested.

All Payment Software versions and platforms, including all necessary system components and dependencies, intended to be listed on the PCI SSC website were tested.

All critical payment software functionalities were …
Removed p. 165
If any of the items in B.3 were marked as “No,” describe why those items could not be confirmed and why the circumstances surrounding the lack of confirmation are acceptable.

Specify any other details or comments related to the testing environment that the Secure Software Assessor would like to note.