SAMMY UI is optimized for resolutions with a width 1024px and higher.
Automated Application Security Testing
AIS-05: Does the testing strategy outline criteria to accept new information systems, upgrades, and new versions while ensuring application security, compliance adherence, and organizational speed of delivery goals?
Note: The implementation guidelines of AIS-05 should be interpreted as further guidance in addition to what is specified in AIS-03 and AIS-04.

Automation of security testing should be implemented to reduce risks and errors and enable the scaling of security practices to meet organizational demands. Multiple test types and integration points will likely be needed to provide the appropriate level of assurance throughout the SDLC. Criteria should be developed for use when assessing the automation required by an application, as not all systems will benefit equally.

Strategy:
a. Identify the goals and requirements of the automation implementation.

Example goals:
• Security requirements are not relaxed to improve speed.
• All developers can leverage tools to detect security weaknesses while developing software.
• All third-party libraries are scanned for known vulnerabilities.
• All authentication and authorization functions “pass” abuse case unit tests before deployment.
• All website security headers are verified to meet security requirements when deployed.

Example requirements:
• Applicable programming languages should be supported by static analysis tools.
• Python and C# should be supported by select static analysis tools.
• Automation should not require infrastructure support.
• All automation tools should offer an application programming interface (API).
• All website security headers are verified to meet security requirements when deployed.

Strategy can also include, but is not limited to:
b. Security testing for unintentional side effects and behaviors that are not specified in the test plan or design.
c. Security testing for incident response procedures, such as simulating breaches.
d. Determining which portfolio applications warrant investment in automation. Prioritize the adoption order based on criticality.

Considerations:
e. Security requirements
f. Risk, business, and compliance requirements
g. Development methodology
h. Lifecycle
i. Metrics establishment

Example:
• Count or percentage of (test type) adoption among applications requiring (test type) SAST, DAST, SCA, etc.
• Count or percentage of false positives produced by test automation.
• Count or percentage of execution-time SLA breaches by test automation.
• Soft measures, including satisfaction levels with usability.
• Change in the number of security weaknesses discovered after release.
• Percentage coverage of automated test cases for exposed APIs and SDK functions by service (i.e., the total number of automated test cases for APIs/SDK functions; the total number of APIs/SDK by service).
• Evaluate test types to determine which is best suited for different categories of applications based on the attributes of those in the prioritized inventory.

Considerations:
• Development and security team sizes
• Platform and operating systems
• Maturity of build automation
• Language support

Execution:
a. Avoid approaches that cause unreasonable delays to builds and deployments or require significant process modification or resource commitment from development teams.
b. Seek to adopt automation at multiple SDLC integration points

Example integration points:
               1. A plugin in the developer's integrated development environment (IDE).
               2. Abuse case unit and integration tests—created and maintained by developers—and executed during development and build cycles.
               3. Static application security testing or SCA scans executed during automated builds.
               4. Dynamic application security testing scans executed during automated deployment.

c. Automate patching when possible.
d. Use metrics to drive feedback loops and continuous improvement.

               1. Maintain an accurate count of license utilization.
               2. Tune automation to reduce false positives and false negatives.
               3. Analyze gaps in support and trends for response and planning.

e. Continue to leverage manual testing for scenarios not easily tested with automation.
Control implemented
Control ownership
Description

Implement a testing strategy, including criteria for acceptance of new information systems, upgrades and new versions, which provides application security assurance and maintains compliance while enabling organizational speed of delivery goals. Automate when applicable and possible.

Automated Application Security Testing
AIS-05: Is testing automated when applicable and possible?
Note: The implementation guidelines of AIS-05 should be interpreted as further guidance in addition to what is specified in AIS-03 and AIS-04.

Automation of security testing should be implemented to reduce risks and errors and enable the scaling of security practices to meet organizational demands. Multiple test types and integration points will likely be needed to provide the appropriate level of assurance throughout the SDLC. Criteria should be developed for use when assessing the automation required by an application, as not all systems will benefit equally.

Strategy:
a. Identify the goals and requirements of the automation implementation.

Example goals:
• Security requirements are not relaxed to improve speed.
• All developers can leverage tools to detect security weaknesses while developing software.
• All third-party libraries are scanned for known vulnerabilities.
• All authentication and authorization functions “pass” abuse case unit tests before deployment.
• All website security headers are verified to meet security requirements when deployed.

Example requirements:
• Applicable programming languages should be supported by static analysis tools.
• Python and C# should be supported by select static analysis tools.
• Automation should not require infrastructure support.
• All automation tools should offer an application programming interface (API).
• All website security headers are verified to meet security requirements when deployed.

Strategy can also include, but is not limited to:
b. Security testing for unintentional side effects and behaviors that are not specified in the test plan or design.
c. Security testing for incident response procedures, such as simulating breaches.
d. Determining which portfolio applications warrant investment in automation. Prioritize the adoption order based on criticality.

Considerations:
e. Security requirements
f. Risk, business, and compliance requirements
g. Development methodology
h. Lifecycle
i. Metrics establishment

Example:
• Count or percentage of (test type) adoption among applications requiring (test type) SAST, DAST, SCA, etc.
• Count or percentage of false positives produced by test automation.
• Count or percentage of execution-time SLA breaches by test automation.
• Soft measures, including satisfaction levels with usability.
• Change in the number of security weaknesses discovered after release.
• Percentage coverage of automated test cases for exposed APIs and SDK functions by service (i.e., the total number of automated test cases for APIs/SDK functions; the total number of APIs/SDK by service).
• Evaluate test types to determine which is best suited for different categories of applications based on the attributes of those in the prioritized inventory.

Considerations:
• Development and security team sizes
• Platform and operating systems
• Maturity of build automation
• Language support

Execution:
a. Avoid approaches that cause unreasonable delays to builds and deployments or require significant process modification or resource commitment from development teams.
b. Seek to adopt automation at multiple SDLC integration points

Example integration points:
               1. A plugin in the developer's integrated development environment (IDE).
               2. Abuse case unit and integration tests—created and maintained by developers—and executed during development and build cycles.
               3. Static application security testing or SCA scans executed during automated builds.
               4. Dynamic application security testing scans executed during automated deployment.

c. Automate patching when possible.
d. Use metrics to drive feedback loops and continuous improvement.

               1. Maintain an accurate count of license utilization.
               2. Tune automation to reduce false positives and false negatives.
               3. Analyze gaps in support and trends for response and planning.

e. Continue to leverage manual testing for scenarios not easily tested with automation.
Control implemented
Control ownership
Description

Implement a testing strategy, including criteria for acceptance of new information systems, upgrades and new versions, which provides application security assurance and maintains compliance while enabling organizational speed of delivery goals. Automate when applicable and possible.