Show/Hide Toolbars

ABCI Consultants

Guidance for NIST 800-171 Assessment & Compliance

Navigation: » No topics above this level «

APPENDIX I: ONGOING ASSESSMENT AND AUTOMATION

Scroll Prev Top Next More

Using Automated Techniques To Achieve More Efficient Assessments

Ongoing security assessment is the continuous evaluation of the effectiveness of security control implementation.52 It is an essential subset of Information Security Continuous Monitoring (ISCM) activities.53 Ongoing assessment encompasses ISCM Steps 3 and 4 and is initiated as part of ISCM Step 3, Implement, when the collection of security-related information begins in accordance with the organization-defined frequencies. Ongoing assessment continues as the security-related information generated as part of ISCM Step 3 is correlated, analyzed, and reported to senior leaders as part of ISCM Step 4. As noted in Special Publication 800-137, security-related information is generated, correlated, analyzed, and reported using automated tools to the extent that it is possible and practical to do so. When it is not possible and practical to use automated tools, security-related information is generated, correlated, analyzed, and reported using manual or procedural methods. In this way, senior leaders are provided with the security-related information necessary to make credible, risk-based decisions regarding information security risk to the mission/business.54

Automating assessments is a fundamental element in helping organizations manage information security risks. Evolving threats create a challenge for organizations that design, implement, and operate complex information systems that contain many hardware, firmware, and software components. The ability to assess all implemented security controls as frequently as needed using manual or procedural methods has become impractical for most organizations due to the size, complexity, and scope of their information technology infrastructures.

One strategy to increase the number of security controls for which assessment/monitoring can be automated depends on defining a desired state specification and expressing the desired state in a form that can be compared automatically with the actual state. The desired state is a defined value or specification to which the actual state value can be compared. Mismatches of the two values indicate a defect is present in the effectiveness of one or more security controls. For example, an organizational policy may state that user accounts will be locked after three unsuccessful logon attempts. The desired state specification would be that applicable devices are configured to lock accounts after three unsuccessful logon attempts. If, during automated assessment, the security-related information collected indicates a specific device is configured such that accounts are locked only after five unsuccessful logon attempts, a mismatch between the desired state (three attempts allowed before lockout) and the actual state (five attempts allowed before lockout) is identified. This mismatch may reflect a problem with the effectiveness of Special Publication 800-53 security controls AC-7, Unsuccessful Logon Attempts, AC-2, Account Management, and CM-2, Baseline Configuration. When such a strategy is employed, security-related information generated from ISCM activities is equivalent to security control assessment results.

In order to effectively automate security control assessments using the desired state specification strategy, it is important to meet the following prerequisites:

Automated actual state/behavior specifications are defined;

Data-based desired state specifications (comparable to the actual state) are defined; and

A method to compute/identify defects (differences between desired and actual state/behavior) is defined.

When the prerequisites are met, the assessment system can automatically compute where differences between desired state and actual state (defects) occur and use that information to create security assessment reports and deliver those reports to designated personnel via a security management console (dashboard).

When automated tools are used to conduct assessments, the test assessment method is used.55 The organization determines and documents: (i) the specific capabilities56 or security controls that are being assessed by the automated tool; (ii) the frequency with which the tool will assess the capabilities or controls; and (iii) the analysis and reporting requirements for the capabilities or controls.

To help automate ongoing assessment, NIST and the Department of Homeland Security (DHS) have collaborated on the development of a process that leverages the test assessment method and ensuring the process is consistent with the Risk Management Framework as described in Special Publication 800-37 and the ISCM guidance in Special Publication 800-137. The automation of the test method for security assessments is facilitated in the form of a new service from DHS known as the Continuous Diagnostics and Mitigation (CDM) program.

The transition from manual to automated assessments requires time to implement the data collection system to support automated assessments and a security management console to present assessment results. It also requires time and effort to modify and update the assessment process. More information on automation support for ongoing assessments and how the DHS CDM program facilitates ongoing assessment is provided in Draft NIST Interagency Report 8011, Automation Support for Ongoing Assessment (projected for publication in FY2015).

Hosted by ABCI Consultants for Information Security Management Systems | Implementations, Training and Assessments for Compliance | (800) 644-2056