Published on: 2/3/2026
3 minute read
What is data quality and why is it important?
Data quality (DQ) ensures that data collected into the Centers for Medicare & Medicaid Services (CMS) ecosystem is accurate and has the required structure for the subject matter. DQ activities include establishing DQ metrics and thresholds, as well as cleansing and correcting data as needed.
Good data quality ensures that the data serves its intended purpose for the mission of the system and its users. It helps CMS make sound decisions to reduce risks that could negatively impact environments.
Data quality and CDM
Data quality is an integral part of onboarding FISMA systems at CMS to the Continuous Diagnostics and Mitigation (CDM) program. CDM uses automated scanning tools to identify and prioritize cyber risks and support rapid response. It ensures compliance with OMB and FISMA mandates and enhances enterprise-wide visibility.
For systems using AWS Cloud, CMS Hybrid Cloud is the pathway to getting started with CDM – and that requires teams to ensure their systems are reporting data correctly. There are steps taken at both onboarding and offboarding to ensure good data quality.
Data validation and onboarding
As part of CMS Hybrid Cloud onboarding, data fields and flows are verified logically to capture and report data accurately. Data is validated before synchronizing data flow within the cyber risk reporting tools. Teams should complete these tasks before beginning the onboarding process:
Task | Location | Details |
Confirm the TLC status phase is NOT listed as “Initiate”. | For onboarding to continue, the TLC status phase must be listed as “Operate”. | |
| Talk to the CMS Cloud team that manages the environment and review all AWS Cloud Accounts before onboarding to ensure the CAMP DB team will have the correct accounts to align to the right systems. | AWS applications and platforms must adhere to Gold Image requirements. | |
Check hosting coordinator | Verify that all platforms have been updated properly. | |
Submit a service ticket | Submit a ticket to begin the onboarding process. |
Data decommissioning and offboarding
Decommissioning data affects end-user access and may trigger reclassification of FISMA systems. Proper decommissioning activities – including the management of offboarding roles – will reduce the number of access errors along the way.
To ensure smooth decommissioning, teams should:
- Develop a lifecycle plan for the assets that includes maintenance and decommissions.
- Coordinate access removal for downstream data consumers and remove permissions from consumer roles as needed.
- Update the configuration lifecycle to ensure user lists reflect accurate access rights at both group and individual levels.
- Ensure that all on-premises assets (applications and servers) are up to date with vulnerability and firmware updates.
Conclusion
Strong data quality depends on consistent onboarding practices, accurate validation, and coordinated decommissioning efforts. These steps help reduce reporting issues, support cleaner data flows, and ensure systems are reflected correctly within the CDM and Cyber Risk Management (CRM) programs. These practices strengthen visibility and help teams make informed decisions – keeping their environment aligned with mission and security requirements.
For more resources, and to stay informed:
- Check out Risk Management and Reporting on CyberGeek
- Sign up for the CMS Cloud Weekly Change Newsletter
About the author
Vanessa Thomas (ASSYST) is an M.S. Project Manager and ITIL Intermediate in Service Transition, specializing in enterprise systems security and release management. She contributes to ongoing data quality and system management efforts within CMS.