SOC 2 Common Criteria CC1-CC9: What Each Category Actually Tests
CC6 has more sub-criteria than any other Common Criteria category. It’s also the area where first-time SOC 2 audits fail most often. That’s not a coincidence: the criteria are weighted toward the controls auditors test most thoroughly. Knowing the structure of CC1 through CC9 — not just what each category covers, but how many sub-criteria each contains and what auditors specifically test — tells you where to spend remediation effort first.
This reference walks all 33 SOC 2 Common Criteria, organized by the nine CC categories. Use it to scope your readiness assessment, prioritize remediation, and predict where your audit will spend its time.
How the Common Criteria Are Structured
The Common Criteria (CC1–CC9) are the baseline controls evaluated in every SOC 2 examination, regardless of which Trust Service Criteria you include. Together they make up the Security category, which is mandatory for every SOC 2 report. The structure derives directly from the COSO Internal Control — Integrated Framework, the same framework used in financial-statement auditing. SOC 2’s 17 underlying COSO principles are mapped across CC1 through CC5; the technical and operational categories (CC6–CC9) extend COSO with security-specific controls.
| Category | Sub-criteria | What it covers | Common failure rank |
|---|---|---|---|
| CC1 Control Environment | 5 | Governance, ethics, board oversight | #2 |
| CC2 Communication and Information | 3 | Internal/external communication of policies | Low |
| CC3 Risk Assessment | 4 | Risk identification, fraud risk, change risk | Medium |
| CC4 Monitoring Activities | 2 | Ongoing and periodic control evaluation | Medium |
| CC5 Control Activities | 3 | Selection and deployment of specific controls | Low |
| CC6 Logical and Physical Access | 8 | Authentication, authorization, deprovisioning | #1 |
| CC7 System Operations | 5 | Vulnerability management, incident response | #3 |
| CC8 Change Management | 1 | Change authorization, testing, segregation of duties | #4 |
| CC9 Risk Mitigation | 2 | Vendor management, business risk acceptance | Medium |
Total: 33 sub-criteria across nine categories. Add Availability (3 criteria), Confidentiality (2), Processing Integrity (5), or Privacy (depending on AICPA version) and you reach the full 51-criterion scope — covered in our guide to selecting Trust Service Criteria.
CC1 — Control Environment (5 sub-criteria)
Maps to COSO Principles 1–5. CC1 is about the tone at the top — governance structures, ethical values, board oversight, organizational accountability. It’s the most non-technical category and often the most overlooked by engineering-led companies pursuing SOC 2 for sales reasons.
What auditors test: Code of conduct documentation. Board or management oversight of security (meeting minutes are common evidence). HR practices — background checks, onboarding/offboarding, role definitions. Defined organizational structure with accountability for security. Performance evaluations that include compliance responsibility.
Why it fails: Engineering-driven companies often have no formal board, no documented organizational structure, and no compliance accountability tied to roles. The auditor needs evidence the controls exist as governance, not just as Slack messages and tribal knowledge.
CC2 — Communication and Information (3 sub-criteria)
Maps to COSO Principles 13–15. CC2 covers how information about controls and risks is communicated — both internally to employees and externally to users and stakeholders. Includes the system description that becomes Section 3 of your SOC 2 report.
What auditors test: Internal communication of security policies (training records, intranet documentation, onboarding materials). External communication to users about system commitments and responsibilities (terms of service, security pages, status pages). Accuracy and completeness of the system description.
Why it usually passes: Most companies have something here — a security page, an employee handbook, a policy on the intranet. Auditors look for evidence of distribution, not just existence, but the bar is lower than other categories.
CC3 — Risk Assessment (4 sub-criteria)
Maps to COSO Principles 6–9. CC3 requires formal, documented risk assessment processes that identify and analyze risks to security objectives. Must include consideration of fraud risk and changes in the environment.
What auditors test: A documented risk assessment methodology. Evidence the assessment runs at a defined cadence (annual minimum). A risk register with identified threats, likelihood, impact, and treatment decisions. Evidence that fraud risk specifically was considered. Evidence that changes (org changes, infrastructure changes, regulatory changes) trigger reassessment.
Why it fails: Most teams informally know their top risks but never write them down in an auditor-readable format. The fix is concrete: a quarterly or annual risk-register review with documented outputs. This is often a CC1-adjacent failure — if there’s no governance forum, the risk assessment never happens formally.
CC4 — Monitoring Activities (2 sub-criteria)
Maps to COSO Principles 16–17. CC4 requires ongoing and periodic evaluation of whether controls are operating, plus communication of deficiencies to those responsible for taking corrective action.
What auditors test: Evidence of ongoing monitoring — automated alerts, control dashboards, periodic management review. A defined process for reporting and tracking control deficiencies through to resolution. Evidence that identified deficiencies are remediated within a reasonable timeframe.
Why it’s important: CC4 is the category that compliance automation platforms (Vanta, Drata, Secureframe, Sprinto) directly help with — their continuous monitoring features generate the evidence auditors want.
CC5 — Control Activities (3 sub-criteria)
Maps to COSO Principles 10–12. CC5 is about selecting, developing, and deploying specific controls to address risks. It’s the “you have controls, and they’re the right controls’ category.
What auditors test: Evidence that controls are selected based on the risk assessment (CC3), not arbitrarily. Documentation of policies and procedures that operationalize the controls. Evidence the controls cover technology general controls (logical access, change management, operations).
Why it usually passes: If CC1–CC3 are in order, CC5 follows naturally. Companies with weak governance and no risk assessment hit CC5 issues because their controls aren’t demonstrably tied to anything.
CC6 — Logical and Physical Access Controls (8 sub-criteria)
The largest category and the most commonly failed. CC6 covers identity management, authentication, authorization, monitoring of access, deprovisioning, and physical security. Eight sub-criteria, each independently tested.
What auditors test:
- User provisioning workflows with role-based access
- Multi-factor authentication, especially for privileged accounts and remote access
- Periodic access reviews (typically quarterly) with documented evidence of review
- Timely deprovisioning of terminated employees (samples will be pulled)
- Encryption of data in transit and at rest
- Network segmentation and firewall rules
- Physical access controls for any facilities in scope
- Logging and monitoring of access events
Why it fails: CC6 is the area where the gap between “we have a policy” and “we follow the policy” shows up most clearly. The auditor doesn’t ask if you have an offboarding policy — they pull a list of terminated employees from HR and check whether their access was actually removed within SLA. They don’t ask if you do access reviews — they ask for the records of the last four quarters. Failures here are typically operational, not policy-level.
Where to start: SSO + MFA on every system possible, automated deprovisioning on termination, quarterly access reviews with managed evidence (a ticket per system, not a single spreadsheet). Our readiness assessment checklist covers the CC6 evidence inventory in detail.
CC7 — System Operations (5 sub-criteria)
CC7 covers the operational security disciplines: vulnerability management, security event detection, incident response, business continuity. Five sub-criteria, including some of the most evidence-intensive controls in SOC 2.
What auditors test:
- Vulnerability scanning at a defined cadence (typically continuous or weekly), with evidence of remediation tracking
- Annual penetration testing, with reports retained and findings remediated
- Security event detection — SIEM, alerting, log aggregation
- Documented incident response plan WITH evidence the plan has been tested (tabletop exercise records)
- Business continuity and disaster recovery procedures with test evidence
Why it fails: Plans exist on paper but are never tested. An incident response plan without a tabletop exercise record is a CC7 deficiency. Auditors specifically ask: “Show me when you last tested this.”
CC8 — Change Management (1 sub-criterion)
One sub-criterion but heavily weighted. CC8 covers authorized, tested, and approved changes to the production environment, including segregation of duties between development, testing, and deployment.
What auditors test: Change authorization workflow (a ticket or PR review system that requires approval before deploy). Evidence of testing before production deployment. Segregation of duties — the same engineer who writes code should not unilaterally deploy it to production. Documented rollback procedures. Sample of changes pulled and traced through the workflow.
Why it fails: Small engineering teams often have one engineer who writes, reviews, and deploys their own code. That’s a CC8 segregation-of-duties failure. The fix doesn’t require hiring — require code review with approval before merge, even if the reviewer is the only other engineer; for solo deploys, log the deployment with a documented rollback path. Evidence-of-process beats evidence-of-headcount.
CC9 — Risk Mitigation (2 sub-criteria)
CC9 covers two distinct topics in two sub-criteria: business risk mitigation strategies (CC9.1) and vendor and third-party risk management (CC9.2). Small in count, broad in scope.
What auditors test: A vendor inventory with risk classification. Evidence of vendor security review at onboarding (security questionnaires, SOC 2 reports of subservice organizations). Ongoing vendor monitoring — not just one-time onboarding. Documentation of business risk mitigation strategies (insurance, contractual provisions).
Why it usually passes: Most companies can produce a vendor list. Where it fails: ongoing monitoring beyond onboarding. Auditors look for evidence the vendor list is reviewed periodically, not just maintained.
Where Audits Actually Fail
Aggregating across published audit findings, the top categories where first-time SOC 2 audits show exceptions:
- CC6 (Access Controls) — deprovisioning lag, missing access review evidence, MFA gaps on legacy systems
- CC1 (Control Environment) — no formal governance, missing org structure documentation, no documented accountability
- CC7 (System Operations) — untested incident response, gaps in vulnerability management evidence, no tabletop exercise records
- CC8 (Change Management) — no segregation of duties, undocumented production changes, no testing evidence
- Policy staleness across categories — policies copied from templates, never reviewed, don’t match actual practice
Rough rule of thumb: spend two-thirds of your remediation effort on CC6, CC1, CC7, and CC8. The remaining categories typically pass with reasonable preparation.
Where to Start
Before you remediate, inventory which controls you already have. Most companies pursuing SOC 2 already have 40–60% of CC1–CC9 in some form — the gap analysis is what tells you where to invest. Our free SOC 2 gap assessment walks all 33 Common Criteria with weighted scoring and a remediation timeline. The AICPA’s SOC 2 reporting guidance is the canonical source for the criteria themselves.
This article is general guidance, not legal or audit advice. The Trust Service Criteria are defined by AICPA TSP Section 100; engage a licensed CPA firm to evaluate your specific control environment.