Skip to content

Rule Audit Checklist (RAC)

Purpose

The Rule Audit Checklist (RAC) provides a structured way to evaluate how well a project or delivery system adheres to the Stable Rules Layer (SRL).
It serves as a diagnostic and learning instrument, helping Project Leads, Product Managers, Architects, and Delivery Directors identify where the SDLC system is coherent — and where it may be drifting.

RAC turns the abstract SRL principles into practical, observable checks that can be applied across any delivery model, whether fixed-bid, time-and-material, or co-delivery.
It can be used for self-assessment, peer review, or formal audit, depending on project maturity and organizational needs.

RAC in the 3SF Structure

Layer Purpose
SRL (Stable Rules Layer) Defines the universal principles that keep delivery coherent.
RAC (Rule Audit Checklist) Translates those principles into assessable questions.
CRC (Contextual Rules Catalog) Suggests contextual adaptations when rules need to flex.

The RAC’s job is not compliance policing — it is delivery sense-making.
It helps leaders interpret project behavior and choose improvement focus areas based on systemic signals.

Using the RAC

  1. Select a project or program — ideally after one full delivery iteration (e.g., quarter or milestone).
  2. Interview or survey participants from both Client and Vendor sides (Product Manager, Project Lead, Tech Lead, QA, Architect, Stakeholder).
  3. Score each rule from 1–4 based on observed behavior.
  4. Discuss evidence and next actions — RAC is most effective as a conversation, not a checklist.
  5. Summarize results in a visual radar or heatmap (rules vs. maturity) for transparency.

Scoring Scale

Score Meaning Interpretation
1 – Reactive Rule often violated or ignored. Delivery unstable, reactive management.
2 – Defined Rule recognized but inconsistently applied. Awareness exists, needs reinforcement.
3 – Embedded Rule consistently applied and measured. Predictable delivery, learning in motion.
4 – Adaptive Rule instinctively followed and improved. Self-correcting system; maturity embedded.

RAC Template

Rule ID Rule Name Audit Question(s) Observed Evidence / Notes Score (1–4) Next Action / Owner
R1 Clarity before Commitment Do all team members understand project goals, value, and success criteria before work begins?
R2 One System of Truth Is there a single shared source of information (scope, status, risks) between Client and Vendor?
R3 Decisions are Transparent and Reversible Are key decisions visible, documented, and revisited when assumptions change?
R4 Progress is Measured by Outcomes, not Output Are outcomes and KPIs used to measure progress instead of just deliverables?
R5 Quality is Built In, not Inspected In Is testing and validation integrated into development rather than deferred?
R6 Flow Requires Limits Are WIP, dependencies, and task switching actively managed to maintain throughput?
R7 Risk Shared is Risk Reduced Are risks logged, owned jointly, and addressed collaboratively?
R8 Change is a Continuous Signal, not an Exception Are scope or plan changes processed through adaptive governance, not escalation?
R9 Feedback Completes the Flow Are feedback loops between users, systems, and teams continuous and acted upon?
R10 Governance Enables, not Controls Does governance accelerate decision-making and empower teams rather than slow them down?
R11 Transparency Scales Trust Are delivery data, metrics, and plans visible to all stakeholders?
R12 Learning is the Only Sustainable Advantage Are lessons learned documented and transformed into actions or system changes?

(Recommended: keep this table as a shared document or dashboard updated quarterly.)

Interpreting RAC Results

1. Identify Systemic Weaknesses

  • Low scores across several rules in the same meta-group (e.g., Flow or Learning) indicate systemic imbalance.
  • For example, strong Alignment but weak Flow may suggest bureaucracy or slow adaptation.

2. Analyze Cross-Role Perspectives

  • Compare Client vs. Vendor perceptions — maturity gaps often reveal trust or communication issues.
  • Different scores for the same rule are valuable, not errors — they highlight where transparency or expectations differ.

3. Track Maturity Over Time

  • Conduct RAC every 3–6 months.
  • Visualize progress using radar charts grouped by Alignment / Flow / Learning rule families.
  • Celebrate upward trends; investigate stagnation or regression.
  • Each low-scoring rule becomes an improvement opportunity.
  • Select 2–3 rules per cycle for focused improvement, linking them to measurable actions in the delivery plan.

RAC Example Summary Report

Meta-Group Rules Average Score Interpretation Suggested Focus
Alignment R1–R4 3.0 Clear purpose and measurable outcomes. Maintain stakeholder alignment.
Flow R5–R8 2.3 Quality and risk processes under stress. Strengthen automation and WIP control.
Learning R9–R12 1.8 Feedback loops and improvement culture weak. Prioritize retrospectives and shared metrics.

RAC Application in Practice

Use the RAC to:

  • Facilitate quarterly health checks for ongoing projects.
  • Support retrospectives at delivery milestones.
  • Align Client–Vendor steering discussions around facts, not perceptions.
  • Provide input to the Contextual Rules Catalog (CRC) for tailored playbooks.
  • Serve as baseline measurement before initiating major process or relationship changes.

RAC Facilitation Tips

  • Conduct as a facilitated workshop, not a survey — dialogue matters more than scores.
  • Keep each rule discussion time-boxed (~10–15 minutes).
  • Encourage evidence sharing (artifacts, dashboards, meeting notes).
  • End with 3 actionable insights per meta-group (Alignment, Flow, Learning).
  • Visualize results — transparency itself reinforces trust (R11).

RAC and Relationship Maturity

Rule adherence mirrors relationship maturity:

  • Transactional (1–2): Rules applied inconsistently; focus on compliance and reporting.
  • Aligned Autonomy (2–3): Rules accepted and measured; governance collaborative.
  • Strategic Partnership (3–4): Rules instinctive; system self-corrects via feedback and trust.

Thus, the RAC not only measures system health — it measures partnership evolution.

Systemic Failure: The Maturity Mirage

A common risk in mature organizations is the illusion of progress — performing RAC reviews, retrospectives, and rule discussions as rituals while the underlying mindset remains unchanged.
This maturity mirage creates dashboards that look healthy but conceal disengagement, blame avoidance, or shallow trust.
True maturity is evidenced by visible change in behavior and decision quality, not by consistent high scores.
When results stay static despite visible activity, it signals measurement without learning — the system has optimized for appearance, not evolution.

Summary

  • The Rule Audit Checklist (RAC) operationalizes the 12 Stable Rules into measurable behaviors.
  • It enables reflective, evidence-based dialogue about delivery health and maturity.
  • RAC results guide improvement focus, governance calibration, and contextual rule selection (CRC).
  • Used regularly, RAC turns principles into practice — ensuring the delivery system remains transparent, adaptive, and trusted.