top of page
Search

Field Note #5 - The importance of defining "Done"

  • Writer: Paul Wieser
    Paul Wieser
  • 1 minute ago
  • 7 min read


In justice modernization, the definition and management of “done” is where projects pivot towards a good path - or run the risk of turning into theater. If “done” is vague, not only is it difficult to know what and how much work is left

do, we are inadvertently giving teams license to ship mere activity and focus on noise (without necessarily getting closer to the end of job). At the same time, we encourage leaders to report progress against a plan that is very likely inadequate.


“Done” has to be defined, versioned, constantly evaluated and enforced, and re-aligned as reality becomes known during the project. This starts in procurement, goes through the implementation, and extends all the way into operations.


The pattern we see in courts and justice agencies

It seems to me that people in justice modernization teams generally work hard. At the same time, projects fail because completion is defined in ways that do not match what the organization actually needs. Just consider:


  • Vendors deliver to contracted, documented scope, not to operational intent (!)

  • Agencies accept to schedule and maybe features, not to outcome (since that is not defined well)

  • Teams declare done when unit and some degree of integration testing passed, not when the system as a whole has been simulated by appropriate user proxies with the agency's data, under load and in the environment it is intended to be used in

  • Status reports celebrate and glorify what moved, not what is truly finished


When the definition of done is weak, every downstream discipline is compromised: testing, training, cutover, support, SLAs, data quality, security, auditability, and with that, continued critical thinking and shared trust.


Why this matters to court leaders

Courts operate under statutory constraints, public scrutiny, time-critical proceedings, and institutional memory. If “done” is mis-defined:


  • You get silent scope gaps (“we assumed…”)

  • You get compliance on paper only (“the feature exists”) without reliable execution or planned usability (and yes, how about defining our specific meaning of seamless, easy-to-use, fast, etc.?)

  • You get late surprises that look like “resistance,” but are actually unmet expectations that never made it into the definition of done (OCM failure)

  • You inherit operational risk in the form of manual tracking, duplicated entry - or no entry at all since paper notes come back


What “done” actually is (in justice work)

Done = usable capability with evidence.

That means “done” is not a date, a demo, a document, or a sign-off meeting. It is a claim that must be backed by proof across these seven dimensions:

  1. Functional behavior (does the workflow do what the law and practice requires, what users expect (see my earlier field note #4 on change management), and what other stakeholders find useful, e.g. justice partners, parties and their families?)

  2. Data (is it correct, migrated, validated, and reportable?)

  3. Controls (security, permissions, logging, retention, CJIS/HIPAA where applicable)

  4. Performance & resilience (time-to-complete a step, process or event - including the number of clicks, drill-downs, screen changes required, etc., uptime, recovery, peak periods/performance under load at the point of use)

  5. Operational fit (roles, handoffs, exception handling, backlogs, supervision)

  6. Adoption readiness (training, job aids, policy updates, coaching, staffing plan)

  7. Supportability (monitoring, tickets, defect SLAs, release cadence, rollback plan)


If any one of these is missing, you do not have “done.” You have "something".


The Do’s: how to establish “done” that sets up procurement, implementation, and operations for success


Do #1: Define “done” as a tiered progression model, not just a single statement

Use multiple dimensions; I recommend at least four levels (you can name them however you want):

  • Built (implemented - with version control and a re-use check, e.g. ensure that the same or similar configurations use the same configuration objects across different parts of the system)

  • Verified (tested with evidence - this needs both integration and regression testing)

  • Adoptable (sufficient staff is trained and willing + procedures are updated + staffing/resources are accounted for)

  • Operational (the system is monitored + supported + measured in real conditions; don't forget your performance definitions)


This instantly fixes a common trap: teams stop calling something done simply when it came out of a sprint - a key trap for fans of sprint-based progress reporting - or is shown as "working".


Do #2: Make “done” object-level, not module-level

In justice organizations such as courts, value is usually delivered by objects that require a lead-up/progression/preparation and carry intent and obligations (orders, warrants, bonds, cases, hearings, parties, service events, deadlines). Define done per object and lifecycle, not just “minute taking now works”


Do #3: Use procurement to force shared definitions

In your RFP and contract, require:

  • A Definition of Done framework (tiered!) proposed by vendor, accepted by agency

  • Acceptance criteria templates (per feature / workflow / object - a features sign-off sheet is not an acceptance criteria template)

  • A Requirements Traceability Matrix (RTM) that maps law/policy → workflow → test → training → acceptance

  • Explicit separation of configuration complete vs acceptance vs go-live readiness

  • A change-control mechanism that recognizes “unknowns becoming known” without throwing the project into chaos or blinding governance committees to the overall state and progression towards the outcome of the project.


This is “clarity before commitment” elevated to a contractual element.


Do #4: Bake “done” into SLAs in a way that matches justice reality

SLAs should not only measure ticket response time. Also include:

  • Time-to-stabilize after releases

  • Defect escape rate (issues found after major milestones, e.g. post-go-live that should have been caught)

  • Backlog health (age distribution of incidents/problems)

  • Data correction cycle time

  • Reporting accuracy for statutory / leadership reporting


Done in operations means “measurably stable.”


Do #5: Make “what is left” the center of status reporting

Every status report should answer:

  • What is not yet done, by tier (Built / Verified / Adoptable / Operational, or whatever your tiers are)

  • What is blocked, and by whom

  • What is at risk, and what decision is needed

  • What changed in understanding (new facts, new constraints), who agrees/disagrees and why, and what that does to scope/timeline


One more thing - provide for quick feedback channels that are not interpreted as delay tactics or immediately rise to contractual proceedings. Insist on feedback and make giving it easy!


Do #6: Treat “done” as a living agreement as details emerge

At any time during contracting and in particular, during implementation, reality shows up: local rules, judge preferences, legacy data quirks, partner interfaces, staffing constraints, security constraints.


Your governance stance should be:

  • “We will discover those details.” Meaning we are prepared to listen and are open to items that may have an impact on the project.

  • “When we discover details, we will update the definition of done and update commitments accordingly.”

  • “We won’t hide discovery under the rug and pretend the plan didn’t change.”


Organizational Change Management (OCM) becomes real that way. People lose trust very quickly when leadership pretends nothing changed or does not acknowledge the potential for change.


Do #7: Create a “Done Authority” for acceptance

Acceptance should not be a vague committee moment. Assign a small accountable group (business, technical, operations) that owns:

  • Acceptance evidence standards

  • Final acceptance decisions

  • Any dispute resolution

  • The power of saying “no, not yet”


That is how you prevent premature sign-off.


The Don’ts: the failure modes we see over and over

I will be brief here.


Don’t #1: Don’t treat change discovery as “scope creep” by default

In justice modernization projects, many “changes” actually turn out to be previously unrecognized and unmodeled realities. Calling them “scope creep” is not only wrong; it is how you punish honesty and incentivize hiding problems.


Don’t #2: Don’t let “done” mean “we demoed/showed it”

A demo is not evidence. Whether we like it or not, a demo is still a performance.


Don’t #3: Don’t accept “done” without test artifacts and traceability

If you cannot trace a requirement (legal, policy, operational, etc.) to test results and acceptance, you are accepting risk blindly.


Don’t #4: Don’t define "done" only in technical terms

“Code/configuration complete” is irrelevant if clerks cannot process exceptions or do a cost bill on the fly, judges cannot trust calendars or quickly see which party from yesterday's proceedings has submitted their proposed order, and supervisors cannot reconcile reports or effectively plan for the deployment of the resources needed to support next week's dockets.


This is a popular milestone on many project plans, particularly if the underlying system is meant to be configurable. I am sure you can come up with business scenarios for non-courts, e.g. prosecutor, defender, law enforcement, etc.


Don’t #5: Don’t let the schedule become the acceptance criterion

Dates do matter. But accepting incomplete capability to “stay on schedule” creates the illusion of control and guarantees operational debt. Also, always assume that an additional iteration inserted to complete something WILL have consequences on what is left to do to get to done.


Don’t #6: Don’t let “green” status hide unfinished tiers

A feature can be “green” at Built and “red” at Adoptable. If you only show one color, you’re lying to yourself.


Don’t #7: Don’t separate or leave out operations from the definition of done

If training, policy updates, staffing adjustments, and supervisory routines are not part of or reflected in “done,” then adoption relies on a degree of wishful thinking and heroic efforts.


How to use “done” across the lifecycle

In procurement (RFP, contract, SLAs)

  • Require tiered done + evidence standards

  • Require RTM + acceptance templates

  • Tie payment milestones to Verified/Adoptable, not just Built

  • Design SLAs to include stability, defect escape, and reporting accuracy


In implementation

  • Track each backlog item by tier (not just “pending / in dev / done”)

  • Use “definition of done drift” as a standing agenda item: what did we learn that changes acceptance?

  • Run governance on what’s left, not what was performed this week


In operations

  • Redefine “done” as stable and supportable

  • Treat incidents/problems as signals that prior “done” was incomplete

  • Use release readiness gates aligned to tiers (especially for Verified and Operational)


What to look for (field indicators)

You are in trouble when you hear:

  • “It’s basically done.”

  • “Users will adapt.”

  • “We will fix that after go-live.”

  • “We met the requirement” (but can’t show how it performs end-to-end)

  • “Change management will handle that.”


... also when "we" is never used to designate the whole implementation team - vendor + agency (!)


You’re in good hands when you hear:

  • “It’s Built, but not Verified yet.”

  • "The feature is there but we need to review whether it's practical to use."

  • “Here’s the evidence, here’s the traceability.”

  • “Adoption readiness is the constraint on this item — here’s what we need to do.”

  • “Our understanding changed, and we agree/disagree; and we need a decision.”


The executive perspective shift

I do not think leaders need more dashboards necessarily. They need a different mental model:

  • Done is not a point in time. It’s a standard of proof.

  • Unplanned discovery is not failure. Pretending discovery does not exist (outside of designated discovery windows in the plan) or did not happen is failure.

  • OCM is not “after.” It is part of acceptance.

  • Status is not “what happened.” Status is “what remains, what is blocked, and what decisions are needed to get to the end of job”.


Now, jot down a few thoughts and share them with your teams, and if you can - also with me. I am glad to provide additional nuance or advise on practical, situational measures that help.

 
 
 

Recent Posts

See All
How to Use the Justice Implementation Field Notes

These field notes are written for court leaders, administrators, and peers who are responsible for large-scale justice technology decisions. They are not a checklist, a project plan, or vendor documen

 
 
 
bottom of page