R.I.V.E.R. Reference / V0.1
Reference / V0.1 / April 2026

The vocabulary, defined once.

The canonical definitions for R.I.V.E.R. terminology. Every other piece of R.I.V.E.R. content is permitted to use these terms without redefining them in line.

01 / Foundational vocabulary

The shared baseline.

The terms a R.I.V.E.R. document uses without elaboration. Most are not novel to the framework; the framework asserts how they are scoped and combined.

Cohort

The group of users to whom a release is exposed. In R.I.V.E.R., adoption and impact are always measured within the exposed cohort, not the total user base, because the exposed cohort is the only honest baseline in a progressive-release world. Cohorts may be defined by segment, percentage, tier, geography, customer class, or experimental arm.

Deploy

The act of placing built code on production infrastructure. Deploy makes code reachable; it does not, on its own, make a feature visible to users. R.I.V.E.R. treats the separation of deploy from release as foundational.

DORA

The longitudinal research program, originating at Puppet, that produced the four metrics (deployment frequency, lead time for changes, change failure rate, time to restore service) used to measure the commit-to-deploy segment of the value chain. R.I.V.E.R. adopts the DORA metrics as-is for the segment they cover; there is no R.I.V.E.R. metric that replaces a DORA metric.

Feature flag

A runtime control that determines whether a piece of deployed code is active for a given user or cohort. Feature flags are the primary mechanism by which deploy and release became separable events. R.I.V.E.R. is tool-neutral about the platform that manages flags.

Hypothesis

In R.I.V.E.R., a specific claim made before a release ships about what the release will cause to happen when users encounter it. A hypothesis is part of the release intent and is evaluated against the declared success signal after the release runs. It is not a wish or a goal; it is a falsifiable prediction recorded in advance.

Outcome attribution

The act of tying observed business or user-behavior movement to a specific release. In R.I.V.E.R., outcome attribution is enabled by the release intent's declared success signal, target cohort, and time horizon. The fraction of releases that can be tied to measured outcome movement after the fact is itself a R.I.V.E.R. metric (Outcome Attribution Rate).

Philosophy-coupled

A property of R.I.V.E.R.: the framework is bound to a specific way of practicing release (progressive, targeted, reversible, experiment-aware, cross-functionally owned). An organization that rejects these commitments is not a R.I.V.E.R. candidate. The framework's philosophy-coupling is deliberate; it is what defines R.I.V.E.R.'s scope honestly.

Progressive release

A release model in which functionality is exposed to users in stages, across cohorts, over time, with explicit control at each stage. Contrasted with binary release, where a feature is shipped to all users in a single event.

Release

The act of making deployed code visible to a user or cohort. In modern release practice, release is distinct from deploy and is the unit at which exposure, adoption, and impact are measured. R.I.V.E.R.'s seven metric families are scoped to release events, not deploy events.

Reversible release

A release that can be pulled back in seconds, without a redeploy, by changing flag state or targeting rules. Reversibility changes the definition of release failure: a failed release is a release that degraded a metric and had to be reversed, not a bad deploy that had to be rolled back.

R.I.V.E.R.

A framework for measuring and operating the full value chain of software delivery, from the idea that prompted the work to the impact the work produces. R.I.V.E.R. encompasses DORA as the reference instrumentation for the commit-to-deploy segment and extends measurement and discipline across release, adoption, and impact. It is built for organizations that practice progressive and reversible release.

Success signal

The metric, direction, magnitude, and time window declared in advance that will tell the team whether a release's hypothesis held. The success signal is a component of the release intent. It is fixed at declaration and cannot be revised after the release runs.

Tool-neutral

A property of R.I.V.E.R.: the framework can be instrumented on any combination of platforms and internal systems that produce the required data. Some platforms make instrumentation materially easier than others; none is required. Tool-neutrality is modeled on DORA's portability across CI systems, deployment tools, and clouds.

Value chain

The six-stage spine of software delivery as R.I.V.E.R. names it: idea, commit, deploy, release, adoption, impact. DORA measures commit-to-deploy. R.I.V.E.R. extends measurement to release, adoption, and impact. The value chain appears as a flow-line on every R.I.V.E.R. artifact.

Value-realization

The deepest of the three adoption layers. Value-realization asks whether a user has completed the action the feature was built to enable, defined per-intent by the declared success signal. It is the primary R.I.V.E.R. adoption metric and is measured within the exposed cohort.

02 / Release intent — thesis VII

The unit of analysis.

The central abstraction of R.I.V.E.R. The artifact that does the operating work. Defined once, here.

Release intent

A declared, structured artifact created before a release begins. A release intent is the unit of analysis in R.I.V.E.R.: every metric is scoped to an intent, and team-level, product-line-level, and organization-level rollups aggregate over intents, not over deploys, flags, tickets, or story points.

A release intent answers five questions, recorded before the work is exposed and not revised after the fact:

Type
What kind of release is this?
Hypothesis
What do we believe will happen when users encounter this?
Success signal
What metric, direction, magnitude, and window will tell us the hypothesis held?
Target cohort
Who is this for, and how will exposure progress through that cohort?
Horizon
By when do we expect to know?
03 / The five commitments — thesis VI

The worldview R.I.V.E.R. is built on.

The five claims about how release should be practiced. Descriptive of where the industry's most capable organizations already operate; prescriptive for organizations moving toward that practice. Not negotiable within the framework.

01
Release is progressive and controlled.
A release is a staged exposure of functionality across cohorts over time, with explicit control at each stage. The question is not whether a release happened, but where it has reached and what it has produced there.
02
Targeting is the unit of control.
Release is to a segment, a percentage, a tier, a geography, a customer class, or an experimental arm. "Production" is a destination; it is not the unit at which release decisions are made. Targeting rules are explicit, versioned, and recoverable.
03
Reversibility is native.
A release can be pulled back in seconds without a redeploy. The cost of trying a release is approximately zero; the cost of being wrong is the time spent in the degraded state.
04
Experimentation is intrinsic to release.
The same mechanism that exposes a feature can measure whether it worked. Experimentation is not a separate discipline layered on top of release; it is a capability of release itself.
05
Release is a product decision, not just an engineering one.
Targeting rules, success criteria, and cohort definitions are co-owned by product and engineering, with operations accountable for the guardrails. R.I.V.E.R. formalizes that agreement into a declared artifact.
04 / The six intent types — thesis VIII

Each measured against its own standard.

R.I.V.E.R. types each release at declaration time into one of six categories. A Platform/Enablement release that makes the next five Growth releases thirty percent faster is a good release; the framework has to be able to say so without distorting itself.

Type What it pursues Success looks like
Growth Acquire, activate, or convert users. Movement in funnel metrics within the exposed cohort.
Retention / Engagement Deepen usage among existing users. Frequency, depth, or stickiness in the exposed cohort.
Monetization Expand revenue per user or unlock new revenue. Revenue metrics in the exposed cohort.
Experience / Quality Improve the existing experience. Satisfaction, ticket reduction, task completion.
Platform / Enablement Make future work faster, safer, or more reliable. The user is another engineer or team. Downstream velocity, incident rate, internal adoption.
Risk Reduction Compliance, security, resilience. Reduction in exposure, incident rate, or audit findings.
05 / The seven metric families — thesis X

Mapped to the value chain.

R.I.V.E.R. organizes its metrics into seven families that correspond to stages of the value chain. The asymmetry with DORA's four is intentional. The named representative metrics are version-one candidates, subject to empirical refinement.

Family What it measures Representative metric
Exposure How long deployed code sits unreleased, and how quickly it moves from deploy to first user exposure. Feature Dark Time
Cohort Progression How exposure moves through its target cohort, and whether cohorts advance smoothly or are blocked. Rollout Velocity, Graduation Smoothness
Reversal How often releases are reversed through kill-switches, flag-offs, or targeting rollbacks. The change-failure analog DORA misses, because a clean deploy with a kill-switched release looks identical to it. Release Reversal Rate
Guardrail How often automated guardrails paused or reversed a release in response to a monitored metric. A maturity marker. Guarded Release Activation Rate
Experiment-Linked How systematically releases are tied to declared hypotheses and to outcome movement after the fact. These metrics measure the adoption of R.I.V.E.R. itself. Hypothesis Attachment Rate, Outcome Attribution Rate
Adoption First-use, sustained-use, and value-realization within the exposed cohort. Defined per-intent. Value-Realization Rate
Impact Whether adopted features moved the business metrics they were built to move. The framework's terminal measurement. Release-to-KPI Lead Time, Outcome Realization Rate
06 / The five maturity levels — thesis XI

Teams climb into them.

The ladder is a language, not a ranking. An organization that can say "we are at Control on most teams and Declare on two pilot teams" has a more tractable conversation about what to invest in next than one that can only say "we are working on our metrics."

# Level Tag Description
1 Deploy Delivery hygiene The team ships reliably on a DORA foundation. Deploy and release may still happen together. Level 1 is the foundation of a good release practice, not a deficient state to be escaped.
2 Control Deploy ≠ release Deploy and release are separate events. Rollouts are progressive. Cohort targeting is used. A release can be reversed in seconds without a redeploy. Most organizations with feature-flag platforms land here without deliberate work beyond tool installation.
3 Declare Intent, ahead of ship Before a release ships, the team states what it expects to happen: hypothesis, success signal, target cohort, time horizon. The hardest transition in the ladder, because it is where measurement stops being instrumentation and starts being ceremony.
4 Prove Systematic attribution Outcome attribution is systematic. Most releases carry a declared success signal and are evaluated against it. Not every hypothesis will hold; the record's credibility depends on its honesty.
5 Learn The compounding loop Evidence from realized and unrealized intents feeds the next planning cycle. Predictions about future releases sharpen over time, not just measurements of past ones. Where the Evolution in R.I.V.E.R. lives. Rare in current industry practice.
07 / The three adoption layers — thesis IX

Adoption is three signals, not one.

All three are measured within the exposed cohort, not the total user base, because that is the only honest baseline in a progressive-release world.

First-usefast, weak signal
Whether exposed users have touched the feature at all. Useful mainly for catching discoverability problems. A high first-use rate with low sustained-use means users found the feature and walked away.
Sustained-useworkflow signal
Whether the feature is used repeatedly over weeks. Tells the team that the feature has a place in real workflows, not only that it was discovered once.
Value-realizationprimary adoption metric
Whether the user has completed the action the feature was built to enable, defined per-intent by the declared success signal. Value-realization is the primary R.I.V.E.R. adoption metric; first-use and sustained-use exist mainly to contextualize it.
08 / Representative metrics

Named metrics, defined.

The representative metrics named across the seven families, listed alphabetically. Specific thresholds and benchmarks are products of the research program, not the framework definition.

Feature Dark Time

Family: Exposure. The duration between a deploy and the first user exposure of the feature it delivered. Reveals the gap between commit-to-deploy completion and the beginning of release that DORA cannot see.

Graduation Smoothness

Family: Cohort Progression. Whether cohorts advance through their rollout stages without manual intervention or are blocked by recurring guardrail triggers. A signal of operational maturity in addition to release outcome.

Guarded Release Activation Rate

Family: Guardrail. The fraction of releases for which automated guardrails paused or reversed a rollout in response to a monitored metric. Measures the organization's ability to respond to release-level signals without human intervention.

Hypothesis Attachment Rate

Family: Experiment-Linked. The fraction of releases that carry a declared hypothesis and success signal. A leading indicator of progress from Control to Declare on the maturity ladder.

Outcome Attribution Rate

Family: Experiment-Linked. The fraction of releases that can be tied to measurable outcome movement after the fact. Measures whether the data the release produced is sufficient to evaluate the release's hypothesis.

Outcome Realization Rate

Family: Impact. The fraction of releases of a given intent type that hit their declared success signal. The metric most directly responsive to the central business question R.I.V.E.R. is built to answer.

Release Reversal Rate

Family: Reversal. The fraction of releases reversed through kill-switches, flag-offs, or targeting rollbacks. R.I.V.E.R.'s change-failure analog at the release layer rather than the deploy layer; measures a failure mode DORA misses entirely, because a clean deploy with a kill-switched release looks identical to it.

Release-to-KPI Lead Time

Family: Impact. The time from a release's first user exposure to measurable movement in the business metric the release was built to move. The temporal counterpart to Outcome Realization Rate.

Rollout Velocity

Family: Cohort Progression. The time from first exposure to target-cohort saturation. Distinct from deploy-side speed metrics; measures how quickly a release reaches its intended audience, not how quickly the code reached production.

VALUE CHAIN framework-wide
Idea
Commit
Deploy
Release
Adoption
Impact