2.5 The "spec first" habit that prevents spaghetti

Overview and links for this section of the guide.

What “spec first” actually means

Spec first means you define the intended behavior before you ask the model to implement it. The spec doesn’t need to be long. It needs to be clear, testable, and bounded.

In vibe coding, specs are not bureaucracy. They are how you:

  • Prevent the model from guessing.
  • Keep diffs small and coherent.
  • Create a verification target so you can decide if the output is correct.
A practical definition

A spec is “the smallest set of constraints that makes correctness measurable.”

Why specs prevent spaghetti (especially with AI)

LLMs are excellent at producing plausible code quickly. Without a spec, “plausible” turns into:

  • Extra features you didn’t ask for.
  • Hidden assumptions about data, errors, and edge cases.
  • Architecture drift (“it seemed convenient to put logic here…”).
  • Large rewrites that are hard to review.

A spec counters those failure modes by acting as a stable anchor. It’s what you use to say “no” to the model when it overreaches.

The “spaghetti multiplier” effect

AI increases throughput. If your requirements are vague, you can generate spaghetti faster than ever. Specs are the throttle.

The “minimum viable spec” (MVS)

You do not need a long document. You need a minimum viable spec—something you can write in 5–15 minutes for most tasks.

A good MVS fits in a single screen and answers:

  • What is the goal?
  • What are the constraints?
  • What does “done” look like?
  • What are the key edge cases?
  • What files are in scope?
If you’re stuck, shrink the spec

If you can’t write a spec, the task is too big or too ambiguous. Cut scope until you can describe it clearly.

Spec components that matter most

Not all spec details are equally valuable. These components buy the most correctness per minute:

1) Goal (one sentence)

Define the outcome, not the implementation:

  • Good: “Parse this input format into a structured object.”
  • Bad: “Use regex and recursion to do X” (implementation bias too early).

2) Scope and non-scope

Explicitly state what you are not doing. This prevents feature creep:

  • “We are not adding auth.”
  • “We are not changing the database schema.”
  • “We are not optimizing performance yet.”

3) Constraints

Constraints control output consistency:

  • Runtime/language versions
  • Libraries allowed / forbidden
  • Style rules (formatting, lint expectations)
  • File boundaries (“only change these files”)
  • Security constraints (“no secrets in logs/prompts”)

4) Evidence (when debugging)

For bug fixes, the best “spec” is evidence:

  • Reproduction steps
  • Failing test output
  • Logs / stack traces
  • Expected vs actual behavior

Acceptance criteria (how to define “done”)

Acceptance criteria are the most important part of a spec because they create an objective target.

What good acceptance criteria look like

  • Observable: “Given X, returns Y” or “this test passes.”
  • Complete enough: includes success and failure behavior.
  • Minimal: only what you need for this iteration.

Examples

  • “If the input is empty, return a validation error, not an exception.”
  • “Output must be valid JSON matching this schema.”
  • “Existing tests pass and add 1 regression test.”
How to use acceptance criteria with AI

Paste acceptance criteria at the top of the prompt and require the model to explicitly confirm how the implementation satisfies each item.

Examples-as-tests (inputs/outputs)

Examples are mini-tests. They constrain the model more effectively than abstract wording.

A good example set includes

  • Happy path: typical valid input.
  • Edge case: empty, missing fields, unusual formatting.
  • Failure case: invalid input with expected error behavior.

Example format you can reuse

Examples:
1) Input: ...
   Output: ...
2) Input: ...
   Output: ...
3) Input: ...
   Error: ...
If the model keeps drifting, add one more example

One well-chosen example often fixes a whole category of misunderstandings.

Interfaces and boundaries

Specs aren’t just about behavior—they’re also about keeping code organized. Even a small spec should mention boundaries:

  • Where does the logic live? (module/function)
  • What is the interface? (function signature / schema)
  • What stays stable? (public APIs)
  • What is off-limits? (files, dependencies, behaviors)

When boundaries are unclear, the model will “helpfully” mix concerns—creating spaghetti.

Spaghetti is mostly boundary failure

Spaghetti code happens when responsibilities leak across modules. Specs that define boundaries prevent that leak.

Spec → plan → diff workflow

The simplest reliable workflow is:

  1. Write MVS: goal, constraints, acceptance criteria, examples.
  2. Ask for plan: 3–7 steps + risks and unknowns.
  3. Ask for minimal diff: within allowed files, no rewrites.
  4. Verify: run tests or a defined check.
  5. Update spec: incorporate what you learned and iterate.

This keeps the model aligned to a stable target and keeps you aligned to reality.

When a loop fails

Don’t “try again” with the same prompt. Add evidence (the failing output) and refine the spec (tighten acceptance criteria or examples).

Spec anti-patterns (and fixes)

Anti-pattern: vague goals

  • Symptom: the model adds features you didn’t want.
  • Fix: write scope + non-scope and add examples.

Anti-pattern: implementation-first specs

  • Symptom: you lock into a bad approach early.
  • Fix: specify behavior and constraints first; ask for options before committing.

Anti-pattern: no acceptance criteria

  • Symptom: you can’t tell if you’re done; you keep iterating forever.
  • Fix: write 3–7 bullets that define success and failure behavior.

Anti-pattern: giant “perfect” spec

  • Symptom: you spend time writing docs instead of shipping.
  • Fix: write an MVS, ship a slice, then expand the spec as needed.
Specs are living documents

The point is not to be perfect at the start. The point is to keep a stable target as you iterate.

Copy-paste spec templates

Template: Minimum Viable Spec

Goal (one sentence):
...

Scope:
- ...

Non-scope:
- ...

Constraints:
- Runtime/language:
- Libraries:
- Allowed files:
- Forbidden files:

Acceptance criteria:
- ...

Examples:
1) Input: ...
   Output: ...
2) Input: ...
   Error: ...

Template: Bug fix spec

Bug:
...

Repro:
1) ...

Expected:
...

Actual:
...

Evidence:
- Error/log:
...

Constraints:
- Diff-only
- Add regression test
- Do not change: ...

Template: Spec-first implementation prompt

Using the spec below:
1) Ask clarifying questions if needed.
2) Propose a plan (3–7 steps).
3) Implement as a minimal diff.
4) Provide verification commands.

Spec:
...
Upgrade path

Later in the guide, you’ll turn these specs into structured schemas and eval sets—so quality becomes measurable at scale.

Where to go next