0.1 What "vibe coding" means (and what it doesn't)
Overview and links for this section of the guide.
On this page
Definition (practical, not mystical)
Vibe coding is a software development style where you collaborate with a model in tight iteration loops: you describe intent and constraints, the model produces a concrete draft (code, tests, docs, diffs), and you immediately validate it in reality (run, test, inspect), then iterate.
The “vibe” is the speed and flow you get when the model handles a lot of the typing and first-pass structure, while you keep the project coherent and correct.
The model is a high-throughput generator of plausible code. Your job is to turn “plausible” into “correct and shippable” by steering, constraining, and verifying.
The vibe loop
You’ll repeat a small loop many times. The loop is the method—not the prompt.
- Specify: define the smallest next outcome (feature, fix, refactor) and the constraints.
- Generate: have the model propose a plan and a small diff (or a short implementation step).
- Integrate: apply changes in your repo with reviewable scope.
- Verify: run the app/tests, reproduce the bug, validate outputs, check logs.
- Refine: feed back the real results and tighten the spec.
If you can’t explain what you’re verifying after the model’s output, the step is too big.
What it isn’t
Vibe coding is easy to misunderstand. It’s not:
- Autopilot engineering: you don’t “accept all” and hope.
- Prompt-only building: text that sounds right is not a substitute for running code.
- One giant mega-prompt: large prompts tend to create large, fragile rewrites.
- Skipping design: you still need architecture boundaries, interfaces, and tradeoffs.
- Skipping correctness: tests, checks, and real usage are still the truth source.
- A replacement for fundamentals: it amplifies good engineering habits; it punishes sloppy ones.
Confusing “the model produced code quickly” with “the problem is solved.” The solution exists only after it’s verified in your environment.
Your job vs the model’s job
Think of vibe coding as a division of labor.
You control
- Architecture: modules, data flow, boundaries, and what goes where.
- Constraints: language/runtime, libraries, style, security rules, “don’t touch” areas.
- Acceptance criteria: what “done” means, including edge cases and failure behavior.
- Risk: what must be reviewed carefully, tested, or gated.
- Reality checks: running the code, inspecting logs, profiling, validating outputs.
The model helps
- Scaffolding: initial structure, boilerplate, wiring, and consistent patterns.
- Translation: turning intent into code, and code into explanations.
- Refactors: mechanical transformations when you provide the boundaries and tests.
- Debug support: hypothesis generation and “what to check next” (you still confirm).
- Documentation: drafts that you validate against the codebase.
Where the speed actually comes from
The speed is not “the model is always right.” The speed is:
- Lower setup cost: you can go from idea → runnable skeleton fast.
- Smaller iteration steps: you can ask for a narrow change, get a narrow change.
- Parallel thinking: you ask for options and tradeoffs quickly, then choose.
- Less busywork: the model can generate repetitive code, tests, and glue reliably when constrained.
- Faster recovery: when something breaks, you can loop: error → hypothesis → patch → rerun.
Get to runnable, then get to correct. Early execution creates feedback. Feedback creates speed.
Signals you’re doing it right
- You frequently run the code and treat output as provisional until verified.
- Your changes are small and reviewable (diffs are comprehensible).
- You have a clear “done” definition and you can tell when you’ve hit it.
- You accumulate tests, checks, or eval cases that prevent regressions over time.
- You can stop at a “ship point” with a working product slice, then iterate.
A starter prompt template
Use this as a baseline. Keep it short; tighten it as you learn what you need.
Task: [one sentence outcome]
Context:
- Repo / files: [what the model should consider]
- Runtime: [node/python/etc]
- Constraints: [libraries allowed, style, “don’t change” areas]
Definition of done:
- [acceptance criteria 1]
- [acceptance criteria 2]
- [edge cases / error behavior]
Process:
1) Ask clarifying questions if needed.
2) Propose a plan in 3–7 steps.
3) Implement as a small diff (no rewrites).
4) Tell me exactly what to run to verify.