DEV Community

Cover image for I Vibe Coded a Full-Stack React App in 90 Seconds — Here's Why the Stack Matters
Jbee - codehooks.io
Jbee - codehooks.io

Posted on

I Vibe Coded a Full-Stack React App in 90 Seconds — Here's Why the Stack Matters

I opened a terminal, described what I wanted to an AI agent, and 90 seconds later a full-stack todo app was live on the internet. React frontend, REST API, database, Swagger docs — all deployed to a single URL.

No boilerplate repos cloned. No config files hand-written. I just talked to Claude Code, and it built the whole thing.

But here's what I learned: vibe coding is only as fast as the stack allows it to be. The AI agent was fast. What made it instant was that the platform eliminated the usual friction points.

What is vibe coding?

You describe what you want in natural language. An AI agent writes the code, runs the commands, tests the result, and iterates. You steer — it executes.

My prompt was simple:

"Build a todo app. React frontend, Codehooks backend with a CRUD API. Deploy everything."

The agent planned the approach, generated 6 files, installed dependencies, built the frontend, and deployed — all without me writing a single line of code. I watched it happen in real time.

One thing that made this seamless: I'm using the Codehooks plugin for Claude Code with the codehooks-backend skill. The plugin gives the agent full knowledge of the Codehooks.io API — routing, database operations, static hosting, queues, cron jobs — so it generates correct, deployable code on the first try without hallucinating APIs that don't exist.

But the interesting part isn't the AI. It's what happens when you pair vibe coding with a stack that has almost zero ceremony.

The friction that kills vibe coding

Most full-stack setups fight against the speed that vibe coding enables:

  • Separate frontend/backend repos — the agent has to context-switch between projects
  • CORS configuration — the agent generates code, deploys it, hits a CORS wall, debugs, redeploys
  • Docker + CI/CD — every deploy takes minutes, breaking the flow
  • Infrastructure config — Terraform, nginx, environment variables pointing services at each other

The AI can handle all of this. But every layer of ceremony adds time, and time is the enemy of the vibe. You want the agent to spend its cycles on your app, not on gluing infrastructure together.

What zero-friction looks like

I used Codehooks.io — frontend and backend deploy together, same origin, same command. Here's the entire backend the agent generated:

import { app } from 'codehooks-js';
import { z } from 'zod';

const TodoSchema = z.object({
  title: z.string().min(1),
  completed: z.boolean().default(false)
});

app.crudlify({ todos: TodoSchema }, { prefix: '/api' });

app.auth('/*', (req, res, next) => next());

app.static({ route: '/', directory: '/static',
  default: 'index.html', notFound: '/index.html' });

export default app.init();
Enter fullscreen mode Exit fullscreen mode

From that Zod schema, crudlify() auto-generates six REST endpoints with validation. app.static() serves the React build. One deploy command puts both live on the same URL.

The agent didn't have to:

  • Configure CORS (same origin — it doesn't exist)
  • Set up a reverse proxy (static files and API share one server)
  • Write a Dockerfile (serverless — there's nothing to containerize)
  • Create a CI pipeline (one coho deploy, 5 seconds to live)

That's why 90 seconds was enough. The agent spent all its time on the app, not the platform.

The vibe coding session, in real time

Here's how the prompts flowed:

Prompt 1: "Build a todo app with React and Codehooks"
The agent planned, generated the backend (Zod schema + crudlify), built the React frontend (add, toggle, delete todos), and deployed. First deploy: under 90 seconds.

Prompt 2: "Add OpenAPI docs and Swagger"
One line added — app.openapi(). Since the API was already schema-driven, Swagger docs were auto-generated. Deployed in 5 seconds.

Prompt 3: "Add a cron job that empties todos every hour"
Two lines:

app.job('0 * * * *', async () => {
  const conn = await Datastore.open();
  await conn.removeMany('todos', {});
});
Enter fullscreen mode Exit fullscreen mode

No external scheduler. No crontab. Just deploy.

Prompt 4: "Add a link to the docs as a footer"
The agent edited the React component, rebuilt, redeployed. Done.

Each iteration followed the same loop: prompt → generate → deploy → verify. The deploy step never took more than 5 seconds, so the feedback loop stayed tight.

Why the single-origin architecture matters for vibe coding

When the AI agent generates a fetch('/api/todos') call in React, it just works — in local dev (Vite proxy) and in production (same origin). There's no moment where the agent has to stop and think about which URL to use, whether CORS is configured, or whether the frontend and backend are on the same version.

This sounds minor. It's not. Every eliminated decision point means fewer failure modes for the agent, which means fewer retry loops, which means faster results.

The same principle applies to schema-driven APIs. When the agent defines a Zod schema, it gets validation, REST endpoints, and API docs from that single source of truth. There's no drift between what the code does and what the docs say, because they're generated from the same place.

The final result

~200 lines of code. 6 files. One URL.

The takeaway

Vibe coding doesn't just change how you write code. It changes what you should optimize for in your stack. The old trade-offs — "this framework has more features" vs. "this one is simpler" — shift when an AI is doing the typing.

What matters now:

  1. Low ceremony — fewer config files means fewer places the agent can get stuck
  2. Fast deploys — tight feedback loops keep the vibe alive
  3. Same origin — eliminates an entire class of bugs the agent would otherwise have to debug
  4. Schema-driven — one definition, multiple outputs (API, validation, docs)

The best stack for vibe coding isn't the one with the most features. It's the one with the least friction between "I want this" and "it's live."


Links

Top comments (0)