Ever wanted to record yourself looking like you're on a Mars mission, running a cyber operation, or filming a found-footage VHS tape? That's exactly what we built with LogMaker — a client-side web app that composites sci-fi HUD overlays on top of your webcam feed in real time and records the result as a WebM video. No backend, no uploads, everything runs in your browser.
In this article, I'll walk you through the entire architecture: how we capture the camera, draw HUD elements on a Canvas, and record the composited output — all with vanilla browser APIs.
Architecture Overview
Here's the full data pipeline:
Camera (getUserMedia)
↓
Hidden <video> element
↓
Canvas 2D (drawImage + HUD overlay @ 60fps)
↓
canvas.captureStream(30) + audio tracks
↓
MediaRecorder (WebM VP9/VP8)
↓
Blob → Download / Web Share API
The key insight: the canvas is both the preview and the recording source. The user sees the composited result in real time, and captureStream() captures exactly what's on the canvas — HUD included.
Tech stack: Next.js 14 (static export), React 18, Tailwind CSS, and zero external runtime dependencies. The HUD drawing system is pure Canvas 2D — no React, no DOM manipulation.
Step 1: Accessing the Camera
First, we need the webcam feed. We use getUserMedia with specific constraints:
const buildConstraints = (videoDeviceId, audioDeviceId) => {
const video = videoDeviceId
? { deviceId: { exact: videoDeviceId }, width: { ideal: 1280 }, height: { ideal: 720 } }
: { width: { ideal: 1280 }, height: { ideal: 720 }, facingMode: "user" };
const audio = audioDeviceId
? { deviceId: { exact: audioDeviceId } }
: true;
return { video, audio };
};
If the user has selected a specific device (e.g., an external webcam), we use exact to pin it. Otherwise, we default to facingMode: "user" for mobile front cameras and ideal: 1280x720 for desktop.
Stream Acquisition
const acquireStream = async (videoDeviceId, audioDeviceId) => {
const constraints = buildConstraints(videoDeviceId, audioDeviceId);
const stream = await navigator.mediaDevices.getUserMedia(constraints);
// Wire to hidden video element
const video = videoRef.current;
video.srcObject = stream;
// Wait for metadata to load
await new Promise((resolve) => {
video.onloadedmetadata = () => {
video.play().then(resolve).catch(resolve);
};
});
// Match canvas size to actual video resolution
const canvas = canvasRef.current;
canvas.width = video.videoWidth || 1280;
canvas.height = video.videoHeight || 720;
return stream;
};
Critical detail: we set the canvas dimensions to match the actual video resolution, not the ideal. The camera might give us 1920x1080 or 640x480 depending on the device. The canvas must match exactly.
The Hidden Video Element
The <video> element is invisible — it's only a bridge for getting frames into the canvas:
<video
ref={videoRef}
autoPlay
muted
playsInline
className="absolute inset-0 w-0 h-0 opacity-0 pointer-events-none"
/>
muted and playsInline are critical for autoplay to work on mobile. Without playsInline, iOS Safari will try to go fullscreen.
Step 2: The Draw Loop
This is where the magic happens. A requestAnimationFrame loop runs at ~60fps, drawing the camera frame and HUD overlay on every tick:
const startDrawLoop = (theme) => {
const canvas = canvasRef.current;
const video = videoRef.current;
const ctx = canvas.getContext("2d");
let frameCount = 0;
function draw() {
const { width: w, height: h } = canvas;
// 1. Draw the camera frame
if (video.readyState >= video.HAVE_CURRENT_DATA) {
ctx.drawImage(video, 0, 0, w, h);
}
// 2. Build HUD state
const isRec = statusRef.current === "recording";
const now = Date.now();
const hudState = {
isRecording: isRec,
elapsedMs: isRec ? now - startTimeRef.current : 0,
blinkPhase: Math.sin(now / 500) * 0.5 + 0.5, // smooth 0→1→0 pulse
frameCount: frameCount++,
solBase: solBaseRef.current,
};
// 3. Draw HUD overlay on top
drawHud(ctx, w, h, theme, hudState);
// 4. Schedule next frame
requestAnimationFrame(draw);
}
requestAnimationFrame(draw);
};
Why statusRef Instead of status State?
Notice we read statusRef.current instead of the React state status. This is intentional — the draw function is created once and runs in a requestAnimationFrame loop. If we read the state variable directly, it would be a stale closure. The ref always has the latest value:
const statusRef = useRef("idle");
useEffect(() => {
statusRef.current = status;
}, [status]);
The Blink Phase
The blinkPhase is a smooth oscillation between 0 and 1 using a sine wave:
blinkPhase: Math.sin(Date.now() / 500) * 0.5 + 0.5
This drives the pulsing glow on corner brackets, the REC indicator, and other animated elements. A full cycle takes ~3.14 seconds (π × 1000ms / 500).
Step 3: The HUD Drawing System
The entire HUD is drawn with Canvas 2D — no DOM, no React. This is important for two reasons:
- Performance: DOM manipulation during a 60fps loop would be brutal. Canvas is GPU-accelerated.
-
Recording:
captureStream()captures the canvas pixels. DOM overlays wouldn't be recorded.
Responsive Scaling
Every dimension in the HUD is scaled relative to a 1280px baseline:
/** Scale factor: 1.0 at 1280px width, min 0.5 */
function S(w) {
return Math.max(w / 1280, 0.5);
}
And a safe margin keeps elements away from the edges:
/** Safe margin: 4% of the smaller dimension */
function safeMargin(w, h) {
return Math.round(Math.min(w, h) * 0.04);
}
Every pixel value in the HUD is multiplied by S(w):
const s = S(w);
const m = safeMargin(w, h);
// Title font size: 22px at 1280w, scales proportionally
ctx.font = `bold ${Math.round(22 * s)}px monospace`;
// Position anchored to top-left corner with safe margin
const titleX = m + Math.round(18 * s);
const titleY = m + Math.round(16 * s);
This means the HUD looks correct whether the canvas is 640px or 1920px wide.
The Composition Hierarchy
Each theme follows a layered composition model. Think of it like Photoshop layers:
Layer 1: Tone overlay (dust, color tint)
Layer 2: Vignette (darkened edges)
Layer 3: Film grain / noise
Layer 4: Scanline sweep (animated horizontal line)
Layer 5: CRT scanlines (static horizontal lines)
Layer 6: Corner brackets (cinematic framing)
PRIMARY: Title + timestamp (most important info)
SECONDARY: Telemetry data (supporting info)
TERTIARY: Scrolling terminal log (ambient detail)
Theme Dispatcher
A simple switch routes to the correct drawing function:
export const THEMES = [
{ id: "mars_log", name: "MARS LOG", accent: "#ff6b35" },
{ id: "cyber_ops", name: "CYBER OPS", accent: "#39ff14" },
{ id: "vhs_archive", name: "VHS ARCHIVE", accent: "#e0e0e0" },
];
export function drawHud(ctx, w, h, theme, state) {
ctx.save();
switch (theme) {
case "mars_log": drawMarsLog(ctx, w, h, state); break;
case "cyber_ops": drawCyberOps(ctx, w, h, state); break;
case "vhs_archive": drawVhsArchive(ctx, w, h, state); break;
default: drawMarsLog(ctx, w, h, state);
}
ctx.restore();
}
The ctx.save() / ctx.restore() pair is essential — each theme can modify globalCompositeOperation, shadows, etc. without leaking state.
Step 4: Drawing the HUD Elements
Let's look at the shared building blocks every theme uses.
Vignette Effect
A radial gradient that darkens the edges, creating a cinematic framing:
function drawVignette(ctx, w, h, r, g, b, strength) {
const a = strength || 0.65;
const grad = ctx.createRadialGradient(
w / 2, h / 2, w * 0.22, // inner circle (transparent)
w / 2, h / 2, w * 0.85 // outer circle (colored)
);
grad.addColorStop(0, `rgba(${r},${g},${b},0)`);
grad.addColorStop(0.65, `rgba(${r},${g},${b},${a * 0.25})`);
grad.addColorStop(1, `rgba(${r},${g},${b},${a})`);
ctx.fillStyle = grad;
ctx.fillRect(0, 0, w, h);
}
Each theme tints the vignette differently:
-
Mars Log: warm red/orange
(35, 8, 0)— dusty Martian atmosphere -
Cyber Ops: green
(0, 12, 0)— night-vision / hacker terminal feel -
VHS Archive: pure black
(0, 0, 0)— old TV edges
CRT Scanlines
Horizontal lines drawn at regular intervals to mimic a CRT monitor:
function drawCrtScanlines(ctx, w, h, alpha, gap) {
ctx.fillStyle = `rgba(0,0,0,${alpha})`;
for (let y = 0; y < h; y += gap) {
ctx.fillRect(0, y, w, 1);
}
}
At a gap of 3-4 pixels with an alpha of 0.06-0.12, this creates a subtle but effective CRT look. We keep the alpha low to avoid making the video unwatchable.
Film Grain / Noise
Random dots scattered across the frame to simulate film grain:
function drawNoise(ctx, w, h, count, tint, alphaMin, alphaMax, frameCount) {
// Only render every 3rd frame for performance
if (frameCount % 3 !== 0) return;
const maxDots = Math.min(count, Math.round(w * h * 0.0003));
for (let i = 0; i < maxDots; i++) {
const x = Math.random() * w;
const y = Math.random() * h;
const size = Math.random() * 1.5 + 0.5;
const a = alphaMin + Math.random() * (alphaMax - alphaMin);
ctx.fillStyle = tint
? `rgba(${tint[0]},${tint[1]},${tint[2]},${a})`
: `rgba(${Math.floor(Math.random() * 180 + 60)},...)`;
ctx.fillRect(x, y, size, size);
}
}
Two performance tricks here:
- Frame skipping: Only drawing every 3rd frame. The persistence of the previous frame's noise gives a natural flicker effect.
-
Count capping:
maxDotsis capped relative to canvas area (w * h * 0.0003), so a 4K canvas doesn't try to draw thousands of dots.
Animated Scanline Sweep
A glowing horizontal line that sweeps down the screen:
function drawScanlineSweep(ctx, w, h, speed, color, alpha, thickness) {
const now = Date.now();
const y = (now / speed) % h; // position cycles through canvas height
const s = S(w);
// Sharp line
ctx.fillStyle = `rgba(${color},${alpha})`;
ctx.fillRect(0, y, w, thickness * s);
// Soft glow around the line
const glowH = 25 * s;
const grad = ctx.createLinearGradient(0, y - glowH, 0, y + glowH);
grad.addColorStop(0, `rgba(${color},0)`);
grad.addColorStop(0.5, `rgba(${color},${alpha * 0.2})`);
grad.addColorStop(1, `rgba(${color},0)`);
ctx.fillStyle = grad;
ctx.fillRect(0, y - glowH, w, glowH * 2);
}
The speed parameter controls how fast the line moves. A speed of 22 (Cyber Ops) is faster than 55 (VHS Archive), matching each theme's energy level.
Corner Brackets
The cinematic framing brackets in each corner, with a glow effect:
function drawCornerBrackets(ctx, w, h, offset, len, thickness, color, glowSize) {
ctx.strokeStyle = color;
ctx.lineWidth = thickness;
ctx.lineCap = "square";
ctx.shadowColor = color;
ctx.shadowBlur = glowSize;
// Top-left bracket
ctx.beginPath();
ctx.moveTo(offset, offset + len);
ctx.lineTo(offset, offset);
ctx.lineTo(offset + len, offset);
ctx.stroke();
// Top-right bracket
ctx.beginPath();
ctx.moveTo(w - offset - len, offset);
ctx.lineTo(w - offset, offset);
ctx.lineTo(w - offset, offset + len);
ctx.stroke();
// Bottom-left and bottom-right follow the same pattern...
// ...
ctx.shadowBlur = 0; // Reset shadow to avoid bleeding
}
The shadowBlur creates the neon glow. Each theme uses its accent color — orange for Mars, green for Cyber, and a subtle white for VHS.
Blinking REC Indicator
A cinematic recording indicator with a pulsing red dot:
function drawRecIndicator(ctx, x, y, s, blinkPhase, elapsedMs) {
const alpha = 0.4 + blinkPhase * 0.6; // pulsates between 0.4 and 1.0
const text = `REC ${formatTime(elapsedMs)}`;
// Background pill
ctx.fillStyle = `rgba(120,10,10,${0.3 + blinkPhase * 0.12})`;
roundRect(ctx, boxX, boxY, totalW, totalH, r);
ctx.fill();
// Pulsing red dot with glow
ctx.fillStyle = `rgba(255,40,40,${alpha})`;
ctx.shadowColor = "#ff2828";
ctx.shadowBlur = 12 * s * blinkPhase;
ctx.beginPath();
ctx.arc(dotX, dotY, dotR, 0, Math.PI * 2);
ctx.fill();
ctx.shadowBlur = 0;
// Text: "REC 00:01:23"
ctx.fillStyle = `rgba(255,220,220,${alpha})`;
ctx.fillText(text, dotX + dotR + 10 * s, y + 1);
}
The blinkPhase (0→1→0 sine wave) drives both the dot opacity and glow radius, creating a smooth pulsing effect.
Scrolling Terminal Log
An ambient scrolling log that adds "life" to the HUD:
// Module-level state — persists across frames
const terminalState = {
mars_log: { lines: [], lastPush: 0 },
cyber_ops: { lines: [], lastPush: 0 },
vhs_archive: { lines: [], lastPush: 0 },
};
function updateTerminalLog(themeId, pool, interval, maxLines) {
const ts = terminalState[themeId];
const now = Date.now();
if (now - ts.lastPush > interval) {
// Pick a random line from the pool
const line = pool[Math.floor(Math.random() * pool.length)];
ts.lines.push({ text: line, time: now });
// Keep only the last N lines
if (ts.lines.length > maxLines) ts.lines.shift();
ts.lastPush = now;
}
return ts.lines;
}
Each theme has its own message pool:
const TERMINAL_LINES_MARS = [
"SYS: SIGNAL LOCKED",
"NAV: ORBITAL POSITION LOCKED",
"COM: UPLINK ACTIVE",
"SYS: DUST FILTER ENGAGED",
"HAB: PRESSURE NOMINAL",
"SYS: SOLAR ARRAY 98%",
"MED: BIOMETRICS SYNCED",
// ...
];
const TERMINAL_LINES_CYBER = [
"NET: PACKET STREAM OK",
"ENC: AES-256 LOCKED",
"NET: PROXY CHAIN VALID",
"OPS: SURVEILLANCE ACTIVE",
"NET: VPN TUNNEL ACTIVE",
// ...
];
const TERMINAL_LINES_VHS = [
"TAPE: HEAD ALIGNED",
"SYS: TRACKING OK",
"REC: SIGNAL STABLE",
"SYS: CHROMA LOCK",
// ...
];
The log is drawn with a fade effect — older lines are more transparent:
function drawTerminalLog(ctx, x, y, s, lines, color, maxAge, maxVisible) {
// Semi-transparent background panel
ctx.fillStyle = "rgba(0,0,0,0.22)";
roundRect(ctx, panelX, panelY, panelW, panelH, r);
ctx.fill();
for (let i = 0; i < lines.length; i++) {
// Newer lines are brighter (bottom = newest)
const posRatio = lines.length > 1 ? i / (lines.length - 1) : 1;
const alpha = 0.5 + posRatio * 0.4;
ctx.fillStyle = `rgba(${color},${alpha})`;
ctx.shadowColor = `rgba(${color},${alpha * 0.3})`;
ctx.shadowBlur = 4 * s;
ctx.fillText(lines[i].text, x, y + i * lineH);
}
ctx.shadowBlur = 0;
}
Step 5: Theme Deep Dive — Mars Log
Let me show you how these building blocks compose into a complete theme. Here's the Mars Log theme, the most feature-rich one:
function drawMarsLog(ctx, w, h, state) {
const s = S(w);
const { isRecording, elapsedMs, blinkPhase, frameCount, solBase } = state;
const accent = "215,95,45"; // warm orange
const m = safeMargin(w, h);
// ── Layer 1: Dusty warm tint ──
ctx.globalCompositeOperation = "multiply";
ctx.fillStyle = "rgba(180,120,70,0.08)";
ctx.fillRect(0, 0, w, h);
ctx.globalCompositeOperation = "source-over";
// ── Layer 2: Dark vignette with red/orange tint ──
drawVignette(ctx, w, h, 35, 8, 0, 0.6);
// ── Layer 3: Warm-tinted film grain ──
drawNoise(ctx, w, h, 30, [200, 130, 60], 0.02, 0.05, frameCount);
// ── Layer 4: Orange scanline sweep ──
drawScanlineSweep(ctx, w, h, 35, accent, 0.08, 2);
// ── Layer 5: CRT scanlines + tracking distortion ──
drawCrtScanlines(ctx, w, h, 0.06, 4);
// Tracking distortion — a line that drifts across the screen
const trackCycle = 5000;
const trackPhase = (Date.now() % trackCycle) / trackCycle;
const trackY = trackPhase * h;
ctx.fillStyle = `rgba(${accent},0.05)`;
ctx.fillRect(0, trackY, w, 2 * s);
// ── Layer 6: Corner brackets with orange glow ──
drawCornerBrackets(
ctx, w, h, m,
Math.round(80 * s), // arm length
Math.round(7 * s), // thickness
`rgba(${accent},${0.4 + blinkPhase * 0.2})`, // pulses with blink
Math.round(14 * s) // glow size
);
// ── PRIMARY: Title "MARS LOG" (top-left) ──
ctx.font = `bold ${Math.round(22 * s)}px monospace`;
ctx.shadowColor = "#d75f2d";
ctx.shadowBlur = 12 * s;
ctx.fillStyle = `rgba(${accent},0.95)`;
ctx.fillText("MARS LOG", m + 18 * s, m + 16 * s);
// Mission ID below title
ctx.font = `${Math.round(11 * s)}px monospace`;
ctx.fillStyle = `rgba(${accent},0.6)`;
ctx.fillText(getMissionId(), m + 18 * s, m + 46 * s);
// ── PRIMARY: UTC timestamp (top-right) ──
drawBigTimestamp(ctx, w - m - 18*s, m + 16*s, utcTimestamp(), 26*s,
`rgba(${accent},0.85)`, "right", "30,12,4");
// ── SECONDARY: SOL counter (bottom-left) ──
const solNum = solBase + Math.floor(elapsedMs / 1000);
const solText = `SOL ${solNum}`;
// ... draws with background panel and mini brackets ...
// ── SECONDARY: ENV DATA (bottom-right) ──
// Temperature, pressure, radiation — all slowly drifting
const env = updateEnvData();
ctx.fillText(`TEMP: ${env.temp.toFixed(1)}°C`, brX, brBaseY);
ctx.fillText(`PRES: ${env.pres} Pa`, brX, brBaseY - envLineH);
ctx.fillText(`RAD: ${env.rad.toFixed(2)} mSv`, brX, brBaseY - 2*envLineH);
// ── TERTIARY: Terminal log + center crosshair ──
const logLines = updateTerminalLog("mars_log", TERMINAL_LINES_MARS, 900, 8);
drawTerminalLog(ctx, titleX, m + 100*s, s, logLines, "255,160,80", 7000, 8);
// ── REC indicator (bottom-center, only during recording) ──
if (isRecording) {
drawRecIndicator(ctx, w/2, h - m - 22*s, s, blinkPhase, elapsedMs);
}
}
Drifting Telemetry Data
The environment data (temp, pressure, radiation) slowly drifts using a random walk:
const envData = { temp: -42.3, pres: 610, rad: 0.24, lastUpdate: 0 };
const ENV_BASE = { temp: -42.3, pres: 610, rad: 0.24 };
const ENV_RANGE = { temp: 1.5, pres: 5, rad: 0.06 };
function updateEnvData() {
const now = Date.now();
const interval = 3000 + Math.random() * 2000; // 3-5 seconds
if (now - envData.lastUpdate < interval) return envData;
envData.lastUpdate = now;
// Tiny random delta
const dTemp = (Math.random() - 0.5) * 0.3;
const dPres = Math.round((Math.random() - 0.5) * 2);
const dRad = (Math.random() - 0.5) * 0.02;
// Apply + clamp to range
envData.temp = Math.max(
ENV_BASE.temp - ENV_RANGE.temp,
Math.min(ENV_BASE.temp + ENV_RANGE.temp, envData.temp + dTemp)
);
// ... same for pres and rad ...
return envData;
}
This creates a realistic "live telemetry" feel — the numbers change, but only slightly and slowly. They stay within a realistic range around the base values.
Step 6: VHS-Specific Effects
The VHS Archive theme has unique effects that deserve their own section.
Color Bleed (Chromatic Aberration)
The title text is drawn twice — once with a red offset to simulate analog color bleeding:
// Red ghost — shifted right
ctx.fillStyle = `rgba(200,140,140,0.18)`;
ctx.fillText("ARCHIVE LOG", titleX + 2 * s + jX, titleY + jY);
// Main text on top
ctx.fillStyle = `rgba(${accent},0.85)`;
ctx.fillText("ARCHIVE LOG", titleX + jX, titleY + jY);
Jitter
The position offset jX and jY add random micro-movements:
const jX = (Math.random() - 0.5) * 1.5;
const jY = (Math.random() - 0.5) * 1.5;
Since this runs every frame at 60fps, the text appears to vibrate slightly — just like a paused VHS tape.
Edge Wobble
The top and bottom edges wobble with a sine wave to simulate tape head instability:
const edgeH = Math.round(h * 0.02);
const wobble = Math.sin(Date.now() / 200) * 3 * s;
ctx.fillStyle = "rgba(0,0,0,0.3)";
ctx.fillRect(wobble, 0, w + 10, edgeH);
ctx.fillRect(-wobble, h - edgeH, w + 10, edgeH);
Tracking Line
A bright horizontal line that periodically sweeps down the screen, like a VHS tracking error:
const trackCycle = 4000;
const trackPhase = (Date.now() % trackCycle) / trackCycle;
if (trackPhase < 0.1) { // only visible 10% of the time
const trackY = (trackPhase / 0.1) * h;
ctx.fillStyle = "rgba(255,255,255,0.1)";
ctx.fillRect(0, trackY - 1, w, 2.5 * s);
}
Step 7: Recording the Composited Output
This is where captureStream() and MediaRecorder come together:
const handleStartRecording = () => {
const canvas = canvasRef.current;
// 1. Capture the canvas at 30fps
const canvasStream = canvas.captureStream(30);
// 2. Add audio tracks from the original camera stream
const audioTracks = streamRef.current?.getAudioTracks() || [];
audioTracks.forEach((track) => canvasStream.addTrack(track));
// 3. Pick the best codec
const mimeType = pickMimeType();
const recorder = new MediaRecorder(canvasStream, { mimeType });
// 4. Collect chunks
const chunks = [];
recorder.ondataavailable = (e) => {
if (e.data && e.data.size > 0) chunks.push(e.data);
};
// 5. On stop, create a downloadable blob
recorder.onstop = () => {
const blob = new Blob(chunks, { type: mimeType });
const url = URL.createObjectURL(blob);
setDownloadUrl(url);
};
// 6. Start recording with 100ms timeslice
recorder.start(100);
};
Codec Selection
We try VP9 first (better quality), then fall back:
function pickMimeType() {
const candidates = [
"video/webm;codecs=vp9,opus",
"video/webm;codecs=vp8,opus",
"video/webm;codecs=vp9",
"video/webm;codecs=vp8",
"video/webm",
];
for (const mime of candidates) {
if (MediaRecorder.isTypeSupported(mime)) return mime;
}
return "video/webm";
}
Why 30fps Capture but 60fps Draw?
The draw loop runs at 60fps for a smooth preview, but captureStream(30) captures at 30fps. This is intentional:
- 60fps draw: Smooth visual feedback for the user
- 30fps capture: Smaller file size, still looks great for video
Download and Share
const handleDownload = () => {
const a = document.createElement("a");
a.href = downloadUrl;
a.download = "logmaker-recording.webm";
document.body.appendChild(a);
a.click();
document.body.removeChild(a);
};
// Web Share API (mobile)
const handleShare = async () => {
const file = new File([blob], "logmaker-recording.webm", {
type: "video/webm",
});
await navigator.share({
files: [file],
title: "LogMaker Recording",
});
};
Step 8: Device Management
Users might have multiple cameras and microphones. We enumerate them and let them switch:
const enumerateDevices = async () => {
const devices = await navigator.mediaDevices.enumerateDevices();
const vDevices = devices.filter((d) => d.kind === "videoinput");
const aDevices = devices.filter((d) => d.kind === "audioinput");
setVideoDevices(vDevices);
setAudioDevices(aDevices);
// Restore saved selections from localStorage
const savedVideo = localStorage.getItem("logmaker_videoDevice");
if (savedVideo && vDevices.some((d) => d.deviceId === savedVideo)) {
setSelectedVideoDevice(savedVideo);
}
};
// Listen for device changes (plug/unplug)
useEffect(() => {
navigator.mediaDevices.addEventListener("devicechange", enumerateDevices);
return () => navigator.mediaDevices.removeEventListener("devicechange", enumerateDevices);
}, []);
Device labels are only available after permission is granted (a browser security measure), so we enumerate again after the first getUserMedia call succeeds.
Performance Considerations
A few things we do to keep this running at 60fps:
- Frame-skipped noise: Grain only renders every 3rd frame
- Throttled React updates: UI elapsed time updates at ~2Hz, not 60Hz
- Capped dot counts: Noise dots are proportional to canvas area
- Module-level state: Terminal log state lives outside React to avoid re-renders
- No DOM manipulation in the loop: Everything is pure canvas operations
- Ref-based status reads: Avoids stale closures in the draw loop
// Throttled UI update — only every 15th frame (~2Hz at 30fps)
if (isRec && frameCountRef.current % 15 === 0) {
setElapsedTime(elapsed);
}
Putting It All Together
The React component structure is minimal:
export default function RecorderClient() {
const [status, setStatus] = useState("idle");
// idle → camera_on → recording → stopped
return (
<div className="flex flex-col flex-1 min-h-0">
{/* Stage: canvas fills available space */}
<div className="relative flex-1 bg-black">
<video ref={videoRef} autoPlay muted playsInline
className="w-0 h-0 opacity-0" />
<canvas ref={canvasRef} width={1280} height={720}
className="w-full h-full object-contain" />
</div>
{/* Control bar: theme selector + action buttons */}
<div className="flex items-center gap-2 px-3 py-2 bg-gray-950">
<select value={selectedTheme} onChange={...}>
{THEMES.map(t => <option key={t.id} value={t.id}>{t.name}</option>)}
</select>
{/* Buttons change based on status */}
</div>
</div>
);
}
The canvas uses object-contain CSS so it maintains aspect ratio while filling the available space — no letterboxing logic needed.
Key Takeaways
Canvas is your compositing layer. Use
drawImage()for the video frame and draw everything else on top.captureStream()captures the final result.Keep HUD drawing pure. No React, no DOM. Just a function that takes
(ctx, width, height, state)and draws. This is fast and testable.Use module-level state for animation. Terminal logs, environment data, and mission IDs live outside React. They persist across frames without causing re-renders.
Scale everything relative to canvas width. A single
S(w)function makes the entire HUD responsive.Use refs for the draw loop. React state creates stale closures in
requestAnimationFrame. Refs always have the current value.Layer your effects. Tone → vignette → grain → scanlines → brackets → text. Each layer adds depth. Keep alpha values low — subtlety is cinematic.
Try It Out
logmaker.app — try it right now in your browser. No signups, no backend, no data collection.
The HUD system is under 1000 lines of pure Canvas 2D, and the entire app has zero runtime dependencies beyond React and Next.js.
If you build your own HUD theme, I'd love to see it. Drop a comment below!
What sci-fi HUD would you want to see next? Night-vision? Submarine sonar? Mech cockpit? Let me know in the comments.

Top comments (0)