DEV Community

Yitao
Yitao

Posted on

Next.js 15 + Supabase: I Accidentally Blew Past My Quota by 1000% (and How “Local‑First” Saved It)

TL;DR: My “perfectly reasonable” real-time architecture for an online party game turned into a billing horror story — first with Supabase Realtime broadcasts, then with Redis + polling. The fix wasn’t “optimize the server.” It was stop needing the server for the common use case.


1. The morning my dashboard tried to jump-scare me

One morning I woke up to warning emails from Supabase and Vercel:

“Your project has significantly exceeded its Realtime message quota.”

I opened the dashboard and had to re-check that I was looking at the right project.

In just a 10-day span, Supabase Realtime (Broadcast + Presence) had processed roughly 50,000,000 messages (the bill line item showed 54,557,731 Realtime Messages).

That wasn’t “a little over.” It was 1000%+ over the included quota.

For context: I run Imposter Game — a browser-based party game (think “Liar Game” / social deduction) that works with 3 to 99 players. No installs, no logins — just open a URL and play.

User growth is great… until your side project starts throwing punches at your wallet.


2. Technical post-mortem: two failures back-to-back

The stack

  • Framework: Next.js 15 (App Router)
  • Database: Supabase (Postgres)
  • Realtime: Supabase Realtime (Broadcast + Presence)
  • State: Upstash Redis (Vercel KV)

Failure 1: Broadcast everything (Supabase Realtime)

My first approach was the classic “real-time multiplayer” instinct:

  • Any state change? Broadcast it immediately.
  • Timer? Send updates every second.
  • Presence? Track joins/leaves live.

Here’s the core math that bit me: Supabase charges on egress messages — effectively:

1 event × number of subscribers in the room (N)

So with N = 50 players:

  • Every second: 1 timer tick × 50 recipients = 50 messages/sec
  • One 15-minute round (900 sec): 50 × 900 = 45,000 messages
  • Add votes, reactions, and Presence traffic…and the number explodes.

Result: ~50M messages in about 10 days, quota obliterated.

Failure 2: “Fine, I’ll use Redis + polling”

My next thought was also extremely common:

“Realtime is expensive. Let’s store state in Redis and have clients poll.”

So I turned off the broadcast approach and switched to:

  • State stored in Upstash Redis
  • Client polls GET /api/game-state once per second

This looked “cheaper” in my head. It wasn’t.

If each poll triggers ~3 Redis commands (Room/Round/Player):

10 concurrent users × 1 poll/sec × 3 commands = 1,800 commands/min

And Upstash’ free monthly quota (500k commands)?

It evaporated in less than half a day.

I ended up adding a credit card for Pay As You Go just to keep the app alive.

At that point I had to admit it:

“Congrats. I just wrote my own DDoS script.”

(Also: Vercel and Upstash being in different regions increased RTT and made the whole thing feel even worse.)


3. The real realization: I was solving the wrong problem

My initial “solutions” were all server-side optimizations:

  • batch Redis reads (MGET)
  • reduce timer update frequency (1s → 5s)
  • compress payloads

Then I paused and pictured the real-world usage.

Most people play party games… in the same room, around the same table.

So why were 10 friends at a campsite burning LTE data and battery life, constantly syncing with a server across the planet?

The problem wasn’t “how do I scale my server cheaper?”

It was:

“How do I remove the server from the default experience?”


4. The pivot: Local‑First, client‑only (“no server” mode)

I made a bold call:

For in-person play, don’t use the network at all.

Not “serverless.” Not “edge.”

Just 0 API calls.

New architecture: client-only pass-and-play

One phone acts as the host. Players pass the device around to confirm roles (“pass and play”), then play together locally.

The Local Mode component is a use client Next.js client component, but internally it behaves like a little state machine.

Local timer (no drift, no server)

Instead of server setInterval, I use requestAnimationFrame + Date.now() to compute time-left deterministically.

// useGameTimer.ts (simplified)
useEffect(() => {
    const startTime = Date.now();
    let animationFrameId: number;

    const tick = () => {
        const elapsed = Math.floor((Date.now() - startTime) / 1000);
        const newTimeLeft = Math.max(duration - elapsed, 0);

        setTimeLeft(newTimeLeft);

        if (newTimeLeft > 0) {
            animationFrameId = requestAnimationFrame(tick);
        }
    };

    animationFrameId = requestAnimationFrame(tick);
    return () => cancelAnimationFrame(animationFrameId);
}, [duration]);
Enter fullscreen mode Exit fullscreen mode

(Yes, background tabs have constraints — but this is optimized for in-person local play, where the app stays in the foreground.)

State transitions happen in memory

Role assignment (3–99 players), voting, win conditions — everything runs in browser memory.

const nextPhase = () => {
    setGame(prev => {
        if (prev.phase === "voting") {
            const result = calculateWinner(prev.votes); // local compute
            return { ...prev, phase: "result", winner: result };
        }
        // ...
    });
};
Enter fullscreen mode Exit fullscreen mode

No network round trip means phase transitions feel instant.

INP optimization for 99 players

Rendering and updating a 99-player list can get janky fast.

React 18’s useTransition helped keep heavy updates non-blocking:

const addPlayer = () => {
    startTransition(() => {
        setGame(prev => {
            const newPlayers = [...prev.players, createNewPlayer()];
            return balanceRoles(newPlayers);
        });
    });
};
Enter fullscreen mode Exit fullscreen mode

Security note (because someone will ask)

Online mode still requires server-side validation.

But in Local Mode, the person holding the phone is effectively authenticated by physics.

Your friends’ eyeballs are the anti-cheat.


Result: costs down, UX up

I restructured the site:

  1. Local Game (Offline Mode) became the main CTA on the homepage.
  2. Online Game stayed as a backup feature (“Remote Mode”).

What changed

  • Realtime messages: ~50M → near zero (because most sessions moved to Local Mode)
  • Redis usage: easily kept within free/low tiers
  • Reliability: no more games dying due to disconnects (even in basements, mountains, and spotty areas)

The best part: users preferred the version with no installs, no logins, and no dependency on good internet.


Takeaway

As developers, we’re often drawn to “real-time,” “websockets,” and “edge everything.”

But the best scaling strategy I’ve learned recently is:

Don’t optimize the server — make the server unnecessary.

Sometimes an Array.map beats a Redis cluster.


Try it

👉 Play Imposter Game

It’s a real demo of a smooth 99-player local party game built with React — no app install required.

Top comments (0)