WebSocket vs SSE vs Long Polling: Real-Time JSON Guide 2026

Users expect data to update without refreshing. Stock prices, chat messages, live dashboards, sports scores, AI streaming responses — real-time data is no longer a feature, it is a baseline expectation. But the three main approaches to delivering real-time JSON — WebSocket, Server-Sent Events, and Long Polling — are genuinely different technologies with different trade-offs. Picking the wrong one means either over-engineering a simple feed or under-engineering a bidirectional system. This guide covers every dimension: protocol, direction, browser support, scalability, implementation code, and a decision framework that tells you exactly which to choose.

3

real-time data technologies — WebSocket, SSE, Long Polling — each genuinely different

95%

global browser support for both WebSocket and SSE in 2026 — no polyfills needed

50ms

typical WebSocket message latency vs 100–500ms for long polling round-trips

6

max concurrent SSE connections per domain in HTTP/1.1 (unlimited in HTTP/2)

1

Definition: What Are WebSocket, SSE, and Long Polling?

Three different answers to: how does the server push data to the client?

Standard HTTP is request-response — the client asks, the server answers, the connection closes. All three real-time technologies solve the same problem differently: keeping data flowing from server to client without constant client polling. They differ fundamentally in directionality, protocol, connection lifecycle, and complexity.

WebSocket — Full-duplex persistent connection

WebSocket starts as an HTTP request, then upgrades the connection to the WebSocket protocol (ws:// or wss://). After the handshake, a single TCP connection stays open and both sides can send frames at any time — simultaneously, with no request-response overhead. Ideal for chat, multiplayer games, collaborative editing, and any scenario requiring truly bidirectional communication.

Server-Sent Events (SSE) — One-way server push over HTTP

SSE is an HTTP endpoint that never closes — the server keeps the connection open and pushes text/event-stream formatted data whenever new information is available. It is strictly server → client (unidirectional). Built on plain HTTP, so it works through proxies and load balancers without configuration. Browsers implement automatic reconnection with the EventSource API. Ideal for live feeds, notifications, dashboards, and AI streaming responses.

Long Polling — Simulated push using repeated HTTP requests

Long polling is a technique, not a protocol. The client sends an HTTP request, the server holds it open (up to 30 seconds) until new data is available, then responds. The client immediately sends a new request. This simulates server push using standard HTTP. It is the oldest pattern, works everywhere, but adds per-message overhead from repeated connection setup. Use it only for legacy compatibility.

2

How Each Protocol Works — Connection Flow Diagram

WebSocket Connection Lifecycle

HTTP Upgrade Request

GET /ws Connection: Upgrade Upgrade: websocket Sec-WebSocket-Key: <base64>

Server 101 Switching

HTTP 101 Switching Protocols. TCP connection upgraded — HTTP protocol abandoned.

Persistent WS Connection

Both sides can now send binary or text frames at any time. Zero request-response overhead.

Message Exchange

Client → Server: {"type":"chat","text":"Hello"} | Server → Client: {"type":"broadcast","from":"Bob","text":"Hi"}

Close Handshake

Either side sends a close frame. Connection tears down gracefully. Client reconnects if needed.

SSE (Server-Sent Events) Lifecycle

GET /api/events

Standard HTTP GET. Client: Accept: text/event-stream. Server does NOT close the connection.

Headers flushed

Content-Type: text/event-stream | Cache-Control: no-cache | Connection: keep-alive. Body starts streaming.

Events pushed as text

data: {"price":142.50,"symbol":"AAPL"}\n\n — double newline terminates each event. Server pushes whenever ready.

EventSource auto-reconnects

If connection drops, the browser automatically reconnects and sends Last-Event-ID header. No client code needed.

Close when done

Client calls eventSource.close() or navigates away. Server detects closed connection and stops sending.

Long Polling Lifecycle

Client sends GET /poll

Standard HTTP GET with optional since= timestamp. Server holds the connection open.

Server waits (holds)

Server checks for new data every 500ms for up to 30 seconds. If data arrives, respond immediately.

Response with data

New data found: 200 OK with JSON body. Timeout with no data: 200 OK with empty array. Client processes.

Client reconnects

Immediately after receiving a response, client sends a new GET /poll. Cycle repeats forever.

Each message = 2 round-trips

Setup + teardown per message adds latency and server connection overhead vs persistent connections.

3

When to Use Each — Full Comparison Chart

FactorWebSocketSSE (Server-Sent Events)Long Polling
DirectionBidirectional (full-duplex)Server → Client onlyServer → Client (simulated)
Protocolws:// or wss:// (RFC 6455)HTTP with text/event-streamPlain HTTP requests
ConnectionSingle persistent TCP socketSingle persistent HTTP connectionRepeated HTTP request-response
Latency~50ms — lowest possible~100ms — near real-time~200–500ms — slower per message
Auto-reconnect❌ Must implement manually✅ EventSource handles it natively✅ Client loops on response
Proxy/firewall⚠️ Some proxies block upgrades✅ Plain HTTP — works everywhere✅ Plain HTTP — works everywhere
HTTP/2 multiplex❌ Separate WS protocol✅ Multiplexed over H2 streams✅ Benefits from H2 connection reuse
Browser support✅ All modern browsers✅ All modern browsers (not IE)✅ Every browser including IE
Server loadLow — one connection per clientLow — one connection per clientHigh — new connection per message
Binary data✅ Native binary frames❌ Text only (base64 encode binary)❌ Text only
ComplexityHigh — need WS server libraryLow — plain HTTP endpointMedium — hold + poll logic
Best forChat, games, collaborative editFeeds, notifications, AI streamingLegacy browsers, simple updates
4

How to Implement All Three — Production Node.js Code

javascriptWebSocket server — Node.js with ws library
import { WebSocketServer } from 'ws';
import http from 'http';

const server = http.createServer();
const wss    = new WebSocketServer({ server });

// Track connected clients with metadata
const clients = new Map(); // ws → { userId, roomId }

wss.on('connection', (ws, req) => {
  const userId = extractUserId(req); // parse from query param or cookie
  clients.set(ws, { userId });

  // Send welcome message immediately
  ws.send(JSON.stringify({ type: 'connected', userId, timestamp: new Date().toISOString() }));

  ws.on('message', (rawData) => {
    try {
      const message = JSON.parse(rawData.toString());

      if (message.type === 'chat') {
        // Broadcast to all clients in the same room
        const sender = clients.get(ws);
        const payload = JSON.stringify({
          type:      'chat',
          from:      sender.userId,
          text:      message.text,
          timestamp: new Date().toISOString(),
        });

        wss.clients.forEach((client) => {
          if (client !== ws && client.readyState === 1 /* OPEN */) {
            client.send(payload);
          }
        });
      }
    } catch {
      ws.send(JSON.stringify({ type: 'error', message: 'Invalid JSON message' }));
    }
  });

  ws.on('close', () => clients.delete(ws));
  ws.on('error', (err) => console.error('WS error:', err));
});

server.listen(3001, () => console.log('WebSocket server on :3001'));
javascriptWebSocket React client — with reconnection logic
import { useEffect, useRef, useCallback, useState } from 'react';

function useWebSocket(url) {
  const wsRef        = useRef(null);
  const [messages, setMessages] = useState([]);
  const [status, setStatus]     = useState('connecting');

  const connect = useCallback(() => {
    const ws = new WebSocket(url);
    wsRef.current = ws;

    ws.onopen    = ()   => setStatus('connected');
    ws.onclose   = ()   => {
      setStatus('disconnected');
      setTimeout(connect, 3000); // reconnect after 3s
    };
    ws.onerror   = ()   => setStatus('error');
    ws.onmessage = (e)  => {
      const data = JSON.parse(e.data);
      setMessages((prev) => [...prev, data]);
    };
  }, [url]);

  useEffect(() => {
    connect();
    return () => wsRef.current?.close();
  }, [connect]);

  const send = useCallback((data) => {
    if (wsRef.current?.readyState === WebSocket.OPEN) {
      wsRef.current.send(JSON.stringify(data));
    }
  }, []);

  return { messages, status, send };
}
javascriptSSE server — Express endpoint (simpler than WebSocket)
import express from 'express';
const app = express();

// In-memory pub/sub (use Redis in production for multi-instance)
const subscribers = new Set();

// ── SSE endpoint — one persistent HTTP connection per client ───────────────
app.get('/api/events', (req, res) => {
  // Required headers for SSE
  res.setHeader('Content-Type',  'text/event-stream');
  res.setHeader('Cache-Control', 'no-cache');
  res.setHeader('Connection',    'keep-alive');
  res.setHeader('X-Accel-Buffering', 'no'); // disable Nginx buffering
  res.flushHeaders(); // send headers immediately, keep connection open

  // Send a heartbeat every 30s to prevent proxy timeouts
  const heartbeat = setInterval(() => {
    res.write(': heartbeat\n\n'); // SSE comment — ignored by EventSource
  }, 30_000);

  // Register this client
  const send = (event, data) => {
    res.write(`event: ${event}\n`);
    res.write(`data: ${JSON.stringify(data)}\n\n`);
  };

  subscribers.add(send);

  // Clean up when client disconnects
  req.on('close', () => {
    clearInterval(heartbeat);
    subscribers.delete(send);
  });
});

// ── Publish — call this whenever you have new data ─────────────────────────
function broadcast(event, data) {
  subscribers.forEach((send) => send(event, data));
}

// Example: push stock price every second
setInterval(() => {
  broadcast('price', { symbol: 'AAPL', price: (140 + Math.random() * 10).toFixed(2) });
}, 1000);

app.listen(3002);
typescriptSSE React hook — useEventSource with typed events
import { useEffect, useState } from 'react';

interface PriceEvent { symbol: string; price: string; }

function useLivePrices(symbol: string) {
  const [price, setPrice]   = useState<string | null>(null);
  const [status, setStatus] = useState<'connecting' | 'open' | 'error'>('connecting');

  useEffect(() => {
    const es = new EventSource(`/api/events?symbol=${symbol}`);

    es.addEventListener('price', (e: MessageEvent) => {
      const data: PriceEvent = JSON.parse(e.data);
      setPrice(data.price);
      setStatus('open');
    });

    es.onerror = () => {
      setStatus('error');
      // EventSource automatically reconnects — no manual logic needed
    };

    return () => es.close();
  }, [symbol]);

  return { price, status };
}

// Usage:
// const { price, status } = useLivePrices('AAPL');
javascriptLong Polling — server and client (legacy compatibility)
// ── Server ─────────────────────────────────────────────────────────────────
app.get('/api/poll', async (req, res) => {
  const since   = req.query.since ? new Date(req.query.since) : new Date(0);
  const timeout = 28_000; // 28s — leave 2s buffer before proxy/load-balancer timeout
  const interval = 500;   // check every 500ms
  const start   = Date.now();

  while (Date.now() - start < timeout) {
    const updates = await db.query(
      'SELECT * FROM events WHERE created_at > ? ORDER BY created_at ASC LIMIT 50',
      [since]
    );

    if (updates.length > 0) {
      return res.json({
        data:      updates,
        timestamp: new Date().toISOString(),
        hasMore:   updates.length === 50,
      });
    }

    await new Promise((resolve) => setTimeout(resolve, interval));
  }

  // Timeout — return empty, client reconnects immediately
  res.json({ data: [], timestamp: new Date().toISOString(), hasMore: false });
});

// ── Client ─────────────────────────────────────────────────────────────────
let lastTimestamp = null;

async function longPoll() {
  try {
    const url = lastTimestamp ? `/api/poll?since=${lastTimestamp}` : '/api/poll';
    const res  = await fetch(url, { signal: AbortSignal.timeout(35_000) });
    const body = await res.json();

    if (body.data.length > 0) {
      processUpdates(body.data);
      lastTimestamp = body.timestamp;
    }
  } catch (err) {
    if (err.name !== 'AbortError') await sleep(2000); // wait on error before retry
  }

  longPoll(); // always reconnect, synchronous recursion avoids stack growth
}

longPoll(); // start the loop
5

Why the Right Choice Matters — Performance and Scalability

Each technology has a fundamentally different cost model at scale. The wrong choice for your use case does not show up in development with 10 concurrent users — it shows up in production with 10,000.

1

Is communication bidirectional? → WebSocket

If the client must send data to the server at any time — chat messages, game inputs, collaborative edits, cursor positions — WebSocket is the only technology that handles this without a second HTTP connection for client-to-server messages. SSE is receive-only.

2

Is it server-to-client only, and you want simplicity? → SSE

Live dashboards, stock tickers, notification feeds, AI streaming text, live sports scores — all are server-to-client. SSE delivers all of these with less code than WebSocket, better proxy compatibility, and automatic reconnection baked into the browser EventSource API. HTTP/2 allows unlimited concurrent SSE connections (vs the 6-connection limit in HTTP/1.1).

3

Do you need to support very old browsers or corporate proxies that block WS? → SSE or Long Polling

Some enterprise networks block WebSocket upgrades at the proxy level. SSE runs over plain HTTP and passes through every proxy. Long polling is the nuclear option: it works in every browser, through every proxy, on every network — at the cost of higher latency and server connection count.

4

Is this a Next.js or edge-deployed app? → SSE via Route Handlers

WebSocket servers require a persistent Node.js process — incompatible with serverless and edge deployments. SSE works in Next.js App Router route handlers using ReadableStream. Long polling also works. For Next.js apps, SSE is almost always the right real-time choice.

5

Are you streaming AI responses? → SSE or NDJSON over fetch

OpenAI, Anthropic, and Gemini all stream responses as Server-Sent Events or NDJSON over HTTP. For displaying AI responses token-by-token in your UI, SSE on your API proxy endpoint is the exact right tool — simple, compatible, and directly maps to the upstream streaming format.

Quick-pick decision table — 30-second read

Chat app / multiplayer game / collaborative doc editor → WebSocket (bidirectional required). Live dashboard / stock ticker / notification feed / AI streaming → SSE (server-push, simple, works everywhere). Next.js / Vercel / Edge deployment → SSE via ReadableStream route handler. Legacy IE support / extreme corporate firewall → Long Polling. Need binary data (audio, video frames)? → WebSocket (SSE is text-only).

Malformed SSE event data, invalid WebSocket JSON frames, or broken long-poll responses — paste any JSON into our AI Error Explainer and get an instant diagnosis with plain-English explanations and auto-fix.

Fix My Real-Time JSON →

Frequently Asked Questions