HAR to cURL Converter — Complete Guide to Capturing and Replaying HTTP Requests

HAR (HTTP Archive) files record every network request your browser makes — including all headers, cookies, request bodies, and timing data. Converting them to cURL commands lets you replay, debug, and automate those exact requests from the terminal or in API testing tools. This guide explains the HAR format, how to export from all major browsers, how to convert to cURL with a Python script, how to sanitize sensitive data before sharing, and real-world use cases for API debugging, automation testing, and security analysis.

HAR

HTTP Archive — JSON recording of all browser requests

All browsers

Chrome, Firefox, Safari, Edge all export HAR

cURL

replays exact requests with all headers and cookies

Debugging

essential for API debugging, automation, and load testing

1

What is a HAR File?

HAR format overview

HAR (HTTP Archive) is a JSON-format specification for logging all network interactions between a web browser and a server. It captures: request URLs, HTTP methods, request/response headers, cookies, request bodies, response status codes, response bodies, and detailed timing data (DNS lookup, TCP connection, SSL handshake, time to first byte, content download). It's the complete forensic record of what your browser sent and received.

2

How to Export a HAR File from Any Browser

1

Open browser DevTools

Chrome/Edge: F12 or Ctrl+Shift+I. Firefox: F12. Safari: Cmd+Option+I (must enable Developer menu first in Safari Preferences). Navigate to the Network tab.

2

Enable "Preserve log"

Check the "Preserve log" checkbox to keep requests across page navigations. This is critical for capturing login flows or multi-page processes. Also enable "Disable cache" to ensure fresh requests.

3

Reproduce the action

Refresh the page or perform the action you want to capture — login, API call, form submit, file upload. All network requests will appear in the Network tab in real time.

4

Export the HAR file

Chrome/Edge: right-click any request → "Save all as HAR with content." Or click the download icon (⬇) in the toolbar. Firefox: gear icon → "Save All As HAR." Safari: Export button. Saves a .har JSON file.

5

Sanitize before sharing

IMPORTANT: Before sharing the HAR file with anyone, remove or redact sensitive data: auth tokens, session cookies, passwords, API keys. Use a text editor to find and replace sensitive header values.

6

Convert to cURL

Upload to unblockdevs.com/har-to-curl or use the Python script below to extract individual cURL commands from the HAR entries.

3

HAR File Format — What's Inside

jsonHAR file format — annotated structure
{
  "log": {
    "version": "1.2",
    "creator": { "name": "Chrome DevTools", "version": "120.0.6099.71" },
    "browser": { "name": "Chrome", "version": "120.0.6099.71" },
    "pages": [
      {
        "startedDateTime": "2024-01-15T10:30:00.000Z",
        "id": "page_1",
        "title": "Dashboard — My App",
        "pageTimings": { "onLoad": 1250 }
      }
    ],
    "entries": [
      {
        "startedDateTime": "2024-01-15T10:30:00.000Z",
        "time": 49,  // total request time in ms
        "request": {
          "method": "POST",
          "url": "https://api.example.com/auth/login",
          "httpVersion": "HTTP/2.0",
          "headers": [
            { "name": "Content-Type", "value": "application/json" },
            { "name": "Accept", "value": "application/json" },
            { "name": "Authorization", "value": "Bearer eyJhbGci..." }  // ← redact before sharing
          ],
          "cookies": [
            { "name": "session_id", "value": "abc123" }  // ← redact before sharing
          ],
          "queryString": [
            { "name": "version", "value": "v2" }
          ],
          "postData": {
            "mimeType": "application/json",
            "text": "{"email":"user@example.com","password":"REDACTED"}"
          },
          "bodySize": 47,
          "headersSize": 350
        },
        "response": {
          "status": 200,
          "statusText": "OK",
          "headers": [
            { "name": "content-type", "value": "application/json" },
            { "name": "x-request-id", "value": "req_abc123" }
          ],
          "content": {
            "size": 156,
            "mimeType": "application/json",
            "text": "{"token":"eyJhbGciOiJIUzI1NiIsInR5cCI6IkpXVCJ9..."}"
          },
          "redirectURL": "",
          "headersSize": 180,
          "bodySize": 156
        },
        "timings": {
          "dns": 0,         // DNS lookup time
          "connect": 0,     // TCP connection time
          "ssl": 12,        // TLS handshake time
          "send": 1,        // Time to send request
          "wait": 32,       // Time to first byte (TTFB)
          "receive": 4      // Time to download response body
        }
      }
    ]
  }
}
4

Converting HAR to cURL — Python Script

pythonhar_to_curl.py — complete converter
import json
import sys
import shlex
from pathlib import Path

def har_entry_to_curl(entry: dict) -> str:
    """Convert a single HAR log entry to a cURL command string."""
    req = entry['request']
    method = req['method']
    url = req['url']

    parts = ['curl']

    # Set HTTP method (GET is default, so skip for GET)
    if method != 'GET':
        parts += ['-X', method]

    # URL (quoted for safety)
    parts.append(shlex.quote(url))

    # Headers — skip HTTP/2 pseudo-headers and headers curl manages
    skip = {':method', ':path', ':scheme', ':authority', ':protocol',
            'content-length', 'connection', 'host', 'transfer-encoding'}

    for header in req.get('headers', []):
        name = header['name']
        value = header['value']
        if name.lower() not in skip and not name.startswith(':'):
            parts += ['-H', shlex.quote(f'{name}: {value}')]

    # Cookies (separate from headers for clarity)
    cookies = req.get('cookies', [])
    if cookies:
        cookie_str = '; '.join(f"{c['name']}={c['value']}" for c in cookies)
        parts += ['--cookie', shlex.quote(cookie_str)]

    # Request body
    post_data = req.get('postData', {})
    if post_data:
        text = post_data.get('text', '')
        params = post_data.get('params', [])

        if text:
            parts += ['--data-raw', shlex.quote(text)]
        elif params:
            # Form data
            for param in params:
                parts += ['-F', shlex.quote(f"{param['name']}={param['value']}")]

    # Useful flags
    parts += ['--compressed']  # Accept gzip/deflate
    parts += ['-s']            # Silent (no progress bar)

    # Join with line continuation for readability
    return ' \\
  '.join(parts)


def har_to_curl(har_path: str, url_filter: str = None,
                status_filter: int = None, redact_auth: bool = True) -> list[dict]:
    """
    Convert all entries in a HAR file to cURL commands.

    Args:
        har_path: Path to .har file
        url_filter: Only include entries whose URL contains this string
        status_filter: Only include entries with this response status
        redact_auth: Replace auth header values with REDACTED (default: True)

    Returns:
        List of dicts with 'url', 'method', 'status', 'curl' keys
    """
    with open(har_path, encoding='utf-8') as f:
        har = json.load(f)

    entries = har['log']['entries']

    # Optional filtering
    if url_filter:
        entries = [e for e in entries if url_filter in e['request']['url']]
    if status_filter:
        entries = [e for e in entries if e['response']['status'] == status_filter]

    results = []
    for entry in entries:
        # Redact sensitive auth headers before converting
        if redact_auth:
            for header in entry['request'].get('headers', []):
                if header['name'].lower() in ('authorization', 'x-api-key', 'x-auth-token'):
                    header['value'] = 'REDACTED'

        curl_cmd = har_entry_to_curl(entry)
        results.append({
            'method': entry['request']['method'],
            'url': entry['request']['url'],
            'status': entry['response']['status'],
            'time_ms': entry.get('time', 0),
            'curl': curl_cmd,
        })

    return results


if __name__ == '__main__':
    if len(sys.argv) < 2:
        print("Usage: python har_to_curl.py <file.har> [url_filter] [status_code]")
        sys.exit(1)

    har_file = sys.argv[1]
    filter_url = sys.argv[2] if len(sys.argv) > 2 else None
    filter_status = int(sys.argv[3]) if len(sys.argv) > 3 else None

    results = har_to_curl(har_file, filter_url, filter_status)

    for i, r in enumerate(results, 1):
        print(f"# ── Request {i}: {r['method']} {r['url'][:80]}")
        print(f"# Status: {r['status']} | Time: {r['time_ms']:.0f}ms")
        print(r['curl'])
        print()

    print(f"# Total: {len(results)} request(s) converted")
5

HAR vs Other API Capture Methods

ItemMethodWhen to Use
HAR export (Browser DevTools)Full browser traffic including auth cookies and exact headersDebugging web app API calls, reproducing bugs that only occur in browser context
Copy as cURL (DevTools)Right-click single request → "Copy as cURL" in Chrome/FirefoxQuickest for one-off request debugging — no conversion needed
Charles Proxy / mitmproxyIntercept all HTTP/HTTPS traffic including mobile appsMobile app debugging, HTTPS inspection, modifying requests in-flight
WiresharkCapture all network traffic at packet levelDeep protocol analysis, non-HTTP protocols, network-level debugging
API client recording (Postman)Record requests directly in Postman via interceptorWhen you want to build a test collection from real usage
6

Practical Use Cases

API debugging and reproduction

Capture the exact failing request with all original headers and cookies. Share the cURL command with the backend team for precise reproduction without setting up the same browser session or authentication state.

Automation and load testing

Capture a complete user flow in HAR, convert to cURL commands, then wrap in k6 or Locust for load testing. Ensures your load test uses realistic browser-generated requests, not simplified ones that miss important headers.

Performance analysis

HAR timing data (dns, connect, ssl, send, wait, receive) shows exactly how long each phase of each request took. Use Chrome HAR Analyzer or WebPageTest to visualize waterfall charts and identify bottlenecks.

Integration with Postman/Insomnia

Both Postman and Insomnia support direct HAR import. Postman: File → Import → select .har file → creates a full collection. Insomnia: File → Import. Converts your browser session into a persistent, shareable API collection.

Security: HAR files contain extremely sensitive data

HAR files capture authentication tokens, session cookies, API keys, passwords in request bodies, and all your response data including PII. Never share raw HAR files from production sessions containing real user data. Before sharing: replace Authorization header values with "Bearer REDACTED", replace Cookie header values with "session=REDACTED", remove any response bodies containing sensitive data. Use the redact_auth=True option in the script above.

Frequently Asked Questions