cURL is the de-facto command-line HTTP client - first released by Daniel Stenberg in 1997, now bundled with macOS, Windows, every Linux distribution, and used in millions of API documentation pages and bug reports as the canonical way to describe a request. The problem is that the language you actually ship code in - JavaScript, Python, PHP, Go - has its own HTTP idioms, and translating a cURL one-liner into idiomatic code by hand is mechanical and error-prone.
This converter parses cURL commands with a small hand-written tokenizer (parseCurl in the source above). It walks the argument list one token at a time, extracts the request method (-X / --request), each header (-H / --header), the request body (-d / --data / --data-raw / --data-binary), and the URL, and silently ignores cosmetic flags like -s and -L. The tokenizer recognizes single- and double-quoted strings as single tokens, so headers and JSON bodies survive their internal whitespace correctly.
Output goes to four target dialects: native browser fetch (no dependency, runs in any modern JavaScript runtime), Python with the requests library (the most-used HTTP client for Python by an order of magnitude), Axios (the most popular Node.js / React HTTP client, with built-in JSON handling and interceptors), and classic PHP cURL (still ubiquitous in WordPress and Laravel codebases). Each emitter is a pure function that takes the parsed AST and produces a runnable snippet.
Body type detection is automatic. If the parsed Content-Type contains "json", or the body string starts with { or [, the JSON path is taken: fetch wraps it in JSON.stringify, requests passes it via the json= keyword, Axios sets data: directly, and PHP keeps it as a JSON-encoded string. Otherwise the body is treated as a raw string, which keeps form-encoded and plain-text bodies intact.
Known limitations of the parser. It handles the common Chrome "Copy as cURL (bash)" format but does not yet support multipart file uploads (-F / --form), --user for Basic auth, --cookie, --user-agent shorthand, or PowerShell's curl alias (which is actually Invoke-WebRequest with completely different syntax). Output assumes the JSON body in -d is already valid JSON; if you have a malformed body it'll be passed through untouched and your runtime will throw on parse.
A practical tip when grabbing requests from browser DevTools: right-click the network entry, choose "Copy > Copy as cURL (bash)" on macOS/Linux or "Copy as cURL (cmd)" on Windows. The Windows variant uses ^ for line continuation and double-quoted strings differently - paste the bash variant into this tool for the cleanest parse. The DevTools version includes every header your browser sent, including cookies; strip Cookie: and any Authorization: tokens before pasting them into a public tool or sharing the output.
Privacy: cURL commands frequently contain Authorization tokens, API keys, or session cookies. This tool runs entirely in the browser - parser, emitters, and download all happen in your tab with no network requests. Even so, treat any URL you paste as if it were public and rotate credentials that have ever appeared in a screenshot, gist, or third-party tool.