cURL to Code

Processed locally

Convert cURL commands to JavaScript fetch, Python requests, Axios, or PHP. Paste a curl command and get ready-to-use code.

cURL Command
JS Fetch Output
Runs entirely in your browser — nothing is uploaded
Runs entirely in your browser. No uploads. Your files stay private.

What Is a cURL Converter?

cURL is the de-facto command-line HTTP client - first released by Daniel Stenberg in 1997, now bundled with macOS, Windows, every Linux distribution, and used in millions of API documentation pages and bug reports as the canonical way to describe a request. The problem is that the language you actually ship code in - JavaScript, Python, PHP, Go - has its own HTTP idioms, and translating a cURL one-liner into idiomatic code by hand is mechanical and error-prone.
This converter parses cURL commands with a small hand-written tokenizer (parseCurl in the source above). It walks the argument list one token at a time, extracts the request method (-X / --request), each header (-H / --header), the request body (-d / --data / --data-raw / --data-binary), and the URL, and silently ignores cosmetic flags like -s and -L. The tokenizer recognizes single- and double-quoted strings as single tokens, so headers and JSON bodies survive their internal whitespace correctly.
Output goes to four target dialects: native browser fetch (no dependency, runs in any modern JavaScript runtime), Python with the requests library (the most-used HTTP client for Python by an order of magnitude), Axios (the most popular Node.js / React HTTP client, with built-in JSON handling and interceptors), and classic PHP cURL (still ubiquitous in WordPress and Laravel codebases). Each emitter is a pure function that takes the parsed AST and produces a runnable snippet.
Body type detection is automatic. If the parsed Content-Type contains "json", or the body string starts with { or [, the JSON path is taken: fetch wraps it in JSON.stringify, requests passes it via the json= keyword, Axios sets data: directly, and PHP keeps it as a JSON-encoded string. Otherwise the body is treated as a raw string, which keeps form-encoded and plain-text bodies intact.
Known limitations of the parser. It handles the common Chrome "Copy as cURL (bash)" format but does not yet support multipart file uploads (-F / --form), --user for Basic auth, --cookie, --user-agent shorthand, or PowerShell's curl alias (which is actually Invoke-WebRequest with completely different syntax). Output assumes the JSON body in -d is already valid JSON; if you have a malformed body it'll be passed through untouched and your runtime will throw on parse.
A practical tip when grabbing requests from browser DevTools: right-click the network entry, choose "Copy > Copy as cURL (bash)" on macOS/Linux or "Copy as cURL (cmd)" on Windows. The Windows variant uses ^ for line continuation and double-quoted strings differently - paste the bash variant into this tool for the cleanest parse. The DevTools version includes every header your browser sent, including cookies; strip Cookie: and any Authorization: tokens before pasting them into a public tool or sharing the output.
Privacy: cURL commands frequently contain Authorization tokens, API keys, or session cookies. This tool runs entirely in the browser - parser, emitters, and download all happen in your tab with no network requests. Even so, treat any URL you paste as if it were public and rotate credentials that have ever appeared in a screenshot, gist, or third-party tool.

Common Use Cases

01

API documentation translation

Drop a cURL example from a vendor's API docs into the tool and get the equivalent fetch or requests call ready to paste into your app.

02

Reproducing a DevTools network call

Copy a failing request from Chrome DevTools as cURL and convert to runnable code that reproduces the bug in a minimal test case.

03

Postman export to source code

Export a Postman request as cURL, paste it here, and skip the friction of Postman's built-in (and often verbose) code generators.

04

Backend SDK scaffolding

Bootstrap the first call against a new third-party API by translating their cURL example into your stack's native client.

Frequently Asked Questions

-X / --request (method), -H / --header (headers), -d / --data / --data-raw / --data-binary (body), and the cosmetic -s, --silent, -L, --location which are ignored. Backslash line continuations across multiple lines are normalized before parsing.
Yes for Authorization headers - the standard pattern -H 'Authorization: Bearer <token>' round-trips into every output dialect. Basic auth via the --user shorthand (--user user:pass) is not parsed yet; for now base64-encode the credentials manually and include them as Authorization: Basic <encoded>.
Not yet. -F / --form arguments aren't parsed, so file uploads need to be coded by hand in the target language. For fetch, build a FormData object; for requests, use the files= keyword; for Axios, send a FormData instance from the form-data package on Node.js.
The fetch and Axios emitters detect JSON bodies (Content-Type contains &quot;json&quot; or the body starts with { or [) and call JSON.stringify on the parsed object. If your body is already a JSON string, the call yields the same string with the same content - just delete the extra wrapper if you prefer.
It expects bash-style quoting. Chrome's &quot;Copy as cURL (cmd)&quot; option uses ^ for line continuation and slightly different quote rules - the parser may misread it. On Windows, switch your DevTools to bash output via the dropdown in the Copy menu.
Output uses the target language's default. fetch and Axios negotiate HTTP/2 transparently in modern browsers and Node 18+; requests uses HTTP/1.1 unless you switch to httpx; PHP cURL uses HTTP/1.1 by default. The original cURL command's version flags (--http2, --http1.1) are ignored.
The emitter uses the most idiomatic shape: requests.post(url, headers=headers, json=data) for JSON, requests.post(url, headers=headers, data=data) for raw strings. If your codebase wraps requests in a custom session, paste only the call signature and adapt the surrounding boilerplate by hand.
Mostly yes - the snippets are self-contained, including imports. You may need to add error handling, await semantics in non-async contexts (await only works inside async functions), and convert the ESM-style imports to CommonJS if you're on an older Node.js version.
No. The parser and emitters all run as JavaScript inside your tab. The original cURL request and the generated code never leave your browser - which matters because cURL commands often contain bearer tokens and API keys.
URL-encode the query string before pasting (the converter doesn't encode it for you). cURL accepts both -G with --data-urlencode and a literal URL with %20 substituted; the parser only handles the literal form, so paste your URL with spaces already replaced by %20.

Advertisement