32blogby Studio Mitsu

curl Complete Guide: API Calls, Debugging, and Beyond

Learn curl from basics to advanced: API requests, headers, authentication, file transfers, scripting, and when to use wget instead.

by omitsu13 min read
On this page

curl is the go-to command-line tool for HTTP communication. curl URL sends a GET request, -X POST -d '{...}' sends data to an API, -v shows the full request/response trace — virtually any HTTP operation in a single command.

Need to test an API endpoint from the terminal? Debug a redirect chain? Fire a webhook from a CI script?

curl handles all of that and more. It comes pre-installed on Linux and macOS, and works on Windows through WSL or Git Bash.

This guide covers everything from the basics to real-world scripting, with ready-to-run examples throughout.

What is curl?

curl (short for "Client URL") is a command-line tool for transferring data using URLs. It supports over 20 protocols including HTTP, HTTPS, FTP, SFTP, and SCP. Originally created by Daniel Stenberg in 1998, it remains one of the most actively maintained open-source projects.

Key characteristics:

  • HTTP Swiss Army knife: supports GET, POST, PUT, DELETE, PATCH, and every other method
  • Flexible output: grab just headers, just the body, measure response times, or capture everything
  • libcurl: also available as a C library, used by Python, PHP, Ruby, and many other languages under the hood
  • Cross-platform: runs on Linux, macOS, and Windows

How it compares to wget: wget is built for downloading files — it excels at recursive downloads and site mirroring. curl is the general-purpose HTTP tool — it shines at API calls, response inspection, and complex request crafting.

For a detailed comparison, see the wget Complete Guide.

Basic usage

Installation check

Verify curl is available on your system.

bash
curl --version

If you're using WSL (Windows Subsystem for Linux), the Linux version of curl is available. Git Bash also ships with curl. Note that PowerShell's curl is an alias for Invoke-WebRequest — it's not the same tool.

bash
# WSL (Ubuntu)
sudo apt install curl

GET request

bash
curl https://example.com

The response body prints to stdout. To save to a file:

bash
# Save with a custom filename
curl -o output.html https://example.com

# Save using the filename from the URL
curl -O https://example.com/file.zip

Inspect response headers

bash
# Headers only (HEAD request)
curl -I https://example.com

# Headers + body together
curl -i https://example.com

Verbose mode (debugging)

bash
curl -v https://example.com

This shows the full conversation: TLS handshake, request headers, response headers, and body. Essential for diagnosing connection issues.

Silent mode

bash
# Suppress progress bar and error messages
curl -s https://example.com

# Show progress as # characters (useful for large files)
curl -# -O https://example.com/file.zip

Common options reference

OptionWhat it does
-o FILESave output to FILE
-OSave using the URL's filename
-I / --headFetch headers only (HEAD request)
-iInclude response headers in output
-v / --verboseShow full request/response (including TLS)
-s / --silentSuppress progress and errors
-L / --locationFollow redirects
-X METHODSpecify HTTP method (POST, PUT, DELETE, etc.)
-H "Header: Value"Add custom header
-d "data"Send POST data
-u user:passBasic authentication
-k / --insecureSkip SSL certificate verification
-c FILESave cookies to FILE
-b FILESend cookies from FILE
--limit-rate RATEThrottle transfer speed (e.g. --limit-rate 1M)
-w "format"Custom output format for response info
--connect-timeout SECConnection timeout in seconds
-C -Resume interrupted download
-F "key=@file"Upload file as form data

Real-world use cases

REST API calls (GET / POST / PUT / DELETE)

The patterns you'll use most when testing APIs.

POST JSON data:

bash
curl -s -X POST \
  -H "Content-Type: application/json" \
  -d '{"name": "Alice", "email": "alice@example.com"}' \
  https://api.example.com/users

Authenticate with a Bearer token:

bash
curl -s \
  -H "Authorization: Bearer eyJhbGciOiJIUzI1NiIs..." \
  https://api.example.com/me

Update a resource with PUT:

bash
curl -s -X PUT \
  -H "Content-Type: application/json" \
  -d '{"name": "Alice Updated"}' \
  https://api.example.com/users/123

Delete a resource:

bash
curl -s -X DELETE \
  -H "Authorization: Bearer eyJhbGciOiJIUzI1NiIs..." \
  https://api.example.com/users/123

Pipe to jq for formatted output:

bash
curl -s https://api.example.com/users | jq '.data[] | {id, name}'

For more on processing JSON output, see the jq Complete Guide.

Debugging headers and redirects

Inspect the full TLS handshake and headers:

bash
curl -v https://example.com 2>&1 | head -30

Verbose output goes to stderr, so 2>&1 merges it with stdout for piping to head.

Measure response time breakdown:

bash
curl -o /dev/null -s -w "DNS: %{time_namelookup}s\nConnect: %{time_connect}s\nTLS: %{time_appconnect}s\nTotal: %{time_total}s\n" https://example.com

This shows DNS resolution, TCP connection, TLS handshake, and total time separately — great for pinpointing performance bottlenecks.

Trace the redirect chain:

bash
curl -s -L -o /dev/null -w "Final URL: %{url_effective}\nRedirects: %{num_redirects}\n" https://example.com

File uploads

Upload a file as form data:

bash
curl -F "file=@photo.jpg" https://api.example.com/upload

Upload multiple files at once:

bash
curl -F "file1=@document.pdf" \
     -F "file2=@image.png" \
     -F "description=Monthly report" \
     https://api.example.com/upload

Specify the MIME type explicitly:

bash
curl -F "file=@data.csv;type=text/csv" https://api.example.com/import

Maintain sessions across requests, such as after logging in.

bash
# Step 1: Log in and save cookies
curl -c cookies.txt \
  -d "username=admin&password=secret" \
  https://example.com/login

# Step 2: Use saved cookies for subsequent requests
curl -b cookies.txt https://example.com/dashboard

# Step 3: Send existing cookies and save any new ones
curl -b cookies.txt -c cookies.txt https://example.com/profile

Resuming interrupted downloads

When a large download gets cut off:

bash
# Original download (interrupted by Ctrl+C, network drop, etc.)
curl -O https://example.com/largefile.iso

# Resume from where it left off
curl -C - -O https://example.com/largefile.iso

The -C - flag tells curl to auto-detect the current file size and request only the remaining bytes.

Advanced techniques

Simplify JSON API calls with --json

The --json flag (curl 7.82+) condenses the standard JSON API pattern into a single option.

bash
# Traditional approach (three options needed)
curl -s -X POST \
  -H "Content-Type: application/json" \
  -H "Accept: application/json" \
  -d '{"name": "Alice"}' \
  https://api.example.com/users

# --json does it all in one
curl -s --json '{"name": "Alice"}' https://api.example.com/users

--json internally sets -d (POST), Content-Type: application/json, and Accept: application/json. It also reads from files and stdin.

bash
# Send JSON from a file
curl --json @payload.json https://api.example.com/users

# Pipe JSON in
echo '{"status": "active"}' | curl --json @- https://api.example.com/users/123

Built-in retry with --retry

No need to write retry loops in scripts — curl has --retry built in.

bash
# Retry up to 3 times on transient errors (automatic backoff)
curl --retry 3 --retry-delay 2 https://api.example.com/data

# Retry on all errors including 4xx/5xx
curl --retry 3 --retry-all-errors https://api.example.com/data

# Cap total retry time
curl --retry 5 --retry-max-time 60 https://api.example.com/data

By default, curl retries on connection timeouts, HTTP 408, HTTP 429, and HTTP 5xx. Add --retry-all-errors to include network errors and other failures.

Parallel requests (--parallel)

curl 7.66+ supports --parallel (-Z) for concurrent requests to multiple URLs.

bash
# Request 4 APIs in parallel
curl -Z -s \
  -o users.json "https://api.example.com/users" \
  -o posts.json "https://api.example.com/posts" \
  -o comments.json "https://api.example.com/comments" \
  -o tags.json "https://api.example.com/tags"
bash
# Limit max concurrent connections
curl -Z --parallel-max 3 -s \
  -o file1.zip "https://example.com/file1.zip" \
  -o file2.zip "https://example.com/file2.zip" \
  -o file3.zip "https://example.com/file3.zip" \
  -o file4.zip "https://example.com/file4.zip" \
  -o file5.zip "https://example.com/file5.zip"

The default --parallel-max is 50. Adjust it to match API rate limits.

Reusable request configs with -K

Instead of typing long option strings every time, save them in a config file.

text
# ~/.curl/api-config
header = "Authorization: Bearer eyJhbGciOiJIUzI1NiIs..."
header = "Content-Type: application/json"
header = "Accept: application/json"
silent
location
connect-timeout = 10
max-time = 30
bash
# Use the config file
curl -K ~/.curl/api-config https://api.example.com/users

# Config file + additional options
curl -K ~/.curl/api-config --json '{"name": "Alice"}' https://api.example.com/users

Share config files across a team to standardize API testing workflows. To run health checks on a schedule, pair this with cron / systemd timer.

Practical -w variables

-w (--write-out) has useful variables beyond response timing.

bash
# Display detailed response info
curl -s -o /dev/null -w "\
HTTP/%{http_version} %{http_code}\n\
Size: %{size_download} bytes\n\
Content-Type: %{content_type}\n\
Connections: %{num_connects}\n\
IP: %{remote_ip}:%{remote_port}\n\
" https://example.com
text
HTTP/2 200
Size: 12345 bytes
Content-Type: text/html; charset=utf-8
Connections: 1
IP: 93.184.215.14:443
bash
# Output as JSON (combine with jq)
curl -s -o /dev/null -w '{
  "status": %{http_code},
  "time_total": %{time_total},
  "size": %{size_download},
  "ip": "%{remote_ip}"
}' https://example.com | jq '.'

Scripting examples

curl pairs naturally with shell scripts. For shell scripting fundamentals, see the Shell Script Practical Guide.

API health checker

A script that checks the status of multiple endpoints and reports failures.

bash
#!/bin/bash

ENDPOINTS=(
  "https://api.example.com/health"
  "https://api.example.com/v2/status"
  "https://cdn.example.com/ping"
  "https://auth.example.com/health"
)

echo "=== API Health Check ==="
echo ""

FAILED=0

for URL in "${ENDPOINTS[@]}"; do
  STATUS=$(curl -s -o /dev/null -w "%{http_code}" --connect-timeout 5 --max-time 10 "$URL")

  if [ "$STATUS" -ge 200 ] && [ "$STATUS" -lt 300 ]; then
    echo "[OK]   $STATUS  $URL"
  else
    echo "[FAIL] $STATUS  $URL"
    FAILED=$((FAILED + 1))
  fi
done

echo ""
echo "Results: $((${#ENDPOINTS[@]} - FAILED))/${#ENDPOINTS[@]} passed"

if [ "$FAILED" -gt 0 ]; then
  exit 1
fi

Batch downloader with retries

bash
#!/bin/bash

URLS=(
  "https://example.com/data/january.csv"
  "https://example.com/data/february.csv"
  "https://example.com/data/march.csv"
)

DEST_DIR="./downloads"
MAX_RETRIES=3

mkdir -p "$DEST_DIR"

for URL in "${URLS[@]}"; do
  FILENAME=$(basename "$URL")
  ATTEMPT=1

  while [ "$ATTEMPT" -le "$MAX_RETRIES" ]; do
    echo "Downloading $FILENAME (attempt $ATTEMPT/$MAX_RETRIES)..."

    if curl -s -f -L -o "${DEST_DIR}/${FILENAME}" --connect-timeout 10 --max-time 120 "$URL"; then
      echo "  OK: $FILENAME"
      break
    else
      echo "  FAILED: $FILENAME"
      ATTEMPT=$((ATTEMPT + 1))

      if [ "$ATTEMPT" -le "$MAX_RETRIES" ]; then
        echo "  Retrying in 5 seconds..."
        sleep 5
      fi
    fi
  done

  if [ "$ATTEMPT" -gt "$MAX_RETRIES" ]; then
    echo "  GAVE UP: $FILENAME after $MAX_RETRIES attempts"
  fi
done

echo "Done."

Security considerations

curl security updates

curl 8.19.0 (released March 11, 2026) is the latest release. The previous 8.18.0 (January 7, 2026) included 6 security fixes.

curl's bug bounty program ended in late January 2026. The flood of AI-generated reports drove the confirmation rate below 5%, consuming disproportionate triage resources. Over its ~6-year run (since April 2019), the program confirmed 87 vulnerabilities and paid out over $100,000 in bounties.

Never use -k in production

bash
# For test environments with self-signed certs ONLY
curl -k https://self-signed.example.com/api

-k (--insecure) disables SSL certificate verification. It's fine for testing against self-signed certificates, but using it in production or automated scripts opens you up to man-in-the-middle attacks.

Keep credentials out of commands

bash
# Bad: password visible in shell history
curl -u admin:P@ssw0rd https://api.example.com/admin

# Good: use .netrc
echo "machine api.example.com login admin password P@ssw0rd" > ~/.netrc
chmod 600 ~/.netrc
curl -n https://api.example.com/admin

Always set .netrc permissions to 600 so other users can't read it.

Environment variables work too:

bash
curl -u "admin:${API_PASSWORD}" https://api.example.com/admin

Frequently Asked Questions

When should I use curl vs wget?

curl excels at API calls, HTTP debugging, and crafting complex requests. wget excels at recursive downloads and site mirroring. For API testing, use curl. For batch file downloads, use wget. See the wget Practical Guide for a detailed comparison.

What's the simplest way to POST JSON with curl?

On curl 7.82+, use curl --json '{"key": "value"}' URL. The --json flag sets Content-Type, Accept, and -d all at once. On older versions, you need -X POST -H "Content-Type: application/json" -d '{...}'.

How do I diagnose slow curl responses?

Use curl -o /dev/null -s -w "DNS: %{time_namelookup}s\nConnect: %{time_connect}s\nTLS: %{time_appconnect}s\nTotal: %{time_total}s\n" URL to measure DNS resolution, TCP connection, TLS handshake, and total time individually. This pinpoints which phase is the bottleneck.

Is it safe to use curl -k (--insecure) in production?

No. -k skips SSL certificate verification, which exposes you to man-in-the-middle attacks. Only use it temporarily in test environments with self-signed certificates. In production, use proper certificates or specify a CA cert with --cacert.

How do I upload files with curl?

Use curl -F "file=@filename" URL for form-data uploads. Specify MIME types with curl -F "file=@data.csv;type=text/csv" URL. Add multiple -F flags for multiple files.

How do I keep passwords out of shell history?

Store credentials in a .netrc file and use curl -n URL. Set .netrc permissions to chmod 600. Environment variables also work: curl -u "user:${PASSWORD}" URL.

How do I add retries to curl?

Use curl --retry 3 --retry-delay 2 URL for automatic retries on transient errors. Add --retry-all-errors to retry on 4xx/5xx responses too. No need to write retry loops in your scripts.

How do I check my curl version?

Run curl --version. The latest release is available on the curl official download page. As of March 2026, the latest is 8.19.0.

Wrapping Up

curl is the Swiss Army knife of HTTP communication. Whether you're testing APIs, debugging connections, uploading files, or scripting automated workflows, it's the go-to tool.

Quick reference for the most useful patterns:

  • Basic request: curl URL. Save to file: -o or -O
  • API calls: --json '{...}' (7.82+) is the simplest approach
  • Debugging: -v for full trace, -w for response timing
  • In scripts: always use -s -f for silent mode + error detection
  • Credentials: use .netrc or environment variables, never hardcode in commands
  • Stay updated: check curl.se regularly for new releases

If your primary need is downloading files, wget is simpler for that task. For API calls, response inspection, and complex HTTP work, curl is the clear choice. Use both — each has its strengths.