32blogby StudioMitsu

curl Complete Guide: API Calls, Debugging, and Beyond

Learn curl from basics to advanced: API requests, headers, authentication, file transfers, scripting, and when to use wget instead.

9 min read
On this page

Need to test an API endpoint from the terminal? Debug a redirect chain? Fire a webhook from a CI script?

curl is the tool for all of that. It handles virtually every HTTP operation you can think of — GET, POST, file uploads, cookie management, authentication, response timing — all from a single command. It comes pre-installed on Linux and macOS, and works on Windows through WSL or Git Bash.

This guide covers everything from the basics to real-world scripting, with ready-to-run examples throughout.

What is curl?

curl (short for "Client URL") is a command-line tool for transferring data using URLs. It supports over 20 protocols including HTTP, HTTPS, FTP, SFTP, and SCP.

Key characteristics:

  • HTTP Swiss Army knife: supports GET, POST, PUT, DELETE, PATCH, and every other method
  • Flexible output: grab just headers, just the body, measure response times, or capture everything
  • libcurl: also available as a C library, used by Python, PHP, Ruby, and many other languages under the hood
  • Cross-platform: runs on Linux, macOS, and Windows

How it compares to wget: wget is built for downloading files — it excels at recursive downloads and site mirroring. curl is the general-purpose HTTP tool — it shines at API calls, response inspection, and complex request crafting.

For a detailed comparison, see the wget Complete Guide.

Basic usage

Installation check

Verify curl is available on your system.

bash
curl --version

If you're using WSL (Windows Subsystem for Linux), the Linux version of curl is available. Git Bash also ships with curl. Note that PowerShell's curl is an alias for Invoke-WebRequest — it's not the same tool.

bash
# WSL (Ubuntu)
sudo apt install curl

GET request

bash
curl https://example.com

The response body prints to stdout. To save to a file:

bash
# Save with a custom filename
curl -o output.html https://example.com

# Save using the filename from the URL
curl -O https://example.com/file.zip

Inspect response headers

bash
# Headers only (HEAD request)
curl -I https://example.com

# Headers + body together
curl -i https://example.com

Verbose mode (debugging)

bash
curl -v https://example.com

This shows the full conversation: TLS handshake, request headers, response headers, and body. Essential for diagnosing connection issues.

Silent mode

bash
# Suppress progress bar and error messages
curl -s https://example.com

# Show progress as # characters (useful for large files)
curl -# -O https://example.com/file.zip

Common options reference

OptionWhat it does
-o FILESave output to FILE
-OSave using the URL's filename
-I / --headFetch headers only (HEAD request)
-iInclude response headers in output
-v / --verboseShow full request/response (including TLS)
-s / --silentSuppress progress and errors
-L / --locationFollow redirects
-X METHODSpecify HTTP method (POST, PUT, DELETE, etc.)
-H "Header: Value"Add custom header
-d "data"Send POST data
-u user:passBasic authentication
-k / --insecureSkip SSL certificate verification
-c FILESave cookies to FILE
-b FILESend cookies from FILE
--limit-rate RATEThrottle transfer speed (e.g. --limit-rate 1M)
-w "format"Custom output format for response info
--connect-timeout SECConnection timeout in seconds
-C -Resume interrupted download
-F "key=@file"Upload file as form data

Real-world use cases

REST API calls (GET / POST / PUT / DELETE)

The patterns you'll use most when testing APIs.

POST JSON data:

bash
curl -s -X POST \
  -H "Content-Type: application/json" \
  -d '{"name": "Alice", "email": "alice@example.com"}' \
  https://api.example.com/users

Authenticate with a Bearer token:

bash
curl -s \
  -H "Authorization: Bearer eyJhbGciOiJIUzI1NiIs..." \
  https://api.example.com/me

Update a resource with PUT:

bash
curl -s -X PUT \
  -H "Content-Type: application/json" \
  -d '{"name": "Alice Updated"}' \
  https://api.example.com/users/123

Delete a resource:

bash
curl -s -X DELETE \
  -H "Authorization: Bearer eyJhbGciOiJIUzI1NiIs..." \
  https://api.example.com/users/123

Pipe to jq for formatted output:

bash
curl -s https://api.example.com/users | jq '.data[] | {id, name}'

For more on processing JSON output, see the jq Complete Guide.

Debugging headers and redirects

Inspect the full TLS handshake and headers:

bash
curl -v https://example.com 2>&1 | head -30

Verbose output goes to stderr, so 2>&1 merges it with stdout for piping to head.

Measure response time breakdown:

bash
curl -o /dev/null -s -w "DNS: %{time_namelookup}s\nConnect: %{time_connect}s\nTLS: %{time_appconnect}s\nTotal: %{time_total}s\n" https://example.com

This shows DNS resolution, TCP connection, TLS handshake, and total time separately — great for pinpointing performance bottlenecks.

Trace the redirect chain:

bash
curl -s -L -o /dev/null -w "Final URL: %{url_effective}\nRedirects: %{num_redirects}\n" https://example.com

File uploads

Upload a file as form data:

bash
curl -F "file=@photo.jpg" https://api.example.com/upload

Upload multiple files at once:

bash
curl -F "file1=@document.pdf" \
     -F "file2=@image.png" \
     -F "description=Monthly report" \
     https://api.example.com/upload

Specify the MIME type explicitly:

bash
curl -F "file=@data.csv;type=text/csv" https://api.example.com/import

Maintain sessions across requests, such as after logging in.

bash
# Step 1: Log in and save cookies
curl -c cookies.txt \
  -d "username=admin&password=secret" \
  https://example.com/login

# Step 2: Use saved cookies for subsequent requests
curl -b cookies.txt https://example.com/dashboard

# Step 3: Send existing cookies and save any new ones
curl -b cookies.txt -c cookies.txt https://example.com/profile

Resuming interrupted downloads

When a large download gets cut off:

bash
# Original download (interrupted by Ctrl+C, network drop, etc.)
curl -O https://example.com/largefile.iso

# Resume from where it left off
curl -C - -O https://example.com/largefile.iso

The -C - flag tells curl to auto-detect the current file size and request only the remaining bytes.

Scripting examples

API health checker

A script that checks the status of multiple endpoints and reports failures.

bash
#!/bin/bash

ENDPOINTS=(
  "https://api.example.com/health"
  "https://api.example.com/v2/status"
  "https://cdn.example.com/ping"
  "https://auth.example.com/health"
)

echo "=== API Health Check ==="
echo ""

FAILED=0

for URL in "${ENDPOINTS[@]}"; do
  STATUS=$(curl -s -o /dev/null -w "%{http_code}" --connect-timeout 5 --max-time 10 "$URL")

  if [ "$STATUS" -ge 200 ] && [ "$STATUS" -lt 300 ]; then
    echo "[OK]   $STATUS  $URL"
  else
    echo "[FAIL] $STATUS  $URL"
    FAILED=$((FAILED + 1))
  fi
done

echo ""
echo "Results: $((${#ENDPOINTS[@]} - FAILED))/${#ENDPOINTS[@]} passed"

if [ "$FAILED" -gt 0 ]; then
  exit 1
fi

Batch downloader with retries

bash
#!/bin/bash

URLS=(
  "https://example.com/data/january.csv"
  "https://example.com/data/february.csv"
  "https://example.com/data/march.csv"
)

DEST_DIR="./downloads"
MAX_RETRIES=3

mkdir -p "$DEST_DIR"

for URL in "${URLS[@]}"; do
  FILENAME=$(basename "$URL")
  ATTEMPT=1

  while [ "$ATTEMPT" -le "$MAX_RETRIES" ]; do
    echo "Downloading $FILENAME (attempt $ATTEMPT/$MAX_RETRIES)..."

    if curl -s -f -L -o "${DEST_DIR}/${FILENAME}" --connect-timeout 10 --max-time 120 "$URL"; then
      echo "  OK: $FILENAME"
      break
    else
      echo "  FAILED: $FILENAME"
      ATTEMPT=$((ATTEMPT + 1))

      if [ "$ATTEMPT" -le "$MAX_RETRIES" ]; then
        echo "  Retrying in 5 seconds..."
        sleep 5
      fi
    fi
  done

  if [ "$ATTEMPT" -gt "$MAX_RETRIES" ]; then
    echo "  GAVE UP: $FILENAME after $MAX_RETRIES attempts"
  fi
done

echo "Done."

Security considerations

curl 8.18.0 security fixes

curl 8.18.0 (released January 7, 2026) included 6 security fixes, 3 of which were discovered by AI tools — a first for the curl project in terms of officially credited AI-assisted vulnerability discovery.

Separately, curl's bug bounty program ended in late January 2026. The program was overwhelmed by a flood of low-quality AI-generated reports that consumed disproportionate triage resources.

Never use -k in production

bash
# For test environments with self-signed certs ONLY
curl -k https://self-signed.example.com/api

-k (--insecure) disables SSL certificate verification. It's fine for testing against self-signed certificates, but using it in production or automated scripts opens you up to man-in-the-middle attacks.

Keep credentials out of commands

bash
# Bad: password visible in shell history
curl -u admin:P@ssw0rd https://api.example.com/admin

# Good: use .netrc
echo "machine api.example.com login admin password P@ssw0rd" > ~/.netrc
chmod 600 ~/.netrc
curl -n https://api.example.com/admin

Always set .netrc permissions to 600 so other users can't read it.

Environment variables work too:

bash
curl -u "admin:${API_PASSWORD}" https://api.example.com/admin

Wrapping Up

curl is the Swiss Army knife of HTTP communication. Whether you're testing APIs, debugging connections, uploading files, or scripting automated workflows, it's the go-to tool.

Quick reference for the most useful patterns:

  • Basic request: curl URL. Save to file: -o or -O
  • API calls: -X POST -H "Content-Type: application/json" -d '{...}'
  • Debugging: -v for full trace, -w for response timing
  • In scripts: always use -s -f for silent mode + error detection
  • Credentials: use .netrc or environment variables, never hardcode in commands

If your primary need is downloading files, wget is simpler for that task. For API calls, response inspection, and complex HTTP work, curl is the clear choice. Use both — each has its strengths.

For more CLI tools and how they work together, see the CLI Toolkit.