32blogby StudioMitsu

Shell Scripting Practical Guide: Write Scripts That Work

Learn to write reliable bash shell scripts from scratch. Covers shebang, variables, conditionals, loops, functions, error handling with set -euo pipefail, and real-world automation patterns.

14 min read
On this page

A shell script is a text file containing a sequence of commands that the shell executes in order. Instead of typing the same commands every day, you write them once and run the file.

In short: shell scripts automate repetitive terminal tasks. Start every script with #!/usr/bin/env bash and set -euo pipefail for safety, use functions to organize logic, and always quote your variables. That's 80% of writing reliable scripts.

Why Shell Scripts Still Matter

Python exists. Go exists. Why would anyone write a shell script in 2026?

Because shell scripts operate at the OS level with zero dependencies. No runtime to install, no package manager to configure. If the machine has a terminal, it can run your script. That's a huge advantage for:

  • Server provisioning — setting up a fresh VPS with users, packages, and firewall rules
  • CI/CD pipelines — the glue between build steps, test runners, and deployment targets
  • Cron jobs — log rotation, database backups, cleanup tasks that run at 3 AM
  • Developer tooling — project setup scripts, git hooks, local dev environment bootstrapping

I run 32blog's deployment checks with a shell script. It validates the build, runs lints, and checks for broken links before pushing to Vercel. Could I write it in TypeScript? Sure. But the shell script is 40 lines, has no dependencies, and hasn't needed a single update in months.

The sweet spot: if your task is "run some commands in sequence with a bit of logic," reach for a shell script. If you need data structures, HTTP clients, or complex error recovery, reach for a real programming language.

Script Anatomy: Shebang, Permissions, and Execution

Every script starts the same way. Here's the minimum viable script:

bash
#!/usr/bin/env bash
set -euo pipefail

echo "Hello from $(hostname) at $(date)"

The shebang line

The first line #!/usr/bin/env bash tells the OS which interpreter to use. You might see #!/bin/bash in older scripts, but #!/usr/bin/env bash is more portable — it finds bash wherever it's installed on the system rather than hardcoding a path.

The safety net: set -euo pipefail

This single line prevents an entire class of bugs:

bash
set -euo pipefail

Here's what each flag does:

FlagEffect
-e (errexit)Exit immediately if any command fails (non-zero exit code)
-u (nounset)Treat unset variables as errors instead of empty strings
-o pipefailA pipeline fails if any command in it fails, not just the last one

Without these flags, a script will happily continue after errors. I once had a backup script that silently failed at the tar step but still ran the "delete old backups" step. Lost a week of data. set -euo pipefail would have caught it on line one.

File permissions and execution

Scripts need execute permission:

bash
# Make the script executable
chmod +x deploy-check.sh

# Run it
./deploy-check.sh

You can also run a script without execute permission by calling the interpreter directly:

bash
bash deploy-check.sh

But the chmod +x approach is standard — it lets you run the script like any other command, and the shebang line ensures the correct interpreter is used.

File naming

There's no strict requirement, but .sh is the convention. Some teams use no extension at all for scripts that live in $PATH. The shebang line, not the extension, determines how the script runs.

Variables, Arguments, and User Input

Variable basics

bash
#!/usr/bin/env bash
set -euo pipefail

# Assignment — no spaces around the equals sign
project_name="32blog"
build_dir="./dist"
max_retries=3

# Usage — always quote variables
echo "Building ${project_name} into ${build_dir}"

The ${} syntax is optional for simple cases ($project_name works too), but it's safer — ${project_name}_backup is unambiguous, while $project_name_backup looks for a variable called project_name_backup.

Always quote your variables. This is the single most common source of bugs in shell scripts:

bash
# BAD — breaks if filename contains spaces
file=my report.txt
cat $file  # tries to cat "my" and "report.txt" separately

# GOOD — handles any filename
file="my report.txt"
cat "$file"

Command substitution

Capture command output into a variable:

bash
current_date=$(date +%Y-%m-%d)
git_branch=$(git rev-parse --abbrev-ref HEAD)
file_count=$(find . -name "*.mdx" | wc -l)

echo "Branch ${git_branch} has ${file_count} MDX files as of ${current_date}"

Use $() instead of backticks. Backticks can't nest and are harder to read:

bash
# Modern — clear and nestable
files=$(find "$(pwd)" -name "*.log")

# Legacy — avoid this
files=`find \`pwd\` -name "*.log"`

Script arguments

Arguments are accessed via positional parameters:

bash
#!/usr/bin/env bash
set -euo pipefail

# $0 = script name, $1 = first arg, $2 = second arg
# $# = number of arguments, $@ = all arguments

if [[ $# -lt 1 ]]; then
    echo "Usage: $0 <environment>" >&2
    exit 1
fi

environment="$1"
echo "Deploying to ${environment}"

For scripts with multiple options, use getopts or a simple case parser:

bash
#!/usr/bin/env bash
set -euo pipefail

verbose=false
output_dir="./build"

while [[ $# -gt 0 ]]; do
    case "$1" in
        -v|--verbose)
            verbose=true
            shift
            ;;
        -o|--output)
            output_dir="$2"
            shift 2
            ;;
        -h|--help)
            echo "Usage: $0 [-v] [-o dir]"
            exit 0
            ;;
        *)
            echo "Unknown option: $1" >&2
            exit 1
            ;;
    esac
done

echo "Verbose: ${verbose}, Output: ${output_dir}"

Reading user input

bash
read -rp "Enter project name: " project_name
echo "Creating project: ${project_name}"

The -r flag prevents backslash interpretation (you almost always want it). The -p flag shows a prompt.

Control Flow: Conditionals and Loops

if statements

Bash has two test syntaxes: [ ] (POSIX) and [[ ]] (Bash-specific). Use [[ ]] — it handles word splitting and globbing more safely:

bash
#!/usr/bin/env bash
set -euo pipefail

file="content/cli/en/cli-shell-script.mdx"

# File tests
if [[ -f "$file" ]]; then
    echo "File exists"
fi

if [[ ! -d "./dist" ]]; then
    echo "Build directory missing, creating..."
    mkdir -p ./dist
fi

# String comparison
environment="${1:-}"
if [[ "$environment" == "production" ]]; then
    echo "Production deploy — running extra checks"
elif [[ "$environment" == "staging" ]]; then
    echo "Staging deploy"
else
    echo "Unknown environment: ${environment}" >&2
    exit 1
fi

# Numeric comparison
file_count=$(find . -name "*.mdx" | wc -l)
if [[ "$file_count" -gt 100 ]]; then
    echo "That's a lot of articles"
fi

Common test operators:

OperatorMeaning
-f fileFile exists and is a regular file
-d dirDirectory exists
-z "$var"String is empty
-n "$var"String is not empty
==, !=String equality/inequality
-eq, -ne, -lt, -gtNumeric comparisons

for loops

bash
# Loop over a list
for env in staging production; do
    echo "Deploying to ${env}"
done

# Loop over files (use glob, never parse ls)
for file in content/cli/en/*.mdx; do
    echo "Processing: ${file}"
done

# Loop over command output
for branch in $(git branch --format='%(refname:short)'); do
    echo "Branch: ${branch}"
done

# C-style for loop
for ((i = 1; i <= 5; i++)); do
    echo "Attempt ${i}"
done

while loops

bash
# Read a file line by line
while IFS= read -r line; do
    echo "Line: ${line}"
done < config.txt

# Process command output line by line
git log --oneline -10 | while IFS= read -r line; do
    echo "Commit: ${line}"
done

# Wait for a condition
retries=0
max_retries=5
until curl -sf http://localhost:3000/health > /dev/null 2>&1; do
    ((retries++))
    if [[ "$retries" -ge "$max_retries" ]]; then
        echo "Server failed to start after ${max_retries} attempts" >&2
        exit 1
    fi
    echo "Waiting for server... (attempt ${retries}/${max_retries})"
    sleep 2
done
echo "Server is up"

case statements

case is Bash's pattern matching — cleaner than chained if/elif for string matching:

bash
case "$1" in
    start)
        echo "Starting service..."
        ;;
    stop)
        echo "Stopping service..."
        ;;
    restart)
        echo "Restarting service..."
        ;;
    status)
        echo "Checking status..."
        ;;
    *)
        echo "Usage: $0 {start|stop|restart|status}" >&2
        exit 1
        ;;
esac

Functions and Error Handling

Functions

Functions keep scripts organized and reusable:

bash
#!/usr/bin/env bash
set -euo pipefail

log_info() {
    echo "[INFO] $(date +%H:%M:%S) $*"
}

log_error() {
    echo "[ERROR] $(date +%H:%M:%S) $*" >&2
}

check_dependency() {
    local cmd="$1"
    if ! command -v "$cmd" &> /dev/null; then
        log_error "${cmd} is not installed"
        return 1
    fi
    log_info "${cmd} found: $(command -v "$cmd")"
}

# Usage
check_dependency "node"
check_dependency "git"
log_info "All dependencies satisfied"

Key points:

  • Use local for variables inside functions to avoid polluting the global scope
  • $* expands all arguments as a single string; "$@" preserves individual arguments
  • Functions return exit codes (0 = success, 1-255 = failure), not values. Use echo and command substitution to "return" data
bash
get_version() {
    local package_json="$1"
    jq -r '.version' "$package_json"
}

version=$(get_version "package.json")
echo "Current version: ${version}"

Error handling patterns

Beyond set -euo pipefail, here are patterns for handling errors gracefully:

Trap for cleanup:

bash
#!/usr/bin/env bash
set -euo pipefail

tmpdir=$(mktemp -d)

cleanup() {
    rm -rf "$tmpdir"
    echo "Cleaned up temp directory"
}

trap cleanup EXIT

# Script logic — temp files go in $tmpdir
# cleanup runs automatically when the script exits, even on error
cp important-data.txt "$tmpdir/backup.txt"

Conditional execution:

bash
# Run command, handle failure explicitly
if ! npm run build; then
    log_error "Build failed"
    exit 1
fi

# OR pattern — default value if command fails
git_hash=$(git rev-parse --short HEAD 2>/dev/null || echo "unknown")

Custom error messages with line numbers:

bash
trap 'echo "Error on line $LINENO. Exit code: $?" >&2' ERR

Real-World Script Patterns

Pattern 1: Project setup script

bash
#!/usr/bin/env bash
set -euo pipefail

# 32blog local development setup
log() { echo "[setup] $*"; }

log "Checking dependencies..."
for cmd in node npm git; do
    if ! command -v "$cmd" &> /dev/null; then
        echo "Missing: ${cmd}. Install it and try again." >&2
        exit 1
    fi
done

node_version=$(node -v | tr -d 'v')
required_version="20"
if [[ "${node_version%%.*}" -lt "$required_version" ]]; then
    echo "Node.js ${required_version}+ required, found ${node_version}" >&2
    exit 1
fi

log "Installing dependencies..."
npm ci

if [[ ! -f .env.local ]]; then
    log "Creating .env.local from template..."
    cp .env.example .env.local
    echo "Edit .env.local with your API keys before running dev server"
fi

log "Setup complete. Run 'npm run dev' to start."

Pattern 2: Backup script with rotation

bash
#!/usr/bin/env bash
set -euo pipefail

backup_dir="/var/backups/32blog"
source_dir="/var/www/32blog/content"
max_backups=7
timestamp=$(date +%Y%m%d_%H%M%S)
backup_file="${backup_dir}/content_${timestamp}.tar.gz"

mkdir -p "$backup_dir"

echo "Creating backup: ${backup_file}"
tar -czf "$backup_file" -C "$(dirname "$source_dir")" "$(basename "$source_dir")"

backup_size=$(du -h "$backup_file" | cut -f1)
echo "Backup created: ${backup_size}"

# Rotate — keep only the most recent backups
backup_count=$(find "$backup_dir" -name "content_*.tar.gz" | wc -l)
if [[ "$backup_count" -gt "$max_backups" ]]; then
    remove_count=$((backup_count - max_backups))
    find "$backup_dir" -name "content_*.tar.gz" -printf '%T@ %p\n' \
        | sort -n \
        | head -n "$remove_count" \
        | cut -d' ' -f2- \
        | xargs rm -f
    echo "Rotated: removed ${remove_count} old backup(s)"
fi

Pattern 3: Pre-deploy validation

bash
#!/usr/bin/env bash
set -euo pipefail

errors=0

check() {
    local description="$1"
    shift
    if "$@" > /dev/null 2>&1; then
        echo "✓ ${description}"
    else
        echo "✗ ${description}" >&2
        ((errors++)) || true
    fi
}

echo "=== Pre-deploy checks ==="

check "Lint passes" npm run lint
check "Build succeeds" npm run build
check "No uncommitted changes" git diff --quiet
check "On main branch" test "$(git branch --show-current)" = "main"
check "Up to date with remote" git diff --quiet origin/main..HEAD

if [[ "$errors" -gt 0 ]]; then
    echo ""
    echo "${errors} check(s) failed. Fix them before deploying." >&2
    exit 1
fi

echo ""
echo "All checks passed. Safe to deploy."

Pattern 4: Batch file processing

bash
#!/usr/bin/env bash
set -euo pipefail

# Convert all PNG images to WebP in a directory
input_dir="${1:-.}"
converted=0
skipped=0

for png in "${input_dir}"/*.png; do
    [[ -f "$png" ]] || continue  # skip if glob doesn't match

    webp="${png%.png}.webp"

    if [[ -f "$webp" ]] && [[ "$webp" -nt "$png" ]]; then
        ((skipped++))
        continue
    fi

    cwebp -q 80 "$png" -o "$webp" 2>/dev/null
    ((converted++))
    echo "Converted: $(basename "$png")"
done

echo "Done: ${converted} converted, ${skipped} skipped (already up to date)"

FAQ

Should I use bash or sh for my scripts?

Use bash. Plain sh (POSIX shell) lacks arrays, [[ ]], string manipulation, and many other features you'll want. The only reason to use sh is if you're writing for a minimal environment like Alpine Docker images where bash isn't installed — and even then, you can apk add bash. Write #!/usr/bin/env bash, not #!/bin/sh.

What does set -euo pipefail actually do?

It enables three safety flags: -e exits on any error, -u treats unset variables as errors, and -o pipefail makes pipelines fail if any command in them fails. Together they catch the most common class of silent shell script failures. Put this line right after your shebang in every script.

How do I debug a shell script?

Add set -x to print each command before execution. For targeted debugging, wrap a section with set -x and set +x. You can also run the script with bash -x script.sh without modifying the file. For more detail, PS4='+ ${BASH_SOURCE}:${LINENO}: ' shows file and line numbers.

When should I use a shell script vs Python?

Shell scripts excel at file operations, command orchestration, and system tasks — anything you'd type in a terminal. Switch to Python when you need complex data structures, HTTP requests, JSON parsing beyond what jq handles, or error handling that's more nuanced than "exit on failure." If your script grows past 200 lines, it's probably time to rewrite.

How do I handle filenames with spaces?

Always quote your variables: "$file", not $file. Use "$@" to forward arguments. When looping over files, use globs (for f in *.txt) instead of parsing ls. With find and xargs, use -print0 and -0 respectively for null-delimited processing.

What's the difference between $@ and $*?

"$@" expands to each argument as a separate word (what you almost always want). "$*" expands to all arguments as a single string. In a for loop, for arg in "$@" iterates over each argument correctly even if they contain spaces. Always use "$@" unless you specifically want concatenation.

How do I make a script portable across Linux and macOS?

Use #!/usr/bin/env bash for the shebang. Avoid GNU-specific flags (like sed -i '' vs sed -i — they differ between macOS and Linux). Test with both GNU and BSD versions of core utilities. For maximum portability, stick to POSIX features or explicitly require bash in your documentation.

Is shellcheck worth using?

Absolutely. ShellCheck catches quoting issues, common pitfalls, and portability problems that are easy to miss. Install it with your package manager (apt install shellcheck, brew install shellcheck) and run it in CI. It's caught bugs in scripts I was confident were correct — more than once.

Wrapping Up

Shell scripting isn't glamorous, but it's one of those skills that compounds. Every script you write saves time the next time you need it — and the time after that, and the time after that.

The essentials fit in your head: start with #!/usr/bin/env bash and set -euo pipefail, quote your variables, use functions for organization, and clean up with trap. That's enough to write scripts that actually hold up in production.

If you want to go further: run ShellCheck on your scripts, read through the Bash Pitfalls wiki for edge cases, and study the GNU Bash manual for the full reference. The Advanced Bash-Scripting Guide is also worth bookmarking for deeper dives.

Combine shell scripts with tools covered in other guides — cron for scheduling, find and xargs for file processing, sed and awk for text transformation — and you have a powerful automation toolkit with zero external dependencies.