32blogby Studio Mitsu

fqmpeg Color Grading & Visual Effects: 23 Verbs Explained

Twenty-three fqmpeg verbs for color grading, keying, stylized looks, glitch, blur, HDR-to-SDR and film grain — with source-verified gotchas for each.

by omitsu29 min read
On this page

fqmpeg's C6 cluster is the biggest one — twenty-three verbs that change how a video looks. Some do pure technical work (color, color-balance, equalize, lut, chroma, hdr-to-sdr). Some pull a green screen and composite (chromakey, color-key, alpha). Some apply a stylized look (cinematic, vintage, sepia, grayscale, negative, posterize, vignette, edge). The rest distort, soften, or add grain (glitch, pixelate, motion-blur, sharpen, blur, noise).

This guide walks each verb against its source in src/commands/ of fqmpeg 3.0.2 — the underlying FFmpeg filter, the defaults, the output filename, and the gotchas you can't see from --help alone (color's contrast range is -1000 to 1000, alpha always outputs .webm, vintage is a four-filter chain, color-key doesn't -c:a copy).

What you'll get out of this guide

  • A decision matrix for the 23 verbs by task (grading / keying / looks / distortion / HDR)
  • Exact FFmpeg invocation each verb generates (verified --dry-run output)
  • Defaults, allowed values, and output filenames for every command
  • Three end-to-end recipes including a green-screen → final-composite pipeline

The 23 Verbs at a Glance

The cluster splits into five task groups. Pick the group, then the verb.

GroupVerbsWhat they do
Color grading & correctioncolor, color-balance, equalize, lut, chromaAdjust brightness/contrast/saturation/gamma, shadow-midtone-highlight balance, auto-equalize, apply 3D LUTs, rotate hue
Keying & alphachromakey, color-key, alphaGreen/blue-screen composite over a background, or export an alpha-channel video
Stylized lookscinematic, vintage, sepia, grayscale, negative, posterize, vignette, edgeLetterbox bars, retro film, sepia, B&W, color invert, color-band reduction, dark edges, edge detection
Distortion & sharpnessglitch, pixelate, motion-blur, sharpen, blurRGB shift + grain, mosaic, frame-mix blur, unsharp, boxblur (with region)
HDR & grainhdr-to-sdr, noiseTone-map HDR → BT.709 SDR, or add film-grain noise

Three things to know before reading on:

  1. color's contrast range is -1000 to 1000, not -1 to 1. It's the eq filter's contrast multiplier, and the default is 1 (unchanged). Useful range is 0.52.0 for normal grading; 5+ is for stylized crushed-contrast looks. fqmpeg validates against -1000/1000 because that's the FFmpeg-side maximum, not because you'd ever want -1000.
  2. alpha always outputs .webm (VP9 + Opus). Transparent video requires a codec that supports alpha, and VP9-in-webm is the web-friendly choice. If you need .mov with ProRes 4444 for editing apps (Premiere, Resolve), copy the --dry-run output and swap libvpx-vp9 for prores_ks -profile:v 4444 and the extension to .mov before running FFmpeg directly.
  3. vintage is a four-filter chain, not a single FFmpeg verb. The chain is colorchannelmixer (sepia-ish matrix) + eq (slight saturation/contrast/brightness) + vignette + noise. Copy the --dry-run and swap or remove stages to taste — it's the worked example for "stack effects yourself" in this cluster.

Color Grading & Correction

color — Brightness, contrast, saturation, gamma

The grading workhorse — wraps FFmpeg's eq filter with all four parameters exposed.

Argument / OptionDefaultRangeNotes
<input>requiredInput video
--brightness <n>0-1.0 to 1.0Additive offset; small values (0.050.2) for subtle lift
--contrast <n>1-1000 to 1000Multiplier; 1 = unchanged. Useful range 0.52.0
--saturation <n>10 to 30 = grayscale, 1 = normal, 1.52 = punchy, 3 = blown
--gamma <n>10.1 to 10<1 darkens midtones, >1 brightens midtones
-o, --output <path><input-stem>-color.<ext>Override output
bash
$ npx fqmpeg color input.mp4 --brightness 0.05 --contrast 1.1 --saturation 1.2 --gamma 1.0 --dry-run

  ffmpeg -i input.mp4 -vf eq=brightness=0.05:contrast=1.1:saturation=1.2:gamma=1 -c:a copy input-color.mp4

For per-channel adjustments (warm the highlights, cool the shadows), use color-balance instead. color operates on the whole frame uniformly.

color-balance — Shadows / midtones / highlights × RGB

Adjusts color in three tonal ranges independently — nine parameters total (3 channels × 3 ranges). This is the "lift / gamma / gain" of fqmpeg.

  • Source: src/commands/color-balance.js
  • Filter: colorbalance=rs=:gs=:bs=:rm=:gm=:bm=:rh=:gh=:bh=
  • Naming: rs/gs/bs = shadow R/G/B, rm/gm/bm = midtone, rh/gh/bh = highlight
Argument / OptionDefaultRangeNotes
<input>requiredInput video
--rs / --gs / --bs <n>0-1.0 to 1.0Shadow tint per channel
--rm / --gm / --bm <n>0-1.0 to 1.0Midtone tint per channel
--rh / --gh / --bh <n>0-1.0 to 1.0Highlight tint per channel
-o, --output <path><input-stem>-balanced.<ext>Override output
bash
$ npx fqmpeg color-balance input.mp4 --rs -0.1 --bs 0.15 --rh 0.1 --bh -0.1 --dry-run

  ffmpeg -i input.mp4 -vf colorbalance=rs=-0.1:gs=0:bs=0.15:rm=0:gm=0:bm=0:rh=0.1:gh=0:bh=-0.1 -c:a copy input-balanced.mp4

The example above is the classic "teal-and-orange" Hollywood look: cool down the shadows (negative R / positive B), warm up the highlights (positive R / negative B). Keep values small (±0.1±0.2) — anything stronger and the color cast becomes obvious.

equalize — Auto histogram equalization

No knobs. Runs FFmpeg's histeq filter which stretches the histogram to fill the full luminance range, brightening dark patches and darkening washed-out highlights.

Argument / OptionDefaultNotes
<input>requiredInput video
-o, --output <path><input-stem>-equalized.<ext>Override output
bash
$ npx fqmpeg equalize input.mp4 --dry-run

  ffmpeg -i input.mp4 -vf histeq -c:a copy input-equalized.mp4

histeq works per-frame and can flicker on shots with rapidly-changing exposure — for a smoother result, manually grade with color --gamma 1.2 --contrast 1.15 instead. Reach for equalize on flat archival footage or low-contrast underwater clips where temporal flicker isn't a worry.

lut — Apply a 3D LUT (.cube, .3dl)

The "real" color grading verb. Loads a 3D LUT file and applies it via FFmpeg's lut3d filter. Works with standard .cube files (the format DaVinci Resolve, FilmConvert, Lumetri all export).

  • Source: src/commands/lut.js
  • Filter: lut3d=<path>
  • Two positional args — input file and LUT file
Argument / OptionDefaultNotes
<input>requiredInput video
<lutfile>requiredLUT path (.cube or .3dl)
-o, --output <path><input-stem>-graded.<ext>Override output
bash
$ npx fqmpeg lut input.mp4 film.cube --dry-run

  ffmpeg -i input.mp4 -vf lut3d=film.cube -c:a copy input-graded.mp4

Free .cube LUT packs you can drop straight in: FreshLUTs, RocketStock, and the FilmConvert sample pack. If a LUT applied via lut3d looks washed out, your source is probably already log-encoded but the LUT expects Rec.709 input — apply a "log-to-Rec.709" conversion LUT first, then the look LUT.

chroma — Hue rotation / temperature

Despite the name "chroma," this is hue + saturation control — it wraps FFmpeg's hue filter, not a chroma-subsampling tweaker.

Argument / OptionDefaultRangeNotes
<input>requiredInput video
--hue <degrees>0-180 to 180Hue rotation in degrees
--saturation <n>10 to 3Saturation multiplier (0 = grayscale)
-o, --output <path><input-stem>-chroma.<ext>Override output
bash
$ npx fqmpeg chroma input.mp4 --hue 20 --saturation 1.3 --dry-run

  ffmpeg -i input.mp4 -vf hue=h=20:s=1.3 -c:a copy input-chroma.mp4

--hue 20 warms the whole image slightly (greens shift toward yellow-green, blues shift toward cyan). For a true color temperature shift (3200K → 5600K), you want color-balance with --rh 0.05 --bh -0.05 (warming highlights) instead — hue rotates the whole color wheel, which gives a stranger result than just shifting white balance.

Keying & Alpha

chromakey — Green/blue-screen composite over a background

Two inputs (foreground + background). Removes the keyed color from the foreground, then overlays it on the background. The keyed color defaults to pure green (0x00FF00).

  • Source: src/commands/chromakey.js
  • Filter: [0:v]colorkey=COLOR:SIM:BLEND[fg];[1:v][fg]overlay[out]
  • Audio: copies foreground audio (-c:a copy); ignores background audio
Argument / OptionDefaultNotes
<input>requiredForeground with green/blue screen
<background>requiredBackground image or video
--color <hex>0x00FF00Key color (hex; e.g. 0x0000FF for blue screen)
--similarity <n>0.3 (range 0.011.0)Match tolerance — lower = stricter, higher = aggressive
--blend <n>0.1 (range 0.01.0)Edge softening
-o, --output <path><input-stem>-keyed.<ext>Override output
bash
$ npx fqmpeg chromakey actor.mp4 background.mp4 --color 0x00FF00 --similarity 0.3 --blend 0.1 --dry-run

  ffmpeg -i actor.mp4 -i background.mp4 -filter_complex [0:v]colorkey=0x00FF00:0.3:0.1[fg];[1:v][fg]overlay[out] -map [out] -map 0:a? -c:a copy actor-keyed.mp4

If the green spill on the subject's edges is visible after keying, raise --blend to 0.20.3. If the subject's skin tone gets keyed out (especially with blue screens), drop --similarity to 0.2 and pick a more precise hex from your source instead of the generic 0x00FF00. Filter graph quirk: colorkey is the underlying filter — chromakey the verb is named for the green-screen use case, but chromakey (the FFmpeg filter) and colorkey (the FFmpeg filter) operate slightly differently. fqmpeg uses colorkey under the hood for both verbs.

color-key — Composite a specific color over a background

Functionally near-identical to chromakey — same filter (colorkey), same two-input overlay. The difference is the default semantics: chromakey is documented as "green/blue screen," color-key as "remove a specific color and composite." Use color-key when you're keying a non-green hex (a logo background, a solid wall color you shot against without a chroma backdrop).

  • Source: src/commands/color-key.js
  • Filter: [0]colorkey=COLOR:SIM:BLEND[fg];[1][fg]overlay[out]
  • Audio: maps foreground audio with -map 0:a? but no -c:a copy — re-encodes audio to the output container default
Argument / OptionDefaultNotes
<input>requiredForeground video
<background>requiredBackground image or video
--color <hex>0x00FF00Color to remove
--similarity <n>0.3Match tolerance
--blend <n>0.1Edge blending
-o, --output <path><input-stem>-colorkey.<ext>Override output
bash
$ npx fqmpeg color-key actor.mp4 background.mp4 --color 0x1A8FE5 --similarity 0.25 --blend 0.05 --dry-run

  ffmpeg -i actor.mp4 -i background.mp4 -filter_complex [0]colorkey=0x1A8FE5:0.25:0.05[fg];[1][fg]overlay[out] -map [out] -map 0:a? actor-colorkey.mp4

The missing -c:a copy is intentional in the source; if you care about preserving the original audio codec, add -c:a copy to the --dry-run output before running FFmpeg directly. Otherwise audio gets re-encoded (usually to AAC for .mp4 outputs), which is fine in practice but adds a generation.

alpha — Export with transparent background (WebM/VP9 + Opus)

Single input. Same colorkey filter, but composited against transparency instead of a background — output is alpha-channel WebM. Use it for transparent overlays you'll drop into a video editor.

  • Source: src/commands/alpha.js
  • Filter: colorkey=COLOR:SIM:BLEND,format=yuva420p
  • Codec: libvpx-vp9 video, libopus audio
  • Container: .webm (the default extension)
Argument / OptionDefaultNotes
<input>requiredInput with green/blue screen
--color <hex>0x00FF00Color to make transparent
--similarity <n>0.3Match tolerance
--blend <n>0.1Edge softness
-o, --output <path><input-stem>-alpha.webm.webm or .mov recommended
bash
$ npx fqmpeg alpha actor.mp4 --color 0x00FF00 --similarity 0.3 --blend 0.1 --dry-run

  ffmpeg -i actor.mp4 -vf colorkey=0x00FF00:0.3:0.1,format=yuva420p -c:v libvpx-vp9 -auto-alt-ref 0 -c:a libopus actor-alpha.webm

For non-web pipelines (Premiere, Resolve, Final Cut) you usually want .mov with ProRes 4444 — copy the --dry-run and swap libvpx-vp9 for prores_ks -profile:v 4444 -pix_fmt yuva444p10le and the output extension to .mov. -auto-alt-ref 0 is there because VP9's alt-ref frame feature breaks alpha-channel encoding.

Stylized Looks

cinematic — Add letterbox bars (2.35:1 widescreen look)

Crops the source to a wide aspect, then pads back to 16:9 with black (or any color) bars top and bottom. The cinematic letterbox is a quick way to make phone footage "feel" like film.

Argument / OptionDefaultNotes
<input>requiredInput video
--ratio <n>2.35Target aspect width (e.g. 2.35, 2.39, 2.40)
--color <name>blackBar color (FFmpeg color name or hex)
-o, --output <path><input-stem>-cinematic.<ext>Override output
bash
$ npx fqmpeg cinematic input.mp4 --ratio 2.35 --color black --dry-run

  ffmpeg -i input.mp4 -vf crop=iw:iw/2.35,pad=iw:iw/(16/9):(ow-iw)/2:(oh-ih)/2:color=black -c:a copy input-cinematic.mp4

2.35:1 is the classic Scope ratio (CinemaScope, anamorphic). For modern digital cinema use 2.39 or 2.40. The bars are an aesthetic crop — you lose the top and bottom of the frame, so reframe your subject before applying if the action's in the upper third.

vintage — Retro film look (4-filter composite)

Combines four filters in one chain: sepia-ish color mixer + saturation/contrast tweak + vignette + grain. It's the "instant 8mm" verb.

  • Source: src/commands/vintage.js
  • Filter chain: colorchannelmixer=.35:.75:.20:0:.30:.65:.17:0:.25:.50:.13:0,eq=saturation=0.7:contrast=1.1:brightness=0.02,vignette=angle=PI/4,noise=alls=15:allf=t+a
Argument / OptionDefaultNotes
<input>requiredInput video
-o, --output <path><input-stem>-vintage.<ext>Override output
bash
$ npx fqmpeg vintage input.mp4 --dry-run

  ffmpeg -i input.mp4 -vf colorchannelmixer=.35:.75:.20:0:.30:.65:.17:0:.25:.50:.13:0,eq=saturation=0.7:contrast=1.1:brightness=0.02,vignette=angle=PI/4,noise=alls=15:allf=t+a -c:a copy input-vintage.mp4

There are no knobs — vintage is opinionated. If you want a stronger grain, dial up the noise=alls=15 to 30 in the --dry-run output. For a colder retro look (think 80s VHS), replace the color mixer with chroma --hue -10 --saturation 0.6 instead.

sepia — Sepia tone

The classic sepia color matrix — same one Wikipedia and the W3C use for example sepia filters.

  • Source: src/commands/sepia.js
  • Filter: colorchannelmixer=.393:.769:.189:0:.349:.686:.168:0:.272:.534:.131:0
Argument / OptionDefaultNotes
<input>requiredInput video
-o, --output <path><input-stem>-sepia.<ext>Override output
bash
$ npx fqmpeg sepia input.mp4 --dry-run

  ffmpeg -i input.mp4 -vf colorchannelmixer=.393:.769:.189:0:.349:.686:.168:0:.272:.534:.131:0 -c:a copy input-sepia.mp4

If you want sepia with more contrast (the typical "old photo" look), chain it: sepia input.mp4 -o tmp.mp4 && color tmp.mp4 --contrast 1.3 --brightness -0.05. The matrix itself is fixed.

grayscale — Black and white

Strips all saturation via hue=s=0. Slightly cheaper than chroma --saturation 0 (no hue parameter to multiply).

Argument / OptionDefaultNotes
<input>requiredInput video
-o, --output <path><input-stem>-grayscale.<ext>Override output
bash
$ npx fqmpeg grayscale input.mp4 --dry-run

  ffmpeg -i input.mp4 -vf hue=s=0 -c:a copy input-grayscale.mp4

For "filmic B&W" with the red channel boosted (skin tones glow against darker skies), use color --saturation 0 instead and add color-balance --rh 0.1 before — grayscale's uniform desaturation is closer to a Polaroid B&W than a Kodak Tri-X.

negative — Invert colors

FFmpeg's negate filter. Inverts every pixel — useful for fast film-negative effects or for "find the lightest area in a dark scene" inspection.

Argument / OptionDefaultNotes
<input>requiredInput video
-o, --output <path><input-stem>-negative.<ext>Override output
bash
$ npx fqmpeg negative input.mp4 --dry-run

  ffmpeg -i input.mp4 -vf negate -c:a copy input-negative.mp4

Running negative twice gets you back to the original (negate is involutive). It also inverts the alpha channel if present — use the negate=alpha=0 variant via --dry-run editing if you want to preserve alpha.

posterize — Reduce color levels

Crushes each channel to N discrete values. --levels 4 (the default) gives a 4-step gradient per channel — 64 total colors (4³). Lower levels = more cartoon-like; higher levels = subtle banding.

  • Source: src/commands/posterize.js
  • Filter: format=rgb24,lutrgb=r='val-mod(val,256/N)':g='val-mod(val,256/N)':b='val-mod(val,256/N)'
Argument / OptionDefaultRangeNotes
<input>requiredInput video
--levels <n>42256Discrete values per channel
-o, --output <path><input-stem>-posterized.<ext>Override output
bash
$ npx fqmpeg posterize input.mp4 --levels 6 --dry-run

  ffmpeg -i input.mp4 -vf format=rgb24,lutrgb=r='val-mod(val,256/6)':g='val-mod(val,256/6)':b='val-mod(val,256/6)' -c:a copy input-posterized.mp4

--levels 2 produces a black-white-red-green-blue-yellow-magenta-cyan look (2³ = 8 colors). --levels 32 gives subtle banding that's hard to see but breaks gradient sky shots. The format=rgb24 upfront converts from YUV so the per-channel math operates on RGB.

edge — Edge detection (sobel / prewitt / roberts)

Three classic edge detectors. sobel (the default) is the most commonly used — it produces the typical "white edges on black" look that game tutorials show.

  • Source: src/commands/edge.js
  • Filter (sobel): edgedetect=mode=colormix:high=0
  • Filter (prewitt): prewitt
  • Filter (roberts): roberts
Argument / OptionDefaultNotes
<input>requiredInput video
--mode <mode>sobelsobel / prewitt / roberts
-o, --output <path><input-stem>-edge.<ext>Override output
bash
$ npx fqmpeg edge input.mp4 --mode sobel --dry-run

  ffmpeg -i input.mp4 -vf edgedetect=mode=colormix:high=0 -c:a copy input-edge.mp4
bash
$ npx fqmpeg edge input.mp4 --mode prewitt --dry-run

  ffmpeg -i input.mp4 -vf prewitt -c:a copy input-edge.mp4

sobel uses FFmpeg's edgedetect=mode=colormix which preserves colors at the edges (most stylistic). prewitt and roberts are direct edge filters with pure-white edges on black. For animation-style "ink line" looks, sobel with colormix is usually what you want.

vignette — Dark edges

FFmpeg's vignette filter, taking a single --angle parameter (vignette aperture in radians).

Argument / OptionDefaultNotes
<input>requiredInput video
--angle <radians>PI/5Smaller = stronger vignette; PI/5 is moderate, PI/4 is heavier
-o, --output <path><input-stem>-vignette.<ext>Override output
bash
$ npx fqmpeg vignette input.mp4 --angle PI/5 --dry-run

  ffmpeg -i input.mp4 -vf vignette=angle=PI/5 -c:a copy input-vignette.mp4

Counterintuitively, larger --angle = wider aperture = less vignette. The default PI/5 (≈0.628 rad) is the FFmpeg default. PI/4 is the value vintage uses internally — slightly heavier. The expression evaluator handles PI/5 literally, so don't try to pass 0.6283 — pass the symbolic form.

Distortion & Sharpness

glitch — RGB shift + noise + contrast

The "datamosh aesthetic" verb — chains rgbashift (red shifts right, blue shifts left), noise, and a slight contrast boost. Intensity scales all three.

  • Source: src/commands/glitch.js
  • Filter chain: rgbashift=rh=N:bh=-N:rv=N/2:bv=-N/2,noise=alls=M:allf=t,eq=contrast=1.2 where N = intensity/5, M = min(intensity, 50)
Argument / OptionDefaultRangeNotes
<input>requiredInput video
--intensity <n>301100Drives RGB shift + noise amount together
-o, --output <path><input-stem>-glitch.<ext>Override output
bash
$ npx fqmpeg glitch input.mp4 --intensity 30 --dry-run

  ffmpeg -i input.mp4 -vf rgbashift=rh=6:bh=-6:rv=3:bv=-3,noise=alls=30:allf=t,eq=contrast=1.2 -c:a copy input-glitch.mp4

--intensity 30 (the default) gives a visible-but-subtle glitch suitable for a transition or title card. --intensity 80+ for full datamosh chaos. Noise caps at 50 even when intensity is 100, so push intensity past 50 only if you want more RGB shift without further grain.

pixelate — Mosaic / pixel-art block

Scales the video down with nearest-neighbor sampling, then back up with nearest-neighbor — the canonical pixel-art trick.

Argument / OptionDefaultNotes
<input>requiredInput video
--size <n>16Block size — larger = chunkier
-o, --output <path><input-stem>-pixelated.<ext>Override output
bash
$ npx fqmpeg pixelate input.mp4 --size 16 --dry-run

  ffmpeg -i input.mp4 -vf scale=iw/16:ih/16:flags=neighbor,scale=iw*16:ih*16:flags=neighbor -c:a copy input-pixelated.mp4

Note the size has to divide the source dimensions for a clean result — at --size 16 on a 1280×720 source, you get exactly 80×45 blocks. On a 1920×1080 source the math comes out to 120×67.5 → FFmpeg rounds, so you'll see slightly uneven block sizes at edges. For privacy-blurring a face (mosaic over a region), use blur --region X:Y:W:H --strength <n> instead — pixelate always applies to the whole frame.

motion-blur — Frame-mix motion blur

Averages multiple consecutive frames into one output frame via FFmpeg's tmix filter. The result is a smeared, motion-trail look.

Argument / OptionDefaultNotes
<input>requiredInput video
--frames <n>5Frames to blend — higher = more blur, lower frame rate output
-o, --output <path><input-stem>-motion-blur.<ext>Override output
bash
$ npx fqmpeg motion-blur input.mp4 --frames 5 --dry-run

  ffmpeg -i input.mp4 -vf tmix=frames=5 -c:a copy input-motion-blur.mp4

tmix keeps the output frame rate the same as input — it averages a sliding window of frames, it doesn't decimate. That makes it suitable for "real motion blur" looks where you'd expect a film camera's shutter to capture motion within a frame. For artistic ghosting (where you want the trail to persist longer), raise --frames to 1520; for shutter-realistic motion blur on 60fps source, --frames 3 matches a 180° shutter angle.

sharpen — Unsharp mask (light / medium / strong)

The unsharp FFmpeg filter with three preset strengths. light/medium/strong map to unsharp=3:3:1, 5:5:1.5, 7:7:2.

Argument / OptionDefaultChoicesNotes
<input>requiredInput video
--strength <level>mediumlight/medium/strongMaps to fixed unsharp params
-o, --output <path><input-stem>-sharpened.<ext>Override output
bash
$ npx fqmpeg sharpen input.mp4 --strength medium --dry-run

  ffmpeg -i input.mp4 -vf unsharp=5:5:1.5 -c:a copy input-sharpened.mp4

medium (the default) is a safe choice for slightly-soft phone footage. strong can introduce halos around high-contrast edges — visible most on text or sky-roof boundaries. For a custom amount, copy the --dry-run and replace 5:5:1.5 with your own <x>:<y>:<amount> values (e.g. 7:7:1.0 for a wider but gentler sharpen).

blur — Box blur, whole frame or region

boxblur filter with named-preset strengths (light = 5, medium = 15, strong = 30) or a custom number. With --region X:Y:W:H it splits the stream, blurs only the cropped region, and overlays it back — useful for privacy blurring a face or license plate without touching the rest of the frame.

  • Source: src/commands/blur.js
  • Filter (whole frame): boxblur=N
  • Filter (region): [0:v]split[base][blur];[blur]crop=W:H:X:Y,boxblur=N[blurred];[base][blurred]overlay=X:Y
Argument / OptionDefaultNotes
<input>requiredInput video
--strength <level>mediumlight(5) / medium(15) / strong(30) or a custom integer
--region <x:y:w:h>Apply only to a region
-o, --output <path><input-stem>-blurred.<ext>Override output
bash
$ npx fqmpeg blur input.mp4 --strength medium --dry-run

  ffmpeg -i input.mp4 -vf boxblur=15 -c:a copy input-blurred.mp4
bash
$ npx fqmpeg blur input.mp4 --strength 8 --region 100:50:300:200 --dry-run

  ffmpeg -i input.mp4 -filter_complex [0:v]split[base][blur];[blur]crop=300:200:100:50,boxblur=8[blurred];[base][blurred]overlay=100:50 -c:a copy input-blurred.mp4

boxblur is fast but produces square-ish blur artifacts on sharp edges. For better-looking blur, copy the --dry-run and swap boxblur=N for gblur=sigma=N (Gaussian, slower but smoother). For a moving subject's face, --region accepts only a static rectangle — for motion tracking you'd need an upstream pose-detection pipeline.

HDR & Grain

hdr-to-sdr — Tone-map HDR (BT.2020 + PQ/HLG) → SDR (BT.709)

The conversion most people need when downloading HDR Netflix/YouTube content and finding their non-HDR display shows washed-out colors. Runs a zscale + tonemap chain with one of three tone-mapping algorithms.

  • Source: src/commands/hdr-to-sdr.js
  • Filter chain: zscale=t=linear:npl=100,format=gbrpf32le,zscale=p=bt709,tonemap=<algo>:desat=0,zscale=t=bt709:m=bt709:r=tv,format=yuv420p
Argument / OptionDefaultChoicesNotes
<input>requiredHDR input
--tonemap <algo>hablehable / reinhard / mobiusTone-mapping curve
-o, --output <path><input-stem>-sdr.<ext>Override output
bash
$ npx fqmpeg hdr-to-sdr input.mp4 --tonemap hable --dry-run

  ffmpeg -i input.mp4 -vf zscale=t=linear:npl=100,format=gbrpf32le,zscale=p=bt709,tonemap=hable:desat=0,zscale=t=bt709:m=bt709:r=tv,format=yuv420p -c:a copy input-sdr.mp4

The chain converts from PQ/HLG to linear, tone-maps, then re-encodes to BT.709 SDR. hable (the Uncharted-2 curve) is the default — natural midtones, gentle highlight roll-off. reinhard compresses everything more aggressively (flatter look). mobius preserves the most highlight detail at the cost of muting midtones. For a deeper dive into the theory and per-algorithm trade-offs see HDR to SDR tonemapping with FFmpeg.

zscale requires FFmpeg to be built with --enable-libzimg. If your build doesn't have it you'll see Unrecognized option 'zscale'; install a build that includes zimg (most Homebrew/apt/static builds do).

noise — Add film grain

FFmpeg's noise filter with two distributions. gaussian (the default) looks like organic film grain. uniform looks more digital — even sprinkle, no clumping.

  • Source: src/commands/noise.js
  • Filter: noise=alls=N:allf=t+a (gaussian) or noise=alls=N:allf=t+u (uniform)
Argument / OptionDefaultChoices / RangeNotes
<input>requiredInput video
--strength <n>201100Intensity
--type <type>gaussiangaussian / uniformDistribution
-o, --output <path><input-stem>-noisy.<ext>Override output
bash
$ npx fqmpeg noise input.mp4 --strength 20 --type gaussian --dry-run

  ffmpeg -i input.mp4 -vf noise=alls=20:allf=t+a -c:a copy input-noisy.mp4

--strength 20 is film-grain levels (subtle, doesn't compete with the picture). --strength 60 is 8mm-handheld levels. Above 80 it dominates the image — useful for stylized aged-footage effects. The +t flag in the filter is "temporal noise" (different per frame) — without it you'd get a static dithering pattern. fqmpeg always uses +t.

Real-World Recipes

Each recipe chains multiple verbs into a workflow you'd actually use.

Recipe 1: Quick "cinematic" color pass for phone footage

You shot a clip on your phone (well-exposed, fairly saturated, colors look flat on YouTube). Goal: slightly warmer highlights, cooler shadows, a touch of contrast, and 2.35:1 letterbox bars.

bash
# Step 1: teal-and-orange shadow/highlight split
npx fqmpeg color-balance clip.mp4 --rs -0.1 --bs 0.15 --rh 0.1 --bh -0.1
# → clip-balanced.mp4

# Step 2: lift contrast and add a touch of saturation
npx fqmpeg color clip-balanced.mp4 --contrast 1.15 --saturation 1.1
# → clip-balanced-color.mp4

# Step 3: 2.35:1 letterbox
npx fqmpeg cinematic clip-balanced-color.mp4 --ratio 2.35
# → clip-balanced-color-cinematic.mp4

If you have a real grading LUT (.cube file) — say, from a paid pack or DaVinci Resolve export — replace steps 1-2 with npx fqmpeg lut clip.mp4 mylook.cube. LUTs let you carry a consistent grade across a whole project; the color-balance + color combo is the "no LUT available" workflow.

Recipe 2: Green screen actor → final composite

You shot an actor against a green backdrop and have a background plate (a still or a video). Goal: clean key, no green spill, alpha-channel master for further editing.

bash
# Step 1: composite onto the final background for preview
npx fqmpeg chromakey actor.mp4 backdrop.jpg --color 0x00FF00 --similarity 0.3 --blend 0.15
# → actor-keyed.mp4

# Step 2 (optional): export an alpha-channel master for the editor
npx fqmpeg alpha actor.mp4 --color 0x00FF00 --similarity 0.3 --blend 0.15
# → actor-alpha.webm

# Step 3 (optional): if green spill remains, drop saturation on green
npx fqmpeg chroma actor-keyed.mp4 --hue -5 --saturation 0.9
# → actor-keyed-chroma.mp4

If the green wraps around to the subject's edges or shoulders, raise --blend to 0.20.3 for softer edges (or accept a slight halo). For chroma spill on the body (greenish skin tones), the better fix is a colorbalance pass that pulls green down in the midtones (--gm -0.1) after the key.

Recipe 3: HDR download → SDR web-ready upload

Many high-res streams from Netflix-style sources come as HDR (BT.2020 + PQ or HLG). Your YouTube re-upload or social-media share needs SDR (BT.709). Wrong order here is the difference between a clean conversion and a 30-minute render that comes out gray.

bash
# Step 1: tone-map HDR → SDR (must come first - never re-encode HDR as SDR)
npx fqmpeg hdr-to-sdr hdr-clip.mp4 --tonemap hable
# → hdr-clip-sdr.mp4

# Step 2: optional grade pass to recover saturation lost in tone-mapping
npx fqmpeg color hdr-clip-sdr.mp4 --saturation 1.15 --contrast 1.05
# → hdr-clip-sdr-color.mp4

# Step 3: compress for upload
npx fqmpeg compress hdr-clip-sdr-color.mp4 --crf 21
# → hdr-clip-sdr-color-compressed.mp4

tonemap=hable (the Uncharted-2 curve) is the safe default. If the highlights look blown out — typical on bright outdoor HDR footage — switch to --tonemap mobius for more highlight detail at the cost of slightly muted midtones. The color --saturation 1.15 step is because tone-mapping inherently desaturates HDR content; bringing it back to SDR-natural levels is a final pass, not a starting point. For more on the underlying theory see HDR to SDR tonemapping with FFmpeg.

Frequently Asked Questions

Why does color --contrast accept -1000 to 1000 if I'd never use those values?

Because that's the FFmpeg-side range for the eq filter's contrast parameter, and fqmpeg validates against it directly. In practice you stay between 0.5 (low contrast, gray look) and 2.0 (high contrast, crushed). Values above 5 produce extreme posterization. The wide range exists so artistic uses (stylized horror, vaporwave) aren't gated.

chromakey vs color-key — which one should I use?

They use the same underlying colorkey FFmpeg filter and produce essentially the same output. chromakey adds -c:a copy to preserve the original audio codec, while color-key doesn't (so audio gets re-encoded to the output container default). Use chromakey when keying actual green/blue screens to keep audio clean; use color-key interchangeably when keying a non-traditional color (a colored wall, a logo background). If preserving audio matters, prefer chromakey.

alpha outputs .webm — can I get .mov with ProRes 4444 for Premiere?

Not via fqmpeg directly — alpha's codec choices (libvpx-vp9 + libopus) are hardcoded for web-friendliness. Copy the --dry-run output and swap libvpx-vp9 for prores_ks -profile:v 4444 -pix_fmt yuva444p10le (and the output extension to .mov) before running FFmpeg directly. Same colorkey,format=yuva420p filter chain works for both — only the encoder differs.

My HDR file doesn't tone-map — hdr-to-sdr fails with "Unrecognized option 'zscale'"

Your FFmpeg build doesn't include the zimg library. Reinstall with a build that has --enable-libzimg: brew reinstall ffmpeg on macOS usually does it (Homebrew's bottle has zimg). On Ubuntu/Debian, the official ffmpeg apt package usually has zimg too — older builds may not. The static builds from BtbN's FFmpeg-Builds include zimg.

How do I get a more "Hollywood" look without a LUT pack?

The classic move is the color-balance teal-and-orange split (cool shadows, warm highlights) plus a slight contrast bump:

bash
npx fqmpeg color-balance clip.mp4 --rs -0.1 --bs 0.15 --rh 0.1 --bh -0.1
npx fqmpeg color clip-balanced.mp4 --contrast 1.15 --saturation 1.05

This emulates ~70% of the look you get from generic "cinematic" LUT packs. For matched-to-a-specific-film looks, you need an actual LUT — try FreshLUTs or RocketStock's free packs.

Can I run a "before/after" preview without rendering the whole video?

Trim to a short test segment first, then apply the effect to the trim:

bash
npx fqmpeg trim input.mp4 --start 10 --duration 5 -o test.mp4
npx fqmpeg color test.mp4 --contrast 1.2 -o test-graded.mp4

The trim step gives you a 5-second sample that renders in seconds even at high CRF, so you can iterate on color settings without 10-minute waits.

Why does pixelate --size 16 give uneven blocks on a 1920×1080 source?

Because 16 doesn't divide 1080 evenly (1080 ÷ 16 = 67.5). FFmpeg rounds the intermediate scale=iw/16:ih/16 step, so the up-scale back to original dimensions distributes the rounding error across the image. For clean blocks pick a divisor of both dimensions: 8, 10, or 20 for 1920×1080; 16 works cleanly on 1280×720.

What's the difference between motion-blur and blur?

motion-blur is temporal (averages multiple frames into one — used when motion happens during a single frame, simulating shutter blur). blur is spatial (blurs each frame independently — used to soften the whole image or hide a region). They look very different: motion-blur only affects moving parts of the frame; blur affects everything.

Can lut apply a regular 2D LUT (1D color curve) instead of 3D?

No — lut is hardcoded to lut3d=<file>. For a 1D LUT (a .1dl file or per-channel curve), you'd run FFmpeg directly with the lut1d filter. Most modern grading workflows use 3D LUTs anyway because they handle hue-dependent color shifts that 1D LUTs can't.

Wrapping Up

The twenty-three C6 verbs cover the look-and-effect operations you reach for after geometry but before final compression:

  • color, color-balance, equalize, lut, chroma for grading (range note: color --contrast accepts -1000 to 1000 because of the underlying eq filter; useful range is 0.5–2.0)
  • chromakey, color-key, alpha for keying and compositing (all three use the same colorkey FFmpeg filter; alpha outputs .webm with VP9+Opus for web-friendly transparent video)
  • cinematic, vintage, sepia, grayscale, negative, posterize, vignette, edge for stylized looks (vintage is a four-filter chain — useful as a worked example for stacking your own effects)
  • glitch, pixelate, motion-blur, sharpen, blur for distortion and sharpness (blur --region is the privacy-pixelation verb)
  • hdr-to-sdr, noise for HDR conversion and film grain (hdr-to-sdr requires zscale / zimg; noise defaults to gaussian)

Every verb prints its underlying FFmpeg invocation under --dry-run, so when the defaults don't fit (a non-.webm alpha output, a .cube LUT but on log source, a custom unsharp amount), you can copy the command, customize, and run FFmpeg directly. For the broader fqmpeg map, see the fqmpeg complete guide.