M
MeshWorld.
Cheatsheet DevOps Linux Backend Developer Tools 12 min read

jq Cheat Sheet: Filter, Transform, and Query JSON from the Terminal

Cobie
By Cobie

Quick reference tables

Basic filters

FilterWhat it does
.Identity — output input unchanged
.keyGet field by name
.key.nestedGet nested field
.["key"]Get field by bracket notation (needed for keys with spaces or dashes)
.key?Get field, null instead of error if missing
.[0]Get first array element
.[-1]Get last array element
.[2:5]Slice array from index 2 to 4
.[]Iterate array or object values
.[].keyGet a field from every object in an array

Pipe and comma

SyntaxWhat it does
filter1 | filter2Pipe — pass output of filter1 into filter2
filter1, filter2Comma — output results of both filters
(filter1), (filter2)Grouping with parentheses

Type and value checks

FilterWhat it does
typeReturn the type: “null”, “boolean”, “number”, “string”, “array”, “object”
lengthLength of string, array, or object
keysArray of object keys (sorted)
keys_unsortedArray of object keys (insertion order)
valuesArray of object values
has("key")True if object has key or array has index
in(obj)True if input key exists in obj
contains(val)True if input contains val
anyTrue if any element is truthy
allTrue if all elements are truthy
emptyProduces no output (useful with conditions)

Constructing output

SyntaxWhat it does
{key: .field}Build object with field mapped to key
{key}Shorthand when key name matches field name
{(.field): .other}Dynamic key from input
[.[] | filter]Collect results into array
@base64Base64 encode a string
@base64dBase64 decode
@uriURI encode
@csvFormat array as CSV
@tsvFormat array as TSV
@jsonEncode as JSON string
@htmlHTML entity encoding
"\(.field) text"String interpolation

Select and conditionals

FilterWhat it does
select(cond)Pass through only if condition is true
if cond then A else B endConditional expression
if cond then A elif cond2 then B else C endMulti-branch conditional
//Alternative operator — use right side if left is null/false
.key // "default"Provide a default value

Array operations

FilterWhat it does
map(f)Apply filter to every element, collect into array
map_values(f)Apply filter to every object value
select(...) inside mapFilter elements from array
sortSort array
sort_by(.field)Sort array of objects by field
reverseReverse array
uniqueDeduplicate array
unique_by(.field)Deduplicate by field value
group_by(.field)Group array of objects by field
flattenFlatten nested arrays
flatten(depth)Flatten to given depth
firstFirst element (or first result from expression)
lastLast element
nth(n; expr)Nth result from expression
addSum numbers, concat strings/arrays, merge objects
any(cond)True if any element matches
all(cond)True if all elements match
min / maxMin/max value
min_by(.f) / max_by(.f)Min/max by field
indices("x")All indices where value appears
index("x")First index
rindex("x")Last index
limit(n; expr)Take first N results
first(expr)First result from expr
range(n)Numbers 0 to n-1
range(start; end)Numbers in range
range(start; end; step)Numbers with step
recurseRecursively descend into values
recurse(.children)Recurse into specific field
walk(f)Apply f to every node bottom-up

String operations

FilterWhat it does
split("delim")Split string into array
join("delim")Join array of strings
ltrimstr("prefix")Remove prefix if present
rtrimstr("suffix")Remove suffix if present
startswith("str")Boolean
endswith("str")Boolean
ascii_downcaseLowercase
ascii_upcaseUppercase
test("regex")Boolean regex match
test("regex"; "flags")With flags: g global, i case-insensitive, x extended
match("regex")Match object with offset and captures
capture("(?P<name>regex)")Named captures as object
scan("regex")Array of all matches
gsub("pat"; "repl")Global substitute
sub("pat"; "repl")First substitute
tostringConvert to string
tonumberParse string as number
explodeArray of Unicode code points
implodeCode points back to string
tojsonEncode value as JSON string
fromjsonParse JSON string to value

Math and comparison

FilterWhat it does
+, -, *, /, %Arithmetic
==, !=, <, <=, >, >=Comparison
and, or, notBoolean operators
floor, ceil, round, fabsNumeric rounding
sqrt, pow(.; n), log, expMath functions
nan, infinite, isinfinite, isnan, isnormal, isfiniteSpecial float checks

Path operations

FilterWhat it does
path(expr)Array representing the path to expr
getpath(["a","b"])Get value at path array
setpath(["a","b"]; val)Set value at path array
delpaths([paths])Delete multiple paths
del(.key)Delete a field
del(.arr[2])Delete array element by index
leaf_pathsAll paths to leaf values
pathsAll paths
paths(scalars)Paths to scalar values only

Reduce and advanced

FilterWhat it does
reduce .[] as $x (init; expr)Fold over values with accumulator
foreach .[] as $x (init; update)Like reduce but emits intermediate results
label-breakBreak out of a foreach or reduce
$__loc__Current file and line (debugging)
debugPrint value to stderr, pass through
debug("msg")Print message to stderr
errorProduce an error
halt_error(n)Exit with code n
inputRead next JSON input
inputsAll remaining inputs as stream

CLI flags

FlagWhat it does
-r / --raw-outputOutput strings without quotes
-R / --raw-inputRead input as raw strings (one per line)
-s / --slurpRead all inputs into one array
-n / --null-inputNo input — start with null
-c / --compact-outputCompact JSON, no pretty-printing
-e / --exit-statusExit 1 if last output is false/null
--arg name valBind shell variable as $name (string)
--argjson name valBind shell variable as $name (JSON)
--slurpfile name fileBind file contents as $name (array)
--rawfile name fileBind raw file contents as $name (string)
--jsonargsTreat remaining positional args as JSON
--argsTreat remaining args as strings ($ARGS.positional)
-f fileRead filter from file
-L pathAdd path to module search path
--tabUse tabs for indentation
--indent nUse n spaces for indentation
--streamParse input in streaming form
--seqUse JSON Sequences (RFC 7464)

Detailed sections

Common DevOps patterns

These are the jq one-liners you end up writing constantly in real DevOps work.

Extract a single field:

curl -s https://api.example.com/health | jq '.status'
# "ok"

# Without quotes
curl -s https://api.example.com/health | jq -r '.status'
# ok

List all keys of an object:

echo '{"name":"web","port":3000,"env":"prod"}' | jq 'keys'
# ["env","name","port"]

Filter array elements:

# Get all services where status is "running"
kubectl get pods -o json | jq '.items[] | select(.status.phase == "Running") | .metadata.name'

Reshape: extract specific fields from array of objects:

docker ps --format json | jq -r '[.Names, .Image, .Status] | @tsv'

Count elements:

cat data.json | jq '.users | length'

Build a new object from parts:

cat server.json | jq '{host: .ip, port: .config.port, secure: (.config.tls // false)}'

Default values:

# Use "unknown" if .region is null or missing
cat data.json | jq '.region // "unknown"'

Sort and deduplicate:

cat data.json | jq '[.[] | .tags[]] | unique | sort'

AWS CLI workflows

jq is the standard way to extract data from AWS CLI JSON output.

# List all EC2 instance IDs and their state
aws ec2 describe-instances | jq -r '.Reservations[].Instances[] | [.InstanceId, .State.Name] | @tsv'

# Get all running instance IPs
aws ec2 describe-instances \
  --filters "Name=instance-state-name,Values=running" | \
  jq -r '.Reservations[].Instances[].PublicIpAddress'

# Get all S3 bucket names
aws s3api list-buckets | jq -r '.Buckets[].Name'

# Get a specific secret value from Secrets Manager
aws secretsmanager get-secret-value --secret-id myapp/prod | \
  jq -r '.SecretString | fromjson | .DB_PASSWORD'

# Get the latest AMI ID from a list
aws ec2 describe-images --owners amazon \
  --filters 'Name=name,Values=amzn2-ami-hvm-*' | \
  jq -r '.Images | sort_by(.CreationDate) | last | .ImageId'

Kubernetes workflows

# Get all pod names in a namespace
kubectl get pods -n default -o json | jq -r '.items[].metadata.name'

# Get pods that are NOT in Running state
kubectl get pods -o json | jq -r '.items[] | select(.status.phase != "Running") | .metadata.name'

# Get all container images in use across pods
kubectl get pods --all-namespaces -o json | \
  jq -r '.items[].spec.containers[].image' | sort | uniq

# Get resource limits for all containers
kubectl get pods -o json | jq '.items[] | {
  pod: .metadata.name,
  containers: [.spec.containers[] | {
    name: .name,
    cpu: .resources.limits.cpu,
    memory: .resources.limits.memory
  }]
}'

# Get all nodes and their roles
kubectl get nodes -o json | \
  jq -r '.items[] | [.metadata.name, (.metadata.labels | to_entries | map(select(.key | startswith("node-role"))) | map(.key) | join(","))] | @tsv'

Docker and compose workflows

# List running container names and images
docker ps --format '{{json .}}' | jq -r '[.Names, .Image] | @tsv'

# Inspect a container's environment variables
docker inspect mycontainer | jq '.[0].Config.Env[]'

# Get all exposed ports for a container
docker inspect mycontainer | jq '.[0].NetworkSettings.Ports'

# Get all container IPs on a specific network
docker network inspect mynetwork | jq '.[0].Containers | to_entries[] | {name: .value.Name, ip: .value.IPv4Address}'

Using variables in filters

Shell variables can be passed into jq filters safely — no string interpolation needed.

# Using --arg (passes as JSON string)
TARGET_ENV="production"
cat config.json | jq --arg env "$TARGET_ENV" '.environments[] | select(.name == $env)'

# Using --argjson (passes as raw JSON — number, array, object, bool)
THRESHOLD=100
cat metrics.json | jq --argjson thresh "$THRESHOLD" '.[] | select(.value > $thresh)'

# Multiple variables
cat data.json | jq --arg name "alice" --arg role "admin" \
  '.users[] | select(.name == $name and .role == $role)'

Reduce — accumulating results

# Sum all values in an array
echo '[1, 2, 3, 4, 5]' | jq 'reduce .[] as $x (0; . + $x)'
# 15

# Group objects by a field into a map
echo '[{"type":"A","v":1},{"type":"B","v":2},{"type":"A","v":3}]' | \
  jq 'reduce .[] as $x ({}; .[$x.type] += [$x.v])'
# {"A":[1,3],"B":[2]}

# Count occurrences of each value
cat logs.json | jq '[.[].status] | reduce .[] as $s ({}; .[$s|tostring] += 1)'

Processing line-delimited JSON (NDJSON / JSONL)

Many logging tools (CloudWatch, Loki, Fluentd) output one JSON object per line rather than a JSON array.

# Process each line as a separate JSON object
cat logs.ndjson | jq -r '.message'

# Collect all lines into an array, then filter
cat logs.ndjson | jq -s '[.[] | select(.level == "error")]'

# Count errors
cat logs.ndjson | jq -s '[.[] | select(.level == "error")] | length'

Processing raw text input

Use -R to read non-JSON input, then parse with fromjson if needed.

# Count lines in a file
cat file.txt | jq -Rn '[inputs] | length'

# Convert CSV-ish output to JSON (simple case)
df -h | tail -n +2 | jq -Rn '
  [inputs | split(" ") | map(select(. != "")) | {
    filesystem: .[0],
    size: .[1],
    used: .[2],
    avail: .[3],
    use_pct: .[4],
    mount: .[5]
  }]
'

Related: Linux Bash Cheat Sheet | Docker Compose Cheat Sheet | Nginx Cheat Sheet