Table of Contents
Why Bash Is Ideal for Small Linux Tools
Bash is available on almost every Linux system, starts instantly, and glues together existing commands. Bash-based tools shine when:
- You’re orchestrating existing CLI programs (
grep,awk,curl, etc.) - Performance is “good enough,” and startup time matters more than raw speed
- You need quick automation that can later be rewritten in another language if necessary
In this chapter we focus on building robust, reusable Bash-based tools rather than simple, one-off scripts.
Designing Bash Tools as Real Commands
A “tool” should behave like a normal Unix command:
- Has a clear name and lives somewhere on
$PATH - Accepts arguments and options
- Prints useful output to
stdout, errors tostderr - Returns meaningful exit codes
- Has at least a
-h/--helpoption
Installing Your Tool on `$PATH`
For a single user, place executables in ~/bin or ~/.local/bin and ensure it’s in $PATH:
mkdir -p ~/.local/bin
echo 'export PATH="$HOME/.local/bin:$PATH"' >> ~/.bashrc
. ~/.bashrc
Now any script you copy or symlink into ~/.local/bin can be run by name.
Script Skeleton for a Bash Tool
A solid starting template:
#!/usr/bin/env bash
set -euo pipefail
# Default values
verbose=0
usage() {
cat <<EOF
Usage: ${0##*/} [OPTIONS] ARG...
Description:
Short description of what this tool does.
Options:
-v, --verbose Increase verbosity
-h, --help Show this help and exit
EOF
}
log() {
# log to stderr
echo "[$(date +'%F %T')] $*" >&2
}
die() {
echo "Error: $*" >&2
exit 1
}
main() {
# Parse arguments
local args=()
while [[ $# -gt 0 ]]; do
case "$1" in
-v|--verbose)
verbose=$((verbose+1))
shift
;;
-h|--help)
usage
exit 0
;;
--) # end of options
shift
break
;;
-*)
die "Unknown option: $1"
;;
*)
args+=("$1")
shift
;;
esac
done
# Restore positional arguments
set -- "${args[@]}" "$@"
# Example: require at least one argument
[[ $# -ge 1 ]] || { usage >&2; exit 1; }
# Your core logic here
if (( verbose > 0 )); then
log "Running with $# arguments: $*"
fi
# ...
}
main "$@"Key ideas:
set -euo pipefailfor safer defaults (covered more deeply in an advanced scripting chapter, but used here as standard practice)${0##*/}for the program’s basenameusagefunction for-h/--helpdieandlogfor consistent error and logging behavior
Input, Output, and Exit Codes in Tools
Well-behaved tools treat:
stdoutas the main data channelstderras diagnostics and progress output- Exit status (
$?) as a machine-readable success/fail indicator
Distinguishing stdout and stderr
list_big_files() {
local dir=${1:-.}
[[ -d "$dir" ]] || die "Not a directory: $dir"
# Progress message on stderr
echo "Scanning directory: $dir" >&2
# Data to stdout
find "$dir" -type f -size +100M -print
}Usage:
# See only results
mytool /var | sort > big_files.txt 2>/dev/null
# See only progress/errors
mytool /var >/dev/nullReturning Meaningful Exit Codes
Common conventions:
0— success1— generic error2— command-line usage error (bad arguments)- Other nonzero values — specific error conditions your tool defines
Example:
if ! some_command; then
die "some_command failed" # exits with 1
fiOr, for specific codes:
readonly E_USAGE=2
readonly E_NOT_FOUND=3
[[ $# -ge 1 ]] || { usage >&2; exit "$E_USAGE"; }
if [[ ! -f "$1" ]]; then
echo "Not found: $1" >&2
exit "$E_NOT_FOUND"
fiRobust Argument Parsing Patterns
For sophisticated tools, option parsing matters.
Manual Parsing with `while` Loop
The skeleton above used:
while [[ $# -gt 0 ]]; do
case "$1" in
-o|--option)
opt_value=$2
shift 2
;;
*)
args+=("$1")
shift
;;
esac
donePattern for options that require values:
-o|--output)
[[ $# -ge 2 ]] || die "Missing value for $1"
output=$2
shift 2
;;Using `getopts` for Short Options
getopts is useful when you only need short options (like -a -b -c value):
#!/usr/bin/env bash
set -euo pipefail
verbose=0
output=""
usage() {
echo "Usage: ${0##*/} [-v] [-o FILE] ARG..."
}
while getopts ":vo:" opt; do
case "$opt" in
v) verbose=$((verbose+1)) ;;
o) output=$OPTARG ;;
\?) echo "Unknown option: -$OPTARG" >&2; usage >&2; exit 2 ;;
:) echo "Option -$OPTARG requires an argument." >&2; usage >&2; exit 2 ;;
esac
done
shift $((OPTIND - 1))
# Now $@ contains non-option arguments
getopts does not handle --long-options; for that, the manual while loop is usually clearer.
Safe Data Handling and Defensive Practices
Writing tools means your script will be used in unpredictable environments. You need to handle:
- Spaces and special characters in filenames
- Errors in pipelines
- Temporary files and cleanup
- Signals (for graceful interruption)
Handling Filenames Safely
Always quote variables and prefer arrays:
- Bad (word splitting, globbing):
rm $files
for f in $files; do ...- Good:
rm -- "${files[@]}"
for f in "${files[@]}"; do ...To read lines safely (including spaces):
mapfile -t lines < <(command producing lines)
for line in "${lines[@]}"; do
printf '%s\n' "$line"
done
Or using while:
while IFS= read -r line; do
printf '%s\n' "$line"
done < <(command)Making Pipelines Fail on Error
With set -o pipefail, your tool will fail the whole pipeline if any command fails:
set -euo pipefail
# If grep fails (e.g. no matches), the whole pipe fails:
grep -r "pattern" /some/dir | sed 's/pattern/FOUND/g'You might intentionally tolerate failures; in that case, explicitly handle them:
if ! grep -q "pattern" file; then
echo "pattern not found, continuing..." >&2
fiTemporary Files and Cleanup
Use mktemp and trap:
tmpfile=$(mktemp)
cleanup() {
rm -f "$tmpfile"
}
trap cleanup EXIT
# Use "$tmpfile"For multiple resources:
tmpdir=$(mktemp -d)
cleanup() {
rm -rf "$tmpdir"
}
trap cleanup EXIT INT TERMStructuring Bash Tools as Libraries + Frontends
For more complex tools, split reusable logic (a library) from the CLI “frontend”.
Simple Library Pattern
libfs.sh:
#!/usr/bin/env bash
fs_list_big() {
local dir=${1:-.}
find "$dir" -type f -size +100M -print
}
bigfiles tool:
#!/usr/bin/env bash
set -euo pipefail
# shellcheck source=libfs.sh
. "/path/to/libfs.sh"
usage() {
echo "Usage: ${0##*/} [DIR]"
}
main() {
local dir=${1:-.}
fs_list_big "$dir"
}
main "$@"This allows multiple tools to reuse the same functions.
Namespacing Functions
To avoid naming collisions in larger toolsets, use prefixes:
fs_for filesystem utilitiesnet_for networking utilitiespkg_for package-related functions
Example:
backup_create_archive() { ... }
backup_list_archives() { ... }
backup_restore_archive() { ... }Building Multi-Command “Tool Suites” (Subcommands)
Many serious Bash tools emulate git/docker style:
mytool add ...
mytool list ...
mytool remove ...Simple Subcommand Dispatcher
#!/usr/bin/env bash
set -euo pipefail
usage() {
cat <<EOF
Usage: ${0##*/} <command> [options]
Commands:
add Add a new item
list List items
remove Remove an item
EOF
}
cmd_add() {
# implementation of "add"
echo "Adding: $*"
}
cmd_list() {
# implementation of "list"
echo "Listing items"
}
cmd_remove() {
# implementation of "remove"
echo "Removing: $*"
}
main() {
local cmd=${1-}
shift || true
case "$cmd" in
add) cmd_add "$@" ;;
list) cmd_list "$@" ;;
remove) cmd_remove "$@" ;;
-h|--help|"")
usage
;;
*)
echo "Unknown command: $cmd" >&2
usage >&2
exit 2
;;
esac
}
main "$@"This approach scales well and keeps each command’s code localized.
Interfacing with Other Tools
The strength of Bash-based tools lies in composing existing commands.
Wrapping External Commands
Sometimes your “tool” is just a strict, opinionated wrapper:
#!/usr/bin/env bash
set -euo pipefail
# Example: safe-rm that moves files to a trash directory
TRASH_DIR=${TRASH_DIR:-"$HOME/.local/share/trash"}
mkdir -p "$TRASH_DIR"
usage() {
echo "Usage: ${0##*/} FILE..."
}
main() {
[[ $# -ge 1 ]] || { usage >&2; exit 2; }
local f dest
for f in "$@"; do
[[ -e "$f" ]] || { echo "No such file: $f" >&2; continue; }
dest="$TRASH_DIR/$(date +%s)_${f##*/}"
mv -- "$f" "$dest"
echo "Moved $f -> $dest"
done
}
main "$@"Using `grep`, `awk`, `sed` in a Maintainable Way
Keep transformations isolated and clear:
# Bad: complex one-liner, hard to maintain
ps aux | grep myproc | grep -v grep | awk '{print $2}' | xargs kill
# Better: separated stages and clear logic
pids=$(ps -C myproc -o pid=) # if possible, use targeted options
if [[ -n "$pids" ]]; then
echo "$pids" | xargs kill
fiWhen you must use pipelines:
ps aux |
awk '$11 ~ /myproc/ {print $2}' |
xargs -r kill
Note the use of xargs -r to avoid running kill with no arguments.
Configuration Handling for Bash Tools
Non-trivial tools often need user configuration.
Config Files with Defaults
Pattern:
- Set hard-coded defaults
- Optionally load a config file if present
- Allow command-line options to override config
Example:
#!/usr/bin/env bash
set -euo pipefail
# 1. Hard-coded defaults
log_level="info"
endpoint="https://api.example.com"
# 2. Optional config
config_file="${XDG_CONFIG_HOME:-$HOME/.config}/mytool/config"
if [[ -f "$config_file" ]]; then
# shellcheck source=/dev/null
. "$config_file"
fi
# 3. Command-line overrides
while [[ $# -gt 0 ]]; do
case "$1" in
--log-level)
log_level=$2; shift 2 ;;
--endpoint)
endpoint=$2; shift 2 ;;
# ...
*)
break ;;
esac
done
# Now use $log_level, $endpoint
Config file example (~/.config/mytool/config):
log_level="debug"
endpoint="https://staging-api.example.com"Testing and Linting Bash Tools
Robust tools need at least basic automated checks.
Using `shellcheck`
shellcheck is a static analyzer for shell scripts:
shellcheck mytoolIt catches:
- Missing quotes
- Wrong use of
[/test - Dangerous
for f in $(ls)patterns - Many subtle bugs
You can run it in CI or as a pre-commit hook.
Simple Test Patterns
For small tools, basic tests can be simple scripts:
#!/usr/bin/env bash
set -euo pipefail
# Example regression test
output=$(./mytool --version)
if [[ "$output" != "mytool 1.0"* ]]; then
echo "Version output mismatch: $output" >&2
exit 1
fi
Or use bats (Bash Automated Testing System) if you want a more structured test framework.
Performance Considerations for Bash Tools
Bash isn’t for heavy computation, but you can keep tools snappy:
- Avoid unnecessary subshells (
$(...)) and external commands - Use Bash’s built-in features where reasonable
- Minimize
fork/execin tight loops
Prefer Built-ins When It’s Clear
Examples:
- Arithmetic: use
(( ))instead ofexpr:
(( count++ ))- String length:
${#var}instead ofwc -c
len=${#var}- Pattern matching: use
[[ ]]with=~or globbing instead ofgrepfor very simple checks:
if [[ $var == foo* ]]; then
...
fiAvoid Forks in Loops
Bad:
for f in *; do
size=$(stat -c '%s' "$f")
echo "$f $size"
doneBetter, where possible:
stat -c '%n %s' * # single stat invocationOr at least limit commands in the inner loop.
Packaging and Distribution of Bash Tools
Once a Bash tool is stable, you may want to distribute it.
Simple: Single Script Download
Host your script somewhere, then users can:
curl -fsSL https://example.com/mytool -o ~/.local/bin/mytool
chmod +x ~/.local/bin/mytool
Optionally add a --version option and a self-update mechanism (carefully).
Slightly More Advanced: Tarball or Git Repo
- Put your scripts in
bin/ - Provide an
install.shthat copies them into a prefix (e.g.,/usr/localor~/.local)
Example install.sh:
#!/usr/bin/env bash
set -euo pipefail
prefix=${1:-/usr/local}
install -Dm755 bin/mytool "$prefix/bin/mytool"
echo "Installed to $prefix/bin/mytool"
Distribution as proper system packages (deb, rpm) is usually done with other tooling and is beyond the scope of this chapter.
Example: A Realistic Bash-Based Tool
To tie it together, here’s a simplified log-filtering tool.
Requirements:
- Read from a file or stdin
- Filter by log level (
INFO,WARN,ERROR) - Options:
-l/--level,-h/--help
#!/usr/bin/env bash
set -euo pipefail
level=""
input_file=""
usage() {
cat <<EOF
Usage: ${0##*/} [OPTIONS] [FILE]
Filter log lines by level.
Options:
-l, --level LEVEL Log level to match (e.g., INFO, WARN, ERROR)
-h, --help Show this help and exit
If FILE is omitted, read from standard input.
EOF
}
die() {
echo "Error: $*" >&2
exit 1
}
main() {
# Parse options
while [[ $# -gt 0 ]]; do
case "$1" in
-l|--level)
[[ $# -ge 2 ]] || die "Missing value for $1"
level=$2
shift 2
;;
-h|--help)
usage
exit 0
;;
--)
shift
break
;;
-*)
die "Unknown option: $1"
;;
*)
input_file=$1
shift
;;
esac
done
[[ -n "$level" ]] || die "Log level is required (use -l LEVEL)"
# Choose input source
if [[ -n "$input_file" ]]; then
[[ -r "$input_file" ]] || die "Cannot read: $input_file"
input=(<"$input_file")
else
input=(</dev/stdin)
fi
# Use grep with a precise pattern
# Example log format: "2025-01-01 12:00:00 [LEVEL] message"
grep -- "\[$level\]" "${input[@]}"
}
main "$@"This tool:
- Behaves like a standard command
- Has help text and usage errors
- Cleanly parses options
- Reads from a file or stdin
- Uses existing tools (
grep) for its core work
You can extend it with subcommands, configuration, or more complex filtering as needed.
Bash-based tools are at their best when they:
- Orchestrate existing commands
- Respect Unix conventions (stdout/stderr/exit codes)
- Are written defensively and portably
- Can be tested and maintained like any other piece of software
The patterns in this chapter are directly reusable for building your own serious Bash command-line tools.