Table of Contents
Why Bash Is Ideal For Small Tools
Bash is already present on virtually every Linux system, which makes it an excellent language for small command line tools. It starts quickly, is good at running other programs, and is very strong at working with text, files, and the shell environment.
You will generally choose Bash tools when you want to glue existing commands together, automate repetitive shell tasks, or provide a small command that behaves like a normal Unix program. For more complex logic, heavy computation, or network APIs, other languages are often better, but Bash remains the most convenient first step for many utilities.
Bash is best used as “glue” around other commands, not as a general purpose language for large applications.
Designing Simple Command Line Interfaces in Bash
A Bash based tool should feel like any other Unix command. It should read from standard input where appropriate, write plain text to standard output, and print errors to standard error. It should accept arguments and options in a predictable way.
The simplest model is:
$$
\text{tool} \; [\text{options}] \; [\text{arguments}]
$$
For a first tool, you can write something like:
#!/usr/bin/env bash
name=$1
echo "Hello, $name"
Here the first argument becomes $1 and the tool writes a greeting to standard output. Even in simple tools it is better to check the number of arguments and print a short usage message if they are missing. This makes your tool friendlier and safer to use.
Handling Arguments and Options
Bash exposes positional arguments as $1, $2, and so on, and $# holds the number of arguments. You can build simple tools using these directly, but for options starting with - or -- it is better to use getopts or a manual loop.
A minimal pattern using positional arguments might be:
#!/usr/bin/env bash
if [ "$#" -lt 1 ]; then
echo "Usage: $0 FILE..." >&2
exit 1
fi
for file in "$@"; do
echo "Processing: $file"
# do something with "$file"
done
In this example, $@ expands to all arguments as separate words and preserves spaces correctly when quoted. This pattern is critical when creating tools that operate on multiple files or values.
To parse short options like -v or -f, you can use getopts:
#!/usr/bin/env bash
verbose=0
file=""
while getopts ":vf:" opt; do
case "$opt" in
v) verbose=1 ;;
f) file=$OPTARG ;;
\?)
echo "Unknown option: -$OPTARG" >&2
exit 1
;;
:)
echo "Option -$OPTARG requires an argument." >&2
exit 1
;;
esac
done
shift $((OPTIND - 1))
[ "$verbose" -eq 1 ] && echo "Verbose mode on"
[ -n "$file" ] && echo "Using file: $file"
getopts works only with short options like -v and -f. For more advanced parsing or long options, many tools implement a simple case around "$1" in a loop, but for beginners getopts is a good starting point.
Always quote "$@" and "$1" style parameters to preserve spaces and avoid word splitting.
Writing Robust Bash Scripts
Turning a quick Bash one liner into a reliable tool requires a few extra practices:
First include a proper shebang line, for example #!/usr/bin/env bash, so the system knows which interpreter to use when your script is executed directly.
Second consider enabling some safety options to catch errors early. A commonly recommended set is:
#!/usr/bin/env bash
set -euo pipefail
set -e makes the script exit if any command returns a non zero status, set -u treats unset variables as an error, and set -o pipefail causes pipelines to fail if any command in the pipeline fails.
You should also define a simple structure that separates configuration, functions, and main logic. For example:
#!/usr/bin/env bash
set -euo pipefail
script_name=${0##*/}
usage() {
echo "Usage: $script_name [options] FILE..." >&2
exit 1
}
main() {
if [ "$#" -lt 1 ]; then
usage
fi
for file in "$@"; do
echo "Handling file: $file"
done
}
main "$@"This pattern keeps your code easier to read and test, and prevents accidental use of global state in large scripts.
Input and Output Patterns for Tools
A good Bash tool should participate naturally in Unix pipelines. Instead of always requiring filenames, you can read from standard input, so that other commands can feed data to your tool.
To read line by line from standard input:
while IFS= read -r line; do
# process "$line"
echo "Saw: $line"
done
IFS= keeps leading and trailing whitespace, and -r prevents Bash from interpreting backslashes.
You can provide a dual mode that accepts files as arguments but falls back to standard input when no file is specified:
process_stream() {
while IFS= read -r line; do
echo "$line" | tr 'a-z' 'A-Z'
done
}
if [ "$#" -gt 0 ]; then
for file in "$@"; do
process_stream < "$file"
done
else
process_stream
fi
Your tool should write normal results to standard output and only errors or warnings to standard error. To send a message to standard error, redirect with >&2:
echo "Error: file not found" >&2This separation lets users redirect output to files without capturing error messages.
Reusing Commands and Building Pipelines
Bash based tools are most powerful when they reuse existing commands. Rather than reimplement searching, counting, or text formatting, let your script orchestrate tools like grep, awk, sed, sort, or system utilities.
For instance, you might write a simple tool that shows the top 5 largest files in a directory:
#!/usr/bin/env bash
set -euo pipefail
dir=${1:-.}
find "$dir" -type f -printf '%s\t%p\n' \
| sort -nr \
| head -n 5
Here the primary value is in combining find, sort, and head in a reproducible way, and giving the user a name for that combination.
When building pipelines, keep them readable by formatting them over multiple lines and aligning the pipe characters. Long one liners are hard to maintain when turned into tools.
Error Handling and Exit Codes
Every Unix command returns an integer status code, available in Bash as $? immediately after the command runs. A code of 0 means success and any nonzero value means an error.
Your tools should follow this convention:
- Return 0 on success.
- Return a nonzero exit status on errors.
- Use different error codes for different failure types when it helps.
You can set an exit status explicitly with exit N at the end of your script or when you detect an error. For instance:
if [ ! -f "$file" ]; then
echo "Error: $file does not exist" >&2
exit 2
fi
Even when using set -e, it is often useful to check conditions manually and print clear messages before exiting. This gives the user a better understanding of what went wrong.
Always ensure that your Bash tool exits with a nonzero code when it fails so that scripts and automation can detect the failure.
Portability and Shell Features
If you intend your Bash based tool to work on many systems, you should be careful about which features you use. Some features belong to Bash specifically, others to more generic POSIX shells.
Bash arrays, associative arrays, and certain parameter expansions are not available in strictly POSIX shells such as sh. Since this chapter focuses on Bash based tools, it is acceptable to use Bash features, but you should keep the shebang pointing to Bash and avoid relying on external behavior that may differ between Linux distributions.
For example, indexed arrays are often useful in tools:
files=()
for arg in "$@"; do
files+=("$arg")
done
echo "Number of files: ${#files[@]}"If you only need simple parameter substitution and loops, you may choose to stick to more portable constructs, but for typical Linux focused tools, Bash is a reasonable baseline.
Organizing and Distributing Bash Tools
Once you have written a Bash tool, you should make it easy to install and run. First give the script a short descriptive name with no spaces, such as logtail, cleanup-tmp, or backup-home.
Set the executable bit:
chmod +x mytool
Then place it in a directory that is part of your PATH, for example ~/bin or /usr/local/bin if you have the required permissions. After that you can run the tool by name from any directory.
It is also helpful to include a brief usage message and a --help option:
usage() {
cat <<EOF
Usage: $0 [options] FILE...
Options:
-h, --help Show this help message
EOF
}
if [ "${1-}" = "-h" ] || [ "${1-}" = "--help" ]; then
usage
exit 0
fiFor shared environments, consider placing common tools in a version controlled repository so others can review and improve them. This is a simple but effective way to build a library of small, dependable Bash based tools across a team or organization.