Table of Contents
Overview
This chapter is about building your own Linux command‑line tools, not about using existing ones. You’ll see patterns, interfaces, and conventions that make a tool feel “native” to Linux, regardless of whether you implement it in Bash, Python, C, or another language.
The next three subchapters go into Bash, Python, and C specifically. Here we’ll focus on:
- What makes something a “Linux tool”
- Designing good command‑line interfaces (CLI)
- Unix philosophy and composability
- Interacting with the OS (files, processes, environment)
- Packaging, installation, and distribution basics
- Testing, documenting, and maintaining tools
Use this chapter as a conceptual framework; the language-specific chapters focus on the “how”.
What Makes a “Linux Tool”?
A “Linux tool” is typically:
- A command‑line program
- Invoked by name from a shell:
mytool [options] [args] - Communicating via:
- Arguments and options (input on the command line)
- Standard streams: stdin, stdout, stderr
- Exit status: success/failure and sometimes richer status
- Working well in pipelines with other commands
- Respecting filesystem and environment conventions
If your program:
- Reads input from stdin (unless files are specified)
- Writes machine‑parseable output to stdout
- Writes diagnostics to stderr
- Returns meaningful exit codes
…then it can be easily scripted, automated, and combined with the rest of the Unix toolbox.
Unix Philosophy and Design Principles
Linux tool design is strongly influenced by classic Unix ideas:
- Do one thing well
Avoid “Swiss Army knife” tools that try to do everything. Smaller tools are easier to test and compose. - Text is a universal interface
Prefer plain text or line‑oriented output unless there’s a strong reason for binary formats. - Compose tools
Design your tool so it works nicely with|, redirection, and other tools: - Take input from stdin when appropriate
- Allow output to be filtered (
grep,awk,jq, etc.) - Be predictable and scriptable
Avoid interactivity unless specifically requested (-i,--interactive); non‑interactive mode should be stable. - Don’t surprise the user
Follow common conventions (-h/--help,-v/--version, quiet vs verbose modes, exit codes, etc.).
CLI Design Fundamentals
Command Format
Typical structure:
tool [global-options] [subcommand] [subcommand-options] [arguments]
Examples:
- Simple:
mylogfilter -i /var/log/syslog -l warn - Git‑style:
mytool config set key value
Subcommands are useful when your tool has distinct modes that each feel like a separate command: git commit, docker run, etc.
Options and Arguments
General conventions:
- Short options: single dash, single letter:
-h,-v,-f - Long options: double dash, descriptive words:
--help,--verbose,--file - Short options can often be combined:
-avz - Arguments (non‑options) typically specify files, targets, or other primary inputs
Patterns:
-h,--help: Usage summary-V,--version: Version information-q,--quiet: Less output-v,--verbose: More output; can be repeated (-vv)
Decide which arguments are required and enforce that clearly with good error messages.
Exit Codes
The shell inspects $? or echo $? for the last command’s exit status.
Conventions:
0: Success1: Generic error / failure2: Misuse of shell builtins or CLI syntax error (used by some tools for invalid options)126: Command found but not executable127: Command not found
For your own tools, define and document your own mapping, e.g.:
0: Success1: Runtime failure (file not found, network error, etc.)2: Invalid arguments / usage error3: Configuration problem4: External dependency failure
The key is consistency and documentation so scripts can react programmatically.
Standard Streams
Tools should use the three standard streams as intended:
- stdin (file descriptor 0)
Default input source. Use when: - No input files are specified
- The tool is part of a pipeline
- stdout (fd 1)
Normal/programmatic output. Anything another program may parse should go here. - stderr (fd 2)
Error messages, warnings, progress output. This allows users to redirect them separately, e.g.:
mytool data.txt >out.txt 2>errors.logGuidelines:
- Do not mix extra “user‑friendly” chatter into stdout if you expect scripts to parse it.
- If you show progress bars or spinners, send them to stderr.
Working with the Shell and Environment
CLI tools interact heavily with the process environment.
Environment Variables
Environment variables are a common way to configure tools non‑interactively:
- Basic reading:
getenv("HOME")(C),os.environ["HOME"](Python),$HOMEin shell. - Patterns:
TOOL_CONFIG=/path/to/config mytoolHTTP_PROXY,NO_PROXY, etc.LC_ALL,LANGfor locale behavior
Best practices:
- Provide sensible defaults when an env var is absent.
- Allow environment configuration but don’t rely exclusively on it:
- Options > env vars > built‑in defaults
- Document the variables your tool respects.
Signals and Process Behavior
Well‑behaved tools:
- Respond to
SIGINT(Ctrl‑C) for clean interruption. - Clean up temporary files/directories on exit (including on signal if possible).
- Avoid ignoring
SIGTERMunless there’s a specific reason; that breaks graceful shutdown.
For long‑running tools (daemons, servers) this becomes even more important, but even short CLI tools benefit from handling interruption cleanly.
Input and Output Patterns
Supporting Both Files and stdin
A classic Unix pattern: if no files are supplied, read stdin; if files are supplied, process them; a single - argument often means “stdin”:
mytool: read from stdinmytool file1 file2: read those filesmytool -: read stdin explicitly
This makes your tool more flexible in pipelines and scripts.
Output Format
You’ll often choose between:
- Human‑oriented output:
- Aligned columns, colors, explanatory text
- Machine‑oriented output:
- Plain lines, CSV/TSV, JSON/YAML, etc.
Consider providing a switch, e.g.:
--jsonfor structured output--no-coloror--plainfor uncolored output--quietfor minimal or no normal output (just exit codes)
When supporting colors:
- Use ANSI escape codes or a helper library.
- Provide
--no-colorand also automatically disable colors if stdout is not a TTY (commonly detected viaisatty(1)).
Filesystem and Path Conventions
Where to Install Tools
Common locations (depending on privilege and packaging):
- System‑wide:
/usr/bin: primary executables for users/usr/local/bin: locally installed tools not managed by the distro- Per‑user:
~/binor~/.local/binfor user‑level tools
Ensure that the directory where you install the tool is in PATH. For user tools, you often instruct the user to add:
export PATH="$HOME/.local/bin:$PATH"Config Files and State
Follow standard locations; for single‑user tools:
- Config (XDG spec, if applicable):
$XDG_CONFIG_HOME/mytool/configor~/.config/mytool/config- State files:
$XDG_STATE_HOME/mytool/or~/.local/state/mytool/- Cache:
$XDG_CACHE_HOME/mytool/or~/.cache/mytool/
Avoid writing config or caches directly into $HOME unless you have to, and definitely don’t clutter the working directory unexpectedly.
Using Temporary Files
Use appropriate temp directories:
- System‑wide:
/tmp, sometimes/var/tmp - Respect
$TMPDIRif set - Use unique names (suffix with PID or use secure library functions)
Always ensure that temp files are cleaned up on normal exit, and attempt cleanup on interrupts.
Packaging and Distribution Overview
Language‑specific details belong in later subchapters; here are general strategies.
Simple Script Tools
For small shell or Python tools:
- Put the script in
~/binor~/.local/bin - Start with a proper shebang, e.g.:
#!/usr/bin/env bash#!/usr/bin/env python3- Make it executable:
chmod +x mytool
This is enough for personal use or simple shared environments.
System Packages
For wider distribution, you usually want distro‑native packages:
debfor Debian/Ubunturpmfor Fedora/RHEL/openSUSEpkg.tar.zstfor Arch (via PKGBUILD)
Even if you won’t become a packaging expert, designing your tool to:
- Install into standard directories
- Not require absolute paths hardcoded inside
- Defer to system paths and environment variables
…makes it easier to package later.
Language‑Specific Packaging
Depending on the language:
- Python: publish on PyPI (
pip install mytool), use console scripts entry points. - Go/Rust: produce a static binary to distribute; optionally use language package registries too.
- C: install via
make install, optionally create distro packages.
The key design aspect: don’t assume the project’s source directory layout at runtime. Tools should work when installed system‑wide.
Testing Linux Tools
Automated Tests
Unit and integration tests make tools reliable and safer to change. Common patterns:
- Unit tests: test core logic with in‑memory data, no filesystem or network
- Integration tests:
- Run the tool in a temporary directory
- Provide sample files
- Capture stdout, stderr, and exit status
- Assert on output and side effects
Make your program test‑friendly by:
- Separating core logic from CLI parsing and I/O glue.
- Having functions that accept strings or file‑like objects rather than always reading from global state.
Golden Files
For text‑processing tools, it’s common to use golden files:
- Prepare input sample files and expected output files.
- Tests run your tool, compare actual output to expected output.
- When behavior legitimately changes, update the golden files deliberately.
This is especially useful for complex output formatting or transforms.
Documentation and Discoverability
Inline Help (`-h` / `--help`)
Good tools provide a concise, readable help message that shows:
- Usage line(s)
- Options and their descriptions
- Subcommands (if any)
- Examples
Patterns:
tool -h→ short usagetool --help→ longer, detailed help (sometimes with multiple sections)
Keep it synchronized with reality; ideally, generate it or centralize it so changes in options don’t drift from documentation.
Man Pages and `--version`
For tools intended for serious use on Linux:
- Provide a
tool(1)man page describing: - Synopsis
- Description
- Options
- Exit codes
- Files used
- Environment variables
- Implement
--versionwith at least: - Version number
- Build info or commit hash (for debugging)
- Maybe OS/architecture if appropriate
This makes your tool feel native to the Linux ecosystem.
Patterns for “Good Citizen” Tools
Some concrete design patterns that make tools more robust and Linux‑friendly:
- Idempotence for mutating tools:
- Running the same command twice in a row should not cause unexpected side effects.
- Dry‑run modes (
--dry-run): - Show what you would do without actually doing it (especially for destructive operations).
- Confirmation prompts for destruction:
- Either
--force/-fto skip prompts or an interactive--interactivemode. - Safe default behavior:
- For example, refuse to overwrite important files unless explicitly told, or create backups.
- Locale and encoding awareness:
- Know how your tool behaves when
LANGand related variables change. - Avoid assuming ASCII; handle UTF‑8 sensibly whenever possible.
Choosing the Right Implementation Approach
The next subchapters are language‑specific. From a design perspective:
- Use shell scripts when:
- You’re orchestrating other commands
- Performance requirements are low
- Portability across distros is key
- Use Python when:
- You need complex logic, parsing, HTTP, etc.
- You want fast development and good readability
- Use C (or a compiled language) when:
- You need maximum performance
- You require low‑level system access
- You want minimal runtime dependencies
Architecturally, try to:
- Keep the CLI, parsing, and I/O at the edges
- Put the core logic in reusable functions/libraries
- Make it possible in the future to:
- Wrap the tool with another interface (e.g. a GUI or an API)
- Use the core logic from other programs
Putting It All Together
When you design a new Linux tool, ask:
- How is it invoked, and does the syntax follow common CLI patterns?
- How does it behave in pipelines and with redirections?
- Are exit codes meaningful and documented?
- Are stdout and stderr used appropriately?
- Are environment variables and configuration handled cleanly?
- Is installation and path handling consistent with Linux conventions?
- Is it testable and documented in a way Linux users expect?
The rest of this section (Bash, Python, C) will focus on the specifics of implementing such tools. Use this chapter as your checklist to ensure whatever you build feels at home on a Linux system.