Table of Contents
Overview
Part II focuses on working with Linux through the command line. While Part I centered on getting a system installed and using it comfortably through the graphical interface, this part shifts your attention to the terminal — the primary, most powerful way professionals interact with Linux systems.
You will not become an expert in one step, but you will gain a solid foundation in everyday command-line tasks, and you will start to see why the shell is such a central tool for Linux users, administrators, and developers.
What You Will Learn in This Part
By the end of Part II, you will be able to:
- Open and use a terminal emulator confidently.
- Understand what a shell is and why tools like Bash are so common.
- Run commands, understand their structure, and use built-in help.
- Navigate the filesystem from the command line and manipulate files and directories.
- View and edit text files directly in a terminal.
- Manage users and file permissions at a basic level.
- Install, update, and remove software via package managers on major distribution families.
- Monitor and control running processes from the command line.
- Write simple shell scripts to automate repetitive tasks.
Each of these topics has its own dedicated chapter, but they are designed to build on each other. This overview explains how they fit together and how to get the most out of them.
Why the Command Line Matters
Graphical tools are convenient for many tasks, but the command line becomes essential when you:
- Administer remote servers (often with no graphical interface installed).
- Need to repeat tasks reliably and automate them.
- Want access to advanced features that GUIs may hide or not support.
- Work with logs, configuration files, and development tools.
The command line is also:
- Scriptable: Anything you can type, you can usually automate.
- Composable: Simple tools can be chained together to solve complex problems.
- Consistent: The same text-based tools are available across many distributions and environments.
You do not need to be a power user to benefit. In this part, you will start with the basics and gradually move toward small automations and more confident system interaction.
Structure of Part II
This part is divided into chapters that follow a logical progression from the basics of using a terminal to writing simple scripts.
Terminal and Shell Basics
You will learn:
- What a shell is and how it relates to the terminal.
- The differences between common shells like Bash, Zsh, and Fish (at an introductory level).
- How commands are structured: command name, options, and arguments.
- How environment variables influence command behavior.
- How to access built-in help systems (
man,--help,info). - How to use command history and autocomplete to work faster and with fewer typing mistakes.
This chapter lays the foundation. Every later topic assumes you are comfortable entering commands and inspecting their help.
Files and Directories
Here you will move from point-and-click navigation to text-based navigation. You will learn to:
- List directory contents with varying levels of detail.
- Move around the filesystem (
cd) using absolute and relative paths. - Create, copy, move, rename, and delete files and directories.
- Use wildcards (
*,?,[]) for selecting multiple files at once. - Search for files and directories based on names and other criteria.
You will repeatedly use these skills in later chapters, especially when editing configuration files, managing logs, or writing scripts that operate on many files.
Viewing and Editing Text Files
Configuration files, scripts, logs, and many other important resources are plain text. This chapter teaches you how to work with them in the terminal:
- Use viewing tools to quickly inspect content without editing.
- Edit files in a beginner-friendly terminal editor (
nano). - Get a first look at
vim, a more powerful but more complex editor. - Use input/output redirection to send output to files, read from files, and chain command outputs and inputs.
Understanding redirection is particularly important: it is a key building block for composing commands and automating workflows.
Users, Permissions, and Ownership
Linux is a multi-user system with a robust permissions model. In this chapter you will:
- Understand basic user and group concepts as they appear when you run commands.
- Learn how the read (
r), write (w), and execute (x) permissions apply to files and directories. - Change file permissions (
chmod) and ownership (chown) when needed. - Use
sudo(or similar tools) to run commands with elevated privileges when required.
These skills will allow you to safely manipulate system files, and to understand why some operations fail with “permission denied” errors.
Package Management
Instead of downloading installers from random websites, Linux distributions rely on package managers and repositories. This chapter focuses on using them from the command line:
- Understand what software repositories are at a practical level (as they relate to commands you run).
- Use APT on Debian/Ubuntu-based systems to install, remove, and update software.
- Use DNF on Fedora/RHEL-based systems.
- Use Pacman on Arch-based systems.
- Get an introduction to universal packaging formats like Snap, Flatpak, and AppImage.
- Perform safe system upgrades via the command line.
You do not need to master every package manager; rather, you will learn the patterns and key commands for the family of distributions you use.
Working with Processes
Programs running on your system are represented as processes. This chapter covers:
- What a process is in terms you can see: process IDs, parent/child relationships, and basic states.
- Viewing running processes and filtering the list.
- Ending or “killing” misbehaving or unneeded processes.
- Running jobs in the foreground and background, and managing them from your shell.
- Understanding and interacting with long-lived background services and daemons (at a basic level).
This is essential for diagnosing performance issues, stuck programs, or confirming that a service is actually running.
Introduction to Shell Scripting
Once you can use the shell interactively, the next step is to automate. In this final chapter of Part II you will:
- Write and run your first simple shell scripts.
- Use variables in scripts to avoid repetition and make scripts more flexible.
- Add basic logic with conditionals and loops.
- Group commands into reusable functions.
- Make scripts executable and integrate them into your normal workflow.
The emphasis is on small, practical scripts rather than large, complex programs. This prepares you for the more advanced scripting topics later in the course, but already gives you tools to reduce repetitive manual work.
How to Approach This Part
To get the most value from Part II:
- Type commands yourself rather than just reading them. Muscle memory is important.
- Experiment safely:
- Use temporary directories in your home directory when practicing file commands.
- Use non-critical test files when editing or scripting.
- Use the help systems:
- Whenever you see a new command, check
command --helporman commandand skim
Overview
Part II is your transition from being a graphical-desktop Linux user to someone who can actually drive the system through the command line.
In this part, you’ll learn how to:
- Open and use a terminal and shell confidently.
- Navigate the filesystem and work with files without a file manager.
- Inspect and edit text files (configs, logs, scripts) entirely from the terminal.
- Manage basic security through users and permissions.
- Install and update software using package managers.
- View and control running programs (processes and jobs).
- Automate repetitive tasks with simple shell scripts.
Each of these topics has its own chapter, and they build on each other. This chapter explains how they fit together and what mindset to adopt as you go through them.
Why the Command Line is Central in Linux
The command line is not just a “power user extra” in Linux. It is:
- The primary way to work on servers (often no GUI at all).
- The most flexible way to administer desktop systems.
- The main interface for many development tools, automation, and DevOps workflows.
Some practical advantages:
- Precision: Commands are explicit, scriptable, and repeatable.
- Reach: You can do almost anything from a shell, even when no graphical tools are available.
- Automation: Anything you can type, you can usually put into a script.
- Remote access: Tools like SSH let you run shell commands on other machines easily.
You don’t need to become a shell guru overnight. Part II’s goal is to make the command line feel natural and useful, not intimidating.
What This Part Assumes
Part II assumes that you:
- Have a Linux system installed (real or virtual).
- Can log in and reach a desktop or basic console.
- Understand at a high level what Linux is and where it’s used (from Part I).
- Have seen the Linux filesystem basics conceptually (from the Filesystem chapters in Part I).
Here, you’ll focus on doing things—actually using those ideas through commands.
How the Chapters in Part II Fit Together
Each chapter addresses one layer of command-line skills. You’ll revisit some tools in later parts with more depth, but you’ll get a practical foundation here.
Terminal and Shell Basics
You start by getting comfortable with the basic environment:
- Opening a terminal and understanding what it shows (prompt, current directory, user).
- Understanding the shell as the program that reads your commands.
- Seeing the differences between common shells (Bash, Zsh, Fish) only as much as you need to pick and use one.
- Learning command structure:
- Command name
- Options/switches (like
-lor--help) - Arguments (like file names and paths)
- Using environment variables (like
PATH,HOME) to influence command behavior. - Getting help directly from the command line (
man,--help,info). - Using history and autocomplete to save time and avoid retyping.
Everything else in Part II assumes you’re comfortable typing commands, repeating them from history, and looking up their usage.
Files and Directories
Once you know how to enter commands, you apply that to the filesystem:
- Listing files in different levels of detail.
- Moving around with
cdusing both absolute and relative paths. - Creating, copying, moving, renaming, and deleting files and directories.
- Using wildcards (globbing) such as
*,?, and character ranges to select multiple files. - Searching for files and directories from the command line.
This chapter is about replacing “clicking around in a file manager” with text-based navigation and manipulation—an essential base for nearly everything else.
Viewing and Editing Text Files
Almost all important Linux artifacts are plain text:
- Configuration files
- Log files
- Scripts
- Many data files
Here you learn to:
- View file contents without changing them (for quick inspection).
- Edit files with a simple terminal editor (
nano) suitable for beginners. - Get a basic feel for
vim, so you’re not lost when it appears on a server. - Use input/output redirection and pipelines:
- Send command output to files.
- Read input from files.
- Chain commands so the output of one becomes the input of another.
Redirection and pipelines are where the “Lego brick” design of Unix-like systems first becomes visible: you combine small tools to solve bigger problems.
Users, Permissions, and Ownership
Here you see how Linux enforces basic security and separation between users through the command line:
- How your user and group identities affect what you can access.
- How permissions (
r,w,x) control reading, writing, and executing. - Changing permissions (
chmod) and ownership (chown) when appropriate. - Using
sudoto run specific commands with elevated privileges instead of always being root.
This chapter is very practical: you’ll understand why “Permission denied” appears and how to solve it correctly without simply trying to “force it.”
Package Management
This chapter is about software installation and updates from the terminal:
- How your distribution uses repositories of signed packages.
- Using:
- APT on Debian/Ubuntu-based systems,
- DNF on Fedora/RHEL-based systems,
- Pacman on Arch-based systems.
- Installing and removing applications and libraries.
- Updating packages and performing system upgrades.
- Getting introduced to Snap, Flatpak, and AppImage as alternative packaging methods.
You’ll see the core patterns are similar across tools: search, install, remove, update, and list installed packages.
Working with Processes
Here you connect the idea of “running programs” with how they appear to the system:
- Seeing processes with tools that show IDs, resource usage, and hierarchy.
- Stopping or killing misbehaving processes.
- Running jobs in the foreground and background, and switching between them.
- Understanding, at a basic level, long-running background processes (services and daemons) and how to check that they’re alive.
These skills help you handle hung programs, high CPU usage, and verifying that something is actually running.
Introduction to Shell Scripting
Finally, you take your interactive skills and turn them into automation:
- Writing your first shell scripts and making them executable.
- Using shell variables to avoid hard-coding values.
- Adding conditionals (if/else) so scripts react to different situations.
- Adding loops to repeat tasks over lists of items or ranges.
- Grouping operations into simple functions for reuse.
The emphasis here is on small, practical scripts you can understand and adjust, not on building large software projects. This prepares you for more advanced scripting later while already making your daily work easier.
How to Work Through Part II
To get real value from this part:
- Follow along on a real system (physical or virtual). Reading alone is not enough.
- Type commands yourself instead of copy-pasting. It builds muscle memory and understanding.
- Experiment in safe areas:
- Use your home directory for file experiments.
- Create disposable test files.
- Break things (safely) and fix them:
- Run commands, see what happens if you omit an argument or option.
- Use the error messages as a learning tool.
- Use built-in help constantly:
- Try
command --helpandman commandwhenever you see a new tool. - Skim the examples and options; you don’t have to memorize everything.
What You Should Be Able to Do After Part II
After finishing this part, you should be comfortable to:
- Log into a Linux system and work in a terminal without relying on the GUI for routine tasks.
- Navigate directories and manage files efficiently with commands.
- Read and edit basic configuration files and logs.
- Install, update, and remove software using your distribution’s package tools.
- Check which programs are running, stop them if needed, and understand job control.
- Write and run simple shell scripts that automate multi-step tasks.
From here, Part III will build on these skills, introducing more systematic administration tasks (services, storage, networking), but your core way of interacting with Linux will remain the command line you learned in this part.