Pipes and Chaining Commands
Pipes connect commands together, allowing the output of one command to become the input of another. This is one of Linux's most powerful features.
The Pipe Operator: |
command1 | command2
The output of command1 becomes the input of command2.
How Pipes Work
command1 → output → pipe → input → command2
Example:
cat file.txt | grep "error" | wc -l
catoutputs file contentsgrepreceives it, filters for "error"wc -lcounts the filtered lines
Common Pipe Patterns
Filter Output
ls -la | grep ".txt"
ps aux | grep nginx
Sort and Unique
cat file.txt | sort | uniq
Count Results
ls | wc -l
grep "error" log.txt | wc -l
Page Through Output
cat large_file.txt | less
ps aux | less
Exercise: Use Pipes
Combine ls and grep to find text files:
Multiple Pipes
Chain many commands:
cat access.log | grep "404" | sort | uniq -c | sort -rn | head -10
This pipeline:
- Reads the log file
- Filters for 404 errors
- Sorts the lines
- Counts unique occurrences
- Sorts by count (descending)
- Shows top 10
Useful Pipe Combinations
Find Large Files
ls -lS | head -10
Count File Types
ls | grep ".txt" | wc -l
Search in Output
history | grep "git"
env | grep PATH
Monitor Processes
ps aux | grep python | grep -v grep
tee - Save and Pass Through
tee saves output to a file AND passes it on:
ls | tee files.txt | wc -l
This saves the file list AND counts it.
xargs - Build Commands
Convert input to arguments:
find . -name "*.txt" | xargs cat
find . -name "*.tmp" | xargs rm
Exercise: Chain Multiple Commands
Count the number of processes running:
Best Practices
- Build incrementally: Test each step
- Keep pipelines readable: Not too long
- Use intermediate files: For complex operations
- Consider performance: Some commands process streams better
Key Takeaways
- Pipes (
|) connect command output to input - Build complex operations from simple commands
- Use
grepto filter,sortto order,wcto count teesaves output while passing it on- Pipelines are the Unix way of combining tools

