Diving into the depths of GNU/Linux
In the last chapter, we covered the basics of Linux, the most commonly used user-friendly …
read moreWhat separates an average developer from an expert isn’t the IDE or the language—it’s how well they master their environment.
If you’ve followed the Linux series so far, you already know the basic commands and understand how permissions work. Now it’s time to level up: these 10 commands will transform your productivity and turn you into that developer who solves complex problems with just a few lines in the terminal.
This isn’t about memorizing syntax—it’s about changing how you think. Each command we’ll explore solves real day-to-day problems: finding lost files, processing logs, automating repetitive tasks, or debugging applications. These are the tools that separate those who “survive” in Linux from those who truly master it.
find
— The file detectiveLocates files and directories based on specific criteria. Much more powerful than any GUI search tool.
# Find all JavaScript files modified in the last 24 hours
find . -name "*.js" -mtime -1
# Search for config files containing 'docker' in the name
find /etc -name "*docker*" 2>/dev/null
# Find large files (>100MB) taking up space
find . -size +100M -type f
# Locate files with specific permissions (potential security issues)
find . -perm 777 -type f
# Find and delete node_modules folders to free up space
find . -name "node_modules" -type d -exec rm -rf {} +
find
parameters.
— Current directory (starting point)-name "pattern"
— Search by filename-mtime -1
— Modified in the last day (-1 = less than 1 day)-size +100M
— Files larger than 100MB (+ = greater than)-type f/d
— Type: f=file, d=directory-perm 777
— Files with specific permissions-exec command {} +
— Execute command on found results2>/dev/null
— Redirect errors to avoid noiseIn large projects, manually searching for files is impractical. find
lets you locate exactly what you need using complex criteria.
grep
— The content excavatorSearches for text patterns within files. Like Ctrl+F but infinitely more powerful.
# Find all functions containing 'async' in TypeScript files
grep -r "async" --include="*.ts" .
# Search for errors in logs excluding warnings
grep -E "ERROR|FATAL" /var/log/app.log | grep -v "WARNING"
# Find environment variables used in your project
grep -r "process\.env\." --include="*.js" src/
# Locate all TODOs in the code
grep -rn "TODO\|FIXME\|HACK" src/
# Search for IPs in configuration files
grep -E "([0-9]{1,3}\.){3}[0-9]{1,3}" config/
grep
parameters-r
— Recursive search in subdirectories-E
— Extended regular expressions (allows |, +, ?)-v
— Inverts the search (excludes matching lines)-n
— Shows line numbers-l
— Only shows filenames containing matches-i
— Ignores case--include="pattern"
— Only searches files matching the pattern\|
— Logical OR in patterns (escaped for bash)
# Find Python files that import a specific library
find . -name "*.py" -exec grep -l "import pandas" {} \;
# Search for database configurations in the project
find . -name "*.yml" -o -name "*.yaml" | xargs grep -i "database"
awk
— The data processing powerhouseProcesses and manipulates structured text. Ideal for extracting specific information from logs, CSVs, or command outputs.
# Extract only IPs from an Apache log
awk '{print $1}' /var/log/apache2/access.log | sort | uniq -c
# Calculate total space used by file type
ls -la | awk '{total += $5} END {print "Total:", total/1024/1024, "MB"}'
# Process a CSV and extract specific columns
awk -F',' '{print $1, $3}' data.csv
# Find processes consuming more memory
ps aux | awk '$4 > 5.0 {print $2, $4, $11}'
# Analyze logs by HTTP status codes
awk '{print $9}' access.log | sort | uniq -c | sort -nr
awk
syntax$1, $2, $3...
— Columns/fields (separated by spaces by default)-F','
— Defines field separator (here: comma for CSV){print $1}
— Action: print first column$4 > 5.0
— Condition: filter lines where column 4 > 5.0END {command}
— Execute command at end of processingtotal += $5
— Sum values from column 5substr($1,1,13)
— Extract substring: from position 1, 13 characters
# Count errors per hour in timestamped logs
awk '/ERROR/ {print substr($1,1,13)}' app.log | uniq -c
sed
— The text editing ninjaModifies text on the fly without opening files. Perfect for massive refactoring or data cleanup.
# Replace API URLs in all configuration files
sed -i 's/api\.dev\.com/api\.prod\.com/g' config/*.json
# Extract only lines between two patterns
sed -n '/START/,/END/p' log.txt
# Remove empty lines and comments
sed '/^#/d; /^$/d' config.conf
# Add a prefix to each line
sed 's/^/LOG: /' error.txt
# Convert date format
echo "2025-08-17" | sed 's/-/\//g'
sed
commands-i
— In-place editing, modifies the original files/pattern/replacement/g
— Substitute: s=substitute, g=global (all occurrences)-n
— Quiet mode, only prints what explicitly told/START/,/END/p
— Print lines from START pattern to END/^#/d
— Delete lines starting with # (d=delete)/^$/d
— Delete empty lines (^$ = line start and end with no content)s/^/prefix/
— Add text at the beginning of each line (^ = line start)
# Change old imports to new ones throughout the project
find src/ -name "*.js" -exec sed -i 's/old-library/new-library/g' {} \;
xargs
— The command multiplierConverts output from one command into arguments for another. It’s the glue that connects commands elegantly.
# Find and delete backup files
find . -name "*.bak" | xargs rm
# Search text in multiple found files
find . -name "*.log" | xargs grep "ERROR"
# Download multiple URLs from a file
cat urls.txt | xargs -I {} curl -O {}
# Change permissions on multiple files
ls *.sh | xargs chmod +x
# Process files in parallel (4 simultaneous jobs)
find . -name "*.jpg" | xargs -P 4 -I {} convert {} {}.webp
xargs
options-I {}
— Defines placeholder for each input element-P 4
— Execute up to 4 processes in parallel-n 1
— Process one argument per command (useful for batches)-d '\n'
— Define custom delimiter (default: spaces and newlines)-r
— Don’t run command if no input (avoids errors)curl
— The universal communicatorMakes HTTP requests from the terminal. Essential for testing APIs, downloading files, or automating web interactions.
Create this curl-format.txt
file to format the data received in the request below:
total_time: %{time_total}s
dns_time: %{time_namelookup}s
connect_time: %{time_connect}s
status_code: %{http_code}
# Test an API with JSON data
curl -X POST -H "Content-Type: application/json" \
-d '{"user":"test","pass":"123"}' \
https://api.example.com/login
# Download files with progress bar
curl -L -o file.zip https://github.com/user/repo/archive/main.zip
# Test API response time
curl -w "@curl-format.txt" -o /dev/null -s https://api.example.com/health
# Follow redirects and save cookies
curl -L -c cookies.txt -b cookies.txt https://site.com/login
# Verify SSL certificates
curl -I https://your-site.com
curl
parameters-X POST/GET/PUT/DELETE
— Specify HTTP method-H "Header: value"
— Add custom headers-d "data"
— Send data in request body-L
— Follow redirects automatically-o file
— Save response to file-O
— Save with original filename from server-s
— Silent mode (no progress)-w "format"
— Define custom output format-c file
— Save cookies to file-b file
— Use cookies from file-I
— Headers only (HEAD method)jq
— The definitive JSON processorManipulates and extracts JSON data elegantly. Indispensable in the REST API era.
sudo apt install jq # Ubuntu/Debian
brew install jq # macOS
# Extract specific field from API response
curl -s https://api.github.com/users/octocat | jq '.name'
# Filter arrays by conditions
jq '.[] | select(.age > 18)' users.json
# Transform data structure
jq '.users[] | {name: .name, email: .email}' data.json
# Count elements in an array
jq '.items | length' response.json
# Search in nested structures
jq '.data.users[] | select(.status == "active") | .email' api_response.json
# Check status of services in Kubernetes
kubectl get pods -o json | jq '.items[] | select(.status.phase != "Running") | .metadata.name'
ss
— The connection monitorShows active network connections. Replaces the obsolete netstat
with better performance.
# Check what process is using a specific port
ss -tulpn | grep :3000
# List all active TCP connections
ss -t -a
# Show connections by state
ss -t state established
# Find connections to a specific IP
ss -t dst 192.168.1.100
# Monitor listening ports
ss -tlnp | grep LISTEN
ss
options-t
— TCP connections only-u
— UDP connections only-l
— Listening ports only-a
— Show all connections (active + listening)-n
— Show port numbers instead of service names-p
— Show processes using each connectionstate established
— Filter by connection statedst IP
— Filter by destination IPsrc IP
— Filter by source IPrsync
— The intelligent synchronizerSynchronizes files and directories efficiently. Only transfers changes, not complete files.
# Sync local project with remote server
rsync -avz --exclude node_modules/ ./ user@server:/var/www/app/
# Incremental backup with progress
rsync -av --progress backup/ destination/
# Bidirectional sync (use with caution)
rsync -av --delete source/ destination/
# Copy only recently modified files
rsync -av --update source/ destination/
# Dry-run to see what will be synced
rsync -avn source/ destination/
rsync
flags-a
— Archive mode (preserves permissions, times, links)-v
— Verbose (shows processed files)-z
— Compress during transfer (useful for slow connections)-n
— Dry-run (simulates without making real changes)--progress
— Show progress bar--delete
— Delete files in destination that don’t exist in source--update
— Only update newer files--exclude pattern
— Exclude files/directories matching pattern/
) matters in paths (affects whether it copies directory or contents)tmux
— The terminal multiplexerCreates persistent terminal sessions. You can disconnect and reconnect without losing your work.
sudo apt install tmux # Ubuntu/Debian
# Basic configuration in ~/.tmux.conf
set -g prefix C-a
bind-key C-a send-prefix
unbind C-b
set -g mouse on
# Create named session (short equivalent: tmux new -s development)
tmux new-session -s development
# List active sessions (alias: tmux ls)
tmux list-sessions
# Detach without closing anything: Prefix + d
# (Prefix = Ctrl+a because we redefined it; if you didn't change config it's Ctrl+b)
# Reconnect to existing session
tmux attach-session -t development
# Close pane/window/session cleanly
# Pane: exit (exits shell) or Prefix + x (confirm)
# Window: Prefix + & (confirm)
# Session: tmux kill-session -t development
Assuming you applied the prefix change to Ctrl+a in ~/.tmux.conf
. If not, replace “Ctrl+a” with “Ctrl+b”.
# 1. Create a new work session
tmux new -s web
# 2. Create a new window (e.g., for the server)
# Press: Ctrl+a c
# 3. Rename current window
# Press: Ctrl+a , (type: server and Enter)
# 4. Switch to previous or next window
# Next: Ctrl+a n
# Previous: Ctrl+a p
# 5. Go directly to a window by number (windows start at 0)
# Example: Ctrl+a 0 (goes to first) / Ctrl+a 1
# 6. Split window into panes
# Vertical (left/right): Ctrl+a %
# Horizontal (top/bottom): Ctrl+a "
# 7. Move between panes
# Ctrl+a (arrows) or Ctrl+a o (rotate)
# 8. Resize (quick optional)
# Hold Ctrl+a then repeatedly press: Alt + ←/→/↑/↓ (depending on terminal)
# 9. Detach from session (leave everything running)
# Ctrl+a d
# 10. Return later
tmux attach -t web
# 11. Close everything when done
# Exit each pane with: exit
# Or kill the entire session:
tmux kill-session -t web
With this, you can work with multiple processes (editor, server, logs) without opening dozens of separate terminals.
man
- Your best allyNever memorize parameters. The system has integrated documentation waiting for you. The man
(manual) command is your inseparable companion:
# Complete command documentation
man find
# Quick search within the manual (once inside)
/pattern # Search forward
?pattern # Search backward
n # Next match
q # Exit manual
# Search commands by functionality
man -k "search files" # Find commands related to searching
# Short version with main options
find --help # Works with most commands
# Quick example with explanation
tldr find # If you have tldr installed (apt install tldr)
# Forgot how to use grep with regular expressions?
man grep
# Search within manual: /regex
# Don't remember rsync options?
rsync --help | grep -E "^\s*-[a-z]" # Only main options
# What exactly does the -z flag do in tar?
man tar
# Search within: /-z
man command
when you need more options
# Find unique errors in logs, sorted by frequency
find /var/log -name "*.log" | \
xargs grep -h "ERROR" | \
awk '{for(i=3;i<=NF;i++) printf "%s ", $i; print ""}' | \
sort | uniq -c | sort -nr | head -10
# Combine tail, grep and awk for live monitoring
tail -f app.log | grep --line-buffered "ERROR\|WARN" | \
awk '{print strftime("%H:%M:%S"), $0}' | \
while read line; do echo -e "\033[31m$line\033[0m"; done
# Script to clean unnecessary files
find . -name "node_modules" -type d | xargs rm -rf
find . -name "*.log" -mtime +7 | xargs rm
find . -name ".DS_Store" | xargs rm
# Add to ~/.bashrc or ~/.zshrc
alias logwatch='tail -f /var/log/app.log | grep --color=always ERROR'
alias findjs='find . -name "*.js" -not -path "./node_modules/*"'
alias gitclean='git branch --merged | grep -v "\*\|main\|develop" | xargs git branch -d'
# Search previous commands with pattern
history | grep "find.*\.js"
#!/bin/bash
# deploy.sh - Simple deployment script
echo "🚀 Starting deployment..."
rsync -av --exclude node_modules/ ./ server:/app/
ssh server "cd /app && npm install && pm2 restart all"
echo "✅ Deployment completed"
These 10 commands aren’t just tools; they’re superpowers that will transform your experience as a Linux developer. The difference between knowing them superficially and mastering them lies in constant practice.
True Linux mastery doesn’t come from memorizing flags and options, but from understanding how to combine these tools to solve real problems. Each command is a piece of a larger puzzle: your optimized workflow.
Remember: the terminal isn’t intimidating when you have documentation just a man
away.
Ready to become that developer who works magic in the terminal?
Happy Coding!
That may interest you
In the last chapter, we covered the basics of Linux, the most commonly used user-friendly …
read moreIn the previous chapter we reviewed what process automation is and why tools like n8n have become …
read moreWorking in software can quickly turn into a maze of tasks, branches, and bugs waiting to be …
read moreConcept to value