Why Every Developer Should Learn Basic Automation
As developers, we often find ourselves doing the same tasks repeatedly:
These tasks take 5-10 minutes each. Do them 10 times a week and that is 2-4 hours lost. A simple script can reduce each task to a single command.
In this guide, I will show you practical scripts in both Bash and Python that you can use immediately.
Part 1: Bash Scripts for Quick Automation
Script 1: Clean Up Development Folders
This script removes common junk files from your projects:
#!/bin/bash
# cleanup.sh - Remove junk files from development folders
echo "Cleaning up development folders..."
# Remove .DS_Store files (macOS)
find . -name ".DS_Store" -type f -delete
echo "Removed .DS_Store files"
# Remove node_modules folders
find . -name "node_modules" -type d -prune -exec rm -rf {} + 2>/dev/null
echo "Removed node_modules folders"
# Remove Python cache
find . -name "__pycache__" -type d -prune -exec rm -rf {} + 2>/dev/null
find . -name "*.pyc" -type f -delete
echo "Removed Python cache"
# Remove .next build folders
find . -name ".next" -type d -prune -exec rm -rf {} + 2>/dev/null
echo "Removed .next build folders"
# Remove log files
find . -name "*.log" -type f -delete
echo "Removed log files"
echo "Cleanup complete!"Save this as `cleanup.sh`, make it executable with `chmod +x cleanup.sh`, and run it in any directory.
Script 2: Create Project Structure
Automatically scaffold a new project with your preferred structure:
#!/bin/bash
# newproject.sh - Create a new project structure
if [ -z "$1" ]; then
echo "Usage: newproject.sh <project-name>"
exit 1
fi
PROJECT_NAME=$1
# Create main directory
mkdir -p "$PROJECT_NAME"/{src/{components,hooks,utils,styles},public/images,tests,docs}
# Create base files
touch "$PROJECT_NAME/README.md"
touch "$PROJECT_NAME/.gitignore"
touch "$PROJECT_NAME/src/index.ts"
touch "$PROJECT_NAME/src/styles/globals.css"
# Add content to .gitignore
cat > "$PROJECT_NAME/.gitignore" << EOF
node_modules/
.next/
.env.local
.env
*.log
.DS_Store
dist/
build/
EOF
# Add content to README
cat > "$PROJECT_NAME/README.md" << EOF
# $PROJECT_NAME
## Getting Started
npm install
npm run dev
## Structure
- src/components - React components
- src/hooks - Custom hooks
- src/utils - Utility functions
- public - Static assets
- tests - Test files
EOF
echo "Project $PROJECT_NAME created successfully!"
echo "Structure:"
tree "$PROJECT_NAME" 2>/dev/null || find "$PROJECT_NAME" -type fNow you can create a new project with `./newproject.sh my-app`.
Script 3: Batch Rename Files
#!/bin/bash
# rename.sh - Batch rename files with pattern
if [ -z "$1" ] || [ -z "$2" ]; then
echo "Usage: rename.sh <search-pattern> <replace-pattern>"
echo "Example: rename.sh 'IMG_' 'photo_'"
exit 1
fi
SEARCH=$1
REPLACE=$2
COUNT=0
for file in *"$SEARCH"*; do
if [ -f "$file" ]; then
newname=$(echo "$file" | sed "s/$SEARCH/$REPLACE/g")
mv "$file" "$newname"
echo "Renamed: $file -> $newname"
((COUNT++))
fi
done
echo "Renamed $COUNT files."Part 2: Python Scripts for Complex Automation
Python is better for tasks that need:
Script 4: Upload Images to Cloudinary
#!/usr/bin/env python3
"""upload_images.py - Upload images to Cloudinary with optimization"""
import os
import sys
from pathlib import Path
try:
import cloudinary
import cloudinary.uploader
except ImportError:
print("Install cloudinary: pip install cloudinary")
sys.exit(1)
# Configure Cloudinary (use env vars in production)
cloudinary.config(
cloud_name=os.environ.get("CLOUDINARY_CLOUD_NAME"),
api_key=os.environ.get("CLOUDINARY_API_KEY"),
api_secret=os.environ.get("CLOUDINARY_API_SECRET")
)
def upload_image(file_path: str, folder: str = "uploads") -> dict:
"""Upload a single image with optimization."""
result = cloudinary.uploader.upload(
file_path,
folder=folder,
transformation=[
{"quality": "auto:good"},
{"fetch_format": "auto"}
],
resource_type="image"
)
return result
def upload_folder(folder_path: str, cloud_folder: str = "uploads"):
"""Upload all images in a folder."""
image_extensions = {".jpg", ".jpeg", ".png", ".gif", ".webp"}
folder = Path(folder_path)
if not folder.exists():
print(f"Folder not found: {folder_path}")
return
images = [f for f in folder.iterdir()
if f.suffix.lower() in image_extensions]
print(f"Found {len(images)} images to upload")
for i, image in enumerate(images, 1):
try:
result = upload_image(str(image), cloud_folder)
print(f"[{i}/{len(images)}] Uploaded: {image.name}")
print(f" URL: {result['secure_url']}")
except Exception as e:
print(f"[{i}/{len(images)}] Failed: {image.name} - {e}")
if __name__ == "__main__":
if len(sys.argv) < 2:
print("Usage: python upload_images.py <folder-path> [cloud-folder]")
sys.exit(1)
folder_path = sys.argv[1]
cloud_folder = sys.argv[2] if len(sys.argv) > 2 else "uploads"
upload_folder(folder_path, cloud_folder)Script 5: Resize Images for Web
#!/usr/bin/env python3
"""resize_images.py - Batch resize images for web"""
import os
import sys
from pathlib import Path
try:
from PIL import Image
except ImportError:
print("Install Pillow: pip install Pillow")
sys.exit(1)
def resize_image(input_path: str, output_path: str, max_width: int = 1200):
"""Resize image maintaining aspect ratio."""
with Image.open(input_path) as img:
# Calculate new dimensions
if img.width > max_width:
ratio = max_width / img.width
new_height = int(img.height * ratio)
img = img.resize((max_width, new_height), Image.LANCZOS)
# Convert to RGB if necessary (for JPEG)
if img.mode in ("RGBA", "P"):
img = img.convert("RGB")
# Save with optimization
img.save(output_path, "JPEG", quality=85, optimize=True)
def process_folder(input_folder: str, output_folder: str, max_width: int = 1200):
"""Process all images in a folder."""
input_path = Path(input_folder)
output_path = Path(output_folder)
output_path.mkdir(parents=True, exist_ok=True)
image_extensions = {".jpg", ".jpeg", ".png", ".gif", ".webp"}
images = [f for f in input_path.iterdir()
if f.suffix.lower() in image_extensions]
print(f"Processing {len(images)} images...")
for i, img_file in enumerate(images, 1):
output_file = output_path / f"{img_file.stem}.jpg"
try:
resize_image(str(img_file), str(output_file), max_width)
# Show size reduction
original_size = img_file.stat().st_size / 1024
new_size = output_file.stat().st_size / 1024
reduction = ((original_size - new_size) / original_size) * 100
print(f"[{i}/{len(images)}] {img_file.name}: {original_size:.1f}KB -> {new_size:.1f}KB ({reduction:.1f}% smaller)")
except Exception as e:
print(f"[{i}/{len(images)}] Failed: {img_file.name} - {e}")
if __name__ == "__main__":
if len(sys.argv) < 3:
print("Usage: python resize_images.py <input-folder> <output-folder> [max-width]")
sys.exit(1)
input_folder = sys.argv[1]
output_folder = sys.argv[2]
max_width = int(sys.argv[3]) if len(sys.argv) > 3 else 1200
process_folder(input_folder, output_folder, max_width)Script 6: Database Backup
#!/usr/bin/env python3
"""db_backup.py - Backup PostgreSQL database to file"""
import os
import subprocess
import sys
from datetime import datetime
from pathlib import Path
def backup_postgres(database_url: str, output_dir: str = "./backups"):
"""Create a PostgreSQL backup."""
output_path = Path(output_dir)
output_path.mkdir(parents=True, exist_ok=True)
timestamp = datetime.now().strftime("%Y%m%d_%H%M%S")
backup_file = output_path / f"backup_{timestamp}.sql"
print(f"Starting backup...")
try:
result = subprocess.run(
["pg_dump", database_url, "-f", str(backup_file)],
capture_output=True,
text=True
)
if result.returncode == 0:
size = backup_file.stat().st_size / 1024 / 1024
print(f"Backup successful: {backup_file}")
print(f"Size: {size:.2f} MB")
# Keep only last 7 backups
cleanup_old_backups(output_path, keep=7)
else:
print(f"Backup failed: {result.stderr}")
except FileNotFoundError:
print("pg_dump not found. Install PostgreSQL client tools.")
def cleanup_old_backups(backup_dir: Path, keep: int = 7):
"""Remove old backups, keeping only the most recent ones."""
backups = sorted(backup_dir.glob("backup_*.sql"), reverse=True)
for old_backup in backups[keep:]:
old_backup.unlink()
print(f"Removed old backup: {old_backup.name}")
if __name__ == "__main__":
database_url = os.environ.get("DATABASE_URL")
if not database_url:
print("Set DATABASE_URL environment variable")
sys.exit(1)
output_dir = sys.argv[1] if len(sys.argv) > 1 else "./backups"
backup_postgres(database_url, output_dir)Part 3: Making Scripts Easy to Use
Add Scripts to Your PATH
Create a `~/scripts` folder and add it to your PATH:
# Add to ~/.bashrc or ~/.zshrc
export PATH="$HOME/scripts:$PATH"Now you can run your scripts from anywhere.
Create Aliases for Common Tasks
# Add to ~/.bashrc or ~/.zshrc
alias cleanup="~/scripts/cleanup.sh"
alias newproj="~/scripts/newproject.sh"
alias resize="python ~/scripts/resize_images.py"
alias backup="python ~/scripts/db_backup.py"Schedule Scripts with Cron
For tasks like daily backups:
# Edit crontab
crontab -e
# Add daily backup at 2 AM
0 2 * * * /usr/bin/python3 ~/scripts/db_backup.py >> ~/logs/backup.log 2>&1
# Weekly cleanup on Sunday at 3 AM
0 3 * * 0 ~/scripts/cleanup.sh >> ~/logs/cleanup.log 2>&1Bonus: Use AI to Write Scripts
Do not know how to write a specific script? Use AI:
Prompt: "Write a Bash script that finds all PNG files larger than 1MB in a directory, compresses them with pngquant, and logs the size reduction."
AI tools like GitHub Copilot or ChatGPT can generate working scripts in seconds. Then you just need to:
1. Read and understand the code
2. Test it on sample data
3. Adjust as needed
Conclusion
Automation is not about being lazy -- it is about being efficient. The 30 minutes you spend writing a script today can save you hours over the coming months.
Start small:
1. Identify a task you do repeatedly
2. Write a simple script to automate it
3. Refine as you use it
Over time, you will build a personal toolkit of scripts that make you significantly more productive.