Posts
All the articles I've posted.
-
MySQL CLI: From Connection to Maintenance
Master MySQL from the command line: connect, query databases, manage users, repair tables, optimize—everything you keep Googling, one reference.
Updated:5 min read -
Three ways to upload ISOs to Proxmox
Web UI, wget, or direct file copy — pick your ISO upload method based on what you’re doing.
Updated:4 min read -
Recursively delete all empty subdirectories
Clean up empty directories with find and rmdir — safely prune orphaned dirs left after file migrations.
Updated:5 min read -
Remove all old installed but unused kernels
Old kernels pile up in /boot and eat disk space — safely remove unused kernels on Ubuntu and Debian with apt and dpkg.
Updated:5 min read -
Bulk rename files in bash
Remove spaces and special characters from filenames using bash loops, rename, find, and parameter expansion tricks.
Updated:5 min read -
When systemd swallows your service logs
Service restart not showing logs? systemd hides stdout. Learn journalctl, systemctl status, and debugging workflows for silent systemd failures.
Updated:5 min read -
Sed 101
sed is the stream editor for making text substitutions, deletions, and insertions in files — the patterns you'll use 90% of the time.
Updated:5 min read -
SSH keys and secure file copy
Generate SSH keys, set up passwordless auth, configure SSH, and transfer files securely with SCP — the foundation of headless Linux work.
Updated:4 min read -
Why You Should Switch to ZShell (zsh)
Switch your default shell to zsh on Linux — why zsh beats bash, installation, Oh My Zsh setup, essential plugins, and the gotcha that still catches everyone.
Updated:5 min read -
Beyond RAG: When a Virtual Filesystem Works Better
RAG is the default answer for giving LLMs access to documents. But chunking, embedding, and retrieval introduce failure modes that a virtual filesystem sidesteps entirely.
6 min read -
Running Gemma 4 Locally with Ollama
Google's Gemma 4 is the best open model they've shipped yet. Here's how to pull it, run it, and actually use it for real work with Ollama on your own hardware.
6 min read -
1-Bit LLMs: The Quantization Endgame
1-bit models store weights as -1, 0, or 1. That sounds insane until you see them run a 100B parameter model on a laptop CPU. Here's what's actually happening.
6 min read