Tag: self-hosting
All the articles with the tag "self-hosting".
-
Nextcloud Advanced: Federation, Backups, and Making It Actually Performant
Nextcloud advanced configuration: Redis caching, federation setup, automated backups, occ command deep dive, LDAP, external storage, and PHP-FPM tuning.
-
BookStack vs Wiki.js: Picking Your Self-Hosted Documentation Platform
BookStack vs Wiki.js: different philosophies, same goal. Compare features, Docker setup, editors, SSO, and which one fits your team or homelab.
-
Paperless-ngx: Scan It, Forget It, Find It Instantly
Paperless-ngx Docker setup with OCR, auto-tagging, email ingestion, mobile scanning, and a backup strategy for going fully digital with your documents.
-
MinIO + Nextcloud: S3-Compatible Storage That's Actually Yours
Replace Nextcloud's local filesystem with MinIO as an S3-compatible backend. Full Docker setup, bucket policies, performance tuning, and why this scales better.
-
Portainer vs Dockge: Managing Containers Without the Terminal
Portainer vs Dockge: two Docker GUIs for managing containers without touching the terminal. We compare features, setup, and which one fits your self-hosting style.
-
Wiki.js with GitSync: Documentation That Lives in Version Control Like It Should
Set up Wiki.js GitSync with GitHub or Gitea for docs-as-code. Version-controlled wikis, PR workflows, automated updates, and sane branch strategies.
-
Vaultwarden vs Bitwarden: Own Your Passwords Before Someone Else Does
Why trust a cloud with your passwords? Compare Vaultwarden and Bitwarden self-hosted — lightweight vs full-stack, Docker setup, backups, and which one to actually run.
-
How to securely deploy Cloudflare Tunnels
Cloudflare Tunnels expose local services to the internet without open ports — secure setup with zero-trust access controls.
-
Uptime Monitoring with Uptime Kuma
Uptime Kuma monitors your services and sends alerts when they go down — beautiful self-hosted alternative to UptimeRobot.
-
Ollama: Powerful Language Models on Your Own Machine
Ollama makes running local LLMs dead simple — pull a model, start the server, and get a private ChatGPT running on your own hardware.
-
Unleash the Power of LLMs with LocalAI
LocalAI is a self-hosted OpenAI-compatible API — run any GGUF model and connect existing tools without changing a line of client code.
-
Supercharge Your Homelab Monitoring with Zabbix
Zabbix is enterprise-grade monitoring that you can self-host — agents, templates, triggers, and dashboards for your entire home lab.