Home cloud "writeup"

Due to a completely hilarious harsh unplug of my RPi, I've recently lost all of my shaarli/bookstack data, along with the docker-compose.yml declaring my configuration. What a nice occasion to redo all the work while documenting and ensure that some backup utility prevents me from losing it all over again.

Hardware setup

My home setup is comprised of:

I'd like to add someday a RPi-sized card that has another storage option than sdcards, due to reliability issues that I had. I'm looking over some hardkernel products (like the XU4, N2, C2, H2...), as well as the AtomicPi. The ARM architecture adds complexity regarding Docker images availability, so that's why I'm looking over some intel boards.

Software setup

After trying YunoHost and having problems when installing numerous services, I gave up and planned to use Docker for everything. I was using balenaOS before the great crash and the lack of support concerning docker-compose always irritated me as I needed another linux machine to launch/edit my stack. Right now I'm using a vanilla Raspbian like everyone else, but Docker specific distributions are still an interesting option to consider someday. I'd like to try out CoreOS, for example.

My Raspbian has docker + docker-compose installed, along with git for syncing my files.

I use Traefik as a reverse proxy, routing queries to two main services/apps: - Shaarli, a web-based link sharing site. While I'm not a fan of the way it looks, it gets the job done and is very simple. Shaarli images on docker hub are amd64 only, even if the repository itself contains an armhf Dockerfile. My friend agurato hosts a armhf image that I use. - Bookstack, a sort of wiki-like platform. I like the frontend, which I found clear and concise. The page/book/shelf metaphor is clever and makes it easy to organize things. The [linuxserver] project hosts an armhf image here. - I (now) run a third service that handles the backup part. It consists of an image that mounts the volumes where Bookstack and Shaarli store their data, then daily dumps the relevant parts of these data to a borg repository. The repository itself is then labelled with a date and sent to my NAS via NFS. I have yet to add a google drive backup in case of a NAS problem.

What's left to do

While I'm quite happy to have some way of recovering my setup now, I have some more work to do.

[twitter] [github] [gitlab] tibl