Backing up my Obsidian store and open-sourcing my website
Restic on local and Cloudflare R2
TL;DR Used restic to setup de-duplicated backups on local and Cloudflare R2 and wrote a script to do it
My Obsidian store used to be tracked under my website's git repository but I decided to open-source the website and couldn't have all my notes visible. A large chunk of my website is generated from notes and clippings from my Obsidian store which is synced to my phone using Syncthing. This part of the process still works well enough for me, however, now that the notes were not tracked in a git repo, I needed to have persistent backups of my store, and a different build story.
Build
Initially, my website was linked to Cloudflare pages using my GitHub repository which would automatically get generated when I pushed to main. Later, I stored the built website in my repository which seemed wasteful storage wise but faster to build on my local.
Now that I wanted to do neither of the above, I chose to build locally and push to Cloudflare Pages using wrangler.
Backup
The first thing I tried to use was Kopia. It was recommended by a colleague and seemed mature enough to be trusted. Setting it up was easy and straightforward but the UI kept crashing and I gave up on it after. The next tool I tried was restic. It worked similarly to kopia but didn't have schedules out of the box. Granted, as the restic documentation also suggests, scheduling backups can be done with systemd or cron.
First, I setup the local repository and then created a copy of the repository on R2 using the suggested method. Now, all that I need is to run the Node script periodically. The script is written in a way that it will ask for a password prompt, which I feel more comfortable with than keeping S3 secrets and the password in my environment variables.