~ ✍️
RSS image

« Back to index

My personal wiki setup

Feb 18 2023

When doing any kind of software related work you will need to use some number of tools (such as git, ssh, editor, docker…) that each require a lot of knowledge and good mental models in order to get your things done. Apart from using multiple tools for some of which exists documentation in the size of a Boeing 747 flight manual, you might also deal with setting up any kind of complicated environments that require some amount of domain knowledge to run and maintain. There will also be countless problems showing up that require specialized solutions and workarounds to deal with. To handle this amount of knowledge you will definitely need some kind of tool that helps you organize and search through your accumulated data. In the usual business context, a company usually has some kind of knowledge base such as counfluence or a similar slow tool for documentation. However, few people have such a tool for their personal use. I strongly recommend having some kind of personal knowledge base to document everything you will need to remember later. I also think this comes hand in hand with a good backup solution and should not be left out. Many applications have been released in recent years to address this issue of personal knowledge management. In this blog post, I am going to show you the setup I use and which fits my personal need very well. It is a bit of a do-it-yourself solution and I will probably customize it even more as time goes on (passes?).

Local access and plain markdown files

I use a markdown-based solution as I like to have plain text files for documentation and this makes it possible to sync them easily with git. With any kind of online tool I usually feel that my data is not really in my possession which it isn’t. So having files locally means I can back them up using regular unix tools. This matches my workflow quite well as I usually work in some kind of shell and can easily edit these files with my preferred vim-based editor Neovim. If I need to find some keywords or files I do this in the terminal with find and grep. Backup and sync is done by just pushing to a git repo that I use for that purpose. Using plain text leads to my wiki still being accessible the next decade compared to some proprietary file format.

When working with markdown files I usually use the markdown editor obsidian. It presents you with a preview of the rendered markdown file which is quite handy. Obsidian also allows you to link to other pages within your wiki using [[<)your_linked_file.md>]]. Linking to other notes can be part of the Zettelkasten system for managing your knowledge. A part of the wiki is organized as a zettelkasten having short notes and links between them and a part of it is just a regular wiki documenting servers, setup scripts, and similar things.

Online access

Because I am often not on my personal computer or on a system where my wiki isn’t present I quickly decided that I also would like online access to my wiki. Luckily, a great number of web wikis work with plain markdown files. At the time of setup of my wiki (~4 years ago in 2019) I decided on
WikiJs. Even though its UI is a bit clunky and requires more ui actions for creating and editing than should be necessary in my opinion it does its job quite well. For authentication, wikijs has an auth mechanism. Because I do not feel comfortable exposing such a service directly I also have the service behind a proxy and use auth basic.

Auto sync

Because manually syncing with git can become a bit annoying if you have regular changes I added an auto-sync mechanism. Tools like obsidian now also allow auto-syncing but because I want my wiki to be tool agnostic so I decided on doing this myself. The solution is really pragmatic and works for my use case. I use a python script on my personal computer that just creates a commit every few minutes if there are any changes and pushes them to the remote. It also pulls and merges automatically. Wikijs always creates a commit when saving changes. This setup works quite well if I do not create changes simultaneously on my computer and the web version which is very seldom the case. If there are merge conflicts I just resolve them manually. The sync script is set up with systemd to automatically start on boot so it just runs in the background. If necessary I can still resolve changes manually and commit by hand. Because there will be a huge number of commits with time I just reset the git history yearly by just force-pushing to the repo and by checking that wikijs is synced with the overwritten repo again.

Python script to sync the repo which has been very quickly cobbled together:

import os                                                                                     
from git import Repo                                                                          
from git import Git                                                                           
import git
import uuid
import time
import os
import datetime

if os.name == 'nt':
    git_ssh_cmd = "ssh -i C:/Users/laeri/.ssh/zettelkasten"
    git_ssh_cmd = "ssh -i ~/.ssh/zettelkasten"

repo_base_path = os.path.abspath(os.path.join(os.path.dirname(os.path.abspath(__file__)), '..'))
repo = git.Repo(repo_base_path)
os.environ['GIT_SSH_COMMAND'] = git_ssh_cmd
# initial pull to update
update_period = 3*60 # 3 minutes
while True:
        diff = repo.git.diff('HEAD~1', name_only=True)
        changed_files = diff.split('\n')
        files_str = ""
        for file in changed_files:
            files_str += "'" + file + "'"
        print(str(datetime.datetime.now()) + " ::::: " + str(changed_files))
        if len(changed_files) > 0:
            repo.index.commit('zettelkasten auto commit: ' + files_str + str(uuid.uuid1()))
        time.sleep(update_period) # every n minutes commit and push if changes
    except Exception as e:
        print("There was an exception: ")


Because a personal wiki will probably contain sensitive data at some point it should probably also have an encryption mechanism. Note that tokens or ssh keys should never belong to a wiki in my opinion some data should probably still be encrypted for safety. Here I do not have a good solution yet. Some tools such as trilium and others allow encryption I will research when deciding on one. However, I would still want a tool-agnostic setup and maybe I will have a separate part encrypted just to be safe. This is something I will have a better look at this year.

All in all, this is a low-effort solution that perfectly fits my use case and my wiki has come in handy very frequently in the past years. It also has become a habit to extensively document and manage any kind of information that I come across which is also a very good habit to have I believe.