Experimenting with SQLite for FrontierNav
SQLite was in the news a bit this week, and it reminded me of an experiment I added years ago to try migrating FrontierNav to it.
Currently FrontierNav's data is one big in-memory JSON blob. To modify it, I use actions and reducers which I wrote myself. As mentioned in the previous FrontierNav Weekly, having to re-implement database commands is a bit tedious and I'd rather use an existing solution if one existed. One solution is to use a WASM-compiled SQLite binary to operate on an in-memory SQLite database.
Not a Graph Database
While FrontierNav uses a graph model, its storage format is pretty relational. Entities live in one table, and relationships in another. There are separate lists for indexes. The querying is where the "graph" aspect comes in as it's recursive. Last I checked SQL can't do recursive queries without an application layer, while graph databases like Neo4j can. However, I can't find a good standard graph database that works in-browser so I don't have much choice there.
It's pretty trivial to write a set of recursive functions to traverse an in-memory JSON graph. Adding SQL to that just replaces some function calls with database queries. Those queries introduce a performance hit. How big that hit is is hard to tell, which is why I wanted to experiment.
Benefits of SQLite
The benefit of using SQLite is mainly around having a standard database that can be easily duplicated and modified locally. It would allow other apps to work with the data without having to use FrontierNav's custom APIs. Hopefully, it would also reduce memory usage as I'd assume it's more optimised than an ad-hoc JSON solution.
The other major benefit is disk access. When I eventually work on a desktop app for FrontierNav, it would be nice if everything wasn't in-memory. Memory isn't infinite. SQLite would pretty much give that for free. I don't know how SQLite works with the file system, but I'm pretty sure it doesn't load the entire database into memory like I'd need to for a JSON solution.
So, the first step was to get the data into an SQLite database and see how large it is. If it's smaller, I have more incentive to switch to an in-memory SQLite database. I used Xenoblade 2's data as it's the largest and I got... unimpressive results.
A 21.6MB JSON database is around 18.3MB in SQLite. However, GZIP plays an important role since most of the data is repetitive. After compression, the JSON database is 1.9MB and SQLite is 2.0MB.
Next I'll be testing querying performance. This will require a lot of refactoring so that I can easily swap the JSON and SQLite querying interfaces; something that's useful regardless of this experiment. That refactoring will be done gradually while I work on other features since it touches a lot of places.
After that test, I can start working on a desktop client for FrontierNav and test disk-access performance.
Thanks for reading.