Atoms

Bite-sized notes and articles that don't fit elsewhere. Also available as an RSS feed.

#

Keeping packages up to date

brew upgrade is a convenient way to update your packages to their latest version. The issue is, at one point or another not all your packages will be installed by brew.

To keep my system up to date I am using a tiny fish function that updates binaries installed through my most commonly used package managers, tools, and languages:

function update --description "Update binaries"
    brew update; and brew upgrade
    fisher update
    gup update
    npm update -g
    rustup update
    cargo install-update -a
end

Every now and then, I run update and get my whole system to latest version of tools. I tried topgrade before, but it’s heavyweight for my use case and often failed for (to me) unclear reasons. On the other hand, update function is less than 10 lines of code long and allows me to easily find the commands that run under the hood in case anything fails.

Published on
#

Reading the Azerothcore codebase

Whiles on sabbatical in New Zealand I have been reading on game development and nostalgically remembering the years I have spent playing video games for the majority of my free time. The fondest memories I have are of World of Warcraft and the endless hours of leveling, raiding, and completing daily quests. Now I am completing the journey from the other side couple of years later, reading through the code base of Azerothcore, an open-source WoW server for the Wrath of the Lich King expansion which is at this point 16 years old.

Getting a glimpse of the machinery behind one of the greatest MMOs of all time is a surprisingly fun endeavor. The code base is exceptionally well organized and easy to navigate even as a complete novice. For example, many of the game entities are defined as standalone scripts that define hooks for the main event loop which leaves the core classes clean while providing necessary customization for a rich game world.

I have no takeaway to wrap this up with except that writing game servers after all, doesn’t seem that different from building large monolithic APIs. Clear naming and good abstractions go a long way.

Published on
#

Pictures from New Zealand

Look to my coming at first light on the 5th day. At dawn, look to the East.

Peter Jackson’s pick of New Zealand as the filming location of the original trilogy of Lord of the Rings was a rightful pick even if it wasn’t his homeland. New Zealand’s nature is majestic in every aspect and the only thing I was missings was Howard Shore’s The Ride of the Rohirrim queuing in whenever I entered a coffee shop.

Verify my claim in the Photos section.

Published on
#

Photos

I added a Photos section to the website, intending to make it a curated gallery of all the photos I keep collecting and sporadically sharing on Insagram, arguably one of the worst platforms for actually sharing photography.

Initially, I wanted to leverage the same setup I already use for other images on this website - Git LFS. This proved to be problematic though, as fetching the repository results in all pictures being downloaded as well. Furthermore, GitHub restricts the LFS space to 2 GB for free accounts which I am afraid of hitting quite early.

After some research I opted for storing all the photos in S3 (CloudFlare R2 to be precise) under my own domain. The process of adding a picture has a few steps:

Pick a picture and a name. Picking a name proved to be harder than I expected, I would like a consistent naming schemantics but… what? I tried to use location names but my camera doesn’t add the longitude and latitude to picture unless it is connected to my phone, which it rarely is, and figuring out the locations manually is a tedious manual task.

Activity/subject of the photo seemed like a good approach but I need to generate unique slugs for the file names. One could have an incrementing counter (e.g. cycling_0005) the only issue being that I am not adding the photos in order; at least for now while backfilling.

In the end I decided to give up on consistency for now and see if I find a better approach in the future.

Next step is to resize and generate thumbnail. This involves using ImageMagick to generate 3 different sizes (512, 1024, and 2048 px) in 2 different formats (jpg and webp).

Upload images to R2 using rclone.

Last but not least, I store the image metadata in matoous/dzxcz repository (source of this website). The metadata contain the image slug, the date the photo was taken, names of the original file and thumbprints, plus all available EXIF metadata as stored by the camera, e.g.:

---
title: Rocks (Bosnia)
date: '2023-10-02 12:53:27'
params:
  iso: '200'
  focal_length: '26'
  f_number: '5.6'
  ev: '1'
  exposure: '1/500'
  model: RICOH GR IIIx
  make: RICOH IMAGING COMPANY, LTD.
  srcset:
    jpg512: rocks_bosnia_512.jpg
    jpg1024: rocks_bosnia_1024.jpg
    jpg2048: rocks_bosnia_2048.jpg
    webp512: rocks_bosnia_512.webp
    webp1024: rocks_bosnia_1024.webp
    webp2048: rocks_bosnia_2048.webp
    original: rocks_bosnia.jpg
---

To make all of this easier I wrote a little CLI tool that wraps all of these steps under 1 command:

dzx --name "Valencia 2023 - Sunset (Spain)" "/Volumes/LaCie 01/2023-12 Valencia/Edited/R0001551.jpg"

The Photos archive is displayed using CSS grid with Masonry layout that’s a new experimental feature in CSS, currently not supported by default by almost any browser. If you want to test it out and have the page display properly you should be able to enable masonry layout in your browser, in Firefox for example under about:config, feature layout.css.grid-template-masonry-value.enabled.

Published on
#

Typst

This weekend I have been updating and rewritting my CV, in Typst. The official description of Typst is:

A new markup-based typesetting system that is powerful and easy to learn.

which is true and maybe even an understatement. Typst is easy to learn, so much so that after understanding a few basic principles, one can guess how the rest works (exaggering here, but the language is indeed neatly designed and structured). It’s a LaTeX for human beings. Typst support all the important things out of the box; tables, bibliography, figures, you name it. Furthermore, there’s the Typst Universe with evergrowing number of packages, to name a few, there’s a IEEE-style paper template, codly for code snippets, drafting for comments and margin notes and much more, all easily searchable.

I wish Typst existing back during my university years. LaTeX was a love-hate relationship, especially when it comes to installing packages and running everything locally.

Published on
#

Race Nutrition

Ultra-trail Snowdonia is just behind the corner with less than 8 days to go. Learning from my past mistakes I’ve done a research on a race nutrition, documenting everything in the Wiki. The race nutrition approach boils down to the same best practices one would apply during training - do what you have tried and tested and what works for you. The same way you wouldn’t line up for a 100km race in a new shoes you don’t want to change your diet the week before the race. A few notes the summarize what I’ve learned:

  • Eat what you are used to and don’t change that too much before the race
  • 2-3 days before the start, cut out fiber. Fiber slows down digestion which is desired in regular regime but you don’t want to feel full during the race and what you eat needs to be absorbed fast.
  • Avoid fats, while their are great source of energy they take a lot of time to process.
  • The day before the race, don’t over-eat. You want to feel slightly hungry and start the race with relatively empty stomach.
  • During the race, the rule of thumb is to eat something small every 45 minutes to 1 hour. Your body can absorb 150-300 kCal per hour which is far below the 600-1000 kCal you will be burning (for a reference, one 40g gel is around 100 kCal). Deficit is unavoidable so don’t try to sutff yourself and upset your stomach.
  • In long races, you want to have a protein intake as well. The recommendation being 2.5g to 10g per hour which helps with the onset of muscle breakdown.

Theory sorted, now let’s see if I can put this to practice and finally have a race where stomach issues won’t be the limitting factor for performance.

Published on
#

From logs to traces

At SumUp we heavily rely on Opentelemetry tracing and Honeycomb for observability. Traces are, in my opinion, far superior tool for debugging and analysis than logs. Yet, there’s time and place for everything and sometimes you need to log (e.g. gracfully ignored errors). In such cases, it’s helpful to be able to move between logs and traces. With slog (and probably any other structured logging library that supports context.Context) this becomes metter of 20 lines of code:

type otelLogHandler struct {
	slog.Handler
}

// WithOTEL wraps the slog handler with OTEL handler that extracts and populates trace and span information
// on the log.
func WithOTEL(h slog.Handler) slog.Handler {
	return &otelLogHandler{h}
}

func (h *otelLogHandler) Handle(ctx context.Context, r slog.Record) error {
	spanCtx := trace.SpanContextFromContext(ctx)

	if spanCtx.IsValid() {
		r.AddAttrs(
			slog.String("trace.trace_id", spanCtx.TraceID().String()),
			slog.String("trace.span_id", spanCtx.SpanID().String()),
		)
	}

	return h.Handler.Handle(ctx, r)
}

Then when initializing your slog.Handler:

logger := slog.New(WithOTEL(slog.NewJSONHandler(os.Stdout, &slog.HandlerOptions{
	AddSource: true,
	Level:     logLevel,
})))

and you are good to go as long as you use slog.ErrorContext (and alternatives for other verbosity levels).

Published on
#

Wiki redesign

The Wiki now has a new fresh look. I migrated the whole website to a custom framework - mwp - to be able to customize it as much as I please. mwp is built in Rust on top of actix (web framework), maud (macro for writing HTML), and tantivy (search). The whole server is then packaged into a docker container and deployed on fly.io.

The main goal of the rewrite was to add a search functionality that would allow me (and anyone else) to search for all links in the Wiki in a way that takes into consideration the content of the linked websites. That’s done by scraping the links, putting the content into an sqlite database that lives alongside the repository, and indexing everything using tantivy at startup. There’s plenty of room for improvement; at the moment the scraping needs to be triggered manually and while the index building is fast, it unnecessarily delays the application startup which is annoying during local development.

Published on
#

Running Zephyr on ESP32

Two weeks ago I started toying around with a development version of SumUp Solo card reader and learning more about embed and hardware in general. Recently, we have launched a new reader, Solo Lite which runs on Zephyr and seemed like an easier entry-point. I don’t have a development Solo Lite at hand but I found a couple unused ESP32 microcontrollers at home. The task for the weekend was simple: get zephyr up and running on and esp32.

First, I followed the official Getting started guide. I only diverged in the dependencies installation step, using Nix instead of Homebrew. In my home-manager config I added the necessary packages:

{ inputs
, outputs
, lib
, config
, pkgs
, unstable
, bleeding-edge
, ...
}:
let
  username = "matousdzivjak";
  homeDir = "/Users/${username}";
in
{
  ...
  home.packages = [
    pkgs.minicom # Modem control and terminal emulation program
    pkgs.wget
    pkgs.python3
    pkgs.ninja # Small build system with a focus on speed
    pkgs.gperf
    pkgs.ccache # Device Tree Compiler
    pkgs.dtc
  ];
}

(I switched to nix + home-manager quite recently. If you are interested in my whole config checkout github.com/matoous/nix-home)

Next, I update the binary blobs needed for ESP32:

west blobs fetch hal_espressif

With that, the development setup is done and what was left was getting something running on the esp and what’s easier than a hello world. Zephyr comes with a hello world sample application, we don’t even have to write anything ourselves. The sample hello world app can be build using:

west build -p always -b esp32_devkitc_wroom zephyr/samples/hello_world

Resulting into an output ending with:

Generating files from /Users/matousdzivjak/code/github.com/matoous/esp32-zephyr/build/zephyr/zephyr.elf for board: esp32_devkitc_wroom
esptool.py v4.5
Creating esp32 image...
Merged 5 ELF sections
Successfully created esp32 image.

I flashed the ESP32 which was as simple as connecting it to the laptop with and USB-C cable and running:

west flash

The esp blinks a few times and the flashing is done in a couple of seconds. Theoretically, we have the hello world sample app up and running, but better confirm. For that, I connected to the esp using serial port to check the logs. First, minicom needed tweeking as the last device I worked with had a different setup:

sudo minicom -s

This pops up the configuration menu:

 +-----[configuration]------+
| Filenames and paths      |
| File transfer protocols  |
| Serial port setup        |
| Modem and dialing        |
| Screen                   |
| Keyboard and Misc        |
| Save setup as dfl        |
| Save setup as..          |
| Exit                     |
| Exit from Minicom        |
+--------------------------+

Here one needs to configure the Serial port setup:

+-----------------------------------------------------------------------+
| A -    Serial Device      : /dev/modem                                |
| B - Lockfile Location     : /var/lock                                 |
| C -   Callin Program      :                                           |
| D -  Callout Program      :                                           |
| E -    Bps/Par/Bits       : 115200 8N1                                |
| F - Hardware Flow Control : No                                        |
| G - Software Flow Control : No                                        |
| H -     RS485 Enable      : No                                        |
| I -   RS485 Rts On Send   : No                                        |
| J -  RS485 Rts After Send : No                                        |
| K -  RS485 Rx During Tx   : No                                        |
| L -  RS485 Terminate Bus  : No                                        |
| M - RS485 Delay Rts Before: 0                                         |
| N - RS485 Delay Rts After : 0                                         |
|                                                                       |
|    Change which setting?                                              |
+-----------------------------------------------------------------------+

Fist, A to modify the Serial device. We need to specify the tty of the connected esp. For me this was /dev/tty.usbserial-0001. I think this might differ for others, so the quick way to check which serial device to use one can run ls /dev | grep 'tty.*usb' and unless there are multiple devices with serial port connected via an USB cable there should be exactly one result.

Once that’s done, Exit, and here it is:

Welcome to minicom 2.9

OPTIONS: I18n
Compiled on Jan  1 1980, 00:00:00.
Port /dev/tty.usbserial-0001, 22:07:30

Press CTRL-A Z for help on special keys

0020 vaddr=00000020 size=0001ch (    28)
I (113) esp_image: segment 1: paddr=00010044 vaddr=3ffb0000 size=00104h (   260) load
I (122) esp_image: segment 2: paddr=00010150 vaddr=3�NG early entropy source...
*** Booting Zephyr OS build v3.6.0-1966-g3f218c6cdae0 ***
Hello World! esp32_devkitc_wroom/esp32/procpu  

A beautiful, rewarding, and underwhelming Hello World!.

Next I think I will try to run Hello World! with hubris which upon initial investigation seems to be way more complex as esp32 board isn’t supported out of the box. Alternatively, I might try to get Zephyr up and running on a Solo instead. Either way, enough for one evening and more to come.

Published on
#

Notes on Nix

Over the weekend I switched my MacOS setup in large part from homebrew to Nix. You can find the new home at github.com/matoous/nix-hoome. To be frank, I know little about Nix which is famous for its steep learning curve so I ended up copying and stitching together code from google and people I follow.

Nix allows multiple different versions of the same binary to be installed at the same time which helps with avoiding collisions and allows different tools to be updated independently.One can do something similar with homebrew, e.g. installing Qt version five using: brew install qt@5, but to my understanding this is way more limiting because of the dependency chain.

  • To go over historical versions of specific package see nix-versions.
  • Install the package using nix-env -iA, e.g. nix-env -iA nodejs_20 -f https://github.com/NixOS/nixpkgs/archive/9957cd48326fe8dbd52fdc50dd2502307f188b0d.tar.gz

Previously, nix used multiple different command for different things. E.g. nix-env for installing packages into the environment, nix-shell to init a nix shell, etc. Nowadays, all these commands are available under the nix command which you need to enable using ~/.config/nix/nix.conf:

experimental-features = nix-command flakes

There are a few promising things about Nix that I want to explore:

  • With nix it is easy to provision developer environment per-repository. One can do so by adding a flake that configures all required tools for development.
  • There’s NixOS which allow whole OS to be declaratively configured using Nix.
  • You can build packages using Nix which builds them in isolations using specific versions of dependencies which ensures that you will get a consistent result.

That’s it for now, I will keep on toying around with Nix and see what more there is to it. For now, I manage my dot files and tools using home-manager and started experimenting with using Nix to cross-compile software for Solo.

Published on
#

CRDTs

In a recent side-project attempt to built an RFC management tool I ventured into the topic of CRDTs (Conflict-free Replicated Data Type) and web-based text editors. Here is a loose collection of links to various tools and articles on the topic:

Implementation

Editors

Published on
#

Inception

This is the inception of Atoms - a short-form stream of notes and thoughts inspired by Tom MacWrigth’s Micro and Brandur Leach’s Atoms and Fragments. Atoms will serve the void between notes in the Wiki (currently undergoing a rewrite) and longer posts.

You can subscribe to atoms using the RSS Feed.

Published on