Basti's Scratchpad on the Internet
29 Mar 2024

Books of 2023

Even though I did read some fiction last year, non really stuck with me. It appears that I am more interested in non-fiction these days. Strange how these things go.

Quest for Performance

book cover for Quest for Performance

Quest for Performance: The Evolution of Modern Aircraft, by Laurence K. Loftin

I have searched for a book like this for a long time: a history of airplane technology. The book details technological milestones and archetypes from the Wright flyer to the mid-1980s, with an emphasis on the two world wars and interwar years. It sometimes veers too close to a mere list of models and performances, but by and large still manages to tie it all into a comprehensible narrative. I guess you need to be a bit of an airplane nerd to appreciate this, but I found it fascinating!

And it is free to download, too.

The Soul of a New Machine

book cover for Soul of a New Machine

The Soul of a New Machine, by Tracy Kidder

The book retells the development of a computer during the interstitial years, after the big bang of computing in the first half of the century, but before the home computer revolution. This is a bit of a gap in the common computing lore, and one I hadn't know much about.

This happened before standardized CPU architectures, so we get a glimpse into CPU hardware design, the user-land software side of things, and the micro-code in between. This is quite an unusual perspective today, reliant on common abstractions as we are.

A fascinating read if you're interested in computing history, without requiring a Computer Science degree for the broader story.

Die großen Zeppeline

book cover for Die Großen Zeppeline

Die großen Zeppeline: Die Geschichte des Luftschiffbaus, by Peter Kleinheins

Half the book is reprints of technical reports of the original lead engineers who worked on the German Zeppelins. The other half is a retrospective view of Zeppelins in Germany and elsewhere.

There are myriad fascinating details about Zeppelin construction, like how their gas bags were made from animal intestines, or how they reclaimed water from engine exhaust to prevent losing weight while burning fuel. And it's especially fascinating to read about these things from people to whom this was the pinnacle of technology, and juxtapose our modern perspective.

This is another book I've been searching for many years. I found both this and Quest for Performance on Library Genesis, which is a terrific resource for researching books.

Tags: books

🪦 Emacs 2011-2023

For the last dozen years, I have used Emacs as my text editor and development environment. But this era ended. In this post, I outline how I went from using Emacs as a cornerstone of my digital life, to abandoning it.

In an ironic twist of history, it was Visual Studio that drove me to Emacs in the first place, and what ultimately pulled me away from it: In 2011, I was working on the firmware of a digital mixing console. This was edited in Visual Studio, compiled with an embedded compiler software, and source-controlled with command-line Git. It was ultimately Emacs that allowed me to tie this hodgepodge of idiosyncratic C+1, Git, and the proprietary compiler into a somewhat sane development environment.

Over the years, my Emacs config grew, I learned Elisp, I published my own Emacs packages, and developed my own Emacs theme. I went back to university, did my PhD, worked both OSS and commercially, and almost all of this was done in Emacs. As particular standouts beyond traditional text editing, I used Emacs' Git-client Magit every single day, and my own org-journal was absolutely vital as my research/work journal.

My monochrome Emacs theme
My custom Emacs theme, all monochrome, with varying fonts instead of colors

In 2023, however, I started a new job, once again with a Visual Studio codebase. This time, however, the code base and build system was tightly woven into the Visual Studio IDE, and only really navigable and editable therein. It thus made no sense to edit this code in Emacs, so I didn't. Perhaps I also needed a break.

And as my Emacs usage waned, so its ancient keyboard shortcuts started to become a liability. I started mis-typing Emacs things in Visual Studio, and hitting Windows shortcuts in Emacs. Friction began to arise. At the same time, I started noticing how poorly Emacs runs on Windows. Startup takes many seconds, it does not integrate well into the task bar2, it doesn't handle resolution changes gracefully, and it's best I don't start talking about its horrendously broken mouse scrolling. And of course it can't scroll point out of the window3.

My last use-case for Emacs was org-journal. I ended up porting a basic version of it to Visual Studio Code. Having thus written a text editor plugin for both editors, I have to be blunt: both, the anachronistic bone-headedness of Elisp, and the utter insanity of TypeScript's node APIs, are terrible environments for writing plugins. A few years ago I did the same exercise in Sublime Text Python, which was a beautiful, simple, quick affair. But I do enjoy a programming puzzle, so here we are.

The final nail in Emacs' coffin came from an unexpected corner: For all my professional life, I was a solo coder. My Emacs was proudly black-and-white (different fonts instead of different colors!), and my keyboard shortcuts were idiosyncratically my own. I did not merely use Emacs. I had built MY OWN Emacs. I like to think this built character, and API design experience. But it was of course a complete non-starter for pair programming. After having tasted Visual Studio (± Code) Live Sharing, there was simply no going back.

And thus, I am saddened to see that I haven't started Emacs in several weeks. I guess this is goodbye. This blog is still rendered by Emacs, and I still maintain various Emacs modules. My journal is still written in org-mode. But it is now edited in Visual Studio Code.

Footnotes:

1

An eclectic subset of C++, intersected with the limitations of the embedded compiler. This was decidedly pre-"modern" C++, and probably less than the sum of its parts.

2

Usually, the program's taskbar button starts the program, and represents it while running. Emacs spawns a new button instead.

3

This is emacs-speak for "it can't scroll the cursor outside the viewport"

Tags: emacs

Two Years with Legacy Code

From January 2021 to the beginning of 2023, I worked on a legacy code base at Fraunhofer IDMT in Oldenburg. My task was the maintenance and development of a DNN-based speech recognition engine that had become terra incognita when its original developer had left the company a year before I started. The code had all the hallmarks of severe technical debt, with layers of half-used abstractions, many unused branches of unknown utility, and the handwriting of several concurrent programmers at odds with each other.

The code had evidently been written in a mad dash to bring the product to market. And not to discredit its developers, had been in production for several years, with a core of robust algorithms surrounded by helper scripts that had allowed the company to build upon, even after the original developers had left.

It was my job to clean it up. Having spent six years on my PhD recently, I welcomed the calmer waters of 'just' programming for a bit. This blog post is a summary of the sorts of challenges I faced during this time, and what kinds of techniques helped me overcome them.

The lay of the land

I approached the task from the outside, sorting through the build scripts first. Evidently, at least three authors were involved: One old-school Unix geek that wrote an outdated dialect of CMake, one high-level Python scripter, and one shell scripter that deeply believed in abstraction-by-information-hiding. The result of this was… interesting.

For a good few weeks I "disassembled" these scripts by tracing their execution manually through their many layers, and writing down the necessary steps that were actually executed. My favorite piece of code was a Makefile that called a shell script that ran a Python program, which instantiated a few classes and data structures, which ultimately executed "configure; make; make install" on another underying Makefile. I derived great satisfaction from cutting out all of these middle-men, and consolidating several directories of scripts into a single Makefile.

Similar simplifactions were implemented at the same time across several code bases by my colleagues. In due time, this concerted effort enabled us to implement continuous integration, automated benchmarking, and automated builds, but more on that later.

Data refactoring

The speech recognition software implemented a sort of interpreter for the DNN layers, originally encoded as a custom binary blob. Apparently, a custom binary approach had been taken to avoid dependencies on external parsing libraries. Yet the data had become so convoluted that both its compilation and its parsing were now considered unchangeable black boxes that impeded further development.

Again, I traced through the execution of the compiling code, noted down the pieces of data it recorded, and rewrote the compiler to produce a MsgPack file. On the parsing side, I wrote a custom MsgPack parser in C. Looking back, every job I've had involved writing at least a couple of data dumpers/parsers, yet many developers seem intimidated by such tasks. But why write such a thing yourself instead of using an off-the-shelf solution? In an unrelated code review later in the year one colleague used the cJSON library for parsing JSON; in the event, cJSON was several magnitudes bigger and more complex than the code base it was serving, which is clearly absurd. Our job as developers is to manage complexity, including that of our dependencies. In cases such as these, I often find a simple, fit-for-purpose solution preferable to more generalized external libraries.

A part of the DNN data came from the output of a training program. This output however was eternally unstable, often breaking unpredictably between version, and requiring complex workarounds to accommodate different versions of the program. The previous solution to this was a deeply nested decision tree for the various permutations the data could take. I simplified this code tremendously by calling directly into the other program's libraries, instead of trying to make sense of its output. This is another technique I had to rely on several times, hooking into C/C++ libraries from various Python scripts to bridge between data in a polyglot environment.

Doing these deep dives into data structures often revealed unintended entanglements. In order to assemble one data structure, you had to grab pieces of multiple different source data. Interestingly, once data structures were cleaned up to no longer have such entanglements, algorithms seemed to fall into place effortlessly. However, this was not a one-step process, but instead an ongoing struggle to keep data structures minimal and orthogonal. While algorithms and functions often feel easier to refactor than data structures, I have learned from this that it is often the changes to data structures that have the greatest effect, and should therefore receive the greatest scrutiny.

Code refactoring

My predecessor had left me a few screen casts by way of documentation. While the core program was reasonably well-structured, it was embedded in an architectural curiosity that told the tale of a frustrated high-level programmer forced to do low-level gruntwork. There were poor-man's-classes implemented as C structs with function pointers, there were do-while-with-goto-loops for exception handling, there were sort-of-dynamically-typed data containers, accompanied by angry comments decrying the stupidity of C.

Now I like my high-level-programming as much as the next guy, but forcing C to be something it isn't, is not my idea of fun. So over a few months I slowly removed most of these abstractions. Somewhat to my surprise, most of them turned out pure overhead that could simply be removed. Where a replacement was needed, I reverted to native C constructs. Tagged unions instead of casting, variable-length-arrays instead of dynamic arrays. Treating structs as values instead of references. This, alone, reduced the entire code base by a good 10%. The harder part was sorting out the jumble of headers and dependencies that had evidentally built up over time. Together with the removal of dead code paths, the overall code base shrank by almost half. There are few things more satisfying than excising and deleting unnecessary code.

I stumbled upon one particularly interesting problem when trying to integrate another code base into ours. Within our own software, build times were small enough to make logging and printf-debugging easier than an interactive debugger such as GDB. The other code base however was too complex to recompile on a whim, and a different solution had to be found. Now I am a weird person who likes to touch the raw command line instead of an IDE. And in this case this turned out to be a huge blessing, as I found that GDB can not only be used interactively, but can also be scripted! So instead of putting logging into the other library, I wrote GDB scripts that augmented break points with a little call printf(...) or print/d X. These could get suprisingly complicated, where one breakpoint might enable or disable other breakpoints conditionally, and break point conditions could call functions on their own. It took some learning, but these debugging scripts were incredibly powerful, and a technique I will definitely refer to in the future.

When adding new features to the software, I often found it impossible to work the required data flow into the existing program code without snowballing complexity. I usually took these situations as code smells that called for a refactoring. Invariably, each cleaning up of program flow or data structures inched the program closer and closer to allow my feature addition. After a while, this became an established modus operandi: independently clean the code until feature additions become easy and obvious, then do the obvious thing. Thus every task I finished also left the surrounding code in a better state. In the end, about 80% of the code base had gotten this treatment, and I strongly believe that this has left the project in a much better state than it was before. To say nothing of the added documentation and tests, of course.

More velocity makes bigger craters

As I slowly shifted from cleanup work to new features, change management became a pressing issue. New features had to be evaluated, existing features had to be tested, and changes had to be documented and downstreamed. Fascinatingly, the continuous integration and evaluation tools we built for this purpose, soon unearthed a number of hidden problems in other parts of the product that we had not been aware of (including that the main task I had been hired to do was less worthwhile than thaught, LOL). That taught us all a valuable lesson about testing, and proving our assertions. That said, I never found bottom-level unit tests all that useful for our purposes; the truly useful tests invariably were higher-level integration tests.

Eventually, my feature additions led to downstream changes by several other developers. While I took great care to present a stable API, and documenting all changes and behavior appropriately, at the end of the day my changes still amounted to a sizeable chunk of work for others. This was a particularly stark contrast to the previous years of perfect stagnation while nobody had maintained the library. My main objective at this point was to avoid the mess I had started out with, where changes had evidentally piled on changes until the whole lot had become unmaintainable.

Thus a balance had to be struck between moving fast (and breaking things), and projecting stability and dependability. One crucial tool for this job turned out to be code reviews. By involving team members directly with the code in question, they could be made more aware of its constraints and edge cases. It took a few months to truly establish the practice, but by the end of a year everyone had clearly found great value in code reviews as a tool for communication.

Conclusions

There is a lot more to be said about my time at Fraunhofer. The deep dive into the world of DNN engines was truly fascinating, as were the varied challenges of implementing these things on diverse platforms such as high-performance CPU servers, Laptops, Raspberry Pis, and embedded DSPs. I learned to value automation of developer tasks, and of interface stability and documentation for developer productivity.

But most of all, I learned to appreciate legacy code. It would have been easy to call it a "mess", and advocate to rewrite it from scratch. But I found it much more interesting to try to understand the code's heritage, and tease out the algorithmic core from the abstractions and architectural supports. There were many gems to be found this way, and a lot to be learned from the programmers before you. I often felt a strange connection to my predecessor, as if we were talking to each other through this code base. And no doubt my successor feels the same way about my code now.

Tags: programming

OS Customization and MacOS

I was an Apple fanboy some years ago. Back then, whenever something was odd on my computer, I was surely just using it wrong. Nowadays, I see things the other way around: We're not "holding it wrong", the computer is just defective. Computers should do our bidding, not vice versa. So here's a bunch of things that I do to my computers to mold them to my way of working.

Keyboard Layout

I switch constantly between a German and English keyboard layout, and regularly between various machines. My physical keyboards are German, and my fingers are used to the Windows-default German and (international) US keyboard layout. These are available by default on Windows and Linux, but MacOS goes its own way.

However, keyboard layouts on MacOS are saved in relatively simple text files, and can be modified with relative ease. The process goes like this: Download Ukelele (free) to create a new keyboard layout bundle for your base layout1. Inside that bundle, there's a *.keylayout file, which is an XML file that defines the characters that each key-modifier combination produces. I changed that to create a Windows-like US keyboard layout. And I replaced the keyboard icon with something sane (not "A") by creating a 256x256 pixel PNG, opening it in Preview, holding alt while saving to select the ICNS format. Save the keyboard bundle to ~/Library/Keyboard Layouts and reboot. Then I remove the unremovable default ("A") German layout by selecting another one, then plutil -convert xml1 ~/Library/Preferences/com.apple.HIToolbox.plist, and delete the entry from AppleEnabledInputSources. Now reboot. Almost easy. Almost.

One the one hand, this was quite the ordeal. On the other, I have tried to do this sort of thing on Windows and Linux before, and for the life of me could not do it. So I actually think this is great!

Keyboard Shortcuts

My main text editor is Emacs, and I am very used to its keyboard shortcuts. Of particular note are CTRL-A/E for going to the beginning/end of a line, and Alt-B/F for navigating forward/backwards by word. I have long wanted to use these shortcuts not just in Emacs and readline-enabled terminal applications, but everywhere else, too. And with MacOS, this is finally possible: Install BetterTouchTool ($22), and create keyboard shortcuts that maps, e.g. Alt-B/F to Alt-←/→. Ideally, put this in a new activation group that excludes Emacs. It may be necessary to remove the keyboard character for Alt-B/F from your keyboard layout before this works. I've spent an embarrassing number of hours trying to get this to work on Windows and Linux, and really got nowhere2. Actually, however, most readline shortcuts such as Ctrl-A/E/B/F/K/Y already work out of the box on MacOS!

Mouse Configuration

I generally use a trackpad, and occasionally a traditional mouse for image editing, and have used a trackball. I find that any one specific device will lead to wrist pain if used constantly, so I switch it up every now and then. The trackpad and trackball, however, need configuration to be usable.

After experimenting with many a trackpad device, I have found Apple touch pads the best trackpads on the market3. On MacOS, they lacks a middle mouse click. So I created a trackpad shortcut in the aforementioned BetterTouchTool ($22) to map the middle click on a three-finger tap (can also be had for free with MiddleClick (OSS)). For Windows, Magic Utilities ($17/y) provides a wonderful third-party driver for Apple devices that also supports the three-finger tap. I have not gotten the Apple touch pad to pair reliably on Linux, and have generally found their touch pad driver libinput a bit lacking.

My trackball is a Kensington SlimBlade. To scroll, you rotate the ball around its vertical axis. This is tedious for longer scroll distances, however. But there's an alternative scrolling method called "button scrolling", where you hold one button on the trackball, and move the ball to scroll. You need to install the Kensington driver to enable this on Windows and MacOS. Button scrolling is available on Linux as well using xinput4, but I haven't gotten the setting to stick across reboots or sleep cycles. So I wrote a background task that checks xinput every five seconds, which does the trick.

Window Management

Frankly, Windows does window management correctly. Win-←/→ moves windows to the left and right edge of the screen, as does dragging the window to the screen border. Further Win-←/→ then moves the window to the next half-screen in that direction, even across display boundaries. KDE does this correctly out of the box as well, Gnome does not do the latter, and it drives me mad. MacOS doesn't do any of these things. But Rectangle (OSS) does. Easy fix. (BetterTouchTool can do it, too, but Rectangle is prettier)

Furthermore, I want Alt-Tab to switch between windows. Again, MacOS is the odd one out, which uses CMD-Tab to switch between apps, now windows. And then there's another shortcut for switching between windows of the same app, but the shortcut really doesn't work at all on a German keyboard. Who came up with this nonsense? Anyway, Witch ($14) implements window switching properly.

Application Management

In Windows and Linux, I hit the Windows key and start typing to select and start an app. In MacOS, this is usually Cmd-Space, but BetterTouchTool can map it to a single short Cmd, if you prefer.

More annoying are the various docks and task bars. I always shove the dock off to the right edge of the screen, where it stays out of the way. Windows 10 had a sane dock, but then 11 came and forced it to the bottom of the screen. Dear OS makers, every modern screen has plenty of horizontal space. But vertical space is somewhat limited. So why on earth would you make a rarely used menu such as the dock consume that precious vertical space by default? And Microsoft, specifically, why not make it movable? Thankfully, there's StartAllBack ($5), which replaces the Windows task bar with something sensible, and additionally cleans up the start menu if you so desire. On KDE, I fractionally prefer Latte (OSS) over KDE's native dock. The MacOS dock is uniquely dumb, offering no start menu, and allowing no window selections. But it's unobtrusive and can be moved to the right edge, so it's not much of a bother.

File Management

One of the most crucial tasks in computer work in general is file management. I am not satisfied with most file managers. Dolphin on KDE works pretty well, it has tabs, can bulk-rename files, can display large directories without crashing, and updates in real time when new files are added to the current directory. Gnome Nautilus is so bad it is the main reason I switched to KDE on my Linux machines. Finder on MacOS is passable, I suppose, although the left sidebar is unnecessarily restrictive (why can't I add a shortcut to a network drive?). Windows Explorer is really rather terrible, lacking a bulk-rename tool, and crucially, tabs. In Windows 10, these can be added with Groupy ($12) (set it to only apply to explorer.exe). Windows 11 has very recently added native tabs, which work OK, but can't be detached from the window.

The sad thing is that there are plenty of very good file manager replacements out there, but none of the OSs have a mechanism for replacing their native file manager in a consistent way, so we're mostly stuck with the defaults.

Oh, and I always remove the iCloud/OneDrive sidebar entries, which is surprisingly tedious on Windows.

Hardware Control

On laptops, you can control screen brightness from your keyboard. On desktops, you can not. However, some clever hackers have put together BetterDisplay (OSS for screen brightness), which adds this capability to MacOS. That's actually a capability I have wanted for quite a while, and apparently it is only available in MacOS. Great stuff!

Less great is that MacOS does not allow volume control on external sound cards. SoundSource ($47) adds this rather crucial functionality back, once you go through the unnecessarily excruciating process of enabling custom kernel extensions. Windows and Linux of course natively support this.

Another necessary functionality for me is access to a non-sucky (i.e. no FAT) cross-platform file system. At the moment, the most portable file system seems to be NTFS, of all things. Regrettably, MacOS only supports reading NTFS, but no writing. Paragon NTFS (€20) adds this with another kernel extension, and promptly kernel-panicked my computer. Oh joy. At least it's only panicking for file transfers initiated by DigiKam, which I can work around. Paragon Support says they're working on it. I'm not holding my breath. Windows and Linux of course natively support NTFS.

System Management

I have learned from experience not to trust graphical backup programs. TimeMachine in particular has eaten my backups many times already, and can not be trusted. But I have used Borg (OSS) for years, and it has so far performed flawlessly. Even more impressive, my Borg backups have a continuous history despite moving operating systems several times. It truly is wonderful software!

On Windows, I run Borg inside the WSL, and schedule its backups with the Windows Task Scheduler. On Linux, I schedule them with systemd units. On MacOS, I install Borg with Homebrew (OSS) and schedule the backups with launchd tasks. It's all pretty equivalent. One nice thing about launchd, however, is how the OS immediately pops up a notification if there's a new task file added, and adds the task to the graphical system settings.

I have to emphasize what a game-changer the WSL is on Windows. Where previously, such simple automations where a pain in the neck to do reliably, they're now the same simple shell scripts as on other OSes. And it perfectly integrates with Windows programs as well, including passing pipes between Linux and Windows programs. It's truly amazing! At the moment, I'd rate Windows a better Unix system than MacOS for this reason. Homebrew is a passable package manager on MacOS, but the way it's ill-integrated into the main system (not in system PATH) is a bit off-putting.

App Compatibility

I generally use my computer for three tasks: General document stuff, photo editing, and video games.

One major downside of Apple computers is that video games aren't available. This has become less of a problem to me since I bought a Steam Deck, which has taken over gaming duties from my main PC. Absolutely astonishingly, the Steam Deck runs Windows games on Linux through emulation, which works almost flawlessly, making video games no longer a Windows-only proposition.

What doesn't work well on Linux are commercial applications. Wine generally does not play well with them, and frustratingly for my photo editing, neither VMWare Workstation Player (free) nor VirtualBox (OSS) support hardware-accelerated VMs on up-to-date Linux5. So where MacOS lacks games, Linux lacks Photoshop. Desktop applications in general tend to be unnecessarily cumbersome to manage and update on Linux. Flatpak is helping in this regard, by installing user-facing applications outside of the OS package managers, but it remains more work than on Windows or MacOS. The occasional scanner driver or camera interface app can also be troublesome on Linux, but that's easily handled with a VirtualBox VM (with the proprietary Extension Pack for USB2 support), and hasn't really bothered me too much.

Luckily for me, my most-used apps are generally OSS tools such as Darktable (OSS) and DigiKam (OSS), or cross-platform programs like Fish (OSS), Git (OSS), and Emacs (OSS). This is however, where Windows has a bit of a sore spot, as these programs tend to perform noticeably worse on Windows than on other platforms. Emacs and git in particular are just terribly slow on Windows, taking several seconds for routine operations that are instant on other platforms. That's probably due to Windows' rather slow file system and malware scanner for the many-small-files style of file management that these Unix tools implement. It is very annoying.

Conclusions

So there's just no perfect solution. MacOS can't do games, Linux can't run commercial applications, and Windows is annoyingly slow for OSS applications. Regardless, I regularly use all three systems productively. My job is mostly done on Windows, my home computer runs MacOS, and my Steam Deck and automations run Linux.

Overall, I currently prefer MacOS as my desktop OS. It is surprisingly flexible, and more scriptable than I thought, and in some ways is actually more functional than Linux or Windows. The integrated calendar and contacts apps are nice, too, and not nearly as terrible as their Windows/Linux counterparts. To say nothing of the amazing M1 hardware with its minuscule power draw and total silence, while maintaining astonishing performance.

Linux is where I prefer to program, due to its sane command line and tremendously good compiler/debugger/library infrastructure. As a desktop OS, it does have some rough edges, however, especially for its lack of access to commercial applications. While Linux should be the most customizable of these three, I find things tend to break too easily, and customizations are often scattered widely between many different subsystems, making them very hard to get right.

Theoretically, Windows is the most capable OS, supporting apps, OSS, and games. But it also feels the most user-hostile of these three, and the least performant. And then there's the intrusive ads everywhere, its spying on my every move, and at work there's inevitably a heavy-handed administrator security setup that gets in the way of productivity. It's honestly fine on my home computer, at least since they introduced the WSL. But using it for work every day is quite enough, so I don't want to use it at home, too.

Footnotes:

1

if there's an easier way to create a keyboard layout bundle, please let me know. I didn't use ukelele for anything but the bundle creation.

2

It occurs to me that it might actually be possible to do something like on Windows this with AutoHotkey (free). I'll have to try that!

3

The Logitech T650 is actually not bad, but the drivers are a travesty. Some Wacom tablets include touch, too, but they palm rejection is abysmal and they don't support gestures.

4

xinput set-prop "Kensington Slimblade Trackball" "libinput Scroll Method Enabled" 0, 0, 1 # button scrolling
xinput set-prop "Kensington Slimblade Trackball" "libinput Button Scrolling Button" 8 # top-right button

5

VirtualBox does not have a good accelerated driver, and VMWare does not support recent kernels. Qemu should be able to solve this problem, but I couldn't get it to work reliably.

Tags: computers linux macos windows

Media of 2022

Around this time of the year, I usually write a blog post about my favorite books of the last year. But this time, not enough of them really stood out. So instead, here's my favorite pieces of media I consumed in the last year:

Firepower

book cover for Firepower

This is the story of how gunpowder changed the world. We get to see the familiar history of central Europe, but told unusually, from the bottom up: how seemingly small inventions change the course of peoples and nations. The book is in essence a history of warfare in the last few hundred years, but this time the movers and shakers are not Great Men, but chemistry and engineering. A fascinating perspective, to say the least.

Along the way, the book unraveled countless tangents and quirks I had previously stumbled upon, yet never understood. How castles went from tall walled structures to flat earthworks in the 18th century due to the disruptive invention of cannons. How wooden galleons of 1850 were obsoleted by turreted iron warships practically overnight. How rifling and shells bled dry the coffers of Central Europe and made conflict inevitable. This book contextualized many a story like this in the most riveting manner!

Steam Deck

steam deck controller

Ever since we had our second child, I didn't really find the time for video games any more, much to my regret. So when the Steam Deck was announced, essentially a portable gaming computer, it didn't feel like a worthwhile investment.

But boy, was I wrong about that. The genius of the Steam Deck is how it's instantly-on, instantly-off like a video games console, allowing me to play in short bursts that would not otherwise be available to gaming. But in contrast to a console, the deck allows me to play games without blocking the living room, and away from the computer screen I'm working at all day anyway.

It has reignited my video gaming, and surprisingly not just for newer titles, but thanks to EmuDeck, emulated retro games as well. I had a blast playing Banjo-Kazooie with my daughter, and Chorus, Ace Combat, Guardians of the Galaxy and Death Stranding on my own. To me, this is a revolutionary device, and I haven't touched my gaming PC or any of my video game consoles since!