Jelle Hermsen's corner

RSS

The joy of handwriting CSS

Today I came across PurgeCSS: a tool that allows you to rid your website of unused stylesheet rules. It seems to be very useful if you’re knee-deep in external CSS dependencies, brought in by the various frameworks and plugins used in websites these days. But preferably you don’t need a tool like this at all. This need is typical of the way modern web development is often conducted. You’re slapdash gluing a bunch of tools and libraries together. Mix it with your CSS framework of choice, say Bootstrap or Tailwind and after a bit of development you end up with a big ball of stylesheets, often 100kB or more. And that is when you reach for CSS processors like Less, minifiers and, yeah, PurgeCSS.

CSS was a godsend when I first started using it in the early noughties. Before CSS I would design web pages by using tables, cutting up a design document into small bits and pieces and carefully fitting them inside the table’s cells and rows. Having CSS and a fixed idea on the size of people’s monitors allowed me to move much quicker. I assumed a maximum screen width of 980 pixels and I just flew, easily recreating pretty much any Photoshop design that was thrown at me.

Unfortunately, or fortunately perhaps, the web grew and so did the amount of devices and screen sizes I had to support. At the advent of mobile browsers I could go by with simply making a wholly separate website running under a “m.” subdomain, and fitting everything inside a much smaller 320 pixels box. This approach was of course not durable at all, since mobile devices started getting higher and higher resolutions. That’s where the new feature of CSS media queries came in handy. It allowed me to have one single “responsive” web design that fitted any browser and things were well for a while.

After some time web designs started getting even more complex. Clients started wanting grid layouts, animations and adaptive designs which offered different layout choices depending on the screen size and rotation. Fitting all those needs in the previously handwritten CSS was hard and often required some JavaScript to assure things went nice and smoothly. The pressure caused by this evolving web was even harder on teams. Getting all the wood behind one arrow was tough. You had to have meetings laying down rules and approaches to how you would structure your stylesheets. Thankfully CSS libraries such as Bootstrap started emerging. They reset browser defaults and contained a large amount of standards for design elements. Instead of building everything from the ground up, you could simply reach for Bootstrap and customize the bits you wanted.

The web however did not stop evolving. I needed to start supporting parallax effects, nifty sliders with all kinds of effects and widgets that pop in when you scroll down the page. I started pulling in even more dependencies to do these things. Slider libraries for the sliders, parallax libraries for the parallax effects..etc. They all came with their own bits of JavaScript and their own CSS. To be able to fit this all into the overall design, more things had to be customized. In the meantime the typical “Bootstrap” look of websites started getting old. You could smell a Boostrappy site from miles away and without major modifications they could feel a bit bland.

The effect of all the extra dependencies was also that load-times of web pages started to go up. Not immediately noticeable on broadband connections, but surely a problem when using your smartphone or in other places with lesser connection speeds. This caused the need for minifiers. I needed to reduce the amount of web requests done for a page and this meant fusing all the CSS files into one big file and minimizing it, making it impossible to read for viewers, but much nicer to get across a network.

On another level web developers started using CSS-preprocessors like Less, Sass and Stylus. This allowed for easier development, adding variables and a better separation of design parts. What used to be an activity of simply adding your own styling sauce to a web page, evolved into a very complex declarative programming of some sort.

The problem with the grown complexity is that CSS wasn’t originally designed for it. I don’t blame the designers or the web browser vendors, since they simply adapted it to the quickly changing times, but CSS has grown to be a very complex language. I wouldn’t like to think how it would be to be a junior frontend developer staring at the W3 specifications describing the positioning rules, flexible box layouts, transforms..etc. I can only imagine this to be a daunting, almost impossible task if you haven’t lived through the changes one by one.

The complexity of CSS lies in its originally simple design. It allows you to make selections of HTML elements and states to which certain styling rules apply. You can append to earlier rules or override them. The way in which this works and which rules take precedence is defined in the cascading order. This cascade starts with browser defaults, appended by user browser settings and moves up all the way to the mighty hammer of !important, overriding most things in its path. The problem with this cascade and the way this works is that it’s pretty hard to keep all the rulesets in your head. Frankly, with most modern websites, this is utterly impossible and you have to resort to debugger-driven development. Using the “inspect element” tool of your web browser you can figure out which rules apply to which element and you can try to find out how you could add your own modifications.
For a software developer this practice would be unacceptable. Code needs to be understandable to a large extent. You have to have a pretty good idea about the effects of your work, otherwise it would drive you mad.

As I see it tools like PurgeCSS are a kind of admission of defeat. There’s so much CSS floating around in our websites that we don’t even know which rules apply anymore. Most of it is probably brought in by the handy libraries we used and there’s no way of understanding their specific construction. I really understand why you would reach for such a tool and I have certainly wanted to use it in the past. Shocked by the large amount of CSS used by a website and the poor scores on PageSpeed Insights any sane person would try to alleviate this.
However, there’s also another way. You could scratch the need for external CSS bloat altogether if you simply refuse to use those and start writing your own code again. The development of CSS hasn’t stopped and in recent years has amassed features that were previously either a fools-errand to implement yourself, or cried out for libraries like Bootstrap. Things like grids, flexible layouts, parallax effects, animations, you name it. Pretty much all of those are supported in CSS3 and you don’t have to write hundreds of lines to get them into your website. I feel that thanks to the quick-evolving nature of web technology we can get all the design niceties our clients desire without ending up with big balls of minified glued stylesheets. That’s exactly the kind of approach I’m going for these days.

One of the many plus sides of being a freelance web programmer is that I often have a large amount of freedom picking the tools and languages I want. I’m usually hired to make most tech decisions myself, not having to fit into the steady drum beat of prescribed systems, frameworks and dodging stimulating Scrum sessions with colleagues. Whenever I can I like to use this freedom to handwrite the CSS for websites I’m working and this is actually kind of fun.

Calling my own stylesheet shots gives me a very nice and orderly feeling. I can get the design just right without first having to go rounds fighting the assumptions of a big framework. I can also keep the complexity in check, making sure that I actually understand what the CSS is doing and where I can find all the bits I need. Sure, cross-browser compatibility can be a pain, especially when everything has to show up nicely in Internet Explorer, but it’s much easier to fix the problems you come across when you actually own the problems.

I like to use semantic HTML tags and pair those with clear sectioning in my CSS file. Combined with all the nifty tricks you can pull you can often reach your goals in much less lines of code. Bootstrap was nice to have around when building responsive grids yourself was like pulling teeth, but when flying solo these days, you don’t really need it anymore.

CSS can be quite maddening and frustrating and its evolutionary design brings a ton of unwanted baggage. Often it seems like an ill-fit for the types of websites we’re making. However, since an alternative styling language isn’t in the cards and I don’t really believe in moving all your styling to JavaScript logic, we have to make do with the tools we have if we want to stay in web development. Instead of giving up and surrendering to the kinds of Boostrap, Tailwind or Foundation, I believe it’s much nicer to travel this road alone.

So next time you’re building a website put on your favorite trainers and leave the marching bands of external CSS libraries on the side of the road and you’ll find it a much nicer journey.

The kind optimism of Games Done Quick

Last Sunday marked the end of another Awesome Games Done Quick: a weeklong speedrunning fundraising extravaganza benefiting charity. I absolutely loved it! If you don’t know GDQ I’ll summarize it for you: GDQ is a round the clock 7 day long event streamed on Twitch featuring a cast of great, funny gamers and commentators celebrating both old and new games while trying to finish them as quickly as possible. In the meantime people can donate and there’s a number of sweepstakes and incentives to try to generate as much money as possible for charity. There’s two main versions of GDQ, with AGDQ in the winter supporting the Prevent Cancer Foundation and a summer version aptly named Summer Games Done Quick that helps Médecins Sans Frontières. I have been watching both AGDQ and SGDQ for years and donate each time.

All nice and great of course, who doesn’t love a nice fundraising event? The thing that strikes me most however is the unique ambiance of GDQ. Online events can quickly turn sour and a bit nasty. They are incredibly hard to make nice and warm, but the speedrunning community is really something special and paired with the great folks behind GDQ they have made something truly remarkable. Both the runners, the commentators on the couch behind them and the announcers on the mic are very sweet, kind and polite and you can see that GDQ goes at lengths to try to be as inclusive as possible. The crowd you see in the back resembles that and so do the speedrunners. You see a broad slice of geek culture in the crowd and I adore it.

This year I especially enjoyed Lizstar’s run of the horrible DOS game Mega Man 3: The Robots are Revolting in appalling but nostalgic CGA colors :-). This was part of the Awful Games Done Quick block, which is my favorite. This is often the most hilarious part of the whole marathon, however you can’t help but notice the sincere love many of the speedrunners in this block feel for these games, how horrible they may be. I guess playing a game for so many hours makes you grow fond of it however bad it may be.

Another great run in the awful block was the 3DO survival horror game Doctor Hauzer. If you’re into making games like me and worry about frame drops, this slideshow of a game will definitely put things into perspective :-).

I also really enjoyed the Final Fantasy VIII run. I liked it so much that I bought the game immediately afterwards. The run was also timed perfectly for my work day. I had a relatively light day planned with a lot of server maintenance and small feature requests and the run fit perfectly in this schedule. So while doing updates to my servers I could watch this 8+ hours co-op relay run in the background. The immense expertise of the runners and their in-depth knowledge of all nooks and crannies of the game and its lore was wonderful.

The closing game was Super Metroid, which is pretty common for an AGDQ, however this time it featured a gruesome ROM hack called Super Metroid Impossible and I can tell you that it really deserves its name. It was pretty amazing to see the runner Oatsngoats actually finish it and with that also concluding AGDQ 2020, having collected more than 3 million dollars for the PCF. Yay!

Moved this blog to SDF

I have moved my weblog to the MetaArray at SDF. I will be using jelle.xyz as the site for my job as a freelance programmer, so I needed a new location for my blog. I’m very fond of SDF and love their stance on preserving old tech that deserves preserving. They are a big lot of *nix minded folks with great affection for terminals, NetBSD, Gopher and old school bulletin boards.

In the next couple of months I want to move all my old blog posts to this place. There’s some old weblogs floating about, and some archives I have backed up, and I want to bring that all here. After all that I might write about the stuff I’m doing these days and whatever sparks my interests.

Join our network

Join our network

Microserviced my dismay
Loosely coupled
Sound design
Proven properties
Prove each time
This web is good
And I am fine

Reach my progressive API
And scriptures on your client
Show working servers, idle time
Grand and unreliant

Mass-approval crossing networks
Cached in storage localized
Your feelings are not even yours
My feelings are not mine

We view this microserviced corpse
We pass it thumbs and grime
But do not worry, we are here
And we are doing fine

Undesigned my blog

I recently read this blog post by Rick Carlino about fabulous text only websites and decided to redesign my blog, or more precisely: undesign my blog.

I have ripped out most of the CSS and used the styles from bettermotherfuckingwebsite.com as a starting point. HasClunk already outputs pretty minimal HTML5, so that’s good.

I’m quite happy with the result. In Lynx there’s not that much of a difference, but I really enjoy the lightweight look in the browser.

While I was at it, I also removed all the embedded music players from Soundcloud and Jamendo. They added all kinds of nasty JavaScript and tracking stuff to my site and don’t want to expose all my non-existing visitors to that kind of stuff :-).

Exhumed but not dead

A couple of weeks ago I did some digging around in the attic (literally :-). And I came across a burned cd with a bunch of old tracks I made in the 90’s. I was using FastTracker II on my trusty 486 running DOS.

Long story cut short: I took a couple of songs, and mixed them into a new up and downtempo track and shoved them onto the interwebs.

You can listen to the album on Jamendo and SoundCloud.

Skef: a language for developer lab notes

For years I have been carrying around a small black notebook containing all my work notes. This is all nice and dandy, but it’s not very convenient. This is why I decided to move my main notetaking to the PC. There’s already a big bunch of applications, systems, cloud thingamajigs that provide all kinds of note taking and categorization goodness, but it doesn’t tickle my particular terminal loving fancy.

When I take notes I really don’t want to deal with a lot of sub windows, having to move the mouse around and even the excellent ORG-mode feels too complicated for this - previously - simple task. It’s a typical phenomenon where something simple in the flesh translates to a big collection of complex digital counter parts.

Cue Skef

That’s why I decided to create a small language that allows me to take my notes in my favorite text editor, having all the neat features like tagging, todos, time tracking..etc while maintainig this simple workflow of editing just one document.

Let me give you a quick example of an entry:

[Fri Sep 02 2016 - 2016/09/02]
    This is a line of text that is just a general little untaxonomized line of
    text that will be filed under September 2nd.

    [project]
        This text is filed under "project". I will be able to run queries that
        gather all my entries filed under this tag. 
        
        [subproject]
            You can even nest projects, nifty wifty.

        [todo mow the grass]

        As you can see above here, we're able to setup todos. The basic idea is
        that you'll be able to run the Skef commandline utility that retrieves
        all todos for a project, or simply all of them in one go. It outputs
        these todos on the commandline and you'll be able to setup your favorite
        text editor in such a way that it's piped into a new editor
        buffer.

        [todo bft]
            Sometimes you'll need more space to explain what to do. You can
            create a whole lot of text under one todo if you like, but you
            probably won't, because this means a lot of things that will require
            taking your lazy bum of the couch.

        [time 4.5h pondering about mowing the grass]

        We can have entries for tracking time. Skef will be able to retrieve all
        these time entries, output them as a CSV, or on the commandline for easy
        viewing.

        [done noticed the relatively high grass length]

        After you're done with a "todo", you can simply replace "todo" with done
        and it'll be marked as done :-). You can also use "skip" or "postpone",
        but the Getting Thing's Done philosphy says you shouldn't (which we
        don't care about of course, silly busy people).

As you can see the Skef language has significate whitespace. This prevents a lot of hassle with closing blocks, and wears down your keyboard just a little bit less.

I’m already using Skef as my work journal, but I’m going to add a couple of tools to make it more useful. They will allow you to search for certain projects, get all your current todos and generate a CSV with your time entries. I also want to add VIM support so I’ll get to have all the pretty colors in my fav editor :-).

Raspberry Pi 2 as my main desktop

A couple of weeks ago I started using a Raspberry Pi 2 as my main desktop. It was an experiment, suggested by some folks on the #aardvark channel at irc.aard.xyz, and to my amazement it works great. I rarely have to boot my monster PC since, saving a lot on my power bills and finding new ways of increasing my productivity by using lighter apps at the same time. In this blog post I will describe my current hardware and software setup and give some tips on overclocking this single board computer.

Hardware

The Rpi 2 is a neat little board with 1GB of Ram and a 900MHz quad core ARM Cortex A-7 CPU. It is connected to my router through the on-board ethernet port and I have attached an external 3.5" HDD case which houses a relatively slow Crucial BX 100 250GB SSD. I chose to use a 3.5" case, since it has its own power supply. Most 2.5" cases out there take their power straight from the USB port, but since the Pi hasn’t got power in abundance, I thought this would be the smarter choice. Since I’m connecting the SSD through a slow USB2 port I could have also gone with a regular HDD without loss of performance, but I kind of like the silence and lower energy consumption of the SSD.

The Pi get’s its juice from a 2.1 Ampere iPad adapter that came with the first iPad. I have used the shortest USB cable I could find, to make sure all the juice gets to the board without too much loss. You really want at least 2 amps to power your Pi, because you’re probably attaching a bunch of USB peripherals, and you might want to overclock. You can find out that your Pi doesn’t have enough power when the “rainbow square” on the top right of your screen appears.

To enable my overclocking endeavors I have bought a small copper heat-sink, and attached it to the CPU. The Pi case I got has small air holes on the top and enough room inside to accommodate the CPU heat-sink.

Overclocking

I have moderately overclocked my Pi. I used Hayden James’ excellent tutorial on this subject. It appears that there’s no silver bullet though, because the build quality varies from Pi to Pi, so you’ll just have to test and figure out yourself. It’s best to start conservatively and work your way up. A good way to maximize the load of your CPU and RAM is by installing ‘stress’ and running this commands:

stress -c 4 --vm-bytes $(awk '/MemFree/{printf "%d\n", $2 * 0.9;}' <
/proc/meminfo)k --vm-keep -m 1

This will stress test all your cores and RAM. While doing this you can monitor the system temperature with:

vcgencmd measure_temp

You can overclock the Pi by simply modifying /boot/config.txt. My current setup:

arm_freq=1000
sdram_freq=450
core_freq=450
over_voltage=8
force_turbo=1

This configuration will void your warranty because I have enabled force_turbo, to make sure that the scaling governor won’t throttle down when the Pi is at rest. I don’t mind that it throttles by default, but there’s this little lag that’s especially annoying when you load a web page. The governor usually throttles just after the site is loaded, making it less useful.

I managed to run the CPU at 1250MHz and both the SDRAM and core at 550, but this wasn’t too stable when working on the Pi for the entire day. You don’t want to overclock the RAM too much, since the chip’s specification shows that it was designed to run at 400MHz. All in all I think that forcing turbo is the biggest benefit here.

With these settings my Pi runs perfectly stable and rarely gets hotter than 50°C.

Software

I installed vanilla Raspbian on my Pi using the NOOBS network installer from the main Raspberry Pi website. I quickly swapped the default desktop for xmonad, because it’s a lot lighter (and I’m a big Haskell fan-girl). I realized I needed to change some of my computing habits, so I switched to Mutt for e-mail and Newsbeuter for all my RSS-feeds.

When you start using the Pi full-time you will get a keen eye on how demanding all the processes are and one thing that will immediately jump at you is how poorly most browsers perform. This is the reason why the Raspberry Pi Foundation made modifications to Epiphany to make it a bit more snappy. But still, you would like to avoid javascript-heavy websites like youtube.com and you’ll get adept at finding alternatives: Youtube-dl for downloading youtube videos for example.

I tried porting Fennec (Mozilla Mobile) to the Pi, since this ARM-browser runs great on my cheap Android tablet. However, Mozilla seems to have switched their focus for Fennec to Android, so I had to rollback 100k of Mercurial commits to get to a version that still had Linux Desktop support. Long story cut short: I couldn’t get it to compile and didn’t want to make it a knight’s quest getting it to run. Instead I switched to using Dillo for quickly looking up things and I’m using Epiphany and Chromium for web development. Chromium suffices nicely for debugging the web apps I built for my work and after I overclocked I’m actually quite happy with its performance.

It’s a bit odd that browsers don’t run faster, since browsing works fine on cheap ARM-based tablets. I guess those have a lot of special ARM-optimizations, and it also helps that they are more closely tight to the GPU for displaying the latest CSS3 transforms and Javascript-based trickery. It would really be nice if someone ported the Android webview, or Fennec to the Pi, utilizing OpenGL ES directly for rendering.

Speaking of OpenGL, you will notice that some games in the default Raspbian repo run extremely slow. This is because they have been written for the regular OpenGL and need to be ported to OpenGL ES for acceleration. There are a couple projects that can help you porting, without having to overhaul the entire graphics stack. Check out Regal and GLShim if you’re interested.

I’m trying to port Armagetron to the Pi, because I have a tron server daemon running on my VPS and would love to continue playing it. That aside, there’s a lot of fun and play to be had on the Pi. There are many emulators available, and I recommend using Retropie to install standalone versions of the available emulators. Especially Retroarch is pretty neat.

Quake 1 to 3 also run great on the Pi, as does Dosbox, so there’s more than enough out there for your leisurely Pi usage. Personally I tend to avoid the official Pi store for software, since it seems to be ridden with GPL-violating packages.

Software freedom is actually the one thing that bugs me about the Pi. Its firmware and drivers aren’t all free and open source and although you can decide for yourself how long your freedom beard grows about this subject, it doesn’t help the Linux and BSD support for the Pi. Personally I would love to run NetBSD on this machine, but without hardware accelerated graphics I don’t see the point. If freedom is really important to you, you might want to wait for the 9 dollar chip by Next Thing Co.

One thing that amazed me is that Libreoffice runs way better than Abiword. I quite expected the opposite and don’t know whether this is something that’s caused by my specific setup, but Libreoffice runs nice and snappy. Anyway, since my pc-swap fired up the minimizer in me, I’m considering moving all my Libreoffice templates to LaTeX instead.

Conclusion

The Pi is a nifty bit of kit and more than powerful enough to replace my desktop for most of my computing. Critics regard it as a mere toy, but it’s quite powerful at that. Fine-tuning the Pi, and finding new and better apps that replace the bloated ones you were using is pretty sweet and good for your productivity. I intend to keep using the Pi for most of my work and hope to someday update it to a 8GB octacore Pi 3.

Rise of the videogame zinesters

Last week I read Anna Anthropy’s Rise of the videogame zinesters and it left a big impression on me. Having read a lot of books on video games, this one is something completely different. No programmer semi-gods this time (like in Masters of Doom or Hackers), no tough stories about manly men working 100 hour weeks in crunch mode (like in Jacked). No: this book is about human beings making games, everybody, not just the prototypical hetero cis men who have dominated gamer culture for so long.

Anthropy paints a bright picture in which games are created like ezines. She makes a very good point about video games needing more diverse personal voices and you don’t need to be a coder to let yours be heard. Mentioning a couple of tools you can use to make games without previous programming experience (like Twine), and talking about various examples, she convincingly attempts to liberate the art form from its prison in which men shooting men in the face is the norm.


This book appealed to me in so many ways it’s hard to summarize. Triple A games and gamer culture create an ecosystem which is often weird and off-putting for queer folk like me, so Anthropy’s inclusive stance is heartwarming. Creating a game yourself seems daunting, even to me, and I am nota bene a coder. Anthropy emphasizes the fact that you should just go and create, build stuff and don’t worry if it sucks. The result will always be personal and that’s something worthy and special in it’s own right! This is such a beautiful thought I actually read the end in tears.

I still love the books on games and game developers I have on my shelves and can never get tired about reading how John Carmack and Romero did their magic with ID software, but this book resonated with me in a way I never thought possible on this subject. If you want to create games like me, or you would like to read how this art form is opening up to a wider range of voices, I really can’t recommend it enough!

Herding Cats

I have just started reading Rise of the Videogame Zinesters by Anna Anthropy and while I was looking at the games she made I found this little puzzle gem in her list: Herding Cats.

Herding Cats is a top down puzzler in the style of Sokoban/Paganitzu/Adventures of Lolo with a very neat twist. Instead of pushing boxes around and blocking baddies, you have to herd cats. The mechanic is as simple as it is tricky, because the cats stick to you once you get next to their block and they will remain by your side for the entire session. Thankfully you can undo moves, and - trust me - you will need this. You can easily end up with too many sticky cats, meaning you can’t get past a narrow part.

I love the way the cats’ furs change color each time you reload a level and I’m quite envious of the player’s pink hair. Tried to do that once, but my hair’s too dark to get properly blonde for coloring.

Herding Cats is pretty neat and difficult, but (thankfully) easy enough to finish in your average coffee break. The perfect kind of game for days like these when I’m surfing a deadline’s waves like I never managed to pull of in California Games.