Unix

I was first introduced to Unix in 1996, in my first year of university. Studying for a B.Sc. in Computer Science at The Robert Gordon University in Aberdeen, the syllabus contained a module on algorithms and data structures. A fact unremarkable in any way, except for the fact that for the first time in the course’s history, it was being taught in Java. At the time I was vaguely aware of what Java was, but looking back now, I’m impressed that my course tutor had managed to put the course together using the language at all, given 1.0 was only released about 9 months before the course started being taught.

Being Java 1.0, it was only available for Solaris, and so we programmed it on SPARC-based workstations in the lab at the university. I remember falling in love with CDE at the time. I can’t really explain why, but I look at CDE screenshots now and I love how it looks.

But we didn’t really spend much time in CDE for the module, because Sun hadn’t invented an IDE for Java yet. Instead, we were introduced to the shell (ksh!) and using Emacs as the text editor. I remember a lot of my classmates wondering what was going on, having spent their entire lives inside GUI computing environments before then. I was mostly the same, using either Amigas (an Amiga 600, loved that machine!) or Windows 3.11 or 95 x86 PCs in the early to mid 90s.

But, being taught how the shell and command line worked for a production Unix at the time was a real eye opener for me. I understood that while GUI environments were fine, there was still real power to be had in the shell. We were taught how to script ksh, to make it easier to invoke javac, the Java compiler, on our programs. Tooling around Java was rudimentary at the time, and I’m incredibly thankful for that.

So after the first few classes working on Solaris on SPARCstation 5s, I wanted that same environment at home. Not just to work on Java for my coursework, but because scripting led me to work with the Solaris userland, and so on to figuring out the Unix philosophy and building chains of powerful programs using plain text as the interchange format as much as possible. I loved that environment. Of course, Solaris wasn’t available for PCs at the time, but Linux was. I knew about Linux, but had mostly ignored it until then because Windows was where my games were, and I didn’t really do much else with computers back in the early 90s.

From then on, I’ve always spent time with Linux and Unix. There hasn’t really been a time in my computing life where I’ve been able to ignore Windows entirely, because that’s not how it works in real-time graphics land, but for almost all of my personal computing and enthusiasm for PCs has happened against a backdrop of working in a shell with great userland tooling and text editing.

These days I’m a Mac person, and the underlying BSD and NeXT Unix heritages for the kernel, userland and GUI programs upon which OS X is built on have a resonance with me that make it my favourite modern computing environment. But between 1996 and late 2008 when I switched to Macs as my personal computer of choice, I almost always dual-booted Linux and Windows, with Windows mostly taking the back seat in terms of time spent as what was booted and used. Working on Unix-like OSes just clicks with me for personal computer use.

Gentoo Linux deserves some special credit. I’ve kept a spotty history.txt for the bigger things I want to remember from my computing history, and there’s an entry in there from 2003 that says “Research Gentoo”. Research it I did. Gentoo is different to most modern Linux distributions in that really encourages doing away with binary packages and pre-built kernels, and versioned releases. Instead, it leads you strongly down a path of building everything yourself from source, including the kernel.

That experience, of learning how to configure a working kernel for your system architecture, and then successively layering a base system and then applications on top of it, taught me an incredible amount. Even if you never write a line of it yourself, if you’re savvy enough to understand the basics of how computer programs are put together, watching code compile gives you a really great overview of effectively everything going on with a computer, from the hardware up through the operating system.

If you watch the compiler output from the Linux kernel you’ll see that it’s modular in its subsystems and necessarily has to be built in a certain order, supports standalone kernel code modules that you can build in statically or make optional and dynamically loadable, and has a special relationship with the rest of the system via the system init. You can see it figure out what processor architecture its going to support, and the wider system architecture on top of that. Peripheral I/O, bus interconnects, storage system support and more can all be learned about if you configure a Linux kernel and watch it being built.

Gentoo then taught me how the userland part really works. When you’re literally building the entire thing yourself from source code, including the compiler you’re going to use to compile everything else in your system, you can’t help but figure a bunch of important things out, like that there’s a processor instruction set architecture and the kernel doesn’t just support all instruction sets at once, binary interfaces to that called ABIs, that the kernel has to understand to be able to execute userland programs, or that the compiler just compiles object files and there’s a separate step later on called the linker, that knows how to build final programs out of compiled objects. You get to understand that there’s also a runtime dynamic linker and loader, for loading shared library code into your programs, built against those ABIs and run by the kernel.

It feels strange to say it, but a lot of my understanding of how today’s computers work was solidified and built up by working with from-source Gentoo systems. I don’t run it now, but when I’m finished this blog post I plan to look it up again and see what it’s like today. I hope it’s just the same, and other Linux users get that same systems glue education that I did, which you just can’t get from observing how a Windows machine goes about its business. That transparency in Linux is something I owe a lot to. These days my Linux usage is still daily, but limited to server stuff most of all.

All of the knowledge gained, that started by poking around the filesystem of a Solaris machine at university and ended up in real-world experience about how Unix and Linux systems work and are assembled, lets me get through all the systems administration work for Beyond3D and all of my other side projects without needing any help. Beyond3D is a bunch of networked Linux and Unix systems, spread out geographically, that I build and maintain myself, and that’s something else I’m thankful for as a result of the education I got, starting with coding some Java nearly 20 years ago now.