List member James L. replied to my last post and talked about a problem with build machines:

At some point we should talk about the problem of the “build machine” mentality. That situation where you have this one machine that’s been around for years and it’s the only one that people trust to create software builds. Unfortunately, Microsoft just dropped support of Windows 7, so the security [folks] will come knocking soon enough :-)

I couldn’t agree more.

I imagine you’re familiar with the phrase: “well, it runs on my machine”.

Setting up a dedicated machine that is responsible for compiling and building executables and other artifacts is a good idea, but if not done correctly, you’re just kicking the can down the road.

A dedicated build machine has the following advantages:

  • for very large, CPU-intensive builds, it offloads work off the developer machines and can often perform the builds faster.
  • it serves as a source of truth of the correctness of the software checked into version control – if the code doesn’t build on the build machine, the build is broken by definition.
  • it enables a build pipeline with automated testing

However, if the build machine is treated like a special snowflake and its configuration isn’t carefully tracked, then you end up with the problem that James is talking about.

The solution is the concept of infrastructure-as-code.

The configuration of a machine that required to perform your software build can be written down. The OS, applications, libraries, packages, along with their versions, can be expressed in a text file. Like, in a script. If you have the right tools, that text file can be executed. The build environment can be created automatically.

This can be done in a virtual machine, automated with a tool like Vagrant, or in a container with a tool like Docker.

More on this next time.

Happy developing!