$ cat post/containers-on-a-budget:-a-docker-dive-in-2013.md

Containers on a Budget: A Docker Dive in 2013


January 28, 2013. I remember it well—Docker was just starting to make waves, and the microservices mantra was beginning its ascent. It was a time when we were all still figuring out how to manage our software as a set of loosely coupled services rather than monolithic beasts. Back then, I had this small project that was begging for some containerization love.

The Project: A Simple REST API

At the time, I was working on a simple REST API using Node.js and Express. It was a straightforward app with a few endpoints to handle CRUD operations for user data. The database? MySQL running locally. Deployments were a mess—a mix of manual git push, npm install, and then hoping that everything came up as expected.

Enter Docker

Docker promised a more predictable way of deploying applications by ensuring that the environment stayed consistent across different machines. The idea was enticing, but I had no real experience with it beyond some basic tutorials.

I set out to containerize my little REST API. After fumbling through the setup for a few hours (it’s not easy when your Docker-fu is weak), I finally got something working locally. The first step was creating a Dockerfile—a simple text file that described how to build an image:

FROM node:6-alpine

WORKDIR /usr/src/app

COPY package.json .
RUN npm install --silent

COPY . .

EXPOSE 3000

CMD ["npm", "start"]

With the Dockerfile in place, I created a Docker image and ran my app inside it. The consistency between development and production environments was a breath of fresh air. No more “It works on my machine” excuses!

Deployment Challenges

But as I started to think about how to deploy this containerized application, I hit a wall. At the time, orchestration tools like Kubernetes were still in their infancy—literally a year away from Google’s announcement. My company didn’t have any cloud services, and we didn’t want to invest in expensive infrastructure just for containers.

We did, however, have a small AWS account with a few EC2 instances. I decided to leverage them by setting up a simple script that would push the Docker image to ECR (Elastic Container Registry), then use the AWS CLI to run the container on one of our instances. It was kludgy, but it worked.

Learning and Growing

This little project taught me a lot about containers and their potential. It wasn’t just about packaging an application; it was about creating an environment that ensured consistency from development to production. I learned how to write better Dockerfiles, understand Docker’s networking, and appreciate the power of a containerized world.

In retrospect, 2013 was a year where the industry was still figuring out its footing with Docker and microservices. It felt like we were all exploring in uncharted territory together. Despite the challenges, it was exhilarating to be part of that early movement towards more modular, efficient software architectures.

And as for Aaron Swartz, his tragic passing on January 11th of that year serves as a stark reminder of the personal costs of our technological advancements. It’s a story that continues to haunt us and should remind us to build with responsibility and care.

So here’s to Docker—a tool that would come to transform my world in ways I never could have imagined back then.