$ cat post/the-deploy-pipeline-/-a-system-i-built-by-hand-/-uptime-was-the-proof.md

the deploy pipeline / a system I built by hand / uptime was the proof


Title: June 3, 2013 - Docker Fever and the Siren Song of Microservices


June 3, 2013. Another day at the office, and it was a good one so far. The team had just finished deploying our latest application update without any major hiccups, which always feels like a win in ops land. But today felt different. Today, I found myself caught up in the hype of Docker.

You see, Docker was all the rage that month. Just a few weeks prior, Docker 0.6 had been released, and it was starting to feel like everyone on the internet—from startups to big enterprises—was talking about containers as if they were some kind of technological revelation.

I sat at my desk sipping coffee when I noticed an article in Hacker News titled “This is a Web Page” with over 1300 points. The title alone had me curious, but the comments made it clear that this was not just another blog post. It was about Docker and how it could transform web development.

I clicked through to the page and read on. The author’s enthusiasm was infectious; he wrote about how simple yet powerful Docker containers were for development environments. “It’s like having a chroot, but better,” I thought to myself as I scrolled down, feeling a twinge of FOMO (Fear Of Missing Out).

Meanwhile, in the news section, there were several articles about NSA surveillance and Project Loon that piqued my interest. The Snowden leaks had hit the mainstream media just a few months ago, and while I wasn’t surprised by the revelations, it added another layer of complexity to our work. We were already dealing with regulatory compliance for data handling, but now we needed to think about how our systems might be affected if someone was listening in.

Back at my desk, I started toying around with Docker on a personal project. It took less than an hour to set up and get running. The ease of deploying an application within a container felt like magic—everything was self-contained, isolated, and ready for action without the headaches of environment configuration or dependency conflicts.

But as I played more with Docker, I found myself thinking about microservices architecture. The term had been floating around in tech circles, and now it seemed that everyone wanted to jump on this bandwagon too. My colleagues were starting to toss around ideas about breaking down our monolithic application into smaller, independently deployable services wrapped up in Docker containers.

I argued with them a bit. “Wait, why are we doing this? Isn’t making everything more complicated just for the sake of it?” They had some good points—better scaling, easier testing, and increased fault isolation were all valid reasons to consider microservices. But my gut told me that maybe we shouldn’t rush into something just because everyone else was.

I decided to take a step back. Instead of jumping straight into microservices, I wanted us to focus on using Docker for simplifying our deployment process within the existing architecture. We needed to understand how containers worked and integrate them properly before thinking about architectural changes.

By the end of the day, my mind was still churning with thoughts about Docker and microservices. It felt like everyone else had already found their religion, but I wasn’t quite ready to convert just yet. There’s a difference between keeping up with the latest trends and truly understanding how they can benefit your work.

As I closed my laptop, I realized that sometimes the best course of action is to take a moment to pause and reflect on what makes sense for you, rather than jumping headfirst into the next big thing.


That’s where I was back in 2013—leaning towards embracing Docker but still cautious about diving deep into microservices. Reflecting now, it’s interesting how much has changed since then. Containers have become a standard part of most operations workflows, and microservices are widely adopted. But the lessons from that time still resonate: always approach new technologies with a critical eye, understand their implications fully before implementation, and make sure they align with your broader goals.