Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

This is amazing! Thanks so much for writing it out. It's so good to read about the same story from a different perspective. I'm only going to respond to one theme in it, I hope that doesn't come across as me not appreciating or not agreeing with what you wrote.

Docker's core innovation from a _technical_ perspective may well be packaging, but I think that really misrepresents the history. Docker, like Heroku, made ordinary developers powerful. It did that through _culture_, by speaking the right language, by knowing what developers want and what they're capable of doing.

I want to talk about Jeff Lindsay[1], the co-author of Docker. He's a prolific open source contributor, a vocal critic of VC funding (and I suspect HN) and passionate advocate of the basic building blocks of software. He created Dokku, introduced Alpine Linux to Docker, founded Flynn, and Glider Labs which innovated things like service discovery and distributed logging in Docker.

Jeff lives and breathes hacker culture. His hallmark is written all over Docker: embrace open source, provide basic building blocks and encourage innovation. He's one of us, a hacker, he builds things for the sheer joy of it. It's obvious he's not in it for the big enterprise money. I mean he's not even it for the reputation, hardly anybody knows his name, everyone just thinks Solomon Hykes created Docker.

These are the people that need to be mentioned in the story, in that documentary. The people that put things into the world, because the world needed them. I don't know the name of the person that came up with The 12 Factors, so it's the same thing, someone knew that the world needed them and just gave them to us.

Of course the story of Omega and Borg is crucial, but they were always motivated by the private interests of an unimaginably large corporation, a corporation that removed its own "Do No Evil" aspiration. Jeff Lindsay has a power that Google doesn't have, the love of software for software's sake. That is a critical part of why we have Docker, and why it became so popular. And Docker's open source status and developer community is an inescapable basis for the world which Kubernetes now rules over.

1. https://dev.to/progrium/hi-im-jeff-lindsay



Now, that's a theme much more interesting from my PoV :)

I don't actually recall that, possibly because I was heavily on the Ops side (sysadmining used to be reasonably easy money, and definitely easier than finding AI job that wasn't ML). From my pov Docker caused an earthshaking change in development process, but it was heavily due to papering over all the crazy between different languages, and of course the infamous "it works on my machine" "then we're shipping your laptop" theme.

I suspect I didn't see this aspect because for me it happened years earlier, when (from my PoV) interest in Rails helped break the monolithic LAMP domination, slowly paving over the divide of those who could afford a dedicated server and those who were stuck with shared LAMP hosting.

By the time Docker landed we had quite extensive suite of tools to maintain complex setups, Puppet was a well known name, 12 factor approaches were big in cloud space, and we had 4 years of heavy evangelism to destroy the wall between Dev and Ops, stopping Dev throwing packages over it and saying "done" while ops cried, and for Ops to get dirty with devs hand to hand. In a way, Docker pushed beyond "development-time only tool" felt like a step back, especially given how early on it was considered pretty normal to run a container, exec into it, modify, then save the result as new image.

Meanwhile, devs tended to not need k8s. Usually they could provision enough locally to run what they needed (possibly in docker containers, making life much easier), but the echoes of the thinking then reverberate IMO in the regular "you don't need k8s because you're not google" message.

Meanwhile on operations side we had the unholy mess that was in no way lessened by Docker, unless you were lucky and you could push your apps into one of those early (hell, I recall a demo written in Bash, I believe) PaaSes building on the simplicity of Docker. Some developers, and less Ops, might have been mollified with that. Meanwhile we had wildly heterogenous systems at times to wrestle together, possibly heavily disparate workloads. Docker looked like a component, but there was always aspect of the amount of work needed to make it generic enough (and I have mentioned few posts ago that running docker in prod was painful, right?). So there was slow acceptance of docker images aas way to handle packaging and distribution easier, and I recall a simple PaaS being demoed by a student literally within days of docker announcement (emulating Heroku and calling it so by name, iirc?), but still, nothing much on the system side. 2013-2014 we were trying to use docker as simpler VM (especially given how hairy ephemeral LXC at the time could be) for running tests of what we would do with automation on "normal" hosts, and reuse of docker as way of dealing with packaging/runtime problem in systems like mesos (but also hadoop/yarn) was becoming more common. It's from this angle, I feel, that Kubernetes arrived on the stage, and this is why I don't necessarily see the same involvement you describe - but that's good, because I also learnt a thing :)

I just don't necessarily feel it was an aspect driving kubernetes, but it definitely feels like it was an aspect driving docker among developers - and with k8s we (more on the ops side ^_-) could deliver such environment for devs in more reliable way.


Sorry for the late reply. I really, really appreciated your in depth replies and your different perspective. Thank you so much for that, I learnt a lot things too :)




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: