14

Generally a developer cares about satisfying business requirements. He / she might have the expertise in a particular stack or a framework. But should he / she make an effort to learn docker and it's various deployment methods (swarm, kube, mesos, etc) ?

Simply put why should a developer care about docker ?

PS : The parent question to this post is Implication of introducing docker to the development team

Abhay Pai
  • 303
  • 2
  • 5

7 Answers7

8

Probably not the answer that you're looking for, but an answer nonetheless :)

Learning about docker and its deployment methods could actually be included in the business requirements by making it part of the project or team development environment, just like the code language(s), version control system, compilers, test infrastructure, etc - to work in that team or on that project one needs to know about and use all of these, can't "bring your own" (in most cases).

Things get a bit more complicated if by "a developer" you actually mean the majority of or even entire development team. Pushing a tool in the development environment without actually having any of the developers supporting it will be really tough. Spend the time to create one such supporter first from the team's technical leadership.

Side note: it might also not be necessary for each and every developer in the team to become a docker expert. Pre-established usage recipes wrapped in simple, cheatsheet-ready commands often allow developers to use docker-based solutions without actually knowing too much about their inner workings, which could be fairly acceptable, especially in large teams. Just like being able to contribute code without knowing all the details about how the end-product is being built.

Dan Cornilescu
  • 6,780
  • 2
  • 21
  • 45
7

I'll give you my perspective. Developers should care about docker as there are other developers who are willing to use docker and have already built an expertise in it. They are willing to take up the roles of a DevOps engineer along with being a developer. So the Ops part of DevOps is what they are now building expertise on.

These days, you'll find more and more guys who can develop, orchestrate, automate tests, automate jobs and build tools to monitor and take this complete package into production single-handedly. These are the guys who are pushing docker and other tools among the developer community.

Also, the tide of market is towards virtualization, auto-scaling, automation, machine learning and docker fits in all of these. It has become very imperative to use docker. The businesses are willing to pay 2x for a single guy who takes all these responsibilities and when there is demand for such guys, the supply will also begin. This is from point of view of an employee-employer.

Technically, in the organisations I have worked, there are separate development and DevOps teams, although they work very closely for deliveries. The DevOps engineers and developers share a vast majority of skill-sets here and hence there is a negotiation of duties sometimes.

The bare minimum a developer can do is to share his binaries, but he needs to understand that the binaries will be used to run inside a docker container and for that he needs to understand how docker works. For kubes, swarms, mesos etc, the developer may not even care what's being used, but basics of docker should be very well understood by the developer and a mindset should be there from the beginning to build the application loosely coupled for re-use as micro-services. If the application is built from that mindset (which requires basics of docker), then the DevOps engineers can take it up from there to auto-scale, orchestrate, test, deploy and monitor.

Also, most of the times there is no one size fits all kind of thing. A developer does not know clearly how to build a docker friendly app and a DevOps engineer quite rightly does not know the internals of the app building process. Hence, most of the times, organisations prefer to give both these tasks to the same guy to speed things up. If there are separate things, then a continuous feedback mechanism is required from the DevOps team to the dev team for making the apps more futuristic and docker/cloud/scaling ready.

Pierre.Vriens
  • 7,225
  • 14
  • 39
  • 84
lakshayk
  • 636
  • 1
  • 4
  • 8
7

It is not about Docker or any other containerisation technologies out there.

Containers like Docker, rkt, etc. are just way of delivering your application in similar fashion to static binary. You are building your deployment that it contains everything it need inside and end user doesn't need anything more than runtime.

These solutions are similar to fat JARs in Java, where everything you (in theory) need is just runtime (JRE) preinstalled and everything Just Works™.


The reason why developers need to understand (they do not need to learn how to operate such tool, only why this is needed) orchestration tools is that this allows you to have some advantages over "traditional" deployment.

Cattle, not pets

EngineYard have written good article about that. Whole point of that is that when your server dies, then you shrug and wait as new will appear. You treat them as cattle, you have tens, hundreds, thousands of them running and when one goes down neither you or your clients should be ever aware of that.

Orchestration tools achieve that by monitoring status of all applications (pods/jobs, whatever) in cluster, and when it sees that one of the servers stops responding (goes down) then it automatically moves all applications that were running on that server somewhere else.

Better resource utilisation

Thanks to orchestration you can run multiple applications on one server and orchestrator will track resources for you. It will rearrange applications when required.

Immutable infrastructure

Thanks to the automatic failover handling in orchestrators you can ran your custom images in the cloud as is. When you will be needing update, you just build new image, set your Launch Configuration to use that one now and just roll. Everything will be handled for you:

  1. Create new server with new configuration.
  2. Kill one running server.
  3. Your orchestrator will move everything to other machines (including new one).
  4. If there are any old servers left then go to 1.

Simpler operations

  • Not enough resources? Add new machine to cluster.
  • Need more application instances? Increase number and go on.
  • Monitoring? Done.
  • Log management? Done.
  • Secrets? Guess what.

TL;DR Whole point isn't about Docker but about orchestration. Docker is just a extended version of tarball/fat JARs that is required for proper orchestration.

Hauleth
  • 456
  • 2
  • 9
4

Here are for example some arguments from a blog post published back 2014 and titled in way quite matching your answer:

  • Much more flexible injection of new technologies into the environment
  • there is still a massive pain point between committing the final tested code and then getting it running on the final production servers. Docker vastly simplifies this final step
  • Docker makes it trivial to keep legacy OS, no matter what flavor of Linux you are running

From: https://thenewstack.io/why-you-should-care-about-docker/

Ta Mu
  • 6,792
  • 5
  • 43
  • 83
4

If you are running your production in docker container it's crucial that those container are being made by the same developers that have build the app running on them. Who else is better place to know what external dependency are needed and so on ... ?

Also pipeline can fail at any step during a CD, particularly when it's the docker image build step, sometimes it's a file who is missing or a lib that's needed.

At work we've introduce all devs to docker explaining them the basics to build the dockerfile in order to serve their app, also we made the pipeline easy so one can only add a name and a dockerfile and its app will automatically be build on the next push regardless of the tech running it.

Docker quickstart is really a great introduction to do so, after what the devOps team guide the dev in their choice of distro (a lot of them don't know things like alpine).

Our job is to provide them an easy access to the tools, they do the rest so they can fix it when something is wrong. Docker is really a part of the development process and the devOps team provide them docker images that match our needs and that are easy enough so it take only a couple minute to create a new app and deploying it without assistance.

Juub
  • 156
  • 4
2

Docker gets lots of press and blog mentions which leads to developers getting interested in using it. For some people it is the interest in playing with a new technology or understanding how things work. For others it is a desire to add keywords to their resume. Either way, the more developers know about how things work and how they get deployed the less surprised they will be later. From what I've seen there's a decent amount of pre-existing interest in this so it shouldn't be that hard to encourage it further.

chicks
  • 1,911
  • 1
  • 13
  • 29
0

Well, if you ever used VMs for testing you may want try to use containers and docker is actually a great stuff for testing and it's much more simpler to use instead of LXC :)

Y V
  • 31
  • 1