Latest news about Bitcoin and all cryptocurrencies. Your daily crypto news habit.
Iâve seen a lot of articles lately suggesting how to use Docker for development. I havenât seen one yet that does it correctly.
Obviously, correctly, is subjective, but Iâd like to compare the typical wisdom, vs. how I usually approach the problem.
The Conventional Wisdom
In many tutorials, the first thing introduced is the Dockerfile.
At the foundation of any Dockerized application, you will find a Dockerfileâââhttps://blog.codeship.com/using-docker-compose-for-nodejs-development/
It apparently, is the foundation.
The first several results on Google all suggest the first thing you need is a Dockerfile, as well.
After all, how can you have a Docker environment without creating a Dockerfile?
Iâm here to tell that while this is true for production, itâs the wrong approach for development. You do not need to create your own.
A Dockerfile is a way to package your application. You donât need to package your application for development, and honestly, you really shouldnât.
Development and production are not the same environment.
When you develop on your MacBook you install different tools than you use running in production. Just because âit runs the same way everywhereâ doesnât mean it should.
Your app runs differently in development.
Packaging it in a time where it is meant to be flexible and malleable is why many engineers have come to the conclusion the Docker isnât for development.
You lose the flexibility of development by needing to build new containers when you have changes to dependencies change for example.
Sure, you could exec into the container and perform some commands, install some libs, but is it really less effort at this point?
Now, some of the above articles got this more right than others, but if youâre using a Dockerfile for development, youâve probably already gone too far. There are situations where you will want one, but probably not in the manner you think.
Hint: If your Dockerfile contains an npm install youâve gone too far.
The docker-compose builder pattern
Letâs talk about what Docker is for a moment.
Docker is a way to package your code. This is the typical context for using Docker.
Docker is also a way to create an isolated environment which is capable of executing certain types of applications.
Docker allows you to package environments that are capable of running your code.
When you use Docker for production you are using the most specialized Docker containers you can make. They are customized and specifically built for your application, packaged in the way you built it. For this purpose, creating a Dockerfile makes sense.
When you set up your computer for development, thatâs not what you do. You instead install the tools that you need for development. You just need to create an environment which your code can run in.
This means you can use a more generalized Dockerfile. Usually, these generalized Dockerfiles you need for development already exist.
For example, when developing a Node.js application, you need node installed on your machine. Thatâs it.
You donât need alpine linux. You donât need to package your node_modules into an immutable build. You donât need little containers to exec into to make significant changes. You just need to be able to execute node and npm.
Therefore, in a container, thatâs all you need as well, meaning the official node image on Docker Hub will do just fine.
Without further ado, my approach to development with Docker.
In my last article I showed how to use Parcel for development and production. Letâs keep that rolling, and build on top of that.
I think itâs a good example because Hot Module Reloading is essential for developing React apps efficiently.
Step One
First, we need a docker-compose file. In it, we need our development environment. Seems how we are making a node app, that means the officalnode image is probably a safe bet.
Letâs add a file docker-compose.yml:
version: '3'
services:
dev: image: node:11
Next, we need our code to be in the environment, but we donât want it to be baked into the image. If we are using this for development, when our files change, the files in the container also need to change.
To accomplish this we can use a volume. We will mount our current directory . to /usr/src/service in the container. We will also need to tell docker where our âworking directoryâ is. Meaningâââwhat directory did we put the code in?
version: '3'
services:
dev: image: node:11volumes: - .:/usr/src/service working_dir: /usr/src/service
Now, every time we make a change on our local machine, the same file changes will be reflected in /usr/src/service.
Next, we need to execute the command npm run dev. This is easily accomplished with a command. We also want to access it locally on port 1234.
Finally, hot module reloading with Parcel happens on a random port by default, which wonât work for us, as we need to map the HMR port as well.
Modify the dev script in package.json to include the option --hmr-port=1235.
"dev": "npm run generate-imported-components && parcel app/index.html --hmr-port 1235",
And with that in place, letâs update the Docker file to map the ports on our local machine to the same ports on our container.
version: '3'
services:
dev: image: node:11 volumes: - .:/usr/src/service working_dir: /usr/src/service command: npm run dev ports: - 1234:1234 - 1235:1235
If youâve done enough Node development, youâll notice we have a problem. You canât just run a node app without installing dependencies.
Also, you canât just install your node modules locally on Mac or Windows and expect them to work on the linux container.
When you run a build in some cases libraries compile natively and the resulting artifact only works on the operating system it was built on!
As a first attempt, you may be tempted to just chain npm install and npm run dev in a single command, and sure enough that would work, but itâs not quite what we want. This would require to run an install every time we started development mode with the container.
Also, some services beyond needing an install, also might need a build step. In our case, this isnât needed for developing the client because parcel or nodemon handle it, but not all apps were built in the past week with the latest tech.
For educational purposes the way to chain commands is using bash or ash to execute the command. If you try
command: npm install && npm run dev
You will learn that doesnât work. Instead you can could use.
command: bash -c "npm install && npm run dev"
This would in fact work, but is not the optimal solution we are looking for.
Which brings us to Step Two.
Step Two
Letâs create another docker-compose file, this time named docker-compose.builder.yml.
We will need to use version: 2 this time to make use of a feature in docker-compose that isnât available in the version 3 specification.
Version 3 is more suited towards use in production than version 2, which has more development friendly features.
The first thing we want to define in docker-compose.builder.yml is a base image.
version: '2'
services:
base: image: node:11 volumes: - .:/usr/src/service working_dir: /usr/src/service
This should look pretty familiar. Itâs the same base we use in our docker-compose.yml file.
Now, we can extend the base to execute a whole bunch of different commands. For example:
version: '2'
services:
base: image: node:11 volumes: - .:/usr/src/service/ working_dir: /usr/src/service/
install: extends: service: base command: npm i
build: extends: service: base command: npm run build
create-bundles: extends: service: base command: npm run create-bundles
Now, to install dependencies using a node:11image which matches our development service in docker-compose.yml we can run:
docker-compose -f docker-compose.builder.yml run --rm install
To install the versions of binaries needed.
Pro Tip: Admittedly, docker-compose -f docker-compose.builder.yml runââârm install, doesnât really âroll off the tongueâ, does it? I usually put this in a Makefile so can just run make install, etc.
After running the install, docker-compose up will bring up our development environment, which works exactly the same as it would on your local machine.
â docker-compose upCreating stream-all-the-things_dev_1 ... doneAttaching to stream-all-the-things_dev_1dev_1 |dev_1 | > stream-all-the-things@1.0.0 dev /usr/src/servicedev_1 | > npm run generate-imported-components && parcel app/index.htmldev_1 |dev_1 |dev_1 | > stream-all-the-things@1.0.0 generate-imported-components /usr/src/servicedev_1 | > imported-components app app/imported.jsdev_1 |dev_1 | scanning app for imports...dev_1 | 1 imports found, saving to app/imported.jsdev_1 | Server running at http://localhost:1234
And when we make a change, hot code reloading works as expected!
All with no Dockerfile!
Bonus
I just wanted to quickly add an example Makefile that will make the commands easier to remember and use.
Create a file called Makefile:
install: docker-compose -f docker-compose.builder.yml run --rm install
dev: docker-compose up
Makefiles use tabs, but Medium Engineeringâs editor wonât allow me to type tabs or even paste them in. Makefileâs will not work with spaces. đą đ đŹ
Now you can run make install and make dev.
Conclusion
You donât always need to make a Dockerfile to use Docker! Oftentimes, for development, you can just use someone elseâs!
I hope Iâve been able to show you an easy way to get up and running quickly with Docker, and docker-compose for development.
If you found this helpful, please give me some claps đ or a share, and make sure to follow me!
To learn about how to create a multi-stage build for production, in CI pipelines, or how to use docker-compose to run staging tests, check out my article: I have a confession to make⊠I commit to master.
Best,Patrick Lee Scott
A Better Way to Develop Node.js with Docker was originally published in Hacker Noon on Medium, where people are continuing the conversation by highlighting and responding to this story.
Disclaimer
The views and opinions expressed in this article are solely those of the authors and do not reflect the views of Bitcoin Insider. Every investment and trading move involves risk - this is especially true for cryptocurrencies given their volatility. We strongly advise our readers to conduct their own research when making a decision.