I want create a complete Node.js environment for develop any kind of application (script, api service, website ecc.) also using different services (es. Mysql, Redis, MongoDB). I want use Docker to do it in order to have a portable and multi OS environment.
I've created a Dockerfile for the container in which is installed Node.js:
FROM node:8-slim
WORKDIR /app
COPY . /app
RUN yarn install
EXPOSE 80
CMD [ "yarn", "start" ]
And a docker-compose.yml file where adding the services that I need to use:
version: "3"
services:
app:
build: ./
volumes:
- "./app:/app"
- "/app/node_modules"
ports:
- "8080:80"
networks:
- webnet
mysql:
...
redis:
...
networks:
webnet:
I would like ask you what are the best patterns to achieve these goals:
Having all the work directory shared across the host and docker container in order to edit the files and see the changes from both sides.
Having the
node_modulesdirectory visible on both the host and the docker container in order to be debuggable also from an IDE in the host.Since I want a development environment suitable for every project, I would have a container where, once it started, I can login into using a command like
docker-compose exec app bash. So I'm trying find another way to keep the container alive instead of running a Node.js server or using the trick ofCMD ['tail', '-f', '/d/null']
Thank you in advice!