I have multiple web applications (app1, app2, app3... appN), each interacting with each other (over http). Each of the web app will be running in a separate docker container. Each app is installed by installing the code from git, setting up the database defaults and running some initializations. All of the apps will be running in a single host (each developer will have his/her copy of all the apps on their system).
The owner of app3 (for ex) wouldn't bother or know about other apps setting, and will be happy to just take the default settings of other apps. The QA team will also have no knowledge of setting up apps, relying on default configuration.
Note: The git repository is password authenticated.
The requirement is to have:
- Create docker containers for each app, with default configurations, so that all the apps (installed via docker-compose) are pre-configured and starts without any additional effort.
- Ability to update latest code changes into the docker containers
Well, docker is a great solution and we could accomplish most of them. However, the following are issues:
During the docker build process, there is no support for env variables (and unlikely to be). But we cannot hard code git repository credentials in the Dockerfile. We need a pre-configured app setup, but I am unable to figure a easier way to get the code from git into the docker image.
For simple changes in git code base, there is no need to create a new docker image. Instead, it should be okay to run "git pull" from within the container. (Also, for a developer of that app, can also do "git push" from within the container itself - making dev testing easier). For all of this, the git credentials need to be of that particular developer, who is using this.
It would be too clumsy to have the git repositories installed on the host and referred using docker volumes (-v). [ If a developer wishes to do so, they could still do that - but I would not prefer this as default ]
Any suggestions welcome!