3

I've dockerised my application and use docker-compose to run it locally. I'd like to use the same docker-compose commands I use locally to build and test my application on the CI runner but I can't seem to find any documentation on how?

I'm using gitlab.com and there documentation says you should just use the docker image. Only docker-compose doesn't seem to come with the standard image any more...

What's the best approach to use docker-compose with GitLab CI?

EDIT: Use case

.gitlab-ci.yml

image: docker:latest

variables:
  DOCKER_DRIVER: overlay
  WORKER_TEST_IMAGE: registry.gitlab.com/org/project/worker:$CI_COMMIT_REF_NAME
  WORKER_RELEASE_IMAGE: registry.gitlab.com/org/project/worker:latest

services:
  - postgres:9.6.3
  - docker:dind

stages:
- build
- test
- release
- deploy

before_script:
  - docker login -u gitlab-ci-token -p $CI_JOB_TOKEN $CI_REGISTRY

build_worker:
  stage: build
  script:
    - docker build --pull -t $WORKER_TEST_IMAGE .
    - docker push $WORKER_TEST_IMAGE

test_worker:
  stage: test
  script:
    - docker pull $WORKER_TEST_IMAGE
    # I need a way to connect the postgres service to the image I'm 
    # trying to run. Which doesn't seem possible?
    # - docker-compose run worker dockerize -wait tcp://postgres:5432 nosetests
    - docker run $WORKER_TEST_IMAGE dockerize -wait tcp://postgres:5432 nosetests
...

I feel like Gitlab CI is making me reimplement docker-compose because they don't support it?

1 Answer 1

4

Gitlab CI has its own docker syntax and there is no support for docker-compose. So docker-compose will only work if you use a mechanism called dind (docker in docker), where you have to mount the docker socket of the host system into your CI runners. Sooner or later you will discover that this approach has serious limitations, might introduce more runner configuration and lacks documentation.

Although possible, you really should stick to the offical gitlab way. Carefully read https://docs.gitlab.com/ce/ci/docker/using_docker_images.html and you will be able to easily use multiple docker containers in a similar way as docker-compose.

Sign up to request clarification or add additional context in comments.

4 Comments

Thanks for the answer. Their way seems to assume you are deploying directly to a VM and docker is just used to run the backing services. I'm deploying containers, I want to build the container and run my unit tests inside of the container that was just built, ensuring environment parity, a major benefit of containers. I have docker running via docker:bind, I was hopping to use docker-compose to bring up my environment and execute my tests. But it looks like I'll have to work out the raw docker commands and toss them in make or something. Just frustrating.
Thats true, you need to execute "raw" docker commands, like docker build and docker push. I think that the main purpose for docker-compose is a dev environment, easy mounting, easy composing, one file. However on a CI Server you want to build, push, test and release your containers, thats a different scenario, hence different tooling. For production most people nowadays use Kubernetes, with a focus on scaling, managing, monitoring, ... Thats the reason why docker-compose is no supported for CI or production / cloud usage.
By the way, i had the very same thought of why not using docker-compose everywhere i want. It took me quiet some time to discover why that's not as great of an idea as it sounds like, mainly due to different requirements.
Thank you, but I feel this deep frustration at having to figure out what docker-compose does with docker. Especially with the networking stuff that seems to be a constant state of flux and I've completely lost track of what is going on...

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.