1

I've a Python application using Django and Celery, and I trying to run using docker and docker-compose because i also using Redis and Dynamodb

The problem is the following:

I'm not able to execute both services WSGI and Celery, cause just the first instruction works fine..

version: '3.3'

services:
  redis:
    image: redis:3.2-alpine
    volumes:
      - redis_data:/data
    ports:
      - "6379:6379"
  dynamodb:
    image: dwmkerr/dynamodb
    ports:
      - "3000:8000"
    volumes:
      - dynamodb_data:/data
  jobs:
    build:
      context: nubo-async-cfe-seces
      dockerfile: Dockerfile
    environment:
      - REDIS_HOST=redisrvi
      - PYTHONUNBUFFERED=0
      - CC_DYNAMODB_NAMESPACE=None
      - CC_DYNAMODB_ACCESS_KEY_ID=anything
      - CC_DYNAMODB_SECRET_ACCESS_KEY=anything
      - CC_DYNAMODB_HOST=dynamodb
      - CC_DYNAMODB_PORT=8000
      - CC_DYNAMODB_IS_SECURE=False

    command: >
      bash -c "celery worker -A tasks.async_service -Q dynamo-queue -E --loglevel=ERROR &&
               uwsgi --socket 0.0.0.0:8080 --protocol=http --wsgi-file nubo_async/wsgi.py"
    depends_on:
      - redis
      - dynamodb
    volumes:
      - .:/jobs
    ports:
      - "9090:8080"
volumes:
  redis_data:
  dynamodb_data:

Has anyone had the same problem?

6
  • It's more architectural notice but I believe that you need to make separate services for jobs and web app parts. In jobs you run celery worker etc, in web service - uwsgi .... As you don't follow rule "1 process - 1 container". Btw, what do you see something in docker-compose logs -f jobs ? Commented Oct 30, 2018 at 12:28
  • @Satevg logs showing only Celery process info nothing about Django. Commented Oct 30, 2018 at 12:38
  • You can try to use semicolon instead of &&. See unix.stackexchange.com/a/187148 Maybe celery does not return success code, but that's another problem Commented Oct 30, 2018 at 12:49
  • @Satevg your first comment gave me the answer here: ruddra.com/docker-do-stuff-using-celery-using-redis-as-broker is the example of what you mentioned, I need to separate the services. thanks for the tip! Commented Oct 30, 2018 at 12:55
  • There's no direct dependency between celery and uwsgi process, so semicolon is ok, you can give a try inside the same container. But it's better to make these services separate for sure. Commented Oct 30, 2018 at 12:57

2 Answers 2

1

You may refer to docker-compose of Saleor project. I would suggest to let celery run its daemon only depend on redis as the broker. See the configuration of docker-compose.yml file:

services:
  web:
    build:
      context: .
      dockerfile: ./Dockerfile
      args:
        STATIC_URL: '/static/'
    restart: unless-stopped
    networks:
      - saleor-backend-tier
    env_file: common.env
    depends_on:
      - db
      - redis

    celery:
        build:
          context: .
          dockerfile: ./Dockerfile
          args:
            STATIC_URL: '/static/'
        command: celery -A saleor worker --app=saleor.celeryconf:app --loglevel=info
        restart: unless-stopped
        networks:
          - saleor-backend-tier
        env_file: common.env
        depends_on:
          - redis

See also that the connection from both services to redis are set separately by the environtment vatables as shown on the common.env file:

CACHE_URL=redis://redis:6379/0
CELERY_BROKER_URL=redis://redis:6379/1
Sign up to request clarification or add additional context in comments.

Comments

0

Here's the docker-compose as suggested by @Satevg, run the Django and Celery application by separate containers. Works fine!

version: '3.3'

services:
  redis:
    image: redis:3.2-alpine
    volumes:
      - redis_data:/data
    ports:
      - "6379:6379"
  dynamodb:
    image: dwmkerr/dynamodb
    ports:
      - "3000:8000"
    volumes:
      - dynamodb_data:/data
  jobs:
    build:
      context: nubo-async-cfe-services
      dockerfile: Dockerfile
    environment:
      - REDIS_HOST=redis
      - PYTHONUNBUFFERED=0
      - CC_DYNAMODB_NAMESPACE=None
      - CC_DYNAMODB_ACCESS_KEY_ID=anything
      - CC_DYNAMODB_SECRET_ACCESS_KEY=anything
      - CC_DYNAMODB_HOST=dynamodb
      - CC_DYNAMODB_PORT=8000
      - CC_DYNAMODB_IS_SECURE=False    
    command: bash -c "uwsgi --socket 0.0.0.0:8080 --protocol=http --wsgi-file nubo_async/wsgi.py"
    depends_on:
      - redis
      - dynamodb
    volumes:
      - .:/jobs
    ports:
      - "9090:8080"
  celery:
    build:
      context: nubo-async-cfe-services
      dockerfile: Dockerfile
    environment:
    - REDIS_HOST=redis
    - PYTHONUNBUFFERED=0
    - CC_DYNAMODB_NAMESPACE=None
    - CC_DYNAMODB_ACCESS_KEY_ID=anything
    - CC_DYNAMODB_SECRET_ACCESS_KEY=anything
    - CC_DYNAMODB_HOST=dynamodb
    - CC_DYNAMODB_PORT=8000
    - CC_DYNAMODB_IS_SECURE=False
    command: celery worker -A tasks.async_service -Q dynamo-queue -E --loglevel=ERROR
    depends_on:
    - redis
    - dynamodb
    volumes:
    - .:/jobs
volumes:
  redis_data:
  dynamodb_data:

1 Comment

Little bit refactored your config (DRY ;)) pastebin.com/Mbz72QyD See jobs service definition

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.