I would like to run Python scripts in various stages of Jenkins (pipeline) jobs, abroad a wide range of agents. I want the same Python environment for all of these, so I'm considering using Docker for this purpose.
I'm considering using Docker to build an image that contains the Python environment (with installed packages, etc.), that then allows an external Python script based on argument input:
docker run my_image my_python_file.py
My question is now, how should the infrastructure be? I see that the Python docker distribution is 688MB, and transferring this image to all steps would surely be an overhead? However, they are all on the same network, so perhaps it wouldn't be a big issue.
Updates. So, my Dockerfile looks like this:
FROM python:3.6-slim-jessie
COPY requirements.txt ./
RUN pip install --no-cache-dir -r requirements.txt
CMD ["python3"]
Then I build the image using
>docker build ./ -t my-app
which successfully builds the image and install my requirements. Then I want to start the image as daemon using
> docker run -dit my-app
Then I execute the process using
> docker exec -d {DAEMON_ID} my-script.py