3

I'm start using the Google Cloud Functions and I've see that has an option to make an automated deploy from the bitbucket. I have multiple functions to deploy, should I have one repo per functions or can I have one repo but divided by directories or something else?

that is what I'm talking about: Deploying from Source Control

Thanks.

1 Answer 1

3

You can have multiple functions in a single repo. A common structure would be as follows:

.
├── common
│   ├── module1.py
│   └── module2.py
├── main.py
└── requirements.txt

Where main.py contains both functions:

from common import module1, module2

def cloudfunction1(request):
    ...

def cloudfunction2(request):
    ...

And you deploy those functions either directly by name:

$ gcloud functions deploy cloudfunction1 --runtime python37 --trigger-http --source https://source.developers.google.com/...
$ gcloud functions deploy cloudfunction2 --runtime python37 --trigger-http --source https://source.developers.google.com/...

Or by entrypoint:

$ gcloud functions deploy foo --runtime python37 --entry-point cloudfunction1 --trigger-http --source https://source.developers.google.com/...
$ gcloud functions deploy bar --runtime python37 --entry-point cloudfunction2 --trigger-http --source https://source.developers.google.com/...
Sign up to request clarification or add additional context in comments.

2 Comments

Thanks for the answer.
Sadly main.py can get really big, loading many modules and affecting cold start. It would be better if we could specify a different main file for each function.

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.