1

I am using a AWS CodePipeline which fetches the source from my git repository, uses CodeBuild buildSpec to build and save output artifacts to S3 bucket, which ultimately gets deployed to Elastic BeanStalk (NodeJS Environment).

Everything works fine but I require the pipeline to copy 1 particular file from one of my AWS S3 buckets and add it to the output artifacts before deploying it to the EB

Can it be done using the buildSpec?

artifacts:
  files:
    - '**/*'
    # - How to add a file from S3 to the artifacts?

1 Answer 1

6

My recommendation is as part of the build or post_build, copy the required file from s3 into your build directory.

build:
  commands:
    - echo "Build commands"
    - aws s3 cp --region=xx-xxxx-x "s3://file/in/s3" "local-file-instance-to-include"

Then you will have the file copied from s3, available for your build, and you can add it to the artifacts output.

Sign up to request clarification or add additional context in comments.

3 Comments

Tried adding the above command and now I am getting this error: COMMAND_EXECUTION_ERROR: Error while executing command: aws s3 cp ...... Reason: exit status 1
The issue was with the permissions. The CB role didn't have permission to access S3 buckets. Thanks.
Worked perfectly, but since I am loading a private key, I don't want to leave the bucket open to the public, do you know how to apply policies in this approach?

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.