1

I am using Code Pipeline to deploy a Cloudformation Template. The problem is that this Cloudformation template has some Nested Stacks. The Nested Stacks templates needs to be in a S3 bucket. So before trigger the master (parent) CF templates I need to upload the CF Nested Stacks to S3.

I didn't find a way to do that using Code Pipeline.

Any suggestions?

2 Answers 2

1

One approach is to use Git hooks to copy the nested stacks to S3, e.g. post-receive hook.

Another one is to add another stage in the pipeline to invoke a Lambda function. You can follow this article to configure this step. When you set the "input artifacts" field, CodePipeline passes the path of the artifacts zip file as part of the event. Then the Lambda function extracts the zip file and uploads your stacks to your bucket.

Below is a sample Python code that downloads & extracts the artifacts to /tmp:

import boto3
import zipfile

def lambda_handler(event, context):
    s3 = boto3.resource('s3')
    codepipeline = boto3.client('codepipeline')

    artifacts_location = event["CodePipeline.job"]["data"]["inputArtifacts"][0]["location"]["s3Location"]
    jobId = event["CodePipeline.job"]["id"]

    try:
        print("Downloading artifacts")
        s3.Bucket(artifacts_location["bucketName"]).download_file(artifact_location["objectKey"], '/tmp/artifacts.zip')
        zip_ref = zipfile.ZipFile('/tmp/artifacts.zip', 'r')
        zip_ref.extractall('/tmp')
        zip_ref.close()
    except ClientError as e:
        print("Cannot process the artifacts: {}".format(str(e)))
        codepipeline.put_job_failure_result(
           jobId=jobId,
           failureDetails={"type": 'JobFailed', "message": str(e)}
        )
        return

    # Perform the steps to copy your files from /tmp folder.
    codepipeline.put_job_success_result(jobId=jobId)

Sign up to request clarification or add additional context in comments.

Comments

1

We solved this by adding a CodeBuild action which uses aws cloudformation package to upload the files to S3

Here is a sample buildspec.yml for codebuild that shows how this is done:

version: 0.2

phases:
  build:
    commands:
      - CLOUDFORMATION_SRC_DIR="$CODEBUILD_SRC_DIR/cloudformation"
      - CFN_TMP=`mktemp` && aws cloudformation package --template-file "$CLOUDFORMATION_SRC_DIR/template.yml" --s3-bucket "my-s3-bucket" --s3-prefix "cfn_package/$CODEBUILD_BUILD_ID" --output-template-file "$CFN_TMP" && mv "$CFN_TMP" "$CLOUDFORMATION_SRC_DIR/template.yml"

artifacts:
  secondary-artifacts:
    cloudformation:
      base-directory: $CLOUDFORMATION_SRC_DIR
      files:
        - '**/*'

Comments

Your Answer

By clicking “Post Your Answer”, you agree to our terms of service and acknowledge you have read our privacy policy.

Start asking to get answers

Find the answer to your question by asking.

Ask question

Explore related questions

See similar questions with these tags.