Matt Martz
martzcodes

martzcodes

CDK Pipelines Crash Course

Photo by Conor Samuel on Unsplash

CDK Pipelines Crash Course

Part 1 of a 2-part series on CDK Pipelines

Matt Martz's photo
Matt Martz
Ā·Feb 9, 2022Ā·

7 min read

Subscribe to my newsletter and never miss my upcoming articles

Listen to this article

Table of contents

  • Crashing into CDK Pipelines
  • Automating the Pull Request Checks
  • Recap

CDK Pipelines is an opinionated construct library. It is purpose-built to deploy one or more copies of your CDK applications using CloudFormation with a minimal amount of effort on your part. In this article we'll go through a crash course to get up and running with CDK Pipelines and CodeBuild Pull Request checks for your code stored on GitHub.

This is Part 1 of a two-part series. Part 2 will convert the code from this article into CDK Constructs and make deploying via a pipeline optional to improve Developer Experience.

If you're new to CDK... you might be getting a bit ahead of yourself. Maybe start with my freeCodeCamp CDK Crash Course instead? šŸ˜‰

The code for this is located at https://github.com/martzcodes/blog-pipeline

Crashing into CDK Pipelines

If you're trying this yourself, make sure your AWS Account is Bootstrapped to CDK 2.0: npx cdk@2.x bootstrap

Quick Start

Since the focus of this post is CDK Pipelines, not "how to create a CDK project" we're going to stick with the standard SQS Queue from cdk init...

$ mkdir blog-pipeline    
$ cd blog-pipeline
$ npx cdk@2.x init --language typescript

Naming things is hard, but I'm going to rename lib/blog-pipeline-stack.ts to lib/queue-stack.ts and uncomment the SQS queue within. This is the stack we want to deploy to ALL the places! šŸ’Ŗ

For the crash course portion of this article we'll be working in the bin/blog-pipeline.ts file, but since this is a two part article I've archived it as bin/simple-pipeline.ts. We'll be doing some refactoring in part 2 but I wanted to preserve the original.

GitHub Credentials

In AWS CodePipeline, there are two supported versions of the GitHub source action:

  • The GitHub version 2 action uses Github app-based auth backed by a CodeStarSourceConnection for Bitbucket, GitHub, and GitHub Enterprise Server actions resource. It installs an AWS CodeStar Connections application into your GitHub organization so that you can manage access in GitHub.
  • The GitHub version 1 action uses OAuth tokens to authenticate with GitHub and uses a separate webhook to detect changes. This is no longer the recommended method (by AWS).

The GitHub version 2 Source Action is WOEFULLY underpowered. It works great if you only want to have your pipeline run a single branch all the time! But if you want to do any webhook-like things (automated Pull Request checks anyone?) you have to use v1.

We want to do PR stuff later, so v1 it is! šŸ¤¬

That means we need an access token. In GitHub you can create a personal access token here: https://github.com/settings/tokens and put it into secrets manager:

aws secretsmanager create-secret \
    --name BlogPipelineGitHubToken \
    --description "The GitHub access token for the blog-pipeline post." \
    --secret-string "{\"access-token\":\"šŸ¤«\"}"

In the response, you'll get back the arn for the secret.

{
    "ARN": "arn:aws:secretsmanager:{region}:{account}:secret:BlogPipelineGitHubToken-gwKanq",
    "Name": "BlogPipelineGitHubToken",
    "VersionId": "0b9d022c-7b20-4f7c-b883-c5a09ef58870"
}

Creating the Pipeline

With that out of the way we can create the actual pipeline.

class BlogPipeline extends Stack {
    constructor(scope: Construct, id: string, props: StackProps) {
        super(scope, id, props);
        const owner = 'martzcodes';
        const repo = 'blog-pipeline';
        const branch = 'main';
        const secretArn = `arn:aws:secretsmanager:${env.region}:${env.account}:secret:BlogPipelineGitHubToken-gwKanq`;

        // Pipeline spec has to run synth at the end! That's how it knows what to deploy.
        const pipelineSpec = BuildSpec.fromObject({
            version: 0.2,
            phases: {
                install: {
                    commands: ['n latest', 'node -v', 'npm ci'],
                },
                build: {
                    commands: ['npx cdk synth']
                }
            }
        });
        const synthAction = new CodeBuildStep(`Synth`, {
            input: CodePipelineSource.gitHub(`${owner}/${repo}`, branch, {
                authentication: SecretValue.secretsManager(secretArn, {
                    jsonField: 'access-token',
                }),
            }),
            partialBuildSpec: pipelineSpec,
            commands: [],
        });
        const pipeline = new CodePipeline(this, `Pipeline`, {
            synth: synthAction,
            dockerEnabledForSynth: true,
            // need this if you're actually deploying to multiple accounts
            // crossAccountKeys: true,
        });
    }
}
new BlogPipeline(app, `BlogPipeline`, {});

There isn't a lot going on in this code. Our build is defined by a BuildSpec which runs in two phases: install and build. Those ultimately just install the npm dependencies and then run the cdk synth.

The BuildSpec is added to a CodeBuildStep that we're going to call "Synth". This CodeBuildStep also includes the GitHub access token (stored in secrets manager) as the authentication for the Source Action for the builds. Any time a commit shows up in the branch specified it will run the pipeline.

From there we create the actual Pipeline and specify the synthAction.

Setting the Stage

A Stage is a CDK Construct that's like an Environment-Specific App. It's what you want to deploy into a specific environment (AWS Account and Region) and can consist of multiple stacks. A Stage is NOT a Stack... You can't add a Queue to a Stage, you have to add it to a Stack.

Our Stage will simply consist of the previous QueueStack.

class BlogPipelineStage extends Stage {
    constructor(scope: Construct, id: string, props: StageProps) {
        super(scope, id, props);
        new QueueStack(this, 'QueueStack', {});
    }
}

And we'll add it to the BlogPipeline stack by adding instantiating the Stage and adding it to the pipeline:

class BlogPipeline extends Stack {
    constructor(scope: Construct, id: string, props: StackProps) {
        super(scope, id, props);
        // ...
        const stage = new BlogPipelineStage(app, 'BlogPipelineStage', {});
        pipeline.addStage(stage);
    }
}

Deploying the Code

Now we can push our code to GitHub and then npx cdk deploy. This will deploy our CodePipeline via CloudFormation and the pipeline will get the code from GitHub and self-mutate to make sure it's up to date. From that point on any time a commit comes into the main branch the pipeline will update itself. After the self-mutation the Synthesis step of the pipeline will CDK Synth all the stages to generate the CloudFormation and then deploy those to each environment. Later if we add a new stage and commit it, the pipeline will deploy it there also.

For a first deployment, it's important to PUSH FIRST then deploy because the pipeline will try to self-mutate based on the code that is stored in GitHub (which wouldn't be there if you didn't push).

Once that's done, we'll end up with two CloudFormation Stacks. One for the pipeline and one for the QueueStack (here with the original name before I renamed it... naming things is hard): Screen Shot 2022-02-08 at 12.56.15 PM.png

āš ļø Note: CDK Pipelines does NOT destroy stacks it deploys in this way. If you cdk destroy the pipeline it will NOT destroy the stacks that were deployed as part of the pipeline. Some would say that's a feature not a bug. šŸ˜…

Automating the Pull Request Checks

Deployments are great, but so are PR checks. To do that we can add some things to our Pipeline Stack.

First we need to "register" the GitHubSourceCredentials for CodeBuild. Oddly this doesn't get passed in anywhere, it's just expected for you to do this. šŸ‘€

new GitHubSourceCredentials(this, 'GitHubCreds', {
  accessToken: SecretValue.secretsManager(`arn:aws:secretsmanager:${env.region}:${env.account}:secret:BlogPipelineGitHubToken-gwKanq`, {
    jsonField: 'access-token',
  }),
});

From there we add another build spec that runs our test command...

const prSpec = BuildSpec.fromObject({
  version: 0.2,
  phases: {
    install: {
      commands: ['n latest', 'node -v', 'npm ci'],
    },
    build: {
      commands: ['npm run test']
    }
  }
});

And define the source and project. In this case the source is set up as a webhook that only executes the CodeBuild Project on Pull Request events (and isn't on the main branch). reportBuildStatus: true is what responds back to GitHub with the success/failure.

const source = Source.gitHub({
  owner: owner,
  repo: repo,
  webhook: true,
  webhookFilters: [
    FilterGroup.inEventOf(
      EventAction.PULL_REQUEST_CREATED,
      EventAction.PULL_REQUEST_UPDATED,
      EventAction.PULL_REQUEST_REOPENED,
    ).andBranchIsNot('main'),
  ],
  reportBuildStatus: true,
});

new Project(this, 'PullRequestProject', {
  source,
  buildSpec: prSpec,
  concurrentBuildLimit: 1,
});

Normally for pipeline modifications you don't have to npx cdk deploy, but this isn't a pipeline modification... this is adding a separate CodeBuild Project to the stack, so you DO need to deploy.

Checking our Work

With everything deployed we can make a Pull Request with a purposely failing test and see that CodeBuild automatically runs and our pull request fails (as intended) āœ…

Screen Shot 2022-02-08 at 2.12.23 PM.png

All of the above code is in bin/simple-pipeline.ts

Recap

In part 1 of my two-part series on CDK Pipelines we wrote around 100 lines of code to create a Pipeline and CodeBuild project for doing PR checks... but none of it is reusable.

In part 2 we'll fix that and turn the code above into re-usable CDK Constructs that you could publish in an npm library and use within your organization. We'll also improve Developer Experience by making the pipeline optional enabling developers to deploy their own code to their own sandbox environments.

Ā 
Share this