Release with GitHub actions to S3
Photo by unsplash


Did you ever want to release your cdk construct as synthesized cloudformation template from GitHub actions to an S3 bucket? Here comes which steps are needed and my opinionated solution!

Questions that I had before:

  • How do I create a public s3 bucket via cdk or command line?
  • How do I extend projen?
  • How do I synthesize the cdk construct as cloudformation template?
  • How do you publish to S3 buckets from GH actions? How do you create a role with the appropriate permissions?

Let’s go 🏃

The Challenge

  1. Synth a cdk construct to a Cloudformation template and release it to an S3 bucket
  2. All in projen
    • extending the workflow
    • and have multiline GitHub actions run commands
  3. Split into assets buckets per region (this link explains why) and one release bucket for the main template
  4. Everything via GitHub actions with OIDC
    • see here how to set it up
    • and here how to harden it for not getting pwn pull requests as described here.

The solution

The section describes the solutions covering all the questions asked in the beginning.

S3 Bucket creation

First, we create a publicly read-accessible S3 bucket. Not via ClickOps but the command line:

export MY_PUBLISH_BUCKET="foobucket-12232929"

# Step 1: Create the bucket
aws s3 mb s3://$MY_PUBLISH_BUCKET --region eu-central-1

# Step 2: change the public access block
aws s3api put-public-access-block \
  --bucket $MY_PUBLISH_BUCKET \
  --public-access-block-configuration "BlockPublicAcls=false,IgnorePublicAcls=false,BlockPublicPolicy=false,RestrictPublicBuckets=false"

# Step 3: Apply a bucket policy granting public read access
aws s3api put-bucket-policy --bucket $MY_PUBLISH_BUCKET --policy '{
    "Version": "2012-10-17",
    "Statement": [
            "Sid": "AddPublicReadAccess",
            "Effect": "Allow",
            "Principal": "*",
            "Action": "s3:GetObject",
            "Resource": "arn:aws:s3:::${MY_PUBLISH_BUCKET}/*"

However, via cdk it is easier. See the following snippet

const fooBucket = new s3.Bucket(stackUnderTest, 'fooBucket', {
  bucketName: 'foobucket-12232929', // DNS compatible and unique
  versioned: false,
  blockPublicAccess: {
    blockPublicAcls: false,
    blockPublicPolicy: false,
    ignorePublicAcls: false,
    restrictPublicBuckets: false,
  publicReadAccess: true,

// create the role (more later in the text)
fooBucket.grantPut(myUploadRole, 'myObjectKeyPattern/*');

You might think it’s enough to be able to upload/put data in the created bucket, right? In GitHub actions I wanted to use it as follows:

    name: Upload to Amazon S3
    runs-on: ubuntu-latest
      id-token: write # needed to interact with GitHub's OIDC Token endpoint.
      contents: read
      - name: Checkout
        uses: actions/checkout@v4

      - name: Configure AWS credentials
        uses: aws-actions/configure-aws-credentials@v4
          role-to-assume: arn:aws:iam::123456789012:role/myUploadRole
          #role-session-name: MySessionName # Optional
          aws-region: eu-central-1

      - name: Sync files to S3
        run: |
          aws s3 sync . s3://foobucket-12232929          

Unfortunately, the following error occured:

error  : [100%] fail: Bucket named 'foobucket-12232929' exists, but not in account 123456789012. Wrong account?
Failure: Error: Bucket named 'foobucket-12232929' exists, but not in account 123456789012. Wrong account?

Hmm 🤔 but the bucket exists in the mentioned account. How is this possible? The solution are additional permissions on the role for determining the bucketlocation. See the comment in the issue here.

So we added the permission as follows:

new iam.PolicyStatement({
  actions: [
  resources: [

Now as the bucket an policies are set up. Let’s move on to extending the projen workflow.

Extending projen workflows

Afte researching I found some snippets on how it can be extended. Part of the .projenrc.ts which is a bit lengthy, comes now with explanations in it:

const buildWorkflow = project.github!.workflows.find(w => === 'build');

// general
const releaseRetries = '20'; // needs to be a string and will be parsed
// comma separated list of regions
const releaseRegions = 'eu-central-1,eu-west-1';
const releasePrefix = 'rootmail'; // will be used as prefix for the S3 path
// dev
const devS3PublishBucket = 'mvc-tmp-dev-releases';
const devS3FileAssetsBucketPrefix = 'mvc-tmp-dev-assets'; // will get '${AWS::Region}' appended
// only yarn is supported atm
const packageManager = project.package.packageManager;
switch (packageManager) {
  case 'yarn':
  case 'yarn2':
  case 'yarn_classic':
  case 'yarn_berry':
    // only yarn is supported atm
  case 'npm':
  case 'pnpm':
  case 'bun':
    throw new Error(`Unsupported package manager atm: ${packageManager}`);

// for multi-line strings in YAML
// see
const installToolDependenciesSteps = [
  'pip install cfn-flip && cfn-flip --version',
  'yarn global add aws-cdk',

const buildAndPublishAssetsSteps = [
  'export RELEASE_VERSION=$(cat $GITHUB_WORKSPACE/dist/releasetag.txt)',
  'echo "Releasing ${CI_REPOSITORY_NAME} with prefix ${RELEASE_PREFIX} and version ${RELEASE_VERSION} to S3 bucket ${S3_PUBLISH_BUCKET} and file assets bucket prefix ${S3_FILE_ASSETS_BUCKET_PREFIX}"',
  'yarn install',
  // 1️⃣ uses the src/index-cli-synth.ts file (explained later)
  'yarn synth',
  // 2️⃣ publishes the assets to the defined s3 asset buckets (1 per region)
  'yarn publish-assets',
  // 3️⃣ puts the templates to s3 publish bucket
  'aws s3 cp cdk.out/${RELEASE_NAME}.template.json s3://${S3_PUBLISH_BUCKET}/${RELEASE_PREFIX}/${RELEASE_VERSION}/',
  'cfn-flip cdk.out/${RELEASE_NAME}.template.json cdk.out/${RELEASE_NAME}.template.yaml',
  'aws s3 cp cdk.out/${RELEASE_NAME}.template.yaml s3://${S3_PUBLISH_BUCKET}/${RELEASE_PREFIX}/${RELEASE_VERSION}/',

// if there is a buildWorkflow (which ne know 😎) we extend it
if (buildWorkflow) {
    release_s3_dev: {
      name: 'Release to S3 (dev)',
      runsOn: ['ubuntu-latest'],
      needs: ['build'],
      // self-mutation did not happen and the PR is from the same repo
      if: '!( && !(github.event.pull_request.head.repo.full_name != github.repository)',
      permissions: {
        idToken: JobPermission.WRITE,
        contents: JobPermission.READ,
      steps: [
          name: 'Checkout',
          uses: 'actions/checkout@v4',
        // NOTE: using a secret so the role and account are not exposed
        // Additionally  it's tied to the owner and repo
          name: 'Configure AWS credentials',
          uses: 'aws-actions/configure-aws-credentials@v4',
          with: {
            'aws-region': 'eu-west-1',
            'role-to-assume': '${{ secrets.DEV_RELEASE_ROLE }}',
          name: 'Setup Node.js',
          uses: 'actions/setup-node@v4',
          with: {
            'node-version': '18.x',
          name: 'Install Build and publish assets dependencies',
          run: installToolDependenciesSteps.join('\n'),
        // we need this to be able to synth in a later step
          name: 'Additional install, build and synth',
          run: 'yarn install && yarn build',
        // NOTE: due to
        // so we can use envs such as CI_HEAD_REF_SLUG
          name: 'Inject environment variables',
          uses: 'FranzDiebold/github-env-vars-action@v2',
        // we create a sortable version
          name: 'Prepare version for branch',
          run: 'mkdir -p $GITHUB_WORKSPACE/dist && echo "0.0.0-${CI_HEAD_REF_SLUG}-$(date -u +\'%Y%m%d-%H%M%S\')-${GITHUB_SHA::8}" > $GITHUB_WORKSPACE/dist/releasetag.txt && cat $GITHUB_WORKSPACE/dist/releasetag.txt',
          name: 'Build and publish assets',
          run: buildAndPublishAssetsSteps.join('\n'),
          // used by the scripts in the commands
          env: {
            S3_PUBLISH_BUCKET: devS3PublishBucket,
            S3_FILE_ASSETS_BUCKET_PREFIX: devS3FileAssetsBucketPrefix,
            RELEASE_RETRIES: releaseRetries,
            RELEASE_REGIONS: releaseRegions,
            RELEASE_PREFIX: releasePrefix,

// we define the scripts
project.package.setScript('synth', 'npx cdk synth -q');
project.package.setScript('publish-assets', 'npx ts-node -P tsconfig.json --prefer-ts-exts src/scripts/publish-assets.ts');

What is missing is src/scripts/publish-assets.ts file. Which comes now:

import { execSync } from 'child_process';
import * as path from 'path';
import * as retry from 'async-retry';

// envs from the GitHub workflow
if (!process.env.RELEASE_NAME || process.env.RELEASE_NAME === '') {
  throw new Error('RELEASE_NAME environment variable must be set');

if (!process.env.RELEASE_RETRIES || process.env.RELEASE_RETRIES === '') {
  throw new Error('RELEASE_RETRIES environment variable must be set');

if (!process.env.RELEASE_REGIONS || process.env.RELEASE_REGIONS.length === 0) {
  throw new Error('RELEASE_REGIONS environment variable must be set and not empty');

const releaseName = process.env.RELEASE_NAME;
const releaseRetries = process.env.RELEASE_RETRIES as unknown as number;
const releaseRegions = process.env.RELEASE_REGIONS.split(',');
console.log(`Publishing assets for release ${releaseName} with ${releaseRetries} retries to regions: ${releaseRegions}`);

const main = async () => {
  const assetManifestPath = path.resolve(__dirname, '..', '..', 'cdk.out', `${releaseName}.assets.json`);
  for (const region of releaseRegions) {
    // this command actually publishes the assets 👇🏽
    const command = `AWS_REGION=${region} yarn cdk-assets publish -p ${assetManifestPath}`;
    await retry(async (_: any, attempt: number) => {
      console.log(`Attempt ${attempt} of ${releaseRetries} in region ${region}`);
      const execResult = await execSync(command);
    }, {
      retries: releaseRetries,
      factor: 2,
      minTimeout: 1000,
      maxTimeout: 30000,

// the usualy async wait loop catch 
(async () => { await main(); })().catch(e => {

You might wonder why we release to multiple AWS regions, this is because for CR the code of the lambda will be hosted in an S3 bucket, which is the bucket you release the synthsized construct to. This bucket has to be in the same AWS Region as the Lamdbda function. See here for details. This is the reason we have to distribute the code across all regions you want to deploy the stack in.

How to synthesize to cloudformation

Now that we have clarified how to release, we need to put the final pieces together.

We need an additional cdk.json file in the root of the construct to be able to synthesize it.

  "app": "npx ts-node -P tsconfig.json --prefer-ts-exts src/index-cli-synth.ts",
  "output": "cdk.out",
  "watch": {
    "include": [
    "exclude": [

This is only possible if we wrap it into a Stack. Therefore, we need the src/index-cli-synth.ts file where we actually do this to create cloudformation template:

class RootmailStack extends Stack {
  constructor(scope: Construct, id: string, props: RootmailStackProps) {
    super(scope, id, props);

    const domain = new CfnParameter(this, 'Domain', {
      type: 'String',
      description: 'Domain used for root mail feature.',

    // other input parameters ...

    // here we define the 👇🏽 actual construct within the stack
    new Rootmail(this, 'Rootmail', {
      domain: domain.valueAsString,
      subdomain: subdomain.valueAsString,
      totalTimeToWireDNS: Duration.minutes(totalTimeToWireDNS.valueAsNumber),
      wireDNSToHostedZoneID: wireDNSToHostedZoneID.valueAsString.trim(),

let releaseVersion = process.env.RELEASE_VERSION;
if (!process.env.RELEASE_VERSION || process.env.RELEASE_VERSION === '') {
  console.log('RELEASE_VERSION is not set. Using default \'0.0.0-DEVELOPMENT\'');
  releaseVersion = '0.0.0-DEVELOPMENT';

if (!process.env.RELEASE_NAME || process.env.RELEASE_NAME === '') {
  throw new Error('RELEASE_NAME environment variable must be set');
const releaseName = process.env.RELEASE_NAME;

if (!process.env.RELEASE_PREFIX || process.env.RELEASE_PREFIX === '') {
  throw new Error('RELEASE_PREFIX environment variable must be set');
const releasePrefix = process.env.RELEASE_PREFIX;

if (!process.env.S3_FILE_ASSETS_BUCKET_PREFIX || process.env.S3_FILE_ASSETS_BUCKET_PREFIX === '') {
  throw new Error('S3_FILE_ASSETS_BUCKET_PREFIX environment variable must be set');
const s3FileAssetsBucketPrefix = process.env.S3_FILE_ASSETS_BUCKET_PREFIX;

console.log(`Using RELEASE_NAME: '${releaseName}' RELEASE_VERSION Version: '${releaseVersion}' with prefix: '${releasePrefix}'`);
new RootmailStack(app, releaseName, {
  version: releaseVersion,
  synthesizer: new CliCredentialsStackSynthesizer({
    fileAssetsBucketName: `${s3FileAssetsBucketPrefix}-\${AWS::Region}`,
    bucketPrefix: `${releasePrefix}/${releaseVersion}/`,

Now that we have plugged everything together locally and in GitHub action, we still have the missing point of the appropriate permissions for the role that publishes. This comes now.

How to create the iam role with appropriate permissions

The role used in the GitHub actions workflow needs to be created upfront. After researching I came across the aws-cdk-github-oidc construct, which does the job.

The best is to have a separate CDK-app project, in which you create the public S3 buckets, the roles, and the permission associations.

Here comes a small snippet:

import { GithubActionsIdentityProvider, GithubActionsRole } from 'aws-cdk-github-oidc';
// ...
const uploadRole = new GithubActionsRole(this, roleName, {
  maxSessionDuration: Duration.hours(1),
  provider: provider,
  owner: owner,
  repo: repo,
  filter: filter,

releaseBucket.grantPut(uploadRole, releaseObjectKeysPattern);

// see
uploadRole.addToPolicy(new iam.PolicyStatement({
  actions: [
  resources: [releaseBucket.bucketArn],

For now you can see the code in the PR.


It was a journey to dig through all the little details, but now we have the boilerplate to release our constructs to S3 in an automated way.

Surprising was that GitHub actions do not have SLUG variables, such as CI_HEAD_REF_SLUG 🤔 So using FranzDiebold/github-env-vars-action@v2 did solve it.

Furthermore, we realized we can add jobs to a workflow on projen, but not add steps to an existing job.

As said, I will create a separate project on GitHub with the boilerplate code. Stay tuned

UPDATE: A separate project on GitHub with the code wrapped for clarity is here: 🏃

Like what you read? You can hire me, or drop me a message to see which services 💻 may help you 👇