Configuring a BitBucket Pipeline with Sigma CLI

BitBucket Pipelines enables you to create custom software development life cycle (SDLC) workflows directly in your BitBucket repository. Therefore if you are using BitBucket as the version control system of your Sigma project, you can use the Sigma CLI to configure a pipeline to automate the build and/or the deployment of your serverless application.

In this article, we discussed the steps on how to configure such a pipeline with Sigma CLI, that builds and deploys your project at each update to the master branch of your BitBucket repository.

Configuring the AWS Keys

Since we need to provide Sigma CLI access to our AWS account, we need to provide an AWS key pair in the pipeline configuration. But we should not specify them plainly in our pipeline configuration, because then anyone who has read access to your repository can access them, which is a major security concern. On the other hand, if AWS detected that a key pair is compromised as such, they automatically invalidate that key pair, so it won’t be useful anymore.

So the best and the BitBukcet recommended approach is to configure the AWS access key and the access secret as Secret Variables in your project. We can configure these secrets either at the workspace/account level, at the repository level, or even at the deployment level. For this article, we’ll configure them at the repository level.

For that,

  • Click on the Repository Settings button on the left panel of your BitBucket repository dashboard.
  • Then scroll down to the PIPELINES section and click on the Repository Variables button.
  • Define a new variable providing a suitable name such as AWS_ACCESS_KEY and specify your AWS access key as the value. Make sure to tick the secured checkbox and click the Add button.
  • In the same manner, add another secret for the AWS Access Secret with a suitable name such as AWS_SECRET_KEY.

Creating a pipeline file

The next step is to create a Pipeline configuration file for the project. For that,

  • Click on the Pipelines button on the left panel of your BitBucket repository dashboard.
  • Then scroll down to the Choose a language template section and select Other from the drop-down.
  • Then a YAML file with the name bitbucket-pipelines.yml will be opened in an editor with a sample pipeline configuration.

Configuring the pipeline

Now let’s configure our pipeline on the previously created YAML file step-by-step.

Execution environment

First, we have to configure a docker image to run each step of the pipeline. Basically, this docker image should have NodeJS version 10 or newer installed for the Sigma CLI and also need Python 3 if your project has Python Lambda functions. Therefore, we are going to use the nikolaik/python-nodejs:python3.8-nodejs14 docker image which has NodeJS 14 and Python 3.8 pre-installed. You can use any other docker image as well.

image: nikolaik/python-nodejs:python3.8-nodejs14

Triggers

Then we need to define how or which events will trigger this pipeline. Since we are going to auto-trigger this pipeline for each push to the master branch, we can configure it as follows.

image: nikolaik/python-nodejs:python3.8-nodejs14

pipelines:
  branches:
    master:

Build project

After setting up the execution environment and triggers, let’s get started on the pipeline steps. First, let’s configure the build step with the name Build Project. But since each pipeline step is executed in a fresh Docker container, we need to install the Sigma CLI first. For that, we can use the npm install command to install the slappforge-sigma-cli module globally on the execution environment.

image: nikolaik/python-nodejs:python3.8-nodejs14

pipelines:
  branches:
    master:
      - step:
          name: Build Project
          script:
            - npm i slappforge-sigma-cli -g

The AWS keys we configured earlier as secrets are available as environment variables of the Docker container. Therefore now we can invoke the sigma aws build command with these environment variables as well as other necessary parameters.

image: nikolaik/python-nodejs:python3.8-nodejs14

pipelines:
  branches:
    master:
      - step:
          name: Build Project
          script:
            - npm i slappforge-sigma-cli -g
            - sigma aws build --s3Bucket deployment.packages.bucket --s3Prefix auto_builds --awsKey $AWS_ACCESS_KEY --awsSecret $AWS_SECRET_KEY

Save deployment package URL to an artifact file

For the deployment step, we need to have the S3 URL of the deployment package generated via this build step. The above build command will output that S3 URL to the stdout and we have to somehow pass that value to the deployment step. But since each step is executed on a fresh Docker container, we cannot share any environment variables between steps. But if we write the S3 URL to a file and configure that file as a step artifact, the deployment step will have access to that file.

So we are going to direct the output of the build command to a file named S3-Deployment-Package-URL.txt and configure that file as an artifact.

image: nikolaik/python-nodejs:python3.8-nodejs14

pipelines:
  branches:
    master:
      - step:
          name: Build Project
          script:
            - npm i slappforge-sigma-cli -g
            - sigma aws build --s3Bucket deployment.packages.bucket --s3Prefix auto_builds --awsKey $AWS_ACCESS_KEY --awsSecret $AWS_SECRET_KEY > ./S3-Deployment-Package-URL.txt
          artifacts:
            - S3-Deployment-Package-URL.txt

Deploy Project

As the last step, we are going to deploy the project with the sigma aws deploy command. Since each step is executed on a fresh Docker container, we need to install the Sigma CLI first as we did on the project build step.

image: nikolaik/python-nodejs:python3.8-nodejs14

pipelines:
  branches:
    master:
      - step:
          name: Build Project
          script:
            - npm i slappforge-sigma-cli -g
            - sigma aws build --s3Bucket deployment.packages.bucket --s3Prefix auto_builds --awsKey $AWS_ACCESS_KEY --awsSecret $AWS_SECRET_KEY > ./S3-Deployment-Package-URL.txt
          artifacts:
            - S3-Deployment-Package-URL.txt

      - step:
          name: Deploy Project
          script:
            - npm i slappforge-sigma-cli -g

After that, we need to extract the previously saved deployment package S3 URL from the S3-Deployment-Package-URL.txt file and assign it to an environment variable named DEPLOYMENT_PACKAGE.  In addition to that, the AWS keys we configured earlier as secrets are also available as environment variables. Then we are going to invoke the deploy command with those environment variables, DEPLOYMENT_PACKAGE variable we created previously, and other necessary parameters. Also, we should make sure to set the --autoDepMode parameters to true, so the deployment process goes ahead without waiting for any user confirmations.

image: nikolaik/python-nodejs:python3.8-nodejs14

pipelines:
  branches:
    master:
      - step:
          name: Build Project
          script:
            - npm i slappforge-sigma-cli -g
            - sigma aws build --s3Bucket deployment.packages.bucket --s3Prefix auto_builds --awsKey $AWS_ACCESS_KEY --awsSecret $AWS_SECRET_KEY > ./S3-Deployment-Package-URL.txt
          artifacts:
            - S3-Deployment-Package-URL.txt

      - step:
          name: Deploy Project
          script:
            - npm i slappforge-sigma-cli -g
            - DEPLOYMENT_PACKAGE=$(cat ./S3-Deployment-Package-URL.txt)
            - sigma aws deploy --depPackage $DEPLOYMENT_PACKAGE --awsKey $AWS_ACCESS_KEY --awsSecret $AWS_SECRET_KEY --autoDepMode true

 

That’s it! We have now configured a CI/CD pipeline to our Sigma project repository. You can push a change to the master branch of the repository and see if the pipeline works successfully on the Pipelines section of the BitBucket repository console.

You can see the full pipeline configuration below.