Skip to content

Quick Start

Summary by Example

export GITHUB_NICKNAME=<github-username>
  • clone repo, copy files, and commit code
git clone git@github.com:$GITHUB_NICKNAME/digital-ocean.git
cd digital-ocean
rm -rf execgroups/_config0_configs/ec2_server
rm -rf execgroups/_config0_configs/template/_chrootfiles/var/tmp/terraform
rm -rf stacks/_config0_configs/aws_ec2_server
rm -rf stacks/_config0_configs/template

mv sample/doks/OpenTofu execgroups/_config0_configs/template/_chrootfiles/var/tmp/terraform
mv execgroups/_config0_configs/template execgroups/_config0_configs/doks
mv sample/doks/Config0/stack stacks/_config0_configs/doks

sed -i "s/config0-publish:::do::doks/$GITHUB_NICKNAME:::digital-ocean::doks/g" stacks/_config0_configs/doks/_main/run.py

git add .
git commit -a -m "testing and adding doks terraform sample workflow on config0"
git push origin main
  • register repository with Config0 and trigger upload
  • On Config0 SaaS UI
  • -> Add Stack
  • -> Register Repo
  • -> Update Stack
# Terraform/OpenTofu execgroup that contain "imported" or "glued" code
https://api-app.config0.com/web_api/v1.0/exec/groups/$GITHUB_NICKNAME/digital-ocean/doks

# Terraform/OpenTofu immutable workflow/stack
https://api-app.config0.com/web_api/v1.0/stacks/$GITHUB_NICKNAME/doks

for example:
https://api-app.config0.com/web_api/v1.0/exec/groups/williaumwu/digital-ocean/doks
https://api-app.config0.com/web_api/v1.0/stacks/williaumwu/doks

create config0/config0.yml to launch

for example “williaumwu”

global:
   arguments: 
     do_region: lon1
   metadata:   
     labels:
        general: 
          environment: dev
          purpose: testing
          provider: doks
        doks: 
          platform: kubernetes
          component: manged_k8
infrastructure:
   doks:
     stack_name: williaumwu:::doks
     arguments:
        doks_cluster_name: config0-authoring-walkthru
        doks_cluster_version: 1.29.1-do.0
        doks_cluster_pool_size: s-1vcpu-2gb-amd
        doks_cluster_autoscale_max: 4
        doks_cluster_autoscale_min: 1
     metadata:
        labels:
          - general
          - doks
        credentials:
          - reference: do-token
            orchestration: true

OpenTofu Integration Details By Example

By leveraging the Config0 helpers, you effectively connect your existing OpenTofu code, and create an immutable OpenTofu-based workflow with a single entry point. This single entry point is either launched directly or called by other Config0 automation stacks.

The process can be reduced to:

  • Copy your existing OpenTofu-based code to a repository registered with Config0.
  • Create a Config0 stack file that references the OpenTofu-based code.
  • Invoke updates to the code changes on Config0.

The example below demonstrates the process of converting an existing OpenTofu code for Digital Ocean Kubernetes Service (DOKS) into an OpenTofu-based workflow.

0. Preq - Fork Starter Repo

  • Fork the config0 contribution repository and name it “digital-ocean”. This repository will serve as the destination for your OpenTofu code and the source for Config0 code changes.
  • Register the repository with Config0 platform
  • Clone this newly forked repository
    git clone https://github.com/<your username>/digital-ocean.git
    

I. Copy OpenTofu Code

  • rename execgroup “template” to “doks”

    mv digital-ocean/execgroups/_config0_configs/template digital-ocean/execgroups/_config0_configs/doks
    

  • copy the sample doks code2

    rm -rf digital-ocean/execgroups/_config0_configs/doks/_chrootfiles/var/tmp/terraform
    cp -rp digital-ocean/OpenTofu digital-ocean/execgroups/_config0_configs/doks/_chrootfiles/var/tmp/terraform
    

II. Create Config0 Stack

Copy Config0 Files
cp -rp digital_ocean/Config0/stack/_documentation/README.md digital_ocean/stacks/_config0_configs/template/_documentation/README.md
cp -rp digital_ocean/Config0/stack/_documentation/metadata.yml digital_ocean/stacks/_config0_configs/template/_documentation/metadata.yml 
cp -rp digital_ocean/Config0/stack/_main/run.py digital_ocean/stacks/_config0_configs/template/_main/run.py
  • The run.py file serves as the stack’s entry point and workflow file
  • Open digital-ocean/stacks/_config0_configs/doks/_main/run.py.
    • Section 1: This section declares variables for the stack , which mostly correspond to OpenTofu variables (e.g., variables.tf).
    • Section 2: Here, you specify the execgroup from above and change execgroup to __<repo_owner>__:::digital-ocean::doks.
    • Section 3: No changes required. This section references the stack responsible for creating inputs/outputs and executing OpenTofu code.
    • Section 4: No changes required. It initializes all stack attributes.
    • Section 5: Optional. This section allows you to specify values to upload as secrets to the AWS SSM Parameter Store, which will automatically expire the values. Typically, these values are uploaded as environment variables for use in the OpenTofu executor environment, such as AWS CodeBuild or Lambda function.
    • Section 6: Set the timeout for the OpenTofu execution.
    • Section 7: Initializes the OpenTofu helper with specific parameters:
    • The provider is set to “do” for DigitalOcean.
    • The ssm_obj is from Section 5.
    • resource_name represents the instance name of the infrastructure.
    • resource_type represents the user category, such as server, database, security_group, or in this case, doks.
    • terraform_type is the OpenTofu/Terraform resource type.
    • Section 8: The OpenTofu helper parses the OpenTofu state file and adds keys from the terraform_type mentioned above for querying the resource in the Config0 database. Example keys could include: arn, region, endpoint.
    • Section 9: Similar to Section 8, this section maps and adds keys. For example, the field region could be mapped as an additional field with the value of the field do_region, and the field id could be mapped as an additional field with the value of the field urn (similar to arn).
    • Section 10: Specifies keys to display on the SaaS UI. For example, you may want to display the DOKS instance ID and endpoint on the SaaS UI.
    • Section 11: No changes required. This finalizes the stack results.
    • Full Example
      from config0_publisher.terraform import TFConstructor
      
      def run(stackargs):
      
          # instantiate authoring stack
          stack = newStack(stackargs)
      
          # Section 1:
          # Add variables for the stack (many fetched from OpenTofu variables)
          stack.parse.add_required(key="doks_cluster_name",
                                   tags="tfvar,db",
                                   types="str")
      
          stack.parse.add_required(key="do_region",
                                   tags="tfvar,db",
                                   types="str",
                                   default="lon1")
      
          stack.parse.add_optional(key="doks_cluster_version",
                                   tags="tfvar,db",
                                   types="str",
                                   default="1.29.1-do.0")
      
          stack.parse.add_optional(key="doks_cluster_pool_size",
                                   tags="tfvar",
                                   types="str",
                                   default="s-1vcpu-2gb-amd")
      
          stack.parse.add_optional(key="doks_cluster_pool_node_count",
                                   tags="tfvar",
                                   types="int",
                                   default="1")
      
          stack.parse.add_optional(key="doks_cluster_autoscale_min",
                                   tags="tfvar",
                                   types="int",
                                   default="1")
      
          stack.parse.add_optional(key="doks_cluster_autoscale_max",
                                   tags="tfvar",
                                   types="int",
                                   default="3")
      
          # Section 2:
          # Declare execution groups - for simplicity we alias "tf_execgroup"
          # the execgroup must be fully qualified <repo_owner>:::<repo_name>::<execgroup_name>
          stack.add_execgroup("config0-publish:::do::doks",
                              "tf_execgroup")
      
          # Section 3:
          # Add substack - for OpenTofu it will almost always be config0-publish:::tf_executor
          stack.add_substack("config0-publish:::tf_executor")
      
          # Section 4:
          # Initialize Variables in stack
          stack.init_variables()
          stack.init_execgroups()
          stack.init_substacks()
      
          # Section 5:
          # For sensitive upload to ssm parameter store which will automatically expire/remove object
          ssm_obj = {
              "DIGITALOCEAN_TOKEN":stack.inputvars["DO_TOKEN"],
              "DIGITALOCEAN_ACCESS_TOKEN":stack.inputvars["DO_TOKEN"]
          }
      
          # Section 6:
          # if timeout exceeds 600, then it will use codebuild to execute tf
          # otherwise, if less than 600 seconds, it will use a lambda function
          # which is faster since lambda coldstarts is less than codebuild
          stack.set_variable("timeout",600)
      
          # Section 7:
          # use the terraform constructor (helper)
          # but this is optional
          tf = TFConstructor(stack=stack,
                             execgroup_name=stack.tf_execgroup.name,
                             provider="do",
                             ssm_obj=ssm_obj,
                             resource_name=stack.doks_cluster_name,
                             resource_type="doks",
                             terraform_type="digitalocean_kubernetes_cluster")
      
          # Section 8:
          # keys to include in db fields
          # from the terraform resource type
          tf.include(keys=["name",
                           "service_subnet",
                           "id",
                           "urn",
                           "endpoint",
                           "kube_config",
                           "vpc_uuid"])
      
          # Section 9:
          # keys to map and include in db fields
          tf.include(maps={"cluster_id": "id",
                           "doks_version": "version"})
      
          # Section 10:
          # keys to publish and display in SaaS UI
          tf.output(keys=["doks_version",
                          "do_region",
                          "service_subnet",
                          "urn",
                          "vpc_uuid",
                          "endpoint"])
      
          # Section 11:
          # Finalize the tf_executor
          stack.tf_executor.insert(display=True,
                                   **tf.get())
      
          # Section 12:
          # return results
          return stack.get_results()
      
Create Config0 Docs
  • Create a README file for the stack that includes a detailed description of the stack and its input variables.
  • Generate metadata for the stack, specifying a release version to track changes and adding relevant tags for easy searching and categorization of stacks .

III. Trigger Update Config0

Follow these steps to check in your code and trigger Config0 to upload and automatically version any changes to the infrastructure as code (IaC) or workflows:

  • Check in the infrastructure as code and corresponding stack (workflow) to the IaC repository.
  • Trigger Config0 to upload the repository with Config0 platform

  1. OpenTofu and Terraform integration 

  2. Please ensure that you remove any existing backend configuration, as the helpers will automatically generate the backend.tf file.