Skip to content

helpers


authoring imports.

The authoring imports section describes two types of helpers for stack authoring: METHOD HELPER and CLASS HELPER.

  • METHOD HELPER

    This helper contains a schedule with a single job. It is typically used for simple one-time automation or for stacks meant to be consumed by other high-level stacks. An example of using the METHOD HELPER is shown below:
    def run(stackargs):
        stack = newStack(stackargs)
    
  • CLASS HELPER

    This helper expects an explicit schedule of jobs. It is commonly used for automation with multiple jobs. An example of using the CLASS HELPER is shown below:
    class Main(newSchedStack):
    
        def __init__(self,stackargs):
            newSchedStack.__init__(self,stackargs)
    

helper_publisher package.

The Config0 publisher Python package serves the purpose of creating scripts and shellouts. It provides a set of tools and functionalities to assist in the development of scripts and the execution of shell commands.

The ResourceCmdHelper class, which is part of the Config0 publisher package, is specifically designed to facilitate the execution and capture of resources. It offers convenient methods and attributes that allow for seamless interaction with resources, particularly those related to OpenTofu and Terraform.

ResourceCmdHelper

usage example

#!/usr/bin/env python

import os
from config0_publisher.loggerly import Config0Logger
from config0_publisher.resource.manage import ResourceCmdHelper

class CodebuildSrcFile(ResourceCmdHelper):

    def __init__(self):

        self.classname = 'CodebuildSrcFile'

        self.logger = Config0Logger(
            self.classname,
            logcategory="cloudprovider"
        )

        ResourceCmdHelper.__init__(
            self,
            main_env_var_key="CONFIG0_BUILDPARMS_HASH",
            app_name=os.environ["APP_NAME"],
            app_dir=os.environ["APP_DIR"],
            set_must_exists=[
                "tmp_bucket",
                "upload_bucket",
                "log_bucket"],
            set_default_values={
                "build_image":"aws/codebuild/standard:4.0",
                "build_timeout":500,
                "compute_type":"BUILD_GENERAL1_SMALL",
                "image_type":"LINUX_CONTAINER",
                "remote_stateful_bucket":None,
                "upload_bucket":None,
                "stateful_id":None,
                "buildspec_file":"buildspec.yml"
            }
        )

class attribute description default
(stack)share_dir The directory shared among all the workers /var/tmp/share
(stack)stateful_id The unique id for storing state info for the execution <random\>
(stack)run_share_dir share directory + stateful_id (directory) <share_dir>/<stateful_id
(stack)app_dir relative app directory var/tmp/<app_name>1
(stack)working_subdir relative app directory
(stack)exec_dir current dir or run_share_dir + app_dir
(stack)vars_dir vars directory <share_dir/config0/variables/<schedule_id>
(stack)creds_dir credentials directory <vars_dir>/credentials
(stack)inputvars_dir inputvars directory <vars_dir>/inputvars

example opentofu

#!/usr/bin/env python

'''
# terraform 

stateful_id =>  abc123
app_dir =>  var/tmp/terraform 
exec_dir =>  /var/tmp/share/abc123/var/tmp/terraform 
'''

cmd = 'docker run -e METHOD="create" --rm -v ' \
       + f'{self.run_share_dir}:{self.share_dir} config0/terraform-run-env:1.3.7'

os.system(cmd)

  1. for app_name = terraform (opentofu) => var/tmp/terraform 

  2. SCHEDULE_ID is identifier for all the jobs in instance of a stack - set by Config0 automatically