Pipelines for Applications

Pipelines can be completely installed and operate in an AWS environment.

The following graphic depicts the setup and services required.

app env variables

Preparing for your AWS Pipelines on premises Install


First ensure you have a VPC to contain all the Pipelines resources. This can be an existing VPC or you can use the Pipelines AWS CloudFormation scripts to create this VPC.

Ensure this exists in the correct AWS Region that you have chosen.

Security Groups

Pipelines recommends creating 2 security groups:

SG1 - outside :

This security group represents the “outside” or “front door” before Pipelines. In this security group will sit 3 Elastic Load Balancers.

Network ACLs for this Security Group include:

TypeProtocolPortIP CIDR

SG2 - inside :

This security group is where the Pipelines instance(s) will reside.

TypeProtocolPortIP CIDR

If necessary, you can create these security groups with the Pipelines AWS CloudFormation scripts.

S3 Bucket

There must exist an S3 bucket in the region you wish to run Pipelines. The Pipelines IAM Policy/Role must have permission to access this.

If necessary, you can create this S3 bucket with the Pipelines AWS CloudFormation scripts.

IAM Policy/Role

The IAM Policy/Role is applied to the Pipelines EC2 Instances. This IAM Policy/Role needs the following permissions.

The IAM user must have access to the S3 bucket in its region. This Allow access must include:


Here is an example IAM Policy, in this example BUCKET is named distelli-onprem:

    "Version": "2012-10-17",
    "Statement": [
            "Effect": "Allow",
            "Action": [
            "Resource": "arn:aws:s3:::distelli-onprem"
            "Effect": "Allow",
            "Action": [
            "Resource": "arn:aws:s3:::distelli-onprem/*"

The IAM user must also have access to the DynamoDB in its region.

During install, you can specify a prefix for you DynamoDB tables. This can provide you a mechanism to lock down Pipelines’s access to your DynamoDB to only tables with this prefix. Here is an example policy, note the PREFIX followed by an asterisk.

    "Version": "2012-10-17",
    "Statement": [
            "Sid": "Stmt1465427217000",
            "Effect": "Allow",
            "Action": [
            "Resource": [

EC2 Instance

The EC2 Instance created in AWS for Pipelines.

Pipelines supports redundant instances, so it is recommended that 2 or more instances be stood up. Having 2 or more allows Pipelines to update Pipelines live.

The recommended minimum specifications for the Pipelines server instance is: An AWS m4.large

  • 2 cpu
  • 8 GB RAM
  • 50 GB EBS Volume

The following AWS resources will be needed to provision the instance(s).

Security GroupFrom above, this instance should be assigned to SG2
SubnetThis subnet must allow Pipelines to talk off-net so that Pipelines can communicate with the many Pipelines Agents, SCM repositories, Docker registries, and more.
RoleThe IAM Instance Profile the Pipelines instance will use.
Instance TypePipelines recommends an r3.large or better EC2 instance.
SSH KeyAn SSH Key for accessing the host to install the Pipelines software

Elastic Load Balancers

After the Pipelines instances are provisioned, the ELBs can be configured.


This ELB is associated with www DNS entry.

LB ProtocolLB PortInstance ProtocolInstance Port
Health Check
Timeout:5 seconds
Interval:30 seconds
Unhealthy threshold:2
Healthy threshold:2


ELB2 represents various traffic to Pipelines. These include:

  • Pipelines Agent communication
  • Repository Webhooks
  • Pipelines API calls
  • more...

This ELB is associated with DNS entry for login.

LB ProtocolLB PortInstance ProtocolInstance Port
Health Check
Timeout:5 seconds
Interval:30 seconds
Unhealthy threshold:2
Healthy threshold:2


ELB3 is specifically for Pipelines Agent communication. This ELB is associated with DNS entry for agent

LB ProtocolLB PortInstance ProtocolInstance Port
Health Check
Timeout:5 seconds
Interval:30 seconds
Unhealthy threshold:2
Healthy threshold:2

DNS Records

Before you can install Pipelines, you must determine your Pipelines DNS naming schema.

The DNS names that Pipelines uses are for the following endpoints. In a best case scenario, there should be FQDN DNS names for these.

  • www
  • login
  • agent

DNS and ELB mapping Table

The DNS records associate (point to) an ELB.


DynamoDB Tables

The DynamoDB tables will be created as necessary by the Pipelines Install script. You can specify a unique prefix for each table. The table names also include the stage as the suffix. For the stage beta the table names without any prefix or suffix are as follows:

Note: Table names are subject to change.

  • access
  • adeployments
  • af-tag-index
  • afidentry
  • app-branch-link
  • app-groups
  • app-pipelines
  • app-repo-link
  • app-settings
  • app-updates
  • appinstances
  • apps
  • archived-servers
  • artifacts
  • auth
  • aws-keys
  • batch-jobs
  • billing
  • deidentry
  • dinstances
  • dmq
  • docker-settings
  • domains
  • ec2-configs
  • endpoints
  • enterprise-data
  • env-branch-link
  • env-groups
  • envelope-keys
  • group-members
  • groups
  • instance-history
  • instance-images
  • instance-templates
  • instance-types
  • invoice-index
  • logins
  • master-keys
  • member-teams
  • metadata
  • news-feed
  • nftargets
  • oauth
  • pipelines
  • public-keys
  • pubsub-subjects
  • repos
  • roles
  • sequences
  • server-capability
  • server-history
  • server-inventory
  • server-launch-events
  • server-usage-2013-11
  • server-usage-2015-12
  • server-usage-2016-01
  • server-usage-2016-02
  • servers
  • snapshots
  • ssh-keys
  • stage-members
  • stage-settings
  • stages
  • tag-servers
  • tasks
  • tax-report
  • team-members
  • teams
  • tqueue
  • wh-results
  • workflow-items
  • workflows

Puppet Pipelines AWS CloudFormation Scripts

Pipelines AWS CloudFormation scripts can provide automated setup of AWS resources for an on premises install of Pipelines.

This process is broken up into several dependent scripts. They are as follows:

Note: These steps must be done in order.

  • VPC Creation
  • Security Group Creation
  • S3 Bucket Creation
  • IAM Policy/Role Creation
  • Instance Creation
  • ELB Creation
  • DynamoDB Creation

Any feature marked as [*under development] may not be available.

The Scripts and Usage

VPC Creation

An AWS VPC that supports the Pipelines AWS security groups, subnets, gateways, dhcp options, routes, and more. These must be created before moving on to the next step and creating the Pipelines Security Groups.

Pipelines vpc-only.json AWS CloudFormation script.

Description:A minimum setup to create the AWS Networking elements for doing a Pipelines on-premise install.
distelli-rtb10.0.0.0/16 ->
distelli-rtb(2) ->
distelli-igw10.0.0.0/16 -> Internet
Comments:This creates all the needed underlying network infrastructure in AWS. Typically, this script is not necessary as your Operations team may already have an appropriate VPC for you to use. If this is the case, move on to Security Group Creation

Security Group Creation

Pipelines security-groups.json AWS CloudFormation script.

Description:Creation of 2 security groups to support a Pipelines AWS on premises installation. This includes a Security Group for the ELBs and a Security Group for the Pipelines Instance.
distelli-vpcThe VPC in which to create the 2 security groups.
sgdistellielbA Security Group for the Pipelines Elastic Load Balancers (ELB). - port 80 - port 443 - port 444
sgdistelliinstanceA Security Group for the Pipelines Instance (Server). - port 22 - port 7999 - port 8000 - port 8090
Comments:This creates the necessary 2 security groups for Pipelines and the Pipelines ELBs. This includes the above required ACLs:

S3 Bucket Creation

Pipelines s3.json AWS CloudFormation script.

Description:Creation of an S3 bucket.
s3nameThe name of the S3 bucket to create for Pipelines.
Outputs:An S3 Bucket
Comments:This creates an S3 bucket in the region it is run in. This should be completed before creating the Pipelines IAM role.

IAM Policy Creation

Pipelines iam.json AWS CloudFormation script.

Description:Create the appropriate IAM policy/role/profile for the Pipelines Istance.
distelli-roleThe Pipelines Role
distelli-profileThe Pipelines Profile applied to the Pipelines Instance
Comments:This creates an IAM profile for use by the Pipelines EC2 Instance.
You will be required to authorize the creation of IAM before creating this stack.

Instance Creation

Pipelines instance.json AWS CloudFormation script.

Description:Create an EC2 Instance for running Pipelines. This requires an appropriate IAM role that includes access to DynamoDB and S3.
sgdinstelliinstanceThe security group id assisnged to the Pipelines instance.
subnetdistelliThe subnet assisnged to the Pipelines instance.
distelliroleThe IAM Instance Profile the Pipelines Instance will use.
InstanceTypeThe EC2 Instance Type [defaults to r3.large]
keynameThe SSH key to access the host
Comments:This creates an EC2 Instance for hosting Pipelines and assigns it to the appropriate Security Group, subnet, ssh keypair, and IAM Role.

Note: The distellirole can be found here:

app env variables

ELB Creation

Pipelines elb.json AWS CloudFormation script.

Pipelines *elbssl.json AWS CloudFormation script. [*under developement]

Description:Create 3 AWS Elastic Load Balancers and attach them to the EC2 instance of Pipelines.
sgdinstelli3lbThe security group id assisnged to the Pipelines ELBs.
subnetdistelliThe subnet assisnged to the Pipelines ELBs.
DistelliInstanceIDThe AWS EC2 Instance ID of the Pipelines Instance.
ELBweb UIHTTP(s):// -> Distelli:8080
ELBProxy (app, agent, sys, login HTTP(s):// -> Distelli: 8000
ELBAgs (Agent)TCP:// -> Distelli: 7999
Comments:This create 3 ELBs, health checks, and assigns them to the Pipelines instance.

DynamoDB Table Creation

Pipelines *DistelliDDB_MASTER.json AWS CloudFormation script. [*under developement]

Description:Creates the necessary Dynamo DB Tables for Pipelines. *Not necessary as currently the Pipelines bootscrap script does this.
TableNamePrefixThe prefix for the table names.
RegionThe AWS region to create the dynamo db table in.
StageNameThe "Environment" Pipelines is manifesting in (i.e. Beta, Gamma, or Prod)
Outputs:Table NameDescription
Comments:This create 3 ELBs, health checks, and assigns them to the Pipelines instance.

How to Run the CloudFormation Scripts

  1. Log into Amazon WebServices Console
  2. Navigate to the CloudFormation Console.
  3. Click Create Stack
  4. Click the Browse... button
  5. Browse to the Pipelines CloudFormation Script file you wish to run and Open
  6. Click Next
  7. You will find yourself on the Specify Details Page.

  8. Create a unique meaningful Stack name
  9. Enter the appropriate Parameters
  10. Click Next
  11. You will find yourself on the Options page.

  12. Enter any appropriate Tags for the stack
  13. Click Next
  14. Review your settings and when ready, click Create

Removing the Pipelines Stack

You must remove the Pipelines stack in reverse order, and ensure each removal is complete before continuing to the next.

Caution: Removing the stack will remove all the AWS resources created by the stack.

Note: Before removing the S3 bucket, you must first remove its contents manually.


Back to top
The page rank or the 1 our of 5 rating a user has given the page.
The email address of the user submitting feedback.
The URL of the page being ranked/rated.