Workaround: CodePipeline for GitHub Enterprise

Andreas Wittig – 21 Aug 2020

There is no question that AWS has a strong focus on customer obsession. However, sometimes it takes forever until popular feature requests get implemented. A good example: CodePipeline - the continuous delivery service - does support all kinds of source code repositories: CodeCommit, GitHub, Bitbucket, S3, and ECR. Although a very popular option is missing: GitHub Enterprise.

Workaround: CodePipeline for GitHub Enterprise

Luckily, there is a decent workaround to connect GitHub Enterprise with CodePipeline.

  1. A webhook from GitHub Enterprise triggers CodeBuild.
  2. CodeBuild fetches the latest changes (Git over HTTPS or SSH), bundles them into a ZIP file, and uploads the archive to S3.
  3. The S3 bucket with versioning enabled stores the latest version of the repository.
  4. A CloudWatch event rule triggers the pipeline whenever the CodeBuild project succeeded.
  5. The source action of CodePipeline downloads the ZIP file, unpacks the archive, and hands over the source code to the next stage.

The solution: GitHub Enterprise, CodeBuild, S3, and CodePipeline.

Cover of Rapid Docker on AWS

Become a Docker on AWS professional!

Our book Rapid Docker on AWS is designed for DevOps engineers and web developers who want to run dockerized web applications on AWS. We lead you with many examples: From dockerizing your application to Continuous Deployment and Infrastructure as Code on AWS. No prior knowledge of Docker and AWS is required. Get the first chapter for free!

Code Example

In the following, I will use Terraform to set up all the needed resources. First of all, we need to get Terraform up and running.

Make sure to fill in the following placeholders:

  • <GITHUB_ACCESS_TOKEN> a personal access token to access the GitHub Enterprise API. Please note, scope admin:repo_hook is required.
  • <GITHUB_ORGANIZATON> the name of your GitHub Enterprise organization (e.g., myorg).
  • <GITHUB_REPOSITORY_NAME> the name of your GitHub Enterprise repository (e.g., myrepo).
  • <GITHUB_REPOSITORY_URL> the URL of your GitHub Enterprise repository (e.g., https://git.example.com/myorg/myrepo.git).
  • <GITHUB_API_URL> the URL of the GitHub Enterprise API (e.g., https://git.example.com/api/).
terraform {
required_version = ">= 0.12"
}

provider "aws" {
region = "eu-central-1"
version = "~> 3.0"
}

provider "github" {
token = "<GITHUB_ACCESS_TOKEN>"
organization = "<GITHUB_ORGANIZATON>"
base_url = "<GITHUB_API_URL>"
}

data "aws_kms_alias" "s3" {
name = "alias/aws/s3"
}

Next, we configure the CodeBuild project that will fetch the source code from GitHub Enterprise and upload it to S3.

resource "aws_codebuild_project" "github_enterprise" {
name = "github-enterprise"
build_timeout = "5"
service_role = aws_iam_role.github_enterprise.arn

artifacts {
type = "S3"
name = "source.zip"
location = aws_s3_bucket.artifacts.id
packaging = "ZIP"
}

cache {
type = "LOCAL"
modes = ["LOCAL_SOURCE_CACHE"]
}

environment {
compute_type = "BUILD_GENERAL1_LARGE"
image = "aws/codebuild/standard:2.0"
type = "LINUX_CONTAINER"
privileged_mode = false
}

source {
type = "GITHUB_ENTERPRISE"
location = "<GITHUB_REPOSITORY_URL>"
buildspec = "github-enterprise/buildspec.yml"
}
}

resource "aws_codebuild_source_credential" "github_enterprise" {
auth_type = "PERSONAL_ACCESS_TOKEN"
server_type = "GITHUB_ENTERPRISE"
token = "<GITHUB_ACCESS_TOKEN>"
}

Next, we configure a webhook between GitHub Enterprise and CodeBuild.

resource "aws_codebuild_webhook" "github_enterprise" {
project_name = aws_codebuild_project.github_enterprise.name

filter_group {
filter {
type = "EVENT"
pattern = "PUSH"
}

filter {
type = "HEAD_REF"
pattern = "master"
}
}
}

resource "github_repository_webhook" "github_enterprise" {
active = true
events = ["push"]
repository = "<GITHUB_REPOSITORY_NAME>"

configuration {
url = aws_codebuild_webhook.github_enterprise.payload_url
secret = aws_codebuild_webhook.github_enterprise.secret
content_type = "json"
insecure_ssl = false
}
}

In the following step, you will create an S3 bucket to store the source code artifact - a zip file named source.zip. I’m using KMS encryption with the default key here.

resource "aws_s3_bucket" "artifacts" {
versioning {
enabled = true
}

server_side_encryption_configuration {
rule {
apply_server_side_encryption_by_default {
sse_algorithm = "aws:kms"
}
}
}
}

resource "aws_s3_bucket_policy" "artifacts" {
bucket = aws_s3_bucket.artifacts.id

policy = <<POLICY
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Deny",
"Principal": "*",
"Action": "s3:PutObject",
"Resource": "${aws_s3_bucket.artifacts.arn}/*",
"Condition": {
"StringNotEquals": {
"s3:x-amz-server-side-encryption": "",
"s3:x-amz-server-side-encryption-aws-kms-key-id": "${data.aws_kms_alias.s3.target_key_arn}"
}
}
}
]
}
POLICY
}

Nothing works without IAM. Let’s create the IAM role for the CodeBuild project to grant access to S3 and CloudWatch Logs.

resource "aws_iam_role" "github_enterprise" {
assume_role_policy = <<POLICY
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Principal": {
"Service": "codebuild.amazonaws.com"
},
"Action": "sts:AssumeRole"
}
]
}
POLICY
}

resource "aws_iam_role_policy" "github_enterprise" {
role = aws_iam_role.github_enterprise.name

policy = <<POLICY
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Resource": "*",
"Action": [
"logs:CreateLogGroup",
"logs:CreateLogStream",
"logs:PutLogEvents"
]
},
{
"Effect": "Allow",
"Resource": "${aws_s3_bucket.artifacts.arn}/*",
"Action": [
"s3:GetObject",
"s3:PutObject"
]
}
]
}
POLICY
}

The event rule triggers the pipeline whenever the CodeBuild project finished fetching and uploading the latest source code.

resource "aws_cloudwatch_event_rule" "artifacts" {
event_pattern = <<PATTERN
{
"source": [
"aws.codebuild"
],
"detail-type": [
"CodeBuild Build State Change"
],
"detail": {
"build-status": [
"SUCCEEDED"
],
"project-name": [
"${aws_codebuild_project.github_enterprise.name}"
]
}
}
PATTERN
}

resource "aws_cloudwatch_event_target" "artifacts" {
rule = aws_cloudwatch_event_rule.artifacts.name
arn = aws_codepipeline.pipeline.arn
role_arn = aws_iam_role.events.arn
}

One more IAM role which allows the CloudWatch event to trigger the pipeline is needed.

resource "aws_iam_role" "events" {
assume_role_policy = <<POLICY
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Principal": {
"Service": "events.amazonaws.com"
},
"Action": "sts:AssumeRole"
}
]
}
POLICY
}

resource "aws_iam_role_policy" "events" {
role = aws_iam_role.events.id

policy = <<POLICY
{
"Version": "2012-10-17",
"Statement": [
{
"Effect":"Allow",
"Action": "codepipeline:StartPipelineExecution",
"Resource": "${aws_codepipeline.pipeline.arn}"
}
]
}
POLICY
}

And finally, the pipeline itself.

resource "aws_codepipeline" "pipeline" {
name = "mypipeline"
role_arn = aws_iam_role.codepipeline.arn

artifact_store {
location = aws_s3_bucket.artifacts.bucket
type = "S3"

encryption_key {
id = data.aws_kms_alias.s3.target_key_arn
type = "KMS"
}
}

stage {
name = "Source"

action {
name = "Source"
category = "Source"
owner = "AWS"
provider = "S3"
version = "1"
output_artifacts = ["source"]

configuration = {
S3Bucket = aws_s3_bucket.artifacts.id
PollForSourceChanges = "false"
S3ObjectKey = "source.zip"
}
}
}

stage {
name = "Deploy"

action {
run_order = "1"
name = "Approval"
category = "Approval"
owner = "AWS"
provider = "Manual"
version = "1"
}
}
}

Last but not least, an IAM role for CodePipeline granting access to the artifacts bucket as well.

resource "aws_iam_role" "codepipeline" {
assume_role_policy = <<POLICY
{
"Version": "2012-10-17",
"Statement": [
{
"Effect": "Allow",
"Principal": {
"Service": "codepipeline.amazonaws.com"
},
"Action": "sts:AssumeRole"
}
]
}
POLICY
}

resource "aws_iam_role_policy" "codepipeline" {
role = aws_iam_role.codepipeline.id

policy = <<POLICY
{
"Version": "2012-10-17",
"Statement": [
{
"Effect":"Allow",
"Action": [
"s3:GetObject",
"s3:GetObjectVersion",
"s3:PutObject"
],
"Resource": [
"${aws_s3_bucket.artifacts.arn}/*"
]
},
{
"Effect":"Allow",
"Action": [
"s3:GetBucketVersioning"
],
"Resource": [
"${aws_s3_bucket.artifacts.arn}"
]
}
]
}
POLICY
}

One more thing. You need to add the file github-enterprise/buildspec.yml to your repository. The file should contain the following configuration.

---
version: 0.2
phases:
build:
commands:
- 'echo "${CODEBUILD_RESOLVED_SOURCE_VERSION}" > SOURCE_VERSION'
artifacts:
files:
- '**/*'
name: source

That’s it. You are ready to run terraform apply to set up CodePipeline for GitHub Enterprise.

Limitations

  • The code example only works when GitHub Enterprise is available over the Internet. In theory, it is possible to access GitHub Enterprise over private networks only as well. Doing so requires to configure a network interface to establish access to a VPC for the CodeBuild project.
  • CodePipeline does not know about the commit hash. It shows the version of the source.zip S3 object instead. However, the example adds a file SOURCE_VERSION to the source.zip archive, which contains the original commit hash.
  • Copying the source code to S3 adds additional latency (about 1-3 minutes) to your deployment pipeline.

Summary

Unfortunately, CodePipeline does not support GitHub Enterprise yet. Using CodeBuild and S3 is a decent workaround to get CodePipeline running for your GitHub Enterprise repository.

Andreas Wittig

Andreas Wittig

I’m an independent consultant, technical writer, and programming founder. All these activities have to do with AWS. I’m writing this blog and all other projects together with my brother Michael.

In 2009, we joined the same company as software developers. Three years later, we were looking for a way to deploy our software—an online banking platform—in an agile way. We got excited about the possibilities in the cloud and the DevOps movement. It’s no wonder we ended up migrating the whole infrastructure of Tullius Walden Bank to AWS. This was a first in the finance industry, at least in Germany! Since 2015, we have accelerated the cloud journeys of startups, mid-sized companies, and enterprises. We have penned books like Amazon Web Services in Action and Rapid Docker on AWS, we regularly update our blog, and we are contributing to the Open Source community. Besides running a 2-headed consultancy, we are entrepreneurs building Software-as-a-Service products.

We are available for projects.

You can contact me via Email, Twitter, and LinkedIn.

Briefcase icon
Hire me