This repo contains the reference deployment that is intended to be able to be copy/pasted for a mostly prod-ready Day Zero deployment. We intend for the deployment to be as close to production ready as possible, given the constraints of the reference deployment.
- Help establish a common pattern for how we consume and extend DUBBD.
- Give a good starting point for delivery engineers to start from when deploying a mission app.
- Help stakeholders understand what tools/technologies/patterns are being used by our team. If we don't talk about something, it's probably not being used.
The following is a list of everything that goes into this deployment. Each item is separately deployed in the order listed.
- Zarf Init Package
- Zarf Package for MetalLB
- Zarf Package for DUBBD
- Zarf Package for IDAM (Keycloak)
- Zarf Package for SSO (AuthService)
- Zarf Package for the mission app. In this case, the simple Podinfo app is being used as a standin for the mission app.
This reference deployment uses several components that are still in the early stages of development. As such, this reference deployment should be considered to be at the "Experimental" maturity level.
Component | Maturity Level | Notes |
---|---|---|
Zarf | Late-Stage Beta | We are comfortable using Zarf in production, despite its v0.X status. |
MetalLB | Experimental | The Zarf Package for MetalLB is very new and does not yet meet our qualifications for being used in production. |
DUBBD | Mid-Stage Beta | We intend to use DUBBD in production at some point, but are anticipating a lot of churn which will likely cause some pain. |
IDAM | Early-Stage Beta | We intend to use the IDAM package in production at some point, but are anticipating a lot of churn which will likely cause some pain. |
SSO | Early-Stage Beta | We intend to use the SSO package in production at some point, but are anticipating a lot of churn which will likely cause some pain. |
NOTE: The prerequisites assume you have an empty Linux server. If you already have a Kubernetes cluster that you want to deploy to, rather than running
make zarf-init
as referenced below, you should instead runzarf init --components=git-server --confirm
.The Kubernetes cluster must meet the following criteria:
- uses amd64 architecture (sorry ARM, the Big Bang people still haven't gotten around to you yet).
- No existing ingress controllers or ServiceLB. If you are using K3s Traefik and ServiceLB have to be disabled.
- The version of K8s is a modern and supported one that is not EOL (End Of Life).
- The cluster has enough CPU and RAM available (exact numbers TBD)
-
A Linux server
-
The following tools installed locally and available on the $PATH:
-
A file called
zarf-config.yaml
in this directory that has YOUR values configured in it. Seezarf-config.example.yaml
for an example. -
A file called
tls.cert
in this directory that has your TLS cert in it. Seezarf-config.example.yaml
for more details. -
A file called
tls.key
in this directory that has your TLS key in it. Seezarf-config.example.yaml
for more details.NOTE: The
zarf-config.yaml
,tls.cert
, andtls.key
files should be treated as secrets, since they will have sensitive environment-specific data in them. They should not be checked into source control.
- All of the above
- AWS creds configured and available using the standard environment variables
- Docker
- AWS CLI Session Manager Plugin
To have the mission app deployed and integrated with DUBBD and Keycloak in the single-node K3s cluster on the server, and to be properly redirected to Keycloak for authentication when accessing the mission app from the local machine.
- Clone this repo locally.
- Configure
zarf-config.yaml
,tls.cert
, andtls.key
with your environment-specific values. See above section for more details. - Create the Kubernetes cluster and initialize Zarf with
make zarf-init
- Deploy the platform with
make platform-up
- Deploy the mission app with
make mission-app-up
Run
make help
to see all possible targets
TODO: write this section
TODO: write this section