🌻ServicesConcourse Ci
Concourse CI
Concourse is a CI (Continouous Integration) system built out of various pieces of the CloudFoundry linux container platform.
It allows users to define pipelines composed of multiple resources and jobs organized in a [DAG (directed acyclic graph)](en.wikipedia.org/wiki/Directed_acyclic_graph).
Jobs can both consume and produce resources, for example a quintisential job might clone a git repository, perform a build or compilation step, and then publish the finished/packaged executable application to a storage server as a new resource. Then, a test job may download the application and run it through an automated test.
Concourse has a hub-and-spoke architecture similar to most Continuous Integration systems, there is a central node which runs the concourse web and API server, called [atc
(Air Traffic Control)](github.com/concourse/concourse/tree/master/atc).
Then there is a concourse worker which runs on one or more "build agent" computers or virtual machines, staying in constant contact with the hub server. Currently we have installed the build agent on an old tower known as dredd
.
⚠️ user experience warning ⚠️
Concourse has a lot of great features, but it can feel a bit rough around the edges in the usability department.
In order to use concourse, you have to use both the web application and the CLI (Command Line Interface) at the same time.
The web interface acts as a dashboard, allowing you to quickly view an overview of the status of
your pipelines and thier jobs as well as drill down and figure out why a particular job failed.
You can also use the web interface to retry a failed build.
However, literally everything else you do with concourse like browsing the teams, users, build agents, managing your pipelines,
etc, must be done with the CLI. But you can't use the CLI to monitor the status of your build, you have to use the web app for that.
Using the web app
Go to concourse.cyberia.club/
Using the CLI
You may download the fly
CLI application from concourse-ci.org
or from the latest stable Github release: github.com/concourse/concourse/releases
You will have to either add the fly
executable binary to your PATH
variable or run fly directly based on a file path,
like /Users/slenderman/Desktop/fly ...
or cd ~/Desktop; ./fly ...
First, you have to log into the concourse server using the CLI. When you log in, you must provide the URL of the
server you are logging into as well as a "tag" or short name you will have to use to refer to that server
on every command after you log in.
fly -t cyberia login -c https://concourse.cyberia.club/
This will create a file named ~/.flyrc
.
⚠️ This file will contain an auth token that can be used to execute arbitrary code on our servers! ⚠️
Luckily it should only be readable and writable only by the user who invoked the
fly login
command.If you wish to be a paranoid perry you can further restrict access to this file by
making sure it is written to some sort of encrypted volume & running the fly command as a separate user.
Once you have logged in, you can perform various actions with the CLI:
# printing a list of all the avaliable commands
fly -t cyberia
# list the workers
fly -t cyberia workers
# view a pipeline's configuration
./fly -t cyberia get-pipeline -p example-pipeline
# trigger a job to run
./fly -t cyberia trigger-job -j example-pipeline/example-job
# create or update a pipeline's configuration
./fly -t cyberia set-pipeline -p example-pipeline -c example-pipeline.yml
Secrets and Vault
We set up vault to handle Concourse pipeline secrets. Our vault installation is incredibly simplistic, with the minimum
possible operational complexity required to integrate with concourse. Vault unseals itself on startup and has one policy and approle set up for concourse. Administrators may log into the server that hosts concourse rosewater.cyberia.club
and manage the secrets.
See the [vault readme in the ops-handbook repository ](git.cyberia.club/services/ops-handbook/tree/ansible/roles/concourse-vault/files/README?h=vault-wip) for more information on how to add secrets and use them in your pipelines.