Sep 16, 2020

Using Conftest to Validate Configuration Files

Conftest is a utility within the Open Policy Agent ecosystem that helps simplify writing validation tests against configuration files. In a previous blog post, I wrote about using the Open Policy Agent utility directly to validate Terraform plan files. Conftest enables users to easily parse and validate across many different formats of structure data. The project recently joined the CNCF and appears to be in the late phase of 0.x releases, so it appears to be maturing nicely into the broader ecosystem of the cloud native community.

Why Conftest?

Conftest extends the Open Policy Agent capabilities into a broad set of structured data formats. Most of our modern infrastructure and application configuration is written in one of these structured formats that conftest parses:

  • YAML
  • JSON
  • HCL
  • Dockerfile
  • TOML
  • INI
  • XML
  • Jsonnet

What does conftest do that makes it so useful against these formats? It translates the structure of the data that is provided in these files into a structure that can be validated using Open Policy Agent as an input source. For example, this is the only tool that can take a Dockerfile and structure it as input to OPA. Given the following Dockerfile:

FROM alpine:latest
RUN apk update
RUN apk add curl

Conftest then structures the format for OPA parsing:

[
  { 
    Cmd: "FROM",
    Value: "alpine:latest"
  },
  {
    Cmd: "RUN",
    Value: "apk update"
  },
  {
    Cmd: "RUN",
    Value: "apk add curl"
  }
]

Then, we can write OPA policies against this data and share those policies across our organization.

Conftest also enables CI/CD pipelines to utilize centralized policy validation by allowing policies to be bundled into an OCI compliant image format, using oras, and stored in an OCI compliant container registry. Most of the cloud container registries, as well as Artifactory and Gitlab registries, support the OCI image specification. This means that we can develop a central policy repository and stored in a central place for validation within CI/CD pipelines.

Finally, these kinds of pipelines always run into issues with exceptions to the policy statements. Creating these exceptions can often mean ignoring failing tests or skipping validation for other components in the pipeline. Fortunately, conftest allows for creating exceptions within OPA validations. Conftest expects a specific structure for its errors, prefixing the evaluation statements with one of:

  • deny
  • violation
  • warn

In order to create exceptions to rules, create a rule prefixed with one of these statements, for example:

denylist = [
  "centos",
  "fedora"
]

deny_from_centos_fedora[msg] {
  input[i].Cmd == "from"
  val := input[i].Value
  contains(val[_], denylist[_])

  msg = sprintf("unallowed from image %s", [val])
}

Then, we create an exception to the rule by removing the statement prefix:

exception[rules] {
  input[i].Cmd == "label"
  input[i].Value == "team=fedora-ninjas"

  rules = ["from_centos_fedora"]
}

Now, our policy exceptions are stored in the same repository where the policies themselves are written. This means they can be validated and tested against new policy changes and pull requests can be discussed between the team that requires the exception and the policy maintainers.

Policy Maintenance

How do we go about creating this kind of centralized repository for policy maintenance and ensure that teams can easily use these policies within automated pipelines? We can start by creating a repository and populating it with rego files. The conftest project comes with a set of example policies that you can use as template for your own policy statements.

Then, we need teams to be able to use our policies within their pipelines. The policy repository itself can have its own CI/CD pipeline that both validates the policies and then pushes them into a central store after our tests pass.

The way to structure this repository is to create a directory called policy that will store the policy files. This will make the defaults work for the conftest commands. Then, underneath this, we can create policies for each input format, for example policy/kubernetes.

Testing Policies

Open Policy Agent comes with a policy testing tool and conftest extends that to its own formats and binary. We write rego files with a _test postfix and then write rules that are prefixed with test. These rules look similar to our existing deny, violation, and warn rules, but they are only processed during the verify step. Conftest will run all the tests within the policy directory when you run conftest verify.

Currently, the test format does not support providing input that is structured in the conftest parsed formats, but the conftest verify is currently going through some refactoring, so we can hope to see further capabilities around testing pre-parsed formats in the future.

Push Policies

This is the simplest part of the process. Given our existing directory structure, from the root of the repository we can run conftest push myregistry.mycorp.com/policy/format:latest and the policies will be bundled into a OCI compliant image and pushed to this container registry. Conftest uses the docker login structure to authorize pushing images, so ensure that you are logged into the registry when running this command.

Consuming Policies

Finally, we need to consume policies in CI/CD pipelines. Given the different formats in our structured data, it will be best to process each of the formats separately. In a pipeline that builds a docker image, manages some Terraform code, and then builds to Kubernetes, we can write validations for each of our file types.

conftest pull myregistry.mycorp.com/policy/format:latest
conftest test --input Dockerfile build/Dockerfile
# Kubernetes manifests
ls manifests | grep yaml | xargs -L 1 conftest test --input yaml
# Terraform
find terraform -type f -name "*.tf" | xargs -L1 conftest test --input hcl

This can allow for very fast validation across a number of files to simplify the feedback loop between development teams and security/policy owners.

Getting Started

The simplest way to get started is to write a policy! Your organization probably has statements around Kubernetes manifests or Terraform files that you can turn into these types of validations today. Go through the examples library and check out what can be done against each file type. This project is still maturing, so it would be great to see a community build up around use cases here.

About the Author

James McShane profile.

James McShane

Sr. Consultant
Leave a Reply

Your email address will not be published. Required fields are marked *

Related Blog Posts
SwiftGen with Image & Color Asset Catalogs
You might remember back in 2015 when iOS 9 was introduced, and we were finally given a way to manage all of our assets in one place with Asset Catalogs. A few years later, support […]
Tracking Original URL Through Authentication
If you read my other post about refreshing AWS tokens, then you probably have a use case for keeping track of the original requested resource while the user goes through authentication so you can route […]
Using Spring Beans in a Kafka Streams ExceptionHandler
There are many things to know before diving into Kafka Streams. If you haven’t already, check out these 5 things as a starting point. Bullet 2 mentions designing for exceptions. Ironically, this seems to be […]
Local WordPress Development with Docker
Getting the LAMP stack setup for local development can be difficult. With Docker we can get setup in five minutes. In this post I'll show you how.