Kafka Testing using YAML

Integration testing can be difficult for distributed systems. I’ve found this to be true when testing Kafka to ensure data is produced correctly. I’ve tried using a consumer in my tests to verify data made it into the topic. Unfortunately, the Observer Effect made this unstable because of Kafka’s design to balance partitions between consumers.

I found using a Producer Interceptor solved my testing problem in a less invasive way. (Note that for unit tests, Kafka provides MockProducer and MockConsumer classes that are quite powerful and should be used when applicable.)

Kafka Producer Interceptor

Producers can have interceptors that are given the opportunity to process records and optionally modify them. We’ll use an interceptor that logs the record to a file. The file format will be YAML. YAML allows us to append elements to the file and the format is always valid. With JSON or XML we’d need closing syntax and that becomes more difficult to record a valid file.

The interceptor is configured using the properties given to the KafkaProducer constructor:

  • interceptor.classes = LoggingProducerInterceptor
  • interceptor.LoggingProducerInterceptor.file = /tmp/producer.log

Output Format

The YAML file is an array of JSON objects. The first entry is the configuration for the producer. Each entry after that has the record thread name, key and value. The thread can be used, in addition to the key and/or value, for querying in the test.

Writing the Test

Verification is done by reading the YAML file and finding the record you expect. With Groovy Iterable extensions and YAML/JSON, this is straightforward and readable.

There are two common patterns I found in my tests. First, limiting to the current test thread, which assumes the producer interceptor is called in the same thread as the test. Second, ignoring existing records.

I wrote a helper class that provides the YAML as a collection with optional filtering by thread and/or ignoring the existing records.

Here’s an example of a test using LoggingProducerOutput:

The example is simple but your verification doesn’t have to be. The complete value of the record is available for testing.

Conclusion

I had unstable integration tests that would fail about 10% of the time. Using a YAML file to record producer activity removed the instability when checking for correctly produced records. Also, it can be applied to other technology beyond Kafka.

Happy Testing!

About the Author

Patrick Double profile.

Patrick Double

Principal Technologist

I have been coding since 6th grade, circa 1986, professionally (i.e. college graduate) since 1998 when I graduated from the University of Nebraska-Lincoln. Most of my career has been in web applications using JEE. I work the entire stack from user interface to database.   I especially like solving application security and high availability problems.

Leave a Reply

Your email address will not be published. Required fields are marked *

Related Blog Posts
Feature Flags in Terraform
Feature flagging any code can be useful to developers but many don’t know how to or even that you can do it in Terraform. Some benefits of Feature Flagging your code You can enable different […]
Infrastructure as Code – The Wrong Way
You are probably familiar with the term “infrastructure as code”. It’s a great concept, and it’s gaining steam in the industry. Unfortunately, just as we had a lot to learn about how to write clean […]
Snowflake CI/CD using Jenkins and Schemachange
CI/CD and Management of Data Warehouses can be a serious challenge. In this blog you will learn how to setup CI/CD for Snowflake using Schemachange, Github, and Jenkins. For access to the code check out […]
How to get your pull requests approved more quickly
TL;DR The fewer reviews necessary, the quicker your PR gets approved. Code reviews serve an essential function on any software codebase. Done right, they help ensure correctness, reliability, and maintainability of code. On many teams, […]