Kafka Testing using YAML

Integration testing can be difficult for distributed systems. I’ve found this to be true when testing Kafka to ensure data is produced correctly. I’ve tried using a consumer in my tests to verify data made it into the topic. Unfortunately, the Observer Effect made this unstable because of Kafka’s design to balance partitions between consumers.

I found using a Producer Interceptor solved my testing problem in a less invasive way. (Note that for unit tests, Kafka provides MockProducer and MockConsumer classes that are quite powerful and should be used when applicable.)

Kafka Producer Interceptor

Producers can have interceptors that are given the opportunity to process records and optionally modify them. We’ll use an interceptor that logs the record to a file. The file format will be YAML. YAML allows us to append elements to the file and the format is always valid. With JSON or XML we’d need closing syntax and that becomes more difficult to record a valid file.

The interceptor is configured using the properties given to the KafkaProducer constructor:

  • interceptor.classes = LoggingProducerInterceptor
  • interceptor.LoggingProducerInterceptor.file = /tmp/producer.log

Output Format

The YAML file is an array of JSON objects. The first entry is the configuration for the producer. Each entry after that has the record thread name, key and value. The thread can be used, in addition to the key and/or value, for querying in the test.

Writing the Test

Verification is done by reading the YAML file and finding the record you expect. With Groovy Iterable extensions and YAML/JSON, this is straightforward and readable.

There are two common patterns I found in my tests. First, limiting to the current test thread, which assumes the producer interceptor is called in the same thread as the test. Second, ignoring existing records.

I wrote a helper class that provides the YAML as a collection with optional filtering by thread and/or ignoring the existing records.

Here’s an example of a test using LoggingProducerOutput:

The example is simple but your verification doesn’t have to be. The complete value of the record is available for testing.

Conclusion

I had unstable integration tests that would fail about 10% of the time. Using a YAML file to record producer activity removed the instability when checking for correctly produced records. Also, it can be applied to other technology beyond Kafka.

Happy Testing!

About the Author

Patrick Double profile.

Patrick Double

Principal Technologist

I have been coding since 6th grade, circa 1986, professionally (i.e. college graduate) since 1998 when I graduated from the University of Nebraska-Lincoln. Most of my career has been in web applications using JEE. I work the entire stack from user interface to database.   I especially like solving application security and high availability problems.

Leave a Reply

Your email address will not be published. Required fields are marked *

Related Blog Posts
Google Professional Machine Learning Engineer Exam 2021
Exam Description A Professional Machine Learning Engineer designs, builds, and productionizes ML models to solve business challenges using Google Cloud technologies and knowledge of proven ML models and techniques. The ML Engineer is proficient in all aspects […]
Designing Kubernetes Controllers
There has been some excellent online discussion lately around Kubernetes controllers, highlighted by an excellent Speakerdeck presentation assembled by Tim Hockin. What I’d like to do in this post is explore some of the implications […]
React Server Components
The React Team recently announced new work they are doing on React Server Components, a new way of rendering React components. The goal is to create smaller bundle sizes, speed up render time, and prevent […]
Jolt custom java transform
Jolt is a JSON to JSON transformation library where the transform is defined in JSON. It’s really good at reorganizing the json data and massaging it into the output JSON you need. Sometimes, you just […]