Testing with Spring Kafka and MockSchemaRegistryClient

Spring Kafka provides a variety of testing utilities to make writing integration tests easier. Most notably, the @EmbeddedKafka annotation spins up an embedded broker (and zookeeper) available for tests. The address of the broker is set to the ${spring.embedded.kafka.brokers} property so that you can configure your consumers and producers appropriately. If you aren’t familiar with Spring Kafka’s testing package, go check out the documentation.

EmbeddedKafka is a great tool for many tests but it falls flat when dealing with Avro data because of the absence of Schema Registry support. However, there are a few of options that I have explored to get around this.

  1. Run the entire Kafka stack in a docker environment and execute tests against it.
  2. Port the EmbeddedSingleNodeKafkaCluster that Confluent has engineered for their testing samples. As of writing this, it sounds like there are plans to publish a core-test-utils artifact so keep your eyes out for that. This is a nice embedded solution that includes a broker, zookeeper, AND schema registry client along with some other useful features.
  3. Utilize the MockSchemaRegistryClient that Confluent has made available in place for the CachedSchemaRegistryClient that is used by default.

Being that I didn’t need a full-fledged docker environment and wasn’t keen on porting a bunch of code, I implemented option #3. Here is the configuration I came up with so that my integration tests use an embedded Kafka broker and MockSchemaRegistryClient.

Test Properties

Spring Kafka exposes a set of properties that can be used to configure producer, consumer, and admin Kafka clients. These same properties come in handy when setting up a test environment.

The main thing to note in the properties shown below is that bootstrap-servers is set to ${spring.embedded.kafka.brokers} so that clients created for tests use the embedded broker. The schema.registry.url property is required but can be any value since the MockSchemaRegistryClient won’t use it.

MockSchemaRegistryClient Configuration

To enable the MockSchemaRegistryClient in our serialization and deserialization there are a few beans that have to be defined in the test project. The comments throughout the gist below illustrate what the bean is and why it’s needed.

Dependencies

To start using the Spring Kafka embedded broker alongside a MockSchemaRegistryClient, the dependencies in the snippet below should be added to your existing build.gradle. To pull any io.confluent packages you will have to add Confluent’s maven repository.

Conclusion

With spring-kaka-test in the mix and a few additional bean configurations, you can start adding valuable test coverage to any Kafka client application that relies on Avro and the Schema Registry!


Full Sample Project

About the Author

Matt Schroeder profile.

Matt Schroeder

Director, Modern API

A wide range of professional experience and a Master’s Degree in Software Engineering have become the foundation that enables Matt to lead teams to the best solution for every problem.

One thought on “Testing with Spring Kafka and MockSchemaRegistryClient

  1. Terry says:

    Hi Matt,

    Thanks for your good article. I am trying to use the MockSchemaRegistryClient provided by Confluent to hook up in my Kafka producer project.

    I am wondering how you solve the problem of confluent registry restful mock server? Since Spring Kafka only provide embedded Kafka but not a schema registry server.

    When I try to run the test, I always got error like:
    Failed to send HTTP request to endpoint: http://127.0.0.1:8081/subjects/sender.t-value/versions

    Thanks.

Leave a Reply

Your email address will not be published. Required fields are marked *

Related Blog Posts
Using Conftest to Validate Configuration Files
Conftest is a utility within the Open Policy Agent ecosystem that helps simplify writing validation tests against configuration files. In a previous blog post, I wrote about using the Open Policy Agent utility directly to […]
SwiftGen with Image & Color Asset Catalogs
You might remember back in 2015 when iOS 9 was introduced, and we were finally given a way to manage all of our assets in one place with Asset Catalogs. A few years later, support […]
Tracking Original URL Through Authentication
If you read my other post about refreshing AWS tokens, then you probably have a use case for keeping track of the original requested resource while the user goes through authentication so you can route […]
Using Spring Beans in a Kafka Streams ExceptionHandler
There are many things to know before diving into Kafka Streams. If you haven’t already, check out these 5 things as a starting point. Bullet 2 mentions designing for exceptions. Ironically, this seems to be […]