Testing with Spring Kafka and MockSchemaRegistryClient
Spring Kafka provides a variety of testing utilities to make writing integration tests easier. Most notably, the @EmbeddedKafka annotation spins up an embedded broker (and zookeeper) available for tests. The address of the broker is set to the ${spring.embedded.kafka.brokers} property so that you can configure your consumers and producers appropriately. If you aren’t familiar with Spring Kafka’s testing package, go check out the documentation.
EmbeddedKafka is a great tool for many tests but it falls flat when dealing with Avro data because of the absence of Schema Registry support. However, there are a few of options that I have explored to get around this.
- Run the entire Kafka stack in a docker environment and execute tests against it.
- Port the EmbeddedSingleNodeKafkaCluster that Confluent has engineered for their testing samples. As of writing this, it sounds like there are plans to publish a core-test-utils artifact so keep your eyes out for that. This is a nice embedded solution that includes a broker, zookeeper, AND schema registry client along with some other useful features.
- Utilize the MockSchemaRegistryClient that Confluent has made available in place for the CachedSchemaRegistryClient that is used by default.
Being that I didn’t need a full-fledged docker environment and wasn’t keen on porting a bunch of code, I implemented option #3. Here is the configuration I came up with so that my integration tests use an embedded Kafka broker and MockSchemaRegistryClient.
Test Properties
Spring Kafka exposes a set of properties that can be used to configure producer, consumer, and admin Kafka clients. These same properties come in handy when setting up a test environment.
The main thing to note in the properties shown below is that bootstrap-servers is set to ${spring.embedded.kafka.brokers} so that clients created for tests use the embedded broker. The schema.registry.url property is required but can be any value since the MockSchemaRegistryClient won’t use it.
MockSchemaRegistryClient Configuration
To enable the MockSchemaRegistryClient in our serialization and deserialization there are a few beans that have to be defined in the test project. The comments throughout the gist below illustrate what the bean is and why it’s needed.
Dependencies
To start using the Spring Kafka embedded broker alongside a MockSchemaRegistryClient, the dependencies in the snippet below should be added to your existing build.gradle. To pull any io.confluent packages you will have to add Confluent’s maven repository.
Conclusion
With spring-kaka-test in the mix and a few additional bean configurations, you can start adding valuable test coverage to any Kafka client application that relies on Avro and the Schema Registry!
Hi Matt,
Thanks for your good article. I am trying to use the MockSchemaRegistryClient provided by Confluent to hook up in my Kafka producer project.
I am wondering how you solve the problem of confluent registry restful mock server? Since Spring Kafka only provide embedded Kafka but not a schema registry server.
When I try to run the test, I always got error like:
Failed to send HTTP request to endpoint: http://127.0.0.1:8081/subjects/sender.t-value/versions
Thanks.