8
minutes
Mis à jour le
6/10/2020


Share this post

This step-by-step guide will help you set up a messaging feature using Apache Kafka for your Spring Boot micro-service application.

#
E-mail
#
Kafka
#
Spring Boot
Test Sipionaute
Software Engineer

Introduction

Providing feedback to users is of tremendous importance in a modern Web application. E-mails continue to be the best way to communicate when a user is not active on the platform. Sending e-mail can however take time and may fail. In modern micro-service architecture, the need to give feedback can occur in any of many services. A robust design requires a dedicated service for messaging.

I faced this issue on an ambitious project for one of our customers. The customer had an Apache Kafka infrastructure on site. I figured it could be the ideal tool to asynchronously send e-mail from various parts of our software stack.

Apache Kafka: what it is and why?

kafka-logo-title-1

As described on its website, "Apache Kafka is an open-source distributed event streaming platform used by thousands of companies for high-performance data pipelines, streaming analytics, data integration, and mission-critical applications." Said otherwise, Kafka is a message queue software on steroids. Its flexibility makes it perfect for both simple and complex projects. It acquired therefore vast popularity in the IT of influential companies.

The goal of the article is not to explain how Kafka works. It is by itself a topic for a whole book. I recommend "Kafka: The Definitive Guide" by Narkhede, Shapira, and Palino. If you have no experience of Kafka, here are a few definitions of its concepts:

  • A topic is a queue on which messages can be pushed. Each topic several partitions that allow parallel processing of messages.
  • A producer is a client that pushes messages to a topic on a group of servers called brokers.
  • A consumer is a client that process messages from a topic and gets attributed partitions. Each message is processed by one consumer of each consumer group.

The messaging micro-service

schema

Our application is comprised of several micro-services. Each of them has its responsibility; whether it is to manage permissions, customer data, products... The strength of using Kafka is that it permits several of our micro-services to send notifications by pushing messages to a single Kafka topic. On the other end of the queue, a single Spring Boot application is responsible for handling the request for e-mails of our whole application.

Now, let's set up the project

The distributed messaging system is comprised of the following elements:

  • The reverse proxy Nginx
  • A Spring Boot micro-service, called messaging-api
  • Kafka, and its dependency Zookeeper
  • Mailhog: a simple mail catcher that is quite handy for debugging our e-mailing application.

I combine readily available Docker images to write a ready to use Docker-Compose file:


[...]

  # Our messaging micro-service
  messaging-api:
    build:
      context: ./messaging-api/
      dockerfile: ./docker/Dockerfile
    depends_on:
      - nginx
      - kafka
    command: mvn spring-boot:run
    environment:
      - JAVA_TOOL_OPTIONS="-Xmx512m"
    volumes:
      - ./configuration:/tmp/configuration:delegated
      - ./messaging-api:/tmp/app:delegated
      - ~/.m2:/home/deploy/.m2:cached
    networks:
      - default

  # Zookeeper: required by Kafka
  zookeeper:
    image: 'bitnami/zookeeper:3'
    ports:
      - '2181:2181'
    volumes:
      - 'zookeeper_data:/bitnami'
    environment:
      - ALLOW_ANONYMOUS_LOGIN=yes

  # Kafka itself
  kafka:
    image: 'bitnami/kafka:2'
    ports:
      - '9092:9092'
    volumes:
      - 'kafka_data:/bitnami'
    environment:
      - KAFKA_CFG_ZOOKEEPER_CONNECT=zookeeper:2181
      - ALLOW_PLAINTEXT_LISTENER=yes
      - KAFKA_CFG_AUTO_CREATE_TOPICS_ENABLE=true
    depends_on:
      - zookeeper

# Mailhog: mail catcher for local debugging
  mailhog:
    image: mailhog/mailhog
    ports:
      - 1025:1025 # SMTP server ports
      - 8025:8025 # Web UI ports

[...]

Concerning the Spring Boot application itself, I generated a pom.xml from the automated generation tool (https://start.spring.io/), including Kafka, Emailing and Thymeleaf. Alternatively, you can include the following in the dependency section:


<dependency>
	<groupId>org.springframework.kafka</groupId>
	<artifactId>spring-kafka</artifactId>
</dependency>
<dependency>
	<groupId>org.springframework.kafka</groupId>
	<artifactId>spring-kafka-test</artifactId>
	<scope>test</scope>
</dependency>
<dependency>
	<groupId>org.springframework.boot</groupId>
	<artifactId>spring-boot-starter-mail</artifactId>
</dependency>
<dependency>
	<groupId>org.springframework.boot</groupId>
	<artifactId>spring-boot-starter-thymeleaf</artifactId>
</dependency>

A trivial producer for the demonstration

schema-producer

In an actual application using a micro-service architecture, the request for an e-mail notification could come from any service. In this example, I set up a simple Spring Boot controller directly in the messaging service. Using Kafka in such a situation is, of course, ridiculous, but will serve my demonstration purpose.

To create a Kafka producer, I instantiate the KafkaTemplate class and use the send method. In this case, I set it in a trivial HTTP route:


@RestController
@RequestMapping
@AllArgsConstructor
@Slf4j
public class DemoController {
    private final KafkaTemplate<String, ProjectStatusChangeDto> kakfaProducer;
    private final KafkaProperties kafkaProperties;

    @PostMapping
    @ResponseStatus(HttpStatus.NO_CONTENT)
    public void sendProjectStatusEmail(@RequestBody ProjectStatusChangeDto statusChange) {
        log.info("Sending mailing request: " + statusChange.toString());
        kakfaProducer.send(kafkaProperties.getTopics().getProjectStatusChanged(), statusChange);
    }
}

The Kafka consumer

schema-consumer

Configuring our application with the application file

The Spring Kafka library is configured via the application file. The spring.kafka section can be configured the following way:


spring:
  kafka:
    properties:
      security.protocol: 'PLAINTEXT'
    bootstrap-servers: kafka:9092
    producer:
      value-serializer: org.springframework.kafka.support.serializer.JsonSerializer
      key-serializer: org.apache.kafka.common.serialization.StringSerializer
    consumer:
      group-id: messaging_api
      auto-offset-reset: earliest
      key-deserializer: org.springframework.kafka.support.serializer.ErrorHandlingDeserializer
      value-deserializer: org.springframework.kafka.support.serializer.ErrorHandlingDeserializer
      properties:
        spring.json.trusted.packages: '*'
        spring.deserializer.key.delegate.class: org.apache.kafka.common.serialization.StringDeserializer
        spring.deserializer.value.delegate.class: org.springframework.kafka.support.serializer.JsonDeserializer
    listener:
      missing-topics-fatal: false

Note: I utilized the YAML format for this project, but it should work likewise with a properties file. In local environment, I use a plain text security protocol. Naturally, in production, it is advisable to use SSL and a proper configuration of security keys.

The serialization formats are set using the spring.kafka.producer section. I use simple string keys and JSON for the body of the messages. For deserialization, we must set the same formats. Please note that instead of directly using the StringSerializer and JsonSerializer, we use the ErrorHandlingSerializer class and configure it in its dedicated section. This is extremely effective to avoid poison pill situations: if a corrupted object is pushed to the Kafka topic, serialization will likely fail with a deserialization exception. The consumer would retry again, yielding the same result, and enter a deadlock situation. This class handles deserialization errors for us. More detail about poison pills and deserialization error management can be found in the following article.

Configuring e-mail settings

The Spring SMTP client can also be configured through the application.yml file. In this case, for local development, we configure it to send e-mail to the mail catcher and deactivate SSL. In production, those parameters are of course overridden.

Mailhog can be accessed in the Web browser at http://localhost:8025.


mail:
    host: mailhog
    port: 1025
    properties:
      mail.smtp.auth: false
      mail.smtp.starttls.enable: false
    addresses:
      from: dev@sipios.com
      replyTo: no-reply@sipios.com

The Kafka consumer

The @KafkaListener annotation is particularly handy to construct a new Kafka consumer.


@KafkaListener(topicPattern = "${kafka.topics.project-status-changed}", autoStartup = "${kafka.enabled}")
public void listenToProjectStatusChange(ConsumerRecord<String, ProjectStatusChangeDto> record) {
    log.info("Request for project status change received: " + record.toString());

    ProjectStatusChangeDto payload = record.value();

    try {
        emailService.sendEmail(
            payload.getAuthorEmailAddress(),
            "Votre demande",
            templateService.generateProjectStatusChangeEmail(payload)
        );
    } catch (MailException e) {
        log.error("Could not send e-mail", e);
    }
}

The e-mailing service

Using the e-mail library from the Spring framework, developping a service that sends e-mail only takes a few lines of code:


public void sendEmail(String recipient, String subject, EmailContentDto content) throws MailException {
    MimeMessagePreparator messagePreparator = mimeMessage -> {
        MimeMessageHelper messageHelper = new MimeMessageHelper(mimeMessage, true);
        messageHelper.setFrom(emailFromAddress);
        messageHelper.setReplyTo(emailReplyToAddress);
        messageHelper.setTo(recipient);
        messageHelper.setSubject(subject);
        messageHelper.setText(content.getText(), content.getHtml());
    };
    emailSender.send(messagePreparator);
}

A word on templating

The generation of the content of the e-mail is out of the scope of this article. The Thymeleaf templating service, which is native of Spring, provides a practical way of generating both text and HTML e-mails.

A word on error management

The attentive reader will have noted that besides deserialization errors, I don't manage application errors in this example. A more suitable design would require the handling of thrown exceptions and potentially implement a retry strategy. This is outside the scope of this article, but I recommend the following article if you want to dive further into the topic of error management.

Conclusion

In this article, I explained how to set up a simple e-mailing micro-service with Spring Boot and running on a Kafka infrastructure. It allows applications running in a micro-service architecture to asynchronously send e-mails to the user. If you want more details on the technical implementation or bootstrap your micro-service, the code for this project is hosted on my Github page.