Monthly Archives: January 2019

All about Kafka Streaming

Lately, I have been hearing a lot about Kafka streaming. Even though I have worked on microservices, I haven’t really tackled heavy data applications. In my previous experience, where we did deal with heavy data for health insurance benefits, it was very different.

Now with Netflix and Amazon, data streaming has become a major target. With growing technology and information, it has become even more important to tackle the growing data. In simple terms, all web applications should be able to process large data sets with improved performance. Data sets size should not deter applications usage.

What is Kafka Data Streaming?

Firstly we used to process large data in batch, but that is not continuous processing and sometimes, it doesn’t work in real-time scenarios for applications. Like Netflix, batch processing will never work. What’s the alternative? The data streaming. Data streaming is a process of sending data sets continuously. This process is the backbone for applications like Netflix and Amazon. Also the growing social network platforms, data streaming is at the heart of handling large data.

The streamed data is often used for real-time aggregation and correlation, filtering, or sampling. One of the major benefits of data streaming is that it allows us to view and analyze data in real-time.

Few challenges that data streaming often faces

  1. Scalability
  2. Durability of data
  3. Fault Tolerance

Tools for Data Streaming

There are a bunch of tools that are available for data streaming. Amazon offers Kinesis, Apache has few open-source tools like Kafka, Storm, and Flink. Similarly, in future posts, I will talk more about Apache Kafka and its usage. Here I am just giving a brief idea about Apache Kafka.

Apache Kafka Streaming

Apache Kafka is a real-time distributed streaming platform. Basically it allows us to publish and subscribe to streams of records.

There are two main usages where Apache Kafka is used:

  1. Building data pipelines where records are streamed continuously.
  2. Building applications that can consume data pipelines and react accordingly

Above all, the basic idea that Kafka has adapted is from Hadoop. It runs as a cluster of one or more servers that can span multiple data centers. These clusters store data streams. Each record in the stream comprised of key, value, and timestamp. Kafka provides four main APIs Producer, Consumer, Streams, Connector.

Similarly, in future posts, I will break down these APIs in detail with their usage in a sample application.

References

  1. Apache Kafka – Kafka
  2. Data Streaming – Data Streaming
  3. Stream Processing – Stream Processing

 

 

 

How to send email with Spring Boot

Scenario

In this post, I show how to use email configuration inside a spring boot application. The use case I want to discuss here is if you have Contact Us page on your web application where you allow users to send email to your sales team or support team, then how do you accomplish this feature about sending an email with contact us form using Spring Boot.

What will you need

  • Java 8
  • IntelliJ
  • A Spring boot based web application

Use case solution

As part of this post, I will not be describing how to build a spring boot based web application. You can visit some of my older posts Saas applicationWeb application with Spring boot security OR Spring boot application with docker. Even though none of these applications have Contact Us page, I will recommend to add that page with a simple form like below:

Contact Us

I have used Bootstrap template to build this form. This form is outside the web application, but I have similar form inside the web application for users to contact sales or support team. In this case, user who wants to sign up for the application can contact my sales team.

Now to use Spring provided facility for sending email from the application to your designated email, we will add following library:

compile('it.ozimov:spring-boot-email-core:0.6.3')

This library provides a EmailService which covers the spring library for spring-boot-starter-mail and do not have to write part of the code to send email. In this example, I will show this EmailService can be used to write a simple method to send email.

First we need to enable email tools by an annotation @EnableEmailTools in our main Spring application. Once we do that, I have written a simple method to send email. This method will look like below:

@Autowired
private EmailService emailService;

private void sendEmail(String emailAddress, String message, String phoneno) throws UnsupportedEncodingException, AddressException
{
    ArrayList<InternetAddress> emails = new ArrayList<>();
    emails.add(new InternetAddress("betterjavacode.com@gmail.com"));
    final Email email = DefaultEmail.builder()
            .from(new InternetAddress(emailAddress))
            .to(emails)
            .subject("Sales Support")
            .body(message + "\n" + phoneno)
            .encoding("UTF-8").build();

    emailService.send(email);
}

Now to make this email service to work, we still have to provide SMTP server properties and a from sender email and password. In this example above, I showed betterjavacode.com as my from gmail address.

Adding following properties in application.properties will set up our SMTP host for sending the email.

spring.mail.host = smtp.gmail.com
spring.mail.port = 587
spring.mail.username = betterjavacode.com@gmail.com
spring.mail.password =*****************
spring.mail.properties.mail.smtp.starttls.enable=true
spring.mail.properties.mail.smtp.starttls.required=true
spring.mail.properties.mail.smtp.auth=true
spring.mail.properties.mail.smtp.connectiontimeout=5000
spring.mail.properties.mail.smtp.timeout=5000
spring.mail.properties.mail.smtp.writetimeout=5000

Conclusion

In this post, I showed how to send email using the Spring boot email configuration feature.