Category Archives: Programming

Connect Spring Boot Application with AWS Dynamo DB

In this post, I show how we can connect Spring Boot Application with AWS Dynamo DB. I will also cover some fundamentals of AWS Dynamo DB which is a No-SQL database.

AWS Dynamo DB

As per Amazon Documentation, Dynamo DB is No-SQL key-value and document database. We do have some alternatives like Cassandra (key-value) or Mongo DB (document).

Dynamo DB offers

  • reliable scaleable performance
  • a simple API for allowing key-value access

Dynamo DB is usually a great fit for application with the following requirements:

  1. A large amount of data and latency requirements
  2. Data sets for recommendation systems
  3. Serverless Application with AWS Lambda

Key Concepts

Before we can use Dynamo DB, it is important to understand some key concepts about this database.

  • Tables, Items, and Attributes – These three are the fundamental blocks of Dynamo DB. A table is a grouping of data records. An item is a single data record in a table. Henceforth, each item in a table is identified using the primary key. Attributes are pieces of data in a single item.
  • Dynamo DB tables are schemaless. However, we only need to define a primary key when creating the table. A simple primary key or a composite primary key are two types of primary keys.
  • Secondary Indexes – Sometimes primary keys are not enough to access data from the table. Secondary indexes enable additional access patterns from the Dynamo DB. Nevertheless, there are two types of indexes – local secondary indexes and global secondary indexes. A local secondary index uses the same partition key as the underlying table, but a different sort key. A global secondary index uses the different partition key and sort key from the underlying table.

Applications with Dynamo DB

There is one difference with Dynamo DB compared to other SQL or NoSQL databases. We can interact with Dynamo DB through REST calls. We do not need JDBC Connection protocols where applications need to maintain consistent connections.

There are two ways we can connect applications to Dynamo DB.

  1. Use Spring Data Library with Dynamo DB
  2. Use an AWS SDK provided client

Spring Boot Application

As part of this demo, we will create some data model classes that depict entity relationships. Subsequently, the application will provide a simple REST API for crud operation and the application will store the data in Dynamo DB.

So let’s start with adding the required dependencies in our application:


dependencies {
	implementation 'org.springframework.boot:spring-boot-starter-web'
	implementation 'io.github.boostchicken:spring-data-dynamodb:5.2.5'
	implementation 'junit:junit:4.13.1'
	testImplementation 'org.springframework.boot:spring-boot-starter-test'
}

So the dependency spring-data-dynamodb allows us to represent Dynamo DB tables in model classes and create repositories for those tables.

We will create our model class Company as follows:


package com.betterjavacode.dynamodbdemo.models;

import com.amazonaws.services.dynamodbv2.datamodeling.DynamoDBAttribute;
import com.amazonaws.services.dynamodbv2.datamodeling.DynamoDBAutoGeneratedKey;
import com.amazonaws.services.dynamodbv2.datamodeling.DynamoDBHashKey;
import com.amazonaws.services.dynamodbv2.datamodeling.DynamoDBTable;

@DynamoDBTable(tableName = "Company")
public class Company
{
    private String companyId;

    private String name;

    private String type;

    @DynamoDBHashKey(attributeName = "CompanyId")
    @DynamoDBAutoGeneratedKey
    public String getCompanyId ()
    {
        return companyId;
    }


    public void setCompanyId (String companyId)
    {
        this.companyId = companyId;
    }

    @DynamoDBAttribute(attributeName = "Name")
    public String getName ()
    {
        return name;
    }

    public void setName (String name)
    {
        this.name = name;
    }

    @DynamoDBAttribute(attributeName = "Type")
    public String getType ()
    {
        return type;
    }

    public void setType (String type)
    {
        this.type = type;
    }
}

So this class Company maps to the Dynamo DB table of the same name. The annotation DynamoDBTable helps us with this mapping. Similarly, DynamoDBHashKey is the attribute key of this table. DynamoDBAttribute are the other attributes of this table.

We will create a REST Controller and a Service class that will allow us to call the CRUD APIs for this object.


package com.betterjavacode.dynamodbdemo.controllers;

import com.betterjavacode.dynamodbdemo.models.Company;
import com.betterjavacode.dynamodbdemo.services.CompanyService;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.http.HttpStatus;
import org.springframework.http.ResponseEntity;
import org.springframework.web.bind.annotation.*;

@RestController
@RequestMapping("v1/betterjavacode/companies")
public class CompanyController
{
    @Autowired
    private CompanyService companyService;

    @GetMapping(value = "/{id}", produces = "application/json")
    public ResponseEntity getCompany(@PathVariable("id") String id)
    {

        Company company = companyService.getCompany(id);

        if(company == null)
        {
            return new ResponseEntity<>(HttpStatus.BAD_REQUEST);
        }
        else
        {
            return new ResponseEntity<>(company, HttpStatus.OK);
        }
    }

    @PostMapping()
    public Company createCompany(@RequestBody Company company)
    {
        Company companyCreated = companyService.createCompany(company);

        return company;
    }
}



So we have two methods one to get the company data and another one to create the company.


package com.betterjavacode.dynamodbdemo.services;

import com.betterjavacode.dynamodbdemo.models.Company;
import com.betterjavacode.dynamodbdemo.repositories.CompanyRepository;
import org.springframework.beans.factory.annotation.Autowired;
import org.springframework.stereotype.Service;

import java.util.List;
import java.util.Optional;

@Service
public class CompanyService
{
    @Autowired
    private CompanyRepository companyRepository;

    public Company createCompany(final Company company)
    {
        Company createdCompany = companyRepository.save(company);
        return createdCompany;
    }

    public List getAllCompanies()
    {
        return (List) companyRepository.findAll();
    }

    public Company getCompany(String companyId)
    {
        Optional companyOptional = companyRepository.findById(companyId);

        if(companyOptional.isPresent())
        {
            return companyOptional.get();
        }
        else
        {
            return null;
        }
    }
}

Connect Spring Boot Application with AWS Dynamo DB

So far, we have seen creating some parts of the application. But, we still have an important part left and that is to connect our application to AWS Dynamo DB service in AWS.

Login to AWS Console and access Dynamo DB.

Create a new table in Dynamo DB.

Spring Boot Connect AWS Dynamo DB - Table

Assuming, you choose the primary key as CompanyId, we should be fine here. Remember, that’s the partition key we defined in our model class.

Now back to the Spring Boot Application. Create a new bean ApplicationConfig to define Dynamo DB configuration.

 


package com.betterjavacode.dynamodbdemo.config;

import com.amazonaws.auth.AWSCredentials;
import com.amazonaws.auth.AWSCredentialsProvider;
import com.amazonaws.auth.AWSStaticCredentialsProvider;
import com.amazonaws.auth.BasicAWSCredentials;
import com.amazonaws.regions.Regions;
import com.amazonaws.services.dynamodbv2.AmazonDynamoDB;
import com.amazonaws.services.dynamodbv2.AmazonDynamoDBClientBuilder;
import org.socialsignin.spring.data.dynamodb.repository.config.EnableDynamoDBRepositories;
import org.springframework.beans.factory.annotation.Value;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;

@Configuration
@EnableDynamoDBRepositories(basePackages = "com.betterjavacode.dynamodbdemo.repositories")
public class ApplicationConfig
{
    @Value("${amazon.aws.accesskey}")
    private String amazonAccessKey;

    @Value("${amazon.aws.secretkey}")
    private String amazonSecretKey;

    public AWSCredentialsProvider awsCredentialsProvider()
    {
        return new AWSStaticCredentialsProvider(amazonAWSCredentials());
    }

    @Bean
    public AWSCredentials amazonAWSCredentials()
    {
        return new BasicAWSCredentials(amazonAccessKey, amazonSecretKey);
    }

    @Bean
    public AmazonDynamoDB amazonDynamoDB()
    {
        return AmazonDynamoDBClientBuilder.standard().withCredentials(awsCredentialsProvider()).withRegion(Regions.US_EAST_1).build();
    }
}

We will need to pass accessKey and secretKey in application.properties. Importantly, we are creating an AmazonDynamoDB bean here.

Now, let’s start our application and we will see the log that shows it has created a connection with DynamoDB table Company.

Once the application has started, we will access Postman for REST API.

Conclusion

Code for this demo is available on my github repository.

In this post, we showed how we can use Dynamo DB – a No SQL database in a Spring Boot application.

  • We went over Dynamo DB concepts.
  • And we created a Spring Boot application.
  • We created a Dynamo DB table in AWS.
  • We connected the Spring Boot application to the AWS Dynamo DB table.

References

Introduction to Serverless Architecture Patterns

In this post, I will cover Serverless Architecture Patterns. With multiple cloud providers, on-premise infrastructure is out of date. By simple definition, serverless can be the absence of a server. But is that true? Not really. To start, we will find out serverless architecture basics and then its benefits and drawbacks.

What is Serverless Architecture?

Lately, serverless architecture is becoming more of a trend. The developer still writes the server-side code, but it runs in stateless compute containers instead of traditional server architecture. An event triggers this code and a third party (like AWS Lambda) manages it. Basically, this is Function as a Service (FaaS). AWS Lambda is the most popular form of FaaS.

So the definition of Serverless Architecture –

Serverless Architecture is a design pattern where applications are hosted by a third-party service, eliminating the need for server software and hardware.

In traditional architecture, a user does activity on the UI side and that sends a request to the server where the server-side code does some database transaction. With this architecture, the client has no idea what is happening as most of the logic is on the server-side.

With Serverless Architecture, we will have multiple functions (lambdas) for individual services and the client UI will call them through API-Gateway.

So in the above architecture

  1. A Client UI when accessed, authenticates the user through an authentication function that will interact with the user database.
  2. Similarly, once the user logs in, he can purchase or search for products using purchase and search functions.

In traditional server-based architecture, there was a central piece that was managing flow, control, and security. In Serverless architecture, there is no central piece. The drawback of serverless architecture is that we then rely on the underlying platform for security.

Why Serverless Architecture?

With a traditional architecture, one used to own a server, and then you would configure the webserver and the application. Then came the cloud revolution and now everyone wants to be on the cloud. Even with multiple cloud providers, we still need to manage the operating system on the server and web server.

What if there is a way where you can solely focus on the code and a service manages the server and the webserver. AWS provides Lambda, Microsoft Azure provides Function to take care of physical hardware, virtual operating systems, and webserver. This is how Serverless Architecture reduces the complexity by letting developers focus on code only.

Function As A Service (FAAS)

We have covered some ground with Serverless Architecture. But this pattern is possible only with Function as a Service. Lambda is one type of Function as a service. Basically, FaaS is about running backend code without managing any server or server applications.

One key advantage of FaaS like AWS Lambda is that you can use any programming language (like Java, Javascript, Python, Ruby) to code and the Lambda infrastructure will take care of setting up an environment for that language.

Another advantage of FaaS is horizontal scaling. It is mostly automatic and elastic. Cloud provider handles horizontal scaling.

Tools

There are a number of tools available for building serverless architecture functions. This particular Serverless Framework makes it building an easy process.

Benefits

  1. The key benefit of using Serverless Architecture is reduced operational cost. Once a cloud provider takes care of infrastructure, you do not have to focus on infrastructure.
  2. Faster Deployment, great flexibility – Speed helps with innovation. With faster deployment, serverless makes it easier to change functions and test the changes.
  3. Reduced scaling cost and time. With infrastructure providers handling most of the horizontal scaling, the application owner doesn’t have to worry about scaling.
  4. Focus more on UX. Therefore, developers can focus more on User Experience (UX) with architecture becoming easier.

Drawbacks

After all, there are tradeoffs with any architectural design.

  1. Vendor Control – By using an infrastructure vendor, we are giving away the control for backend service. We have to rely on their security infrastructure instead of designing our own.
  2. Running workloads could be more costly for serverless.
  3. Repetition of logic – Database migration means repetition of code and coordination for the new database.
  4. Startup latency issues – When the initial request comes, the platform must start the required resources before it can serve the request and this causes initial latency issues.

Conclusion

In this post, I discussed Serverless Architecture Patterns and what makes it possible to use this pattern. I also discussed the benefits and drawbacks. If you have more questions about Serverless Architecture, please post your comment and I will be happy to answer. You can subscribe to my blog here.

References

  1. Serverless Architecture – Serverless

How To Use Spring Security With SAML Protocol Binding

In this post, I will show how we can use Spring Security with SAML Protocol Binding to integrate with Keycloak Identity Provider. And, if you want to read on how to use Keycloak, you can read here.

What is SAML?

SAML stands for Security Assertion Markup Language. It’s an open standard for
exchanging authentication and authorization data between a service provider (SP) and identity provider (IdP).

Identity Provider – performs authentication and validates user identity for authorization and passes that to Service Provider.

Service Provider – Trusts the identity provider and provides access to the user to service based on authorization.

SAML Authentication Flow

As part of this flow, we will be building a simple To-Do List Application. Henceforth, a user will access the application and he will be redirected for authentication.

SAML Authentication User Flow:

  1. User accesses Service Provider (SP) ToDo List Application.
  2. Application redirects user to Keycloak login screen. During this redirect, the application sends an AuthnRequest to Keycloak IDP.
  3. Keycloak IDP validates the request if it is coming from the right relying party/service provider. It checks for issuer and redirect URI (ACS URL).
  4. Keycloak IDP sends a SAML response back to Service Provider.
  5. Service Provider validates the signed response with provided IDP public certificate.
  6. If the response is valid, we will extract the attribute NameID from the assertion and logged the user in.

 

NoteSpring Security SAML extension was a library that used to provide SAML support.
But after 2018, Spring Security team moved that project and now supports SAML
2 authentication as part of core Spring Security.

Use Spring Security with SAML Protocol Binding

Therefore, once you create a Spring Boot project, we will need to import the following dependencies.


dependencies {
	implementation 'org.springframework.boot:spring-boot-starter-data-jpa'
	implementation 'org.springframework.boot:spring-boot-starter-jdbc'
	implementation 'org.springframework.boot:spring-boot-starter-thymeleaf'
	implementation 'org.springframework.boot:spring-boot-starter-web'
	/*
	 * Spring Security
	 */
	implementation 'org.springframework.boot:spring-boot-starter-security'
	runtimeOnly 'mysql:mysql-connector-java'
	providedRuntime 'org.springframework.boot:spring-boot-starter-tomcat'
	implementation 'org.springframework.security:spring-security-saml2-service-provider:5.3.5' +
			'.RELEASE'

	/*
	 * Keycloak
	 */
	implementation 'org.keycloak:keycloak-spring-boot-starter:11.0.3'
	testImplementation('org.springframework.boot:spring-boot-starter-test') {
		exclude group: 'org.junit.vintage', module: 'junit-vintage-engine'
	}
}

Accordingly, the dependency spring-security-saml2-service-provider will allow us to add relying party registration. It also helps with identity provider registration.

Now, we will add this registration in ourSecurityConfig as below:


    @Bean
    public RelyingPartyRegistrationRepository relyingPartyRegistrationRepository() throws CertificateException
    {
        final String idpEntityId = "http://localhost:8180/auth/realms/ToDoListSAMLApp";
        final String webSSOEndpoint = "http://localhost:8180/auth/realms/ToDoListSAMLApp/protocol/saml";
        final String registrationId = "keycloak";
        final String localEntityIdTemplate = "{baseUrl}/saml2/service-provider-metadata" +
                "/{registrationId}";
        final String acsUrlTemplate = "{baseUrl}/login/saml2/sso/{registrationId}";


        Saml2X509Credential idpVerificationCertificate;
        try (InputStream pub = new ClassPathResource("credentials/idp.cer").getInputStream())
        {
            X509Certificate c = (X509Certificate) CertificateFactory.getInstance("X.509").generateCertificate(pub);
            idpVerificationCertificate = new Saml2X509Credential(c, VERIFICATION);
        }
        catch (Exception e)
        {
            throw new RuntimeException(e);
        }

        RelyingPartyRegistration relyingPartyRegistration = RelyingPartyRegistration
                .withRegistrationId(registrationId)
                .providerDetails(config -> config.entityId(idpEntityId))
                .providerDetails(config -> config.webSsoUrl(webSSOEndpoint))
                .providerDetails(config -> config.signAuthNRequest(false))
                .credentials(c -> c.add(idpVerificationCertificate))
                .assertionConsumerServiceUrlTemplate(acsUrlTemplate)
                .build();

        return new InMemoryRelyingPartyRegistrationRepository(relyingPartyRegistration);
    }

Our login will also change with HttpSecurity as follows:

httpSecurity.authorizeRequests()
.antMatchers("/js/**","/css/**","/img/**").permitAll()
.antMatchers("/signup","/forgotpassword").permitAll()
.antMatchers("/saml/**").permitAll()
.anyRequest().authenticated()
.and()
.formLogin()
.loginPage("/login").permitAll()
.and()
.saml2Login(Customizer.withDefaults()).exceptionHandling(exception ->
exception.authenticationEntryPoint(entryPoint()))
.logout()
.logoutUrl("/logout")
.logoutSuccessHandler(logoutSuccessHandler)
.deleteCookies("JSESSIONID")
.permitAll();

We are now using saml2Login . By default, if you access the application, it will redirect to an identity provider. We want to configure our custom login page before it can be redirected to the identity provider – keycloak. That’s why we have authenticationEntryPoint which allows us to configure our custom login page. So now if we access our application at https://localhost:8743/login, we will see the below login page:

So once you select the option for Login with Keycloak SAML, it will send a AuthnRequest to Keycloak. Also, this request is an unsigned request. Keycloak will send a signed response. A controller will receive this signed response to decode NameId attribute.


@GetMapping(value="/index")
public String getHomePage(Model model, @AuthenticationPrincipal Saml2AuthenticatedPrincipal saml2AuthenticatedPrincipal)
{
   String principal = saml2AuthenticatedPrincipal.getName();
   model.addAttribute("username", principal);
   return "index";
}

Once the NameId is retrieved, it will log the user in.

Configuration on Keycloak

We will have to configure our application in the Keycloak administration console.

  • Create a REALM for your application.
  • Select Endpoints – SAML 2.0 IdP Metadata
  • In Clients – Add the service provider.
  • For your client, configure Root URL, SAML Processing URL (https://localhost:8743/saml2/service-provider-metadata/keycloak)
  • You can also adjust the other settings like – signing assertions, including AuthnStatement.
  • Certainly, configure the ACS URL in “Fine Grain SAML Endpoint Configuration” section.

Code Repository

The code for this project is available in my github repository. I also covered this in more detail in my book Simplifying Spring Security. To learn more, you can buy my book here.

Conclusion

In this post, I showed how to use Spring Security with SAML Protocol. Over the years, there have been a lot of improvements in Spring Security and now it can easily be used with different protocols like OAuth, OIDC. If you enjoyed this post, subscribe to my blog here.

What Makes a Good Junior Developer

What makes a good Junior Developer? Yes, I will talk about some qualities every junior developer should develop to do better in this role. Now Junior Developer is a broad term, it can include Associate Software Engineers, Software Engineers, or Developers.

Once I was a Junior Developer too. Now I am in a senior role, but that doesn’t take away from me to still be a junior to other Seniors. So, I wish there was some kind of a guide to help junior developers to succeed in their roles.

Qualities that will help you succeed as a Junior Developer

  1. Be open-minded to take up a challenge – One quality I really appreciate about the junior developers that I have worked with so far is to take up a challenge. In the initial days, you want to learn as much as you can. Sometimes, it can be overwhelming, other times, it can be boring. But learn and read.
  2. Take ownership of the task you work on – If you get a task to work on, take ownership of that task till its completion. You can create trust with your peers by taking ownership of the task. If you get stuck with the task, ask questions about it to your seniors. Senior Developers are meant to help you.
  3. Ask questions – As a senior developer, I really appreciate developers who ask questions. Even if those questions can be easy to answer. If you are stuck or don’t know, ask the question. Even in meetings, if you ask a question, everybody should appreciate it. A question brings a different perspective. And every perspective matters.
  4. Help others – One way to build your career at any organization is by helping others as much as you can. So, help others. You solved a problem that many other developers are facing, share that knowledge. If you built an automation tool, share the tool. Other junior developers come to you, help them. Be so good that everyone comes to you.

How to understand the system as a Junior Developer

Everyone has their own way to learn any skill or a complex system. But there are a few tried and tested ideas that can help you understand the system. This is definitely helpful in your initial days at any organization as a junior developer.

  1. Read and take notes. If there is documentation for the system architecture, read as much as you can. This will help you get the bigger picture of the system. If there is no documentation, then you are up against a challenge. Try to identify the components of the system that communicate with each other. Start creating documentation yourself so the future junior developers can thank you.
  2. Take up a task. Once you get a bigger picture, take up a task and work on it. Do a microanalysis of part of the system that you work on. Now this will provide you both a short distance view as well as a long-distance view of your system. Also, remember understanding any complex system takes time, so don’t be discouraged if you do not understand everything in a month or two.
  3. Read about system design in general. One way to build up your knowledge about any complex system is by reading about system design. Another way is to read the engineering blog of your company. A lot of passionate developers write these engineering blogs to share their knowledge worldwide.

Tips to succeed as a Junior Developer

  1. Read code – Read the code of the system you are working on. This will make you comfortable with code as well. Create your questions based on reading the code that you can ask senior developers.
  2. Participate in Code Review – One thing you will learn from code review to see how senior developers help in improving the code quality, so you can adapt to that quality. You will also learn to review others’ code.
  3. Write unit test cases – For whatever code you write, write unit test cases. This will set you apart from other developers.
  4. Start understanding where your team is struggling – Become a person who can identify a problem and find a solution. It’s very rare to find high-agency developers, but you can build up that skill. See where your team is struggling and how you can help the team in not struggling. Every team struggles with something. If as a Junior Developer, you have time, then help your team by building a tool or creating documentation of the system that has been ignored.
  5. Do not compare – Do not compare yourself with fellow junior or senior developers. Everyone’s path is different. So focus on what you can and can not do. Improve your skills.
  6. Ask for feedback – If there is no way to get feedback from a manager or senior developers, check in with your seniors once a month to get feedback. Feedback helps to improve. Find your own weaknesses and work on them.

Conclusion

In this post, I shared some tips and skills that a Junior Developer can use to be successful. If you enjoyed this post, comment on this post telling me what skill you think can make a Junior Developer stand out.

If you are still looking for my book Simplifying Spring Security, it is here to buy.

Integration Testing in Spring Boot Application

In this post, I will show how we can add integration testing to a Spring Boot application.

Integration tests play a key role in ensuring the quality of the application. With a framework like Spring Boot, it is even easier to integrate such tests. Nevertheless, it is important to test applications with integration tests without deploying them to the application server.

Integration tests can help to test the data access layer of your application. Integration tests also help to test multiple units. For the Spring Boot application, we need to run an application in ApplicationContext to be able to run tests. Integration tests can help in testing exception handling.

Spring Boot Application

For this demo, we will build a simple Spring Boot application with REST APIs. We will be using the H2 In-Memory database for storing the data. Eventually, I will show how to write an integration test. This application reads a JSON file of vulnerabilities from the National Vulnerability Database and stores it in the H2 database. REST APIs allow a user to fetch that data in a more readable format.

Dependencies

First, we want to build integration tests in this application, so we will need to include the dependency spring-boot-starter-test .


dependencies {
	implementation 'org.springframework.boot:spring-boot-starter-data-jpa'
	implementation 'org.springframework.boot:spring-boot-starter-web'
	implementation 'junit:junit:4.13.1'
	runtimeOnly 'com.h2database:h2:1.4.200'
	testImplementation 'org.springframework.boot:spring-boot-starter-test'
}

This dependency of spring-boot-starter-test allow us to add testing-related annotations that we will see soon.

REST API

Now as I said previously, we will have a REST API to fetch national vulnerability database data. We will create a REST controller with two APIs one to fetch a list of vulnerabilities and one to fetch a vulnerability by CVE id.


@RestController
@RequestMapping("/v1/beacon23/vulnerabilities")
public class CveController
{

    @Autowired
    private CveService cveService;

    @GetMapping("/list")
    public List getAllCveItems(@RequestParam(required = false, name="fromDate") String fromDate, @RequestParam(required = false, name=
            "toDate") String toDate)
    {
        List cveDTOList = cveService.getCveItems(fromDate, toDate);

        if(cveDTOList == null || cveDTOList.isEmpty())
        {
            return new ArrayList<>();
        }
        else
        {
            return cveDTOList;
        }
    }

    @GetMapping
    public ResponseEntity getCveItemById(@RequestParam("cveId") String cveId)
    {
        CveDTO cveDTO = cveService.getCveItemByCveId(cveId);

        if(cveDTO != null)
        {
            return new ResponseEntity<>(cveDTO, HttpStatus.OK);
        }
        else
        {
            return new ResponseEntity<>(HttpStatus.NO_CONTENT);
        }
    }

}

So we have

  • /v1/beacon23/vulnerabilities/list – to fetch a list of vulnerabilities
  • /v1/beacon23/vulnerabilities?cveId=value – to fetch vulnerability by CVE id.

Service

Now, most of the business logic and validation happen in Service class. As we saw in our API, we use CVEService to fetch the required data.

    @Autowired
    public CveDataDao cveDataDao;

    public List getCveItems(String from, String to)
    {
        LOGGER.debug("The date range values are from = {} and to = {}", from, to);
        List cveDataList = cveDataDao.findAll();
        List cveDTOList = new ArrayList<>();

        for(CveData cveData : cveDataList)
        {
            List cveList = cveData.getCveItems();
            for(CveItem cveItem: cveList)
            {
                Date fromDate;
                Date toDate;

                if(!isNullOrEmpty(from) && !isNullOrEmpty(to))
                {
                    fromDate = DateUtil.formatDate(from);
                    toDate = DateUtil.formatDate(to);

                    Date publishedDate = DateUtil.formatDate(cveItem.getPublishedDate());

                    if(publishedDate.after(toDate) || publishedDate.before(fromDate))
                    {
                        continue;
                    }
                }
                CveDTO cveDTO = convertCveItemToCveDTO(cveItem);
                cveDTOList.add(cveDTO);
            }
        }
        return cveDTOList;
    }

    private boolean isNullOrEmpty (String str)
    {
        return (str == null || str.isEmpty());
    }

    private String buildDescription (List descriptionDataList)
    {
        if(descriptionDataList == null || descriptionDataList.isEmpty())
        {
            return EMPTY_STRING;
        }
        else
        {
            return descriptionDataList.get(0).getValue();
        }
    }

    private List buildReferenceUrls (List referenceDataList)
    {
        return referenceDataList.stream().map(it -> it.getUrl()).collect(Collectors.toList());
    }

    public CveDTO getCveItemByCveId(String cveId)
    {
        List cveDataList = cveDataDao.findAll();
        CveDTO cveDTO = null;

        for(CveData cveData : cveDataList)
        {
            List cveItems = cveData.getCveItems();

            Optional optionalCveItem =
                    cveItems.stream().filter(ci -> ci.getCve().getCveMetadata().getCveId().equals(cveId)).findAny();
            CveItem cveItem = null;
            if(optionalCveItem.isPresent())
            {
                cveItem = optionalCveItem.get();
            }
            else
            {
                return cveDTO;
            }
            cveDTO = convertCveItemToCveDTO(cveItem);
        }

        return cveDTO;
    }

Usage of @SpringBootTest

Spring Boot provides an annotation @SpringBootTest that we can use in integration tests. With this annotation, the tests can start the application context that can contain all the objects we need for the application to run.

Integration tests provide an almost production-like scenario to test our code. The tests annotated with @SpringBootTest create the application context used in our tests through application class annotated with @SpringBootConfiguration.

These tests start an embedded server, create a web environment, and then run @Test methods to do integration testing. We need to add few attributes to make sure we can start web environment while using @SpringBootTest.

  • Attribute webEnvironment – To create a web environment with a default port or a random port.

We can also pass properties to use for tests using an active profile. Usually, we use these profiles for different environments, but we can also use a special profile for tests only. We create application-dev.yml, application-prod.yml profiles. Similarly, we can create application-test.yml and use the annotation @ActiveProfiles('test') in our tests.

Example of Integration Test

For our REST API, we will create an integration test that will test our controller. We will also use TestRestTemplate to fetch data. This integration test will look like below:


package com.betterjavacode.beacon23.tests;

import org.junit.Test;
import org.junit.runner.RunWith;
import org.springframework.boot.test.context.SpringBootTest;
import org.springframework.boot.test.web.client.TestRestTemplate;
import org.springframework.boot.web.server.LocalServerPort;
import org.springframework.http.HttpEntity;
import org.springframework.http.HttpHeaders;
import org.springframework.http.HttpMethod;
import org.springframework.http.ResponseEntity;
import org.springframework.test.context.junit4.SpringRunner;

import static org.junit.Assert.assertNotNull;

@RunWith(SpringRunner.class)
@SpringBootTest(webEnvironment = SpringBootTest.WebEnvironment.RANDOM_PORT)
public class CveControllerTest
{
    @LocalServerPort
    private int port;

    TestRestTemplate testRestTemplate = new TestRestTemplate();

    HttpHeaders headers = new HttpHeaders();

    @Test
    public void testGetAllCveItems()
    {
        HttpEntity entity = new HttpEntity<>(null, headers);

        ResponseEntity responseEntity = testRestTemplate.exchange(createURLWithPort(
                "/v1/beacon23/vulnerabilities/list"),HttpMethod.GET, entity, String.class);

        assertNotNull(responseEntity);

    }


    private String createURLWithPort(String uri)
    {
        return "http://localhost:" + port + uri;
    }
}

We use @SpringBootTest annotation for our test class and set up the application context by using webEnvironment with a RANDOM_PORT. We also mock the local web server by setting up a mock port with @LocalServerPort.

TestRestTemplate allows us to simulate a client that will call our API. Once we run this test (either through gradle build OR through IntelliJ), we will see the Spring Boot Application Context setup running and the application running at a random port.

One disadvantage of creating integration tests with @SpringBootTest is that it will slow down building your application. In most enterprise environments, you will have this set up through continuous integration and continuous deployment. In such scenarios, it slows down the process of integration and deployment if you have a lot of integration tests.

Conclusion

Finally, you should use integration testing in the Spring Boot application or not, it depends on your application. But despite the drawback, it is always useful to have integration tests that allow testing multiple units at a time. @SpringBootTest is a handy annotation that can be used to set up an application context, allowing us to run tests close to a production environment.

References

  1. integration Testing with Spring Boot – Integration Testing