Project Reactor [Part 3] Error Handling

unsplash-logoDaniil Silantev
This is the third article on a series of articles on Project Reactor. In previous article we discussed different operators in Project Reactor. In this article, I’ll show you how to do error handling in reactor. We’ll do this through examples.

In Reactive programming errors are also consider as terminal events. When an error occurs, event is sent to onError method of Subscriber.Before we start looking at how we handle errors, you must keep in mind that any error in a reactive sequence is a terminal event. Even if an error-handling operator is used, it does not let the original sequence continue. Rather, it converts the onError signal into the start of a new sequence (the fallback one). In other words, it replaces the terminated sequence upstream of it.

The code for this post is available on my Github account here

OnError

In below code blok, subscriber will print values from 1,2,3,4,5 and after that onError code block will get executed. Note that number 6 will never be printed as Error is terminal event. subscriber Completed line will also not be printed on console as in case of error we do not get onComplete event.

OnError
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16

public void testErrorFlowFlux() {
Flux<Integer> fluxFromJust = Flux.just(1, 2,3,4,5)
.concatWith(Flux.error(new RuntimeException("Test")))
.concatWith(Flux.just(6));
fluxFromJust.subscribe(
(it)-> System.out.println("Number is " + it), // OnNext
(e) -> e.printStackTrace(), //OnError
() -> System.out.println("subscriber Completed") //onComplete
);
//To Unit test this code
StepVerifier
.create(fluxFromJust)
.expectNext(1, 2,3,4,5)
.expectError(RuntimeException.class);

onErrorResume

If you want to report exception and then want to return some fallback value you can use ‘onErrorResume’

onErrorResume
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
 
@Test
public void testOnErrorResume() throws InterruptedException {
Flux<Integer> fluxFromJust = Flux.just(1, 2,3,4,5)
.concatWith(Flux.error(new RuntimeException("Test")))
.concatWith(Flux.just(6))
.onErrorResume(e->{
log.info("**************");
System.out.println("Exception occured " + e.getMessage());
//Return Some Fallback Values
return Flux.just(7,8);
});

StepVerifier
.create(fluxFromJust)
.expectNext(1, 2,3,4,5)
.expectNext(7,8)
.verifyComplete();

}

onErrorReturn

onErrorReturn can be used if you just want to return some fallback value for error item

onErrorResume
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
   
@Test
public void testOnErrorReturn() throws InterruptedException {

Flux<Integer> fluxFromJust = Flux.just(1, 2,3,4,5)
.concatWith(Flux.error(new RuntimeException("Test")))
.concatWith(Flux.just(6))
.onErrorReturn(99)
;
StepVerifier
.create(fluxFromJust)
.expectNext(1, 2,3,4,5)
.expectNext(99)
.verifyComplete();

}

OnErrorContinue

If you want to ignore error produce by any operator onErrorContinue can be use.
onErrorContinue will ignore the error element and continue the sequence.
in below example for number 3 we are getting some exception, onErrorContinue will simply ignore that exception.

onErrorContinue
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
 

@Test
public void testOnErrorContinue() throws InterruptedException {

Flux<Integer> fluxFromJust = Flux.just(1, 2,3,4,5)
.map(i->mapSomeValue(i))
.onErrorContinue((e,i)->{
System.out.println("Error For Item +" + i );
})
;
StepVerifier
.create(fluxFromJust)
.expectNext(1, 2,4,5)
.verifyComplete();

}

private int mapSomeValue(Integer i) {
if( i==3)
throw new RuntimeException("Exception From Map");
return i;
}

OnErrorMap

To Map Exception to any custom exception

OnErrorMap
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
 
@Test
public void testOnErrorMap() throws InterruptedException {

Flux<Integer> fluxFromJust = Flux.just(1, 2,3,4,5)
.concatWith(Flux.error(new RuntimeException("Test")))
.concatWith(Flux.just(6))
.map(i->i*2)
.onErrorMap(e -> new CustomeException(e) )
;
StepVerifier
.create(fluxFromJust)
.expectNext(2, 4,6,8,10)
.expectError(CustomeException.class);
}

retry

You can add retry on error. But keep in mind the retry will be started from first element.

retry
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
@Test
public void testOnRetry() throws InterruptedException {

Flux<Integer> fluxFromJust = Flux.just(1, 2,3)
.map(i->i*2)
.onErrorMap(e -> new CustomeException(e) )
.retry(2);

StepVerifier
.create(fluxFromJust)
.expectNext(2, 4,6)
.expectNext(2, 4,6)
.expectNext(2, 4,6)
.expectError(CustomeException.class)
.verify();
}

onErrorStop

onErrorStop will stop the execution

onErrorStop
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
@Test
public void testOnErrorStop() throws InterruptedException {

Flux<Integer> fluxFromJust = Flux.just(1, 2,3)
.concatWith(Flux.error(new RuntimeException("Test")))
.concatWith(Flux.just(6))
.map(i->doubleValue(i))
.onErrorStop();
StepVerifier
.create(fluxFromJust)
.expectNext(2, 4,6)
.verifyError();
}
private Integer doubleValue(Integer i) {
System.out.println("Doing Multiple");
return i*2;
}

doOnError

It you want to execute side effect on error

doOnError
1
2
3
4
5
6
7
8
9
10
11
12
13
14
 @Test
public void testDoOnError() throws InterruptedException {

Flux<Integer> fluxFromJust = Flux.just(1, 2,3,4,5)
.concatWith(Flux.error(new RuntimeException("Test")))
.doOnError(e -> System.out.println("Rum some Side effect!!"));

StepVerifier
.create(fluxFromJust)
.expectNext(1, 2,3,4,5)
.expectError()
.verify();
TimeUnit.SECONDS.sleep(2);
}

doFinally

doFinally is similar to finally block of try catch.

doOnError
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
 @Test
public void testDoFinally() throws InterruptedException {
Flux<Integer> fluxFromJust = Flux.just(1, 2,3,4,5)
.concatWith(Flux.error(new RuntimeException("Test")))
.doFinally( i->{
if (SignalType.ON_ERROR.equals(i)) {
System.out.println("Completed with Error ");
}
if (SignalType.ON_COMPLETE.equals(i)) {
System.out.println("Completed without Error ");
}
});

StepVerifier
.create(fluxFromJust)
.expectNext(1, 2,3,4,5)
.expectError()
.verify();
TimeUnit.SECONDS.sleep(2);
}


The code for this post is available on my Github account here

Share Comments

Project Reactor [Part 2] Exploring Operators in Flux & Mono

unsplash-logoDaniil Silantev

This is the second article on a series of Project Reactor. In previous article we discussed basic of Flux an Mono. In this second article, I’ll show you how to use Operators to modified and transform flux. We’ll do this through examples.
The code for this post is available on my Github account here

Filter

This is similar to java 8 stream filter. It takes predicates, the elements which satisfies predicate condition will be pass thro.

Filtering
1
2
3
4
5
6
7
8
9
10
@Test
public void testFilteringFlux() {

Flux<Integer> fluxFromJust = Flux.just(1, 2,3,4,5,6,7,8,9,10).log();
Flux<Integer> filter = fluxFromJust.filter(i -> i % 2 == 0);//filter the even numbers only
StepVerifier
.create(filter)
.expectNext(2,4,6,8,10)
.verifyComplete();
}

Distinct

it filter out duplicates.

Distinct
1
2
3
4
5
6
7
8
9
10
@Test
public void distinct() {

Flux<Integer> fluxFromJust = Flux.just(1, 2,3,4,5,1,2,3,4,5).log();
Flux<Integer> distinct = fluxFromJust.distinct();
StepVerifier
.create(distinct)
.expectNext(1, 2,3,4,5)
.verifyComplete();
}

takeWhile

Consumes values from flux until predicate returns TRUE for the values

takeWhile
1
2
3
4
5
6
7
8
9
10
@Test
public void takeWhile() {

Flux<Integer> fluxFromJust = Flux.just(1, 2,3,4,5,6,7,8,9,10).log();
Flux<Integer> takeWhile = fluxFromJust.takeWhile(i -> i <=5);
StepVerifier
.create(takeWhile)
.expectNext(1, 2,3,4,5)
.verifyComplete();
}

skipWhile

Skips elements until predicate returns TRUE for the values

skipWhile
1
2
3
4
5
6
7
8
9
10
@Test
public void skipWhile() {

Flux<Integer> fluxFromJust = Flux.just(1, 2,3,4,5,6,7,8,9,10).log();
Flux<Integer> takeWhile = fluxFromJust.skipWhile(i -> i <=5);
StepVerifier
.create(takeWhile)
.expectNext(6,7,8,9,10)
.verifyComplete();
}

map

Map operation is similar to java 8 stream map operation. Map operation is use for transforming element from one Type to another.

Convert String Flux To Integer Flux
1
2
3
4
5
6
7
8
9
10
@Test
public void testMapOperationFlux() {

Flux<String> fluxFromJust = Flux.just("RandomString", "SecondString","XCDFRG").log();
Flux<Integer> filter = fluxFromJust.map(i-> i.length());
StepVerifier
.create(filter)
.expectNext(12,12,6)
.verifyComplete();
}

flatMap

FlatMap Transform the elements emitted by this Flux asynchronously into Publishers, then flatten these inner publishers into a single Flux through merging, which allow them to interleave.

FlatMap
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
@Test
public void testFlatMapFlux() {

Flux<Integer> fluxFromJust = Flux.just(1,2,3).log();
Flux<Integer> integerFlux = fluxFromJust
.flatMap(i -> getSomeFlux(i));//getSomeFlux returns flux of range ,
// then we do flatMap on all Flux to convert them in to single Flux

StepVerifier
.create(integerFlux)
.expectNextCount(30)
.verifyComplete();
}
private Flux<Integer> getSomeFlux(Integer i) {
return Flux.range(i,10);
}

Index

Keep information about the order in which source values were received by indexing them with a 0-based incrementing long

index
1
2
3
4
5
6
7
8
9
10
11
12
public  void maintainIndex(){

Flux<Tuple2<Long, String>> index = Flux
.just("First", "Second", "Third")
.index();
StepVerifier.create(index)
.expectNext(Tuples.of(0L,"First"))
.expectNext(Tuples.of(1L,"Second"))
.expectNext(Tuples.of(2L,"Third"))
.verifyComplete();

}

flatMapMany

This operator is very useful if you want to convert mono to flux.flatMapMany transforms the signals emitted by this Mono into signal-specific Publishers, then forward the applicable Publisher’s emissions into the returned Flux.

flatMapMany
1
2
3
4
5
6
7
8
9
@Test
public void flatMapManyTest() {
Mono<List<Integer>> just = Mono.just(Arrays.asList(1, 2, 3));
Flux<Integer> integerFlux = just.flatMapMany(it -> Flux.fromIterable(it));
StepVerifier
.create(integerFlux)
.expectNext(1, 2, 3)
.verifyComplete();
}

startWith

startWith
1
2
3
4
5
6
7
8
@Test
public void startWith(){
Flux<Integer> just = Flux.just(1, 2, 3);
Flux<Integer> integerFlux = just.startWith(0);
StepVerifier.create(integerFlux)
.expectNext(0,1,2,3)
.verifyComplete();
}

concatWith

The concatWith method does concatenation of two flux sequentially subscribing to the first flux then waits for completion and then subscribes to the next.

concatWith
1
2
3
4
5
6
7
8
@Test
public void concatWith(){
Flux<Integer> just = Flux.just(1, 2, 3);
Flux<Integer> integerFlux = just.concatWith(Flux.just(4,5));
StepVerifier.create(integerFlux)
.expectNext(1,2,3,4,5)
.verifyComplete();
}

merge

Merge data from Publisher sequences contained in an array / vararg into an interleaved merged sequence. Unlike concat, sources are subscribed to eagerly.

Merge
1
2
3
4
5
6
7
8
9
@Test
public void zip() throws InterruptedException {
Flux<Integer> firsFlux = Flux.just(1, 2, 3,4,5).delayElements(Duration.ofSeconds(1));
Flux<Integer> secondFlux = Flux.just(10, 20, 30, 40).delayElements(Duration.ofSeconds(1));
firsFlux.mergeWith(secondFlux)
.subscribe(System.out::println);
//This will print numbers received from firsFlux and secondFlux in random order
TimeUnit.SECONDS.sleep(11);
}

CollectList

Collect all elements emitted by this Flux into a List that is emitted by the resulting Mono when this sequence completes.

CollectList
1
2
3
4
5
6
7
8
9
@Test
public void CollectList(){
Mono<List<Integer>> listMono = Flux
.just(1, 2, 3)
.collectList();
StepVerifier.create(listMono)
.expectNext(Arrays.asList(1,2,3))
.verifyComplete();
}
CollectSortedList
1
2
3
4
5
6
7
8
9
@Test
public void CollectSortedListList(){
Mono<List<Integer>> listMono = Flux
.just(1, 2, 3,9,8)
.collectSortedList();
StepVerifier.create(listMono)
.expectNext(Arrays.asList(1,2,3,8,9))
.verifyComplete();
}

zip

Zip multiple sources together, that is to say wait for all the sources to emit one element and combine these elements once into an output value (constructed by the provided combinator). The operator will continue doing so until any of the sources completes. Errors will immediately be forwarded. This “Step-Merge” processing is especially useful in Scatter-Gather scenarios.

Zip
1
2
3
4
5
6
7
8
9
10
@Test
public void zip() {
Flux<Integer> firsFlux = Flux.just(1, 2, 3);
Flux<Integer> secondFlux = Flux.just(10, 20, 30, 40);
Flux<Integer> zip = Flux.zip(firsFlux, secondFlux, (num1, num2) -> num1 + num2);
StepVerifier
.create(zip)
.expectNext(11, 22, 33)
.verifyComplete();
}

buffer

Collect all incoming values into a single List buffer that will be emitted by the returned Flux once this Flux completes.

buffer
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
@Test
public void bufferTest() {
Flux<List<Integer>> buffer = Flux
.just(1, 2, 3, 4, 5, 6, 7)
.buffer(2);

StepVerifier
.create(buffer)
.expectNext(Arrays.asList(1, 2))
.expectNext(Arrays.asList(3, 4))
.expectNext(Arrays.asList(5, 6))
.expectNext(Arrays.asList(7))
.verifyComplete();

}

You can check all available operators here

It the next article, I’ll show you how to handel errors while processing data in Mono and Flux.

The code for this post is available on my Github account here

Share Comments

Project Reactor [Part 1] - Playing With Flux & Mono

unsplash-logoDaniil Silantev

What is Reactor

Project Reactor implements the reactive programming model.It implements the Reactive Streams Specification a standard for building reactive applications.
reactor integrates directly with the Java 8 functional APIs, notably CompletableFuture, Stream, and Duration. It offers composable asynchronous sequence APIs — Flux (for [N] elements) and Mono (for [0|1] elements).

Key Components of Reactive Manifesto and Reactive Streams Specification

Reactive Streams is an initiative to provide a standard for asynchronous stream processing with non-blocking back pressure. This encompasses efforts aimed at runtime environments (JVM and JavaScript) as well as network protocols. Reactive Specification are based on Reactive Manifesto

Reactive Specifications defines below key Contracts
Publisher: Representing sources of data also called as Observables.
Subscriber: Listening to the Publisher. Subscriber subscribes to publisher
Subscription: Publisher will create a subscription for every Subscriber which will try to subscribe to it.
Processor: A processor can be used as a publisher as well as subscriber. Processors are used for data transformation. these are set of methods for modifying and composing the data.

Objective Of this post

Objective of these series of post is to show, How to use Flux & Mono. How to subscribe to them for consuming data. Different operation that we can perform on data while consuming data. I will be using Spring Webflux which internally used Project Reactor

Let get started, We need spring-boot-starter-webflux to get started.Project rector also provides very handy library reactor-test for unit testing.

The code for this post is available on my Github account here

Pom File
1
2
3
4
5
6
7
8
9
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-webflux</artifactId>
</dependency>
<dependency>
<groupId>io.projectreactor</groupId>
<artifactId>reactor-test</artifactId>
<scope>test</scop>
</dependency>
Creating Flux and Mono

Flux A Reactive Streams Publisher with rx operators that emits 0 to N elements, and then completes(successfully or with an error).

Few Example of creating Flux
1
2
3
4
5
6
7
8
9

Flux<Integer> fluxFromJust = Flux.just(1, 2, 3);//Create a Flux that emits the provided elements and then completes.
Flux<Integer> integerFlux = Flux.fromIterable(Arrays.asList(1, 2, 3));//Create a Flux that emits the items contained in the provided Iterable.
Flux.fromStream(Arrays.asList(1,2,3).stream());//Create a Flux that emits the items contained in the provided Stream.
Integer[] num2 = {1, 2, 3, 4, 5};
Flux<Integer> integerFluxFromArray = Flux.fromArray(num2);//Create a Flux that emits the items contained in the provided array.
Flux.generate(<Consumer>) ;//Programmatically create a Flux by generating signals one-by-one via a consumer callback.
Flux.Creare(<Consumer>); //Programmatically create a Flux with the capability of emitting multiple elements in a synchronous or asynchronous manner through the FluxSink API.

Mono A Reactive Streams Publisher with basic rx operators that completes successfully by emitting an element, or with an error.

Few Examples of creating Mono
1
2
3
4
5
6
7
8
Mono<Integer> just = Mono.just(1);//
Mono<Object> empty = Mono.empty();//
Mono.create(); //Create a Mono that completes without emitting any item.
Mono.from(<Publisher>)//Expose the specified Publisher with the Mono API, and ensure it will emit 0 or 1 item.

//there are multiple options available for creating mono from Callable,CompletionStage,CompletableFuture etc
Mono.fromCallable()
}

Flux & Mono are lazy

Lazy in context of reactive programming means, No Matter how many operation you do on the stream, They won’t be executed until you consume it. Flux & Mono start emitting values only when subscriber is attached.

Creating Subscriber and consuming values

Subscriber has multiple overloaded methods, let’s check few of them.

Subscriber with onNext

Subscriber with consumer (onNext)
1
2
Flux<Integer> fluxFromJust = Flux.just(1, 2, 3);
fluxFromJust.subscribe(i->System.out.println(i));//It will print number 1,2,3

Subscriber with onNext & onError. OnError will be get call in case of error.
Flux has one handy concatWith which we can use to concat error

Subscriber with consumer and error handler (onError)
1
2
3
4
5
6
7
Flux<Integer> fluxFromJust = 
Flux.just(1, 2, 3)
.concatWith(Flux.error( new RuntimeException("Test Exception"));
fluxFromJust.subscribe(
i->System.out.println(i),// onNext
e->System.out.println("In Error Block " + e.getMessage()) //onError
);

Subscriber with onNext, onError and onComplete. On Successful completion onComplete method get’s called
Subscriber with consumer, error handler and onComplete
1
2
3
4
5
6
7
8
Flux<Integer> fluxFromJust = 
Flux.just(1, 2, 3)
.concatWith(Flux.error( new RuntimeException("Test Exception"));
fluxFromJust.subscribe(
i->System.out.println(i),//Will Print 1,2,3 //onNext
e->System.out.println("In Error Block " + e.getMessage())//OnError
()-> System.out.println("Process Completed") //OnComplete
);

To Log the activities on Flux / Mono. We can use log method and it will start logging all events.

Log Events
1
2
3
Flux<Integer> fluxFromJust = 
Flux.just(1, 2, 3)
.log()

Lets write some Unit test using reactor-test

StepVerifier can be use like below to validate the expectations

StepVerifier
1
2
3
4
5
6
7
8
9
10
11
12
@Test
public void testCreateFluxAndSubscribe() {

Flux<Integer> fluxFromJust = Flux.just(1, 2, 3).log();
StepVerifier
.create(fluxFromJust)
.expectNext(1)
.expectNext(2)
.expectNext(3)
.verifyComplete();
}


IF You want to verify only count then we can use expectNextCount
StepVerifier
1
2
3
4
5
6
7
8
9
@Test
public void testCreateFluxAndSubscribeVerifyCount() {

Flux<Integer> fluxFromJust = Flux.just(1, 2, 3).log();
StepVerifier
.create(fluxFromJust)
.expectNextCount(3)
.verifyComplete();
}

To Assert on Error
StepVerifier
1
2
3
4
5
6
7
8
9
@Test
public void testCreateFluxAndSubscribeVerifyError() {

Flux<Integer> fluxFromJust = Flux.just(1).concatWith(Flux.error(new RuntimeException("Test")));
StepVerifier
.create(fluxFromJust)
.expectNextCount(1)
.verifyError(RuntimeException.class);
}

It the next article, I’ll show you how to process and transform data in Mono and Flux.
The code for this post is available on my Github account here

Share Comments

Testcontainers With Spring Boot For Integration Testing

Take your Integration Tests to next level using Testcontainers

Now days the modern application interacts with multiple systems like database, microservice within system ,external API, middleware systems etc.It’s become very critical for the success of project to have good Integration test strategy.
In memory database like H2, HSQLDB are very popular for Integration testing of persistance layer but these are not close to production environment. If application has dependancy on Docker containers, Then it’s become more difficult to test that code in Integration environment. Testcontainers help us to handle these challenges

What is Testcontainers?

Testcontainers is a Java library that supports JUnit tests, providing lightweight, throwaway instances of common databases, Selenium web browsers, or anything else that can run in a Docker container.

The code for this post is available on my Github account here

Let’s Write some Integration Test using Testcontainers For Spring Boot App

In previous Post We created simple Spring Boot application that uses Mongodb Database (containrized) let’s write integration test for that.

It’s easy to add Testcontainers to your project - let’s walk through a quick example to see how.

Add Testcontainer to project

Pom File
1
2
3
4
5
6
7
8
9
10
11
12
<dependency>
<groupId>org.testcontainers</groupId>
<artifactId>testcontainers</artifactId>
<version>1.12.3</version>
<scope>test</scope>
</dependency>
<dependency>
<groupId>org.testcontainers</groupId>
<artifactId>junit-jupiter</artifactId>
<version>1.12.3</version>
<scope>test</scope>
</dependency>

Creating a generic container based on an image

For Mongodb there is no special test container image available, But we can create one by extending GenericContainer

MongoDbContainer
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
public class MongoDbContainer extends GenericContainer<MongoDbContainer> {

public static final int MONGODB_PORT = 27017;
public static final String DEFAULT_IMAGE_AND_TAG = "mongo:3.2.4";
public MongoDbContainer() {
this(DEFAULT_IMAGE_AND_TAG);
}
public MongoDbContainer(@NotNull String image) {
super(image);
addExposedPort(MONGODB_PORT);
}
@NotNull
public Integer getPort() {
return getMappedPort(MONGODB_PORT);
}
}

You can also create generic container using ClassRule
1
2
3
4

@ClassRule
public static GenericContainer mongo = new GenericContainer("mongo:3.2.4")
.withExposedPorts(27017);

Starting the container and using it in test

MongoDbContainerTest
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
@SpringBootTest
@AutoConfigureMockMvc
@ContextConfiguration(initializers = MongoDbContainerTest.MongoDbInitializer.class)
@Slf4j
public class MongoDbContainerTest {

@Autowired
private MockMvc mockMvc;
@Autowired
private ObjectMapper objectMapper;
@Autowired
private FruitRepository fruitRepository;

private static MongoDbContainer mongoDbContainer;

@BeforeAll
public static void startContainerAndPublicPortIsAvailable() {
mongoDbContainer = new MongoDbContainer();
mongoDbContainer.start();
}


@Test
public void containerStartsAndPublicPortIsAvailable() throws Exception {

FruitModel build = FruitModel.builder().color("Red").name("banana").build();
mockMvc.perform(post("/fruits")
.contentType("application/json")
.content(objectMapper.writeValueAsString(build)))
.andExpect(status().isCreated());
Assert.assertEquals(1, fruitRepository.findAll().size());
}

public static class MongoDbInitializer implements ApplicationContextInitializer<ConfigurableApplicationContext> {
@Override
public void initialize(ConfigurableApplicationContext configurableApplicationContext) {
log.info("Overriding Spring Properties for mongodb !!!!!!!!!");

TestPropertyValues values = TestPropertyValues.of(
"spring.data.mongodb.host=" + mongoDbContainer.getContainerIpAddress(),
"spring.data.mongodb.port=" + mongoDbContainer.getPort()

);
values.applyTo(configurableApplicationContext);
}
}
}
}

lets see what we are doing in test line by line.
@SpringBootTest As we want to write Integration test we are using @SpringBootTest which tells Spring to load complete application context.

@AutoConfigureMockMvc configure auto-configuration of MockMvc. As we want to test
controller->service->repository ->database

@ContextConfiguration Overriding Spring properties. We need to override host and port on which mongodb testcontainer has started.

Starting Mongodb container in BeforeAll hook so that DB instance is available during the test.
Then In test we are simpaly calling post method on controller and after that checking if actually data is getting inserted in database or not.

Using Predefined Testcontainer

There are some predefined Testcontainer are available, Which are very useful and easy to use
e.g MySQLContainer is available for mysql database. let’s see how to use that

MySQLContainerTest
1
2
3
4
5
6
7
8
9
10
11
12
@Testcontainers
public class MySQLContainerTest {

@Container
private final MySQLContainer mySQLContainer = new MySQLContainer();

@Test
@DisplayName("Should start the container")
public void test() {
Assert.assertTrue(mySQLContainer.isRunning());
}
}

To use the Testcontainers extension annotate your test class with @Testcontainers.

Note You need to think about how you want to use container

  1. Containers that are restarted for every test method
  2. Containers that are shared between all methods of a test class

If you define Container like this
@Container private MySQLContainer mySQLContainer = new MySQLContainer(); Then it will create new instance for each test case and it you use static like this
@Container private static MySQLContainer mySQLContainer = new MySQLContainer(); then same Instance will be used for all tests.

The code for this post is available on my Github account here

Share Comments

Spring Boot + Mongodb + Docker Compose

unsplash-logoGabriel Barletta
In this post we will discuss how to use Docker Compose to define and run multi-container Docker applications.
The code for this post is available on my Github account here

Prerequisites
To Follow along this post basic knowledge of Docker, Container & Spring Boot is Needed. docker and docker-compose should be install on your system.

Docker Compose
Docker Compose is a tool for defining and running multi-container Docker applications. We define and configure all the services used in application in single file called as “docker-compose.yml” More details about docker compose can be found in documentation. Docker compose helps and reduces lot of overhead of managing apps that has dependancy on multiple containers.
Docker compose significantly improves productivity as we can run complete application stack using single command. Docker compose runs all the containers on a single host by default.

Docker Compose in Action
I will create simple hypothetical application that will expose rest endpoint to manage fruit information. Application is build using two containers. I will use docker compose to run this multi-container application.

Spring Boot APP
Create very Spring Spring boot application using Spring initializr use below dependency. spring-boot-starter-web,spring-boot-starter-actuator,lombok and you should be able to run the application.Check http://localhost:8080/actuator/health point is returning status as “UP”

Dokcrizeing spring boot app
Dockerizing Spring Boot app is very straightforward,below is sample Dockerfile file.

1
2
3
4
5
FROM openjdk:8-jdk-alpine
VOLUME /tmp
ARG JAR_FILE=target/*.jar
COPY ${JAR_FILE} app.jar
ENTRYPOINT ["java","-Djava.security.egd=file:/dev/./urandom","-jar","/app.jar"]

Creating Docker Image
docker build -t api-docker-image .
Run above command to create docker image. There are maven plugins available for creating docker image during maven build. for simplicity i am using simple docker command to create image.

Running Docker Image
docker run -d -p 9090:8080 api-docker-image
we are mapping 8080 container port to 9090 host machine port means the application will be available on host machine on port 9090. Now Spring boot application is running on docker and will be available on
http://localhost:9090/actuator/health

Now let’s add mongodb
Add below dependancy in pom file

Pom File
1
2
3
4
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-data-mongodb</artifactId>
</dependency>

Rest endpoints to save and get Fruit information.

Rest Controller
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
@RestController
@Slf4j
public class FruitController {

private final FruitService fruitService;
public FruitController(FruitService fruitService) {this.fruitService = fruitService;}

@PostMapping("/fruits")
public ResponseEntity addFruit(@RequestBody FruitRequest fruitRequest) {
log.info("Request : {}", fruitRequest);
fruitService.saveFruit(fruitRequest.toFruitModel());
return ResponseEntity.status(HttpStatus.CREATED).build();
}

@GetMapping("/fruits")
public List<FruitModel> getAllFruit() {
return fruitService.findAll();
}
}
}

Simple JPA MongoRepository for saving and getting data to/from mongodb

JPA Repository & Fruit JPA Model
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
@Component
public interface FruitRepository extends MongoRepository<FruitModel ,String> {
}

@Document
@Data
@Builder
public class FruitModel {

@Id
private String id;
private String name;
private String color;
}
}

Now we want to run mongodb database as separate container and spring boot app as separate container application. We can do this manually by running docker commands, But that’s very tedious task and lot of configurations needed for containers to talk to each other. docker compose simplifies these things for us

Define services in a Compose file
We Create a file called docker-compose.yml and starts defining all the containers needed for application as services.
in below docker compose file we are defining two services one is for database and one for rest-api and we do all the needed configuration at single place. As our spring boot app (api) is dependent on database we are specifying that as link. There are lof configuration we can do in docker compose file.

docker-compose
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
version: "3"
services:
api-database:
image: mongo:3.2.4
container_name: "api-database"
ports:
- 27017:27017
command: --smallfiles
api:
image: api-docker-image
ports:
- 9091:8080
links:
- api-database
}

Configure mongodb host name using spring config property using service name defined in docker-compose.

docker-compose
1
spring.data.mongodb.host=api-database

Running Docker Compose
docker-compose up single command is needed to start the application. command will create one container for database and one for spring-boot app as defined in docker-compose file.

Summary

Using Compose is basically a three-step process:

  1. Define your app’s environment with a Dockerfile so it can be reproduced anywhere.

  2. Define the services that make up your app in docker-compose.yml so they can be run together in an isolated environment.

  3. Run docker-compose up and Compose starts and runs your entire app.

The code for this post is available on my Github account here

Share Comments

Creating Custom Spring Boot Starter To Implement Cross-Cutting Concerns

Now days Spring Boot has become de facto standard for numerous Web enterprise developments. Spring Boot helps to improve developer’s productivity by implementing lot of cross-cutting concerns as Starter Projects. We Just add these dependancy in our projects or configure few properties and Spring Boot does the magic for us by doing autoconfiguration. The starters projects automatically configure lof of stuff for us. This helps us to get started more quickly.
However, With a lof magic happening in background, it’s very Important to know How things work.

The code for this post is available for download here.

How Spring Boot’s Starter Works

On Startup, Spring Boot checks for spring.factories file. This file is located in the META-INF directory.

1
2
org.springframework.boot.autoconfigure.EnableAutoConfiguration=\
ns.aop.LogMethodExecutionTimeAutoConfiguration

All the classes with @Configuration should list under EnableAutoConfiguration key in the spring.factories file.
Spring Will create Beans Based on configuration and Conditions defined in Configurations files. We will see this in detail with example.

Let’s Create Library That logs Method Execution Time

lets imagine, We want to log the method execution time for few methods in our project. We should be able to enable/disable this feature based on some property.

Creating Our Custom Spring Boot Starter Project

Create Spring Boot Project with Below dependencies.

Pom File
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
<dependencyManagement>
<dependencies>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-dependencies</artifactId>
<version>${spring.boot.version}</version>
<type>pom</type>
<scope>import</scope>
</dependency>
</dependencies>
</dependencyManagement>
<dependencies>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-autoconfigure</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-aop</artifactId>
</dependency>
<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-configuration-processor</artifactId>
<optional>true</optional>
</dependency>
</dependencies>

spring-boot-dependencies allows us to use any Spring dependency.
spring-boot-autoconfigure for using autoconfigure feature
spring-boot-configuration-processor to generate metadata for our configuration properties. IDEs can give us autocomplete.

Use AOP to log method execution time

Create simple annotation LogMethodExecutionTime to be used on method and aspect to log time.

Annotation & Aspect
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
@Retention(RetentionPolicy.RUNTIME)
@Target(ElementType.METHOD)
public @interface LogMethodExecutionTime {
}

@Aspect
@Slf4j
public class LogMethodExecutionTimeAspect {
@Around("@annotation(LogMethodExecutionTime)")
public Object logExecutionTime(ProceedingJoinPoint joinPoint) throws Throwable {
final long start = System.currentTimeMillis();
final Object proceed = joinPoint.proceed();
final long executionTime = System.currentTimeMillis() - start;
log.info(joinPoint.getSignature() + " executed in " + executionTime + "ms");
return proceed;
}
}

Creating Our Own Autoconfiguration

We want to control the Method Execution Time logging to be enable based on certain properties. Spring provides lot of @Conditional annotations.Based on different conditions we can control Configurations of starter projects. For this example we can use ConditionalOnProperty.
our starter will be activated only if logging.api.enabled property is present and has value true

Configuration For our Starter
1
2
3
4
5
6
7
8
@Configuration
@ConditionalOnProperty(name = "logging.api.enabled", havingValue = "true", matchIfMissing = true)
public class LogMethodExecutionTimeAutoConfiguration {
@Bean
public LogMethodExecutionTimeAspect getLogMethodExecutionTimeAspect(){
return new LogMethodExecutionTimeAspect();
}
}

spring.factories File

We need to create special file called as spring.factories. Our custom starter to be pick by Spring boot, we have to create spring.factories. This file should be placed within the src/main/resources/META-INF folder. This file has list of all classes annotated with @Configuration.

1
2
org.springframework.boot.autoconfigure.EnableAutoConfiguration=\
ns.aop.LogMethodExecutionTimeAutoConfiguration

Note Naming Convention
You should make sure to provide a proper namespace for your starter. Do not start your module names with spring-boot, even if you use a different Maven groupId.
As a rule of thumb, you should name a combined module after the starter. For example, assume that you are creating a starter for “acme” and that you name the auto-configure module acme-spring-boot-autoconfigure and the starter acme-spring-boot-starter. If you only have one module that combines the two, name it acme-spring-boot-starter.

Using the custom starter

Let’s create a sample Spring Boot application client to use our custom starter.

Client
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
<dependency>
<groupId>ns</groupId>
<artifactId>method-execution-time-logging-spring-boot-starter</artifactId>
<version>0.0.1-SNAPSHOT</version>
</dependency>

// define below property
logging.api.enabled=true
//Main Class
@SpringBootApplication
public class ClientApplication {

public static void main(String[] args) {
SpringApplication.run(ClientApplication.class, args);
}
@Bean
ApplicationRunner init(TestClass testClass) {
return (ApplicationArguments args) -> dataSetup(testClass);
}
private void dataSetup(TestClass testClass) throws InterruptedException {
testClass.run();
}
}
//Test Class
@Component
public class TestClass {
@LogMethodExecutionTime
public void run() throws InterruptedException {
Thread.sleep(3000);
}
}

The code for this post is available for download here.

Share Comments

My AWS Solutions Architect Associate Certification Preparation Guide

unsplash-logoBen White

I Recently [September 2019 ] Completed AWS Architect Associate Certification in First Attempt with score of 89%. I want to share my experience and exam preparation tips with you.

My Background about AWS and Cloud Technologies

I had No prior AWS experience or knowledge. But I had previous work experience in Other IaaS Platform. I was well aware about the most of the cloud concepts covered in Exam. It was easy for me to relate the problem and Pain point Any AWS Service was trying to Solve. I think this is very important to understand any AWS Service and motivation behind creating such service. This helped me to understand bigger picture.

Preparation Timeline

AWS suggest you should have minimum One year of working experience on AWS Platform but it’s not mandatory. When i did my research, Most of the people recommending at lest Three to Four months of study (2 -3 hrs everyday). It’s obvious we cannot generalize this. It depends on individual.

Please take your time to study and do not rush and Book the Exam. Give yourself enough time The platform has more than 140 different services and new services are launching every day. It can be very overwhelming and time consuming trying to understand Services. Architect Associate exam does not cover all Services but it still covers lot.
I Personally made mistake by booking exam in advance, Giving myself only three weeks of time for preparation along with my Office work of 8-9 hrs. For last few days i have to push myself to cover all services.

Preparation Resources

I Use Udemy Platform for my regular study, For AWS I took below courses from Udemy.

AWS Certified Solutions Architect - Associate 2019. By Ryan Kroonenburg, Faye Ellis
This is one of the most popular course for CSAA Certification. Content of course is very good and up to date. I will surely recommend this course.
I have rated this course as 4.5 Star on Udemy

Ultimate AWS Certified Solutions Architect Associate 2019 by Stephane Maarek
This is also another very good course. This course provide more details and hands on compare to Ryan course. This was my primary source for preparation.
I will definitely recommend this course. I have rated this course as 5 Star on Udemy

**I would Suggest do not rely on any Single Course. Complete at lest two courses of your choice.

Practice Exams
Practice Exams are equally Important. I Used below two practice exams. All Exam questions are scenario based, So it’s become very important to prepare your self for such questions. Both the below Practice Exams includes 6 Practice Tests and quality of question is also good. Real Exam also has similar (But Not exact same) questions. I have rated both these course as 5 Star on Udemy

You might not do well in these practice exams. Read the explanation about the answers and repeat the the test again. Due to limited time i was not able to repeat the test.

AWS Certified Solutions Architect Associate Practice Exams by Neal Davis, Digital Cloud Training
My First Attempt Scores in Test exams 46%,49%,60%,76%,53%,63%. I Was able to pass only 1 Test in my First Attempt

AWS Certified Solutions Architect Associate Practice Exams By by Jon Bonso, Tutorials Dojo
My First Attempt Scores in Test exams 75%,75%,68%,80%,86%,75%. I took these test at last.

FAQ & Whitepaper
It is Suggested that You should go through FAQ and White Papers before appearing to exam. I haven’t got time to read any White Papers or FAQ

Preparation And Exam Tips

These are not some magical Tips and you must be knowing most of these. Just make sure you check all the boxes.

Hands on using Free tier Do not start you preparation considering exam in mind. Do hands on whenever possible and consider it as you are implementing it for any enterprise level application. Exam will test you for such scenario only.

Taking Note Start taking your notes as soon as possible. Exam covers lot of topics and it’s difficult to remember all stuff. Video tutorial needs lot of time and you might not able to repeat.

Read questions carefully and Take your Time to Understand it Most of the questions are Descriptive and are up to 3 lines . Try to Find Keywords like Cost effective,Manage Service, Minimum Configurations,Availability, Durability etc Keywords will help to select Most appropriate Option. First Try to Eliminate Wrong Answer rather than finding correct one

VPC is Very Important
VPC is very important from Exam point of view and i got lot of question about VPC.You need to understand which service should be in a public subnet versus a private subnets And How Services deployed in Private subnets can interact with Internet and other services.

Disaster recovery
Disaster recovery also covers good portion of exam. Try to Understand and memorize how each services works in case of failure, And How to configure services for Disaster recovery.
E.g Does the service backups data, Where it copies, is it automatic or we need to configure it, How failover works etc.

Reality Check After Certification

Does it mean I am now great cloud architect after Certification? Of course not :)
The real word Enterprise application Brings lof of complexity and different unknown challenges and No exam can test these scenarios.

It’s very important to keep ourself updated with stuff happening around technology space. Any Certification can give me an edge over other candidates While getting job or Can Help me to get better salary But it does not guarantee that I Will be able to solve the Problem.

I want to become good software architect and getting any such certification is Just the one Step Forward.

Please let me know about your AWS Exam experience and Any tips that you want to share.

Share Comments

Securing Spring Boot Microservices Using JWT Token

unsplash-logoJR Korpa
In Last few articles, We Discuss Different aspects of Microservices. In this post we will add Security to our rest services using Json Web Token And Spring Security
There are multiple options available for adding Authentication and Authorization, Today we will be focusing on JSON Web Tokens, or JWT in short.

What Is JSON Web Tokens (JWT)?

JSON Web Tokens are an open, industry standard RFC 7519 method for representing claims securely between two parties. JWT Token has three Parts Header, Payload & Signature

Header of the JWT contains information about how the JWT signature should be computed. Header contains information about type and hashing algorithm used.

Header
1
2
3
4
{
"alg": "HS256",
"typ": "JWT"
}

Payload contains the data also referred as claims. The Internet drafts define the certain standard fields (“claims”) that can be used inside a JWT claim set. Such as Issuer, Subject, Issued at, JWT ID etc You can Find full list Here. We Can also pass other needed information using private claims.
Payload
1
2
3
4
5
6
{
"sub": "1234567890",
"name": "John Doe",
"iat": 1516239022,
"Roles":"Admin"
}

Signature Make sure that the token is not changed on the way. Signature is created by taking the header and payload together and passing it through the specified algorithm along with a known secret.

For More Details about JWT Please check jwt.io

The code for this post is available for download here.

Sample Application Using JWT And Spring Security

An overview of the security mechanism that we will be using in our sample application.

Client will call Authenticate Endpoint by providing valid Username and Password to get The Token
Clients will send this JWT token in the Authorization header for all the requests to access any protected resources. For Valid Tokens, Access will be granted to resources.
In Case of missing Token or invalid token, Response code 401 unauthorized will be return.

We will also Add Role Base Authentication. Student Endpoint Will be accessible to Users having ADMIN or USER roles. And Subject Endpoint will be accessible to Users having ADMIN roles.

This Will be long post, So I divided it into multiple Steps. Github repo has Branch corresponding to each Step

Step 1: Create Spring Boot Rest Endpoints

Create Two Simple Rest endpoints For Our Student and Subject Domain objects. User Login Details are saved in User table using User Entity.

Setting Up Things For playing with jWT
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28

//Student Resource
@RestController
@RequestMapping("/Student")
public class StudentResource {
private final StudentRepository studentRepository;
public StudentResource(StudentRepository studentRepository) {
this.studentRepository=studentRepository;
}
@GetMapping
public List<Student> getAllStudents(){
return studentRepository.findAll();
}
}
//Subject Resource
@RestController
@RequestMapping("/Subject")
public class SubjectResource {
private final SubjectRepository subjectRepository;

public SubjectResource(SubjectRepository subjectRepository) {
this.subjectRepository = subjectRepository;
}
@GetMapping
public List<Subject> getAllSubjects(){
return subjectRepository.findAll();
}
}

Add User Entity to Store userId & password in in-memory database. Password are stored in BCrypt encrypted Format

Setting Up Things For playing with jWT
1
2
3
4
5
6
7
8
9
10
11
12
@Entity
@Data
@AllArgsConstructor
@NoArgsConstructor
@ToString
public class User {
@Id
private String email;
private String password;
private String role;
}
}

Populate Database during application Startup using ApplicationRunner

Setting Up Things For playing with jWT
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
@SpringBootApplication
public class GwtTokenApplication {

public static void main(String[] args) {
SpringApplication.run(GwtTokenApplication.class, args);
}
@Bean
ApplicationRunner init(UserRepository userRepository, StudentRepository studentRepository, SubjectRepository subjectRepository) {
return (ApplicationArguments args) -> dataSetup(userRepository,studentRepository,subjectRepository);
}

private void dataSetup(UserRepository userRepository, StudentRepository studentRepository, SubjectRepository subjectRepository) {
User niraj = new User("niraj.sonawane@gmail.com", "$2a$10$yRxRYK/s8vZCp.bgmZsD/uXmHjekuPU/duM0iPZw04ddt1ID9H7kK", "Admin");
User test = new User("test@gmail.com", "$2a$10$YWDqYU0XJwwBogVycbfPFOnzU7vsG/XvAyQlrN34G/oA1SbhRW.W.", "User");
userRepository.save(niraj);
userRepository.save(test);

Student student1 = new Student(1L,"Ram");
Student student2 = new Student(2L,"Sham");
studentRepository.save(student1);
studentRepository.save(student2);

Subject math = new Subject(1l,"Math");
Subject science = new Subject(2l,"Science");
subjectRepository.save(math);
subjectRepository.save(science);
}
}

At This point, We have Setup Two Reset endpoints and User Table to Store UserID and Password.

Step 2: Add Authentication Endpoint To Return JWT Token and Secure All Other Endpoint

Student and Subject endpoints should be accessible only if Valid token is provided. Authenticate endpoint Should be accessible to everyone.

SecurityConfig
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
@Configuration
@EnableWebSecurity
public class SecurityConfig extends WebSecurityConfigurerAdapter{
@Autowired
private UserAuthDetailsService userAuthDetailsService;
@Autowired
private InvalidLoginAttemptHandler invalidLoginAttemptHandler;

@Override
public void configure(AuthenticationManagerBuilder authenticationManagerBuilder) throws Exception {
authenticationManagerBuilder.userDetailsService(userAuthDetailsService)
.passwordEncoder(passwordEncoder());

}
@Bean
public PasswordEncoder passwordEncoder() {
return new BCryptPasswordEncoder();
}
@Bean
@Override
public AuthenticationManager authenticationManagerBean() throws Exception {
return super.authenticationManagerBean();
}
@Override
protected void configure(HttpSecurity http) throws Exception {

http
.cors().and().csrf().disable()
.exceptionHandling().authenticationEntryPoint(invalidLoginAttemptHandler)
.and()
.authorizeRequests()
.antMatchers("/authenticate/**")
.permitAll()
.anyRequest().authenticated();
}
}

lets discuss, What Configurations we have done here.

@EnableWebSecurity This is main Spring annotation that is used to enable web security in a project.

WebSecurityConfigurerAdapter This class provides default security configurations and allows other classes to extend it and customize the security configurations by overriding its methods.

UserAuthDetailsService - This class is needed by Spring security to get the user details from Database, LDAP or other other user store.UserAuthDetailsService Service loads the user details from our User Table.

AuthenticationManager This class uses UserAuthDetailsService to do user Authentication.

InvalidLoginAttemptHandler This class is responsible for taking action if Authentication fails.

.antMatchers("/authenticate/**").permitAll().anyRequest().authenticated() This Configuration allows any user to access our authenticate endpoint and all other Request need to be authenticated.

Authenticate User and Create JWT Token

AuthResource
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
@RestController
@RequestMapping("/authenticate")
@Slf4j
public class AuthResource {

@Autowired
private AuthenticationManager authenticationManager;
@Autowired
private JWTTokenProvider jwtTokenProvider;
@PostMapping
public ResponseEntity authenticateUser(@RequestBody AuthenticateRequest authenticateRequest) {
Authentication authentication = authenticationManager.authenticate(new UsernamePasswordAuthenticationToken(authenticateRequest.getUserName(), authenticateRequest.getPassword()));
String token =jwtTokenProvider.generateToken((UserPrincipal)authentication.getPrincipal());
log.info("Token Created {}",token);
return ResponseEntity.ok(new JwtAuthenticationResponse(token));
}
}

JWTTokenProvider
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
@Component
public class JWTTokenProvider {

@Value("${jwt.secret}")
private String jwtSecret;

@Value("${jwt.expirationInMs}")
private int jwtExpirationInMs;

public String generateToken(UserPrincipal userPrincipal){
List<String> roles = userPrincipal
.getAuthorities()
.stream()
.map(GrantedAuthority::getAuthority)
.collect(Collectors.toList());
return Jwts.builder().setIssuer("Demo App")
.setIssuedAt(new Date())
.setExpiration(new Date(new Date().getTime() + jwtExpirationInMs))
.claim("Roles",roles)
.signWith(SignatureAlgorithm.HS512,jwtSecret).compact();
}
}

After successful authentication of User we create JWT Token using jsonwebtoken library. jsonwebtoken provides fluent api to create JWT Token.
We are Adding Roles in Claim. We Can Use these role for role based authorization.
In Case authentication fails, InvalidLoginAttemptHandler Will be called which we have configured in exceptionHandling section of our SecurityConfig.

Now If we Call authenticate endpoints with Valid userid and password, JWT Token will send back in Response.

curl -X POST http://localhost:8080/authenticate -H "Content-Type:application/json" -d "{\"userName\":\"niraj.sonawane@gmail.com\",\"password\":\"test\"}"

Step 3: Add AuthenticationFilter To Get JWT token from the request and Validate It

After receiving jwt token, Clients Need to pass this token in Authorization header to access the protected resource, in our case student or subject resource.
Sample curl for same

curl
1
2
curl -X GET http://localhost:8080/Subject -H "Authorization: Bearer eyJhbGciOiJIUzUxMiJ9.eyJpc3MiOiJEZW1vIEFwcCIsInN1YiI6InRlc3RAZ21haWwuY29tIiwiaWF0IjoxNTYzMTAwODk2LCJleHAiOjE1NjMxNTA4OTYsIlJvbGVzIjpbIlJPTEVfVVNFUiJdfQ.XSUpdBhRkXL0b1U5gD0y-siCrSMMzQaJupV4bJOTnAA7txYmDNTZ8O18ueCG72K7XdwueLZGXWX5C2NtCWghaA"
}

JwtAuthenticationFilter Filter will be called OncePerRequest and will validate the provided token. After Successful Validation of Token We need to pass on UsernamePasswordAuthenticationToken to filter chain using SecurityContextHolder

JwtAuthenticationFilter
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
public class JwtAuthenticationFilter extends OncePerRequestFilter {
@Autowired
private JWTTokenProvider tokenProvider;
@Autowired
private UserAuthDetailsService userDetailsService;
@Override
protected void doFilterInternal(HttpServletRequest request, HttpServletResponse response, FilterChain filterChain) throws ServletException, IOException {
logger.info("Validating Token!!!!!");
try {
String jwt = getJwtFromRequest(request);
if (StringUtils.hasText(jwt) && tokenProvider.validateToken(jwt)) {
logger.info("Token is Valid ");
String userNameFromToken = tokenProvider.getUserNameFromToken(jwt);
UserPrincipal userDetails = userDetailsService.loadUserByUsername(userNameFromToken);
UsernamePasswordAuthenticationToken authentication = new UsernamePasswordAuthenticationToken(userDetails, null, userDetails.getAuthorities());
SecurityContextHolder.getContext().setAuthentication(authentication);
}
} catch (Exception ex) {
logger.error("Could not set user authentication in security context", ex);
}
filterChain.doFilter(request, response);
}
private String getJwtFromRequest(HttpServletRequest request) {
String bearerToken = request.getHeader("Authorization");
if (StringUtils.hasText(bearerToken) && bearerToken.startsWith("Bearer ")) {
return bearerToken.substring(7, bearerToken.length());
}
return null;
}
}

Token Can be validated like this
Jwts.parser().setSigningKey(jwtSecret).parseClaimsJws(jwt)

At This point, We are able to Create JWT Token for valid Users. And able to provide access to protected resources based on token.

lets go one step further and use roles to grant access at granular level.

Step 4: Role Based Access

lets Say we want Student Endpoint to be accessible to Users having ADMIN or USER roles. And Subject Endpoint to be accessible to Users having ADMIN role.
In order to use method level security, we need to enable this in the security configuration using @EnableGlobalMethodSecurity.

Configuration
1
2
3
4
5
6
7
8
@Configuration
@EnableGlobalMethodSecurity(
prePostEnabled = true,
securedEnabled = true,
jsr250Enabled = true)
public class MethodSecurityConfig extends GlobalMethodSecurityConfiguration {

}

The annotations @PreAuthorize and @PostAuthorize support Spring Expression Language (SpEL) and provide expression-based access control.
Note: The Roles In Database need to saved with Prefix as ROLE_

StudentResource
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
@RestController
@RequestMapping("/Student")
public class StudentResource {

private final StudentRepository studentRepository;

public StudentResource(StudentRepository studentRepository) {
this.studentRepository=studentRepository;
}

@GetMapping
@PreAuthorize("hasRole('ADMIN') OR hasRole('USER')")
public List<Student> getAllStudents(){
return studentRepository.findAll();
}
}

SubjectResource
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
@RestController
@RequestMapping("/Subject")
public class SubjectResource {

private final SubjectRepository subjectRepository;
public SubjectResource(SubjectRepository subjectRepository) {
this.subjectRepository = subjectRepository;
}
@GetMapping
@PreAuthorize("hasRole('ADMIN')")
public List<Subject> getAllSubjects(){
return subjectRepository.findAll();
}
}

The code for this post is available for download here.

Share Comments

Setup Monitoring System for your Spring Boot applications Using Spring Boot Admin

unsplash-logoZdeněk Macháček

In past few microservices articles, We discuss different Spring Cloud features.In microservices architecture we have lot of services doing small small tasks. Monitoring all these application becomes very critical and integral part of your technology stack.
Spring Boot Admin is Community project provides an admin interface for Spring Boot applications.
let check how we can setup and use Spring Boot Admin.

The code for this post is available for download here.

Setting Up Spring Boot Admin Server

To Add admin server in project we need to add below spring boot starter dependancy

spring boot admin starter server
1
2
3
4
  <dependency>
<groupId>de.codecentric</groupId>
<artifactId>spring-boot-admin-starter-server</artifactId>
</dependency>

Add @EnableAdminServer on Main Application.

Admin Main Class
1
2
3
4
5
6
7
8
@SpringBootApplication
@EnableAdminServer
public class SpringBootAdminServerApplication {

public static void main(String[] args) {
SpringApplication.run(SpringBootAdminServerApplication.class, args);
}
}

Now if we Start the application, the admin UI should be available

Register clients with admin Server

Each application that wants to register has to include the Spring Boot Admin Client dependancy.

Client
1
2
3
4
  <dependency>
<groupId>de.codecentric</groupId>
<artifactId>spring-boot-admin-starter-client</artifactId>
</dependency>

And add Admin.client url in application.properties
spring.boot.admin.client.url=http://localhost:8080/
Spring Boot Admin will use spring.application.name property to display the name of application.

Our application is registered on Admin Consol, But apart for Status no other information is available

Magic of Spring Boot Actuator and Spring Boot Admin Server

Spring Boot Admin depends on Actuators endpoints to provide information. let’s enable Actuator endpoints.
put below in client application.
management.endpoints.web.exposure.include=*
Note, Expose only needed endpoints.

Tips and tricks

Display Build Info
To Display Build Info, Simply add build-info Goal in spring-boot-maven-plugin plugin

Build Info
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
  <build>
<plugins>
<plugin>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-maven-plugin</artifactId>
<executions>
<execution>
<goals>
<goal>build-info</goal>
</goals>
</execution>
</executions>
</plugin>
</plugins>
</build>

Display Git Commit info
To know which commit id is deployed, We can simply use git-commit-id-plugin, git information will be available in info endpoint and will be displayed on admin ui.

Build Info
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
  <build>
<plugins>
<plugin>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-maven-plugin</artifactId>
<executions>
<execution>
<goals>
<goal>build-info</goal>
</goals>
</execution>
</executions>
</plugin>
<plugin>
<groupId>pl.project13.maven</groupId>
<artifactId>git-commit-id-plugin</artifactId>
</plugin>
</plugins>
</build>

Logs
By default the log file is not accessible via actuator endpoints and therefore not visible in Spring Boot Admin. In order to enable the logfile actuator endpoint you need to configure Spring Boot to write a logfile, either by setting logging.path or logging.file.
Add below property in Client application.
logging.file=target/sample-boot-application.log
At run time we can also change the logs level, Which is useful in lower environments.
Multiple Instance
Admin Will Automatically Group applications based on application names and will show consolidated view.
Tags
Tags are a way to add visual markers per instance, they will appear in the application list as well as in the instance view.
spring.boot.admin.client.instance.metadata.tags.environment=QA

Notification

By default Spring Boot Admin will send registered application changes to DOWN or OFFLINE. We can do lot of customizations for Notifications.
Below is configurations for sending emails using google smpt server.
Use spring-boot-starter-mail dependancy in admin.

Email Config in Admin
1
2
3
4
5
6
spring.mail.host=smtp.gmail.com
spring.mail.port=587
spring.mail.username=XYZ@gmail.com
spring.mail.password=XYZ
spring.boot.admin.notify.mail.to=XYZ
spring.mail.properties.mail.smtp.starttls.enable=true

Discovery Clients

The Spring Boot Admin Server can use Spring Clouds DiscoveryClient to discover applications. The advantage is that the clients don’t have to include the spring-boot-admin-starter-client. You just have to add a DiscoveryClient implementation to your admin server - everything else is done by AutoConfiguration.

Security

As Admin Server is an spring boot application, Securing admin server is same as securing normal spring boot application.

Reference

Documentation
The code for this post is available for download here.

Share Comments

Service Discovery and Registration Using Spring Cloud Eureka

unsplash-logoFo Fa

In past few microservices articles, We discuss different Spring Cloud features, Like Config Server , OpenFeign and Ribbon. In this Post, Let’s talk about Spring Cloud Netflix – Eureka.Why we need it ? How to configure the Eureka server , How to register service in Registry Server?

In cloud based microservices environment, Servers come and go. Unlike the traditional architecture which work with servers with well known IP addresses and host names. In cloud platform, It is not possible to know the IP and Host names of services. Also the containers use dynamic IPs for autoscaling moreover Load balancing requires much more sophistication in registering and de-registering servers for balancing.
So we can not have tight coupling between the services based of IP or Host Names. Spring Cloud Netflix – Eureka helps to solve that problem.

The code for this post is available for download here.

In typical Eureka setup we have
Eureka Server: Which acts as service registry.
Eureka Client REST services which registers itself at the registry.

let’s Setup Eureka Server & Client for our demo application.

Demo Application: Our Simple PricingService calculates Price for the Product. Pricing Service gets some dynamic discount details from DiscountService.

Setting Up Eureka Server

To Add Eureka Server in project, we need to use the spring-cloud-starter-netflix-eureka-server artifact id. and use @EnableEurekaServer on main class.
Eureka server can also register it’s self as a Eureka client. For the demo we will disable that using registerWithEureka flag. Eureka Server can be configured to read configuration from Config server that we discuss Here. this property can be set using fetchRegistry flag.

Eureka Server
1
2
3
4
5
6
7
@SpringBootApplication
@EnableEurekaServer
public class EurekaServerApplication {
public static void main(String[] args) {
SpringApplication.run(EurekaServerApplication.class, args);
}
}

Eureka Server Properties
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
#Server Specifics
server:
port: 8761
spring:
application:
name: eureka-server
#Eureka Specifics
eureka:
instance:
hostname: localhost
client:
registerWithEureka: false
fetchRegistry: false
serviceUrl:
defaultZone: http://${eureka.instance.hostname}:${server.port}/eureka/

Start the EurekaServerApplication and you should be able to see Eureka dashboard at http://localhost:8761/

Setting Up Eureka Clients - Discount Service

let’s Setup Discount Service to register with Eureka Server. To include the Eureka Client in project we need to use spring-cloud-starter-netflix-eureka-client artifact ID.Spring Boot will identify the client jar in classpath and will try to register with Server. We can also use @EnableDiscoveryClient to do that. We need to provide information about the Eureka Server, This can be done using serviceUrl property.
Discount service has simple endpoint that returns the discount for product.

Discount Service
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
@SpringBootApplication
@EnableDiscoveryClient
@Slf4j
public class DiscountServiceApplication {
public static void main(String[] args) {
SpringApplication.run(DiscountServiceApplication.class, args);
}
}
@RestController
@RequestMapping("/discount/{product}")
@Slf4j
class DiscountController{
@GetMapping
public int getDiscountPercentage(@PathVariable("product") String product){
log.info("Getting Discount for Product {}",product);
return 50;
}
}

Discount Service properties
1
2
3
4
5
6
7
8
9
10
11
12
13
spring:
application:
name: discount-service

server:
port: 8081

eureka:
client:
healthcheck:
enabled: true
serviceUrl:
defaultZone: http://localhost:8761/eureka/

Start the DiscountServiceApplication and you should be able to see DiscountService registered on Eureka dashboard http://localhost:8761/
Note Application Name is very important as it is used for server for lookups

Magic of Registry: Calling Service using Registry & Feign

Now Let’s Call DiscountService to calculate price. Now as DiscountService is registered with Registry so we don’t need to know the host details.I have already discusses in details about Feign and how to use to call service in this Post. Feign works with eureka-client and we can call service using application.name. In below code snippet DiscountServiceClient is a FeignClient Which is calling service using registered application name.

Pricing Service
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
@SpringBootApplication
@EnableDiscoveryClient
@EnableFeignClients
public class PricingServiceApplication {
public static void main(String[] args) {
SpringApplication.run(PricingServiceApplication.class, args);
}
}
@RestController
@Slf4j
class ServiceInstanceRestController {
@Autowired
private DiscountServiceClient discountServiceClient;
@GetMapping("/price/{product}")
public Int getPriceForProduct(@PathVariable("product") String product) {
int discount = discountServiceClient.getDiscountPercentage("Test");
int price = 100;
s = 100-discount
log.info("Discount is {}",discount);
return (s*price)/100;
}
}

@FeignClient("discount-service")
interface DiscountServiceClient {
@RequestMapping(method = RequestMethod.GET, value = "/discount/{product}")
int getDiscountPercentage(@PathVariable("product") String product);
}
Pricing Service properties
1
2
3
4
5
6
7
8
9
10
11
12
13
spring:
application:
name: pricing-service

server:
port: 8080

eureka:
client:
healthcheck:
enabled: true
serviceUrl:
defaultZone: http://localhost:8761/eureka/

What about Load Balancing ?

I have already discusses how to use and setup Ribbon for Client-side load balancing in this Post. Feign already uses Ribbon, so, if you use @FeignClient Ribbon will be used along with that. When Eureka is used in conjunction with Ribbon (that is, both are on the classpath), the ribbonServerList is overridden with an extension of DiscoveryEnabledNIWSServerList, which populates the list of servers from Eureka
if we start multiple instances of discount service then Ribbon will discount service in round robin algorithm.

Final Architecture of Application will be like this

The code for this post is available for download here.

Share Comments