Building Serverless APIs with Spring Boot, AWS Lambda and API Gateway

This post demonstrates how to expose a RESTful API implemented with Spring MVC in a Spring Boot application as a Lambda function to be deployed via AWS API Gateway. We will be using the aws-serverless-java-container package which supports native API gateway’s proxy integration models for requests and responses.

Project Setup

Create a new Spring Boot project e.g. using the Spring Initializer or modify an existing project to include the aws-serverless-java-container package dependency:

<dependency> 
      <groupId>com.amazonaws.serverless</groupId> 
      <artifactId>aws-serverless-java-container-spring</artifactId> 
      <version>1.1</version> 
</dependency>

We can remove the Spring Boot Maven Plugin from the pom file. Instead, add the Maven Shade Plugin and remove the embedded Tomcat from the deployed package:

<plugins> 
   <plugin> 
      <groupId>org.apache.maven.plugins</groupId>
      <artifactId>maven-shade-plugin</artifactId>
      <configuration> 
        <createDependencyReducedPom>false</createDependencyReducedPom> 
      </configuration> 
      <executions> 
        <execution> 
          <phase>package</phase>
          <goals> 
            <goal>shade</goal> 
          </goals> 
          <configuration> 
             <artifactSet> 
                <excludes> 
                   <exclude>org.apache.tomcat.embed:*</exclude>
                </excludes> 
             </artifactSet>
          </configuration> 
        </execution>
      </executions> 
   </plugin>
</plugins>

Serverless API

1. HelloController

Implement RESTful APIs using Spring MVC as usual. For example:

package com.madman.lambda;
...
@RestController
public class HelloController {
 
     @RequestMapping(path = "/greeting", method = RequestMethod.GET) 
     public GreetingDto sayHello(@RequestParam String name) { 
          String message = "Hello " + name; 
          GreetingDto dto = new GreetingDto();
          dto.setMessage(message); return dto; 
     }

     ...
}

2. StreamLambdaHandler

To deploy Java codes to run as AWS Lambda function, it needs to implement the handler interface RequestStreamHandler. The aws-serverless-java-container library makes it rather straight forward:

...
public class StreamLambdaHandler implements RequestStreamHandler {
    private static Logger logger = LoggerFactory.getLogger(StreamLambdaHandler.class);     

    public static final SpringBootLambdaContainerHandler<AwsProxyRequest, AwsProxyResponse> handler;
 
    static { 
       try { 
           handler = SpringBootLambdaContainerHandler.getAwsProxyHandler(HelloLambdaApplication.class);
       } catch (ContainerInitializationException e) { 
           // if we fail here. We re-throw the exception to force another cold start 
           String errMsg = "Could not initialize Spring Boot application"; 
           logger.error(errMsg); 
           throw new RuntimeException("Could not initialize Spring Boot application", e); 
       } 
    }

    @Override 
    public void handleRequest(InputStream inputStream, OutputStream outputStream, Context context) throws IOException {
        handler.proxyStream(inputStream, outputStream, context);
        // just in case it wasn't closed 
        outputStream.close(); 
    }
}

The class StreamLambdaHandler implements the AWS Lambda predefined handler  interface RequestStreamHandler for handling events.

Note the handling of the Lambda events is delegated to the class SpringBootLambdaContainerHandler.

3. HelloLambdaApplication

Note the SpringBootLambdaContainerHandler.getAwsProxyHandler method is provided with a Spring web application initializer interface, which is implemented by the main Spring Boot Application class by extending the implementing class SpringBootServletInitializer :

@SpringBootApplication
@ComponentScan(basePackages = "com.madman.lambda.controller")
public class HelloLambdaApplication extends SpringBootServletInitializer {
 
     public static void main(String[] args) { 
           SpringApplication.run(HelloLambdaApplication.class, args);
     }
}

4. HelloControllerTest

The aws-serverless-java-container library also supports integration testing the proxy API. Below is integration test for HelloController:

@RunWith(SpringJUnit4ClassRunner.class)
@ContextConfiguration(classes = { HelloLambdaApplication.class })
@WebAppConfiguration
public class HelloControllerTest {
    private MockLambdaContext lambdaContext;
    private SpringBootLambdaContainerHandler<AwsProxyRequest, AwsProxyResponse> handler;
    
    @Autowired 
    private ObjectMapper mapper;
 
    public HelloControllerTest() { 
       lambdaContext = new MockLambdaContext(); 
       this.handler = StreamLambdaHandler.handler; 
    }

    @Test public void testGreetingApi() throws JsonParseException, JsonMappingException, IOException {
       AwsProxyRequest request = new AwsProxyRequestBuilder("/greeting", "GET").queryString("name", "John").build(); 
       AwsProxyResponse response = handler.proxy(request, lambdaContext);
      
       assertThat(response.getStatusCode(), equalTo(200)); 
       GreetingDto responseBody = mapper.readValue(response.getBody(), GreetingDto.class);
       asserThat(responseBody.getMessage(), equalTo("Hello John")); 
    }
}

Deploying to AWS

The full source codes can be found in GitHub here. Run Maven to build the jar file and deploy it as Lambda function. I use the AWS Toolkit for Eclipse to deploy the jar package to AWS. Refer to AWS documentation for more options and information on deploying Lambda applications.

To setup AWS API Gateway as trigger for the Lambda function:

  1. Create a New API
  2. Create Resource
    1. Configure as proxy resource
    2. Resource Name: greeting
  3. Create Method
    1. Get
    2. Integration type: Lambda Function
    3. Use Lambda Proxy Integration: true
    4. Lambda Function: <Name of Lambda function>

You should then be able to test the API with the AWS console (as screenshot below).

blog_apigateway

A couple of things to note:

  1. Cold start – the Java container takes a good few seconds. The latency is ok once it’s warmed up
  2. Fat jar – the Spring Boot jar in this example is around 13MB which is still ok for Lambda (limit 50MB)
Advertisements

Spring for Apache Kafka Quick Start

In this blog, I setup a basic Spring Boot project for developing Kafka based messaging system using Spring for Apache Kafka. The project also includes basic Spring config required for publishing and listening to messages from Kafka broker.

Project Setup

The following tools and versions are used here:

  1. Maven 3.x
  2. Spring Kafka 1.3.2 (current release version)
  3. Kafka client 0.11.0.2
  4. Spring Boot 1.5.9

The current Spring Boot release version (1.5.9) has Spring Kafka version  1.1.7 as the managed version. I have to override this to use 1.3.2. My Maven pom file fragment as below:

 <parent>
      <groupId>org.springframework.boot</groupId>
      <artifactId>spring-boot-starter-parent</artifactId>
      <version>1.5.9.RELEASE</version>
      <relativePath /> <!-- lookup parent from repository -->
 </parent>

 <dependencies>
      <dependency>
           <groupId>org.springframework.kafka</groupId>
           <artifactId>spring-kafka</artifactId>
           <version>1.3.2.RELEASE</version>
      </dependency>
      
      <dependency>
           <groupId>org.springframework.kafka</groupId>
           <artifactId>spring-kafka-test</artifactId>
           <version>1.3.2.RELEASE</version>
           <scope>test</scope>
      </dependency>

      <dependency>
           <groupId>org.apache.kafka</groupId>
           <artifactId>kafka-clients</artifactId>
           <version>0.11.0.2</version>
      </dependency>

Producer Config

Spring Boot provides auto configuration for connecting to Kafka but I find it useful to setup the beans myself. Spring Kafka adopts the same approach to Kafka as in other message brokers such as ActiveMQ. For publishing message a template, KafkaTemplate, as to be configured as with JmsTemplate for ActiveMQ.

The following is my Java Config for a KafkaTemplate to publish message to the Kafka broker

@Configuration
public class KafkaProducerConfig {

     @Value("${spring.kafka.bootstrap-servers}") // (1)
     private String brokerAsString;
 
     @Bean
     public ProducerFactory<Integer, String> producerFactory() {
          return new DefaultKafkaProducerFactory<>(producerConfigs());
     }

     @Bean
     public Map<String, Object> producerConfigs() {
          Map<String, Object> props = new HashMap<>();
          props.put(ProducerConfig.BOOTSTRAP_SERVERS_CONFIG, brokerAsString);
          props.put(ProducerConfig.RETRIES_CONFIG, 0);
          props.put(ProducerConfig.BATCH_SIZE_CONFIG, 16384);
          props.put(ProducerConfig.LINGER_MS_CONFIG, 1);
          props.put(ProducerConfig.BUFFER_MEMORY_CONFIG, 33554432);
          props.put(ProducerConfig.KEY_SERIALIZER_CLASS_CONFIG, IntegerSerializer.class);
          props.put(ProducerConfig.VALUE_SERIALIZER_CLASS_CONFIG, StringSerializer.class);
          return props;
      }

     @Bean
     public KafkaTemplate<Integer, String> kafkaTemplate() {
          return new KafkaTemplate<Integer, String>(producerFactory());
     }
}

Note:

  1. The broker address is set using the property spring.kafka.bootstrap-servers defined in the application.properties (or yml) file. For example,
// application.properties
spring.kafka.bootstrap-servers=http://localhost:9092

Consumer Config

Consuming messages from Kafka using Spring Kafka is similar to consuming messages from Active MQ using Spring JMS support. We need to define container factory and message listener. Below is my Java Config for message listener factory.

@Configuration
public class KafkaConsumerConfig {
 
     @Value("${spring.kafka.bootstrap-servers}")
     private String brokerAsString;

     @Value("${spring.kafka.consumer.group-id}")
     private String groupId;
 
     @Value("${spring.kafka.consumer.auto-offset-reset}")
     private String autoOffsetReset;
 
     @Bean
     ConcurrentKafkaListenerContainerFactory<Integer, String> kafkaListenerContainerFactory() {
          ConcurrentKafkaListenerContainerFactory<Integer, String> factory = new ConcurrentKafkaListenerContainerFactory<>();
          factory.setConsumerFactory(consumerFactory());
          return factory;
     }

     @Bean
     public ConsumerFactory<Integer, String> consumerFactory() {
         return new DefaultKafkaConsumerFactory<>(consumerConfigs());
     }

     @Bean
     public Map<String, Object> consumerConfigs() {
         Map<String, Object> props = new HashMap<>();
         props.put(ConsumerConfig.BOOTSTRAP_SERVERS_CONFIG, brokerAsString);
         props.put(ConsumerConfig.GROUP_ID_CONFIG, groupId);
         props.put(ConsumerConfig.AUTO_OFFSET_RESET_CONFIG, autoOffsetReset);
         props.put(ConsumerConfig.ENABLE_AUTO_COMMIT_CONFIG, true);
         props.put(ConsumerConfig.AUTO_COMMIT_INTERVAL_MS_CONFIG, "100");
         props.put(ConsumerConfig.SESSION_TIMEOUT_MS_CONFIG, "15000");
         props.put(ConsumerConfig.KEY_DESERIALIZER_CLASS_CONFIG, IntegerDeserializer.class);
         props.put(ConsumerConfig.VALUE_DESERIALIZER_CLASS_CONFIG, StringDeserializer.class);
         return props;
     }
}

Now we can listen to a Kafka topic by using the annotation @KafkaListener. For example

@Service
public class GreetingsTopicListener {

 private Logger logger = LoggerFactory.getLogger(getClass());
 
 @KafkaListener(topics = "greetings")
 public void listen(ConsumerRecord<?,?> cr) throws Exception {
      logger.info(cr.toString());
 }
}

@KafkaListener will use the default listener container factory defined in class ConsumerConfig above to create the message listener. It is also possible to override this by settig the containerFactory attribute in the annotation. See javadoc for more details.

Creating Topics

It is also possible to automatically add topics to the broker by defining @Beans using the new 0.11.0.x client library class AdminClient as in the Spring Kafka reference documentation

@Configuration
public class KafkaTopicConfig {
 
 @Value("${spring.kafka.bootstrap-servers}")
 private String brokerAsString;
 
 @Bean
 public KafkaAdmin admin() {
   Map<String, Object> configs = new HashMap<>();
   configs.put(AdminClientConfig.BOOTSTRAP_SERVERS_CONFIG, brokerAsString);
   return new KafkaAdmin(configs);
 }

 @Bean
 public NewTopic topic1() {
   return new NewTopic("foo", 10, (short) 2);
 }

 @Bean
 public NewTopic topic2() {
   return new NewTopic("bar", 10, (short) 2);
 }
}

That’s about it. The codes included in this blog should be sufficient for setting up a Spring Boot project for a messaging system using Spring Kafka.

Paging with Spring Data

This blog shows by an example on how to use paging support in Spring Data. Let start with following repository without paging

 

public interface ProductRepository extends JpaRepository<Product, Long> {

 @Query("SELECT p FROM product p WHERE p.category= :category")
 List<Product> findByCategory(@Param("category") String category);

}

 

The above query will return all products with matching category. To change it to only return products required to fill in a single page,  the method is updated to:

import org.springframework.data.domain.Page;
import org.springframework.data.domain.Pageable;
...
public interface ProductRepository extends JpaRepository<Product, Long> {

 @Query("SELECT p FROM product p WHERE p.category= :category")
 Page<Product> findByCategory(@Param("category") String category, 
     Pageable pageable);

}

Note:

  1. The method now returns an object with class Page instead of List
  2. A new argument with type Pageable is added

The Pageable argument allows you to specify a few things including which page to return, the number of items in each page and properties used to sort the results. For example, to return the 2nd page where size of each page is 10, the method can be called with Pageable below

Pageable pageable = new PageRequest(1, 10);

Class PageRequest is a concrete implementation of Pageable provided by Spring Data. Note page number is 0-indexed.

To define how the results should be ordered, a Sort object can be supplied as the 3rd argument of the PageRequest constructor. For example, to return list of products sorted from highest price and then by name in alphabetical order:

Sort sort = new Sort(new Order(Direction.DESC, "price"), 
                     new Order(Direction.ASC, "name"));

Pageable pageable = new PageRequest(1, 10, sort);

Finally, note that the Page object returned from the method contains not just the list of products but also other values useful for implementing pagination, e.g. total number of results, number of pages. See the javadoc here for more detail.

 

 

Securing Multiple Resources in OAuth2 Resource Server using Spring Security

In my previous blog post I demonstrate how to setup an OAuth2 authorization server using Spring Security. In this post, I will demonstrate how to setup the security configurations in a resource server to secure multiple resources.

The example here uses Spring Boot 1.2.7 and is a standalone OAuth2 resource server which secures multiple resources with their own ids and access rules. To do that, instead of using @EnableResourceServer, we have to define a ResourceServerConfiguration bean for each resource to be secured as shown below

@Configuration
@EnableOAuth2Resource
public class OAuth2ServerConfig {
     @Bean
     protected ResourceServerConfiguration stockesources() {
          ResourceServerConfiguration resource = new ResourceServerConfiguration(){
                // Switch off the Spring Boot @Autowired configurers
                public void setConfigurers(List<ResourceServerConfigurer> configurers) {
                    super.setConfigurers(configurers);
                }
                @Override
                public int getOrder() {
                     return 30;
                }
     };
           resource.setConfigurers(Arrays.<ResourceServerConfigurer> asList(new ResourceServerConfigurerAdapter() {
                 @Override
                 public void configure(ResourceServerSecurityConfigurer resources) throws Exception {
                      resources.resourceId("stock");
                  }
                  @Override
                  public void configure(HttpSecurity http) throws Exception {
                        http.antMatcher("/stock/**").authorizeRequests().anyRequest().access("#oauth2.hasScope('read')");
                  }
             }));
            return resource;
}

To secure the resource “stock”, we first implements a bean of ResourceServerConfiguration by manually setting the configurers by calling the super.setConfigurers() method and then set the order by overriding the getOrder() method. Then, security configuration for the resource can then be set up by using the return ResourceServerConfiguration object (variable resource) by implementing an anonymous ResourceSourceServerConfigurerAdapter.

To secure another resource, just define another bean of ResourceServerConfiguration similar to the above with a different resource id and its own OAuth2 access rules. Also, the order value has to be unique.

Note that I have to override the getOrder() method in the ResourceServerConfiguration here instead of line like

     ResourceServerConfiguration resource = ...
     resource.setOrder(30)
     resource.setConfigurers(...

or spring security will throw exception like below

Caused by: java.lang.IllegalStateException: @Order on WebSecurityConfigurers must be unique. Order of 2147483626 was already used, so it cannot be used on . OAuth2ServerConfig$1@1a40489f too.

 

Writing parameterized Tests with Spring and JUnit 4

This blog will demonstrate how to write parameterized tests with JUnit4 with Spring.

Let say we have the following interface

public interface Logic {
     boolean xor(boolean a, boolean b);
}

and the corresponding implementation annotated as a Spring service component

@Service
public class LogicImpl implements Logic {
      @Override
      public boolean xor(boolean a, boolean b) {
            return a ^ b;
      }
}

To test the above class with different input combinations of arguments a and b, we can write the following parameterized test in JUnit

@RunWith(Parameterized.class) // Note 1
@SpringApplicationConfiguration(classes = BlogApplication.class)
public class LogicImplTest {

     @Autowired
     private LogicImpl logic;

     // Manually config for spring to use Parameterised
     private TestContextManager testContextManager;

     @Parameter(value = 0) // Note 3i
     public boolean a;

     @Parameter(value = 1) // Note 3ii
     public boolean b;

     @Parameter(value = 2) // Note 3iii
     public boolean expected;

     @Parameters // Note 4
     public static Collection<Object[]> data() {
          Collection<Object[]> params = new ArrayList<>();
          params.add(new Object[] { true, true, false});
          params.add(new Object[] { true, false, true});
          params.add(new Object[] { false, true, true});
          params.add(new Object[] { false, false, false});

          return params;
      }

     @Before // Note 2
     public void setUp() throws Exception {
          this.testContextManager = new TestContextManager(getClass());
          this.testContextManager.prepareTestInstance(this);
     }

     @Test // Note 5
     public void testXor() {
          assertThat(logic.xor(a, b), equalTo(expected));
     }

}

A few things to note here:

  1. The test class is to be run with the Parameterized runner, instead of the normal SpringJUnit4ClassRunner class.
  2. We need to manually configure the test context manager as in the @Before method. This is typically done automatically by Spring
  3. Parameters are defined as public members of the class (as in 3i to 3iii) with the @Parameter annotation. Since we have more than 1 parameter, it is also neccessary to set the value attribute. This defines the index of the parameters to use.
  4. Parameter values are set by implementing a static  method and annotate it with @Parameters. In our example, the data() method returns a list of object arrays. Each value of the list params corresponds to a set of parameter values.
  5.  Tests now can use the parameter values.

Running the test class in Eclipse will give you something like this

blogPTest

Note testXor() is run 4 times, each using the parameter set of the values defined in the list returned by the data() method.

 

Setup Spring RestTemplate to accept Self Signed Cert

This is strictly for testing only but may be useful if you need to perform integration tests. For example, the system you develop needs to access another internal or 3rd party test server via https where the server’s certificate is not signed.

PKIX path building failed

By default, if you try to access a server via https with a self signed certificate, for example with the following codes

RestTemplate template = new TestRestTemplate();
template.getForObject(https://<some server>/, String.class);

you will get the following exception:

org.springframework.web.client.ResourceAccessException: I/O error on GET request for "https://<some server>":
sun.security.validator.ValidatorException: PKIX path building failed: 
sun.security.provider.certpath.SunCertPathBuilderException:
unable to find valid certification path to requested target;

HttpClient

To fix the above, update the RestTemplate with a custom HttpClient that accepts self-signed certificate:

      SSLConnectionSocketFactory socketFactory = new SSLConnectionSocketFactory(new SSLContextBuilder().loadTrustMaterial(null, new TrustSelfSignedStrategy()).build());

      HttpClient httpClient = HttpClients.custom().setSSLSocketFactory(socketFactory).build();

      RestTemplate template = new TestRestTemplate();
      ((HttpComponentsClientHttpRequestFactory) template.getRequestFactory()).setHttpClient(httpClient);

and add the following dependency if needed

      <dependency>
           <groupId>org.apache.httpcomponents</groupId>
           <artifactId>httpclient</artifactId>
           <scope>test</scope>
      </dependency>

Again this should only be used for testing purpose only.