07/06/2018

Deploying Microservices on AWS

There are three main ways to host microservices on AWS.

  • Load Balance the microservices across a set of EC2 instances.
  • Load Balance a set of Docker containers across a set of EC2 instances.
  • Employ AWS’s API Gateway and Lambda architecture.

We’ve been exploring each of these in a little more detail:

1) Load Balance the microservices across a set of EC2 instances

 

 Simply install the microservice application binaries on an EC2 instance. For performance and fault tolerance load balance across a set of EC2 instances each with the application binaries installed and placed into an auto scaling group. Use Elastic Load Balance to load balance across the EC2 instances. We’d recommend using Elastic Beanstalk to automatically setup such an arrangement including the deployment onto multiple EC2 instances, auto scaling and load balancing. This works fine but the setup of services on each machine can be manually intensive and difficult to maintain without employing containers. Which leads us to option 2.

2) Load Balance a set of Docker containers across a set of EC2 instances

Use containers to deploy the microservices. ECS (Elastic Containers Service) is used to schedule and deploy the containers across a set of EC2 instances including Amazon container registry, Identity and Access control (IAM control) auto scaling and optimal placement strategy. This is our standard architectural approach, at the moment, because it leverages a lot of work we’ve done recently to move our work over to container based microservices. We still have to set up and own EC2 instances and it would be nice to remove that – which leads us to option 3.

3) Employ AWS’s API Gateway and Lambda architecture

This is Amazon Web Services Serverless architecture. It removes the need for EC2 instances completely by replacing the containerised services with lambda functions. These are units of functionality that can be run from another AWS service, webservice trigger or mobile application trigger. We’re currently exploring triggering through REST API. Apart from removing the need for EC2 instances another cool aspect is that Lambda automatically scales and you only pay for compute used (no EC2 instances!). The usual architecture employs an API Gateway pattern to help spread and throttle the incoming API calls, provide authentication and authorisation support, handle DDoS protection, monitor activities and log interactions.