AWS: Framework to Build and Deploy Applications using Webservices
Problem Statement:
With growing competition across various industries the need to launch new products or enhance existing products quicker in the market has never been greater and the ability of companies to scale based on business determines how relevant they are with their peers or else they will start to fall behind and lose business. Hence, companies, especially in the IT industry are seeking ways to optimize processes that bring development and operations teams together to allow faster cycles of development and scalability.
Below is a simple breakdown of how companies are spending time and money in various phases of a product/application launch.
As shown in red, in the pie chart above companies are not only investing almost half of their budget but also losing time and competition because of Infrastructure setup and Maintenance. Hence, the need to optimize/streamline these steps.
Objective: To create a platform/framework that can be used to build, test, deploy and operate docker applications at scale using AWS.
Proposed Solution:
The proposal is to create a platform that can be used to build, test, deploy and operate docker applications at scale using AWS. The solution will offer a framework that will simplify building of infrastructure at a click of a button and automate maintenance and scaling based on volumes.
Using Docker and Amazon Web Services, using these two technologies we intent to create a powerful framework and toolset that can be used for building deploying, testing and operating any application.
High level steps involved are –
Step 1 – Building a sample application (we used an open source microservice application to implement this framework).
Building a Web Application using Microservices based architecture and deploy to staging environment using docker images and native AWS CI/CD tools/services.
Step 2 – Deploying required foundational resources
Next, we will use this framework to deploy foundational resources in your AWS account, ElasticC2 container registry, Virtual Private Cloud (VPC) networking resources, and an HTTP proxy service that secures outbound communication for your applications.
Step 3 – Building the Infrastructure
Ansible and CloudFormation will be leveraged to create a generic tool chain for deploying any Docker applications or any cloud service using a fully automated, infrastructure as code.
Step 4 – Operations
Finally, we will extend CloudFormation to perform custom provisioning tasks using AWS Lambda functions, how to securely manage and inject secrets into Docker applications and supporting resources.
Architecture:
Role of Cloud:
- Open Source Application – We used an open source application called “Microtrader” to create this framework, it is a microservices based application, built on the vertex framework. The application has microservices like
- Quote Generator
- User Dashboard
- Audit services
- Portfolio services
- Identity and Access Management – This AWS service was use to setup
- Admin and User roles to administer and access various AWS services
- Setup roles for AWS services, so that they can access each other
- Set up MFA for all users so that all accesses are secure.
- CloudWatch – The CloudWatch service was used to monitor all the services and ensure there are no security breaches.
- CloudFormation – All the AWS services that have been deployed in this project have been deployed using the CloudFormation templates, templates had configurable values for both various environments ranging from Dev, QA & Production.
5. Load Balancers – Load balancers were setup for both user load across the application and for load on internal services.
6. Route 53 – Route 53 service has been used to resolve DNS requests between microservices within the application/entities.
7. Amazon ECR – The application we used for this platform was Dockerized using the container registry that made it easy for us to store, manage, and deploy images.
8. Lambda – A Lambda function was developed to manage capacity of our containers, based on capacity the Lambda function spins new instances of EC2s.
9. AWS Code Build, Code Pipeline and Code Build – We used these AWS services for
- Continuous integration and delivery
- To increase speed and quality
- For greater productivity
- Overall cost reduction
Learning:
- The most challenging aspect was the choice of services to use. The selection should be based on various parameters like time to market, cost, long-term cost implication, portability to other cloud vendors, etc.
- If time to market is of primary concern, managed services provided by the cloud vendor should be preferred over popular/open-source technologies.
- Estimating costs accurately, based on projections of business growth may be a challenge.
Cloud Specific Learning
- Using AWS IAM, it is very easy to manage AWS services and resources securely.
- Using ansible and cloud formation, it i very easy to manage the stacks. We can easily automate the creation of infrastructure as well by managing through the code. We create immutable resources using cloud formation execution.
- We can work and define all the policies and configuration depending on environment and then apply the same in templates. It is very to manage environment specific deployment using ansible.
- We have divided the logic and responsibility of whole application among various AWS services, Each AWS service is highly scalable and reliable in terms of providing services. More over it i very easy to manage the services at individual level.
- Code Pipeline works like joining multiple building blocks. We have divided the deployment in multiple stages and then automate them using code pipeline.
- Autoscaling polices are adding higher scalability to the application by adding more instances in run time which gives a performance boost to application in a busy and high-volume day.
Project Done by:
Animesh Kumar Bhadra, Sr. Tech Lead, Synamedia.
15 yrs of experience in embedded systems in STB and next-gen cloud-enabled video solutions.
Shobhit Agrawal, Advisory System Analyst with IBM
15 years of experience in software development.
Ramesh Maddheshiya, Senior Tech Lead DevOps
Works as “Sr. Tech Lead-DevOps” at PWC, Bangalore, I’ve 11 yrs of experience in DevOps technologies and interested in Cloud computing and Blockchain.
Rohit Shah, Principal Systems Analyst,
14 years of experience working in BFSI.