Episode 2 – Cloud security overview
Cloud computing is a dynamic platform with continuous provisioning and de-provisioning of on-demand resources based on utility and consumption. It has caused considerable confusion on how this is both different and similar to conventional architectures and how this impacts information and application security from a technical standpoint.
Cloud security controls should be thought of not only from the perspective of the “physical” location of the resources but also the ones managing them and consuming them. It follows what is known as the shared responsibilities model, where the responsibility is shared between the customer and the cloud provider.
But how do you know which responsibilities belong to you and which belong to the cloud provider? A good rule of thumb would be to break down the security aspect into the various dimensions first.
- Applications: This includes access control, application firewalls, and transactional security
- Information: This consists of the aspects of database encryption and monitoring
- Management: This includes patch management, configuration management, and monitoring
- Network: This holds firewalls and Anti-DDOS measures
- Compute and Storage: The focus here is on Host-based firewalls, Integrity, and file-log management
- Physical: This includes physical data centre security
After you have identified these aspects in your application, they can be mapped to your cloud provider to check which controls exist and which do not. The division of the responsibilities of these dimensions will depend on the classification of your cloud implementation based on the “SPI model”, i.e. SaaS, PaaS or IaaS.
Read Episode 1: Migrating to the cloud
SPI Model
Software as a Service(SaaS) – In this implementation, the customer is given the use of a software or application deployed and managed by the provider on a cloud infrastructure, which cannot be controlled or managed by the customer apart from limited customization and configuration options based on special requirements.
The user has the responsibility of managing access to applications and dictates policies on who has access to which resources. For example, an employee from the sales team may have access to only data from the CRM application, someone from the academic team may only have access to the LMS, etc. The rest of the cloud stack is the responsibility of the cloud provider including infrastructure and the platform.
Platform as a Service(PaaS) – This enables the customer to build, deploy and manage applications to the cloud using programming languages and tools supplied by the cloud provider. The organization can deploy applications without having to manage the underlying hardware and hosting capabilities.
The cloud provider takes the responsibility of securing the platform provided and all stacks below it. The customer has the responsibility of securing the developed application and all the access to these applications. It is also recommended that customers encrypt all application data before storing it on the cloud providers platform and plan for load balancing across different providers or across geographical regions in case of an outage.
Infrastructure as a Service(IaaS) – The cloud provider delivers computing infrastructure along with storage and networking needs via a platform virtualization service. The customer can then run and deploy applications and software on the infrastructure as per their need.
The responsibility of the underlying hardware along with all used storage and networking resources falls with the cloud provider. The customer is responsible for putting controls in place regarding how virtual machines are created and who has access to the machines to keep costs in control and reduce wastage of resources.
Curious about what it takes to become a Cloud Ops Engineer?
Read our blog, ‘What Do I Need to Know to Be a Cloud Ops Engineer?‘ to get all the essential insights and tips.
Recommended practices
– Encrypt data before migrating: Your cloud provider will do everything it can to make sure the data you have uploaded is secure, however, the application as such may not be infallible. If the data contains private information which should not be found by a third party, it needs to be encrypted before storing and/or uploading.
– Take care of data security (at rest): This can primarily fall under the following categories
– Encrypt your data: All cloud providers will have some encryption systems in place to protect your data from rogue usage. Make sure these systems are in accordance with your organization’s policies. For security reasons, you may also want to manage the encryption keys yourself rather than let your provider do it; check whether this service is available.
– Protect your keys: Some providers will allow you to manually handle encryption keys in the form of hardware security modules (HSM). This will place the responsibility of managing the keys on the customer but allows for better control. Also, you will certainly be issued SSH and API keys for access to various cloud services. These should be stored securely and protected against unauthorized access. Remember, if the keys are compromised, there is likely nothing your provider can do to help you!
– Data that is deleted stays deleted: Redundancy systems used by cloud providers often replicate data to maintain persistence. As such, sensitive data can often find it’s way into logging systems, backups, and management tools. It’s highly recommended to be familiar with the cloud deployment system to keep track of where your data may have ended up.
– Secure your data in transit: Firewalls, network access control solutions, and organizational policies should be in place to make sure that your data is safe against malware attacks or intrusions. For example, policies should also be set up to automatically encrypt or block sensitive data when it is attached to an email or moved to another cloud storage or external drive. This can be made easier by categorizing and classifying all company data, no matter where it resides, to maintain easier access control.
– Unauthorized cloud usage: Strict policies will need to be set up to ensure that employees can only access the resources that they should. Similar measures will need to be put in place to regulate the number of virtual machines being run and make sure those machines are spun down when not in use.
Every cloud provider will have its own governance services to manage resource usage. It is highly recommended that an in-house cloud governance framework is put in place.
– Keep an Audit trail: Cloud ecosystems run on a pay-as-you-go basis and can rack up huge bills and lead to considerable wastage when not used properly. Therefore, tracking the use of cloud resources is very important. Your cloud provider will likely have a system in place to generate audit trails, but if your cloud implementation is spread across multiple providers, creating an independent in-house audit trail becomes important. A Cloud Service Broker solution will be able to assist you in this by monitoring resource usage and identify vulnerabilities and rogue users, which brings us to the next point.
– Ask your provider: Your cloud provider will have numerous manuals and whitepapers describing best practices to follow for various implementations. Make sure to take advantage of them!
Cloud Security is a tricky jungle to navigate, but by following some simple guidelines and best practices, you can ensure that your organization’s data and applications are safe and rest easy. To read more on Cloud Computing click here.
Experts Talk Series is a repository of articles written and published by cloud experts. Here we talk in-depth about cloud concepts, applications, and implementation practices.