NEW: Zero Trust Security For Dummies ebook. Get your free copy now!

Managing Your Cloud Workloads Under the Shared Responsibility Model

Long ago and far away, network and security teams could worry about securing their data center. Singular. After some time, cloud computing came along and teams had to figure out strategies and tools for securing data placed into cloud environments while fretting over whether the cloud provider was doing its own due diligence.

Today, various studies suggest that enterprise cloud usage is in the 80-90% range, and the major cloud providers like AWS, Azure, and Google are buffing up their security capabilities and promoting these features as a competitive advantage. As a result, everyone wins. Cloud providers employ skilled teams whose job it is to ensure their environments are the most secure. The fewer security incidents they have to report, the more attractive their solution is to that 80-90% of companies using cloud.

That said, even with all of the skill and attention to detail, cloud service providers (CSPs) are only responsible for certain parts of security; cloud consumers retain responsibility for their data. In some cases, responsibility extends down through applications to the operating system (OS). Depending on the type of model a company chooses—Infrastructure-as-a-Service, Platform-as-a-Service, or Software-as-a-Service—requirements change.



So far, this is all pretty straightforward. The type of cloud model in use determines the balance of responsibility shared with the CSP. Complexity starts to mount when different cloud models are in use, and further when the organization has multi-cloud deployment. According to a recent report by Forrester Research on behalf of Virtustream, 86% of companies say they have a multi-cloud strategy. That’s a lot of ground to cover in terms of determining who does what for which deployment. If the initial goals of moving to cloud included efficiency and lower management costs, business leaders didn’t take into account when and where these benefits would be offset.

Subscribe to our newsletter:

Securing your cloud deployment

By any definition, outsourcing data and services to the cloud requires, minimally, governance. In a SaaS model, the only thing cloud consumers have to worry about is the data. But that, in and of itself, is a huge responsibility. It’s generally the data that attackers are after. Encryption helps, as do regular backups, and companies must be vigilant about who is authorized to use SaaS and their associated credentials. Outside of a few controls, though, responsibility for security is largely outside of the cloud consumers’ perview.

Moving through the deployment models, data and applications remain at the core. There is obviously increased responsibility for processes like patching, testing, endpoint security, and network security but the increased responsibility for the cloud consumer doesn’t change the fundamental need to secure data and applications communicating on the network. It’s simply that for Iaas or PaaS cloud consumers need to think more about the environment in which their data and applications are communicating, and how they’re communicating — which data paths are in use, which services talk to what other hosts, how services and programs (not just users and devices) are authenticating and are authorized. Cloud workloads are simply an application vehicle, therefore what else is “on the road” affects the security of those workloads.

But microsegmentation is hard

Microsegmentation has become a common mechanism by which organizations can improve security within IaaS. Increasing the number of gateways through which traffic must authenticate and be authorized will decrease risk of malicious traffic, unauthorized access, and lateral movement within the data center. However, microsegmentation is often conflated with firewalls or other network-centric technologies. The problems with address- and packet-based tools in a cloud model include the ever-changing nature of the cloud (IP addresses and ports can be used for different purposes each time they’re assigned, sometimes during a single session), the amount of traffic that must be inspected (potentially causing performance issues), an overabundance of rules/policies to manage, and the dependencies that that can break if policies are changed.

Issues with deploying and maintaining an address-centric microsegmented network have caused many companies to abandon ship — microsegmentation in its traditional form is just too complex and costly to administer. This doesn’t mean that creating secure zones within a network, especially one that is ever-changing and in which cloud-native services are constantly deployed, isn’t a good idea. In fact, the more cloud-native applications and services an organization builds and deploys, the more crucial it is for some sort of granular segmentation to be present — they never pass through a (traditional) perimeter.

Simplifying segmentation and shared responsibility

The surprising news to some security and networking practitioners is that microsegmentation does not need to rely on network constructs. Policies can be applied based on the data, applications, or services running in the cloud. In this way, when network changes occur, protection of core assets remains. When new workloads are deployed, security policy is immediately applied, and policy travels with the asset, regardless of the environment. Therefore, if the business needs to move data from one cloud to another, it can do so without hassle.

Providers and third-party services change all the time, but identity does not. It’s been said that “identity is the new perimeter.” If this is the case, cloud consumers need to rely more heavily on identity as a security control, keeping in mind that identity does not only apply to humans and their computing devices. Workloads, too, have identities, and companies can tie those identities to network permissions. This way, cloud consumers can microfocus data protection around core assets and be confident that computing environment doesn’t affect security status. The burden of shared responsibility is lifted because protection remains a constant across deployments and cloud models. This is where security and networking teams can realize the greater efficiency and lower management costs non-technical teams have been touting for years.

Katherine Teitler, Director of Content

Written by Katherine Teitler, Director of Content

Katherine Teitler leads content strategy and development for Edgewise Networks. In her role as Director of Content she is a storyteller; a translator; and liaison between sales, marketing, and the customer. Prior to Edgewise, Katherine was the Director of Content for MISTI, a global training and events company, where she was in charge of digital content strategy and programming for the company's cybersecurity events, and the Director of Content at IANS, where she built, managed, and contributed to the company's research portal.