Ask many of today’s enterprise businesses what the most important factors are to remain competitive in their industry, and you’re likely to get an answer that includes both speed and innovation. There’s always another competitor snapping at your heels, and there aren’t enough hours in the day to get down your to-do lists. The faster you can go live with new features and updates, the better.
For many, this comes at a severely high price – security. If speed and innovation are the top items on the agenda, how can you balance this with keeping your sensitive information or critical assets safe? Of course, pushing security onto the back burner is never a solution, as increased risk, compliance and internal governance mandates will continually remind us.
A fellow cybersecurity evangelist Tricia Howard and I discussed this conundrum a while back. She came up with a terrific visual representation of this dilemma which can be seen in the Penrose Triangle, below. This diagram, also known as the ‘impossible triangle’ is an optical illusion. In this drawing, the two bottom points, speed and innovation, make the top point, security, seem like it’s further away – but it’s not.
First, let’s look at how organizations are achieving the speed and innovation corners of this triangle, and then we can see why securing our IT environments has become more of a challenge while still an ACHIEVABLE one.
Understanding the Cloud and DevOps Best Practices
There are two key elements to the DevOps process as we know it today. The first one is simplifying management by decoupling it from underlying platforms. Instead of managing each system/platform separately, DevOps and Cloud best practices seek solutions that provide an abstraction layer. Using this layer, enterprises can work across all systems, from legacy to future-focused, without impediment. It’s streamlining that has become essential in today’s enterprises which have everything from legacy, end of life operating systems and platforms, to modern virtualized environments, clouds and containers.
Secondly, DevOps and Cloud best practices utilize automated provisioning, management and autoscaling of workloads, allowing them to work faster and smarter. These are implemented through playbooks, scripts like Chef, Puppet and Ansible to name a few.
Sounds Great, but not for Traditional Segmentation Tools
These new best practices allow enterprises to push out new features quickly, remain competitive, and act at the speed of today’s fast-paced world. However, securing these by traditional security methods is all but impossible.
Historically, organizations would use firewalls, VLANs and ACLs for on-premises systems, and then virtualized firewalls and Security Groups in their cloud environments. Without an established external perimeter, with so many advanced cyberattacks, and with dynamic change happening all the time, these have now become yesterday’s solution. Here are just some of the problems:
- Complex to manage: Having multiple systems just isn’t realistic. Using Firewalls, VLANs and ACLs on-premises and security groups in the cloud for example means that you have multiple systems to manage, which add to management complexity, are resource intensive and do not provide the seamless visibility required. The rule-sets vary, and can even contradict one another, and you don’t know if you have gaps that could leave you open to unnecessary risk.
- Increased maintenance: Changes for these systems need to be carried out manually, and nothing less than automation is enough for today’s complex IT environments. You may have tens of thousands of servers or communication flows to handle, and it’s impossible to do this with the human touch.
- Low visibility: For strong security, your business needs to be able to see down to process level, include user/identity and domain name information across all systems and assets. With a lack of basic visibility, your IT teams cannot understand application and user workflows or behavior. Any simple change could cause an outage or a problem that slows down business as usual.
- Platform-specific: For example, VLANs do not work on the cloud, or Security Groups won’t help on-premises. To ensure you have wide coverage, you need a security solution that can visualize and control everything, from the most legacy infrastructure or bare metal servers all the way through to clouds, containers and serverless computing.
- Coarse controls: The most common traditional segmentation tools are port and IP-based, despite today’s attackers going after processes, users or workloads for their attacks. Firewalls are innately perimeter controls, so cannot be placed between most traffic points. While companies attempt to fix this by re-engineering traffic flows, this is a huge effort that can become a serious bottleneck.
Introducing Software-Defined Segmentation: An Approach That Works with DevOps From the Start
With these challenges in mind, there are security solutions that take advantage of DevOps and cloud best practices, and allow us to build an abstraction layer that simplifies visibility and control across our environment in a seamless, streamlined fashion. One that allows us to take advantage of DevOps and cloud automation to gain speed as well.
Software-defined segmentation is built to address the challenges of traditional tools for the hybrid cloud and modern data center from the start. Just like with cloud or DevOps processes, the visibility and policy management is decoupled from the underlying platforms, working on an abstraction layer across all environments and operating systems. On one unique platform, organizations can gain deep visibility and control over their entire IT ecosystem, from legacy systems through to the most future-focused technology. The insight you receive is far more granular than with any traditional segmentation tools, allowing you to see at a glance the dependencies among applications, users, and workloads, making it simple to define and enforce the right policy for your business needs. These policies can be enforced by process, user identity, and FQDN, rather than relying on port and IP that will do little to thwart today’s advanced threats.
Software-defined segmentation follows the DevOps mindset in more ways than one. It incorporates the same techniques for efficiency, innovation and speed, such as automated provisioning, management, and autoscaling. Developers can continue to embrace a ‘done once, done right’ attitude, using playbooks and scripts such as Chef, Puppet and Ansible to speed up the process from end to end, and automate faster, rather than rely on manual moves, changes, adds or deletes.
Embrace the New, but Cover the Old
Software-defined segmentation is a new age for cybersecurity, providing a faster, more granular way for enterprises to protect their critical assets. Projects that in the past may have spanned many years can now be done in a matter of a few weeks with this new approach, quickly reducing risk and validating compliance.
If your segmentation solution is stuck in the past, you’re leaving yourself open to risk, making it far easier for hackers to launch an attack, and you’re unlikely to be living up to the necessary compliance mandates for your industry.
Instead, think about a new approach that, just like your DevOps practices, is decoupled from any particular infrastructure, and is both automatable and auto-scalable. On top of this, make sure that it provides equal visibility and control across the board in a granular way, so that speed and innovation can thrive, with security an equal partner in the triangle of success.
Securing modern data centers and clouds needs a whole new approach to segmentation. To learn more about it, check out our white paper.