Posts

Guardicore Raises $60 Million; Funding Fuels Company Growth and Continued Disruption

Today I am excited to share that we have secured a Series C funding round of $60 million, bringing our total funding to more than $110 million. The latest round was led by Qumra Capital and was joined by other new investors DTCP, Partech, and ClalTech. Existing investors Battery Ventures, 83North, TPG Growth, and Greenfield Partners also participated in the round.

Since we launched the company in 2015, Guardicore has been focused on a single vision for providing a new, innovative way to protect critical assets in the cloud and data center. Our focus, and our incredible team, has earned the trust of some of the world’s most respected brands by helping them protect what matters most to their business. As the confidence our customers have in us has grown, so has our business, which has demonstrated consistent year-over-year growth for the past three years.

Our growth is due to our ability to deliver on a new approach to secure data centers and clouds using distributed, software-defined segmentation. This approach aligns with the transformation of the modern data center, driven by cloud, hybrid cloud, and PaaS adoption. As a result, we have delivered a solution that redefines the role of firewalls and implementing Zero Trust security frameworks. More dynamic, agile, and practical security techniques are required to complement or even replace the next-generation firewall technologies. We are delivering this and give our customers the ability to innovate rapidly with the confidence their security posture can keep up with the pace of change.

Continued Innovation

The movement of critical workloads into virtualized, hybrid cloud environments, industry compliance requirements and the increase of data center breaches demands a new approach to security that moves away from legacy firewalls and other perimeter-based security products to a new, software-defined approach. This movement continues to inspire our innovations and ensure that our customers have a simpler, faster way to guarantee persistent and consistent security — for any application, in any IT environment.

Our innovation is evident in several areas of the company. First, we have been able to quickly add new innovative technology into our Centra solution, working in close partnership with our customers. For example, we deliver expansive coverage of data center, cloud infrastructure and operating environments, and simpler and more intuitive ways to define application dependencies and segmentation policies. This gives our customers the right level of protection for critical applications and workloads in virtually any environment.

Second, our Guardicore Labs global research team continues to provide deep insights into the latest exploits and vulnerabilities that matter to the data center. They also equip industry with access to open source tools like Infection Monkey, and Cyber Threat Intelligence (CTI) that allows security teams to keep track of potential threats that are happening in real time.

We have also continued to build out other areas of our business, such as our partner ecosystem, which earned the five-star partner program rating from CRN since its inception two years ago, as well as our technology alliances, which include relationships with leading cloud / IaaS infrastructure players such as AWS, Azure, and Nutanix.

Looking Ahead

We are proud of our past, but even more excited about our future. While there is always more work to do, we are in a unique position to lead the market with not only great technology, but a strong roster of customers, partners and, most importantly, a team of Guardicorians that challenge the status quo every single day to deliver the most innovative solutions to meet the new requirements of a cloud-centric era. I truly believe that we have the best team in the business.

Finally, as we celebrate this important milestone, I want to say thanks to our customers who have made Guardicore their trusted security partner. It is our mission to continue to earn your trust by
ensuring you maximize the value of your security investments beyond your goals and expectations.

Understanding and Avoiding Security Misconfiguration

Security Misconfiguration is simply defined as failing to implement all the security controls for a server or web application, or implementing the security controls, but doing so with errors. What a company thought of as a safe environment actually has dangerous gaps or mistakes that leave the organization open to risk. According to the OWASP top 10, this type of misconfiguration is number 6 on the list of critical web application security risks.

How Do I Know if I Have a Security Misconfiguration, and What Could It Be?

The truth is, you probably do have misconfigurations in your security, as this is a widespread problem, and can happen at any level of the application stack. Some of the most common misconfigurations in traditional data centers include default configurations that have never been changed and remain insecure, incomplete configurations that were intended to be temporary, and wrong assumptions about the application expected network behaviour and connectivity requirements.

In today’s hybrid data centers and cloud environments, and with the complexity of applications, operating systems, frameworks and workloads, this challenge is growing. These environments are technologically diverse and rapidly changing, making it difficult to understand and introduce the right controls for secure configuration. Without the right level of visibility, security misconfiguration is opening new risks for heterogeneous environments. These include:

  • Unnecessary administration ports that are open for an application. These expose the application to remote attacks.
  • Outbound connections to various internet services. These could reveal unwanted behavior of the application in a critical environment.
  • Legacy applications that are trying to communicate with applications that do not exist anymore. Attackers could mimic these applications to establish a connection.

The Enhanced Risk of Misconfiguration in a Hybrid-Cloud Environment

While security misconfiguration in traditional data centers put companies at risk of unauthorized access to application resources, data exposure and in-organization threats, the advent of the cloud has increased the threat landscape exponentially. It comes as no surprise that “2017 saw an incredible 424 percent increase in records breached through misconfigurations in cloud servers” according to a recent report by IBM. This kind of cloud security misconfiguration accounted for almost 70% of the overall compromised data records that year.

One element to consider in a hybrid environment is the use of public cloud services, third party services, and applications that are hosted in different infrastructure. Unauthorized application access, both from external sources or internal applications or legacy applications can open a business up to a large amount of risk.

Firewalls can often suffer from misconfiguration, with policies left dangerously loose and permissive, providing a large amount of exposure to the network. In many cases, production environments are not firewalled from development environments, or firewalls are not used to enforce least privilege where it could be most beneficial.

Private servers with third-party vendors or software can lack visibility or an understanding of shared responsibility, often resulting in misconfiguration. One example is the 2018 Exactis breach, where 340 million records were exposed, affecting more than 21 million companies. Exactis were responsible for their data, despite the fact that they use standard and commonly used Elasticsearch infrastructure as their database. Critically, they failed to implement any access control to manage this shared responsibility.

With so much complexity in a heterogeneous environment, and human error often responsible for misconfiguration that may well be outside of your control, how can you demystify errors and keep your business safe?

Learning about Application Behavior to Mitigate the Risk of Misconfiguration

Visibility is your new best friend when it comes to fighting security misconfiguration in a hybrid cloud environment. Your business needs to learn the behavior of its applications, focusing in on each critical asset and its behavior. To do this, you need an accurate, real-time map of your entire ecosystem, which shows you communication and flows across your data center environment, whether that’s on premises, bare metal, hybrid cloud, or using containers and microservices.

This visibility not only helps you learn more about expected application behaviors, it also allows you to identify potential misconfigurations at a glance. An example could be revealing repeated connection failures from one specific application. On exploration, you may uncover that it is attempting to connect to a legacy application that is no longer in use. Without a real-time map into communications and flows, this could well have been the cause of a breach, where malware imitated the abandoned application to extract data or expose application behaviors. With foundational visibility, you can use this information to remove any disused or unnecessary applications or features.

Once you gain visibility, and you have a thorough understanding of your entire environment, the best way to manage risk is to lock down the most critical infrastructure, allowing only desired behavior, in a similar method to a zero-trust model. Any communication which is not necessary for an application should be blocked. This is what OWASP calls a ‘segmented application architecture’ and is their recommendation for protecting yourself against security misconfiguration.

Micro-segmentation is an effective way to make this happen. Strict policy protects communication to the most sensitive applications and therefore its information, so that even if a breach happens due to security misconfiguration, attackers cannot pivot to the most critical areas.

Visibility and Smart Policy Limit the Risk of Security Misconfiguration

The chances are, your business is already plagued by security misconfiguration. Complex and dynamic data centers are only increasing the risk of human error, as we add third-party services, external vendors, and public cloud management to our business ecosystems.

Guardicore Centra provides an accurate and detailed map of your hybrid-cloud data center as an important first step, enabling you to automatically identify unusual behavior and remove or mitigate unpatched features and applications, as well as identify anomalies in communication.

Once you’ve revealed your critical assets, you can then use micro-segmentation policy to ensure you are protected in case of a breach, limiting the attack surface if misconfigurations go unresolved, or if patch management is delayed on-premises or by external vendors. This all in one solution of visibility, breach detection and response is a powerful tool to protect your hybrid-cloud environment against security misconfiguration, and to amp up your security posture as a whole.

Want to hear more about Guardicore Centra and micro-segmentation? Get in touch

Looking for a Micro-segmentation Technology That Works? Think Overlay Model

Gartner’s Four Models for Micro-Segmentation

Gartner has recently updated the micro-segmentation evaluation factors document (“How to Use Evaluation Factors to Select the Best Micro-Segmentation Model (Refreshed: 5 November 2018).

This report details the four different models for micro-segmentation, but it did not make a clear recommendation on which was best. Understanding the answer to this means looking at the limitations of each model, and recognizing what the future looks like for dynamic hybrid-cloud data centers. I recommend reading this report and evaluating the different capabilities, however for us at Guardicore, it is clear that one solution model stands above the others and it should not be a surprise that vendors that have previously used other models are now changing their technology to use this model: Overlay.

But first, let me explain why other models are not adequate for most enterprise customers.

The Inflexibility of Native-Cloud Controls

The native model uses the tools that are provided with a virtualization platform, hypervisor, or infrastructure. This model is inherently limited and inflexible. Even for businesses only using a single hypervisor provider, this model ties them into one service, as micro-segmentation policy cannot be simply moved when you switch provider. In addition, while businesses might think they are working under one IaaS server or hypervisor, the provider may have servers elsewhere, too, known as Shadow IT. The reality is that vendors that used to support Native controls for micro-segmentation have realized that customers are transforming and had to develop new Overlay-based products.

More commonly, enterprises know that they are working with multiple cloud providers and services, and need a micro-segmentation strategy that can work seamlessly across this heterogeneous environment.

The Inconsistency of Third-Party Firewalls

This model is based on virtual firewalls offered by third-party vendors. Enterprises using this model are often subject to network layer design limitations, and therefore forced to change their networking topology. They can be prevented from gaining visibility due to proprietary applications, encryption, or invisible and uncontrolled traffic on the same VLAN.

A known issue with this approach is the creation of bottlenecks due to reliance on additional third-party infrastructure. Essentially, this model is not a consistent solution across different architectures, and can’t be used to control the container layer.

The Complexity of a Hybrid Model

A combination of the above two models, enterprises using a hybrid model for micro-segmentation are attempting to limit some of the downsides of both models alone. To allow them more flexibility than native controls, they usually utilize third-party firewalls for north-south traffic. Inside the data center where you don’t have to worry about multi-cloud support, native controls can be used for east-west traffic.

However, as discussed, both of these solutions, even in tandem, are limited at best. With a hybrid approach, you are also going to add the extra problems of a complex and arduous set up and maintenance strategy. Visibility and control of a hybrid choice is unsustainable in a future-focused IT ecosystem where workloads and applications are spun up, automated, auto-scaled and migrated across multiple environments. Enterprises need one solution that works well, not two that are sub-par individually and limited together.

Understanding the Overlay Model – the Only Solution Built for Future Focused Micro-Segmentation

Rather than a patched-together hybrid solution from imperfect models, Overlay is built to be a more robust and future-proof solution from the ground up. Gartner describes the Overlay model as a solution where a host agent or software is enforced on the workload itself. Agent-to-agent communication is utilized rather than network zoning.

One of the negative sides to third-party firewalls is that they are inherently unscalable. In contrast, agents have no choke points to be constrained by, making them infinitely scalable for your needs.

With Overlay, your business has the best possible visibility across a complex and dynamic environment, with insight and control down to the process layer, including for future-focused architecture like container technology. The only solution that can address infrastructure differences, Overlay is agnostic to any operational or infrastructure environments, which means an enterprise has support for anything from bare metal and cloud to virtual or micro-services, or whatever technology comes next. Without an Overlay model – your business can’t be sure of supporting future use cases and remaining competitive against the opposition.

Not all Overlay Models are Created Equal

It’s clear that Overlay is the strongest technology model, and the only future-focused solution for micro-segmentation. This is true for traditional access-list style micro-segmentation as well as for implementing deeper security capabilities that include support for layer 7 and application-level controls.

Unfortunately, not every vendor will provide the best version of Overlay, meeting the functionality that its capable of. Utilizing the inherent benefits of an Overlay solution means you can put agents in the right places, setting communication policy that works in a granular way. With the right vendor, you can make intelligent choices for where to place agents, using context and process level visibility all the way to Layer 7. Your vendor should also be able to provide extra functionality such as enforcement by account, user, or hash, all within the same agent.

Remember that protecting the infrastructure requires more than micro-segmentation and you will have to deploy additional solutions that will allow you to reduce risk and meet security and compliance requirements.

Micro-segmentation has moved from being an exciting new buzzword in cyber-security to an essential risk reduction strategy for any forward-thinking enterprise. If it’s on your to-do list for 2019, make sure you do it right, and don’t fall victim to the limitations of an agentless model. Guardicore Centra provides an all in one solution for risk reduction, with a powerful Overlay model that supports a deep and flexible approach to workload security in any environment.

Want to learn more about the differences between agent and agentless micro-segmentation? Check out our recent white paper.

Read More

What is Micro-Segmentation?

Micro-segmentation is an emerging security best practice that offers a number of advantages over more established approaches like network segmentation and application segmentation. The added granularity that micro-segmentation offers is essential at a time when many organizations are adopting cloud services and new deployment options like containers that make traditional perimeter security less relevant.

Micro-Segmentation Methods

The best way for organizations to get started with micro-segmentation is to identify the methods that best align with their security and policy objectives, start with focused policies, and gradually layer additional micro-segmentation techniques over time through step-by-step iteration.

Harness the Benefits of Micro-Segmentation

One of the major benefits of micro-segmentation is that it provides shared visibility into the assets and activities in an environment without slowing development and innovation. Implementing micro-segmentation greatly reduces the attack surface in environments with a diverse set of deployment models and a high rate of change.

Time to Transform Data Centre Security?

Digital transformation is by its very definition redefining how data centres are designed and services managed and deployed. In fact, much like the long-maligned ‘perimeter’ security model many once datacentre-centric workloads are evaporating and re-forming as more agile and elastic cloud-based operational models.

Read more

Security Features of the Hybrid Cloud (OpenStack and AWS)

Everyone knows about the many benefits of the cloud: it is infinitely scalable, developer-friendly, and easy to use. However, we often avoid addressing the reality that the cloud is not perfect. The truth is that, despite the cloud’s many merits, it presents a significant challenge from a security standpoint. Security concerns might make you hesitate to deploy your workloads in any cloud, be it public or private – and understandably so.

Read more

Why Security Teams Need Visibility into Container Networks

Containers and orchestration systems use numerous technical abstractions to support auto-scaling and distributed applications that obfuscate visibility into application communication flows. Security teams lose visibility into application communication flows, rendering traditional tools useless and exposing the application to risk.
Read more

Avoiding Micro-Segmentation Pitfalls: A Phased Approach to Implementation

Micro-segmentation is very achievable. While it can feel daunting, you can succeed by proactively being aware of and avoiding these roadblocks.  Read more