In recent years, the “Edge” has taken on a vital role in cloud computing. The Edge represents the growing need to deliver a better cloud model that enables locations and methods to place workloads, compute, storage, applications and data closer to the point of action.
Cloud edge computing moves the processing closer to the user and IOT devices, where the data is generated and consumed. This solves the problem caused by these highly distributed edge sites, by minimizing latency, maximizing bandwidth, and performing computation and data compression right at the point of action. Edge computing even addresses compliance requirements which can vary between different states and countries.
The Edge is decentralizing the cloud itself and creating a better model to support emerging use cases like self-driving cars, augmented reality (AR) and virtual reality (VR), connected homes and offices, 5G and more.
Guardicore is excited to partner and work together with NVIDIA to leverage their high-performance, cloud-native NVIDIA EGX Edge AI platform to deliver AI, IoT and 5G-based services efficiently, powerfully, and securely.
- There are many verticals that can benefit from Edge computing. Here are just two examples:
Healthcare organizations can run machine learning and analytics models on their health management platforms, especially where low latency processing requirements dictate that they remain on-premises. When it’s time to retrieve data, this information is stored locally and therefore quick to retrieve.
- Financial services are another vertical that can leverage edge computing to handle the real-time processing of data that must reside within the confines of local data requirements.
Decentralizing the cloud has many benefits, but it also creates and amplifies the security challenges that are already present in the cloud. The distributed cloud edge creates a larger attack surface, spread across diverse IOT technologies and multiple unprotected physical locations. This provides attackers more opportunities to penetrate the organization and achieve their malicious goals.
Edge-related security challenges are compounded by the accelerating pace of change of infrastructure and the more dynamic application deployment models required to support the Edge. (But this is a topic for a different blog post).
In other words, the security of the cloud, which has always been a top priority, is becoming even more important with Edge.
To address these unique challenges, security must be built into the edge to ensure quality and transparent operations across the entire extended organization: at the core data center, public cloud, and the Edge.
Ironing security into workloads, compute, storage, critical application, and data in any environment and any platform is considered a huge challenge.
Fortuitously micro-segmentation has recently become available, and when implemented correctly, addresses the security challenges inherent in the distributed and decentralized nature of the Edge. Gartner recently named micro-segmentation as one of their top 10 security initiatives. They cited micro-segmentation’s ability to reduce risk and protect the critical assets and information that matter most to the business.
Gartner also described micro-segmentation as being well suited for thwarting “the spread of data center attacks in both on-premises and cloud environments.”
Micro-segmentation is a granular way to create secure zones in data center and cloud deployments, allowing workload isolation and protection. Since legacy perimeter protection is painfully inadequate, micro-segmentation is an essential technology to implement a zero-trust security model. Furthermore, it provides both real-time and historical visibility to understand application dependencies and then easily create network and application security policies based on various business owner contexts.
The cloud killed the enterprise’s legacy perimeter and the Edge is killing the cloud’s perimeter, making micro-segmentation more important for securing the distributed, hybrid cloud that includes an Edge component.
Micro-segmentation, when well-executed, provides benefits at the earliest stages of deployment. Many enterprises start out with easily implemented and achievable projects that eliminate the most fundamental risks first. Whether separating development environments from production, isolating a compliance-driven infrastructure or series of applications from the non-compliant ones, or merely segmenting most critical applications first, these early-stage projects provide the enterprise with immediate value and measurable gains.
It’s important to select a micro-segmentation approach that works consistently across multiple cloud providers. By decoupling security from the cloud infrastructure provider, organizations can prevent vendor lock-in from driving costs up and avoid unnecessary complexity when mergers and acquisitions create mixed cloud environments.
Our solutions are able to address both the security and performance requirements by taking advantage of the advanced hardware capabilities of NVIDIA Mellanox BlueField and NVIDIA Mellanox ConnectX SmartNIC technology, which include dynamically reconfigurable firewall offloads in hardware, encryption offloads and the ASAP2 flow engine for virtual switching offloading. We are excited to see secure NVIDIA Mellanox ConnectX adapters being integrated into the new NVIDIA EGX Edge AI platform, and look forward to the benefits that secure, accelerated computing will bring to the edge.