Edge & 5G
Gain crucial understandings of Edge software and 5G concepts with Intel® industry experts
92 Discussions

The Edge isn’t just the Cloud in a different location

3 0 6,351

The Edge isn’t just the Cloud in a different location

Cloud-native approaches to applications will inevitably fall short when it comes to security and manageability at the Edge.


Make no mistake, cloud computing will remain an integral component of most technology stacks for the foreseeable future. But for all the virtues of the cloud — abundant high-powered computing and storage, scalability, relative ease of application development and others — it is fundamentally inadequate for delivering secure and high-performing services in Edge environments. Business leaders need to understand the “why” given the tremendous security and TCO ramifications.

This is also why Intel is collaborating with companies like Spectro Cloud, to support initiatives such as Secure Edge-Native Architecture (SENA), that will pave the way for more secure and easily manageable edge services.

Going from Cloud Native to Edge Native

The edge is a young, fractured market. Solutions are being offered by a small cohort of diverse vendors, ranging from cloud and communications service providers, silicon manufacturers, and OEMs, to start-ups—all without unified standards. One clear difference emerging in the market is the architectural approach to the edge, whether solution design is cloud native (i.e., ‘cloud- out’ to the edge) or edge native (i.e., ‘edge-in’ to the cloud).

While the industry is driving toward a “cloud native” experience for the edge, the cloud computing model by itself cannot unlock the promise of the edge. 

The cloud was born from technology companies renting out the unused server capacity to host third-party data and applications. The tools and technologies that have arisen along with the maturing cloud computing infrastructure model, no doubt, has ushered an innovation revolution driven by developers. The centrality of storage and compute, along with tools that abstracted away hardware complexities—has allowed developers to focus on solving business problems, and scale solutions, without much concern for , lifecycle management or physical security.

The cloud model has fundamentally reshaped software development and service monetization. But the maturation of cloud computing has exposed critical limitations of centralized computing:


  1. Speed—it is too slow to send every piece of data to the cloud to see if it has value. Data is getting bigger while the window to make critical, actionable decisions is shrinking.
  2. Cost—It is too costly to send all data to the cloud or run many applications at scale in public cloud, especially for data-intensive workloads.
  3. Data Control—either organizations can’t, or don’t want to, move data for sovereignty, security, or regulatory requirements that demand strict, local data control and insight.

These limitations make it imperative to build and optimize edge platforms and solutions that are “edge-in” to the cloud or edge-native.

Edge devices (e.g., smart phones, smart cameras, and others) generate enormous amounts of data—that could be turned into value business insights and unlock process automation, except for the cost and time associated with backhauling data for processing in the data center.  Bringing compute out from the data center and closer to the point of data generation at the Edge is one solution for reducing cost and latency. However, the move to decentralize compute creates a host of new challenges foreign to cloud environments.

Instead of homogeneous high-powered servers secured behind the secure walls of a data center, an edge node will likely operate in a constrained environment, such as hanging from a lamppost of a major city, baking in the sun at a vineyard, deployed in emergency responses scenarios with degraded infrastructure, or even off-planet. All these leading to complex Day2 operations needs with expensive truck rolls potentially required for solution life cycle management. Nodes may differ in architecture given variable workload needs across the solution stack and power budget, with data flowing across edge nodes to private networks and public clouds. Data also flows among edge nodes, creating a very different operational workflow at the edge.  In addition, Kubernetes and navigating the open-source ecosystem dramatically increases the complexity for IT, platform engineering and DevOps teams, with multiple software layers consisting of heterogeneous components that need to be maintained up to date. And the number of edge nodes can vary from a few to the tens of thousands, often requiring administration responsibilities to be shared by less technical or non-technical personnel.

Put another way, a single edge solution can represent thousands of opportunities for failure, unauthorized access, and unending maintenance—potentially threatening operational continuity, brand reputation and bottom lines.

Why Edge Native Matters

Success at the Edge requires a completely different approach to hardware, software, and management. The table below outlines how significant this different is:  




Cloud Native

Edge Native

App Model

Microservices/Container based built for horizontal scaling and are often stateless

Container or VM based, monolithic nodes that are often stateful


App self-orchestration in a horizontal way

Infra and edge orchestration across edges in a hierarchical way


Rapid spin-up and spin-down

Limited elasticity


Horizontal and unlimited scaling

Scale out to the Edge or Scale back to the Cloud


Cloud fabric that never fails; building resiliency to and region failures into apps

Edge is expected to fail; relying on infrastructure architecture itself to manage resiliency


Centralized model to process and store

Caching, streaming, real time and distributed models


Highly standardized and abstracted

Hardware variety, low abstraction, hostile environments, location awareness


High speeds and rich capabilities

Varied speeds and capabilities including mobile and RAN


Centralized management and automation

Remote centralized management, zero touch provisioning hardware and software


Trusted fabric in secure cloud facilities

Zero trust environments in physically insecure locations

The Secure Edge-Native Architecture (SENA)

At Intel, we understand the Edge requires an entirely new paradigm for developing and administering applications on distributed networks. That’s why we’re eager to collaborate with innovative companies like Spectro Cloud who are working to solve some of the most complex challenges of the distributed edge.

Spectro Cloud announced the Secure Edge Native Architecture (SENA), a comprehensive solution architecture that outlines the tools and practices for a secure modern edge infrastructure. SENA combines Intel hardware and edge software, with Spectro Cloud’s Kubernetes management platform Palette, its sponsored open-source project Kairos, and other technology to address edge security as well as overall lifecycle management i.e., deployment, provisioning and operating the platform.

SENA draws on Intel deep root security features—such as Intel® Active Management Technology (AMT) and Intel® Software Guard Extensions (SGX)—and the zero-trust management features of Intel edge software (Intel® Smart Edge), enabling true edge native capabilities for deployment, provisioning and managing a solution at scale.

This includes:

  • Zero-touch provisioning of the entire edge stack at scale
  • Support for scaling to thousands of locations without performance degradation based on a decentralized architecture with local policy enforcement
  • Enhanced hardware encryption to statically measure boot and seal the user data while dynamically assessing device runtime state
  • Zero-trust access model across management plane and locations, with granular Role Based Access Control (RBAC)

“As we talk with organizations every day representing a variety of industries and geographies, we continue to be impressed with the consistent interest in delivering and managing new Kubernetes-based applications at the edge", said Tenry Fu, Spectro Cloud CEO and co-founder. "At the same time, a new set of deployment and management challenges is surfacing which requires us to rethink security and the need to tightly couple and coordinate security capabilities that span hardware and software, from the silicon to the app".

You can learn more about SENA with this whitepaper  and webinar.  

Building the Distributed Edge with Intel

Enterprises are racing to adopt AI automation in the face of hypercompetitive markets, merging IT and OT technologies often on legacy infrastructure, to deploy business-critical solutions. Understanding the limitations of the cloud computing model, and optimizing edge platforms and solutions for the edge using “edge-in” or edge-native technologies can make the difference between hitting business goals and cost overruns for early adopters.

Intel, along with our ecosystem partners, is focused on developing the software defined infrastructure, tools, and standards needed for cloud-like agility at the edge.  



[1] Table adapted from the blog Cloud-Native Isn’t Edge-Native (Gartner 2020)


About the Author
Vice President, Data Platforms Group, General Manager, Smart Edge Platforms Division Intel Corporation Renu N. Navale is vice president in the Data Platforms Group and general manager of the Smart Edge Platforms Division at Intel Corporation. She is responsible for overall strategy, software and platforms that strengthen Intel’s presence in the network edge computing business sector. She also leads ecosystem enabling for the Network Platforms Group (NPG) to accelerate network and edge transformation, and she fosters scale and industrywide collaboration through open source initiatives. Navale joined Intel in 2004 with extensive experience in networking and software. During her Intel career, she has held positions in strategic planning, product management, software marketing and engineering, and in 2015, she earned an Intel Achievement Award, the company’s highest recognition. In 2019, Navale was a finalist for the Edge Computing Woman of Year industry award from Edge Computing World and State of Edge. She also serves on the board of LFEdge, an industry initiative focused on creating an open source community for Edge. Before assuming her current role in 2017, Navale served as technical assistant and chief of staff to the vice president and general manager of NPG. Earlier in her Intel career, she oversaw the creation and launch of the Intel Network Builders program. Her tenure at Intel also includes managing the strategic planning team in the Internet of Things Group’s automotive division, where she led autonomous driving and connected car strategies. Before joining Intel, Navale worked on wireless, VoIP, and network management technologies at Nokia Networks. Her achievements were recognized twice with the company’s highest award for software design and innovation during her eight-year tenure. Navale holds a bachelor’s degree in computer science and engineering from Bangalore University in India, a master’s degree in computer science and engineering from Arizona State University, and an MBA degree from the Thunderbird School of Global Management. She has published multiple papers on topics related to wireless technologies. An active champion of STEM education, Navale volunteers as a programming coach and science fair mentor and judge. She is currently on the advisory board for the SciTech Institute and the Girls Innovation Academy to promote STEM education. She also champions diversity and inclusion efforts for NPG as part of the People First initiative.