open.intel
Explore Intel’s open platform and community advocacy contributions
23 Discussions

My Journey to the Clouds... Becoming a Cloud Native

Chris_Norman
Employee
0 0 1,803

The life of a technical marketing specialist has always been hands on, translating deep technical topics into relatable conversations with customers and users.  I once spent a day in Fry's Electronics in the Bay Area, reassuring customers that the Intel Pentium Processor flaw would not impact them. 

Now I’m tackling a new challenge. 

As an evangelist on the Open Ecosystem Team, my role is evolving to focus on all aspects of cloud native technologies.  I’ll be providing deep technical dives into specific programs and projects to illustrate the use cases they address and problems they solve, documenting concepts and terminology, and sharing insights into how Intel is working with cloud service providers (CSP)s to get the optimizations and feature improvements that Intel generates into hands of users and top of stack developers. 

Cloudy, With the Certainty of Open

My first encounter with open source dates back to 1998, the year I discovered Linux* using a text-based install to put RedHat* 5.2 on my trusty Intel® 403e 33MHz 80486DX computer. Hooked on the power of open, I also contributed to Synfig*, an open source 2D animation program, for several years. 

That seems a long way from ‘optimizing from edge to cloud’ -- Intel's motto for cloud native -- but one of the benefits of working here is a continued emphasis on learning, education, and growth despite the inevitable twists and turns of the tech sector.

What started as a hobby turned into a career pivot. In 2011, I talked my way into Intel's Open Source Technology Center (OTC) to spread the word about the benefits of open source on Intel architecture.  It was a big change from my previous positions as a Technical Marketing Engineer in the Network Division (if you ever waited three seconds during boot for the Intel Boot Agent to time out -- sorry!) and in the Customer Quality Network, generating and troubleshooting test patterns for CPU caches. 

In OTC, I championed MeeGo*, Tizen* and Android* as these operating systems were developed on Intel CPUs, working with customers and engineers to resolve issues and implement new features.  Around 2016, I started working on Clear Linux* (where you’ll still find me moderating the Clear Linux community forum).  Clear Linux OS is an open source rolling release Linux distribution optimized for performance and security, from cloud to edge, designed for customization and manageability. 

 Clear Containers* was one of the best things to come out of those efforts.  At the time, there was a weighty tradeoff between relatively secure, but slow and hard to manage virtual machines and much lighter, faster, but less secure, containers.  Clear Containers offered users the best of both worlds.  In 2017, it merged with Hyper's RunV project resulting in Kata Containers*, a de facto standard for secure containers. 

While I focused on various open source operating systems, other OTC teams in OTC worked on solving problems around low-level hardware enabling, the Linux Kernel, web technologies, and data center software stacks.  We had teams working on OpenStack*, Ceph*, Data Plane Development Kit* (DPDK), Storage Plane Development Kit (SPDK) and Kubernetes*.  More recently, we've been contributing to technologies involved in Software Defined Networking and Edge Computing, Istio*, Envoy*, service mesh,  containerd*, CRI-O* and confidential computing.  I’ll be exploring these projects and technologies in more detail in upcoming months. 

Fast forward to 2021, when I dedicated time to working with various cloud workload and language runtime teams as they experimented with different approaches to optimizing configurations and code and pushing those patches upstream. 

All this to say: If not exactly a native in the cloud, I’ve clocked a lot of hours with cloud native engineering teams and absorbed a lot of information. This is my chance to study the subject from the ground up.  My favorite analogy? My knowledge of cloud native concepts is like of a closet full of clothes.  There's a big pile of clothes pooling on the bottom of my closet (just like the first step in the KonMari Method* – but imagine that all of it sparks joy!) Now it’s time to make space, grab the hangers, put up the dividing poles and get organizing -- formally research the subject and build a framework that I can slot these concepts into with a foundation to build on. 

Follow Along 

As I document my learnings, I'll focus on open source projects, especially ones that Intel has either initiated or contributed to, highlighting options based on Intel architecture for each of the features and technologies.  The spotlight will land on examples of the code (features, patches, and bug fixes) Intel has contributed that have been adopted by projects upstream, then filtering down to implementations in Docker* images and CSPs, before being adopted by customers. 

As I ramp up, I'll cover a lot of the basics, but since no learning journey is linear there will be more advanced topics, covered by talking to experts. 

I've been encouraged to focus not just on how developers can take advantage of the features and optimizations that Intel has developed, but how they can be used in a real-world CSPs. 

Update

August 2023: Well, life is never quite as predictable as we expect.  After 34 years of working here at Intel, I'm going to be retiring this month.  I will continue to stay involved in open source, specifically in and around the Clear Linux* project, but I will no longer be part of the Open Ecosystem team here at Intel.  I'm sure they will continue to provide many more interesting articles in future and I look forward to reading them!  

About the Author

Chris Norman is an Open Source Advocate who has promoted the use of open source ecosystems for over a decade.  You can find him as pixelgeek on Twitter,  MastodonIRC and GitHub.   

 

Cover photo by CHUTTERSNAP on Unsplash