Infrastructure Considerations for Containers and Kubernetes

  • Updated on August 8, 2023
  • Alex Lesser
    By Alex Lesser
    Alex Lesser

    Experienced and dedicated integrated hardware solutions evangelist for effective HPC platform deployments for the last 30+ years.

Table of Contents

    pexels-pixabay-50711

    Containers and Kubernetes are at the heart of a broad industry shift where applications and services are based on a microservices architecture. Specifically, microservices are being rapidly adopted as a means of building and modernizing distributed applications allowing them to be more scalable, flexible, resilient, and easier to build.

    Instead of building self-contained, monolithic applications, a microservices approach breaks applications into modular, independent components that can be dynamically integrated with one another using application programming interfaces (APIs).

    Increasingly, companies are using containers to power their microservices application architectures. Containers encapsulate a lightweight runtime environment for an application. Specifically, containers enable finer-grained execution environments, permit application isolation, and are lightweight.

    Furthermore, containers include everything needed to run, such as code, dependencies, libraries, binaries, and other elements. Today, Docker is the most popular choice for building and running containers.

    Compared to Virtual Machines, containers share the OS kernel instead of having a full copy of it and take up less space. Because they do not require OS spin-up time associated with a VM, containers initialize faster. In general, containers start in seconds or even milliseconds, which is much faster than VMs. As such, containers deliver performance characteristics that match the needs of a microservices architecture. In particular, the quick instantiation maps better to the unpredictable workload characteristics associated with microservices.

    The growing embracement of containers was validated in a 2019 industry container usage survey that found the median number of containers per host doubled (to 30) between 2018 and 2019. And the maximum per-node density was 250 containers, which was a 38% increase from 2018.

    Managing Your Containers

    With such explosive growth in the use of containers, companies need a way to oversee and manage their efforts. That’s where Kubernetes comes in.

    Kubernetes is an open-source container orchestrator system for automating deployment, scaling, and management of application containers across clusters of hosts. It was originally designed by Google and is now maintained by the Cloud Native Computing Foundation. Kubernetes works with a range of container tools, including Docker. It groups containers that make up an application into logical units for easy management and discovery.

    Kubernetes provides a framework to run distributed systems resiliently. It takes care of scaling and failover for an application. For example, in a production environment, Kubernetes can start a new container if one goes down. Thus, helping to ensure there is no application downtime. Additionally, Kubernetes provides service discovery and load balancing, storage orchestration, automated rollout and rollbacks, self-healing features, configuration management, and more.

    While there are a handful of container orchestrators available today, Kubernetes dominates the market. In addition to the widely used open-source variant, some commercial offerings such as Red Hat OpenShift are built on Kubernetes. (The commercial offers add enterprise features and support.)

    Kubernetes can be deployed on a bare-metal cluster or on a cluster of virtual machines. Kubernetes, in turn, can orchestrate the containers it manages directly on bare metal or on virtual machines. Most instances of Kubernetes today are run on VMs running on-premises or in the cloud.

    Bare-metal instances are not as common. However, there are use cases where they offer advantages. For example, a network edge application might be too latency-sensitive to tolerate the overhead created by a VM. Or an application (such as machine learning) might need to run on GPUs or other hardware accelerators, which do not lend themselves to VMs.

    The company is ideally suited to deliver on NZO Cloud HPC and big data platforms for various government agencies and various sectors, from defense to public health. In conjunction with a close IT solutions partner, we worked to anticipate the technology needs of our government and created proposal that focused on a weather data ingestion and display platform that was built on our hardware with SAS software, a business intelligence, statistics, and analytics software stack. Among several hundred submissions, ours was just one in six that would be given the chance to pitch at the U.S. Space Force Pitch Day.

    NZO Cloud chose to partner with two other prominent organizations to offer Space Force a solution around Weather Data Ingestion. The stated goal is to deliver a single dashboard to help operators monitor and anticipate launch conditions. This would be accomplished through the use of high-performance computing instances, artificial intelligence and graphical tools to simplify interpreting the data. Space Force awarded the contract to deliver the solution immediately stating that NZO Cloud’s record of successful deployments to the 45th Space Wing played a big role in the decision. Mr. Lesser explains the significance of the award, “We recognize that this award is an opportunity to show the newest branch of the DOD that NZO Cloud is a partner they can rely on to solve really complex problems with very, very limited risk.”

    The platform is expected to be delivered within the next few weeks depending upon base regulations due to Covid-19 protocols. Once deployed, the 45th Space Wing will better understand and act to benefit the mission and public safety.

    About NZO Cloud

    For technology powered visionaries with a passion for challenging the status quo, NZO Cloud is the answer for hand-crafted HPC and Big Data computing solutions that deliver relentless performance with the absolute lowest total cost of ownership. We are true innovators offering high performance computing solutions to solve the world’s most demanding problems. For 25+ years, organizations of all sizes and from a variety of sectors rely on NZO Cloud’s computing instances. We are proud to support many departments within the United States government, Fortune 500 companies, as well as small and medium-sized businesses.

    All products are designed and built at the company’s headquarters in Lake Forest, California.

    One fixed, simple price for all your cloud computing and storage needs.