<img src="https://ws.zoominfo.com/pixel/JHVDdRXH2uangmUMQBZd" width="1" height="1" style="display: none;">

3 Forces Reinventing the Cloud

Cloud Machine Learning Containers
3 Forces Reinventing the Cloud
Cloud Machine Learning Containers
Reinventing the Cloud

Even though the public cloud sparked a lot of innovation in scalable infrastructure, it failed to transform or reinvent how infrastructure and applications are built. We still need to deal with the same old constructs including virtual machines, complex networking topologies, and connecting different layers together.

We are now at a very interesting time in the history of computing where infrastructure and applications are being redefined. It is sparking the new generation of cloud infrastructure.

Emergence of machine learning

Machine learning in its different forms, such as probabilistic systems, neural networks and linear regressions, has been used in different applications. However, we are now rediscovering what AI can do for us. It is simple. Whenever there is more data and hints derive more value out of it, AI becomes an interesting dimension to add. It recently became a hot field due to significantly cheaper and faster compute, as well as the abundance in data. More insights and services can be now derived and learned with these conventional models. It is now about how and where you apply such AI models. This becomes interesting when we apply it to cloud infrastructure and applications running on top of it.

IT infrastructure, and the cloud, has no shortage in data. However, it is lacking nimble and meaningful ways to control infrastructure and applications to make them work in harmony. This is where we can get a significantly higher value. Infrastructure is currently being controlled with hard coded rules based on predefined conditions. For example, if my cpu us more than 80% for 10 minutes, spin up a new virtual machine. A question: how frequently do we need to revise these rules? Well, it is usually as fast as the application is evolving. If we are now evolving applications faster than before and expecting that evolution to accelerate, do we have enough man hours to review and do the tedious analysis to pick the right rules, parameters, components, etc.? Do we really want to spend time there?

Now, if we have such AI systems that can tell what to do, how can we give enough control to these systems to keep our applications and services up to their key performance indicators (KPIs)? If such control became AI powered and we were able to make it economically sound and smart enough to learn quickly, a new era of smart infrastructure will emerge.

It is like finding a cheap and easy way to make all current cars driverless!

Containers as building block of applications

Containers emerged 2 years ago as a great way to wrap software and have a consistent behavior from one environment to another. It has since rapidly evolved and is now considered the building block for modern applications to provide a simple, unified and consistent view of applications running on different infrastructures. Containers actually offer the opportunity to redefine computing. Rather than thinking of compute as objects you rent, such as virtual machines, to utilities that applications and services get at any capacity and with very high precision. An application running inside a container does not care about what is outside of said container as long as it has the necessary resources it needs such as compute, memory and I/O to connect with its other components.

Think of the making compute accessible like electricity!

Ubiquity of public cloud infrastructure

Big players in the public cloud are creating a new abundance in public infrastructure that will keep driving prices down and enriching services offered on top of them. However, even with the enriching services, basic compute (aka IaaS) will continue to be the biggest chunk of the cloud since it is the most common for all applications and engineers. It is also apparent that despite the abundance of basic compute capacity, we are still in early days of how the cloud can be consumed as a utility.

We need to normalize and decompose compute to its basic elements in order to offer it as a utility. Compute should be thought of like electricity. We consume electricity today the same way with the same side effects regardless of the providers. We also pay for electricity with very high precision based on what devices actually consumed without approximation or rounding.

Imagine compute being fed directly into applications, just like electricity, without affecting how applications are developed or where they are run. We can certainly see the opportunities if we can have compute offered in such a fluid format.

Hint: the solution includes containers!

Stay tuned on what Magalix is up to! Register now to get updates and early access to a new way running your applications on the cloud!

Comments and Responses

Related Articles

Product In-Depth: Enforce Policies and Standards from a Single Console

Magalix provides a single management interface to control, enforce and visualize the state of compliance for all of your clusters.

Read more
Product In-Depth: Centralized Policy Management

achieving DevSecOps isn’t as difficult as you may have been led to believe. Interested in learning more about how to start resolving violations in minutes

Read more
Product In Depth: Detailed Violation Analysis

Security, compliance, and governance are not just one-time events that happen every so often. Managing a compliant environment is a 24x7 operation.

Read more