Blog

Repost: Why Enterprises Want Containers Now — And Why You Should Too

Originally Posted by TheNewStack

 

About Solinea

Solinea services help enterprises build step-by-step modernization plans to evolve from legacy infrastructure and processes to modern cloud and open source infrastructure driven by DevOps and Agile processes.

Better processes and tools equals better customer (and employee) satisfaction, lower IT costs, and easier recruiting, with fewer legacy headaches.


Historically, large organizations have been slow to adopt new technologies. For example, when virtualization became available, it was a massive, quantum leap from the mindset of one physical server, one operation system, one workload. But it took enterprises a decade or more to embrace virtualization.

And virtualization was a relatively easy sell. After all, it offered energy savings, a reduced data center footprint, faster provisioning, increased uptime and more. With virtualization, cost compression was the name of the game, and no one played the game better than VMware.

Later, when the public cloud (AWS, GCE, etc.) and open source private cloud options (OpenStack, CloudStack, etc.) came along, enterprises had to fundamentally change the way they thought about application development and deployment, as well as infrastructure operations. This, too, was a monumental shift, and this, too, took time.

And yet, today, in stark comparison to those two previous examples of plodding progress, enterprises are adopting container technologies at light speed (relatively speaking). Only two years ago enterprises were asking, “What’s this DevOps thing all about?” and now many are asking, “Do we even need VMs?”

That’s a big shift in a short timeframe.Why have containers caught on so quickly? All the conditions are just right for rapid adoption of container technologies:

1. Virtualization Paves the Way

First, the adoption of virtualization itself has facilitated the adoption of container technologies. With virtualization, enterprises started to become accustomed to faster application development and testing time and have seen application deployment times reduced from days or weeks to hours and minutes. Virtualization made possible, for many enterprises, their dramatic IT transformation from waterfall development to agile and DevOps.

Like infants who take their first steps one day and are cruising around the living room the next, enterprises have experienced the benefits of speed and agility, thanks to virtualization; now, there is no stopping their quest for more. Containers offer the next big leap beyond virtualization. You can further optimize your server resources with containers than you can with VMs. Containers are more efficient at maximizing memory use and CPUs than VMs running a similar workload, and containers initiate in milliseconds as opposed to the minutes that it takes for VMs to boot up.

Plus, for app developers, containers deliver “easy” on a platter: containers are portable, consistent environments for development, testing and deployment, enabling developers to build once and deploy anywhere with a single click.

2. Open Source Goes Mainstream

Second, open source technologies are becoming more prevalent in large enterprises. Open source has been making its way into the enterprise at a pretty steady pace. If you go back over 15 years, you would see open source in places like mail relays and web servers. Slowly but surely, the use of open source has migrated into more and more areas of the enterprise.

Some organizations can consume open source from the code base and operate it on their own. There are some clear advantages here, especially if your organization is willing and able to contribute code back to a project. Most organizations, however, consume open source via a standard support and maintenance model, otherwise known as a distribution (or distro, for short). This is very common, and we can point to historical examples ranging from Linux distributions to OpenStack distributions.

Today you will be hard pressed to find a large enterprise that is not using open source for significant parts of their infrastructure. In fact, open source technologies are now recognized and valued by enterprises not only for scalability and avoidance of vendor lock-in but also for quality, adaptability, innovation and feature development. Strength of the development community is also a critical factor. Even the security concerns that some enterprises cited once upon a time have given way to the high value placed on transparency of the code base as well as the software development process.

Because enterprises have now embraced open source — a Future of Open Source survey in 2015 reported 78 percent of respondents are running their businesses on open source software enterprises are leaning in on Docker and other container technologies at a faster pace than they have leaned in on other technologies in the past.

3. Enterprises Get Cloudy

Third, enterprises have taken the leap to build “cloudy” applications, and this has reduced the intellectual barrier to adopting containers. Not long ago, “Shadow IT” was a big enterprise IT issue; developers secretly stepped out beyond the corporate IT boundaries to find suitable testing grounds for their applications. Enterprises used to be surprised and alarmed when they discovered that their developers were using AWS; now, we’re all surprised if they are not.

As enterprises have embraced the cloud, Shadow IT is going mainstream. Now developers and IT operators are linking arms and asking, “Now that we have cloud, how can we move faster?” You might even say that the cloud has spawned the DevOps mindset: How can I converge development and operational processes to capitalize on the cloud infrastructure that is now available?

These three developments — virtualization, open source and cloud — have transformed the enterprise IT mentality and opened the floodgates to container land.

Should Containers Be in Your Future?

If you are the kind of enterprise that builds your own apps, and these apps are designed to make your business money, containers are one bandwagon you’d be well advised to hop on. Containers are real; the technology has a low barrier for entry, and the economy of running containers at scale will deliver real value.

That said, the answer to “Do we even need VMs?” is yes. Virtual machines will have their place in the enterprise and are not going away in the near term. Not all applications are ready for containers, and yet those applications still make money and add value to your business. So for the time being, we still need VMs to support some of the packaged applications that run our business.

A first project? Why not start by tackling one of the biggest IT constraints enterprises face: developer wait cycles. In most enterprises, developers have to wait inordinately for anything ranging from getting a laptop to getting a server to test their code. Containers are a great tool for overcoming these wasteful and maddening bottlenecks.

Consider the benefits containers can offer here. With a container framework, developers can test code on their desktops, or on shared server hardware. Their deployment code can be used in many locations for testing. The control groups and namespace isolation that are core to containerland make this a very reasonable approach for development teams. Not only can developers gain real time savings here, but also operators can achieve a tremendous reduction in idle infrastructure. Plus, if you are developing this way, the leap to test/prod environments is much shorter as well. Provance and immutability are key tenets of any CI/CD process; this is core to the container framework.

Start with an application that can utilize this new container framework in a sane manner. Find an internally facing application where people can learn by making mistakes and get away with it, without putting external relationships at risk. There are many new constructs from orchestration to operations that will be different on a container platform, so give the whole organization, from the developers to the operators, the time to learn what these new frameworks mean. Find a way for the team to cut their teeth on this stuff before you decide that you should roll out containers to customer facing applications.

Lessons Learned

The rapid adoption of a “new” technology typically comes with numerous toe-stumps. Here are a few boulders to look out for as you journey into container land.

Tooling: Tooling in the container space is, to put it mildly, immature by enterprise standards. The ecosystem has not yet found a unified course and is running in many directions at once. In the month of December, according to GitHub stats, the Kubernetes project alone has seen 592 code commits from 117 contributors. In that same time, Docker had 408 code commits from 97 contributors.

Distributions: As we have seen with open source projects in the past, distributions will form in this space as well. Today when you are assembling the tools you need to support your container environment, you will be getting components from different sources. A sampling of these are listed here: 

  • API Driven Infrastructure Services — AWS, GCE, OpenStack
  • Container Host — CoreOS, RancherOS, CentOS, Ubuntu, RedHat Atomic
  • Container Networking — Project Calico, Flannel, Weave
  • Container Engine — Docker, Rocket (rkt)
  • Container Registry — Docker Trusted Registry, Quay, Artifactory, Nexus (beta at the time of this writing)

Change Management: For most organizations the alignment to a DevOps model is a real change and therefore a real challenge. Getting the operations teams to believe in the magic of containers is a daunting task. These guys have been on the front lines of the enterprise code base for many years and have been key to keeping environments up and running as other technology promises have fallen flat. Container frameworks “tout” new ways to simplify operational life cycles, but, based on their past experiences, most operators will need to see it to believe it. 

 

Are You Ready for Containers?
You’d better be.

If you tell me your enterprise does not build applications that will run in containers, you are looking at the problem from the wrong angle. Perhaps you don’t do it now, but you will be doing this in the future. What I am talking about are applications that are “cloud ready’— that is, stateless, shared-nothing applications that rely on back-end data sets to persist data. If your apps don’t behave that way, they should — and they will over time. And be warned: if you are not building container-based applications and microservices in the next few years, you will not be able to hire the developers that you need to take your business forward.

Author: Seth Fox

 

Interested in learning more? Sign up for the September 29 Webinar on this topic today!

 

Solinea specializes in 3 areas: 

  • Containers and Microservices –  Now enterprises are looking for ways to drive even more efficiencies, we help organizations with Docker and Kubernetes implementations – containerizing applications and orchestrating the containers in production.
  • DevOps and CI/CD Automation –  Once we build the infrastructure, the challenge is to gain agility from the environment, which is the primary reason people adopt cloud. We work at the process level and tool chain level, meaning that we have engineers that specialize in technologies like Jenkins, Git, Artifactory, Cliqr and we build these toolchains and underlying processes so organizations can build and move apps more effectively to the cloud.
  • Cloud Architecture and Infrastructure –  We are design and implementation experts, working with a variety of open source and proprietary, and have built numerous private, public, and hybrid cloud platforms for globally-recognized enterprises for over three years.