Embrace custom development, why containers are your new friend
22 October, 2021 - 7 min read
Embrace custom development
why containers are your new friend
We are now in the flux of the biggest revolution in enterprise software. As far as extinction events occur in the business world, this one is likely to give way to many new upstarts.
I'm of course referring to the impending A.I. meteor that's flying just within eyesight of our solar system as we breathe. Amid all the choices left to enterprise companies these days, it's hard to know where to start and how to prepare.
Especially since the idea of the moment is to move everything to cloud platforms. But in order to get the most out of an algorythm, it typically takes lots of insider knowledge and many, many late days tinkering.
So, how can companies get past their legacy architecture, systems and databases, while maintaining all of the great advances the cloud providers bring, and build themselves up for a future that will require A.I.?
Enter, Containerization.
What is Containerization
I'm going to attempt to explain this concept in the most foundational way.
Imagine computer systems as a closed environment, like a sterile laboratory. At first the lab is empty, then you bring into it an operating system, which is comprised of many different little bots that do all sorts of different things.
Think of the Operating System, OS, like the people working in the lab, the tables, etc, all of the basic things you get at go, or in computer lingo, out of the box. Then from there, you can bring in other tools, that outside this analogy would be libraries or programs, that suit specific needs that you have.
What containerization allows for, is when you develop a specific feature, anything from a single script through onto a whole app, you can build a laboratory, or container, that comprises only what is needed to run that code.
In theory, this means that only the code it needs is within that environment, meaning less variables for failure from interdependence or conflicts.
Going back to the lab analogy, this would be like not having burners in the lab if you don't need them. Eliminating that risk for fire.
In practice, this means that updating a whole app, fiddling around with changing this or that, or god forbid something breaks, you can easily isolate the containers affected.
linux has entered the chat
Companies like Docker, have built an ecosystem for doing this. Creating a platform to select or build containers comprised of an OS specifically stripped down to just what's fundamental to support that need.
So for machine learning, computer vision or the like, one can spin up a docker container that has Ubuntu 18.04 as the OS, install OpenCV, Nvidia drivers etc, and with a single docker file, have a self contained system to power the features.
No bloat, no conflicting libraries, just what you need.
Or if you'd like, create a container that contains an API, integration with another system, translation script, access to a database-- you name it.
It's elegant in it's simplicity. Faster run time, more stability and increased ability to experiment, which means more innovation and improvement.
powered by a penguin
Most of this container revolution has been powered by linux. Why? Well, it's quite easy to see.
Remember when buying a new Windows computer, sometimes you'd have to spend hours, or pay someone, to get rid of all the bloatware that it came pre-installed with? All those maddening little pop-ups for programs you don't want, or won't ever use, that flood your screen when turning the thing on.
For every one of those things that you see through the graphic interface, there's countless others that are running in the background. And with Windows, or Mac for that matter, unlike Linux, there's absolutely nothing you can do with some of these processes to get them to stop, or to remove them.
Linux is open source, meaning not only can you access the code at every level of the system, you can alter it. Opening up a new world for developers to be able to strip down the OS to just the bare bones of what's needed. A great example of this is my new favorite hacking find, Puppy Linux.
Such a thin version of Linux, it can run on practically any computer from the past 20+ years.
what does this have to do with custom development, don't platforms provide more?
Well yes, and no. Many SaaS companies now provide exactly what is needed for a business to operate up and down the line.
But, and this is a big but here, because they provide so many different tools, they risk being designed for everyone, so useful for no one. Not to mention the reality of where most companies are right now. In a transitional state from the legacy systems, into a new mixed-cloud system --that hopefully if they're looking ahead, include some Machine Learning.
This transitional period I talked a bit about in a previous postThe Waves of Building Software.
I've seen this time and time again working with clients. There's inevitably some feature within the platform they're using that's limiting. Whether it's SAP, Oracle, Salesforce, Azure, AWS-- you name it; something is going to be a fly in the ointment.
It can be too expensive, limits, or just flat out that a legacy system won't play nice with them, or compliance doesn't want something to be taken off-site. There's also sometimes very legitimate reasons why you may want to just keep the development and processing of it in house.
microservices for a macro world
Now here's where containers take off. You can, instead of upgrading to a new feature set from your SaaS provider, consider making a small app in a container ( called Micro-Service ), that instead serves exactly your need.
No more having to compromise this or that, you can have it do exactly what you need-- because you're the one building it from scratch. Many people these days are concerned because of having to maintain that code then, but with containers it extends the lifecycle of it to just what your dependencies are within the container.
Limit those down to the smallest possible variables, and you'll be surprised at the lifetime you'll get out of it.
I have an API I built for fetching news stories for myself that I built I-don't-even-know-how-long-ago. And while all of the components in it are horribly outdated, it's self-contained and chugging mightily along. It's stable within it's universe, or micro-verse, and until that changes it'll stay the same.
And the same would be true of a microservice you build.
Furthermore, pick a language your company is proficient in, libraries they know and versions of them or Linux that are ubiquitous to your stack, or even better are where you want to take your architecture. And you'll be building for the future. One micro step at a time.
but wait, there's more
Containers also mean that you can limit the data that comes in and out, or what's visible to the outside world. Meaning, it's incredibly difficult to get inside of them if you don't have access, and the only open ports to exchange information are designated at go. As well as what information goes in and out.
Which means, not only will your container exhibit a long shelf-life from being stable, but it will also be considerably more secure than if it was compiled in a more traditional way.
in summary, get cozy with containers
Now if you really embrace containerization, you'll be able to spin up little apps that can interchange with another. And like the previous article where I touch upon it briefly, you can now tap into those pre-built features for other apps and increase your ability to build faster.
That is, so long as you keep track of your architecture and make sure all your business units keep in solid communication. But that dear reader, is for another post.