published on 30 August 2016

Technology has moved into almost every part of our lives. The internet of things surrounds us with knick-knacks that rely on silicon — our apparel even contains microcontrollers. We are customers for sellers clamouring for market supremacy in a sea of feature-richness. We are faced with billboards lit up to display dizzying commercial color in endless motion. This spectacle is the essence of western industrial life.

How did a hairless ape make all of this happen? Is it Moore’s law? Is it consumer demand? Is it large industrial corporations? Is it simply clever marketing? It’s all of that, plus something so embedded that we can’t see it unless we take a step back: abstraction.

It’s what we all do every day without thinking. It’s how we make sense of the world, how we generalize and see the essence of things. This gives us the ability to manage complexity and novelty. It has been with us since we were foraging for food on the Serengeti. The thing we call abstraction is used to mentally collect like things and treat them as a group of entities. We classify, encapsulate and draw generalizations, and it’s incredibly powerful. When people say "simplify," what they mean is “manage complexity effectively.” Our systems aren’t getting any more simple, but we can manage the complexity so our systems don’t cause chaos. This is our discipline.

Now let’s take a look at abstraction and consider how it lets us make sense of all the complexity.

Abstraction gives us a creative landscape derived from the most mechanistic edifice imaginable: a zero and a one. Our digital world is built upon ridiculously simple mechanistic rules, yet the emergent properties of those rules can be transformed into something organic. When they are well designed, rules are a catalyst for creative capabilities, and they should serve to build more layers of robust, well-managed thought levers.

Every enabling function is based on well-conceived abstractions. Amazing things and unexpected capabilities emerge when abstracted patterns are overlaid on suitable domains. The transformation is often elegant to the point where it resembles art.

But it all has to be built on a solid foundation. The elemental rules must be remembered, no matter how high we climb. When you’re building a complex system, you’ll need to ask yourself these questions:

  • What is good design and how can I tell if I have it?
  • How do I manage complexity without adding chaos?
  • How do I write solid predictable code?
  • How might I deliver emergent properties from my engineering disciplines?

Abstraction and the orchestra

Puppet and the orchestra

Let’s take a look at an example of the principles of abstraction coming together to produce something grand. We all know about the orchestra, but have we looked at what is happening here with our abstraction hats on?

In the example above, the orchestra is an instance of hierarchy and abstraction. How does that relate to a business or a computer?

The orchestra is analogous to so many things we see everyday, including IT systems, games, factories, cars or cities. They all have a common thread — collections and layers of rules, specifications, archetypes and abstractions.

In an orchestra, the conductor is leading the system. He’s not playing an instrument, but he is directing the events and acting as quality control. He’s the highest level in a hierarchy of abstractions. If we look at the musical score, we see embedded instructions. We see notes written, but they are not the sounds. They encapsulate the rules of the music — the sequence, pitch, volume and timing. But nothing happens until we get to the executive layer: the individual musicians and their instruments.

The score contains high-level sequential instructions necessary to execute in specific ways the lower-level instructions required to get the intended result. The musicians do that job, and are performing a role analogous to the Puppet agent on a target server.

Orchestrating with Puppet

The musician (Puppet agent) will execute parts (automation) within the score (high-level commands) on a particular instrument (operating system/end points). The musician translates the encoded information into sounds (instrument) necessary to have the audience appreciate the outcome (business function), all managed by the top layer (orchestration), and intended by the composer (IT service owner) to please the audience (business owner).

These abstractions enable useful hierarchies. Abstractions use interfaces between layers. Abstractions are built on trusted behaviours.

The system works because each layer is aware of the rules agreed between the adjacent layers. There is a common language between those layers: music. A transformation happens within each layer when something more specific is added. If the conductor had to play all the instruments himself, the outcome would be impossible. Here we see specialization and abstraction work to form a useful hierarchy, from general case to specific. There are technical elements, encoding elements and transformational elements. Without these structures, we would not have an orchestra. Abstract layers link together via a common language, and internally take care of translation.

While we are all aware of this example on the surface, it is worth examining it in the light of our new awareness. This idea is as natural to us as breathing, but the sophistication of the abstractions would not have happened without recognizing the incremental disciplines involved. Humans built this musical result after years of evolved thinking. It is an example where far more has been realized because of imposed structural rules. It has yielded far greater returns than if we simply sat the entire group of musicians in a room and shouted, “Play!”

Instead, we sat the musicians down and had them agree to a set of operating principles over an agreed common target. Then we filled the agreed structure with the rules of musical notation (software) that we enabled and tested a priori, and we executed the sequence within them. The result is robust, recognizable and repeatabl,e and can deliver far more impact than random sound, or even one man trying to play each instrument individually.

Here’s the lesson. Good abstractions:

  • Are robust.
  • Depend on well-formed rules.
  • Manage complexity.
  • Need evolution to refine them.
  • Facilitate useful hierarchies.
  • Take care of translation between layers.
  • Use common interfaces between layers.
  • May yield or empower results that exceed expectation.

Bits and bytes

Puppet and the orchestra

In this diagram, I want to show how a simple reorganization of the same thing (bits) can yield powerful results because of well-formed structural artefacts (abstractions).

So how do we get organic capabilities from something built on such simple rules? What are good or bad abstractions? Let’s begin at the very beginning: the humble one and zero.

In the mid-70s, I bought a hobby microprocessor kit. It was a National Semiconductor Mini SCMP. The kit let me enter opcodes and data in addresses. The opcodes, addresses and data were entered with a row of switches on the front panel and an enter button. Lights displayed the data you had entered. Simple and entertaining, but slow to program. After a while I got bored, but it did give me an insight into the inner workings of the magic. It showed me there was the practical need for better management of digital bits. Granted, we had automatic address generation and an instruction set. We had a CPU that could respond to bit patterns as opcodes and make crude decisions. But how useful was it for the real world? Not very.

From bits to code

In other parts of the world, other more insightful people had grappled with this problem. They had encoded information in punchcards, borrowed as an idea from Jacquard’s loom. Then they moved to magnetic tape. They took bits that by themselves are pretty limited, and aligned them in bytes. They took those bytes and represented them as binary numbers with arithmetic properties and rules. They used Boolean logic to manipulate these bits in a CPU that has a set of fixed responses to each combination of numbers (instruction set). They added an abstraction layer, and made the binary numbers into hexadecimal numbers that mapped over the bits perfectly. They had encoded these numbers into magnetic signals and made them available for reuse by the rules engine — a CPU.

Then they did some of the most powerful of things in the shortest time. They made a text-editable assembler that notated the hexadecimal as pseudo code or mnemonics. These were words that could be typed and printed and read by people. The first elements of symbolic computer language were born. What people had already realized with this development was that groups of instructions or variables often re-appeared when solving similar problems. Concepts like loops, decisions, variable storage templates and developed algorithms appeared as patterns. The next leap was to encode these as subroutines, linked as reusable sections of code, whose behavior could be predictable and trusted.

Subroutines could be named with English labels that gave a mnemonic hint as to their function. On top of that, engineers also added a link function that enabled a relocatable address calculation to be done without human intervention. The improved readability and comprehension immeasurably streamlined the development of useful code. Labels formed abstractions as named placeholders for the function or variable to hand, and they could be put anywhere you wanted without concern for the call or return address. The result was code on machines that could truly help solve powerful scientific problems, and could be programmed faster for more diverse uses. An organization of symbols gave logical abstractions that translated into power. Yet this was only the start.

Engineers also saw the correlation with human linguistic representations and program structure. They developed special interpreters that generated low-level machine code from English labels and went far beyond assembler notation. This new programming environment aligned much more closely with how people think. Programming changed from being arcane technical wizardry to something that looked like human language. The first compliers were born. Under the hood were rules, templates and specialized logic that could allow for the planning and development of far more complex algorithms and program structures than ever before. This work represented another layer of amazingly useful abstraction, and the leap thus provided stood for much more than a bunch of labelled subroutines. The results would have been incredibly error-prone and slow to code at the level of the assembler, and completely impossible if all you had were punched cards or switches and lights on a front panel.

The emerging value of a multiple-pass compilation process was groundbreaking. English-like statements as input gave rise to machine-executable binary code as output. By then, computers had become formidable computational companions, yet they still used a perfunctory 1-0 bit logic in their cores. Abstraction had empowered elementary digital principles beyond imagining. Solid, trusted and predictable, typewritten English-like words could be reliably translated into bits and bytes and then executed. The natural order of human cognition (the narrative or story) had found an evolving analogue in digital space, and it was a powerful combination.

Puppet

Puppet and the orchestra

So it is with Puppet-based systems, automation and the development of cloud service patterns. The cloud is an abstraction of IT services at a higher layer, and is involved with the provision and management of IT systems that support entire business functions. Puppet agents can act as a conduit for the full lifecycle management of servers, applications, individual components, and even systems of those components that make up an entire business.

Because we’re considering the top business-service level, the code to manage applications and infrastructure needs to be carefully thought out. With infrastructure as code, what can be done powerfully can also be done powerfully wrong.

Puppet automation is a lever to enable a new way to manage systems from the inside. Puppet agents allow carte-blanche access to system infrastructure and application deployment alike. Puppet agents allow infrastructure as code, and that is the new dimension of IT service provision. Making cloud-like services on internal brownfield systems using well-developed, stable automation and orchestration is the new challenge.

To do it properly, I suggest we need to align application development and infrastructure management. Puppet can make it happen. It’s time for new conversations to happen, and for a common outcome to be imagined.

Cloud hides the distinctions of application and infrastructure from the customer and the business owner. However, for the abstraction to work properly, the under-hood translations must be managed well. That is our new domain of abstraction and discipline. Like the bits-to-English abstraction pioneered by the first compiler writers, we are moving to the edge of a new abstraction layer, pulled further away from bits, files, operating systems and even servers, and centering on the end result for the consumer and business owner.

Virtualization and IT

Virtualization makes us think about IT components in new ways. ITIL may not be the only framework by which infrastructure is managed. Some ITIL capabilities may be redundant when code manages change, for example. Some ITIL -demanded checks that are done manually will be done in code, or dispensed with altogether. If a unit of change is requested from Puppet agents, they will translate the function to the particular endpoint, and no manual intervention is required.

As an example, a Linux VM may let Puppet manage the configuration on the file system, while a Windows server may require Puppet to call a function in SCCM. Either way, the question asked of Puppet will be the same. A system manager will not know the detail, except that Puppet has made the change. It’s similar to the C programmer who does not need to know the actual address of a function or variable. This is very useful when we consider systems as business services.

Control at our fingertips

We have transcended the barrier of platform silo,s and now have control of business systems at our fingertips. We have not had to coordinate changes across people and silo owners. Administrators will no longer have direct access to servers. Change will be managed by continuous integration over automation, and will occur more and more by code. Servers will be just another managed endpoint. Infrastructure can now be considered an amorphous layer that means nothing, and does nothing until configured — whether server, storage or network.

Production transition will have a different emphasis tomorrow than it does today. Updates and regular changes expected of applications will also be frequent for infrastructure, with changes on both possibly occurring together. Change, the bane of the old world order, must be prepared for, and embraced and accepted as a benefit to service availability, rather than regarded a threat. Applications are moving to this paradigm, and so must infrastructure. DevOps disciplines will need to be welcomed into both houses to accommodate an agile delivery and full lifecycle management of services.

Even as there will be differentiated streams under the hood, both will have the same service obligation to the business user or system manager. Gone are the days when infrastructure lights were simply on, and the job of a platform service was deemed to be done. Gone are the days when an application could be written without some awareness of infrastructure. Now they’re one and the same, and the common theme is Puppet automation, cloud service principles and an agile change mentality. But it happens only if the two silos are merged correctly. Agile change across the silos can’t occur in isolation — it must occur in an integrated service model, and that model today is the cloud. The challenge is to bring it to existing IT shops on brownfield implementations.

The old adage was, “prevent outages by keeping change minimized.” Now the adage is, “provide service availability by preparing for and accepting well-managed change.”

"Well managed" means agile in this context, and happens well before production. It’s about publishing what is current, and making the transition to expose it to customers. It’s about moving the old frameworks into the new world while retaining the desired service values. ITIL is not dead; it just lives in a different house with new guests. Application development now has a closer bedfellow (infrastructure) because code is code.

But “code is code” makes true sense only within a proper confine: written to be robust, written to add value, written to provide layers of transformation, written to hide complexity and written to have uniform interfaces.

We are still dealing with the same disciplines as in the initial examples cited above. The same ideas that are present in our high layers of service definition exist as they were at the lowest bit levels of computing. Qualities of robustness, trustworthiness and value add must link up, especially where we have a very powerful capability like Puppet.

Stephen Curtis is an IT infrastructure architect at ANZ Bank in Australia.

Learn more

Share via:
Posted in:

Add new comment

The content of this field is kept private and will not be shown publicly.

Restricted HTML

  • Allowed HTML tags: <a href hreflang> <em> <strong> <cite> <blockquote cite> <code> <ul type> <ol start type> <li> <dl> <dt> <dd> <h2 id> <h3 id> <h4 id> <h5 id> <h6 id>
  • Lines and paragraphs break automatically.
  • Web page addresses and email addresses turn into links automatically.