Ending the DevOps vs. Software Engineer Cold War
If we were that metaphorical fly on the wall, following an all too common Slack conversation between a software engineer and DevOps engineers, it might go something like this:
Software Engineer: This is gonna take forever. “I need a new environment for my app.”
Two hours later…..
DevOps: Why do software engineers think I have telepathy!? “Okay, what instance types do you need?”
An hour later (after consulting with the team…)
Software Engineer: “I need a g3.8xlarge to test out our latest visualization feature.”
DevOps: “Cool, and what AZ do you need it in? Also, which security group should it be associated with?”
Software Engineer: Don’t they know all this operational stuff, like automatically! “Any AZ in us-west-1, and it’s sg-3164z279.”
48 hours later after a few volleys back and forth over some additional parameters, permissions and other lovely details, DevOps gets the greenlight to spin up the environment, steps out to buy some licorice as a reward for all their suffering, and forgets to notify the software engineer that the request has been approved until 5 minutes before taking off for the day.
Sound familiar? If so, there might be a highly unproductive cold war going on deep in the heart of your software engineering and DevOps departments.
The Heart of the War
What’s at the heart of this war? To understand that, let’s unpack two major issues that emerge from this not-so-smooth but all-too-familiar scenario. First, without a common language and clear communication channels, no two parties can work together even on simple tasks, let alone complex ones. Second, even with a common language, all the excess work, context switching, delays, and the inevitable friction, lead to cold-war-level frustration brewing within your organization.
Adding to these issues are the blurred lines of responsibility that the DevOps model has created for both software engineering and DevOps (aka operations) teams. But the reality is that:
- Software engineers want to code, implement features and run them on infrastructure (so the customers can use them), without a lot of hassle and without getting bogged down in the operational details.
- DevOps want to focus on streamlining and keeping production stable, optimizing infrastructure, improving monitoring and general innovation, without getting sucked into the rabbit hole of end-user (e.g., software engineers’) service and access requests.
When both sides spend multiple cycles on operational bottlenecks—in between throwing invisible daggers of hate at one another—the organization loses software development productivity as well as potential innovation from the DevOps side because nobody is getting what they want.
This massive productivity loss can’t be measured by DORA metrics alone. It goes deep, right to the heart of your organization’s culture. But now, at last, there’s an end in sight to the decades-long cold war between software engineers and DevOps.
Evolution of a DevOps Peace Prize Winner
Nobody thinks the situation we’ve seen here—the radical disconnect between software engineers and DevOps—is okay. No business can work efficiently with this level of wasted time and effort. That’s why, a few years ago, insiders started proclaiming the dawn of self-service DevOps.
When you think of self-service DevOps, it probably calls to mind little robots provisioning all the infrastructure your devs could possibly need. If only that were true.
At the moment, self-service DevOps is still in its adolescent stage with cumbersome internal developer portals, service catalogs, workflow automation tools, and other shiny toys.
But the space is rapidly maturing, thanks to an evolutionary process that’s already well underway.
Yesterday: It Started With Chatbots
Simulated conversation goes all the way back to the dawn of computing. Modern chatbots are definitely smarter but have still failed to live up to their initial promise, which was that chatbots would come to understand any request, easily integrate with DevOps tools, and automate workflows.
In reality, chatbots are somewhat useful but face a number of fundamental problems. Essentially, they rely on simple, predetermined flows and rule-based, canned linear interactions. But we all know that in the real world, questions and requests can be varied and unique… essentially, not something a chatbot is equipped to handle.
Plus, let’s face it—a chatbot that doesn’t do the job is worse than nothing at all: Software engineers try to guess the magic commands that will make the chatbot do their bidding, and if and when they fail (probably), they have to go running to the DevOps team anyway—except now it’s 24, 48, or 72 hours later, and they’re as frustrated as [fill in the blank].
To create an interface that will truly save time on the Engineering and Operations sides, you need more intelligence than a simple chatbot can provide.
Today and Tomorrow: It Continues With Conversational AI
Chatbots are limited because they don’t understand the languages that our (human) developers use. But what if you could use AI to power more sophisticated understanding? To achieve that level of understanding, you need two distinct strategies in place:
- Natural language processing (NLP) uses AI to systematically parse your words and, through extensive training as required by most AI solutions, tries to determine the meaning.
- Then, natural language understanding (NLU) goes one step further, learning to recognize variations in language that reflect the imprecise way that people communicate in the real world, including taking into consideration factors like sentiment, semantics, context, and intent.
Essentially, NLP focuses on building algorithms to recognize and understand natural language, while NLU focuses on the meaning of a sentence. Putting these together, you finally arrive at true conversational AI.
NLP and NLU are just one of a number of essential building blocks that go into conversational AI to provide genuine understanding and intelligence that will ultimately replace chatbots. Let’s look at some of those building blocks:
- Natural language processing engine (using NLP/NLU) to evaluate user input and understand what is being requested
- Integration with an identity provider (IDP) like Okta and other sources, such as a knowledge base or cloud providers, to keep tight control of permissions and security
- Iterative machine learning to identify new data sets and test user behavior predictions to drive continuous improvement of responses
- Dialog management system to retain context of the conversation and allow the conversational AI to respond accordingly
- Context management, or “keeping state,” to track exactly where the conversation left off and the last step that was reached
- Interface to interact with the user, usually via text or speech, ideally through their favorite workflow tool
As an example of context management, in the fictionalized developer/DevOps conversation above, even after an interruption of 24 hours or longer, AI needs to remember what stage was reached so it can carry on once it receives authorization
Furthermore, the last point—user interface—is critically important. To achieve true conversational intelligence that actually streamlines DevOps, you need an interface that meshes with the way your teams are already working. That way, you won’t add additional stress from context switching, which in itself is a big drain on productivity and focus.
And the Peace Prize Winner Is… Your DevOps Virtual Assistant
Remember the disconnect I mentioned earlier between Dev and DevOps? Well, a virtual assistant can bridge the gap, giving both sides exactly what they want—and need. Developers want to code, getting the infrastructure they need without a hassle. DevOps engineers want security and efficiency to avoid over-permissioning and excess cloud costs; they also don’t want to waste time on repetitive, tedious, tasks with lots of context switching.
With a virtual assistant in place, here’s how the interaction might go:
- The software engineer uses their preferred work environment: Slack, Microsoft Teams, etc.
- They provide all the details in plain English.
- The virtual assistant then executes their request. For example:
- Provisions new cloud resources
- Triggers a complex workflow
- Provides some hard-to-find data
- Finally, the virtual assistant provides confirmation within the software engineer’s chosen work environment so they can get to work right away.
With conversational AI, both sides stay focused and productive. Software engineers can focus on development, and DevOps won’t have to waste time on context switching or endless, repetitive requests. So they can all leave the building arm in arm at the end of the day, ready to work out those old resentments at the bowling alley.
So let’s all hand conversational AI a Nobel Peace Prize; after decades of conflict, peace has broken out at last. And the big winner? Your organization, living in DevOps harmony happily ever after.