Culture, schmulture. DevOps, agile need to be software-first again
Decades of preaching about meatware complicated dev life
"The talks get a little repetitive, don't they?" she said as we were walking out of the elevator and through the lobby, escaping the latest two-day DevOpsDays nerd fest. Unable to resist the urge to mansplain, I meekly volunteered that most of the attendees are first-timers, so, you know, maybe it's new to them.
Upstairs someone had said they'd like to see more technical talks, and fewer, as they're called, "culture" talks. Of course, I hadn't attended any of the talks because, you know, a thought lord like myself goes to many of these and has seen "all the talks". Many years into DevOps, even I'm sick of all this culture stuff!
Everything was going well until the people showed up
This emphasis on "culture" is now well known to induce agenda and presentation nausea on the DevOps circuit. For example, the most fashionable architectural style of the moment starts with humans: one wants to do microservices to take advantage of how humans but build systems that mimic how they organise themselves and, thus, communicate with one another. It's all people, the latest microservices deck-flipper will say.
And then there's handling failure: instead of (only) hardening systems so that they never fail, accept that they'll always fail, and rapidly learn from failure, even relishing and rewarding it. Failure is learning, comrade! This push to improve by failing brings about the "blameless postmortem", perhaps the most baffling concept for the sassy old-timers in the glasshouse.
In the tech industry, we're never really sure which is more important: the tool, or how people use the tool. There have always been at least two humans involved, the builders and the users. The builders are the ones who create the software: developers, designers, operators, QA staff, product managers. And, of course, there's the people who actually use the software, the users, sometimes called "the customer", especially when it comes to consumer tech.
The Hyborian age of computing
Before DevOps, way before the recorded time of the web, The Mythical Man-Month by Fred Brooks emphasised the best way to organise developers, namely in something analogous to surgical teams – a sort of great man theory. Getting the right builders in place was key to great software. Of course, much revived now, there was Conway's observation, drawn up into a "law" that (put slightly wrong) said software architecture will model the structure of the organisation that created it. Getting software to work well and do a job was something of a dancing bear for a long time: the quality of the bears dancing was not the axis of judgement, the fact that the bear could dance at all was the point!
In response to this, you saw a hoard of "usability" experts descend on the land. Here there were things like one-way mirrors, user interaction testing festooned with cameras recording the user's every move. It was expensive, and slow. And in most cases, the results seemed trivial: this button's text should be bigger; no one understands this error message; the configuration wizard should probably have less than 30 panels.
Nonetheless, the cat was out of the bag. The technology was now good enough that we could pay attention to how well actual users – humans – can use this software to get things done.
Things get extreme
Around this time, in the 1990s, early notions of agile software development formed. Any history of agile is fraught with a parade of agilesplainers with talk of Bohemian spirals, roses, and wikis. That's fine, and delightful over some snifters, but let's simplify it. In 1999, Kent Beck's eXtreme Programming Explained described a method that integrated the builders and users together in a novel, just-crazy-enough-to-work way. It crystallised while working on Chrysler's HR system, so it certainly had "enterprise" chops: this wasn't some pizza-based method for creating new Space Quest episodes. It was for real jobby-jobs!
One of eXtreme Programming Explained's core insights is that we have no idea what our software should actually do, and especially how it should be implemented, until we start trying. Rather than imaging the requirements a priori, it's only through an ongoing conversation with the user that we'll discover the right features. To do this, you would slice down the release window to something like a week incrementally co-innovating with the users, creating small pieces of functionality and asking them "whaddya make of that?" You'd conquer the unknown by shipping, and changing your approach as you learned more.
To do this, you had to do less each cycle, automate quality control with tests, and optimise the labour of the developers with pair programming. Even more bonkers, you'd pluck someone from "the business" – or even actual users! – to embed in the team to be the voice of reason and fight for the users, as they say.
These ideas ruffled the feathers of contemporary practitioners no end. They'd scoff and call agile people "cowboys" and other such derogatory grunts. "Agile" seemed bananas. Instead, people trusted their ability to predict what the software should do, confident that they could maximise requirements, fidelity and quality far beyond than those absurd, agile short release loops.