April 18, 2024

Why I Don't Hate the Term "Cloud Computing"

I see now why they credit old people with having wisdom. If you hang around long enough, you begin to see patterns in the way the world works, mostly through sheer repetition.

I think I’ve been hanging around long enough—first as a technology journalist and now as an analyst—to explain why the concept of cloud computing seems to be gaining acceptance when other descriptions of the very same thing did not.

The evolution of the terminology for cloud computing mirrors what the best software developers are trying to do with computing in general: abstract away the complexity. The more you make the details of computing go away—at least for those of us who have to use the stuff rather than design it—the better off we all are.

Remember MS-DOS and the terror of the blinking C prompt? There wasn’t a day when I confronted that big, blank screen with that cursor blinking impatiently at me when I didn’t have at least a fleeting sensation that I had forgotten the secret code, that I would be struck numb, that the cursor would just keep blinking and blinking and blinking blinking and never stop. That a co-worker would happen by as I sat there staring into the black abyss and say what the aggro-geeks said about us users all the time back then: “Are you so freakin’ stoooopid that you can’t get past the C-prompt?!”

And then the question that allows people to pretend that they’re being helpful but which they know merely confirms your fecklessness and makes you feel like a five-year-old: “What is it that you are trying to do?”

How about kill the questioner?

We know what happened to the C-prompt. Apple killed it—or rather forced Microsoft to hide it. Today it’s like the monster behind a flimsy wooden door—you catch a terrifying glimpse every now and then and when the monster breaks down the door and the blue screen of death appears, well, you just have to run away.

But at least we don’t have to remember what MS-DOS means, or A: or B:, or C:. We have terminology for that now that abstracts away the complexity: Windows, folders, and of course “my computer.” I still feel like a five-year-old when I see “my computer” but at least I don’t have to rely as heavily on the deeply embedded memories in my basal ganglia when I start up my computer now.

But then things got complicated again. When we pulled computing out of the boxes and onto the network, we had client/server computing, which, in terms of terminology, was a lot like the C-prompt. It took many tries before you could absorb the difference between a client and a server—the forward slash alone was scary enough, like if you had to ask what the slash meant you were one of those stoooopid people who needed conjunctions. And the definition leaked like oil from an old British sports car. Did the client hold the entire application? Well sometimes, though usually just part of it. Did the server hold all the data? Sort of—eventually.

Then along came network computing. Ah, here was simplicity itself. The network is the computer. Just to prove it, Larry Ellison came out with a computer. But I thought you just said that… Makes you want sit on the floor and cry, doesn’t it?

To clear up the confusion, we gave up the metaphors and got back to our terminological roots (psychologists would use a slightly darker term: regression)—and came up with an acronym: ASP. An application service provider was someone who managed your applications and data for you.

But isn’t that what they used to call outsourcing? Well, not really, because they do it at their data center, not your data center. But wait a minute; don’t IBM and Accenture have their own data centers that they use to serve clients? Yes, but…

Clearly, what we needed to do in the aftermath of the ASP era was to think bigger. Metaphors weren’t the problem; it was that the metaphors weren’t big enough. We needed to go huge with this thing. So along comes IBM with on-demand computing. Computing was going to be like plugging a socket into the wall. Everything would just flow through the wire like power from the electric company. Except not for a while yet. For now, companies would continue to build their own data centers themselves, or have IBM build them for them, sort of like how companies used to build their own power generators, umm, before … we … had … power … utilities.

In a fit of candor, we then started talking about grid computing. Cause after all, that’s what this thing really was, a grid of computers linked together to create one huge, all-knowing, all-powerful computer somewhere. Except now you couldn’t do little things on the big computer. You had to do big things, like figure out the human genome. The way I understood grid computing, I assumed that eventually someone would come and take my computer and my Internet connection away. The grid couldn’t be wasted on people as dumb and lacking in ambition as me. I cried (again) when I heard about grid computing.

But now we’ve finally broken free with cloud computing. It isn’t just a big metaphor, it’s an infinite metaphor. It’s downright existential. Clouds are everywhere and nowhere, big and tiny—puffy white things from far away and microscopic droplets of water up close. Nothing says that computing happens somewhere else—but relax, it doesn’t matter where or how—as well as cloud computing.

If I were still a journalist, my life would be wonderful. I wouldn’t have to explain the cloud to my readers. There is no loophole in the definition that invites me to delve deeper to understand what it really means and determine whether it’s really a valid description of what’s going on here. The complexity has been completely abstracted away.

I know this all sounds cynical, but in the end it’s probably just as well that the cloud metaphor lets us off the hook of rigor. Like the DOS prompt, it’s not something most people need to know about.

It’s a lesson for all of us. Tech companies are complexity junkies, needing to explain and justify every detail until the marketing becomes as monstrous as the DOS prompt. We could all use a few more layers of abstraction—and some good metaphors—in our thinking and our communications.

Post to Twitter

Get Adobe Flash player