Wide Awake Developers

Main

Why Do Enterprise Applications Suck?

What is it about enterprise applications that makes them suck?

I mean, have you ever seen someone write 1,500 words about how much they love their corporate expense reporting system? Or spend their free time mashing up the job posting system together with Google maps? Of course not. But why not?

There's a quality about some software that inspires love in their users, and it's totally devoid in enterprise software. The best you can ever say about enterprise software is when it doesn't get in the way of the business. At it's worst, enterprise software creates more work than it automates.

For example, in my company, we've got a personnel management tool that's so unpredictable that every admin in the company keeps his or her own spreadsheet of requests that have been entered. They have to do that because the system itself randomly eats input, drops requests, or silently trashes forms. It's not a training problem, it's just lousy software.

We've got a time-tracking system that has a feature where an employee can enter in a vacation request. There's a little workflow triggered to have the supervisor approve the vacation request. I've seen it used inside two groups. In both cases, the employee negotiates the leave request via email then enters it into the time tracking system. I know several people who use Travelocity to find their flights before they log in to our corporate travel system. And you wouldn't even believe how hard our sales force automation system it compared to Salesforce.com.

Way back in 1937, Ronald Coase elaborated his theory about why corporations exist. He said that a firm's boundaries should be drawn so as to minimize transaction costs... search and information costs, bargaining costs, and cost of policing behavior. By almost every measure, then, external systems offer lower transaction costs than internal ones. No wonder some people think IT doesn't matter.

If the best you can do is not mess up a nondifferentiating function like personnel management, it's tough to claim that IT can be a competitive advantage. So, again I'll ask, why?

I think there are exactly four reasons that internal corporate systems are so unloved and unlovable.

1. The serve their corporate overlords, not their users.

This is simple. Corporate systems are built according to what analysts believe will make the company more efficient. Unfortunately, this too often falls prey to penny-wise-pound-foolish decisions that micro-optimize costs while suboptimizing the overall value stream. Optimizing one person's job with a system that creates more work for a number of other people doesn't do any good for the company as a whole.

2. They only do gray-suited, stolidly conservative things.

Corporate IM now looks like an obvious idea, but messaging started frivolously. It was blocked, prohibited, and firewalled. In 1990, who would have spent precious capital on something to let cubicle-dwellers ask each other what they were doing for lunch? As it turns out, a few companies were on the leading edge of that wave, but their illicit communications were done in spite of IT.  How many companies would build something to "Create Breakthrough Products Through Collaborative Play?"

3. They have captive audiences.

If your company has six purchasing systems, that's a problem. If you have a choice of six online stores, that's competition.

4. They lack "give-a-shitness".

I think this one matters most of all. Commerce sites, Web 2.0 startups, IM networks... the software that people love was created by people who love it, too. It's either their ticket to F-U money, it's their brainchild, or it's their livelihood. The people who build those systems live with them for a long time, often years. They have reason to care about the design and about keeping the thing alive.

This is also why, once acquired, startups often lose their luster. The founders get their big check and cash out. The barnstormers that poured their passion into it discover they don't like being assimilated and drift away.

Architects, designers, and developers of corporate systems usually have little or no voice in what gets built, or how, or why. (Imagine the average IT department meeting where one developer says this system really ought to be built using Scala and Lift.) The don't sign on, they get assigned. I know that individual developers do care passionately about their work, but usually have no way to really make a difference.

The net result is that corporate software is software that nobody gives a shit about: not it's creators, not it's investors, and not it's users.

 

Beyond the Village

As an organization scales up, it must navigate several transitions. If it fails to make these transitions well, it will stall out or disappear.

One of them happens when the company grows larger than "village-sized". In a village of about 150 people or less, it's possible for you to know everyone else. Larger than that, and you need some kind of secondary structures, because personal relationships don't reach from every person to every other person. Not coincidentally, this is also the size where you see startups introducing mid-level management.

There are other factors that can bring this on sooner. If the company is split into several locations, people at one location will lose track of those in other locations. Likewise, if the company is split into different practice areas or functional groups, those groups will tend to become separate villages on their own. In either case, the village transition will happen sooner than 150.

It's a tough transition, because it takes the company from a flat, familial structure to a hierarchical one. That implicitly moves the axis of status from pure merit to positional. Low-numbered employees may find themselves suddenly reporting to a newcomer with no historical context. It shouldn't come as a surprise when long-time employees start leaving, but somehow the founders never expect it.

This is also when the founders start to lose touch with day-to-day execution. They need to recognize that they will never again know every employee by name, family, skills, and goals. Beyond village size, the founders have to be professional managers. Of course, this may also be when the board (if there is one) brings in some professional managers. It shouldn't come as a surprise when founders start getting replaced, but somehow they never expect it.

 

Social Factors

I mentioned Tom DeMarco just a couple of days ago. I'm re-reading his great book, Why Does Software Cost So Much? for the first time in about ten years.

Personally, I credit Tom as one of the unsung progenitors of the agile movement. Long before we had "Agile" or even "lightweight methods", Tom was talking about the psycho-social nature of software development. 

For instance, here's an excerpt from essay 8, "Nontechnological Issues in Software Engineering":

Imagine your boss just plunked a specification on your desk and asked, "How long will it take you and one other person to get this job done?" What's the first question out of your mouth?

Would you ask, "Can we use object-oriented methods?" or "What CASE system can we buy?" or "Is it okay to use rapid prototyping?" Of course not. Your first question is,

Who is the other person?

Absolutely. Right on, Tom. 

Coach and Team From Same Firm

Is it an antipattern to have a consulting firm provide both the coach and developers?  By providing the developers, the firm is motivated to deliver on the project, with coaching as an adjunct.  If, instead, the firm provides just the coach, it will be judged by how well the client adopts the process.  These two motives can easily conflict.

Case in point: at a previous client of mine, my employer was charged with completing the project, using a 50-50 mix of contractors and client developers.  My employer, a consulting firm, provided several developers experienced with XP and Scrum, as well as an agile coach.  The firm was thus charged with two imperatives: first, deliver the project; second, introduce agile methods within the client. 

With project success as a requirement, the firm decided to intereview the developers at the outset of the project. The client's developers (rightly) perceived that they were interviewing for their own jobs.  This started a negative dynamic that ultimately resulted in 80% attrition among the client's developers.

On a pure coaching engagement, the coach would probably have "made do" with whomever the client provided. 

We delivered all the features, basically on time, with very high quality. Financially speaking, it was a success, generating more orders and more revenue per order than its predecessor.  It is harder to say that the engagement as a whole was a success, though.  Almost all of the developers were contractors, so the client got their product, but very little adoption of agile methods.

Perhaps if the coach and the contract developers had come from different firms, the motivations would not have been as tangled, and more of the client's valuable people would have stayed.  The team might not have suffered from the strained, unhealthy environment from the early days of the project.

Then again, perhaps not.  The client may have been expecting that level of attrition. Maybe that's just to be expected when you trying to bring a random selection of corporate developers over to agile methods, especially if the methods are decreed from above instead of brought upward by grass-roots. Maybe the dynamic would have existed even with a coach that was totally disinterested in the project outcome.

Planning to Support Operations

In 2005, I was on a team doing application development for a system that would be deployed to 600 locations. About half of those locations would not have network connections. We knew right away that deploying our application would be key, particularly since it is a "rich-client" application. (What we used to call a "fat client", before they became cool again.) Deployment had to be done by store associates, not IT. It had to be safe, so that a failed deployment could be rolled back before the store opened for business the next day. We spent nearly half of an iteration setting up the installation scripts and configuration. We set our continuous build server up to create the "setup.exe" files on every build. We did hundreds of test installations in our test environment.

Operations said that our software was "the easiest installation we've ever had." Still, that wasn't the end of it. After the first update went out, we asked operations what could be done to improve the upgrade process. Over the next three releases, we made numerous improvements to the installers:

  • Make one "setup.exe" that can install either a server or a client, and have the installer itself figure out which one to do.
  • Abort the install if the application is still running. This turned out to be particularly important on the server.
  • Don't allow the user to launch the application twice. Very hard to implement in Java. We were fortunate to find an installer package that made this a check-box feature in the build configuration file!
  • Don't show a blank Windows command prompt window. (An artifact of our original .cmd scripts that were launching the application.)
  • Create separate installation discs for the two different store brands.
  • When spawning a secondary application, force it's window to the front, avoiding the appearance of a hang if the user accidentally gives focus to the original window.

These changes reduced support call volume by nearly 50%.

My point is not to brag about what a great job we did. (Though we did a great job.) To keep improving our support for operations, we deliberately set aside a portion of our team capacity each iteration. Operations had an open invitation to our iteration planning meetings, where they could prioritize and select story cards the same as our other stakeholders. In this manner, we explicitly included Operations as a stakeholder in application construction. They consistently brought us ideas and requests that we, as developers, would not have come up with.

Furthermore, we forged a strong bond with Operations. When issues arose---as they always will---we avoided all of the usual finger-pointing. We reacted as one team, instead of two disparate teams trying to avoid responsibility for the problems. I attribute that partly to the high level of professionalism in both development and operations, and partly to the strong relationship we created through the entire development cycle.

"Us" and "Them"

As a consultant, I've joined a lot of projects, usually not right when the team is forming. Over the years, I've developed a few heuristics that tell me a lot about the psychological health of the team. Who lunches together? When someone says "whole team meeting," who is invited? Listen for the "us and them" language. How inclusive is the "us" and who is relegated to "them?" These simple observations speak volumes about the perception of the development team. You can see who they consider their stakeholders, their allies, and their opponents.

Ten years ago, for example, the users were always "them." Testing and QA was always "them." Today, particularly on agile teams, testers and users often get "us" status (As an aside, this may be why startups show such great productivity in the early days. The company isn't big enough to allow "us" and "them" thinking to set in. Of course, the converse is true as well: us and them thinking in a startup might be a failure indicator to watch out for!). Watch out if an "us" suddenly becomes "them." Trouble is brewing!

Any conversation can create a "happy accident;" some understanding that obviates a requirement, avoids a potential bug, reduces cost, or improves the outcome in some other way. Conversations prevented thanks to an armed-camp mentality are opportunities lost.

One of the most persistent and perplexing "us" and "them" divisions I see is between development and operations. Maybe it's due to the high org-chart distance (OCD) between development groups and operations groups. Maybe it's because development doesn't tend to plan as far ahead as operations does. Maybe it's just due to a long-term dynamic of requests and refusals that sets each new conversation up for conflict. Whatever the cause, two groups that should absolutely be working as partners often end up in conflict, or worse, barely speaking at all.

This has serious consequences. People in the "us" tent get their requests built very quickly and accurately. People in the "them" tent get told to write specifications. Specifications have their place. Specifications are great for the fourth or fifth iteration of a well-defined process. During development, though, ideas need to be explored, not specified. If a developer has a vague idea about using the storage area network to rapidly move large volumes of data from the content management system into production, but he doesn't know how to write the request, the idea will wither on the vine.

The development-operations divide virtually ensures that applications will not be transitioned to operations as effectively as possible. Some vital bits of knowledge just don't fit into a document template. For example, developers have knowledge about the internals of the application that can help diagnose and recover from system failures. (Developer: "Oh, when you see all the request handling threads blocked inside the K2 client library, just bounce the search servers. The app will come right back." Operations: "Roger that. What's a thread?") These gaps in knowledge degrade uptime, either by extending outages or preventing operations from intervening. If the company culture is at all political, one or two incidents of downtime will be enough to start the finger-pointing between development and operations. Once that corrosive dynamic gets started, nothing short of changing the personnel or the leadership will stop it.

Inviting Domestic Disaster

We had a minor domestic disaster this morning. It's not unusual. With four children, there's always some kind of crisis. Today, I followed a trail of water along the floor to my youngest daughter. She was shaking her "sippy cup" upside down, depositing a full cup of water on the carpet... and on my new digital grand piano. 

Since the entire purpose of the "sippy cup" is to contain the water, not to spread it around this house, this was perplexing.

On investigation, I found that this failure in function actually mimicked common dynamics of major disasters. In Inviting Disaster, James R. Chiles describes numerous mechanical and industrial disasters, each with a terrible cost in lives. In Release It, I discuss software failures that cost millions of dollars---though, thankfully, no lives. None of these failures come as a bolt from the blue. Rather, each one has precursor incidents: small issues whose significance are only obvious in retrospect. Most of these chains of events also involve humans and human interaction with the technological environment.

The proximate cause of this morning's problem was inside the sippy cup itself. The removable valve was inserted into the lid backwards, completely negating its purpose. A few weeks earlier, I had pulled a sippy cup from the cupboard with a similarly backward valve. I knew it had been assembled by my oldest, who has the job of emptying the dishwasher, so I made a mental note to provide some additional instruction. Of course, mental notes are only worth the paper they're written on. I never did get around to speaking with her about it.

Today, my wonderful mother-in-law, who is visiting for the holidays, filled the cup and gave it to my youngest child. My mother-in-law, not having dealt with thousands of sippy cup fillings, as I have, did not notice the reversed valve, or did not catch its significance.

My small-scale mess was much easier to clean up than the disasters in "Release It!" or "Inviting Disaster". It shared some similar features, though. The individual with experience and knowledge to avert the problem--me--was not present at the crucial moment. The preconditions were created by someone who did not recognize the potential significance of her actions. The last person who could have stopped the chain of events did not have the experience to catch and stop the problem. Change any one of those factors and the crisis would not have occurred.

New Interview Question

So many frameworks... so much alphabet soup on the resumes.

Anyone that ever reads The Server Side or Monster.com knows exactly which boxes to hit when they're writing a resume. The recruiters telegraph their needs a mile away. (Usually because they couldn't care less about the differences or similarities between Struts, JSF, WebWork, etc.) As long as the candidate knows how to spell Spring and Hibernate, they'll get submitted to the "preferred vendor" system.

Being one of those candidates is tough, but that's not the part I'm concerned about now. I'm interested in weeding out the know-nothings, the poseurs, and the fast talkers.

When I'm interviewing somebody, my main criterion is this: would I want to work on a two-person project with this candidate? My secondary criterion is "Would I feel comfortable leaving this person along at a client site? Will they deliver value to the client? Will they look like an idiot, and by extension, make me look like an idiot?"

My friend Dion Stewart had a great idea for a weed-out question. No matter what frameworks the candidate shows on the resume, ask them what they disliked the most about the framework. (I have my top three list for each framework I've worked in... except NeXT's Enterprise Objects Framework. But that's another story.)

If they can't answer at all, then they haven't actually worked with the framework. They're just playing buzzword bingo.

If they answer, but it sounds like bullshit, then odds are they're bullshitting you.

If they have never thought about it, haven't formed an opinion, or say "it's all good", then they lack passion about what they do.

A candidate that is driven, that cares about the quality-without-a-name should be able to go on a rant about something in each framework they've actually worked with. In fact, you've really hit the jackpot if your candidate <i>can</i> go on a rant, but does it in a professional, reasoned way. I love to see a candidate that can show some fire without seeming like a loon. That's when I can see how they'll react when the client makes a decision the candidate considers boneheaded. (I've seen some spectacular pyrotechnics from consultants that forgot whose money they're spending. But that's another story.)

Technorati Tags: resume, jobs

On Relativism and Social Constructions

The key operative precept of post-modernism is that all reality is a social construct. Since no institution or normative behavior stems from natural cause, and there is no objective, external reality, then all institutions and attitudes are just social constructs. They exist only through the agreement of the participants.

Nothing can be sacred, since sanctification comes from outside, by definition.

If nothing is sacred, and institutions have no more reality than a children's amorphous game of ball, they deduce that any construct can be reconstructed through willful choice.

Even if you accept the precept that there is no objective, external (let alone universal) value system, you can still see the fundamental fallacy in this thinking.

Anyone who has ever tried to bring change into a hidebound organization knows that social constructs are far harder to change than any physical or legal structure. You can reorganize units, bring locations together, shuffle management, or get rid of half of the people. Still, underlying social organization will re-emerge as long as there is any vestige of continuity.

Much of the heat energy in the ongoing culture war arises from this inertia. Those who are so tiresomely labelled as "liberal", "progressive", the "Left", the "Cultural Elite", etc. represent a large force of people aimed at deliberately reconstructing every institution in Western life. They have decided, based on their own feelings, bereft of natural or religious law, that any institution observed by men for more than one hundred years cannot be endured. They are organized around the post-modern paradigm--armed with Hayakawa and Chomsky--and don't accept that some hidebound Neanderthals will not welcome forceful re-education.

I suppose that I follow a third way. I can agree that our institutions are social cosntructs. That does not mean that they can, or should be, tampered with lightly. The concept of "natural law" teaches that certain modes of behavior, certain morals, generate a more successful society. Our social institutions--like marriage--have undergone the same forces of competitive pressures and differential reproduction that drive neo-Darwinian evolution. That means the institutions we observe today--such as preserving the integrity of personal property--are the ones that worked.

There is an argument to be made that I'm advocating cultural imperialism. It could perhaps be seen that way, though such is not my intent. Rather, just as we should justifiably be wary of changing our own genetic code, we should be wary of making large changes to our social institutions. We do not know what will result. There are many paths down the mountain, but only one upward. Most random mutations result in death. Even well-planned changes have unintended, sometimes catastrophic, effects.

References

Uniting Reason and Passion

Reason and Passion need not conflict. Reason without passion is dusty, dry, and dead. Reason without passion leads to moral relativity. If nothing moves the thinker to passion, then all subjects are equal and without distinction. As well to discuss the economic benefits of the euthanasia of infants as the artistic merits of urinals.

Passion without reason brings the indiscriminate energy of a summer's thunderstorm. Too much energy unbound, without direction, it's fury as constant as the winds of the air.

Passion provides energy, the drive to accomplish, change, improve, or destroy. Reason provides direction. Reason channels Passion and achieves goals by identifying targets, foci, leverage points. Passion powers Reason. It brings motive power. Passion knows that things must be done and that change is possible. Reason knows how change may be effected.

I was reminded of the fallacy of Passion without Reason recently. At lunch with a friend, she talked about working with a non-profit organization. Workers for non-profits epitomize those who are driven by Passion. Agree or disagree with their aims, you must admit that they earnestly mean to change the world. My friend, who comes from the profit-driven corporate world, was explaining some aspects of statistical process control and how it could be applied to improve fundraising results on their website. She was told that she needed to have more heart and feel for those unfortunates that this group helps.

Her critic obviously felt that her approach was too analytical. Too driven by Reason, not enough Passion. In fact, the opposite was true. She was applying the combination of Reason and Passion. Passion showed her that the cause was worthy and that she could help. Reason showed her where leverage could be gained and a small effort input could result in a large change in output.

In various disfunctional organizations which I have inhabited, I've seen many examples of the opposite. Reason reveals problems and solutions to those poor sapient cogs in the low levels of the machine. They lack the Passion to see that change is possible and so divest themselves of the power to improve their own lot in life. Problems or challenges will always overcome such people, because they give the problem power and remove it from themselves.


More Wiki

My personal favorite is TWiki. It has some nice features like file attachments, a great search interface, high configurability, and a rich set of available plugins (including an XP tracker plugin.)

One cool thing about TWiki: configuration settings are accomplished through text on particular topics. For example, each "web" (set of interrelated topics) has a topic called "WebPreferences". The text on the WebPreferences topics actually controls the variables. Likewise, if you want to set personal preferences, you set them as variables--in text--on your personal topic. It's a lot harder to describe than it is to use.

There are some other nice features like role-based access control (each topic can have a variable that says which users or groups can modify the topic), multiple "webs", and so on.

The search interface is available as variable interpolation on a topic, so something like the "recent changes" topic just ends up being a date-ordered search of changes, limited to ten topics. This means that you can build dynamic views based on content, metadata, attachments, or form values. I once put a search variable on my home topic that would show me any task I was assigned to work on or review.

I've also been looking at Oahu Wiki. It's an open source Java wiki. It's fairly short on features at this point, but it has by far the cleanest design I've seen yet. I look forward to seeing more from this project.


Moving on

The latest in my not-exactly-daily news and commentary...

As of December 10th, I will be leaving Totality Corporation. It has been a challenge and an education. It has also been an interesting time, as we uncovered the hidden linkages from daily activities to ultimate profitability. The managed service provider space is still new enough that the business models are not all so well-defined and understood as in consulting. I earnestly hope that I am leaving Totality in a much better place than it was when I joined.

Still, a number of positive attractions to the new position and some negative forces away from my current position have overcome inertia.

I will be joining Advanced Technologies Integration as a consultant. I will be forming a team with Kyle Larson, Dale Schumacher, and Dion Stewart to do a development project for one of ATI's clients. The project itself has some moderately interesting requirements... it's not just another random commerce site. (I'm really, really bored with shopping carts!) The thing that really attracted me though, is that this is a hardcore agile methods project. We'll be using a combination of Scrum and XP.

For a long time, I've advocated small teams of highly skilled developers. I have seen such teams produce many times the business value (and ROI) of the typical team. ATI and this client are willing to subscribe to the theory that a small, high-caliber team will outperform an army of cheap morons.

It's going to be a blast proving them right!

The Lights Are On, Is Anybody Home?

We pay a lot of attention to stakeholders when we create systems. The end users get a say, as do the Gold Owners. Analysts put their imprimatur on the requirements. In better cases, operations and administration adds their own spin. It seems like the only group that doesn't have any input during requirements gathering is the development team itself. That is truly unfortunate.

Not even the users will have to live with the system more than the developers will. Developers literally inhabit the system for most of their waking hours, just as much (or maybe more) than they inhabit their cubes or offices. When the code is messy, nobody suffers more than the developers. When living in the system becomes unpleasant, morale will suffer. Any time you hear a developer ask for a few weeks of "cleanup" after a release, what they are really saying is, "This room is a terrible mess. We need to remodel."

A code review is just like an episode of "Trading Spaces". Developers get to trade problems for a while, to see if somebody else can see possibilities in their dwelling. Rip out that clunky old design that doesn't work any more! Hang some fabric on the walls and change the lighting.

Whether your virtual working environment becomes a cozy place, a model of efficiency, or a cold, drab prison, you create your own living space. It is worth taking some care to create a place you enjoy inhabiting. You will spend a lot of time there before the job is done.

The Paradox of Honor

You can use a person's honor against him only if he values honor. Only the honest man is threatened by the pointed finger. The liar is unaffected by that kind of accusation. I think it is because there is no such thing as "dishonesty". There is only honesty or it's lack. Not a thing and it's opposite, but a thing and it's absence. One or zero, not one or minus-one. One who is lacking a thing cannot be threatened at the prospect of its loss.


Bill Joy Knocks the Open Source Business Model

Bill Joy had some doubts to voice about Linux. Of course, like so many others he immediately jumps to the wrong conclusion. "The open-source business model hasn't worked very well," he says.

Tough nuts. Here's the point that seems to get missed over and over again. There is no "open source business model". There never was, and I doubt there ever will be. It doesn't exist. It's a contradiction in terms.

Open source needs no business model.

Look, GNU existed before anyone ever talked about "open source". Linux was built before there were companies like RedHat and IBM interested (let alone Sun). The thing that the corps and the pundits cannot seem to grasp is their absolute irrelevance.

It's like Bruce Sterling's speech. Harangue. Whatever you want to call it. I see it as yet another person getting up and trying to tell the "open-source community" what they need to do. Getting on their case about not being organized enough... or something.

Or it's like those posters on Slashdot that wish either GNOME or KDE would shut down so everyone can focus on one "standard" desktop.

Or Scott McNealy, lamenting the fact that open source Java application servers inhibit the expenditure of dollars that could be used to market J2EE against .Net.

Or the UI designers who froth at the mouth about how terrible an open source applications user interface may be. They say moronic things like "when will coders learn that they shouldn't design user interfaces?" (Or the more extreme form, "Programmers should never design UIs.")

Or it's like anyone who looks at an application and says, "That's pretty good. You know what you really need to do?"

All of these people don't get the true point. I'll say it here as baldly as I can.

There is nobody in charge. Not IBM, not Linus Torvalds, not Richard Stallman. Nobody.

All you will find is an anarchic collection of self-interested individuals. Sometimes they collaborate. Some of them work together, some work apart, some work against each other. To the extent that some clusters of individuals share a vision, they collaborate to tackle bigger, cooler projects.

There is no one in control. Nobody gets to decree what open source projects live or die, or what direction they go in. These projects are an expression of free will, created by those capable of expressing themselves in that medium. Decisions happen in code, because coders make them happen.

Free will, baby. It's my project, and I'll do what I want with it. If I want to create the most god-awful user interface ever seen by Man, that's my perogative. (If I want lots of users, I probably won't do that, but who says I have to want lots of users? It's my choice!)

As long as one GNOME hacker wants to keep working on GNOME, it will continue to evolve. As long as one Linux kernel hacker keeps coding, Linux will continue. None of these things require corporations, IPOs, or investement dollars to continue. The only true investments in open source are time and brainpower. Money is useful in that it can be used to purchase time, the greatest gift you can give a coder. Corporations are useful in that they are effective at aggregating and channeling money. "Useful", not "required".

As long as coders have free will and the tools to express it, open source software will continue. In fact, even if you take away their tools, they'll build new ones! To truly kill open source software, you must kill free will itself.

(And, by the way, there are those who want to do exactly that.)

Needles, Haystacks

So, this may seem a little off-topic, but it comes round in the end. Really, it does.

I've been aggravated with the way members of the fourth estate have been treating the supposed "information" that various TLAs had before the September 11 attacks. (That used to be my birthday, by the way. I've since decided to change it.) We hear that four of five good bits of information scattered across the hundreds of FBI, CIA, NSA, NRO, IRS, DEA, INS, or IMF offices "clearly indicate" that terrorists were planning to fly planes into buildings. Maybe so. Still, it doesn't take a doctorate in complexity theory to figure out that you could probably find just as much data to support any conclusion you want. I'm willing to bet that if the same amount of collective effort were invested, we could prove that the U. S. Government has evidence that Saddam Hussein and aliens from Saturn are going to land in Red Square to re-establish the Soviet Union and launch missiles at Guam.

You see, if you already have the conclusion in hand, you can sift through mountain ranges of data to find those bits that best support your conclusion. That's just hindsight. It's only good for gossipy hens clucking over the backyard fence, network news anchors, and not-so-subtle innuendos by Congresscritters.

The trouble is, it doesn't work in reverse. How many documents does just the FBI produce every day? 10,000? 50,000? How would anyone find exactly those five or six documents that really matter and ignore all of the chaff? That's the job of analysis, and it's damn hard. A priori, you could only put these documents together and form a conclusion through sheer dumb luck. No matter how many analysts the agencies hire, they will always be crushed by the tsunami of data.

Now, I'm not trying to make excuses for the alphabet soup gang. I think they need to reconsider some of their basic operations. I'll leave questions about separating counter-intelligence from law enforcement to others. I want to think about harnessing randomness. You see, government agencies are, by their very nature, bureaucratic entities. Bureaucracies thrive on command-and-control structures. I think it comes from protecting their budgets. Orders flow down the hierarchy, information flows up. Somewhere, at the top, an omniscient being directs the whole shebang. A command-and-control structure hates nothing more than randomness. Randomness is noise in the system, evidence of an inadequate procedures. A properly structured bureaucracy has a big, fat binder that defines who talks to whom, and when, and under what circumstances.

Such a structure is perfectly optimized to ignore things. Why? Because each level in the chain of command has to summarize, categorize, and condense information for its immediate superior. Information is lost at every exchange. Worse yet, the chance for somebody to see a pattern is minimized. The problem is this whole idea that information flows toward a converging point. Whether that point is the head of the agency, the POTUS, or an army of analysts in Foggy Bottom, they cannot assimilate everything. There isn't even any way to build information systems to support the mass of data produced every day, let alone correlating reports over time.

So, how do Dan Rather and his cohorts find these things and put them together? Decentralization. There are hordes of pit-bull journalists just waiting for the scandal that will catapult them onto CNN. ("Eat your heart out Wolf, I found the smoking gun first!")

Just imagine if every document produced by the Minneapolis field office of the FBI were sent to every other FBI agent and office in the country. A vast torrent of data flowing constantly around the nation. Suppose that an agent filing a report about suspicious flight school activity could correlate that with other reports about students at other flight schools. He might dig a little deeper and find some additional reports about increased training activity, or a cluster of expired visas that overlap with the students in the schools. In short, it would be a lot easier to correlate those random bits of data to make the connections. Humans are amazing at detecting patterns, but they have to see the data first!

This is what we should focus on. Not on rebuilding the $6 Billion Bureaucracy, but on finding ways to make available all of the data collected today. (Notice that I haven't said anything that requires weakening our 4th or 5th Amendment rights. This can all be done under laws that existed before 9/11.) Well, we certainly have a model for a global, decentrallized document repository that will let you search, index, and correlate all of its contents. We even have technologies that can induce membership in a set. I'd love to see what Google Sets would do with the 19 hijackers names, after you have it index the entire contents of the FBI, CIA, and INS databases. Who would it nominate for membership in that set?

Basically, the recipe is this: move away from ill-conceived ideas about creating a "global clearinghouse" for intelligence reports. Decentralize it. Follow the model of the Internet, Gnutella, and Google. Maximize the chances for field agents and analysts to be exposed to that last, vital bit of data that makes a pattern come clear. Then, when an agent perceives a pattern, make damn sure the command-and-control structure is ready to respond.

Here's my number one frustration

Here's my number one frustration with the state of the industry today. I am a professional. I regard my work as a craft to be studied and learned. Yet, in most domains, there is no benefit to developing a high level of skill. You end up surrounded by people who don't understand a word you say, can't work at that level, and don't really give a damn. They'll get the same rewards and go home happy at 5:00 every day. It's like, once you achieve a base level of mediocrity, there's no benefit for further personal development. In fact, there's a distinct disadvantage, in that you end up pulling ridiculous hours to clean up their garbage.

Bah, there I go being bitter again. Maybe I just need to work in some other domain--one where skills count for something, and being good at your job is a benefit, not a hindrance. I'm sick of writing Address classes, anyway.


Debating "Web Services"

There is a huge and contentious debate under way right now related to "Web services". A sizable contingent of the W3C and various XML pioneers are challenging the value of SOAP, WSDL, and other "Web service" technology.

This is a nuanced discussion with many different positions being taken by the opponents. Some are critical of the W3C's participation in something viewed as a "pay to play" maneuver from Microsoft and IBM. Others are pointing out serious flaws in SOAP itself. To me, the most interesting challenge comes from the W3C's Technical Architecture Group (TAG). This is the group tasked with defining what the web is and is not. Several of the TAG, including the president of the Apache Foundation, are arguing that "Web services" as defined by SOAP, fundamentally are not "the web". ("The web" being defined crudely as "things are named via URI's" and "every time I ask for the same URI, I get the same results". My definition, not theirs.) With a "Web service", a URI doesn't name a thing, it names a process. What I get when I ask for a URI is no longer dependent solely on the state of the thing itself. Instead, what I get depends on my path through the application.

I'd encourage you to all sample this debate, as summarized by Simon St. Laurent (one of the original XML designers).


Prison of our own Making

We who build worlds dwell in a dank and dismal prison of our own construction, though not our design. Why so difficult? Where is the green grass? Where is the sunshine?


Lately, I have been struggling

Lately, I have been struggling to find the meaning in my work. I suppose that's not surprising. I am a human being--a mortal creature. My age will soon flip a decimal digit. (I decline to specify which.) These can certainly cause one to spend time reflecting on one's legacy. They can also cause one to buy a flaming red sports car. I may explore that option later.

 

I also work in a field of incredible transience. Two hundred years from now, no cathedral will bear my mark. No train depot of my design will grace the National Register of Historic Places. No literary critics will deconstruct the significance of my characters' middle initials. In truth, the shelf life of my work compares poorly to that of a gallon of milk.

I am a programmer.

I and my comrades can usually be found behind our glowing screens, working hour after hour to bring some other person's vision to life. We who grapple with chaos and ether and mud expend our spirit, energy, life, time, soul, and qi in the name of creation. We work long after the managers have left. We learn the janitors' names. I have often gazed out my window to the neon street below, full of the theater signs, restaurants, and wandering crowds seeking to be entertained. I have wondered what kind of life I should have led to be in that crowd instead of watching it. I've wondered how I could rejoin that human mass. I think I'd have to change careers.

I cannot deny, however, that my work brings me deep--if ephemeral--satisfaction. The harsh joy of self-sacrifice combines with the exultant delight of success when a project comes together. When I finally get my programs to work, it's a kind of magic, dense and layered. At one level, the thought that my work will be useful to someone--that it will make dozens, hundreds, maybe millions of people more individually powerful--it heady and exciting.

At another level, I have a fierce pride that my software works at all. Knowing that my creation is strong enough, powerful enough to survive the threat of millions of users doing their damndest to destroy it. Despite the teeming millions trying to prove that there is no such thing as "foolproof", my software keeps working. "Robust", we call it. "Resilient". "Come on", it says, "bring it on."

Deeper still, I take a craftman's pride in a job well done. Like a mason or a carpenter, I know what is under the surface. I know how well it is put together. I know what skill went into its construction. No one else may see this, but I know.