Sunday, May 3, 2009

Hacking the HOV Lanes

The city of Pittsburgh has high occupancy vehicle (HOV) lanes to expedite the flow of traffic during the morning and evening rush hours. Sounds good, huh? I don't know whether it was a result of poor planning, condensed space due to the terrain, or a restricted budget, but the architects of the HOV solution decided to use the same two lanes in the morning and in the evening. In the morning the two lanes carry traffic into the city, and in the evening, the traffic flows outward.

Are the merest glimpses of a problem beginning to emerge? There are a set of gates at each end of the HOV lanes structured in a way that allow or prohibit traffic from entering the lanes. Typically, the gates into the city are open in the morning, and the gates leaving the city are open at the end of the day. A human operator is responsible for opening and closing the gates. It should be noted that it is impossible to see both ends of the HOV lanes from any single location. Therefore the operator has to close one set of gates at one end of the HOV lanes before proceeding to the other end and opening those gates.

It is therefore, possible, to have both sets of gates closed, and both sets of gates open at the same time. Having both ends closed is a bit of an inconvenience. Having both ends open is taunting disaster. Here is an expert from the website

The worst accident to occur on the HOV lanes happened in 1995 between two cars which hit each other head-on and cost the lives of six people. A PennDOT employee did not close the gates to the outbound entrances, and was later convicted of improperly changing the lanes while under the influence of cocaine. After the accident, the number of HOV users dropped by more than 1,000 per day. Wrong-way accidents are unheard of today; however, this hasn't helped to increase the ridership.

The latest improvements to the HOV lanes were unveiled on May 18, 2006 in the form of a $770,000 automated "fast-acting" gate system which are the latest in a series of improvements such as CCTV cameras, automated interlocks on permanent gates, and improved signage since the 1995 accident. The new gates will be down during morning rush hours with overhead sensors to detect approaching inbound vehicles. If one is detected, the gate will raise to allow it to pass. During afternoon rush hours and weekends when the HOV lanes are open in the outbound direction, the gates will be up.

Did you catch the phrase, "convicted of improperly changing the lanes while under the influence of cocaine" I will make no excuses for someone in dereliction of their duties as a result of self-inflicted judgment-impairing activities. There's simply no excuse. That being said, was the architect of the solution also convicted? Tried? Admonished? Told to sit in a corner without crayons? Consider the cost of "improvements" which were made after the HOV lanes were operational.

In the book, "Why We Make Mistakes" by Joseph Hallinan the author argues that when mistakes happen we tend to look down, not so much with our eyes, as with our attempt to understand where to place the blame. We look down the chain of events to the people closest to the accident rather than looking upward to determine how the situation was allowed to exist and (more importantly) how to avoid a recurrence. I don't mean avoid as in band-aid and patches, I mean avoid as in designing solutions that cannot fail.

Even if we could wrap our heads around the insanity of a system which relied on human memory to avoid a catastrophic loss of life, how does one implement such a solution with the absence of closed circuit television cameras, automated fast-acting gate systems, and other fail-safe devices. Make no mistake about it, these band-aids and hacks (there aren't any more-appropriate terms) are only necessary because of the fundamental design flaw in the system. Creating a high speed roadway where traffic flows in both directions in a shared set of lanes is fundamentally flawed. We have learned that we need concrete barriers between high speed lanes of opposing traffic, or maybe a wide grassy knoll; at the very least an ugly three foot steel guide rail.

Donald Norman writes in his book, "The Design of Everyday Things" his credo about errors:

If an error is possible, someone will make it. The designer must assume that all possible errors will occur and design so as to minimize the chance of the error in the first place, or its effect once it gets made. Errors should be easy to detect, they should have minimal consequences, and, if possible, their effects should be reversible.

Imagine if the architect, project managers, budget holders, and users of the HOV lanes had embraced Norman's credo; even if only the first sentence, "If an error is possible, someone will make it."

In my industry, banking, no one dies from the mistakes we make. Any fundamental design flaws are, in the grand scheme of life, relatively minor. But that is no reason to be careless. Consider the solution you are working on right now. Forget malicious users for a moment (although, you should never stop considering malicious users) and ask yourself, does this solution assume that users will always perform their job correctly. Will batch job 'B' always be executed after batch job 'A'? What, you say? Users don't execute batch jobs, only highly trained professionals do that!

If an error is possible, someone will make it. Batch jobs are usually initiated in sequence as a result of some script kicked off by a job scheduler. Batch job 'B' should contain logic to ensure that its input files are correct rather than assume that 'A' must have executed. User interface gurus will tell you, never trust the input. Actually, that's good advice for any API.

If an error is possible, someone will make it. If it is possible to shoot yourself with a nail gun, someone will do it, or rather, 37,000 people will do it. Every year. Have you looked at your current project with a Donald Norman perspective and eliminated (as opposed to hacking over) all of the HOV design flaws?

Follow by Email