Stan Kelly-Bootle graduated from Cambridge University, U.K., where he earned the world's first postgraduate diploma in computer science. Kelly-Bootle is the author of The Devil's DP Dictionary and four less serious computer books.
The traditional English folk song misquoted in my title was actually a "Come All Ye Fair Maids" broadside, warning us not to misdally beyond a certain critical point, lest our "Thyme" be irreversibly assailed. Many of the sweet-sounding, sweetly blooming flowers and herbs of the Elizabethan country garden packed a far from innocently bucolic symbolism.
These down-to-earth ciphers eventually suffered Victorian gentrifraction. When The Language of Flowers appeared in 1913 (facsimile edition, London, U.K.: Michael Joseph Ltd., 1968), thyme had lost its sexual connotations. Thyme now simply means activity, an interesting example of what George Steiner might call "euphemistic double metonomy," substituting a neutral instrument for the unspoken target of the unspoken action.
The 1913 floral mapping, by the way, could possibly provide a context-free LALR(1) grammar, but I fear that Michael Swaine might promote this as the Primarose Paradigm. The extra folksy syllable in "primarose" is good etymology as well as mandatory for scansion, as in "The Banks of the Sweet Primaroses." And you would see Interflora blossoming as a data-communications leader, with GarLAN protocols for downloading structured, wreath-only bouguest. Just a guess.
If the Elizabethan herbal doubles ententes were still in vogue, Ophelia's mad scene would provoke even more blushes. Our present corporate culture, though, would support my homophone: time is the precious commodity to be defended more fiercely than one's virginity. Let no person or event interfere with your time-management (TM) strategy.
But what, as they ask rhetorically in primers, is time and how can it be managed? The poets are not too helpful:
O let not time deceive you,
You cannot conquer time!
--Louis MacNiece
And the arch enemy of TM:
What is this life, if full of care
We have no time to stand and stare?
Programmers get close to the quintessential paradox of time when they realize that the nanosecond is finite -- add enough of them sequentially and you get what is known in the trade as "unacceptable response." We learn to respect those epsilons of clockticks and spend weeks shaving cycles. Our image of time is a 25-MHz crystal rather than Old Father Time rolling along with Old Man River. We realize, of course, that if you are an electron in orbit (or whatever they do nowadays), the nanosecond is an eternity, but we are still haunted by Zeno's stationary, moving arrow. Is time infinitely divisible? If so, how do all those zero-length instances add up to blink my cursor and then press on to make a long winter's night at a terminal? The smallest discussable time period at the moment (sic) is the first 1043 seconds following the Big Bang, yet apparently much was accomplished. Great planning and TM, no doubt, that should inspire us to make every cycle pay.
The smallest interval in everyday life, the cynics claim, is the time between the traffic lights turning green and the Klaxon hoot from the car behind you. Or perhaps the gap between subscribing to a computer magazine and receiving a letter urging you to renew. I recall on one occasion getting so mad at Byte's hectic requests to renew a subscription that still had five months to run that I ticked the three-year box and sent my check. Almost by return mail, I received a not explaining that under the new tax laws pending, a six-year subscription was strongly advised.
Physicists use a T in their equations quite remote from our biological clock and its daily scheduling impositions. Their T ranges from 0 seconds to 1010 years and can even be made to run backward by changing a sign here and there. Stephen Hawking has written a best-seller on the subject, A Brief History of Time (New York, N.Y.: Bantam Books, 1988), covering the entire span from T0 to Tmax.
However, a New York Times survey revealed that Hawking's book was the least-read, least-understood best-seller of all time! Apparently, the American publishers insisted that Professor Hawking remove all obfuscating mathematical formulae from the book, with the exception of E = mc2 , which was considered to be well-known and harmless to the general reader's synapses.
The highlight of the book is Hawking's explanation of why a (3+1)-dimensional universe (three for space, one for time) is just about right for earthlings. Two-dimensional creatures have a topologically difficult job ingesting food and evacuating waste since a proper digestive tract would literally split them asunder. In the other direction, having more than three spatial dimensions would play havoc with the force of gravity, leading to planetary instability. If other spatial dimensions are lurking, as mooted by the non-ASCII super-string theorists, they must be incredibly tiny, coy, and wrapped up in themselves.
The general reader, unaccustomed to the tricks played by mathematicians, will be startled by the fact that a point event in our (3 + 1) space-time manifold is expressed with four coordinates: x, y, z, it, where i is the frightening (to some) and imaginary (the misnomer survives) square root of --1. The so-called imaginary time axis, however, uses i for purely technical reasons, not because of any mystical properties. The i helps keep track of events' time components. A kind of ongoing cosmological TM, perhaps?
You may wonder why we have only one time dimension, apart from mollycoddling the astrologists and diary manufacturers. Well, I have an old, battered copy of Sir Arthur Eddington's Mathematical Theory of Relativity (Cambridge, U.K.: Cambridge University Press, 1923), which reveals that a (2 + 2) universe (two space and two time dimensions) is a perfectly reasonable object for mathematical investigation.
You may recall Eddington's role in the earliest recorded relativistic joke. Eddington, after giving the first public exposition of Einstein's new theory in England, was asked by a reporter if it was true that only three people in the world understood relativity. Eddington paused for a while, eventually saying, "I'm wondering who the other one could be."
Eddington considers the boundary conditions if a real (2 + 2) space-time continuum rubbed shoulders with our (3 + 1) version somewhere. The sums reveal that no electromagnetic signals could pass in either direction--a situation that will surely ring a bell with most readers. He concludes that a (2 + 2) universe cannot enjoy spatiotemporal relations with ours, which is a pedantic way of saying that it does not exist. More precisely, it means that if extraterrestrials are struggling with TM on two independent axes, we can safely ignore them and vice versa. So, back to our own time and place.
One common-sense definition reckons that time prevents everything from happening at once. If this is indeed the intended purpose, then in my experience time has not been consistently successful. A certain parallelism intrudes from time to time, prompting the cry that it's all happening here and now on this very desk. To counter the assault of parallel problems you need to develop parallel methods of problem solving, of which delegation is much touted but rarely available exactly when needed.
As with computing, the difficulty is determining which subtasks can be tackled independently and which must await the outcome of an earlier sequence. It is easy to fall for what I call the floss-while-you-are-eating strategy. Borrowing another computing analogy, the time-sharing concept, you can often service a group of demands in such a speedy, round-robin way that each demand is tricked into believing it has your undivided attention. Upon reflection, the ideas of job priority and time slicing are so basic to all human activity that they must predate the multiterminal computer and its trivial algorithms by countless millenia. The traditional TM solution is learning how to do things more effectively, not only the obvious elements of planning, scheduling priorities, speed, and efficiency, but also the art of avoiding too many commitments and deadlines. One must not forget, though, that situations arise where the appropriate time-management strategy involves stretching the job in hand.
A bystander, it has been reliably related, was watching a painter paint the railings outside Trinity College, Dublin, Eire. The painter would dip the brush in a small tin of paint, walk 50 yards or so to the railings, apply the paint, then return to the paint tin. After watching this performance for some time, the bystander felt compelled to comment: "I say, why don't you carry the paint tin nearer to where you are painting? You'd have less walking and you could probably paint twice as many railings." To which the painter replied indignantly, "And to be sure, now, there aren't twice as many railings to be painted." This tale has an obvious moral for managers. Does your staff know what lies beyond the current project? These preoccupations are prompted by my adoption of "Who-What-When", a PC package from Chronos Software subtitled "Software for People, Project, and Time Management," as my personal deadline-control system. I will not attempt an objective review of who-what-when since it is the only TM package I have tried. In fact, until recently, I viewed TM as a rather pompous, obvious obfuscation, the sort of arena where "methods" become "methodologies." I had perhaps been misconditioned by reading about expensive, full-time, six-month courses where you learn how to avoid wasting time. My reaction was that only those dropping out after the first day would merit a diploma.
Clearly I cannot compete with my fellow contributors who specialize in product reviews and comparisons. Nor can I unroll any of those feature charts with vast matrices of side-by-side Yes/No/Optional notations, each carrying several modifying footnotes. Nor can I provide performance charts superimposed against the vendors' claims. The latter should be measured in Ghallstones, a benchmark unit independently coined by Peter Dufault, myself, and possibly many others. Whoever claims priority must beat Feb. 16, 1989, when I first publicly defined the Ghallstone during a SD Conference '89 lecture before 65 unbribable witnesses.
The pleasant thing about who-what-when and its chief programmer, Greg Peterson, is a lack of gall and hyperbole. The package is not a flashy, windows-everywhere affair promising all things to all people, but rather a solid, easy-to-use planning tool for the nontechnical individual or team using minimum, character-based PCs or laptops. It was written using Logitech's Modula-2 (with a dash of MASM), an interesting choice for what Peterson calls his "one-man team." We often think Modula-2's advantages in terms of groups independently developing component implementaion modules for a set of communal definition modules.
But it's worth remembering that this approach also works for the individual programmer with nontrivial projects, allowing you to concentrate on the module du jour. The other advantage is the growing number of quality toolboxes for Modula-2 so that wheels like B-tree file management do not need reinventing. Now Peterson is reaping the benefits as the company has gained success and grown in size over the last two years. Maintenance, upgrades, and hiring and training are all eased through standardization on a modular, self-documenting language. I also found it interesting that the Chronos staff invites users to all their programplanning sessions so that decisions on features receive user input from step one.
The who-what-when has a simple approach: a single entry of who you are going to meet or phone, what the meeting is about, and when it is scheduled. The three data areas are then cross-referenced so you can call up on your screen people, project, and time views. The input screens offer a consistent and obvious interface, you can dial your contacts automatically or set pop-up alarms, and the printed schedules, project lists, and calendars fit nicely into the quality "Paperware" binder provided. My desk is a tad less uncluttered. Tomorrow, I will be fully organized.
Returning to the subject of time, productivity, and wild claims, Tom Clune of the Eye Research Institute in Boston, Mass., has sent me a strange example of gall taken from a CASE '89 flier: "How do we truly achieve 50 to 100 times more productivity in system development through CASE, and how will we recognize when we have achieved it?"
Clune immediately dismissed my suggestion that the writer intended a 50% to 100% (rather than 5,000% to 10,000%) improvement on the grounds that no one in marketing today would ever admit to a mere doubling of productivity. The mystery is how one could possibly achieve such improvements without being immediately aware of them. Clune wonders if "there is a special kind of productivity, CASE productivity, which is defined as extreme but indetectable productivity...rather like the curvature of space, which is real and significant and yet requires the utmost in human ingenuity to measure."
I hasten to add that the CASE community should not be judged on this single aberration. I fully support the aims of CASE: the computer caused the SE crisis, so the computer will have to help us resolve the SE crisis. My sneaking reservation concerning CASE is a semantic one and applies to most of the CAxx acronyms for computer-aided or computer-assisted whatever (excluding my favorite chestnut: CAD for computer-aided delay).
In my work with Association for Literary and Linguistic Computing, I meet many literary research projects where the CAxx part dominates at the expense of worthy scholastic results. Theses proudly announce that the number of rhyming couplets in Romeo and Juliet is 25% higher than in As You Like It. But rather than explore the literary implications, many papers o'erdwell on the hardware and software problems involved in the counting process.
The inappropriateness of elevating the tools above the end product can be seen if we imagine Shakespeare writing not plays but papers on Quill-Assisted Drama (QAD). And I've forgotten who first suggested a similar down-putting acronym: TAR for Typewriter-Assisted Research. Every area of human endeavor that can be computerized should be, is being, and will be. The CA tag becomes superfluous, tedious pedantry stretching from CAC (Computer-Aided Computing) to CA(CAC), since CAC clearly needs to be Computer Assisted, and so on. That's begged enough questions for one column.