Every year, the judges—many of whom have been participating for over a decade—ask themselves which outstanding offerings among hundreds of nominated tools, books and websites have jolted the industry with a burst of inspiration. In its thirteenth iteration, the Software Development Jolt Awards honored newcomers and lesser-known tools, but didn’t forget some perennial favorites. The results reaffirm the power of agility in transforming development, make it clear that Web services are pushing past the corporate firewall, and portend a bright future for Java. Thankfully, innovation in the lifecycle tools market is alive and well.
The Jolt Methodology
This was my first time managing the Software Development Jolt Awards,
a six-month marathon project. Not only was I new to the job, but this year,
the editors made some major changes to the 13-year-old process.
First, we decided to distinguish the vendor-influenced portion of the competition from the reader-appreciation portion by splitting the Jolts into two tracks: a juried track in which all for-profit vendors paid to nominate their products, and a reader-nominated track meant to reflect the sentiments of working developers. While the juried track more or less followed the traditional process, the readers posed a challenge. Because of the panoply of products, categories and potential voters, the process had to be automated. I quickly made it a priority to find voting software to make my life easier.
The first tool I found was an open-source product—and though it could count votes and reports, I needed more sophistication for the final vote—like multiple ranking of products within categories, importing files rather than manually entering them, locking out judges who had not signed up for a particular category, and IP address checking. Though I evaluated several other open-source products, I found none to meet these needs, so I started looking at the paid product marketplace. Satisfied with an evaluation copy’s features, I selected Perseus. Their support was excellent, and soon, we had a full-featured voting website. Although our IT department didn’t support ODBC connection to the database, Survey Solutions created a .tsv file with which I could download and create a database and generate reports on my desktop—all within the product environment. I could also look at the raw data to see who was voting for what.
The juried voting went without a hitch, but the readers’ choice voting was a bit rockier. We had two automatic levels of security: To participate, voters had to be registered readers at SDmagazine.com—we posted the poll on our gated site (one vendor asked if I could remove the gate so that his users could vote for his product—sorry, no!). Voters were limited to one vote per item (with the help of cookies), and a manual third level of inspection: We looked for duplicate IP addresses to detect multiple votes if cookies were turned off. Some categories were well behaved, with no obvious fixing of votes detected—although suspected. But others were a tad more troublesome: flagrant excesses, with approximately 100 votes (which were promptly dumped) for one product, from the same IP address; and some clever manipulation (numerically stepping through the IP addresses)—those ended up in the circular file, as well. Next year, we’ll turn to user validation with e-mail and ID from magazine subscriber mailing labels, stash the process behind a double-gated site with encrypted password validation, require cookies and hide the whole contraption in some obscure directory … or something like that.
Bottom line? The Readers’ Choice Awards are posted on our website, but
don’t set your clock by them. Next year, we hope for more participation
and greater confidence and that the results more closely reflect our readers’
true choice.
—Rosalyn Lum
The Judging Panel
Thanks to the two dozen reviewers who felt both the thrill of brilliant software and the agony of buggy installs.
|