Printer Friendly

How to write better review guidelines.

"When you're a small company, reviews drive much of what goes on in sales," says T/Maker chairman Royal Farros. "So it breaks your heart when a reviewer gets something wrong." In mid-1988, frustrated by reviewers who didn't seem to grasp key differences between T/Maker's WriteNow word processor and its competitors, Farros decided that reviews of the company's new version were too important to leave to chance.

Farros assembled a detailed, 35-page "Evaluator's Guide" that explained the company's design philosophy, provided competitive benchmarks, and offered an exhaustive feature-by-feature comparison with rival Macintosh word processors. "The project soaked up a ton of time," Farros says--"a couple of man-months at least."

A good many skeptics within the company wondered whether T/Maker would see any payback from this effort, Farros recalls. "I kept hearing that reviewers would never read it all, or that they'd think it was just a sales brochure." But reviewers gave the Guide a warm reception. "The best part is, people who are doing reviews become much more in tune with how they should design tests," says Farros. The quality of reviews-- ours, and everybody elsels--absolutely improved."

Inspired by T/Maker and a few other pioneers, a growing number of software companies have made in-depth review guidelines a key part of their launch strategies (often replacing self-serving "market backgrounders" and white papers"). We recently asked Farros for advice on how to write better review guidelines:

* Target the real competition: Review guidelines shouldn't pretend the product exists in a competitive vacuum, Farros says. Features and performance should be compared against products that are the market leaders in a category (it's acceptable to ignore products with insignificant market share). "But the tough issue is whether you've really named all the competitors from related markets," he says. Thus, even though WriteNow competes primarily against MacWrite II as a mid-range product, Farros included two indirect comnpetitors--a high-end word processor (Microsoft Word) and an integrated package (Microsoft Works)--in his comparisons.

* Highlight the product's special qualities: Reviewers may not notice (or understand) the "specialness" of a product, Farros warns, so the guidelines should focus on competitive differences rather than features that are generic to the category. "We concentrated on speed tests because that was one of the big differentiations between us and other products." Guidelines that don't identify the product's competitive edge may do more harm than good, he adds. "If you can't clearly articulate why your product is special, reviewers may ask why you're even in this business."

* Quantify the differences: "It's not enough to tell reviewers that your product is 'fast' or 'easy to use,'" Farros says. "You have to figure out how to make these differences measurable." To support T/Maker's claims about speed, the WriteNow guidelines published the results of dozens of performance tests against MacWrite, Microsoft Word and Microsoft Works 2.0. (Because the testing process began while WriteNow was in beta test, Farros notes, the tests also gave his programmers immediate feedback about changes they were making in their code.)

* Be scrupulously honest: Avoid the temptation to rig test results or conceal weaknesses, he warns, "or else you'll do serious and irreparable damage to your reputation." Says Farros: Sweat every statement you make. Worry about the accuracy and the reproducibility of every word you write. when we were doing our timing tests, we knew that if even one of these tests was wrong, Bill Gates would come in and sue us."

* Explain your design strategy: There are often important tradeoffs in how features are implemented, Farros points out, so it's important to explain key design decisions. For example, the WriteNow guidelines talk about why, for the sake of consistency with Apple's human interface guidelines, T/Maker chose a slower scrolling technique than its competitors. "We were being penalized for playing by the rules," he says. Farros notes that reviewers don't always agree with such design decisions, but at least they'll have an appreciation of the company's philosophy.

Royal Farros, chairman, T/Maker Co., 1390 Villa St., Mountain View, Calif. 94041; 415/962-0195
COPYRIGHT 1991 Soft-letter
No portion of this article can be reproduced without the express written permission from the copyright holder.
Copyright 1991, Gale Group. All rights reserved. Gale Group is a Thomson Corporation Company.

Article Details
Printer friendly Cite/link Email Feedback
Title Annotation:relations between software publishers and product reviewers
Publication:Soft-Letter
Date:Jul 15, 1991
Words:676
Previous Article:Postscript.
Next Article:Borland: into the fast lane.
Topics:


Related Articles
The Byte software review process.
Books from scratch.
Extending the benefits from a voluntary tax practice review.
On 'dysfunctional' public relations.
Dos & don'ts when working with writers on contract.
Lessons learned from the first year of program reviews for ACEI national recognition of elementary programs. (Professional Standards/ Teacher...
WEB SITE REVIEWERS GET STEADY, LOW-PAY WORK.
Sensitivity training: history and literature, heavily edited.
Making the most of your business, trade media opportunities.

Terms of use | Copyright © 2016 Farlex, Inc. | Feedback | For webmasters