How to test the most newsletter ideas for the least amount of money.
Who among us hasn't been surprised, sometimes even astonished, at which newsletter ideas fly and which can't-miss ideas fail miserably? If, in the next 12 months, you could test every idea which has even one champion, you'd end up with more winners.
Unfortunately, there's a cost to testing. A direct mail test for a b-to-b newsletter costs around $3-15,000, the lower number only if you can write your own package. (This assumes the package presents the newsletter as if it existed and asks for the order, although it says, "Send no money now.")
If you've been thinking about the low cost of new media (fax or e-mail) and wondering if you could reduce your testing costs, you are not alone. However, are fax and e-mail tests reliable?
A direct mail test and a computer model [*] can assure your future profitability with an idea that earns a response rate over a particular number. Can fax or email do that? And what about the telephone?
How reliable are fax, e-mail and telephone results?
Unfortunately, tests via fax, e-mail and telephone are very unreliable compared to direct mail. With direct mail, your rollout results (assuming no errors in the test) can be only a modest surprise, not a stomach-clenching horror. Not so with other media, with one intriguing exception.
Let's look at each individually.
To find out how reliable telephone testing is, I went to the king of telephone testing, David Foster of IOMA (now part of Bureau of National Affairs).
David says they're reliable enough to keep you from losing your shirt (or the company), but still not that reliable. Here's an example he provided:
Two newsletter ideas they tested by telephone pulled almost the same (acceptable) response rate. Each had roughly the same market size (30,000) and each carried the same price. Yet, after a full year of roll-out to each, one newsletter had 2,500 subscribers and the other had 500. The one with 500 was losing money and had to be killed. Yet, I suspect telephone testing may have the same limited reliability as I found in e-mail testing (see below).
Based upon results I received testing fax vs. direct mail for two clients, I strongly recommend you never test via fax. Unreliable numbers is the least of your worries here--you're even likely to get misleading results. One newsletter idea that bombed terribly in direct mail pulled well via fax, and a newsletter that tested successful in direct mail looked like a dog in a fax test.
Why? I think it's because you have just enough copy room to get into trouble. With telephone or e-mail, you get 2-4 sentences max. Just enough to convey the idea. And either people like it or not.
However, with fax you have enough room to tout some benefits and premiums-but not enough to put them in context of the whole newsletter. So if something in a premium or copy point hits a hot button, you can get a great response, even if they couldn't care less about the whole package. And you'll miss people who would want the whole package, if they knew about it, but who aren't struck by what made it into the fax promotion.
Here, again, tests were wildly off the mark, with one unusual exception: Testing four newsletter ideas for clients showed the following results:
E-mail Direct mail Newsletter #1 22.18% 2.02% Newsletter #2 21.92% 1.09% Newsletter #3 18.9% 0.31% Newsletter #4 5.57% 0.17%
As you can see, the e-mail test results actually ranked the newsletters in the same order as their direct mail scores.
How is this possible? Because you have only a couple of sentences to describe the newsletter and its key benefit, that's what the response is based upon--the key benefit. And because telephone testing has the same limitations, it may well also rank ideas in the correct order for direct mail results.
However, e-mail results predicted close results for newsletters 1-3, while the actual direct mail responses ranged from 2% down down to 0.31%.
If you went with your e-mail results, you would roll out numbers 1-3. And you would be rolling out two losers along with the one winner.
It's just not good enough to risk roll-out money. But it might be able to save you a bundle in direct mail testing expense.
Proposed new test strategy
Based upon these results, it seems there is a new test strategy for newsletters that should dramatically increase the number of ideas you test without increasing your test costs. How is that possible?
Suppose you normally do direct mail tests for three new newsletter ideas a year. And further suppose your think-tank sessions with editors and other employees come up with 20 potential ideas.
You could test all 20 by email. Or by telephone, if you have an in-house telemarketing operation (otherwise, it would be too expensive and thus defeat the idea of cheap pretesting).
Then take only the top three scorers in the e-mail tests and give them a direct mail test. Result: practically no additional cost (only e-mail processing) and a much bigger chance of all three being winners. Yet, by not rolling out from e-mail results, you protect yourself against some of the three being bombs.
Components of an e-mail pretest
What should you say in an e-mail test? Only the following:
* Who the offer is coming from (why should they trust you?)
* The newsletter name
* Two sentences about the chief benefits of the newsletter to recipients
* The offer: Receive a free 2issue trial subscription with no obligation (more issues if you're greater than 12x/year)
* The e-mail address to respond to
* The information they must supply, including their name and mailing address.
Why should you ask for their mailing address? Because, believe it or not, it will increase your response.
When I tested e-mail offers for a client, I was told to just provide a link the recipients could click on to order. The company's records could match e-mail addresses to street addresses.
I suggested instead to require a street address to reduce uncommitted responses. They told me such a requirement would kill responses.
To resolve this question, I had each list separated into perfect A/B splits and half the names had only to click to order and the other half had to type in their street address. Here are the results:
Total e-mail responses Address required? Yes No Newsletter #1 25 3 Newsletter #2 65 32 Newsletter #3 54 45 Totals 144 80
Why should this be? The only rationale I can imagine is that by not providing a street address the respondents assumed they were signing up for an email newsletter. And, apparently, that was not very attractive to them compared to a print newsletter. (The copy said "a new newsletter" with no mention of print or e-mail.)
These responses might give you pause if you're considering launching an e-mail newsletter that is expected to be more than a promotion device. The three markets these newsletters went to were not computer-oriented or high tech, but neither were they at the dog-catcher level.
Of course these numbers are gross. But it seems unlikely that requiring someone to do extra work to get the subscription would result in a lower net.
* For a description of a computer model, see Newsletter Launches: Part 1, Wow to reduce your risks in testing and launching newletters," by Marlene Jensen, NL/NL 9115199.3
|Printer friendly Cite/link Email Feedback|
|Publication:||The Newsletter on Newsletters|
|Date:||Sep 30, 1999|
|Previous Article:||If you're planning to promote in Europe, here are some lessons learned.|
|Next Article:||Lynx Media releases postal presort program.|