Optimizing the return on your nonprofit’s direct marketing investment is both an art and a science. You develop creative ways to boost response and then measure the results.
The data from direct mail testing is empirical, factual. To paraphrase a popular speaker, these are facts that don’t care about your feelings. But you have to carefully design your tests to ensure the accuracy of your data.
For example, when conducting a test to increase the performance of a control package, you should only change one element of the package to better know what caused the difference in results. If you change two factors, you’re not sure which had the effect, or if they cancel each other out. Knowing what effect that one test element had on your package can help you make decisions about future packages.
We recently helped a client test a design element we hoped would increase average gift amounts of a successful house file control package. The one simple design change on the reply (see image) was to circle the second ask amount with a script text note off to the side, “This amount would really help!”
The ask string amounts were calculated the same way in both packages. The first ask amount reflected the supporter’s highest previous contribution (HPC) and the second ask amount (the one circled in the test) was calculated as 1.5 times HPC. So the test nudged donors to give just a little bit more than we knew they were capable of giving.
The test package outperformed the control package by a substantial margin in terms of average gift. In the test package, we saw an average gift of $67.99 versus $56.79 for the control package. This is more than a 19 percent increase in average gift for this particular mailing.
This test was successful because it increased average gift like we hoped it would. We’ve rolled this test out to other packages, with the same and other clients, and we’ll report back more results in the future.
For help with developing your nonprofit’s next fundraising campaign, contact LDMI today.