I’m not so sure about retention
I used to think I knew a lot about retention. I was so sure the key to keeping your donors around was to thank them, respect them, listen to them, nurture them, and show them how they’re making an impact.
Obvious, right? Intuitive, even.
But what does it mean to respect, listen to, and nurture donors? The truth is, people have very different ideas about what those words mean—let alone how they feel in action. And even if we all agreed with each other, where’s the evidence that any of it actually helps improve retention?
Despite some decent benchmarking data on retention rates (the best I’ve seen comes from the Fundraising Effectiveness Project), there simply isn’t much good, public data about what strategies work to retain donors.
That’s partly because it’s hard to get this data. Retention, by definition, takes a while to measure. And in that time lots of things happen and lots of variables intrude. Setting up a test to measure the impact on appeal volume, cultivation content, or donor satisfaction surveying is complex, time-consuming, and expensive.
And yet: retention is a really big deal. Depending on the size of your organization’s fundraising program, a 1% improvement in retention can mean hundreds of thousands of dollars in revenue. That’s why we’re including retention in our 2018 Benchmarks Study.
We’re also running longer term retention tests so that we—and our clients—can better understand how some of the more popular retention theories actually work.
The questions we want to answer include:
- Does dramatically increasing appeal volume significantly reduce an organization’s retention rates? Or will additional solicitations drive retention up, because the more you ask, the more people will give?
- Will sending regular content showing the impact of a gift improve retention?
- Can you improve retention by regularly asking your donors for their opinions about you, and their donation experience, similar to how corporations ask their customers for feedback using a net promoter score?
While it’s going to take some time to get the results from these tests, looking at the data we do have on retention for some of our clients suggests that what may seem obvious isn’t always true.
For example, take the impact of appeal volume. Looking at two large organizations that have mature digital fundraising programs:
- Organization A – sends more than 4x as many emails than the benchmark for large organizations, 6x more appeals, and has an online-only retention rate of 50%, with new donor retention rate above 40%. This doesn’t even factor in the bump in retention all organizations see from adding in multichannel gifts. Those retention rates are above the average for the industry, and above the average for peer organizations that we work with.
- Organization B – sends far fewer emails than Organization A, about the same number as other large organizations, and about 1.4x as many appeals, and has an online-only retention rate of 26%, with a new donor retention rate at 20%. Sending far fewer emails led them to a retention rate roughly half of what Organization A experienced.
Of course, other factors likely play into these retention figures, including mission, brand awareness, and other issues. But at a minimum, this data calls into question whether or not the conventional wisdom that says “bombarding” your list with appeals will drive down your donor retention.
I get why people think this. It does feel like a lot of email. We’ve likely spent hours drafting, editing, securing approvals, QA’ing, and finally reporting on each one. We also sign up to receive appeals from other organizations, so we can see what our peers are doing. At some point, it gets old seeing the non-stop flood of appeals.
But, once again, we fundraisers are not our target audience. And while we may think we know what our donors want, the only way to understand how to improve retention is to test some of these theories. If you have run long-term treatment tests on your file to test retention, I’d love to hear from you. Our hope is to publish some findings from our own retention tests a little later in 2018.
This is so timely! My team was just talking about retention and so we really appreciate your piece. Do you think this could apply to advocate retention as well or is this donor-specific? If it’s the latter, do you have any information on advocate retention rates and tips? Thanks so much! We love M+R!