Archive

Testing

The A/B test tool on ActBlue, which allows you to test out Contribution Form titles and pitches, among other variables, has gotten a significant upgrade, just in time for campaign season.

The old A/B testing tool worked great, but it also forced you to wait around for both test variations to get enough traffic to gain statistical significance. If one version was performing way better than the second one, that meant you were losing out on potential contributions in order to gain valuable insight.

This is how most A/B testing tools work, and it’s a good system. But with the new ActBlue testing tools, which use a more advanced statistical algorithm than typical A/B testing, you can still achieve statistical significance without having to sacrifice a ton of traffic to a losing form.

As the test runs and one variation begins performing better, we’ll start sending more traffic to that form, roughly in proportion to how they’re trending. You can see the traffic allocation listed just above each variation on the “A/B Test” tab of your Contribution Form. The traffic allocation will change continuously as donations come in. It’s important to note that if a variation is receiving 75% of the traffic, that does not necessarily mean it’s conversion rate is 3X as high as the other variation(s). If you’re curious what it actually does mean and want to talk complicated stats, you can get in touch with us here.

If there was a false positive and the losing form starts doing better, the traffic allocation will begin to reverse. The test will continue to run indefinitely until you click “Make Winner.” The A/B testing tool will eventually send 100% of volume to the winner if you don’t make either version the winner manually.

The new A/B testing tool makes your tests more efficient, which means you can try out more of them. If you have radically different language you want to try on a form, alongside three more standard pitches, there’s little risk. If it doesn’t work out, we’ll send fewer and fewer people to that losing form.

We wanted to give special thanks to Jim Pugh from ShareProgress for sharing notes on the multi-armed bandit method used in their software and helping us out with building this tool (and for hanging out in the ActBlue office for a week)!

As always, let us know what tests you’re running and what’s working for you at info@actblue.com!

Here at ActBlue, we’re always optimizing our contribution form by testing different variations against each other to see which performs best. And, whenever possible, we like to share our results. Needless to say, it’s great to discuss tests that end up winning; every percentage point increase in conversion rate we bring to our contribution form benefits every committee — of which there are currently over 11,000 active — that fundraises on ActBlue.

A very important part of this process, however, is also tests that fail to bring about a positive change to our contribution form. Failure to openly discuss and reflect upon losing tests belies the experimental nature of optimization. Thus, I’m here to talk about an A/B test that we just ran on our contribution form that lost. (Bonus: it lost twice!)

We tried coalescing our “First name” and “Last name” fields into one “Full name” input. The theory was that one fewer input would reduce friction along the contribution path, thereby increasing conversions. Here’s what it looked like:

Control

Variation

The control version, it turns out, was actually associated with a higher conversion rate than the “Full name” variation, though not statistically significantly.1 We even tested another slight variation of the “Full name” field with slightly different placeholder text and a more expressive label, but it lost again.

If you’re wondering why it lost, then that makes two of us; in a case like this, it’s tough to say what actually happened. Was it aesthetics? Anti-novelty effect? If we speculate like this ad infinitum, we’ll end up with more questions than answers — the world is full of uncertainty, after all. Far from discouraging this type of reflection, I’m saying that we indeed should! This is the origin story of many new testing ideas.

Footnotes:

1: Pr(>|t|) > .05 , n = 63159

Our team is always thinking through ways to make our contribution forms easier to fill out and more streamlined. When donors have too many options and abandon a form, that’s known as choice paralysis. Eliminating that choice paralysis is a big part of building better contribution forms.

Tandem contribution forms list multiple candidates, which require more decisions to be made by donors. But the vast majority of people choose to just split their contribution evenly between all the candidates on the form. That used to look like this:

Too many options and too many boxes for our liking. Do you want to give more to candidate A than organization B? How much do you want to give in total?

We boiled the form down to that last question — how much do you want to give? This made it a lot easier for donors to give (spoiler alert: this A/B test was a huge success).

Now, when you land on a tandem form, you’ll see the normal amount buttons with a note underneath saying who the donation will be split among. You can still click a button to allocate different amounts to each candidate, but donors are less overwhelmed when they land on the page.

Here’s the new form:

So how successful was our A/B test? We saw a 7.16% overall improvement in conversion. That’s unheard-of-huge. We’ve done so many optimizations of our forms that we cheer for a test that leads to a 0.5% increase in conversions.

Part of that overall group consisted of non-Express users (people who haven’t saved their payment information with us) who land on our traditional multi-step form. Among that group we saw a 26% improvement in getting people to move from the first step of the process (choosing an amount to give) to the second step (entering their information).

There are so many candidates and organizations running really thoughtful tandem fundraising campaigns, and this is going to mean a huge bump for them. If you have questions, or want to tell us about a tandem campaign you’ve run, let us know at info AT actblue DOT com. We want to hear from you!

We’re fewer than six weeks from the election. That means, among other things, that optimal fundraising strategies become even more important than usual. Here at ActBlue, we’ve been running tests on a nearly daily basis on all kinds of Express Lane strategies.

Typically, we see the largest (statistically significant) improvements when optimizing factors related to the Express Lane askblock structure like amounts, number of links, and intervals between the links. For our own list, we find that, statistically speaking, the flashier aspects you see in some fundraising emails — emojis in subject lines, e.g. — do not do much (if anything) to improve donation outcomes. Here’s a tactic we recently tested, though, that’s a bit more on the fun side of things and definitely brought in a lot more money.

A little while ago, we started using our weekly recurring feature to great success. (By the way, if you haven’t tried this feature yet, shoot us an email at info [at] actblue [dot] com and we’ll turn it on for you.) After testing which amounts brought in the most money, we landed on this1:

We wanted to see if we could raise more money by asking for “$7 because there are 7 weeks until the election!” Gimmicky? Sure, but we had a hunch that it would perform well.2 Here’s what it looked like:

So what happened? The segment with the ‘7 for 7′ ask performed much better than the control; it brought in 87.6% more money, a statistically and practically significant improvement.3 Cool!

What’ll be interesting to me is to see when this tactic will lose its optimality. The key factor is that $7 (with gimmick) performed better than $10 (the control and previously optimal ask amount) despite it being a lower dollar amount. Though, at some point, a too-low number-of-weeks-to-election-dollar-ask-amount combination will negate the positive c.p. effect of the gimmick. Based on other testing we’ve done, my guess is that that will be at 4-weeks-$4. We’re doing follow-up testing on this “n weeks until the election!” tactic, so we’ll see!

If you decide to test something similar, send me an email and we can chat! Emails to info [at] actblue [dot] com with my name in the subject line will be directed to me.

P.S. Doing a lot of testing in the election run-up? Want a tool to help you manage your test groups? I wrote something in R for you! I’ll post something on the blog about it soon, but if you want it in the meantime, shoot me a note (emails to info [at] actblue [dot] com with my name in the subject line will be directed to me).

FOOTNOTES:

1 Actually, we built a model that predicts how a given Express user will respond to different types of donation requests based on previous donation information. Using those predicted values, we decide what type of donation ask they receive (of one-time, weekly recurring, monthly recurring) and for how much money they are asked. Math! The point: this is what we landed on for a certain subset of our list.

2 Of course, all else equal, it’s tough to distinguish whether any difference was due to the gimmick or because $7 is lower than $10. The theory would be that with a lower amount, more people would give, and even though the mean donation amount would likely be lower, the increase in number of donors would outweigh the decrease in mean donation size. This is definitely possible, but so is the opposite; it’s all about finding the optimal point.

In fact, we included a segment in the test which received an askblock starting with a lower amount and saw this dynamic in action, though the overall treatment effect was not statistically significantly different from the control. This lends support for interpreting the effect from the gimmick segment as the gimmick per se, but a detailed discussion is excluded from the body of the post for the sake of brevity. More rigorous follow-up testing on this “n weeks until the election!” tactic is already in the field— shoot us an email to chat!

3Pr(>|t|) < .01, controlling for other significant factors, including previous donation history.

We’re less than 8 weeks out from Election Day and are now making the weekly recurring feature available to campaigns and organizations. Just drop us a line at info [AT] actblue [DOT] com and we’ll turn it on for you.

Yep, weekly recurring is exactly what it sounds like. You can ask your donors to sign up to make a recurring contribution that processes on that same day of the week every week until Election Day. After Election Day, the recurring contribution automatically ends.

So, if you get someone to sign up today for a weekly recurring contribution, they’d then have 7 more contributions scheduled to process every Friday.

Election Day is getting closer and closer though, so if you’re going to use weekly recurring, we suggest getting started soon.

Once we turn on the feature for you, create a new contribution form and open the “Show recurring options” section in the edit tab. You will see a new option there for weekly recurring. Make sure you also turn off popup recurring if you have it enabled — these two features aren’t compatible (yet!).

It looks like this:

We’ve run a few tests on weekly recurring this week with our own email list and have had a good deal of success. As always, a donor needs to know exactly what amount and for how long they’ll be charged before they click a link. If you’re going to use weekly recurring with Express Lane (and you should!), here is the disclaimer language we used and recommend you use as well:

Based on our testing, certain segments of your list will respond better than others to a weekly recurring ask (not exactly a shocking revelation). We sort our list into those likely to give to a recurring ask and those who are more likely to give a one-time gift. For the recurring pool, the weekly ask has been performing strongly. Unsurprisingly, the same can’t be said for our one-time folks.

Test it out with the portion of your list that is more likely to give recurring gifts. And try fun things like offering a small package of swag like bumper stickers in return for signing up for a weekly recurring gift.

And if you find an angle that’s working really well for weekly recurring, let us know!

Recurring pledges are like gold. There’s a reason why they’re often called sustaining contributions. Building a base of recurring donors can have a huge impact on the sustainability of any organization, including campaigns.

And now we’re making it easier for you to raise more long-term recurring contributions. Introducing: infinite recurring!

You’ve got a choice: ask people for a recurring contribution for a defined number or months (old standard), or ask them for one with no expiration date (new!). You can also choose not to have a recurring option, but we don’t recommend it (I’ll explain later.)

Here’s how you do it: Go to the edit page of any contribution form. Scroll down till you see this:

recurring toggle

Click on it to expand. It’ll look like this:

recurring options expanded

Select your radio button and then scroll down and hit submit. Yep, that’s it.

ActBlue got it’s start helping candidates raise money for their campaigns, which are built in two year cycles, so we allowed folks to set up recurring contributions for up to 48 months. The assumption was that donors would feel more comfortable signing up for a recurring contribution that would be sure to end at some point. These days, more and more organizations, who are around cycle after cycle, are using ActBlue. Plus, the way people use credit cards has changed and we have a whole system to let you extend/edit/add a new card to your recurring contribution, complete with prompts from us. It doesn’t make a ton of sense to have time-limited recurring contributions anymore.

So we tested it. Would forms with an infinite recurring ask perform the same (or better) as forms with a set number of months? AND would you raise more money if you didn’t have a recurring ask on the form, but asked people with a pop-up recurring box after their contribution was submitted?

We’ve got some answers. Several committees have run tests, confirming that conversion rates on time-limited forms and infinite recurring forms are similar. So if you’re around longer than election day, go ahead and turn on infinite recurring.

Generally speaking, making a form shorter and giving people fewer options leads to higher conversion rates. So theoretically, taking the recurring option off of a form should lead to more donations. We have a pop-up recurring box that campaigns can turn on to try and persuade a one-time donor to make their donation recurring, and there seemed to be a reasonable chance that having no recurring ask on the form would raise more money.

Nope! Turns out that we got a statistical tie on conversion rates between having the recurring option on the form or off. Just having pop-up recurring turned on did not generate as many recurring contributions as having it both on the form and as a post-donation action.

There were slightly more contributions processed on forms without a recurring option, but not enough to generate a statistically significant result. And then add to that the lost revenue from having fewer recurring donations, you end up with a pretty clear take-way: leave the recurring option on the form. Sure, you can turn off the recurring option, but you’ll likely lose money. And nobody wants that.

That’s why recurring contributions have been on every ActBlue contribution form since the beginning. These days we run anywhere from 8-14% recurring, and over $11 million is pledged to thousands of campaigns and organizations.

There is one big question we haven’t answered yet: will you raise more money overall from an infinite recurring contribution than say one with a 48 month expiration date? We’re currently working on a long-term experiment to test exactly that.

The answer might seem self-apparent, but the truth is nobody really knows. Credit cards expire and people cancel their pledges. You never know for sure how much money you’ll raise from a recurring contribution, but if you pay attention to your long-term data, you’ll be able to figure out your pledge completion rate.

If you’re interesting in figuring out a recurring donor strategy, we’re more than happy to give you some (free) advice. Just drop us a line at info@actblue.com.

In the fast-paced digital campaigns world, if you’re not innovating and testing constantly, you’re headed for obsolescence. And, more importantly, you’re letting your users down, especially those in short-term competitive environments (aka elections). At ActBlue, we’re always developing our platform with metrics-driven decision making, aka testing.

The result is that today’s ActBlue isn’t the same as the ActBlue of a month ago, and that’s a great thing. Sometimes our tests fail. Others result in a barely statistically significant bump in conversion rates. But that’s ok because all of those little bumps add up. Occasionally we hit on a big winner that dramatically increases conversion rates. We do it in a methodical, constant way that allows us to identify improvements big and small.

One advantage we have is the sheer volume of contributions we process, which allows us to A/B test small tweaks to the form and get statistically sound results. If one organization tried running an identical test on their own, they’d never be able to identify as many improvements.

We’ve got thousands of campaigns and organizations counting on us to have the best system possible, so they can focus on winning. It drives our work and testing every single day.

Our tech team make changes to the platform daily. Some are minor tweaks, others major changes. They’ve developed a rock-solid platform where we can easily roll out significant feature or a layout change, even in the middle of the crazy busy end-of-quarter period. And that’s no easy feat, but a deliberate design choice so we can be as nimble as the party needs.

Today we thought we’d roll back the curtain just a little bit and break down some of our favorite A/B tests from the past few months.

Test 1: Employer Address Checkbox

We know from our data that a lot of donors mark retired or unemployed on the forms and we wanted to see if we could use that knowledge to increase conversions. Turns out: yes! We A/B tested our normal form with one that has a checkbox they can click if they’re not employed. The checkbox automatically provides us with the information, which fulfills the legal requirement and bumps up conversion rates.

Original:

Checkbox:

We saw a 4.7% improvement in conversions (p < 0.05, for those of you keeping score), so we switched over to the new checkbox version. Bonus points for cutting waaaaay down on customer service questions about the occupation/employer boxes.

Test 2: Shrinking the Contribution Form

Speed is essential in online contributions, so we’re always looking for ways to make the Contribution Form shorter and faster to load, but the rapid increase in mobile donations has made it even more important than ever. We ran a number of tests aimed at shrinking the contribution form, including the following:

– Removed credit card tooltip (which popped up when you click the credit card box) so it would load better on mobile
– Removed “Employment” section header
– Using horizontal employer fields rather than stacking them vertically

All of these tests ended without statistically significant results, but that was a win for us, because it meant we could make our forms less cluttered. If a feature isn’t adding value, that means it’s time to go. And bye bye those three things went on every single form in our system.

You can see the evolution of the Employment section below.

Version 1 (original):

Version 2 (horizontal):

Version 3 (no header, checkbox added):

Test 3: Multi-step Contribution Forms

We already wrote a whole blog post about this test, but it’s worth mentioning again here. This was one of those big wins, with a 25.86% increase in conversion rates with 99% significance. That was after just a few days of running the test. We had tested multi-step Contribution Forms a few years back, and they lost to our standard one page forms, which just goes to show how important it is to test and test again.

One page form (losing version):

Multi-step form (winning version):

We do one thing at ActBlue and we’re the best at it in the business. And the biggest reason is that we’re constantly upgrading our platform. We push changes out to everyone ASAP so that thousands of campaigns and groups big and small can get the best right away.

In a few months when we get down to the crunch of election time, know that we’ve got your backs and you will always be using the most optimized and tested form out there.

Follow

Get every new post delivered to your Inbox.

Join 51 other followers