Subject Line Test
A common A/B testing element is subject line testing. No matter how engaging or helpful the content within the body of your email may be, if no one opens it, did it really matter?
Here are a few A/B tests you can run on subject lines:
- Is a longer or shorter subject line more likely to drive opens?
- Will adding time-based messaging drive urgency?
- Does a question spark more interest in the recipient to open the email than a statement?
- Does adding personalization – like a person or office’s name in a subject line- improve open rates
A subject line test could be used in the public affairs industry in trying to schedule a meeting with legislators during your fly-in. In your emails to their schedulers, you could run these subject line tests:
- Length: “Let’s talk about [Topic] at [Fly-in]” vs “We need to change [Topic]. Let’s discuss at your convenience during [Fly-in] DATE”
- Time-based: “Two weeks left to schedule a meeting about [Topic] at [Fly-in]!” vs “Schedule a meeting about [Topic] at [Fly-in]”
- Asking a question: “Can we count on you to discuss [Topic] at [Fly-in]?” vs “Let’s talk about [Topic] at [Fly-in]”
- Personalization: “‘Office Name’, see you at [Fly-in]?” vs “See you at [Fly-in]!”
- Framing: “Understand your constituents focus on [Topic] with 2022 [Topic] Research Report” vs “2022 [Topic] Research Report”
The main metric you’ll use to measure the success of a subject line test is open rate — of all people who received each variation, what percent opened the email?
However, subject line tests can also impact the engagement with the email after it’s opened based on how well your subject line set expectations for what readers would get when they opened the email. For example, a click-bait subject line may drive opens, but if the content within the email doesn’t deliver on the promise of the click-bait, then it won’t have an impact on your goals.
Email Sender Test
How does the name of the email sender impact a recipient’s likelihood to open and respond to an email?
Here are a few A/B tests you can run based on email sender:
-
- High-Profile Executive vs. Peer Employee – Are employees more likely to be driven to take action on a grassroots campaign when it comes from a high-profile name, like your CEO? Or are they more likely to respond to peer-to-peer outreach?
- Person’s Name vs. Organization Name – Are advocates more likely to react to an email that looks like it comes from a human, but they may not know who that person is? Or an email that is clearly marketed from an organization but that they can immediately identify?
Similar to a subject line test, open rate, and click-through rates are your measures of success for this test.
Email Content Test
The test that leaves the most opportunity for creativity is in the actual body of the email. From the language you use on a button to whether you include an image or not, there are countless variables within the body of your email that can help determine what will drive greater engagement from your recipients.
What’s essential about A/B testing is only changing one variable and keeping all other variables the same. If you change the text on a button AND an image in the same email, you won’t know which drove the change you saw.
- Designed vs. plain text – are your stakeholders more likely to engage with content in a heavily-designed and branded email or a plain text one? Do certain types of emails work better with design versus others?
- Button CTA text – much like a subject line test, how can you label your buttons in a way that drives clicks?
- Methods of persuasion – If you’re looking to convince a legislator to take action on a key bill, you could test a data-based argument versus a constituent story to make your case.
When running tests on the email copy, you’ll want to observe differences in click-through rate to determine a winner. The click-through rate measures the percentage of people who clicked something in your email out of everyone who opened the email. By looking only at the subset of recipients who opened the email, you know that they actually saw the test and were influenced by it.
Take it to the next level: testing different list segments or email types
You ran an A/B test on a blast to your whole advocate network that told you your CEO’s name drives great open rates, congrats! That’s a great takeaway for future blasts. But would the same result hold true if you emailed just one department within your organization? Or just your new advocates who you’re trying to drive action from for the first time?
Or maybe you found that designed emails aren’t great for your call-to-action emails asking employees to call their legislator because it’s clearly a blast and doesn’t feel personal. That doesn’t mean it won’t be a good fit for sharing community affairs updates to legislators — with design, they may better brand recall for the ways you’re engaging their constituents.
You shouldn’t assume all audiences will react the same to your variations, so it’s worth continuing to test and seeing how you can get even more narrow in your best practices.
Then, take it another step further and bring your insights to other channels. If you’ve found a certain button language performs really well in your emails to drive action to your campaigns, that same language may work well on your website buttons.
Conclusion
The possibilities for A/B testing are endless and they provide powerful insights for your team. As you are running tests, make sure your team has a way of keeping track of what’s working so you can continue to use the best practices you’ve learned from your winning variations. Over time, the incremental changes you make in your subject lines, email senders, and email content will build your emails into a powerful tool for moving the needle on the issues your team cares about.