Email A/B Testing Guide: What Works (and What Doesn’t)

Email A/B test icon

Email marketing is all about continuous optimization, but let’s be honest… who has time to test everything? I’ve already run these A/B tests so you don’t have to.

Below, I’ll break down the most effective email A/B tests, what we learned, and how you can use these insights to boost your email engagement, conversions, and sales.

 In my experience, the last point (#8) is were the really significant performance enhancements lie.

1. Subject Line Testing 📩

Your subject line is the first impression, make it count!

What We Tested

Results

Short vs. long subject lines

Shorter, punchier subject lines get higher open rates (ranging from 3-10% higher in ecom promotional emails)

Personalization (e.g., “Hey [First Name]” vs. no name)

Personalized subject lines can increase open rates by 26%.

Curiosity-driven vs. straightforward subject lines

Curiosity-based subject lines (e.g., “You won’t believe this...”) drove more opens but fewer click-throughs and conversions.

Emoji vs. no emoji

Emojis always improve open rates sightly cause they draw attention in a full inbox. Use sparingly because it can come across as spammy in some industries.


2. Sender Name & Email Address 📢

Would you rather open an email from “Marketing Team” or “Ryan from Campaign Geeks”?

What We Tested

Results

Brand name (e.g., “Campaign Geeks”) vs. personal name (e.g., “Ryan at Campaign Geeks”)

Emails from a real person’s name often feel more personal and will get higher open rates. Try this out on a high priority campaign you want to make sure your subscribers see.

No-reply@company.com vs. hello@company.com

Avoiding "no-reply" emails improves deliverability and engagement (source: Sendgrid). However, in our case this yielded statistically insignificant results (no winner here!)


3. Email Copy & Tone 📝

How you write your email impacts whether people read or ignore it. Finding the right ton for your audience is important.

What We Tested

Results

Formal vs. casual tone

Conversational emails often increase engagement and reply rates.

Short vs. long email copy

Shorter emails usually drive higher CTRs, but longer, value-packed emails work better for educational content. In our test, a longer flyer style email outperformed a shorter version. I’ve seen lots of mixed results here so make sure you test this one out on your audience across templates.

Storytelling approach vs. direct-to-the-point messaging

Storytelling can increase conversions by making emails more relatable. We tested this approach in a welcome email series and drove 12% more conversions because it resonated more than generic “about us” content. Some great examples on how to do this here.


4. CTA (Call-to-Action) Placement & Wording 🎯

Your CTA determines whether subscribers take action.

What We Tested

Results

CTA above the fold vs. at the end of the email

Placing the CTA higher boosted click-through rates (CTR) by 2-7%, depending on the template.

Button CTA vs. text link CTA

CTA buttons generally get higher clicks than text links. Bonus: test the CTA you’re using on the buttons (i.e. “Shop Sale” will get more click throughs than “Shop Now”

Action-oriented CTA (“Get Started Now”) vs. curiosity-driven CTA (“See What’s Inside”)

“Action-driven” CTAs often outperform vague or generic ones.


5. Plain Text vs. HTML Design 🖼️

Pro Tip: Keep designs clean with a balance of images and text. Try not to overload with images, it can hurt deliverability.

What We Tested

Results

Plain text email vs. beautifully designed HTML email

Plain text emails often perform better in B2B or relationship-based industries. But this isn’t the case in B2C. So be sure to test this out to find the right fit for your business.

Image-heavy email vs. text-heavy email

HTML emails with visuals tend to boost engagement in eCommerce & B2C. Too many images can trigger spam filters, so balance is key.


6. Send Time & Frequency 📆

When you send emails affects whether they get read or buried in busy inboxes.

What We Tested

Results

Morning vs. evening send times

Best send times vary, but many studies suggest Tuesdays and Thursdays at 10 AM work well. For ecom retailers I’ve worked with, sending in the evening at 7pm has been successful, a recent test got 9% lift in open rates vs. sending at 7am.

Weekdays vs. weekends

For B2B, sending in the morning on weekdays will work best. For B2C it’s a bit more nuanced depending on the typical habits of your target market. Test sending to your key segments in the morning and evening, I’ve seen mixed results with B2C businesses.

1 email per week vs. 3 emails per week

More emails can mean more engagement… until subscribers feel overwhelmed and start unsubscribing. Monitor your unsubscribe rates with this test. We found that sending 3 times a week was optimal for ecommerce brands. More than this can have diminishing returns and make a dent in your email list over time.


7. Segmentation Optimization 🎯

Make sure you're sending the right message to the right customer.

What We Tested

Results

Demographics (gender and age)

Typically segmenting based on gender is appropriate in B2C, but I’ve seen mixed results depending on the business:

- Vitamin & health ecom retailer: Men subscribers had a 50% lower CTR on a Women’s health campaign. Clearly their men subscribers weren’t interested, so we helped make this adjustment to their segments going forward.

- Beauty & skincare ecom retailer: the result showed that men subscribers are still engaging with traditionally female content at a high rate - only 2% lower CTR on a makeup email. Suggested they don’t segment based on gender.

Past purchase behaviors

Implemented a series of tests to refine key segments for an online retailer:

- Segment based on brand & category purchased.

- Test the timeframe of when the last purchase was placed.

- Segments that show weak engagement, further refine until the engagement goal is reached.

Incorporating this approach with a baby & kids retailer helped refine the baby audience vs. toddlers vs. kids segments, leading to an overall improvement in CTR of 12% across all campaigns.

Predictive algorithms

Tested fancy predictive AI models for segmenting customers based on their most likely to purchase category. Found that segmenting based on past purchase segment rules that were previously established worked best… Perhaps the inputs for the predictive model weren’t accurate, so more testing to come here!

Unengaged subscribers

It’s important to keep your email list clean, so make sure you’re constantly refining your list. Do this by splitting out sends to your least engaged subscriber cohorts to see CTR and campaign conversions:

- Subscribers who haven’t clicked through in over 3 months.

- Subscribers who haven’t placed a purchase in over a year.

If these cohorts still aren’t clicking through, then it’s time to unsubscribe them.


8. Personalization & Dynamic Content 🔄

Take your email campaigns to the next level by leveraging personalized content. In my experience, this is where the significant performance enhancers lie. Take some time to think about the tailored offers and recommendations you can offer to your customers based on their unique behaviors.

What We Tested

Results

Generic vs. dynamically personalized email content

All tests including a personalization piece of content wins:

- Rewards banner for loyalty members increased conversion by 8% from the cohort.

- Dynamic banner featuring new customer coupon code to drive first purchase increased conversions by 4%.

- Shoppers who recently abandoned cart received “Still looking for something?” content in upcoming promotional email (dynamic banner featured outside of abandon cart trigger email)

- Calling out customers favorite brand in promotional email SL and header increased by 23%!

Including past purchase history in emails vs. no personalization

Recommending products based on past purchases can increase repeat sales. We saw significant lift across various tactics:

- “Similar Items” recommendations in abandon cart email template boosted CTR by over 15%.

- “Your faves on sale now” in weekly promotional email boosted CTR by a whopping 21%!

- Including “You may also like” recommendations in post-purchase email series boosted CTR by 8%.


Final Thought

A/B testing isn’t about one-size-fits-all answers, it’s about continuous improvement. Test, analyze, and iterate to find what works for YOUR audience!

Word of advice: don't try to test too many elements at once. If you do want to test multiple elements make sure you set it up as a proper multivariate test, otherwise, the results will be unclear as to why the winning version actually outperformed,

If you have any suggestions on A/B tests to add to the list, let me know in the comments 😄