Which Test Won? A/B Testing a Button Color on Mobile Devices

We love the guys over at Which Test Won, who do amazing a/b testing and we have written about them more than once because of how great it is! This test was awesome. The company Extra Storage Space tested two versions for a mobile website, one with a blue call-to-action and the other tested an orange button. The responders were making a reservation and moving into a storage unit.

They really wanted to test the button color for mobile to increase storage rentals. Most of the visits came from a mobile device. Version A had a site with a blue CTA and version B had an orange CTA button.

 

The Winner?

The blue button was the control. The orange button killed it with a 7.8% increase in reservations!

From Which Test Won:

“Research shows blue tends to be most people’s favorite color. Blue is used on many social media sites and in the financial industry to convey trust and security. “

Research has also found, yellow and orange typically elicit cheerful feelings. For many, yellow is the color of happiness. But, for many Extra Space Storage customers, it may not be a happy time. Again, the color may have been incongruent with visitors’ feelings.”

Something so small can be oh-so-right!

Which Test Won? A/B Testing Video vs. Datasheet

We always love the folks over at Which Test Won, they geek out on data like we do. So when Continuum, a company that offers remote monitoring and management services, networking solutions, and backup/recovery assistance, did a test on lead capture for an offer for a data sheet vs. a video, you would never believe which one won. See for yourself…

Which Test Won: Video vs. Datasheet for leads

 

Version A won! Can you believe it? It created a 63% increase in clickthroughs compared to the video offer.

What does Continuum say? Know your visitors! Figuring out how to appeal to them is crucial. Part of the a/b testing challenge is identifying the best format to serve them content.

Continuum wanted to test whether its partners wanted to get an email offering informational content in the form of a data sheet or a video.

According to Continuum many of their customers are: “…former IT employees from larger organizations who’ve left to start their own managed IT services business. These people are technicians at heart, but want to become better business owners.” They also thought that their “viewers wouldn’t take the time to watch a video, especially during business hours. However, the data sheet could be quickly scanned or printed out for later review.”

Continuum says that you should test your call to action; if your visitors prefer to quickly scan a sheet, send a data sheet. If they want to look at something more engaging, send them a link to your video. Then see what works best.

Great test!

Which Test Won: A/B Testing a Countdown Timer

We love our buds at Which Test Won, the masters of a/b testing, since we’re all geeky and into testing and data. Sign up for their emails, their testing is awesome!

And this one rocks too. Would you say Version A with a countdown timer or Version B without a timer won?Screen Shot 2015-04-08 at 4.05.30 PM

Version A of course!

This one included a countdown timer on the page selling the jacket. Hokey? Some might say. But hokey just got Miss Etam, a leading Dutch women’s outfitter 8.6% lift in conversions. And not surprisingly a few hours befor the timer was going to finish the lift went nuts.

A total of 50,000 viewers saw the test. It ran for two weeks and achieved 99% confidence.

The agency that performed the test for their client De Nieuwe Zaak felt that the limited time offer was the winner. If people didn’t “act now” they’d be bummed, as a result a great result for their client. Oh and they tested the timer as a visual and as text and the visual won.

 

Thanks Which Test Won!

Which Test Won? Awesome A/B Testing of a Mobile Button

The great folks over at Which Test Won are at it again with some brand new a/b testing and this one’s a doozy. This test was run by Oxfam, a British-born-and-evolved movement who believes in a world without poverty, where people are valued and treated equally, enjoy their rights as full citizens, and can influence decisions affecting their lives. Very cool.

So obviously they rely on donations to keep their non-profit going. This test is pretty cool.

They tested:

  • Version A had a fillable form where the user didn’t have to go to another landing page to fill it out.
  • Version B featured just a “Donate Now” button where you had another page to fill out.

Screen Shot 2015-02-18 at 9.37.21 AM

The Winner!

Version A, the fillable donation form won lifting donation click throughs by 23% compared to Version B and A increased completed donation by 131%!

We think Version A just made it easier to donate. You never know how many pages lie behind a “Donate Now” button and a simple way on one page to do it makes sense. Oxfam agrees with this “transparency.”

Check out the rest of the story here! Thanks again to Which Test Won, we love all of your a/b testing!

Daily Beast Does an Email Marketing Design Makeover

We are pretty adamant about getting our news headlines from two sources, The Daily Beast and Nextdraft (thanks to Dave Pell for THE most compelling headlines you’ll ever read in your entire life on this planet Earth.)

So recently we saw that The Daily Beast did a bit of an email marketing design makeover on their campaigns and we felt compelled to dissect the old vs. the new. Here goes:

An example of what the Daily Beast used to look like and now.

  • The old had just 2 spots for ads in the email. The new now has spots for 4 ads. More ads=more revenue. Good job.
  • The old doesn’t have social links on each article, the new one allows one to socialize the article. We like that one!
  • The new one is very text-heavy, they may be embracing the “less is more” rule?
  • The old one doesn’t look super-great on mobile, the new one has larger text and is a bit more optimized for mobile, so it’s easier to read.
  • The old one uses valuable space above-the-fold for sign up and invite, but these people are already getting the email. The new one uses the share links within each article and the follow links at the bottom. Once people are on the site, all information is very shareable.

They added more emails! They started a new Daily Digest for only the most important can’t-miss news complete with images and incorporates their new best practices from above. Now we have to get to the bottom of Brian William’s on “leave”!

The new Daily Beast Daily Digest, AM edition.

The new Daily Beast Daily Digest, AM edition.

 

So if you’re in the mood to do some testing with your email marketing campaigns you might consider using a few of these ideas and see what happens!

Dataviz: How Color Affects Your Biz [Infographic]

Did you ever think that color could make a huge difference in your marketing? Well it can and it’s been tried and tested but with no simple answer. Kissmetrics put together a few infographics (great dataviz btw!) that show how color could really affect outcomes from testing they’ve uncovered. It’s all about measurement isn’t it?

Most people make decisions based on VISUAL.

 

It takes NO time for our minds to make a call!

Test it! This simple test of a button color change got 21% increase in clicks.

 

People in North America see color different than other countries.

Does green mean GO? Or are you too relaxed to move when you see it?

Selling high end Jewelry? Consider testing sky blue instead of orange!

Which Test Won: A/B Testing for Arial vs. Calibri

The great people over at Which Test Won gave us more a/b testing to noodle on, a FONT test! Can you believe it? We love this stuff.

So, this content management software company Hyland uses Marketo and tested a font in an email. It was 12 point Arial vs. 14 point Calibri. It was a simple tex-based only email to their house list offering a datasheet to learn more about their system and it contained one link.

Which do you think won?

Screen Shot 2014-11-19 at 8.45.30 AM

Version B won increasing clicks by 70.7% at a 99% confidence level! Even though it’s a Microsoft font we’ll give it props.

Have you done any testing like this?

Which Test Won? Clarks Does it Again A/B Testing Navigation

We’ve written about how Clarks Shoes does a fair amount of testing to squeeze any little bit of response they can. This time they some A/B testing with a page that showed navigation if you scrolled up a long page vs. NO navigation. Thanks to the fine folks at Which Test Won, we know the answer!

Which do you think won?

Screen Shot 2014-10-26 at 10.46.21 AM

Version A Won!

How did it do?

It increased orders by 2.16% at a 96.5% confidence level. and it increased ‘Add to Basket’ clicks by 1.27% at a 92% confidence level.

According to Which Test Won: “The test ran in two sequences over a 39-day period. There were four total variations. After three weeks, the team eliminated the two worst performing variations. Next, they ran a split test between the two top performing variations for the next 18 days.”

Great test guys, we’re always learnin’ from ya!