What is A/B Testing?
A/B testing (also called split testing) is used to compare two versions of social proof content against one another. A/B testing is used to determine which version is performing better.
Consider A/B testing an experiment where variants of a page are shown to your customers at random. Then the statistical analysis is used to figure out which of the two variations is performing better based on your goal.
You can use Fera.ai to split test any widget that is running to determine whether or not showing your customers Version A or Version B improves conversion rate or revenue.
Why Should You Use A/B Testing?
A/B testing can be used to make changes to your site and then compare them to leaving the site unchanged. Testing helps you collect data about the impact the changes had on your site (whether they’re good or bad) and helps you determine what changes to make.
For example, you may want to test if having your countdown timer is more effective for pushing a customer to check out when you use red text v. green text.
The importance of A/B testing
is this: A/B testing can help you prove a hypothesis wrong, or it can be used to continually improve a given experience.
The A/B testing process
- Collecting data by going through your shopper journeys, or third party analytics like Google Analytics, helps you to understand where you could make changes & what areas could be optimized
Create a Hypothesis
- When you begin, you can think of A/B testing ideas and create hypotheses about why they’d be better than the current version
- E.g., If I make my “Add to Cart” button sticky on mobile I think I can increase mobile conversion rates
- Having a countdown timer in red will create more urgency to check out
- Including an upsell in my cart will improve my average order value
- Using your A/B testing option with Fera.ai, you can make your changes to create a variation of the original or run a comparison
Run the Experiment
- Launch your A/B test and wait for customers to visit your website.
- The data is based on whether they were randomly assigned the control or the variation
- Once the time is up for your A/B test, you can check out the results. This will let you know if there’s a statistically significant difference between the control and variation. You’ll be able to decide whether or not making the change is worth it!
How Does A/B Testing Work With Fera.ai?
Fera.ai uses A/B testing to show your customers a variation of your site with our social proof or urgency notifications and the control without them. This lets you determine if Fera.ai is having a positive effect on your store and boosting your conversion rate!
For an A/B test, you can create two versions of a webpage. The change can be something simple
- where a button appears
- complete redesign of the original page
When you run a test with Fera.ai, half of the visitors are shown the original version of the page (the control), and the other half are shown the modified version of the page (the variation).
Once the data is collected and analyzed, you can determine whether changing the page’s, design had a positive/negative effect or if it had no effect.
How Can I Use A/B Testing?
Within the Fera.ai app, click on the “Research” tab, and then you will see the “A/B Tests” section. By default, this will be disabled, and Fera.ai will show to 100% of your site’s visitors.
Testing for large stores
If you have a lot of data and visitors (~>500 orders a month), you can use A/B testing to test the results.
Testing for small stores
If you don’t have a lot of orders yet, you can leave this off, and we’ll use average numbers for stores your size to calculate results.
When you navigate to other tabs within any of the social proof content you will get a warning. This warning is a reminder not to modify or edit the campaign in any way while you have an A/B test running live.
This is because it can skew your data and make your results invalid or inconclusive.
A/B Testing in Fera.ai: step-by-step
For more information, you can check out this VWO article.
Go ahead, prove yourself right…or wrong!