A/B testing is a new feature, currently only available on certain premium plans. This is an advanced feature, for users interested in running automated split tests to understand the impact of certain design changes on the conversation rate and sign up volume of the form.
To get started, click the AB Tests navigation on top of your Privy dashboard:
When you click the "New Experiment" button, you will be prompted to name the experiment, and choose an existing campaign that you want to test.
Now you can decide on how you want to configure the experiment. You can choose between a specific schedule, i.e end the experiment after 30 days. Alternatively, you can set the experiment based on the number of participants, i.e end the experiment after 50,000 visitors see the campaign.
Next, it's time to name the original variant. The variant name simply helps you keep track of what you're changing. So, in our example, the original campaign is a flyout style opt-in form, so we name the original variant "flyout".
The Traffic Ratio setting lets you determine what percent of the test traffic you want to point to this variant. Leaving the traffic ration to 1 will divide the traffic evenly across all your variants.
Now it's time to create your next variant. Click the add variant card at the bottom to name your next variant. In our example, we're calling it "bar" since we want to change the display type to a bar, thus testing the effectiveness of a bar vs the original flyout.
After creating the next variant, you will be brought into an editor to alter the design of the variant. Editing a variant within an experiment limits you to simply editing the design or emails section of the original campaign. It begins with an EXACT replica of the original, to simplify the process and keep as much consistent as possible. As with any campaign you can open the designer if you just want to test small changes like copy, or button color.
In this example we deleted the flyout design from the variant, and then add a new display, selecting a bar display type.
Next you'll be brought back the experiment summary page, and need to click the "Start experiment" button to begin the AB split test.
Once live, you'll see an at a glance view of the number of completed participants, conversion rate from each variant, as well as the over impact of one variant over the others.
In this screenshot, you can see an experiment we're currently running on our own site, where the original popup with the purple button color is outperforming the two variants in the experiment, the same popup but with two different button colors.
This feature is still in beta. Have feedback? Notify us directly: firstname.lastname@example.org