In rates with performance features, you have the option of performing split tests (also known as A/B tests). This allows you to test two Landingpage or Webinar variants against each other to find out which one achieves the better results.
I) Creating a split test
1. open the "Split Test" menu
2. click on "New Split Test" in the upper left corner
3. Fill in all the requested fields
a) Split test name: Enter an internal name for the split test here.
b) Equalize page views / Equalize registrations
Here you can choose between two different variants:
Equalize page views: The landing pages of both webinar projects are displayed alternately to your visitors. This means that each landing page was opened the same number of times at the end. This variant is suitable for testing two different landing pages against each other.
Thanks to the integrated conversion optimization, people see the other landing page when they call up a second page.
Equalize registrations: With this variant, people see the landing page of webinar project 1 until a registration takes place. Then people will see the landing page of webinar project 2 until another registration takes place. This variant is suitable if you want to test two different e-mail processes or webinar videos against each other.
Select the two webinar projects you want to test against each other.
d) Landingpage / iFrame
Select the landing page or iFrame to be used for the split test. In the case of an iFrame, you must also specify the URL under which the iFrame is stored.
e) Destination after the end of the split test
Select the landing page to which the split test link should point after the end of the split test.
4. click on "Start split test" in the lower right corner.
II) Evaluation of a split test
1. open the "Analytics" menu
2. set the slider at the top to "Compare mode".
3. select the split test you want to evaluate as "Template" in the upper left corner. The two split-test projects and the split-test time span are then automatically selected.
4. Below you will find the respective values for each of the two split-test projects and can compare them directly with each other.
Currently, the comparison values are only displayed on the statistics dashboard. An extension to the other statistics will take place in the course of time.