Stigg’s experimentation capability allows you to run A/B tests to test different pricing strategies, and determine which strategy results in an increased conversion rate, revenue and overall customer lifetime value (LTV).
Enter the experiment name and optional description.To allow easy analysis of the experiment results, the experiment name must be unique.Example names can be “Freemium vs. reverse trial”.
When running multiple experiments using the same configuration, it’s recommended to add the traffic distribution as the suffix of the experiment name, for example: “Freemium vs. reverse trial - 50/50”
Define how the traffic of new customers will be distributed between the experiment variations.The definition can be done by dragging the scrollbar, or by manually entering the distribution percentage.
After the experiment configuration has been defined, confirm the creation of the experiment by clicking on the “Create experiment” button.Experiments are first created in a “Draft” (= not running) status, which allow you to review and apply changes to the experiment definition before the experiment is started.
When an experiment has not run yet, simply open it from the list of experiments, apply the relevant changes and click on “Save changes” to confirm them.
In order to prevent skew of the experiment results, it’s not possible to apply changes to a running experiment or an experiment that has ended.As an alternative, you can duplicate such experiments by clicking on the “Duplicate” action.
Experiments affect only new customers that are created while the experiment is running.Existing customers are not affected.
Only one experiment can run at certain point of time.
When an experiment is running, additional changes to your product’s pricing can be made; however, publishing them can only take place when no experiments are running.
A notification bar with details about the running experiment will appear throughout the duration of the experiment, indicating that a pricing experiment is in progress.
An indication for a customer participation in an experiment is provided under the “Experiment participation” section of the customer details screen.The indication includes details about what experiments the customer participated in, and what variation was assigned to the customer.
When a subscription is created as part of an experiment, it has an indication whether the subscription was created as part of an experiment that’s currently running or as part of an experiment that has since ended.The indication appears under the “Subscriptions” section of the customer details screen, and also under the subscription details screen.
Stigg provides visibility for the experiment results of each variation while the experiment is running and when it has ended, specifically:
The total number of subscriptions that were created in each variation group while the experiment was running
The number of paid subscriptions that were created in each variation while the experiment was running - when customers have more than one paid subscription, it’s counted as once in order to represent the first conversion event.
The conversion rate (%) - the number of paid subscriptions divided but the total number of subscriptions.
In order to analyze the experiment results using third-party applications (such as Mixpanel and Amplitude), Stigg will need to be integrated with these solutions. Such integration is possible using the Stigg API, SDKs, and webhooks.To allow analysis of the results, every customer and subscription object that are returned by the Stigg platform includes details about the experiment that these entities relate to, specifically:
Experiment name
Experiment ID
Variation group name
When an experiment is running, these properties will be populated with the relevant information.As part of the integration with Stigg, make sure to propagate these properties to these applications. For example: when reporting user events to Mixpanel, report the experiment information on each event.
Applying the results to the rest of your customer base
After the analysis of the experiment results is concluded and the winning variation is determined, navigate to the product details page of the relevant product and apply the same settings as the winning variation.
Learn more about applying changes to the customer journey settings