Experiments
#
A/B TestingA/B tests are specialized statistical tests that allow testing of different variations to determine which variant is better. It can essentially be applied to anything, as it aims to answer the question "which one is better?"
For our case, following are examples of questions A/B tests aim to answer:
- Should I price my product at 10₺ or 8₺?
- Should I call my product "Lite" or "Premium"?
- Should I describe my product as "life changing" or "valuable"?
A great way to answer any of these questions is to get feedback from users. A/B tests is a way to do that by splitting the users into two (control and variant) and expose each of these groups to a different product. Drawing from our previous example, an A/B test could be:
- Group A sees the Product priced at 10₺
- Group B sees the Product priced at 8₺
At the end of the test, we'd like to see how many people bought each variant and the one with more conversions is assumed to be the better option.
Let's assume Group B had a better conversion rate than A. Our app owners then can decide to move forward with option B (8₺).
#
What you can do with Appmate A/B testing?- You can test A/B subscription prices and measure how they affect your revenue and lifetime value.
- You can effortlessly determine winning in-app purchases and subscription prices.
- You can test different trial periods and promotional offers to increase your conversion rate.
- You can make smart decisions based on renewals.
- Find the best pricing and paywall variations.
- Prove hypotheses without additional software development.
#
Create An A/B TestingYou can create an experiment by defining a start date, end date, audience, products, and split rate.
Field | Description | Type |
---|---|---|
Experiment ID | It is a unique value. Once created, it cannot be updated again. | String |
Active | Decides whether the test should continue. If it is disabled in the Running status, the status becomes Pause. Data collection is turned off. No data is collected to the Test report until it is enabled again. | Boolean |
Experiment Name | The name of A/B Test. Status cannot be changed when Running. It is only updated in WFSD status. | String |
Status | Indicates the test status. For more detail. | String |
Start Date | Indicates when the test will start. At the earliest, one day later can be selected. | Date |
End Date | Indicates when the test will end. At the earliest, one day later can be selected. | Date |
Audience | The audience of A/B Test. For more detail. | Audience |
A/B Split Rate | Indicates what percentage of users in the selected audience will see the product in the Group A (control) . The rest see products in Group B (variant). It takes values between 95% and 5%. | Rate |
Group A Product (Control) | The product you want to keep in the control group | Product |
Group B Product (Variant) | The product you want to compare with the control group. You can set the product's isVisible property to false and only open it to the relevant percentage of the relevant audience. Other users will not see the product. You cannot select the product in the Group A (Control). | Product |
You can create an A/B Testing as shown below.

#
Update A/B TestYou can make changes to the A/B testing you have created, but with some restrictions.
You can update all parameters of a Test in WFSD (i.e. Waiting for Start Date) status except Experiment ID.
When an experiment is RUNNING, you can only change the active status of a test. You cannot change the date, products, name, id and split rate.

For a test that has passed the start date and is not in FINISHED status, only the "active" status can be changed.

You cannot update any feature of an experiment in FINISHED status.

#
Technical Requirements of an A/B TestA/B tests rely on events to measure conversion rates. This section explains how events can be passed to Appmate.
note
Best practice for implementing events is to avoid any static declarations.
A dynamic implementation is required to allow no-code experimentation capabilities.
Appmate needs to receive user events for A/B testing calculation. It needs 2 events for A/B testing calculation. These are UserEventType.VIEW and UserEventType.PURCHASE.
You should send these two events to appmate via the setAppmateEvent() api. You can view detailed information in Set Appmate Events section on left menu.
Event | Description |
---|---|
UserEventType.VIEW | The event to be sent when the product is displayed. |
UserEventType.PURCHASE | The event to be sent when the product is purchased. |
#
A/B Testing ListYou can see all done A/B Testing. You can monitor the status of A/B testing from the status value, update A/B testing if you wish, or view the report if you wish.

#
SearchYou can filter by ExperimentName , Status and Start Date.
#
RefreshYou can refresh the page to see the current status of Experiment.
#
View ReportYou can go Experiment Report for selected Test
#
View Test DetailYou can go Update A/B Testing for selected Test
#
A/B Testing StatusSome test product selection scenarios have been written for illustrative purposes. You can take a look at the table.
Status | Description |
---|---|
WFSD | Waiting for start date. It is the status value from the time you create the A/B testing until the start date. |
RUNNING | Status while A/B Testing is running |
INACTIVE | Inactive. When the active switch box is set to false. |
PAUSE | If switchbox "active" is set to disable while A/B testing is in RUNNING status, its status will be paused. During the pause, no data is collected, and the report is empty during this interval. |
FINISHED | The status value after the end-date. |
note
After running status, you can update only in "active" switchbox state. If it is in WFSD state, you can edit everything except A/B testing id.
#
You Can Create An Audience For Your A/B TestingYou can determine on which users the A/B testing will be performed and which audiences will participate in the test. When you do not select any audience, your test will be made to include all users. You can find detailed information about creating Audience in the Audience Managament section.
#
Sample Audiences-Product ScenariosSome test product selection scenarios have been written for illustrative purposes. You can take a look at the table.
Product A Audience | Product A isVisible | Product B Audience | Product B isVisible | A/B Test Audience | Handle |
---|---|---|---|---|---|
Audience 1 | True | Audience 1 | False | Audience 2 | - Audience 1 sees product A - Audience 2 sees product A or B If there are users from audience 1 in audience 2, they should only see product A or product B. Therefore, in other steps of product list generation, holdout products should be removed. |
Audience1 | True | Audience 2 | True | Audience 3 | - Audience 1 sees product A - Audience 2 sees productB - Audience 3 sees product A or B * If there are users from audience 1 or audience 2 in audience 3, they should only see product A or product B. If user 1 is in audience 3 and audience 1 user should see product A or product B due to A/B test. |
Audience 1 | False | Audience2 | False | Audience 3 | - Audience 1 can't see product A - Audience 2 can't see product B - Audience 3 sees product A or B |
Audience 1 | True | All users | False | Audience1 | - Audience 1 can see Product A or Product BIf user in Audience 1, user see Product A or B due to contain A/B testing's userListA or userListB |
All users | True | All users | False | Audience1 | - Audience 1 sees Product A or B - All remaining users which are not include in audience 1 see Product A |
All users | True | All users | False | All users | - All users sees Product A or B due to A/B testing |
All users | False | All users | False | Audience 1 | - Audience 1 sees product A or B |
Audience 1 | True | Audience 2 | True | All users | - All users sees Product A or B due to A/B testing |
Audience 1 | True | All users | False | All users | - All users sees Product A or B due to A/B testing |
Audience 1 | True | Audience 2 | True | Audience 1 | - Audience 1 see Product A or B - Audience 2 see Product B If there are users from audience 1 in audience 2 or from audience 2 in audience 1, they should only see product A or product B. |
#
Experiment ReportThis is the part where you see the test results. You see the data collected during the testing process in the form of tables and charts, and the A/B testing report gives you a final inference.
note
The report is generated one day after you create the test. And it comes out daily at the end of the day.

Confidence Interval helps you estimate generalization when the product is available to your entire user base or selected audience. We would expect conversion rates to fall within the interval.
You can decide which product is advantageous to choose according to the chi-square test result created using this information. The Verdict value is the result of the statistical test and acts as a suggestion.
If the difference is not statistically significant, you may see a no difference verdict. This means, even though purchase counts and conversion rates may be different, no alternative is definitely better than the other.

Table Item | Description |
---|---|
Visits | Shows how many people viewed the product. |
Purchases | Shows how many people have purchased the product from viewers |
Conversion Rate | Indicates how many of the product viewers purchased the product. |
Confidence Interval | The probability of purchase by those who view a product based on the number of samples. |
Verdict | Chi-square test Estimation result. - It might be "No significant difference" or the winner product |
Test Confidence Level | %95 |
#
OfferingYou can control which products will be shown to users with offering without requiring an application update. Creating paywalls that are dynamic and responsive to different product configurations gives you maximum flexibility to perform remote updates.

- They tell about the benefits of a paid service
- They can act as a blocker to content
- They consist of at least one product, however it is often preferred to show at least two alternatives
- IAP purchases can be directly triggered from paywalls
Appmate already includes all relevant product information (content) to fill a paywall. This feature has a grouping mechanism with additional metadata to bundle products into an offerwall / paywall. The app developer will then be able to query a specific offer via the SDK and get all the details about the paywall and products.
When end users are presented with a paywall, application administrators can also see reports for each offer.
What can you do with the offering?- Users can list offers.
- Users can create new offers or edit existing ones.
- Users can set an offer as the default.
- Users can add metadata to their offers.
- Users can edit, reorder and inactive offers.
- The SDK should be able to invoke any offering by id. If no ID is submitted, the default offering is returned.
#
Add OfferingUsers can create a offering for the selected app via the appmate console.


important
The first offer you create is automatically set as the default offering.
Field | Description |
---|---|
Offering Id | Unique offer id |
Description | Description of the offer |
Status | The status of the relevant offer being SDK active or passive. If you do not want to use an offer you have created, you can take the passive status. |
Metadata | Users can add a metadata MetadataType to their offer |
Products | Products can be added and sorted within the Offering. |
#
Edit OfferingThe created offers can be updated by clicking on the related offer.
important
The status of the default offer cannot be deactivated. Offering Id cannot be updated once created

#
List OfferingsYou can see a list of all the offerings you have created in the console. It can set the default offering from this list. You can go to the create, edit and reporting pages.

#
Offering ReportUsers can view a report for each offer. A date period and currency must be selected for the bid report. The report includes a line chart with view and purchase numbers for each date. The report includes the following metrics:
- Total Views
- Total Purchases
- Conversion Rate
- Total Revenue
- Product Based Purchase Counts
- Product Based Revenues

Field | Description |
---|---|
Date Period | Date range of the report |
Report Type | Specifies the y-axis of the report. Can be View, Purchases, Revenue, Conversation |
Currency Type | Registered currency types |
#
View Select OfferingYou can access the offering information about the offeringId through your application. If you sent the offeringId empty, the default offering will be returned to you.
#
RequestParameter Name | Type |
---|---|
offeringId | String? |
listener | ReceivedDataListener<Offerwall, GenericError> |
#
ResponseParameter Name | Type |
---|---|
onSucceeded | Offerwall |
onFailed | GenericError |
- Kotlin
- Java
String selectOfferingId = "default_offer";PurchaseClient.getInstance().getOfferwall( selectOfferingId, new ReceivedDataListener<Offerwall, GenericError>() { @Override public void onSucceeded (Offerwall data) { Log.d("Success", data.toString()); }
@Override public void onError(GenericError error) { Log.d( "Error", error.getErrorMessage().toString()); }});
val selectOfferingId = "default_offer"PurchaseClient.instance.getOfferwall( offeringId, object : ReceivedDataListener<Offerwall, GenericError> { override fun onSucceeded (data: Offerwall) { Log.d( "Success", data.toString()) }
override fun onError (error: GenericError) { Log.d( "Error", error.errorMessage.toString())
}})
#
OfferwallParameter Name | Type |
---|---|
offeringId | String? |
description | String? |
metadata | List<MetaData>? |
products | List<Product>? |
defaultOffer | Boolean |
#
MetadataParameter Name | Type |
---|---|
id | String? |
value | String? |
type | MetaDataType? |
numericValue | Double? |
#
MetadataType- Labels: These are short strings that can be used for titles, bullet points and similar fields in a paywall. Block of Text: This is a formatted text. For MVP, only spacings and breaks should be considered.
- Number: Could be used for int or float values.
- Image: Users should be able to upload PNG or JPG/JPEG images. (max size: 5 MB) The image URL should be returned to the SDK.