Now that we’ve discussed the important textual and visual content that you need to incorporate in the app you’re building, we’ll move to another important step – Conversion Optimization.
Step 3: Optimizing Your Conversion Rate and A/B Testing
What is CRO?
CRO stands for Conversion Rate Optimization. It’s a process of increasing your app’s chances to get downloaded by your target users.
A conversion rate is the rate or number of users who have completed an action you want them to do. In this case, to accomplish the final stage of achieving an app download.
When we optimize we are not only looking at the the number of downloads, but also user retention or the percentage of users who continue to patronize your app. To make this happen, you also need user ratings.
Reviews are a key indicator of your app’s performance and have a direct impact on download rates. People trust other people. So if a lot of people are excited about your app, more are likely to download it.
A four-star app is more likely to be downloaded than one with only 3 stars. Also, if people are writing about how good your app is compared to others, this increases user confidence and directly impacts downloads.
App size also affects a potential download. If your app is bigger than 100MB, chances are most users won’t proceed to download it unless they have wifi access with large data or unlimited data.
A/B Testing: A Must for Improvement and Sustainability
A/B Testing is also known as bucket testing or split run testing. It’s when you promote two versions of your app simultaneously to determine which of them resonates best with your target audience.
This is important because it helps you determine exactly what works and what doesn’t. This guides your app update process and saves you time and money. In this particular instance, you should test for design elements, app naming style and keyword arrangement.
We advise that you run your A/B test for at least 5-7 days. This provides enough traffic to make a better judgment. If you wish to go down to the little details during testing, create a daily, weekly, or maybe a monthly plan if possible to get a more detailed result. To do this, you’d have to test different design changes one by one. Depending on the complexity of your app, it can prove to be a waste of time. It’s best to group your changes as you’d get results in a more reasonable timeframe.
One important note though, make sure you thoroughly test which icon works best. Icons are the first thing people see and from your icon, they have to decipher what your app is all about. Don’t spare any efforts to get this right. Your screenshots and videos only serve as secondary support.
A/B Testing can be done directly in the developer’s console in Google Play using a tool called Google Experiments. For iOS, you need a third party tool. Here are some:
These are very powerful tools to automate your A/B testing. Some of them support Android so if you need to keep your testing in one place, you can.
A/B Testing Pitfalls:
Let’s take a look at what you need to avoid if you want to run a successful A/B test:
1.) Unclear hypotheses – you must have a clear idea of what you are looking for. Have a question that you will use the tests to answer. If you have no clear question, you may end up with a test that doesn’t add any value to your app. Set things clear in your mind before you design your test, and do your due diligence in your research and planning.
2.) Relying too much on Google Experiment results – Create multiple platform A/B tests. Sometimes, it’s the best way to get some insights that will not be available in Google Experiments results.
3.) Using the same creatives over and over – Don’t be lazy. Test different designs. This is what A/B testing is all about. Test test test!
Now you’re ready to make your first A/B test. Do this and then come over to Part 4.
In part 4, we’ll look at ‘Scaling the Analytics.’