A/B Testing: Optimizely vs. Google Optimize
Optimizely and Google Optimize are tools that help you develop variation(s) for your A/B testing experiments. In my previous article I wrote briefly about all phases that are part of conversion rate optimisation and A/B testing as a method, so if you need some background just check it out here.
When creating an experiment for the first time, Optimizely and Google Optimize come on the scene after you’ve generated a hypothesis about what can be changed on your web page that will result in a higher conversion rate. They are tools for developing those variation(s), run the experiment and collect data about users activity on both control and variation. But as you run experiments you can use its results in first phase of CRO, when collecting data (e.g. segmentation of results) because from good experiments (even if variation failed) you can learn a lot about a users behaviour and it can help you spot problems and generate new hypothesis and come up with new experiments.
Either you own a business or you are CRO expert or developer looking for the right tool for your client I will try to cover pros and cons of Google Optimize and Optimizely. In a way, this comparison might not seem fair since I am comparing a free tool such as Google Optimize and the premium Optimizely, but the article might help you get an idea of what both tools provide and which one is more suitable or suitable enough for your business and experiments.
I will make a comparison in terms of the following steps in the process of creating variations and running an experiment:
- Setting up
- Development of variation
- Defining and tracking conversion goals (click, page-view, sign-ups etc)
- Defining audience
- Test and run experiment
Setting up and integration
Implementation of Optimizely will require inserting an Optimizely code snippet in the head of your HTML file and you are ready to go.
For installing Google Optimize, you first should have a Google Analytics account and add the Optimize snippet inside GA tags. You will have to link your GO project to the one in Google Analytics and install the Chrome extension to start developing variations.
|Third-party integration with heat mapping technology||Yes||No|
|Integration with Google Analytics||Third-party||Native|
Let’s get back to simple changes for now. Both Optimizely and Google Optimize provide a Visual Editor Page that you want to run an experiment on and is loaded into the editor. Simply by clicking elements like buttons, text, headlines, you can make desired changes. You can see in the screenshots below what type of changes you can easily make without writing a single line of code. As you can see, editor’s options are quite similar and there is no big difference between them when it comes to the type of changes you can make without writing a single line of code.
Here are requests you might stumble upon when developing new variations.
— you want to make changes only on an element itself (change colour, text etc)
— you want to make changes in variation (you want to reorder elements on the page, insert content before/after some element etc)
— you want to make changes on the experiment (run custom code that will fire on both control and variation)
— you want to run code on the entire project (on each experiment)
Optimizely deals with all of these situations. It has separate editors for each scenario and takes care of code execution timing – project JS will be executed first, then experiment JS and finally variation JS.
Google Optimize on the other hand deals only with the first one quite well. Code that is related to an element itself, like colour change, text change, adding custom styles etc. can be added in a simple way, but everything else requires some workarounds. To run code on the entire variation one solution would be to target the <head> element and write a script. To run the same code on more variations at once, you will have to do it repetitively, meaning add it to each variation’s head.
Optimizely also comes bundled with jQuery but if you want to use the version on your site or not use it at all you can simply uncheck the option in the Settings page.
Setting up conversion goals
To actually track visitor’s interaction with both control and your variation, you will have to set up tracking of conversion goals. Goals can be clicks on the call-to-action buttons, landing on a specific page or some custom goal like the completion of the first step of your form etc.
Optimizely offers easy set up for tracking conversion goals for:
— custom goals
The custom goal is goal triggered by the code, but page view and click is something that a user can set up within a minute without writing any code.
Conversion goals in Google Optimize are called objective. You can set up:
— objective based on the goal that is already defined in Google Analytics
— bounces (number of single-page visits
— session duration
— custom objective ( defining an event like in Google Analytics)
Optimizely allows you to have as many goals as you want, Google Optimize has a the limitation of 3 goals per experiment.
After segmentation of your results often you will have a need to run experiments targeting only a specific group of people (just desktop visitors, users coming from UTM campaign, based on user’s location etc.), so for your A/B experiment, you will need to define the audience. (List all audiences from GO and Optimizely). You can see from the lists below that Audience options are quite similar in both tools. Optimize 360 (premium Google Optimize version) has a nice feature of being able to set audience based on GA data. Meaning you can target a more specific group of the people. For example, you can target only visitors that in the past have purchased some item which can be convenient if you have that kind of request for your experiment since in Optimizely a solution won’t be so straightforward.
Before setting the experiment live, it is important to test it out so you can be sure that everything works as expected. If you are setting up a web page across different devices, make sure it looks and works well on all screen sizes. Also, it is important to test if goals/objectives are firing correctly.
QA using Preview mode
Both tools provide Preview mode and give you an option to share a preview with other people involved in the experiment. Preview mode helps you test out variations across browsers and devices, Optimizely’s preview mode also logs goals fired during testing. Preview mode won’t be enough if you want to share the experiment with users that don’t have access to Optimizely or Google Optimize projects.
QA using a test cookie
You can test out your experiment by setting up a test cookie and targeting only an audience that has a cookie that you defined as the test cookie. In that case, you started your experiment so you are able to check if goals are fired if results are logged the way you expected and of course if it visually looks like it should.
Optimizely even gives you an option to force a variation, so you just add a query parameter optimizely_x=VARIATIONID inside your URL and you will be bucketed in the desired variation. In this case, you don’t have to keep reopening an incognito window and wait to be bucketed in variation(s) to test it out. Google Optimize, unfortunately, doesn’t have this option.
In Optimizely, after you tested out the experiment, you can pause the test, reset the results, remove the test cookie audience and set the experiment live – show it to “real” users.
After you pause the experiment in Google Optimize there is no option to reset results and start it again. Once started you won’t be able to edit the experiment so if you spot some bugs during QA you can’t pause the test and fix it on the current experiment, you will need to copy the experiment, fix the issues and start it again. If you are testing the firing of goals (objectives) the whole process will last more than it should, considering the fact that the results are generated with a few hours delay. So, you won’t be able to tell immediately if your experiment works well. Ater a few hours, when your testing activity is logged, you might see that it is not working as expected. So you copy the experiment and you make changes and fix things that you assume or know are causing the problem, run it again with the test cookie audience test and wait for a few hours until new results are generated. If everything is working well now, you will need to copy the experiment again and set it live.
Optimizely and Google Optimize have a different approach when it comes to counting conversions. Google’s reports are session based, meaning it might count the same visitor twice if they started two different sessions. Optimizely, on the other hand, is unique visitor based, so it won’t count the same visitor and conversion twice. They differ also in the statistical method they use when determining the winning variation. Google Optimize uses Bayesian methods and Optimizely uses Frequentist approach. In my previous article, I briefly described those two methods and linked some useful articles if you want to dig deeper into the subject.
Reporting in Optimizely is almost live, meaning you can see visitor’s conversions immediately or with a few minutes delay. Google Optimize on other hand has a big delay. It has up to 12 hours delay and this can be very frustrating if you are testing out experiments before setting them live. You are not able to see results on the same workday if something needs to be fixed.
As I mentioned in the introduction, I didn’t want this article to be about what is better or what you should go with. Different projects have different requests. GO being completely linked to GA is both an advantage and disadvantage. If you are already using GA or you are planning to use it and your experiments won’t reach really complex phases, then GO should be a very good solution considering it’s a free tool and has a fair amount of features and options. Optimizely is a more professional tool and will serve you well through all phases of creating an experiment.
There is another very popular tool in the A/B testing scene – VWO and we have been testing it out too so in the future you can expect another comparison from us.