However, many teams have not evaluated new software in several years. So, Optimizely’s Strategy & Value Advisory team gathered this helpful vendor-agnostic guide. It’ll help you independently evaluate tools that fit your needs.
We’ve already powered A/B testing for thousands of companies. To help you get started, we’re offering a special offer for Google Optimize customers.
Four tips for finding the right fit for a potential replacement
See ways for mapping out your requirements when looking for a new A/B testing vendor.
1. Start early.
Most companies need 2-6 months to evaluate a new piece of software. Plus, an additional 2 months for implementation.
- Gradually migrate personalization campaigns and tests. Have a minimum of 1-month overlap with a new tool before GO turns off.
- Ensure all personalization campaigns and a/b tests are live in your new tool. No wonder CRO experts are conservative about it, calling July 31 your true date for turning off Google Optimize.
- Plan for 3-4 weeks for technical implementation, a/a tests, key integrations, and data validation. We can do it all faster, but typically teams need time.
2. Dream big.
Changing a tool should unlock the previously restricted digital potential for your business. Know your must-have requirements, but also understand what could be better. Consider Optimizely’s key improvements over Google Optimize. Now, check if you have answers to these questions:
- What if you reached stat sig quicker with the industry’s best Stats Engine and delivered more relevant experiences?
- What if you built an inclusive culture of experimentation that fosters team learning and collaboration?
- What if you leveraged products that speak to each other and integrated Optimizely seamlessly with best-of-breed tech platforms?
- What if you needed less developer or data science time to build or understand an experiment?
- What if you had optimal performance and page load times (also impacting Google’s ranking)?
3. Don’t underestimate internal approvals.
We have seen a c-change in the past six months, where due to economic uncertainties, Finance and Procurement teams have added extra scrutiny to every purchase, whether it be for software, headcount, or other tools for growth. Even very flat organizations and rapid start-ups have found their decision-making delayed.
- Notify Finance, Procurement, and IT teams early of your need to implement a new tool and ask for their support.
- Any investment needs to have a structured plan presented internally for WHY you need an investment and HOW you will ensure that you can experience ROI. (Ask your vendor for help!)
- Show why experimentation and personalization are essential for your business.
- Your stakeholders can be on vacation and you may need extra time for review. Plan that their approvals will take 1-2 months; if it’s faster, happy days!
- You can always communicate to the vendor that you want your contract to start in June or July (see #1 – starting later than July could put you at risk of having interrupted service)
4. Consult experts.
Read reports from analysts and CRO experts, not just the vendors themselves. Ask for reference customers to have a call about their experience.
- Check out the Total Economic Impact™ of Optimizely’s Experimentation Platform as per Forrester in the latest TEI™ study.
- GA4 will have some basic split traffic capabilities, as Google Ads already does, but they’ve specifically stated that they will not focus on the “features and services that our customers request and need for experimentation testing”; if you’re just getting started, GA4 may provide enough for your needs, but there’s a high risk for any company that has been running tests already that this will not serve your needs.
Maybe your company can move faster than this, but it’s better to build in a buffer. The biggest risk is that you can’t get a tool in place in time and your personalization campaigns, a/b tests, and MVT experiments go dark.
Example Timeline for Software Evaluation
1 February | 1 March | 15 April |
Reach out to 4-5 vendors to start the process and schedule the first calls and demos. There is zero commitment, just the research-gathering process. Define your requirements, including: – Who from your team will be using the tool? – What types of tests or campaigns are essential? – What integrations do you need? – How will you manage collaboration across teams? |
Based on which tools meet your requirements, reduce your list to 2-3 choices. Next, go deeper into your specific use cases, including: – Which tests or campaigns are most important to you? – Is the new tool easy to use for this use case? – Where did you struggle with Google Optimize and you’d like to see improvement? |
Finalize your choice of the preferred vendor and start circulating the contract & legal docs for review. (Legal review alone can take 1-2 months). Ensure engineering resources are allocated for implementation (in July-August); while most Eng teams operate in sprint cycles, it’s important that they have this on their roadmap and understand the need for prioritization. |
1 May | 31 May | 15 June |
Present to your leadership & stakeholders the recommended vendor and how you will achieve ROI; your selected vendor should be able to help you and quantify the forecast value specific to your company. | Finalize approvals from Legal, procurement, and finance. | Signing new contract; start provisioning & access to the new tool. |
July & August | 1 September | 30 September |
Implement the new tool. Install Javascript snippet, A/A tests, data validation, GA4 integration, etc. At Optimizely, we can move as fast as your teams are able, but typically customers need at least 3-4 weeks to coordinate schedules with all their internal teams and get this set up. Given the summer holidays in July & August, we recommend starting implementation by July 1 at the latest. |
Keep this the latest date to go live with a new tool and transfer experiments.
Wrap up final experiments & campaigns running in GO over the next 4 weeks. |
Google Optimize is shut down. Time to go fully live with the new tool. |