Guides / A/B testing

A/B testing uses Relevance Tuning and Analytics , meaning:

  • Relevance tuning lets you give your users the best search results.
  • Analytics makes relevance tuning data-driven, ensuring that your configuration choices are sound and effective.

Relevance tuning, however, can be tricky. The choices are not always obvious. It’s sometimes hard to know which settings to focus on and what values to set them to. It’s also hard to know if what you’ve done is useful or not. What you need is input from your users, to test your changes live.

This is what A/B Testing does. It lets you create two alternative search experiences with unique settings, put them both live, and see which one performs best.

Explore related videos in the Algolia Academy

Advantages of A/B testing

With A/B testing, you run alternative indices or searches in parallel, capturing click and conversion events to compare effectiveness.

You make small incremental changes to your main index or search and have those changes tested - live and transparently by your users - before making them official.

A/B testing goes directly to an essential source of information - your users - by including them in the decision-making process, in the most reliable and least burdensome way.

These tests are widely used in the industry to measure the usability and effectiveness of a website. Algolia’s focus is on measuring search and relevance: are your users getting the best search results? Is your search effective in engaging and retaining your users? Is it leading to more clicks, more sales, more activity for your business?

Implementing A/B testing

Algolia A/B Testing was designed with simplicity in mind. This user-friendliness enables you to perform tests regularly. Assuming you send click and conversion events, A/B Testing doesn’t require any coding intervention. It can be managed from start to finish by people with no technical background.

Collect clicks and conversions

To perform A/B testing, you need to send click and conversion events: this is the only way of testing how each of your variants is performing. While A/B testing itself doesn’t require coding, sending clicks and conversions does.

Set up the index or query

Algolia offers two kinds of A/B tests:

Run the A/B test

After creating or selecting your indices, you can start your A/B tests in two steps.

  1. Use the A/B test tab of the Algolia dashboard to create your test,
  2. Run the A/B test.

After letting your test run and collect analytics, you can review, interpret, and act on the results. You can then create new A/B tests, iteratively optimizing your search.

A/B testing in the Algolia dashboard

Open Search > A/B Testing in the dashboard. The first screen is the Overview page: it lists every test, its status, start date, and test duration. Select any test to open its details page. Here you’ll see metrics for each variant, and a View analytics menu that opens the Search Analytics tab for the test’s variants.

Example A/B tests

Algolia offers two kinds of A/B tests:

  • Comparing different index settings
  • Comparing different search settings

For index-based testing, you can test:

  • Your index settings
  • Your data format

For search-based settings, you can test any search-time setting, including:

  • Typo tolerance
  • Rules
  • Optional filters

Example: changing your index settings

Add a new custom ranking with the number_of_likes attribute

You’ve recently offered your users the ability to like your items, which include music, films, and blog posts. You’ve gathered “likes” data, and you’d like to use this information to sort your search results.

Before you implement such a big change, if you want to make sure it improves your search, you can do this with A/B testing.

  1. Create your A/B test indices:
  2. Add a number_of_likes attribute to your main catalog index (this is the control variant in your A/B test), and then create a replica of your main catalog index to use as variant B in your test.
  3. Adjust the replica’s settings by sorting its records based on number_of_likes.
  4. Name your test “Test new ranking with number of likes”.
  5. Set variant B’s split to 10% to minimize user disruption while evaluating the impact of the new sorting. This helps ensure the change improves the experience before expanding to more users.
  6. Use the sample size estimator to estimate the duration of your test, or set the duration to whatever is reasonable for your traffic patterns. More traffic generally allows for shorter test durations.
  7. When your test reaches confidence, see whether your change improves your search, and whether the improvement is large enough to justify implementation costs.

Example: reformatting your data

Add a new search attribute: short_description

Your company has added a new short description to each of your records.

To see if adding the short description as a searchable attribute improves your relevance, here’s what you need to do:

  1. Configure your test index (you only need one index for this test).
  2. Add a new searchable attribute short_description to your main index. Use your main index as both the control and the variant to test a search-time setting.
  3. Create an A/B test named “Testing the new short description” in the dashboard’s A/B testing page.
  4. For variant B, click Add Query Parameter, click the Facets tab, then enable Restricts a given query to look in only a subset of your searchable attributes.. Include all searchable attributes that you want to test, except short_description.
  5. Direct 30% of traffic to variant B to minimize potential impact from the untested attribute and avoid degrading search performance.
  6. Use the sample size estimator to determine the test duration, or choose a duration based on your traffic patterns: higher traffic typically suggests shorter tests.
  7. When the results reach confidence, evaluate the performance improvement and implementation cost to see whether this change is beneficial.

Example: turning rules on and off to compare a query with and without merchandising

You can use A/B testing to check the effectiveness of your rules. This example compares searching with rules enabled and search with rules turned off.

Your company has just received the new iPhone. You want this item to appear at the top of the list for all searches that contain “apple”, “iphone”, or “mobile”.

To use an A/B test to see whether putting the new iPhone at the top of your results encourages traffic and sales, here’s what you need to do:

  1. Ensure that rules are enabled for your main index (the test’s control variant).
  2. Create a rule for your main index that promotes your new iPhone record.
  3. On the Algolia dashboard’s A/B Testing page, create a test named “Testing newly released iPhone merchandising”.
  4. Use the main index for the control and variant B.
  5. For variant B, click Add Query Parameter and then Enable Rules.
  6. Assign 30% of traffic to variant B. This helps mitigate the potential impact of the untested configuration.
  7. Use the sample size estimator to determine your test duration, or choose a duration based on your traffic patterns: higher traffic typically suggests shorter tests.
  8. When the test reaches statistical significance, compare the results. Consider both the improvement and the implementation effort to decide whether to apply the change.
Did you find this page helpful?