A/B testing the course page design

Lewis Wake
Monday 7 March 2022

At the end of 2021, we began A/B testing of a new page design for the English MA and Biology BSc course pages. We have been using Google Optimise and Hotjar to determine the difference in usage.

What is A/B testing?

A/B testing is a method for comparing two versions of a webpage, or app, against each other to determine which one performs better. A/B testing is essentially an experiment where two or more variants of a page are shown to users at random, and statistical analysis is used to determine which variation performs better for a given conversion goal.

How it works

We have been using Google Optimise to A/B test the pages. Google Optimise allows us to randomly swap out the original page for the new page when a user navigates to it. Data of the usage is tracked via Google Analytics and Hotjar.

What pages did we test?

We have had a new design for course pages in the works for a while now. A new layout to help optimise the visual hierarchy of content on the course pages. We created variations of English MA and Biology BSc course pages as a starting point. One course for science and one for arts.

Biology BSc

English MA

Google Analytics results

Using Google Analytics, we are able to compare the number of sessions triggered, and the bounce rate of each session. A bounce rate is the percentage of visitors to a particular website who navigate away from the site after viewing only one page. The higher the bounce rate does not necessarily mean the page is a failure – if a user finds the information they are looking for then their journey will end.

English MA (72 days of data)

Experiment sessions
  • Original page – 3,746 experiment sessions
  • New page – 1,929 experiment sessions
Experiment bounces
  • Original page – 940 experiment bounces
  • New page – 626 experiment bounces
Calculated bounce rate
  • Original page – 25.09% bounce rate
  • New page – 32.61% bounce rate

Biology BSc (77 days of data)

Experiment sessions
  • Original page – 4,010 experiment sessions
  • New page – 397 experiment sessions
Experiment bounces
  • Original page – 1,113 experiment bounces
  • New page – 107 experiment bounces
Calculated bounce rate
  • Original page – 27.76% bounce rate
  • New page – 26.95% bounce rate

Hotjar results

We have used Hotjar to randomly and anonymously record sessions of the pages tested and measure click rates of the pages. The click rates have allowed us to capture heatmaps of the pages tested.

Number of sessions captured with Hotjar

Over a two month period of usage the number of sessions of each page recorded in Hotjar are as follows:

English MA

  • Original page – 164 sessions
  • New page – 224 sessions

Biology BSc

  • Original page – 170 sessions
  • New page – 22 sessions

Heatmaps

Hotjar created heatmaps based on clicks recorded in all sessions. Linked are the JPEG files of these heatmaps.

English MA

Biology BSc

Conclusions

The heatmaps generated from the Hotjar recordings give us the biggest insight into the usage of both varients. However, in all instances, the most popular sections of each page remain the same:

  • Entry requirements
  • module information
  • joint degree options.

These tasks have been noted in the past as being top tasks for the course pages. We know that this information is what the majority of users navigating to these pages are looking for.

Higher bounce rates on the new pages could suggest that this information is more visibly displayed and therefore easier to find. However, without verbal feedback from a user, this can only be an assumption. Therefore, the next stage of validation would be to carry out usability testing.

Related topics