Maria is the digital data analyst and content editor for the digital communications team. From Newcastle originally, Maria has lived in Dundee for the last five years while completing her degree in English and working in marketing for small, start-up businesses. Maria is usually found analysing Google Analytics and the results of usability testing.

Updating the undergraduate pages with HEFCE information

A couple of weeks ago, the content team spent some time going through a ‘Information for Students’ guide from the Quality Assurance Agency for Higher Education, but we call it the HEFCE guide (HEFCE stands for Higher Education Funding Council for England, for whom the guide was developed). The guide aims to help universities present good quality information to prospective students. 

Digicomms are using this guide to help shape the new 2018 undergraduate pages – this blog post explains what the HEFCE guide is and looks at the information that the undergraduate pages are missing.

The guide

Including the right information is essential in terms of consumer protection legislation, and also for user experience. The guide states which information HEFCE thinks universities should be providing and when it should be provided – it also includes examples of how this information should be displayed. Digicomms worked through the document and made a note of all of the information that St Andrews does not currently include on its undergraduate pages.

It is important to note that some of the recommended information does not apply to the University of St Andrews because the HEFCE guide is a document written with English universities in mind.

However, digicomms feel it is important to align the University’s content with industry standards in England, as many prospective students will compare St Andrews with English universities. In addition, similar guidelines could be implemented or suggested in the future in Scotland.

What we don’t include

Overall, we already included a lot of the recommended information. However, we were missing a few bits:

  • start and end date of undergraduate courses
  • geographic location of study
  • timetable information
  • how to apply through other routes
  • accreditation of prior learning
  • an explanation of how the academic year is organised
  • a copy of rules and regulations
  • optional modules and how those modules are chosen
  • teaching and learning information
  • assessment and progression information.

Some information on this list is easier to implement than others and digicomms have already drafted text for many of those items listed. However, with things like optional module information, this is where it gets a little more complicated.

This complication arises due to the fact that St Andrews is a Scottish university, where students have a far wider choice of modules at sub-honours. If every optional module was listed, there would be up to a thousand modules on the page. Furthermore, it’s difficult to state how these modules are chosen, because availability depends on a range of factors including staff availability, interest in modules and staff research interest.

Next steps

There isn’t a ‘one-size fits all’ for this information; we need to ensure that the right people are involved with what and how much needs to be said regarding academic progression.

To date, the digicomms team have taken an undergraduate programme and added the information that was missing. The next steps for the team will be to sign off the new content, which can then be added to the new 2018 undergraduate pages.

Common content mistakes and how to fix them

Part of the work done by members of digicomms is ensure all content on the University’s website meets digital standards. In particular, any text on the website or in print materials must meet the University’s house style. Here are some of the most common mistakes found when the team check over content. Continue reading “Common content mistakes and how to fix them”

Results from mobile usability testing

Last week I conducted some guerrilla usability testing on a mobile device in the University’s Library. The main purpose of the test was to find out whether the new digital prospectus pages can be used on mobile devices. This post provides an overview of the users’ experiences and what digicomms needs to do next with the pages to ensure they’re meeting user needs.

Who took the test?

In total, seven participants were tested, all were undergraduates and some were international students. The courses they studied were varied and ranged from International Relations, to Mathematics and English.

General feedback from participants was that their experience of the website is usually positive. Interestingly, as with the initial usability test, we found that participants highlighted the importance of awards and accreditations for Schools and the University over course information when exploring courses as a prospective student.

Some participants stated that they found the differences between iSaint/MySaint and MMS confusing. One explicitly stated: “there’s lots of different websites to use that all do the same thing”.

Highlights from test

In general, user experience with new subject and undergraduate pages was improved when compared with trying to find the same information on the existing School webpages.

However, one major drawback from the new pages that was evident on mobile, but not on desktop testing, was that students did not use the navigation bar at the top of undergraduate pages and on subject pages.

This, combined with students’ general unwillingness to scroll beyond 50% of the page resulted in students remarking how long the pages were, and how there was a lot of text. One participant stated that it probably wasn’t the case on the desktop, but on a phone extraneous text can seriously impact user experience.

We added the ‘Joint degree options’ title, reworded the text underneath it and linked to the courses. More needs to be done though.

Despite digicomms’ work rewording the joint degree section on the subject pages (an outcome of our last usability testing), joint degree information was still not easy to find. However, I feel that this isn’t due to the wording on the subject pages, as this time the text was noticed. Instead, this issue is connected to students not using the navigation or scrolling far enough down the page (joint degree information is nearer the bottom than the top of the page).

Finally, one other major factor that impacted students finding joint degree information was the fact that the links within the joint degree information text on the subject pages didn’t link directly to the joint degree sections. Instead, they were just linking to the courses that could be taken as a joint degree.

Below is a video which highlights the effect of this on the student’s experience; what you’ll see is the participant in a loop between the subject page and the undergraduate page. The question we asked was “Can you tell me if you can take xxxxx as a joint Honours degree?”

Actions

Going forward, the digicomms team are going to further reassess how joint degree information is presented to prospective students. This will include linking directly to the joint degree section within individual undergraduate pages.

We are also considering creating a more general joint degree information page within Study. This will ensure that students searching for something like “joint Honours degrees” on Google or within the website will be taken directly to relevant information.

This round of testing also highlighted areas of improvement surrounding the test itself. For one, we realised that students don’t want free donuts, so next time we’ll need an alternative incentive. More seriously however is the need to clearly explain what a prospective student is, as one student began a task thinking and acting as a current student.

In addition, for the next round of testing, to ensure participants spend more time testing the area of the website we need them to, they should be placed on a specific webpage, rather than just the homepage or Google. In this test, participants started on the University’s homepage and this led students to complete tasks on School sites and using the Study pages, whereas we wanted to see how the new digital prospectus pages held up.

While allowing the student to ‘organically’ find information from a more general source can be illustrative, sometimes guiding them a little bit of the way can help produce more refined and useful results.

Guerilla testing on mobile devices

A couple of weeks ago The University of Dundee’s external relations team released data from some guerrilla usability testing they had conducted during their freshers’ week. It was an incredible data sample, and fresh off the back of our most recent usability testing of the digital prospectus, it seemed like a good excuse as any to try out some guerrilla testing.

Continue reading “Guerilla testing on mobile devices”

Usability testing of the undergraduate pages: the results

Usability testing is a method used to evaluate a product as it is developed by testing it on others. It has been used by the digital communications team in the past to gain insightful data about user experience. With the imminent launch of the new undergraduate and subject pages, conducting some usability testing allows us to see any potential design and content issues with these pages. This post looks into what was tested, and provides an overview of the users’ responses and our next steps for the pages.

Continue reading “Usability testing of the undergraduate pages: the results”

Benchmarking the Study webpages

Over the next few weeks the new Subject and undergraduate course pages will go live and Course Search will be taken down. With such a drastic change about to occur on the University’s website it’s crucial that we monitor any impact these changes have. This corresponds with the digital communications team’s commitment to putting user needs first.  

One way of monitoring the effects of the new webpages is by measuring specific Google Analytics data relating to the Study section of the website before the changes are implemented. Then, further down the line this data will be used as a point of reference. 

What the report showed

Two time scales were used: one month (November 2016) and one year (2016). The report provided an overview of the Study webpages and looked into:

  • user flow
  • user location
  • traffic sources
  • devices used.

Overall in 2016, the Study webpages received 5,473,817 pageviews, a total of 17% of the pageviews of the entire website. Of this, 96% (5,259,350) where external users.

These figures were mirrored in November where, the Study webpages received 567,342 pageviews, a total of 21% of the pageviews for the entire website, where, 95% of those users were (544,507) were external.

As the new Subjects pages will be linked to from the Study pages, it is important to note the current pageview data for Study. Going forward, we expect to see an increase in pageviews in Study after the instalment of the new pages.

User flow

We expect that user flow will be altered with the arrival of the new pages. Currently the most popular path for users is the University Homepage> Study> Undergraduate Study.

In 2016, 80% of users accessed the main Study pages from the University’s homepage, 14% accessed the main Study pages from other University pages and 6% of users accessed the Study page directly.

nov-data

After landing on the Study homepage, 94% of those users continued to another page within the website. Of that 95% however, only around 70% went on to view another page within Study.

When the new pages are established, the route could change to something more specific such as Study>Subjects>Maths for a prospective undergraduate student. 

Traffic sources

In November and across 2016, the majority of traffic was the result of organic searches in Google. The chart below summarises the top traffic sources to the Study pages.

screen-shot-2017-01-19-at-11-38-43

One interesting find was the high portion of referrals within the top 25 sources of traffic. On a yearly basis, referrals made up 68% of the top 25 traffic sources, highlighting the importance of links on external websites for users as an entry route.

User location

Overall, the main users of Study are in the UK. This has increased from 2015 by 7.7% (1,871,600 to 2,015,377). Other notable increases from countries include China (up 28% from 152,046 to 195,548) and India (up 21% from 92,076 to 111,672 pageviews to study).

Users in the USA viewed fewer pages in Study in 2016 compared with 2015 as page views were down 14%. It will be interesting to see if these trends continue next year.

It was also interesting to note the substantial drop in pageviews from countries following the UK and the USA. Across the year there was a 74% difference in pageviews between the USA and China.

Devices used

Users predominately accessed the Study pages via desktop, and if they were to use a mobile device, an actual mobile was preferable to a tablet.

devices

In 2016, mobile and tablet use made up around 25% of the pageviews the Study pages received. In November 2016 mobile and tablet use made up around 25% of the pageviews the Study pages received. 

 

Testing your website with Google Optimize

A few weeks ago an email from Google popped into my inbox, imploring me to accept an invite to try out Google Optimize. Sadly, as of this writing, Optimize has not yet been released, so I can’t provide a hands-on walkthrough of Google’s latest data powerhouse. However, I can offer a brief overview of what Optimize does and how you can benefit from all the testing possibilities it provides.

What is Google Optimize?

Google Optimize is a new analytics tool that allows you to quickly create, test and see the results of new online experiences through A/B testing. Optimize also allows you to conduct multivariate tests and redirect site experiments.

Optimize sits alongside Google Analytics, Tag Manager and other Google applications in the Analytics 360 suite. There will be both a free version of Optimize and a paid, ‘enterprise-level’ version available. As of the time of writing, both are in a beta state.

Usefully, it works alongside Google Analytics, meaning you can use Analytics data to identify areas of your site and then measure your site experiments against performance indicators in Analytics.

What is A/B testing?

A/B testing is a term for a randomised experiment with two variants (A and B). These variants could be two different versions of a webpage or a newsletter. Using a newsletter as an example, variant A is the control – the original newsletter you have been sending out for months – and variant B is essentially the same newsletter, but with a different design or altered content.

a-b-test
Can you spot the difference?

A/B testing is primarily used in marketing, so that ads are optimised to produce the most return on investment (ROI). Features that are usually tested using this method include:

  • calls to action
  • headlines
  • alternative images or graphics
  • copy
  • formatting.

More examples of pages and features to A/B test can be found in the Google Optimize help pages from Google.

What is multivariate testing?

Multivariate testing (MVT) tests variants of two or more elements simultaneously to highlight the most effective variant of each element.

In concrete terms, say you wanted to conduct MVT on three different versions of a landing page. The three different landing pages have different headlines and images. MVT would be able to tell you which varying elements on each page were the most effective, as well as the overall success of the heading and images in each iteration.

MVT

So whereas A/B testing will tell you which landing page is the most successful, MVT will do that and then some, highlighting specifically the best combinations of media and text to use.

What are redirect site experiments?

A redirect test is a type of A/B test that allows you to test separate web pages against each other. Variants in these tests are identified by URL, rather than the content on the page itself.

redirect-testing

Why conduct this type of testing?

The main benefit of A/B testing is that it removes guesswork. Instead of siding with personal preference when deciding on a certain colour choice or call to action, you go with what the end user prefers.

From the University of St Andrews’ perspective, having content that is tailored for users aligns strongly with our user-centred approach. Overall, it allows users to interact with content in a way that makes sense to them, rather than in a way that we want them to act. In this sense, A/B testing possesses many of the same benefits as usability testing and generally using data, rather than ‘gut instinct’ to make decisions regarding content and design.

Using Google Analytics to define key markets

Recently,  Admissions got in touch asking for some data on international markets and how they use the study section of the site. Having access to this data will be useful going forward as it will allow Admissions to analyse and track individual countries and easily compare them with others. This post looks at what the initial data showed, and what the outcomes of having this data are for admissions.

What the data showed

The data we collected referenced external users to the Study section of the website only. This meant that anybody who was accessing those pages from the university was excluded from the data. Below is a list of countries in order of number of pageviews, it shows that the UK is the biggest user of the website.

  1. United Kingdom
  2. United States
  3. China
  4. Germany
  5. Italy
  6. India
  7. Canada
  8. France
  9. Spain
  10. Greece

Having access to data like this ensures that marketing efforts are focused and tailored to specific audiences, which should in turn increase engagement.

The next step is to analyse the data relating to a specific country, such as India, to see:

  • if there are any unusual ways Indian users interact with webpages
  • how India compares to the top market (the UK)  
  • If India is a growing market that needs to be targeted more through marketing
  • if any areas need to be refined in order to improve engagement with Indian users.

This process can be simplified after creating a country-specific dashboard. Segmenting specific user data through dashboards allows for at-a-glance readings of key markets and so makes analysis and comparison much faster. For example, after creating a dashboard for users from India, it was easier to see the percentage of new users year on year. The number of new Indian users increased by 15% since 2015, whereas new UK users only increased by 3.6%.

Outcome for Admissions

Monitoring markets going forward will show if user interest in these areas remains steady or begins to increase. Comparing data to previous years will also highlight to whether there is any potential in certain markets, allowing Admissions to jump in and secure more admissions through marketing efforts.

Analysing leading markets more closely going forward will also be beneficial as they can be monitored against certain standards or towards certain goals. Such segmentation will also allow to see if certain countries possess any nuances such as most popular times to view the Study pages, or how they prefer to view the pages (such as through mobile or desktop).

If you would like more information on creating a dashboard in Google Analytics, I’ve wrote in the past about why and how to do this. If you’re a member of staff at the University of St Andrews and would like access to Google Analytics, please email itservicedesk@st-andrews.ac.uk.

Using the Users Flow report in Google Analytics

One of the most helpful pieces of information you can glean from your website is knowing where users have come from and where they go once they’ve landed on a particular page. While you can use the Source/Medium report to see how users first accessed your site, the Users Flow report presents the path users take on your site or through a specific page. And unlike the Source/Medium report, the information is presented through a handy diagram.

What is the Users Flow report?

The Users Flow report is a graphical representation of the paths users take through your site, from the source, through the various pages, and where along their paths they exited your site.

What can you find out from the Users Flow report?

The Users Flow report allows you to:

  • compare traffic from different sources
  • examine traffic patterns through specific sections of your site
  • see specific metrics (such as number of sessions) for connections and paths between pages in the user flow graph
  • measure A/B testing results
  • troubleshoot the efficacy of your site.

What this means is you’ll be able to see if there are any specific pages where users are leaving your site. Once these pages have been recognised and then altered with more appropriate or engaging content, you can go back to the Users Flow report to check if the number of drop-offs has reduced.

Are there any limitations to Users Flow report?

Yes. The main limitation is that the Users Flow report is based on a data sample, not the complete range of data Google Analytics has access to. This means that when you’re viewing the Users Flow report, the results in your report are only based on a small percentage of the data available. You’re being presented with the trends visible to Google in that sample of data, which as a result means that the data is not 100% accurate.

Another limitation is that if you are wanting to see the typical user path to and from one specific page, you will sometimes be unable to see anything at all in your report. This is because there is too little data in the first place to take a sample from to analyse and present any trends. When this happens this screen will appear:

Failed user flow

Accessing the Users Flow report

  1. Sign in to Google Analytics.
  2. Select the Reporting tab.
  3. Select Audience from the Report navigation, then select Users Flow.

user-flow-instructions

Using the Users Flow report

I’ll go into detail about how to see how many users go to and leave a specific page with the Users Flow report, but if you’d like to dig a bit deeper and try more with the report, Google has a helpful overview of all the key things you can measure.

Once you have accessed the Users Flow report, to narrow the report down to a specific page, follow these steps:

  1. Select the cog next to where it says ‘Country’. (It will display ‘Country’ by default).

users-flow-step1

2. Using the drop-down, select ‘Landing page’.

3. Add the URI of the page you’d like to see in more detail, e.g /library

4. Select ‘Apply’.

choosing a specific page in Google Analytics

5. Now, to see the process of traffic through your page, left click on the ‘Starting pages’ column and select “Explore traffic through here.”

user-flow-narrow

6. You’ll now be presented with a more detailed view of the traffic to and from the page you have chosen. It’s worth noting that you can see the steps before and after the three that are  presented on this page by selecting one of the arrow buttons:

users-flow-final

If you’re a member of staff at the University and would like further help using User Flow, or would like access to Google Analytics to see how your site is doing, please email itservicedesk@st-andrews.ac.uk.

How to conduct usability testing

If you’re new to usability testing, you may find my post What is usability testing and why do we do it, useful before diving into the ‘how-to’. While that post looked at the what and why, this post will look at the how, and will focus on the five planning and testing stages of usability testing. If you’d like more information on what to do post-testing, check out an agile way to prioritize usability problems.

Continue reading “How to conduct usability testing”