Maria is the digital data analyst and content editor for the digital communications team. From Newcastle originally, Maria has lived in Dundee for the last five years while completing her degree in English and working in marketing for small, start-up businesses. Maria is usually found analysing Google Analytics and the results of usability testing.

An update on the 2018 digital prospectus

Last month the digital communications team were busy launching the new site structure, a big milestone in our external website project. Now that’s out of the way, we have more time to focus on our business as usual work, and this means ensuring all of the 2018-entry postgraduate taught and undergraduate course webpages are ready for September 2018.

New methods

2017 was the first year that the web publishing schedule for postgraduate programmes aligned with Publications, which meant that all drafts of the webpages were sent out at the same time as the draft print prospectus for each subject. This was done in an effort to make the proofing process easier for Schools, as all feedback could be sent back to Publications in one go. Undergraduate pages were still sent out separately, but digicomms hope that, in the future, these can also be sent out with print prospectus information.

Uploading

Once digicomms receives all the Word documents back from Schools, the next step is to upload the content to T4v8.

When each course is uploaded, the content team checks the page against specially written acceptance criteria for the undergraduate and postgraduate pages.

The first phase is to upload the postgraduate pages. Once they have all been uploaded, there will be a final overall QA in which the content team will read over for typos, test all links, and check each page against the acceptance criteria and the print prospectus.

Once the postgraduate pages have been signed off, we will begin uploading the undergraduate pages. The process will be the same for UG as it is for PGT, although, due to UG pages having more subpages, and more content on each page, we expect this will take longer.

The schedule

The deadline for Schools to return the postgraduate taught pages was 24 May, and the deadline for undergraduate page returns was 7 July.

We aim to have both undergraduate and postgraduate course pages live on the University website by the 1 September 2017.

Once the new 2018 pages are live, both the 2017 postgraduate and undergraduate pages will move into the archive.

Launching the new site structure

On Thursday 6 July, the digital communications team will be changing the structure of the University’s website. All academic information that currently resides under Study at St Andrews will be moved from there into the Subjects section.

The digital communications team will be migrating Entry requirements, Study options, Non-degree programmes and Postgraduate taught courses.

The new site structure for the Subjects section will go live on Thursday 6 July 2017.


Continue reading “Launching the new site structure”

Prospective student information: usability testing and results

As part of the external website project we are retiring the Study at St Andrews section of the website and moving its content into either Subjects or Prospective students. Certain information will now live under Subjects, including:

  • Study options
  • Entry requirements
  • Non-degree courses

Last week we conducted a control usability test which asked users to find information in the pages listed above. The purpose behind this test was to see how users currently found this information within Study. Continue reading “Prospective student information: usability testing and results”

Google Analytics training: an overview

Recently, the digital communications team launched the digital visa, a programme of training sessions which enables staff to feel competent working on digital communications projects. One of the courses within the visa is Google Analytics: creating a dashboard – this post outlines in more detail what the Google Analytics training session covers.

The session

By the end of every training session, each participant walks away with their own Google Analytics dashboard. A dashboard allows users to quickly access relevant data within Google Analytics.

Creating a dashboard means collating specific reports in one place, with the main purpose of having all the most important and most frequently used reports at hand. The training session covers five reports:

  1. Who visits your site? (Location report)
  2. How do users visit your site? (Mobile device report)
  3. How many pageviews does your site receive? (Pageviews report)
  4. How do users reach your site? (Traffic source report)
  5. Number of sessions from social media. (Social media report)

With each report, participants are shown what the report looks like in the dashboard, what the report can tell you about a site, and how to add the report to the dashboard.  

Once a report has been added, participants also have the option of editing the style of each widget in the dashboard so they present the data in a suitable manner. For example, instead of showing the mobile device report as a table, participants are shown how to change it to a pie chart.

Alongside creating a dashboard, participants are introduced to key terminology associated with the tool. For example:

  • bounce rate
  • entrance
  • exit rate
  • organic
  • pageview
  • referral
  • session
  • unique pageviews.

Additional reports

There is time allocated in the training session to look at any additional reports that users would like to include but have not been covered in the session. For example, in the first training session, one participant asked if it were possible to monitor PDF downloads. The whole class was shown how to implement this report in the dashboard.

Resources

After the training session, each participant is emailed a training pack which includes information about the reports that were covered in the lesson, along with videos showing the steps involved if additional reports need to be added.

If you’re interested in signing up for Google Analytics: creating a dashboard or any of the other courses in the digital visa, please email digitalcommunications@st-andrews.ac.uk.

Updating the undergraduate pages with HEFCE information

A couple of weeks ago, the content team spent some time going through a ‘Information for Students’ guide from the Quality Assurance Agency for Higher Education, but we call it the HEFCE guide (HEFCE stands for Higher Education Funding Council for England, for whom the guide was developed). The guide aims to help universities present good quality information to prospective students. 

Digicomms are using this guide to help shape the new 2018 undergraduate pages – this blog post explains what the HEFCE guide is and looks at the information that the undergraduate pages are missing.

The guide

Including the right information is essential in terms of consumer protection legislation, and also for user experience. The guide states which information HEFCE thinks universities should be providing and when it should be provided – it also includes examples of how this information should be displayed. Digicomms worked through the document and made a note of all of the information that St Andrews does not currently include on its undergraduate pages.

It is important to note that some of the recommended information does not apply to the University of St Andrews because the HEFCE guide is a document written with English universities in mind.

However, digicomms feel it is important to align the University’s content with industry standards in England, as many prospective students will compare St Andrews with English universities. In addition, similar guidelines could be implemented or suggested in the future in Scotland.

What we don’t include

Overall, we already included a lot of the recommended information. However, we were missing a few bits:

  • start and end date of undergraduate courses
  • geographic location of study
  • timetable information
  • how to apply through other routes
  • accreditation of prior learning
  • an explanation of how the academic year is organised
  • a copy of rules and regulations
  • optional modules and how those modules are chosen
  • teaching and learning information
  • assessment and progression information.

Some information on this list is easier to implement than others and digicomms have already drafted text for many of those items listed. However, with things like optional module information, this is where it gets a little more complicated.

This complication arises due to the fact that St Andrews is a Scottish university, where students have a far wider choice of modules at sub-honours. If every optional module was listed, there would be up to a thousand modules on the page. Furthermore, it’s difficult to state how these modules are chosen, because availability depends on a range of factors including staff availability, interest in modules and staff research interest.

Next steps

There isn’t a ‘one-size fits all’ for this information; we need to ensure that the right people are involved with what and how much needs to be said regarding academic progression.

To date, the digicomms team have taken an undergraduate programme and added the information that was missing. The next steps for the team will be to sign off the new content, which can then be added to the new 2018 undergraduate pages.

Common content mistakes and how to fix them

Part of the work done by members of digicomms is ensure all content on the University’s website meets digital standards. In particular, any text on the website or in print materials must meet the University’s house style. Here are some of the most common mistakes found when the team check over content. Continue reading “Common content mistakes and how to fix them”

Results from mobile usability testing

Last week I conducted some guerrilla usability testing on a mobile device in the University’s Library. The main purpose of the test was to find out whether the new digital prospectus pages can be used on mobile devices. This post provides an overview of the users’ experiences and what digicomms needs to do next with the pages to ensure they’re meeting user needs.

Who took the test?

In total, seven participants were tested, all were undergraduates and some were international students. The courses they studied were varied and ranged from International Relations, to Mathematics and English.

General feedback from participants was that their experience of the website is usually positive. Interestingly, as with the initial usability test, we found that participants highlighted the importance of awards and accreditations for Schools and the University over course information when exploring courses as a prospective student.

Some participants stated that they found the differences between iSaint/MySaint and MMS confusing. One explicitly stated: “there’s lots of different websites to use that all do the same thing”.

Highlights from test

In general, user experience with new subject and undergraduate pages was improved when compared with trying to find the same information on the existing School webpages.

However, one major drawback from the new pages that was evident on mobile, but not on desktop testing, was that students did not use the navigation bar at the top of undergraduate pages and on subject pages.

This, combined with students’ general unwillingness to scroll beyond 50% of the page resulted in students remarking how long the pages were, and how there was a lot of text. One participant stated that it probably wasn’t the case on the desktop, but on a phone extraneous text can seriously impact user experience.

We added the ‘Joint degree options’ title, reworded the text underneath it and linked to the courses. More needs to be done though.

Despite digicomms’ work rewording the joint degree section on the subject pages (an outcome of our last usability testing), joint degree information was still not easy to find. However, I feel that this isn’t due to the wording on the subject pages, as this time the text was noticed. Instead, this issue is connected to students not using the navigation or scrolling far enough down the page (joint degree information is nearer the bottom than the top of the page).

Finally, one other major factor that impacted students finding joint degree information was the fact that the links within the joint degree information text on the subject pages didn’t link directly to the joint degree sections. Instead, they were just linking to the courses that could be taken as a joint degree.

Below is a video which highlights the effect of this on the student’s experience; what you’ll see is the participant in a loop between the subject page and the undergraduate page. The question we asked was “Can you tell me if you can take xxxxx as a joint Honours degree?”

Actions

Going forward, the digicomms team are going to further reassess how joint degree information is presented to prospective students. This will include linking directly to the joint degree section within individual undergraduate pages.

We are also considering creating a more general joint degree information page within Study. This will ensure that students searching for something like “joint Honours degrees” on Google or within the website will be taken directly to relevant information.

This round of testing also highlighted areas of improvement surrounding the test itself. For one, we realised that students don’t want free donuts, so next time we’ll need an alternative incentive. More seriously however is the need to clearly explain what a prospective student is, as one student began a task thinking and acting as a current student.

In addition, for the next round of testing, to ensure participants spend more time testing the area of the website we need them to, they should be placed on a specific webpage, rather than just the homepage or Google. In this test, participants started on the University’s homepage and this led students to complete tasks on School sites and using the Study pages, whereas we wanted to see how the new digital prospectus pages held up.

While allowing the student to ‘organically’ find information from a more general source can be illustrative, sometimes guiding them a little bit of the way can help produce more refined and useful results.

Guerilla testing on mobile devices

A couple of weeks ago The University of Dundee’s external relations team released data from some guerrilla usability testing they had conducted during their freshers’ week. It was an incredible data sample, and fresh off the back of our most recent usability testing of the digital prospectus, it seemed like a good excuse as any to try out some guerrilla testing.

Continue reading “Guerilla testing on mobile devices”

Usability testing of the undergraduate pages: the results

Usability testing is a method used to evaluate a product as it is developed by testing it on others. It has been used by the digital communications team in the past to gain insightful data about user experience. With the imminent launch of the new undergraduate and subject pages, conducting some usability testing allows us to see any potential design and content issues with these pages. This post looks into what was tested, and provides an overview of the users’ responses and our next steps for the pages.

Continue reading “Usability testing of the undergraduate pages: the results”

Benchmarking the Study webpages

Over the next few weeks the new Subject and undergraduate course pages will go live and Course Search will be taken down. With such a drastic change about to occur on the University’s website it’s crucial that we monitor any impact these changes have. This corresponds with the digital communications team’s commitment to putting user needs first.  

One way of monitoring the effects of the new webpages is by measuring specific Google Analytics data relating to the Study section of the website before the changes are implemented. Then, further down the line this data will be used as a point of reference. 

What the report showed

Two time scales were used: one month (November 2016) and one year (2016). The report provided an overview of the Study webpages and looked into:

  • user flow
  • user location
  • traffic sources
  • devices used.

Overall in 2016, the Study webpages received 5,473,817 pageviews, a total of 17% of the pageviews of the entire website. Of this, 96% (5,259,350) where external users.

These figures were mirrored in November where, the Study webpages received 567,342 pageviews, a total of 21% of the pageviews for the entire website, where, 95% of those users were (544,507) were external.

As the new Subjects pages will be linked to from the Study pages, it is important to note the current pageview data for Study. Going forward, we expect to see an increase in pageviews in Study after the instalment of the new pages.

User flow

We expect that user flow will be altered with the arrival of the new pages. Currently the most popular path for users is the University Homepage> Study> Undergraduate Study.

In 2016, 80% of users accessed the main Study pages from the University’s homepage, 14% accessed the main Study pages from other University pages and 6% of users accessed the Study page directly.

nov-data

After landing on the Study homepage, 94% of those users continued to another page within the website. Of that 95% however, only around 70% went on to view another page within Study.

When the new pages are established, the route could change to something more specific such as Study>Subjects>Maths for a prospective undergraduate student. 

Traffic sources

In November and across 2016, the majority of traffic was the result of organic searches in Google. The chart below summarises the top traffic sources to the Study pages.

screen-shot-2017-01-19-at-11-38-43

One interesting find was the high portion of referrals within the top 25 sources of traffic. On a yearly basis, referrals made up 68% of the top 25 traffic sources, highlighting the importance of links on external websites for users as an entry route.

User location

Overall, the main users of Study are in the UK. This has increased from 2015 by 7.7% (1,871,600 to 2,015,377). Other notable increases from countries include China (up 28% from 152,046 to 195,548) and India (up 21% from 92,076 to 111,672 pageviews to study).

Users in the USA viewed fewer pages in Study in 2016 compared with 2015 as page views were down 14%. It will be interesting to see if these trends continue next year.

It was also interesting to note the substantial drop in pageviews from countries following the UK and the USA. Across the year there was a 74% difference in pageviews between the USA and China.

Devices used

Users predominately accessed the Study pages via desktop, and if they were to use a mobile device, an actual mobile was preferable to a tablet.

devices

In 2016, mobile and tablet use made up around 25% of the pageviews the Study pages received. In November 2016 mobile and tablet use made up around 25% of the pageviews the Study pages received.