Staff digital insights surveys

Earlier this year we ran two RAU staff digital insights surveys – one for academic staff and one for professional service staff. The surveys were managed by Jisc and are part of an annual survey programme. They complement the student digital survey that we ran last year. The academic staff survey asks teaching staff across higher (HE) and further education (FE) about their experiences of digital in their institution and in their teaching practice. This year the Professional Services staff survey ran as a pilot and the RAU was part of the pilot group. The results from the surveys are benchmarked and compared with other institutions in the sector.

Jisc survey

Below are some of our key findings.

Response rates

Academic staff Professional services
Response number 24 67
Percentage of staff ~ 50% ~ 25%
Time at RAU Even split between ‘4 years or more’ and ‘less than 4 years’ Even split between ‘4 years or more’ and ‘less than 4 years’
Department From all four schools
2 from Capel
Operations (43%) Student services (31%) Commercial and Business Development (21%) & others

Key metrics: Academic staff

  • 21% rate the quality of their digital provision (software, hardware, learning environment) as good or above
  • 92% can access reliable Wi-Fi whenever they need it
  • 50% agree it is easy to design & organise their course materials in the VLE (Gateway)
  • 54% rate the support they get to develop their digital role as good or above
  • 21% agree software for teaching is industry standard and up-to-date
  • 8% agree they are informed about ensuring students behave safely online

Key metrics: Professional services staff

  • 58% rate the quality of their digital provision (software, hardware, learning environment) as good or above
  • 87% can access reliable Wi-Fi whenever they need it
  • 27% agree that our online systems support working as a team
  • 34% rate the support they get to develop their digital role as good or above
  • 43% agree systems are up to date
  • 72% agree systems are reliable


It appears that academics are more unhappy about the quality of the digital provision but happier about the support they receive to develop digital aspects of their jobs. While for professional services it is the other way round. This may be to do with the lack of support for professional services staff training and the requirement for fit for purpose pedagogic tools

Benchmark comparisons: Academic staff

Question Our data UK data
Quality of digital provision 21% 58%
Reliable Wi-Fi 92% 85%
Support to develop digital role 54% 36%
Software for teaching is industry standard and up-to-date 21% 35%
Easy to design & organise course materials in VLE 55% 48%
Are informed about ensuring students behave safely online 8% 18%

The areas in red are below the sector and the areas in green are above.

Benchmark comparisons: Professional services staff

Question Our data UK data
Quality of digital provision 58% 68%
Reliable Wi-Fi 87% 85%
Support to develop digital role 35% 56%
Systems are reliable 72% 67%
Systems are up to date 43% 46%
Our online systems support working as a team 27% 46%

The areas in red are below the sector and the areas in green are above.

As you can see there is still lots to be done!

What can we do to help? Academic staff

  • Increased recognition by senior management of the importance of supporting innovative and good quality teaching (both digital and non-digital)
  • Better celebration of good practice
  • Support for a culture where experimentation is accepted and time/resource is allocated to it
  • More CPD in digital skills
  • Better digital teaching rooms
  • Further investment in academic and industry-standard digital tools
  • Improvements to Turnitin and integration with Quercus

What can we do to help? Professional services staff

  • Better support for flexible and remote working
  • More accessible training – from a more formal training structure to informal lunchtime drop-in training, at all levels (beginners to expert), and for new staff
  • More guidance, support and videos
  • Improve labs set up
  • Provide a list of systems with an outline of what they do
  • Better equipment – headsets for making calls, AV equipment, laptops for all

The recently developed IT and Digital strategy and action plan addresses the vast majority of these areas including:

  1. Work to establish a cross-functional group to produce an action plan for developing our student and staff digital capabilities,
  2. Help to define a set of activities and processes that directly encourage and support staff digital capability e.g: recruitment requirements, appraisals, promotions  etc.

Huge thanks to everyone who participated in either of the surveys!

These results were presented to the RAU senior managers by Alun Dawes (Head of IT) on the 10th September. Going forward we hope to run the staff surveys and the student survey on alternate years. If you have any comments on the survey results please do get in touch with IT Services.

Thinking holistically about data matters

Yesterday we (myself and one of our students, Alex Norris) attended the Data Matters conference and presented a case study on  running our 2018 Student Digital Experience tracker survey.

Marieke Guy and Alex Norris just before giving their presentation

Marieke Guy and Alex Norris just before giving their presentation

The conference was jointly organised by Jisc, QAA and HESA and held at etc venues next to the museum of London. It is the second year for the event which brings together data practitioners, quality professionals and digital service specialists to discuss topical issues around data and its use in higher education.

Our presentation was part of session looking at the Jisc pilot work on better understanding the student digital experience. Ruth Drysdale and Mark Langer-Crame gave an overview of the survey and also shared some data from their version of the survey aimed at staff. There were some interesting variations, for example staff want to use more digital tech in the classroom while on the whole students are happy with the level of technology they already have. Note that we hope to run a survey exploring the digital experience of staff over the forthcoming year.

Marieke and Alex presenting

Marieke and Alex presenting – photo courtesy of Ruth Drysdale

Our case study was followed by one from Marc Griffiths, Head of Digitally Enhanced Learning, London South Bank University. Marc planned to use insights from the survey to inform South Bank’s digitally enhanced strategy. In reality the survey results have made South Bank question quite a lot of their previous assumptions, for example their mobile first approach given that a quarter of their students don’t own smart phones.

The first key note of the day was presented by Professor Nick Petford, Vice Chancellor and CEO of University of Northampton. Nick gave an incredibly open account of the data they collect, from social media, websites accessed (the highest hits definitely aren’t learning and teaching related – think Facebook, youtube, QQ, BitTorrent) and VLE usage. Considering this data has allowed the university to act fast in order to improve the student experience. Nick related his tale of the recent ‘toastergate’ fiasco in which students had toasters that didn’t work or only allowed two slices of bread. Northampton’s aim is to pull together data from Salto, their network use, VLE, usage, security incidents, learning analytics, timetabling,  heat maps etc. into one dashboard using Power BI. This in turn will support better decision making. Nick’s talk was followed by a panel discussion on Counting what’s measured or measuring what counts: questioning education metrics. There were contributions from Professor Helen Higson, Provost and Deputy Vice-Chancellor, Aston University ; Nick Hillman, Director, Higher Education Policy Institute ; Charlie Kleboe-Rogers, Vice-President of Academia, Dundee University Students’ Association  and Andy Youell, previously from HESA but now a Writer, speaker, strategic advisor, Andy Youell Associates Ltd. The biggest tweetable points came from Nick Hillman who suggested that we consider creating more metrics and league tables as this will (to some extent) make them become less meaningful . Unis can then pick that data that they want to be framed by. He also suggested that we start asking applicants and staff more, rather than just focusing on students.

David Boyle's data lessons

David Boyle’s data lessons

The post-lunch keynote’s enticing title was Analytics and the Student Experience: Lessons from Politicians, Pop Stars and Power Brands  presented by David Boyle, Customer Insights Director, Harrods. David’s talk considered the data answers to why Hillary Clinton lost the 2016 US presidential election campaign. His answer explored four key lessons for the campaign team:

  • Cluster always – clustering tools like Affinio can offer major insights
  • Multiple sources & data science – Don’t let any one data set be separate from the other areas, think holistically. You need behavioural data, research data , data science and story telling to find and share the insights. Work on your ocular regression and find the data patterns. If you are making big decisions make all these people and all these data sets work together. You can ever put all the sources into a data soup and come out with one indicator.
  • Augmented experts – combine human skills with the ability of computer systems.
  • Insights – Load data into the head of decision makers and train them to use the data

I also attended two more break out sessions. One on Understanding your student body through innovative data analysis and the new “Career Explorer” service presented by James Jackson, Head of System Development and Integration, Bishop Grosseteste University and members of the UCAS team. The end result could end up being be a nice little app for students allowing them to see the potential universities for a chosen subject and the likelihood of an offer given their grades or predicted grades.

And a final one on the Intelligent Campus from James Clay of Jisc. You could classify campuses in to: dumb campuses (that know very little, smart campuses (that collect data) and intelligent campuses (that use that data to make decisions). So think better room usage (letting students know when the library is ridiculously busy and suggesting a better time to visit), better service provision (why do cleaners clean rooms that haven’t been used?), clever alerts (you are walking past the library why not collect that book you’ve reserved). and more. Some of this work could scale up to the the intelligent estate, or focus in to the Intelligent timetable, intelligent learning spaces or intelligent library. The Jisc Intelligent campus project has been working on a list of data sources, use cases and general guidance. There is also the very useful code of practice developed for learning analytics projects.

James Clay outlines the intelligent campus

James Clay outlines the intelligent campus

It was an interesting day. The biggest take away was the need to adopt a holistic view and ensure that you are using several data sets, including qualitative data. HE really needs people who can take on this overseeing role and provide narratives on data in order to make it meaningful for decision-makers. As Andy Youell put it: “Data soup is better than the blancmange of opinion“.

Results from the Student digital experience tracker

Back in April we ran the Jisc Student digital experience tracker and are now ready to share the data more widely.

The tracker is a brief (10 minute) online survey that all our students were invited to participate in. The survey  gathers data on students’ expectations and experiences of technology at their institution. It is run annually and each institution’s data is available for that institution to use, the data is also benchmarked against other institutions.

We asked all our students (in all year groups) to participate. They were sent the link by email, the survey was also advertised on the Student Facebook site and other social media, by posters and by academics. 218 of our students responded to the tracker (18 % response rate), this breaks down into:

  • Male (44%), female (56%) gender split (Q2)
  • 1st year (56%), middle year (15%), final year (18%), Masters/postgrad (10%) stages of study (Q3)
  • 20% self-identified as needing to use assistive technologies (Q6)

We are really happy with the response rate and the make up of students seems representative of the University. We also had a good split across centres.

Our key metrics were interesting. We are clearly at the start of our digital transformation with lots to do but the metrics show clear priority areas: 1) digital as part of learning and teaching 2) digital and data literacy of our staff and students.


The key findings were that:

  • The amount of technology available at RAU is fine – students want us to make it more user-friendly and get better at training staff and students in how to use it
  • Wifi is very important to students but they actually think our coverage is OK
  • Students like consistency – on Gateway, in lessons, in tools
  • Students want us to prioritise online, free, up-to-date resources
  • Students don’t feel they are getting the right amount of digital training in tools they need for their course or in the skills they need for the work place
  • Students like computer rooms, printers and charging points
  • Students want more multimedia – videos (of lectures) and images
  • Students appreciate IT support and would like more help with technical issues

We have been working on an action log which takes the form of a ‘You said. we did’ spreadsheet. Highlighted areas are given a response in the following way:

  • We did (current resolution)
  • We will do (future resolution)
  • Why we can’t do it (explanation)

The log covers areas from accessibility, communication and data, to digital literacy, digital spaces and library resources.

We will be sharing these with students over the next few months.


Academics have also been invited to look at more detailed analysis of the data if interested – this includes further benchmarked data (against other institutions and GuildHE institutions) and a summary of all the free text responses. Potentially the raw data could also be distilled and analysed at the centre level.

There has already been agreement that the survey will run again. Hopefully earlier in the academic year (November?) so as to not coincide with the NSS.

Student digital experience tracker

Today we launched the Jisc student digital experience tracker. The tracker is a national survey to find out about how students use digital technologies and how it affects their experience of learning.

Last year 74 UK institutions participated (there were 22,593 student responses) and the results allowed Jisc to benchmark areas and share themes and trends across the whole Higher Education sector. You can access the 2017 report online.

The tracker is aimed at all our students, undergrad and postgrad. It takes about 10 minutes to complete and will be available during April 2018.


We really want to get as many students to participate in the survey as possible. Although we are interested in the benchmarking aspect of the survey the main draw is getting a better understanding of how students use the RAU digital environment and services and how we could improve these services. This information will feed in to the RAU Digital strategy and priorities and spending for the IT services team. The aim is that we can then target resources towards the issues that matter, improve the student experience and ultimately become a better institution.