Survey Says: The NSS 2023 Changes and What They Mean for Higher Education
With the 2023 NSS period closing at the end of April, how far are you aware of the changes to the survey and the implications for analysis and reporting?
For 2023, the Office for Students (OfS)’s National Student Survey (NSS) has undergone some significant changes, which have been met with mixed reactions from instructors and which will undoubtedly impact on how institutions and students use the data moving forward. These changes follow a thorough review and consultation process between 2020-22, aimed at ensuring the survey remains fit for purpose and reflects current pedagogic practice and diversity of provision across the sector.
Naturally, however, the relaunch of a ‘new’ NSS has led to some debate about the purpose and standing of the survey, raising questions about the autonomy of universities and the role of student voice in the regulatory regime.
Has the new NSS become too much of a tool for public accountability/ regulation rather its original purpose of enhancing student experience?
The OfS argues that regulation of providers across the UK has been a key purpose of the survey since its introduction in 2005, and that this hasn’t changed with the new iteration. The National Student Survey (NSS) works alongside a broader set of measures aimed at enhancing the quality of higher education in England, sitting alongside the OfS’s new and revised quality and standards conditions and guidance, and providing evidence of the quality of teaching and student outcomes as part of the TEF assessment process. “Use of data from the NSS as regulatory intelligence has [therefore] always been part of the regulatory framework that was consulted on with the sector and others,” says the Consultation on changes to the NSS - Responses and decisions.
How appropriate is the addition of the two new questions on well-being and freedom of expression?
One of the most significant changes to the NSS this year is the addition of two new questions on wellbeing and freedom of expression. The question on wellbeing asks students to rate how well their university supported their mental and emotional wellbeing during their time there, while the question on freedom of expression asks students to rate how well their university promoted and protected freedom of speech and expression.
The addition of the two new questions has not been universally welcomed, with some concerned that their inclusion deviates from the survey’s main emphasis on student academic experience, and objections that these areas are not solely within a provider’s control. Furthermore, unlike the core questions which are universal both to institutions and across time, these questions could be deemed more topical and transitory. Some critics might even go so far as to view their addition as political interference in the in the running of universities.
However, others view this as a positive step towards promoting a more holistic view of the student experience. Mental health and freedom of expression are both crucial aspects of student life, and it is important that universities are held accountable for how they support and protect these areas.
The OfS states that freedom of expression is a theme consistently raised by stakeholders, with students seeing it as key to inclusion and belonging but increasing numbers of staff and students reporting that they feel unable to express cultural, religious or political views without fear of repercussion. The inclusion of this question is intended to protect against the growing phenomenon of ‘cancel culture’ across many western societies which can lead to self-censorship and a reluctance to express or even discuss views that may provoke backlash. The OfS believes freedom of expression to be an important element of academic freedom and that students should be “entitled to be taught by staff holding a wide range of views, even where these may be unpopular or controversial, and to similarly express their own views.”
In recognition that these are questions that sit outside of the core – the OfS has in fact added them as additional questions, signalling that they are factors beyond the core academic experience that are not entirely within a provider’s control but can be influenced by providers and should be monitored as part of the wider educational experience.
Recommended by LinkedIn
Change in the wording of survey = impact on trend data
Three-quarters of respondents in the consultation were concerned with the change up in the format of the questions from a Likert scale (agree, disagree etc) to direct questions with a four-point scale, which inevitably entails a full stop and new beginning for trend data. From the practical consideration that it will be harder to evidence improvements over time as evidence for the TEF, to the wider, more general impact on institutions’ ability to analyse and use survey information from a response scale that varies from question to question, the worry is that the new format could limit the usefulness of the survey and reduce clarity.
In rebuttal, the OfS points out that in phase one of the review, there had been a strong desire from providers to update the questionnaire, who considered that the current NSS questions could be improved to reflect current pedagogic practice and to reflect diversity of provision across the sector. There was also a feeling that with questions remaining static over several years, the momentum to keep driving improvement based on results was dropping off. Therefore, while recognising that changes to the questionnaire would result in a break of trend data, the OfS considered that the improvements outweighed the negatives, pointing out that any change – even minor updates to the Likert questionnaire would impact on trend data – and that we cannot allow the survey to become gradually less impactful simply to preserve consistency.
Why has the summative question been removed for England but not for the devolved nations?
Perhaps the most confusing development in the NSS this year is the removal of the summative question for England, while it remains compulsory in the devolved Northern Ireland, Scotland, and Wales.
The summative question, which asks students to rate their level of agreement with the statement: ‘Overall, I am satisfied with the quality of the course’ is alternatively viewed as a simple and accessible way for students to provide an overall rating for their course or an overly narrow and unreliable indicator of quality that could pull in students’ views on aspects unrelated to teaching and learning such as student accommodation or availability of parking.
Interestingly, the majority of respondents to the consultation were in favour of retaining the summative question across the board, believing that removal of the question would make the data less robust and impede comparison between countries – with the absence of data for England where it is available for the devolved countries confusing for international students.
Nevertheless, the OfS made the decision to remove it in England – stating that “the benefits that some respondents identified in maintaining the same summative question across the UK are outweighed in England by the need to ensure clear links between the information provided by the NSS and the aspects of quality that are subject to regulation in English providers. We consider that, in England, the importance of focusing on the more granular questions, as set out above, rather than a summative question outweighs any benefits of retaining a summative question in England that does not relate clearly to the OfS’s regulatory requirements.” In other words, the question is seen as insufficiently meaningful to be of use in determining quality.
In the devolved nations, on the other hand, the summative question is a key indicator used in their quality assurance processes, and respondents from Higher Education Funding Council for Wales (HEFCW) and the Scottish Funding Council (SFC) felt strongly that the continuity and time series of results was important to maintain.
Give your students extra support and structure (without killing yourself)
Our academic consultants spend their time across the UK and Europe’s best universities and schools. Over the past several years we’ve been working with instructors to help them leverage technology as a key part of educational design to boost student engagement and satisfaction, carrying best practice from institution to institution.
We’re here to listen to your challenges and your objectives and help you develop an approach that allows you to teach your way while giving students extra support and structure.