Tagged / GCSEs

HE policy update for the w/e 3rd September 2020

So it’s back to school for pupils and teachers, and Parliament is back (although still mostly virtually). What’s in the news?

Ofqual fight back

The House of Commons Education Committee grilled Ofqual this week in a fascinating session – the transcript is here. Before the session, Roger Taylor, the Chair of Ofqual, submitted a written statement, which you can read here.   We thought we would summarise the good bits for you.

Before you skip, though, the obvious question is “does it matter” – or is it all just a witch-hunt?  Clearly it does matter, because some of the same issues that led the government to cancel exams this year still apply – missed school time, uneven opportunities to learn, the implications of a second wave.  In our next segment, we look at the hints about next summer.

If you want to skip the next bit, the conclusion seems to be: Ofqual were handed an impossible brief by the Minister, who made it harder by changing policy on the hoof without asking them, they had a solution to it all in the form of a better appeals process to address outlying results (like high performing students in schools with poor previous performance) but never got a chance to roll it out because of the mocks fiasco, that they always thought exams should have gone ahead, and that the algorithm was fair and has been unfairly criticised by people who don’t understand the data!  Gavin Williamson is giving evidence soon, so that will be worth reading.  And Ofqual are going to publish correspondence so everyone can see that it wasn’t their fault….

David Kernohan has written about it for Wonkhe here.

The written statement  starts with an apology to students, teachers, and HE and FE providers.  As widely reported on the news channels yesterday, it confirms that Ofqual didn’t want the exams to be cancelled – they wanted them held in a socially distanced way.  Gavin Williamson decided to cancel them because of concerns about lost schooling and the risks with getting students back into schools.  So the well known solution and the well known moderation process was adopted. 

You will recall this decision was announced on 18th March – which was very early – and might be said to have shown decisiveness and the desire to provide certainty in a complex situation.  But of course that assumes that the alternative was going to be a good and not a mutant one, which we all hoped it would be…..

In the evidence session, Roger Taylor said that after Ofqual offered advice on options:

  • It was the Secretary of State who then subsequently took the decision and announced, without further consultation with Ofqual, that exams were to be cancelled and a system of calculated grades was to be implemented. We then received a direction from the Secretary of State setting out what he wished Ofqual to implement.

In the statement, Ofqual say:

  • The principle of moderating teacher grades was accepted as a sound one, and indeed the relevant regulatory and examination bodies across the four nations of the United Kingdom separately put in place plans to do this. All the evidence shows that teachers vary considerably in the generosity of their grading – as every school pupil knows. Also, using teacher assessment alone might exacerbate socio-economic disadvantage. Using statistics to iron out these differences and ensure consistency Written submission from Roger Taylor, Chair of Ofqual looked, in principle, to be a good idea. That is why in our consultations and stakeholder discussions all the teaching unions supported the approach we adopted. Indeed, when we consulted on it, 89% of respondents agreed or strongly agreed with our proposed aims for the statistical standardisation approach.

And they knew there were risks but on the whole the averaged out effect was correct:

  • We knew, however, that there would be specific issues associated with this approach. In particular, statistical standardisation of this kind will inevitably result in a very small proportion of quite anomalous results that would need to be corrected by applying human judgment through an appeals process.
  • For example, we were concerned about bright students in historically low attaining schools. We identified that approximately 0.2% of young peoples’ grades were affected by this but that it was not possible to determine in advance which cases warranted a change to grades. That is why the appeals process we designed and refined was so important. But we recognise that young people receiving these results experienced significant distress and that this caused people to question the process.

In the evidence session, Roger Taylor was asked about this and he said:

  • It was clear that to make a valid judgment would require a degree of human judgment and therefore a form of appeal would be necessary to make this work, but we were also exploring with the exam boards how we could implement a system of outreach to those students through the exam boards to let them know on the day, “Look, we think you’ve probably got a very good case for appeal.” That was the direction we were moving in. When the mock appeals route came in, that question became less relevant.

And they are still defending it:

  • The statistical standardisation process was not biased – we did the analyses to check and found there was no widening of the attainment gap. We have published this analysis. Indeed, ‘A’ and ‘A*’ grade students in more disadvantaged areas did relatively better with standardised results than when results were not standardised.

They were challenged on this in the evidence session.

  • Robert Halfon, the chair, asked about it: The Department for Education confirmed on 14 August that pupils from lower socioeconomic groups were more likely than their peers to have their centre assessed grades downgraded by Ofqual’s algorithm at grades C and above. The difference between Ofqual’s moderated grades and teacher centre assessed grades for lower socioeconomic groups was 10.42%. In contrast, the difference between Ofqual’s moderated grades and teacher centre assessed grades for higher socioeconomic groups was 8.34%.
  • Michelle Meadows, Executive Director for Strategy and Research, replied: We had done a full equalities analysis, looking at the grades not just by socioeconomic status but by other protected characteristics such as ethnicity, gender and so on, and what we were able to see and we were very confident about was that any fluctuation in outcomes seen for these various groups this year was extremely similar to the small changes in outcomes we had seen in previous years. In other words, there was nothing about the process that was biased.

And when challenged about the impact on individual students, Roger Taylor said in the evidence session:

  • I disagree with the notion that this algorithm was not fit for purpose or that a better algorithm would have produced a different result; but I strongly agree with your statement that to say this was fair just fails to recognise what happens to students—just the level of accuracy that was fundamentally possible with the information that was available was too low to be acceptable to individuals, and we recognised this right at the outset. We identified this as a risk.

And on small class sizes etc

  • However, the impossibility of standardising very small classes meant that some subjects and some centres could not be standardised, and so saw higher grades on average than would have been expected if it had been possible to standardise their results. This benefitted smaller schools and disadvantaged larger schools and colleges. It affected private schools in particular, as well as some smaller maintained schools and colleges, special schools, pupil referral units, hospital schools and similar institutions. We knew about this, but were unable to find a solution to this problem. However, we still regarded standardisation as preferable because overall it reduced the relative advantage of private schools compared to others.
  • Ultimately, however, the approach failed to win public confidence, even in circumstances where it was operating exactly as we had intended it to. While sound in principle, candidates who had reasonable expectations of achieving a grade were not willing to accept that they had been selected on the basis of teacher rankings and statistical predictions to receive a lower grade. To be told that you cannot progress as you wanted because you have been awarded a lower grade in this way was unacceptable and so the approach had to be withdrawn. We apologise for this.

And here is the killer statement:

  • With hindsight it appears unlikely that we could ever have delivered this policy successfully.

And whose fault is it?

  • Understandably, there is now a desire to attribute blame. The decision to use a system of statistical standardised teacher assessments was taken by the Secretary of State and issued as a direction to Ofqual. Ofqual could have rejected this, but we decided that this was in the best interests of students, so that they could progress to their next stage of education, training or work.
  • The implementation of that approach was entirely down to Ofqual. However, given the exceptional nature of this year, we worked in a much more collaborative way than we would in a normal year, sharing detailed information with partners.
  • We kept the Department for Education fully informed about the work we were doing and the approach we intended to take to qualifications, the risks and impact on results as they emerged. However, we are ultimately responsible for the decisions that fall to us as the regulator.
  • …. The blame lies with us collectively – all of us who failed to design a mechanism for awarding grades that was acceptable to the public and met the Secretary of State’s policy intent of ensuing grades were awarded in a way consistent with the previous year.

Autumn exams:   It was clear to everyone that autumn exams would be a problem for those intending to start university this year.  No plan or proposal was made for this, apart from ministerial exhortations that universities should be flexible, and vague references to a January start.  Put on top of an absolute prohibition on unconditional offers, it was hard to see what universities were meant to do. Ofqual say:

  • “the original policy was adopted on the basis that the autumn series would give young people who were disappointed with their results, the opportunity to sit an examination. However, the extended lockdown of schools and the failure to ensure that such candidates could still take their places at university meant that this option was, for many, effectively removed. This significantly shifted the public acceptability of awarding standardised grades”

I have no idea what that means….but it looks like blaming the context for the problems.  Roger Taylor clarified it in the evidence session:

  • When the decision was originally made, there was a strong belief that the autumn series would be the compensation for that—that people would be given a chance and that university places could be held open for them that they could take in January, and that that would limit that damage. At the time, it was felt that it was a fair offer, but of course, over time, schools did not reopen; there were no arrangements for late entry to university; and by July, it was clear that the autumn series did not represent any sort of reasonable alternative that candidates felt would make up for being given an inaccurate calculated grade. At that point, we were in a situation where it was difficult to see how people would accept it as a fair way to have their grades awarded.

Autonomy and influence

  • Roger Taylor: The relationship is one in which the Secretary of State, as the democratically accountable politician, decides policy. Ofqual’s role is to have regard to policy and to implement policy, but within the constraints laid down by the statute that established Ofqual. Those constraints are that the awarding of grades must be valid, it must maintain standards year on year, and it must command public confidence. We can decide not to implement a direction from the Secretary of State if we feel that it would directly contradict those statutory duties, but if the policy does not directly contradict those statutory duties, our obligation is to implement policy as directed by the Secretary of State.

There was a bit more about this in the evidence session when Roger Taylor was asked about the mock appeals policy (see below) and he said:

  • It is important, in trying to manage public confidence, that we do not have a Secretary of State stating one policy and Ofqual stating a different policy. It also struck us that the way to resolve this was to move at pace and it needed to be negotiated and managed in an orderly fashion. But we were acting with full independence.

The comings and goings about the use of mock results in appeals were discussed at length:

  • Roger Taylor:the Secretary of State informed us that, effectively, they were going to change policy. Until that point, the policy had been calculated grades plus an appeals process. The Secretary of State informed me that they were planning to change this policy in a significant way by allowing an entirely new mechanism by which a grade could be awarded through a mock exams appeal. Our advice to the Secretary of State at this point was that we could not be confident that this could be delivered within the statutory duties of Ofqual, to ensure that valid and trustworthy grades were being issued. The Secretary of State, as he is entitled to do, none the less announced that that was the policy of the Government.
  • That having been announced as the policy of the Government, the Ofqual board felt—I think correctly—that we should therefore attempt to find a way to implement this in a way that was consistent with our statutory duties. We consulted very rapidly with exam boards and other key stakeholders. We were very concerned that this idea of a valid mock exam had no real credible meaning, but we consulted very rapidly and developed an approach that we felt would be consistent with awarding valid qualifications. We then agreed that with the Department for Education and, to our understanding, with the Secretary of State’s office. We then published this on the Saturday. We were subsequently contacted by the Secretary of State later that evening and were informed that this was in fact not, to his mind, in line with Government policy.
  • ….It was published about 3 o’clock on the Saturday. I think the call from the Secretary of State was probably at around 7 o’clock, 8 o’clock that evening. The Secretary of State first phoned the chief regulator. …
  • The Secretary of State telephoned me and said that he would like the board to reconsider. ….given the Secretary of State’s views, it felt appropriate to call the board together very late that evening. The board convened at, I think, around 10 o’clock that evening. I think at this stage we realised that we were in a situation which was rapidly getting out of control—that there were policies being recommended and strongly advocated by the Secretary of State that we felt would not be consistent with our legal duties, and that there was, additionally, a growing risk around delivering any form of mock appeals results in a way that would be acceptable as a reasonable way to award grades….

Grade inflation

  • Ian Mearns asked: This is the problem: Ministers are regularly telling us that we have more good and outstanding schools, with the most highly professional teaching profession that we have ever had. Given that process, that improvement and that continuing improvement, should there not be some increase in the levels of achievement by youngsters year on year that cannot be put down as grade inflation?
  • Roger Taylor replied: On your point about grade inflation, we were very aware that being very strict about grade inflation would only make this situation worse. That is why, in the design of the model, at every point where we could reasonably do this, we erred in the direction of making decisions that allowed grades to rise. Consequently, the final result of the moderated grades did allow for between 2% and 3% inflation in grades which, in assessment terms, is very significant and larger than would represent the sorts of effects that you talked about resulting from improvements in teaching, but we felt that that was appropriate in these extremely unusual circumstances, given the disruption happening in people’s lives as a result of the pandemic.

Issues with CAGs:

  • David Simmonds MP said that he has had more complaints about the u-turn and the fairness of the CAGs than the original grades. There is concern about the lack of opportunity for students to appeal these grades.
  • Roger Taylor said: It goes to the nature of the problem: there is not an independent piece of information that can be used to determine between these two competing claims. That is why the lack of any form of standardised test or examination makes this a situation that people find very hard to tolerate.

On private students (who have to take exams in the autumn):

  • Roger Taylor: I have huge sympathy with these people. Clearly, they have been some of the people who have lost out most as a result of the decision to cancel exams. I will hand over to Julie to say a little bit more about this, but once the decision had been taken to cancel exams, it was very hard to find a solution. We explored extensive solutions, but ultimately the situation was one in which, once exams had been cancelled, these people had lost the opportunity to demonstrate their skills and knowledge in a way that would enable them to move forward with their lives. That was the situation we were in.

On the tiering problem (students getting a higher grade than permitted by the exam, i.e. foundation students at GCSE who can’t get higher than a 5, who got a 6, for example):

  • Michelle Meadows: In the absence of papers this year, we felt that the fairest thing to do was to remove those limits on students’ performance. So there were a very small number of cases where, for the tiered qualifications, less than 1% of foundation tier students received higher grades and, for the higher tier, less than 0.5% received lower grades than they would normally achieve. We felt that it was a decision in favour of students—that they would not be constrained in the normal way.

And on BTECs:

  • Roger Taylor: It was not inevitable that there would be a domino effect, because the use of calculated grades inside the BTEC system was completely different from what had gone on with general qualifications. They were two completely separate pieces: one Ofqual was closely involved with and where we had the authority to make a decision; and the second was one that Pearson were responsible for and where we had no authority to determine how they were going to respond to the situation. That was their call.

And did the algorithm mutate?

  • Ian Mearns: At what point did the algorithm mutate?
  • Dr Meadows: I don’t believe that the algorithm ever mutated.

So what about next year

There are already discussions about delaying the exams, some elements have been changed, there are discussions about having an online option with open book exams, etc.  Ofqual have now made it extremely clear in the evidence session referred to above that they didn’t want to cancel exams this summer and they certainly don’t want to next summer, but also that they don’t want to rely on moderated CAGs again.  So some form of formal assessment seems likely.  But this one has some way to run.

For what was announced in August, Schoolsweek have a nice round up of the changes to A levels and for GCSEs here.  The Ofqual statement about A levels, AS levels and GCSEs is here.

In their statement referred to above, Ofqual confirm that amongst the lessons learned from this year are some things that will influence next year:

  • any awarding process that does not give the individual the ability to affect their fate by demonstrating their skills and knowledge in a fair test will not command and retain public confidence
  • a ‘better’ algorithm would not have made the outcomes significantly more acceptable. The inherent limitations of the data and the nature of the process were what made it unacceptable

And there should have been better comms and not just by them.

In the evidence session, Roger Taylor said:

  • I think we have been very clear that we think that some form of examination or standardised test, or something that gives the student an ability to demonstrate their skills and knowledge, will be essential for any awarding system that the students regard as fair. We have done some consultation, and have published the results of that consultation, but it is obviously a fast-moving environment, and the impact of the pandemic remains uncertain over the future, so it is something that we are keeping under constant review……I want to be really clear that, absolutely, we raised it in our initial consultation, and we are very conscious of the enormous benefit that would come from delay. We recognise the value in trying to find a way of making this work.

And Julie Swan said:

  • Content for GCSEs, AS and A-levels is of course determined by Ministers, and Ministers, as I am sure you will know, have agreed some changes to content for a couple of GCSE subjects—history, ancient history and English literature. We have published information about changes to assessment arrangements in other subjects that will free up teaching time, such as making the assessment of spoken language in modern foreign languages much less formal. …..as well as allowing, for example, GCSE science students to observe practical science, rather than to undertake it themselves….We are working with the DFE to get to conclusions within weeks, rather than months.

Gavin Williamson’s position

Gavin Williamson gave a statement to the House of Tuesday, on the first day back.  He said very, very little, really.  He apologised and then moved on quickly to talk about schools going back.  David Kernohan has written about this for Wonkhe too.

  • The problem with having a Prime Minister who will only sack officials is that we are forced to watch senior politicians descent into near-Grayling levels of farcical inadequacy without hope of respite. Williamson’s haunted soul screams for release, but still he has to field questions about next summer while struggling to get through the next five minutes.

Research Professional cover it here.

Meanwhile in HE

The Office for Students have today launched a call for evidence into Digital teaching and learning in English Higher Education during the pandemic.  It closes on 14th October 2020.

The review will consider:

  1. The use of digital technology to deliver remote teaching and learning since the start of the pandemic and understand what has and has not worked.
  2. How high-quality digital teaching and learning can be continued and delivered at scale in the future.
  3. The opportunities that digital teaching and learning present for English higher education in the medium to longer-term.
  4. The relationship between ‘digital poverty’ and students’ digital teaching and learning experience

If you are interested in contributing to a BU institutional response please contact policy@bournemouth.ac.uk as soon as possible.

Inquiries and Consultations

Have you contributed to a Parliamentary Inquiry?  Many colleagues from across BU have done so over the last year, and inquiries can be relevant for both academic and professional services colleagues.  Your policy team (policy@bournemouth.ac.uk) can help you prepare and submit a response – there are some important rules to follow about content and presentation, but a good submission might result in a call to give oral evidence (by video, these days) or get people talking about your submission.

You can find the list of open Parliamentary inquires here.  They include (just a few examples):

  • Police conduct and complaints (accepting written evidence until 14th September 2020)
  • Digital transformation in the NHS {(until 9th September)
  • Reforming public transport after the pandemic ?(until 24th September)
  • Biodiversity and ecosystems (until 11th September)
  • Black people, racism and human rights {(until 11th September)

And you can also find Secre – a small selection (these have longer dates):

  • A call for evidence on a future international regulation strategy
  • Pavement parking
  • Marine energy projects
  • Distributing Covid and flu vaccines
  • Recognition of professional qualifications
  • Marine monitoring
  • Deforestation in UK supply chains
  • Waste management plan for England
  • Front of pack nutrition labelling
  • Review of the Highway Code to improve road safety for cyclist, pedestrians and horse riders

Let us know if you are interested in responding to these or any others.MinisSecre

Subscribe!

To subscribe to the weekly policy update simply email policy@bournemouth.ac.uk.

Did you know? You can catch up on previous versions of the policy update on BU’s intranet pages here.. Some links require access to a BU account- BU staff not able to click through to an external link should contact eresourceshelp@bournemouth.ac.uk for further assistance.

External readers: Thank you to our external readers who enjoy our policy updates. Not all our content is accessible to external readers, but you can continue to read our updates which omit the restricted content on the policy pages of the BU Research Blog – here’s the link.

JANE FORSTER                                            |                       SARAH CARTER

Policy Advisor                                                                     Policy & Public Affairs Officer

Follow: @PolicyBU on Twitter                   |                       policy@bournemouth.ac.uk

HE policy update for the w/e 6th December 2019

A fresh selection of educational reports were issued this week. When we issue next week’s policy update the election results will be out.  The campaign has already got a bit over-heated, with leaks from both main parties, edited videos, dodgy data and everyone trying to avoid making the ultimate error in today’s world – the soundbite in which you admit that the interviewer may have a point.  It is becoming increasingly hard to listen to interviews in which people read out their prepared lines and then repeat them over and over again. And it will probably get worse next week.  So we’re going election light in this update.

If you are interested in comparing manifesto pledges, the BBC have an interactive tool here.  And here is our own comparison of the major parties’ take on the key HE issues.

An English atlas of inequality

The Nuffield Foundation have published a new English Atlas of Inequality (created by University of Sheffield) challenging the current one-metric approach to disadvantage in distinguishing between ‘rich’ and ‘poor’ areas.

  • In late 2019, as the nation continues to experience political uncertainty and the machinations of the Brexit process roll on, it seems there is little room in the policy arena for taking action on persistent poverty, deprivation or the level of inequality in England. In fact, it seems like there is little room to even discuss the topic. However, as hard as it may be to envision a return to ‘normal’ politics, it is surely the case that at some point in the future attention will once again turn to the question of inequality, and the growing consensus that something needs to be done about it. Indeed, only two years ago it was one of the few topics where there was an element of consensus across the political spectrum…in their 2017 party political manifestos, all the major parties in England highlighted inequality as a policy challenge that needed to be tackled.

The research uses three separate measures of inequality and compares the results of each measure in ‘travel to work areas’ to outcomes for the population in mortality, poverty and entry to higher education; to understand how alternative approaches to understanding inequality can produce very different results. The measures used consider income distribution, a measure of economic imbalance within areas, and geographic clustering of different income groups. The report also stresses the risk of relying on one metric to understand an issue so they compared all three measures across the geographical classifications of local authority districts, parliamentary constituencies, and the ‘travel to work’ areas.

Professor Rae (author, Sheffield) said: 

  • “Our atlas highlights the fact that no one measure of inequality paints the full picture and that methodological diversity is needed before we start to think of solutions to inequality at a local, sub-national and national level. This is a reminder that a policy focus on inequality ought also to be linked to a focus on poverty alleviation and equality of opportunity, but also that how we understand inequality is inextricably linked to how we measure it in the first place.”

An example given in the report is that

  • if inequality alone was seen as a policy problem worth tackling, and the Gini coefficient [income distribution] was the only way we measured it, one could conclude that some of England’s most deprived seaside towns should not be the focal point. We believe such a conclusion would be incorrect.

Alex Beer, Programme Head at the Nuffield Foundation, said:

  • “This English Atlas of Inequality advances our knowledge of how inequalities are distributed at the local level. The Atlas highlights the importance of taking a multi-faceted approach to the study of inequality and to policy making for a more inclusive society.”

The report makes four recommendations:

  1. Take into account the fact that many of the poorest local economies in the country are also the most equal. Methods which increase equality alone are not enough.
  2. Increase the policy focus on the links between geographic dislocation, deprivation and inequality. It is important to consider wider questions of regional and sub-national connectivity and links to the drivers of inequality. There are important connections to be made between transport policy and welfare policy and as such an inter-departmental approach to tackling geographic dislocation is likely to be necessary.
  3. Thorough review of evidence considering whether the ‘majority of deprived individuals and families [do] not live in the most deprived areas’ (Smith et al., 2001; Barnes and Lucas, 1975). Rather than viewing this issue as an arcane methodological question finding a definitive answer should be a policy priority. When it comes to tackling persistent poverty through policy intervention, it may be right to focus on the most deprived locations if they contain the highest proportions of poor households and residents, yet doing this in isolation may lead to reduced effectiveness if poorer residents living elsewhere are overlooked. This is a fairly obvious point but it is a gap in the academic and policy literature – there is no definitive answer on the proportion of ‘poor people’ who do or don’t live in ‘poor areas’.
  4. Any approaches which seek to understand the true nature of inequalities should incorporate an explicit measure of spatial disparity: it’s clear from our analysis in this Atlas that the story of inequality in England is an inherently spatial one and as such we believe it should also be measured as one, in addition to [income] The authors say this point is threaded through the literature on urban and regional inequalities (e.g. Beatty and Fothergill, 1996; Bell et al., 2018), which often highlights quite striking spatial imbalances at the regional level.

On local areas Dorchester and Weymouth are rated 11th in the country as least unequal. Portsmouth and Southsea are among the most unequal. (Remember areas can be poor but still equal.) You can also delve into all the map detail for different areas here (e.g. by constituency, by travel to work area, and by local authority areas).

  • “Too often the debate takes place in silos, focusing on just one type of inequality, a specific alleged cause or a specific proposed solution. We need to step back and ask: how are different kinds of inequality related and which matter most? What are the underlying forces that come together to create them? And crucially, what is the right mix of policies to tackle inequalities?”

(Joyce, R. and Xu, X. (2019) Inequalities in the twenty-first century, Introducing the IFS Deaton Review, Institute for Fiscal Studies and the Nuffield Foundation, London.)

Academic Mismatch

UCL and the Nuffield Foundation have launched ‘Mismatch in Higher Education’ . Mismatch is a term that’s become very popular in widening participation and governmental circles recently, particularly after the Behavioural Insights Team considered how they could use nudge theory to tackle academic mismatch. A ‘mismatch’ is when a student selects or attends a course/institution which is less or more selective (competitive) than their academic achievement might suggest they could attain.

In the Nuffield investigation a course was benchmarked by using the median A-level (and equivalent) exam results of the students studying on the course as well as the average earnings of previous graduates of the course. The report finds that there is significant under- and over-match in the UK. They also confirm the widely held belief that there are substantial socio-economic status (SES) and gender gaps in mismatch, with low SES students and women attending lower quality courses than their attainment might otherwise warrant. Past universities ministers Sam Gyimah, Chris Skidmore and (briefly) Jo Johnson all picked up the theme of ensuring the most capable students from disadvantaged backgrounds aspired to and were able to access the most selective institutions. Under matching by disadvantaged students and females has ramifications for social mobility and the gender pay gap.

Key Points:

  • Up to 1 in 4 students from lower socio-economic backgrounds take courses at ‘less prestigious’ universities despite having the grades for ‘more selective’ institutions.
  • 15% of students were over-matched and 15% were under-matched using the course quality measure and 23% over-matched and 23% under-matched based on earnings.
  • The school attended accounted for much of the ‘mismatch’ among lower socio-economic students, most likely due to influential factors such as peers, school resources and what information, advice and guidance (IAG) is offered.
  • Disadvantaged students were more likely to attend universities close to home, but those who do so are worse matched than richer students who attend universities close to home.
  • High attaining disadvantaged students going to universities near home were more likely to attend a post-1992 institution, whereas high attaining advantaged students staying near home were more likely to attend a Russell Group university.
  • Interestingly the report suggests 50% of US students are mismatched and that students from ethnic minority backgrounds are likely to undermatch, however, this is not replicated in the UK context.

The data points have been taken from a report by Dods Political Consultants because at the time of writing the full report findings have not been released to the public outside of those attending the launch event for us to verify their accuracy.

Cheryl Lloyd, Programme Head at the Nuffield Foundation, says:

  • “This research highlights that students from different backgrounds but with similar abilities are making very different choices when it comes to the university courses they decide to study. To overcome the significant socio-economic and gender inequalities students face when choosing university courses, it is clear that they need equal access to the information, advice and support they require to make informed choices about their future.”

Co-author, Professor Lindsey Macmillan (UCL Institute of Education) explains:

  • “While women enrol in courses that are as academically prestigious as men, they are more likely to attend courses which command lower average earnings. This is, in large part, driven by the different subjects studied by men and women at university. These findings have important implications for the gender pay gap.”

The student take on data security

HEPI have published students or data subjects? What students think about university data security.

The research stems from the volume of data HEIs collect on students both for regulatory purposes or to gather information about student experience. The authors suggest the volume of data collected will increase further as the Government’s focus on measuring universities’ performance through metrics and the internal analysis of data increases. Key Points:

  • 32% of students surveyed agree they are aware of how their institution handles their personal data, 45% who disagree, 22% undecided.
  • Students surveyed do not feel they have been provided with clear information on how their personal data are used. 31% feel their institution has clearly explained how their personal data are used and stored, compared to 46% who disagree (24% who neither agree nor disagree).
  • When asked whether students are concerned about rumours of universities facing data security issues, 69% of students said they are concerned. Around one-fifth of students (19%) are unconcerned and 12% are unsure.
  • 65% of students said a poor security reputation would have made them less likely to apply, compared to around a third (31%) who said it would have made no difference and 4% who said it would have made them more likely to apply.
  • Under half of students feel their university will keep their data safe: only 45% of students feel confident that their institution will keep their personal data secure and private, while 22% are not confident. A third (33%) are unsure.
  • 64% of students say that when sharing personal information online, they check to see if the source is trustworthy and secure. 17% don’t check.
  • Students were split in their knowledge of data privacy and ethics news and 36% keep current on ethical developments whilst 37% don’t.
  • 93% of students feel they should have the right to view any personal information their university stores about them, 2% disagree. 86% also felt they should have the right to delete any personal data the institution holds about them.
  • Students do not want their health information shared widely. 83% of students expect their medical information to be kept private to their institution and themselves. 5% say they would expect for it to be shared with commercial and business services, 10% for it to be shared with government services and 2% for the information to be shared more widely.
  • When asked about information provided to student support and welfare services, 78% say they expect the information to be kept private between them and their institution.
  • A quarter of students (26%) said they are comfortable with their HEI reviewing their social media posts, if it allows them to better identify and target struggling students with wellbeing support services. 57% were opposed to this and 17% neither agreed nor disagreed.
  • On sharing health or wellbeing information with a student’s parents/guardians 48% were happy for institutions to do this; 33% disagreed, 19% were undecided. However, on contacting parents/guardians over academic performance issues only 35% of students were happy for this to take place, 48% were opposed and 17% undecided.

Rachel Hewitt, HEPI’s Director of Policy and Advocacy, said:

  • ‘Students are required to provide large amounts of data to their universities, including personal and sensitive information. It is critical that universities are open with students about how this information will be used.
  • Under a third of students feel their university has clearly explained how their data will be used and shared and under half feel confident that their data will be kept secure and private. Universities should take action to ensure students can have confidence in the security of their data.’

Michael Natzler, HEPI’s Policy Officer, said:

  • ‘Students are generally willing for their data to be used anonymously to improve the experience of other students, for example on learning and mental wellbeing. Around half are even happy for information about their health or mental wellbeing to be shared with parents or guardians.
  • However, when it comes to identifiable information about them as individuals, students are clear they want this data to be kept confidential between them and their institutions. It is important that universities keep students’ data private where possible and are clear with students when information must be shared more widely.’

On learning analytics the majority of students were happy for their anonymised data on accessing university buildings, online platform usage, library books checked out to be aggregated into patterns and used as insights for other students, lecturers, to forecast if future students will drop out and to predict their own performance from the similarity of behaviours from past students (including possibility of drop out).

HEPI concluded:

  • A clear majority of students are happy for the university to use their own and other students’ data to enhance the learning and mental wellbeing of students at university. However, students do not want personal data and data related to learning to be shared outside the student-university relationship.
  • Students expect and demand privacy around their data, while being aware of the positive outcomes responsible usage can bring. Understanding of how student data are used is lower than it ought to be, which universities should work to address, but the message about how students want their data used is clear and must be listened to.

PISA results

The DfE have published the PISA (programme for International Student Assessment) 2018 reports coving the four areas of the UK. Once every three years the PISA measures 15-year-old school pupils’ abilities in reading, mathematics and science through ‘their competence to address real-life challenges’. PISA is administered by OECD (Organisation for Economic Co-operation and Development). It is a snapshot assessment checking how countries are performing relative to each other,

  • In PISA 2018, mean scores in England were significantly above the OECD averages in all 3 subjects. The mean scores in reading and science in England have not changed significantly over successive PISA cycles, but in mathematics, England’s overall mean score showed a statistically significant increase compared with PISA 2015.
  • England’s mean score for reading was similar to scores for Scotland and Northern Ireland, and all 3 had scores significantly higher than Wales. In both science and mathematics, the mean scores for England were significantly higher than the scores for Wales, Scotland and Northern Ireland, which were not significantly different from each other
  • Closing the gap – the top performers in reading were south-east Asian countries China, Singapore, Macao, China and Hong Kong with Estonia, Canada and Finland also scoring highly. In PISA 2018 there were 9 countries where the mean reading score was statistically significantly higher than that in England, compared to 12 countries in PISA 2015.
  • In common with all other participating countries, girls in England outperformed boys in reading. However, the gender gap in England was significantly smaller than the average gap across the OECD.
  • In England, the gap between high and low achievers in science was significantly larger than the OECD average, with a larger proportion of pupils in England performing at the highest proficiency levels.
  • There was no statistically significant gap between performance of boys and girls in science in England, which was also the case in PISA 2015. This differs from the OECD average where there was a small but statistically significant gender gap in favour of girls.
  • England’s mean score in mathematics was significantly higher than in PISA 2015, which is the first time performance has improved after a stable picture in all previous cycles of PISA. The size of the gap between scores of the highest and lowest achievers in England was similar to the OECD average.
  • Boys in England significantly outperformed girls in mathematics, as was also the case for the OECD average. The gap between boys and girls in England was similar to that in PISA 2015.

TES covered the release in PISA results must be a relief for the government (but there are still many challenges that we must address). This includes the England’s higher scores for pupil dissatisfaction and poorer wellbeing. Also that many pupils said they only read if they have to, not for enjoyment which the article says is of concern, given the importance of reading – for future learning, stimulating creativity and imagination (sought after by employers).

Inquiries and Consultations

Click here to view the updated inquiries and consultation tracker. Email us on policy@bournemouth.ac.uk if you’d like to contribute to any of the current consultations. There are not any new consultations or inquiries this week because we are still in the purdah period.

Other news

Transnational Education: The Government have released statistics costing the revenue generated through transnational education (THE) and other education related exports. The HE highlights:

  • HEIs contributed £14.4 billion (67%) of the total value of £21.4 billion. This is +7% growth between 2010 to 2017. The revenue from other stages of education such as FE and Schools is smaller at £0.3 billion and £1.0 billion.
  • The share of English Language Training (ELT) and FE (non-EU students) have both fallen – the ELT share dropping from 14% to 7% and FE dropping from 6% to 1%.

New Welsh Tertiary System: The Welsh Children, Young People and Education Committee have published their final report scrutinising the HE (Wales) Act 2015. This report aims to showcase evidence to learn the lessons of the 2015 Act, which is considered unsuccessful and set to be repealed. The report also sets the scene to influence the preparation of the forthcoming Tertiary Education Bill. The new bill will establish a new Tertiary Education and Research Commission for Wales, which will oversee the entire post-16 education system.  Lynne Neagle AM, Committee Chair, said:

  • We heard quite considerable criticism of the HE Act, mainly focusing on its failure to create a complete system of HE regulation, its unsatisfactory addressing of student interests, and it not providing an effective means to align providers behind national priorities. These issues are of such consequence, and are so much a part of the fabric of the 2015 Act, that we agree with the Minister’s intention to repeal Because it is to be repealed, the recommendations we make in this report in relation to it are what we think are realistically possible before any new tertiary education and research Commission is established. 

GCSE changes: The Sutton Trust has released the report Making the Grade analysing the impact of GCSE reforms on the attainment gap between disadvantaged pupils and their peers. Read the executive summary for a main synopsis in general the gaps between disadvantaged pupils and their more advantaged peers have not changed significantly (except for triple science), partly due to the conscious maintenance of grade boundaries and the comparable outcomes approach. Of concern is that less disadvantaged students are achieving the highest marks and grades – potentially impacting on future social mobility as less disadvantaged students achieve the top grades needed to apply to the most selective institutions and impacting on their graduate wage due to the focus top employers place on recruiting from the selective institutions.

Kevin Courtney, Joint General Secretary of the National Education Union, commented on the report:

  • “It is absolutely not surprising that the attainment gap between disadvantaged pupils and others has widened as a result of the Government’s GCSE reforms. These reforms were unplanned, had no meaningful consultation with teachers and no proper lead-in time. The exams now cover an unmanageable amount of content for many students, and unlike in real life the students have to sit them once-and-for-all at the end of the course.
  • Both these issues are causing real problems… whilst under the previous system 2% of disadvantaged pupils achieved the top grade (of A*), it is now just 1% that achieve a grade 9. The Sutton Trust is right to say that this may have negative impacts on these students when they are applying for university places.
  • A survey of National Education Union members found that 73% thought that pupil mental health was worse due to the new GCSE reforms and 64% said the reformed courses did not reflect students’ abilities as accurately.
  • We need to see a system in place that plays to all pupils’ strengths to ensure they get the qualifications they deserve.”

Subscribe!

To subscribe to the weekly policy update simply email policy@bournemouth.ac.uk

JANE FORSTER                                            |                       SARAH CARTER

Policy Advisor                                                                     Policy & Public Affairs Officer

Follow: @PolicyBU on Twitter                   |                       policy@bournemouth.ac.uk