Category / innovation

EU Info Day: ‘Health, demographic change and well-being’, Brussels 8/12/17

There will be an EU Societal Challenge 1 Health & Wellbeing Info day on 8 December 2017, in Brussels. As you can see form the outline below, there are opportunities for BU across all four faculties within this part of the Work Programme for 2018-2020, due to be released in October.

 

With the principle of better health for all at its core, Horizon 2020’s Societal Challenge 1 (Health, demographic change and well-being) focuses on personalised health and care, infectious diseases and improving global health, innovative and sustainable health systems, decoding the role of the environment (including climate change) for health and well-being, digital transformation and cybersecurity in health and care. Horizon 2020′ Societal Challenge 1 Work Programme 2018-2020 will be expected to offer calls for proposals with an overall budget of about €2 billion.

Draft programme and registrations are to follow at the end of September.

Related to the Health Open Info Day, the Directorate-General for Research & Innovation supports the following two events which will be organised on 7 December 2017, also in Brussels

Partnering Event – organised by the EU-funded project Health-NCP-Net 2.0 – the event aims at helping you find the right project partners for the upcoming 2018 health calls. Registration opens on 8/10/17.

Satellite event on Innovation Procurement in health care – Limited to 80 participants – First come, first served basis, with booking already open.

Booking links and further information are on the main Info Day page.

Please let Emily Cieciura, RKEO Research Facilitator: EU & International, know if you plan to attend.

HE policy update for the w/e 1st September 2017

We continue our series of summer updates focussing on themes rather than news with a look at learning gain.  We have updates on the Industrial Strategy Bell review of Life Sciences, and an update on the TEF from UUK.

Learning Gain

Learning gain has become a potential hot topic for universities over the last year – could it be the magic bullet for problems with TEF metrics?  Why is it a policy issue and what are the implications of the policy context for universities and students?  Wonkhe recently published a helpful summary in July by Dr Camille B. Kandiko Howson, from Kings College.

Background – TEF – The Teaching Excellence Framework (TEF) includes learning gain alongside student outcomes more generally as one of its three main criteria for assessing teaching excellence (the others are teaching quality and learning environment).  The relevant TEF criteria are:

Student Outcomes and Learning Gain  
Employment and Further Study (SO1) Students achieve their educational and professional goals, in particular progression to further study or highly skilled employment  
Employability and Transferrable Skills (SO2) Students acquire knowledge, skills and attributes that are valued by employers and that enhance their personal and/or professional lives
Positive Outcomes for All (SO3) Positive outcomes are achieved by its students from all backgrounds, in particular those from disadvantaged backgrounds or those who are at greater risk of not achieving positive outcomes

Further definition was given in the “Aspects of Quality” guidance (see the TEF guidance issued by HEFCE):

Student Outcomes and Learning Gain is focused on the achievement of positive outcomes. Positive outcomes are taken to include:

  • acquisition of attributes such as lifelong learning skills and others that allow a graduate to make a strong contribution to society, economy and the environment,
  • progression to further study, acquisition of knowledge, skills and attributes necessary to compete for a graduate level job that requires the high level of skills arising from higher education

The extent to which positive outcomes are achieved for all students, including those from disadvantaged backgrounds, is a key feature. The distance travelled by students (‘learning gain’) is included”.

And it goes on:

  • Work across the sector to develop new measures of learning gain is in progress. Until new measures become available and are robust and applicable for all types of providers and students, we anticipate providers will refer to their own approaches to identifying and assessing students’ learning gain – this aspect is not prescriptive about what those measures might be.”

The TEF guidance issued by HEFCE included examples of the sorts of evidence that universities might want to consider including (amongst a much longer list):

  • Learning gain and distance-travelled by all students including those entering higher education part-way through their professional lives
  • Evidence and impact of initiatives aimed at preparing students for further study and research
  • Use and effectiveness of initiatives used to help measure and record student progress, such as Grade Point Average (GPA)
  • Impact of initiatives aimed at closing gaps in development, attainment and progression for students from different backgrounds, in particular those from disadvantaged backgrounds or those who are at greater risk of not achieving positive outcomes.

TEF Assessment – If you have been following the debates about the TEF in Year 2 (results now published), you will be aware that the assessment of institutions against these criteria was done in two ways – by looking at metrics (with benchmarking and subdivision into various sub-sets), and by review of a written provider assessment.

  • The metrics that were used in TEF Year 2 for Student Outcomes and Learning Gain were from the Destination of Leavers from Higher Education survey (DLHE), specifically the DLHE declared activity 6 months after graduation – were they in employment of further study, and if in employment, was it “highly skilled” as defined by SOC groups 1-3 (managerial and professional).
  • So the metrics used in Year 2 of TEF do not cover learning gain at all. In fact they only really relate to SO1 above, are of limited use in terms of employability for SO2. DLHE doesn’t measure employability, only employment. Of course, DLHE is being replaced, after major consultations by HESA throughout 2016 and 2017 with the new Graduate Outcomes survey, which will take a longer-term view and look at a broader range of outcomes. (read more in our Policy Update of 30th June 2017).
  • So for the TEF year 2, any assessment of learning gain was done through the written submissions – and as noted above there are no measures for this, it was left to providers to “refer to their own approaches to identifying and assessing students’ learning gain”.

Universities UK have published their review of Year 2 of the TEF (see next section below) which includes a strong endorsement from UUK members for a comparative learning gain metric in future iterations of the TEF.

Measuring Learning Gain – As referred to above, there is a HEFCE project to look at ways of measuring learning gain.

They are running 13 pilot projects:

  • careers registration and employability initiatives – this  uses surveys and is linked most closely to SO2 – employability
  • critical-thinking ‘CLA+’ standardised assessment tool – also uses the UK Engagement Survey (UKES). CLA+ is a US assessment that is done on-line and asks students to assess data and evidence and decide on a course of action or propose a solution. As such, it measures general skills but is not subject specific.
  • self-efficacy across a range of disciplines
  • skills and self-assessment of confidence measures
  • a self-assessment skills audit and a situational judgement test
  • HE in FE
  • A multi-strand one: standardising entry and exit qualifications, new measures of critical skills and modelling change in learning outcomes
  • a project that will analyse the Affective-Behaviour-Cognition (ABC) model data for previous years
  • research skills in 6 disciplines
  • psychometric testing
  • learning gain from work-based learning and work placements
  • a project evaluating a range of methodologies including degree classifications, UKES, NSS, Student Wellbeing survey and CLA+
  • employability and subject specific learning across a range of methods – includes a project to understand the dimensions of learning gain and develop a way to measure them, one to look at R2 Strengths, one to look at career adaptability and one looking at international experience.

These are long term (3 year) projects – HEFCE published a year 1 report on 6th July 2017 – you can read more on our 14th July policy update – this flags a couple of challenges including how to get students to complete surveys and tests that are not relevant to their degree (a problem also encountered by the UKES). The report suggests embedding measurement “in the standard administrative procedures or formal curriculum” – which means a survey or test through enrolment and as part of our assessment programme.

To become a core TEF metric there would need to be a national standard measure that was implemented across the sector. That means that have to be mass testing (like SATs for university students) or another national survey alongside NSS and the new Graduate Outcomes survey (the replacement for DLHE) – with surveys on enrolment and at other points across the lifecycle.

Some BU staff are taking a different approach – instead of looking at generic measures for generic skills they have been looking at measuring specific learning gain against the defined learning outcomes for cohorts of students on a particular course. This is a much more customised approach but the team have set some basic parameters for the questions that they have asked which could be applied to other courses. The methodology was a survey. (Dr Martyn Polkinghorne, Dr Gelareh Roushan, Dr Julia Taylor) (see also a more detailed explanation, March 2017)

Pros, cons and alternatives

In January 2016, HEPI published a speech delivered in December 2015 by Andreas Schleicher, Director for Education and Skills, and Special Advisor on Education Policy to the Secretary-General at the Organisation for Economic Co-operation and Development (OECD) in Paris. In the speech, the author argues strongly for institutions worldwide to measure and use learning gain data. He supports the use of specific measures for disciplines although points out the difficulties with this – not least in getting comparable data. So he also focuses on generic skills – but he doesn’t suggest a specific methodology.

An HEA presentation from December 2016 mentions a number of inputs that “predict both student performance and learning gains” – including contact hours, class size (and a host of other things including the extent and timing of feedback on assignments).

It is worth looking quickly at GPA (Grade Point Average) as this is also mentioned in the TEF specification as noted above. The HEA are looking at degree standards for HEFCE now, having done a pilot project on GPA in 2013-14.  The report notes that “potential capacity to increase granularity of awards, transparency in award calculations, international recognition and student engagement in their programmes”. The summary says, “The importance to stakeholders of a nationally-agreed, common scale is a key finding of the pilot and is considered crucial for the acceptance and success of GPA in the UK”, and that “The pilot providers considered that the development of widespread stakeholder understanding and commitment would require clear communication to be sustained over a number of years.”

Wonkhe have a round up on the background to the GPA debate from June 2016,

Although the big focus for the TEF was on outputs not inputs, the Department for Education has announced that it will start to look at including some of the inputs. See our HE policy update for 21st July where we look at the new teaching intensity measure that will be part of the subject level TEF pilots. You can read more about this in a THE article from 2nd August:

  • The pilot “will measure teaching intensity using a method that weights the number of hours taught by the student-staff ratio of each taught hour,” explains the pilot’s specification, published by the Department for Education“. Put simply, this model would value each of these at the same level: two hours spent in a group of 10 students with one member of staff, two hours spent in a group of 20 with two members of staff, one hour spent in a group of five students with one member of staff,” it explains. Once contact hours are weighted by class sizes, and aggregated up to subject level, those running the pilot will be able to calculate a “gross teaching quotient” score, which would be an “easily interpretable number” and used as a “supplementary metric” to inform subject-level assessments”.

The contact hours debate is very political – tied up with concerns about value for money and linked to the very topical debate on fees (speech on 20th July by Jo Johnson .and see our HE Policy Update for 21st July 2017)

This is all very interesting when, as mentioned above, the TEF specification for year two put so much emphasis on measuring outcome and not just inputs: “The emphasis in the provider submission should be on demonstrating the impact and effectiveness of teaching on the student experience and outcomes they achieve. The submission should therefore avoid focusing on descriptions of strategies or approach but instead should focus on impact. Wherever possible, impact should be demonstrated empirically. “

Experts and evidence – There will be a real push from the sector for evidence that the new teaching intensity measure and reporting of contact hours and other things really does make a difference to students before it is included in the TEF. The HEA’s position on this (2016) is a helpful summary of the debate about contact hours.

There is an interesting article in the HEPI collection of responses to the Green Paper in January 2016  from Graham Gibbs, former Professor at the University of Winchester and Director of the Oxford Learning Institute, University of Oxford, and author of Dimensions of quality and Implications of ‘Dimensions of quality’ in a market environment. He supports the use of learning gain metrics as a useful tool. He points out that “cohort size is a strong negative predictor of both student performance and learning gains”. He also adds “Russell Group Universities have comparatively larger cohorts and larger class sizes, and their small group teaching is less likely to be undertaken by academics, all of which save money but reduce learning gains”. He does not accept that contact hours, or institutional reputation (linked to high tariff entry and research reputation) impact learning gain.

There is an interesting article on the Higher Education Policy Institute (HEPI) website here written by the authors of an article that looked at class size.

Impact so far – So what happened in the TEF – a very quick and incomplete look at TEF submissions suggests that not many institutions included much about learning gain (or GPA) and those that did seem to fall into two categories – those participating in the pilot who mention the pilot, and some who mention it in the context of the TEF core data – e.g. Birmingham mention their access project and learning gain (but don’t really evidence it except through employment and retention). Huddersfield talk about it in the context of placements and work experience but again linked to employment outcomes, although they also mention assessment improvement.

Teaching Excellence Framework (TEF) – year 2 review

Universities UK have published their review of Year 2 of the TEF following a survey that UUK did of their members.

The key findings from the report are:

  • There appears to be general confidence that overall process was fair, notwithstanding the outcomes of individual appeals. Judgements were the result of an intensive and discursive process of deliberation by the assessment panel. There was a slight correlation between TEF results, entry tariff and league table rankings.
  • It is estimated that the cost of participating in the TEF for 134 higher education institutions was approximately £4 million. This was driven by the volume of staff engagement, particularly senior staff.
  • Further consideration will need to be given to how the TEF accounts for the diversity of the student body, particularly part-time students, and how the TEF defines and measures excellence. [UUK also raises a concern about judgements possibly being skewed by prior attainment]
  • If subject-level TEF is to provide students with reliable information it must address the impact of increased metric suppression [this relates to metrics which could not be used because of small numbers, particularly for part-time students and for the ethnicity splits], how judgments are made in the absence of data [particularly an issue for those institutions affected by the NSS boycott], the comparability of subject groupings and the increase in cost and complexity of submissions and assessment.

[To address the issue with suppression, the report noted that the splits for ethnicity will be reduced from 6 to 3 for subject level TEF (p35)]

These findings also suggest that if the TEF is to make an effective contribution to the ongoing success of the whole UK sector, the following issues would merit consideration as part of the independent review:

  • How the TEF defines and measures excellence in a diverse sector and supports development of teaching and learning practice.
  • The role that the TEF plays across the student decision-making process and the relationship with the wider student information landscape.
  • The process for the future development of the TEF and the role of the sector, including students and devolved nations.
  • The relationship between the TEF and quality assessment, including regulatory baselines and the Quality Code.

Figure 4 shows the data benchmarking flags received by providers at each award level – these two charts are interesting because they show that providers with negative flags still received gold (and silver).

The survey also asked about future developments for the TEF with learning gain being a clear leader – ahead of teaching intensity. HEFCE is running learning gain pilots, as discussed above, and teaching intensity will be the subject of a pilot alongside subject level TEF. Interestingly, on p 33 a chart shows that nearly 70% of respondents believed that “there is no proportionate approach for producing a robust subject level TEF judgement which will be useful for students”.

Industrial Strategy

Following our update on the Industrial Strategy last week there are a couple of updates. Innovate UK has announced funding for businesses to work on innovative technologies, future products and services. The categories link closely to the Industrial Strategy priorities including digital technologies, robotics, creative economy and design and space applications as well as emerging technologies and electronics.

There was also an announcement about funding for innovative medicines manufacturing solutions.

Sir John Bell has published his report for the government on Life Sciences and the Industrial Strategy. There are seven main recommendations under 4 themes, which are summarised below. You can read a longer summary on the BU Research Blog.

Some interesting comments:

  • The key UK attribute driving success in life sciences is the great strength in university-based research. Strong research-based universities underpin most of the public sector research success in the UK, as they do in the USA and in Scandinavia. National research systems based around institutes rather than universities, as seen in Germany, France and China, do not achieve the same productivity in life sciences as seen in university-focussed systems.” (p22)
  • “The decline in funding of indirect costs for charity research is coupled to an increasing tendency for Research Councils to construct approaches that avoid paying indirect Full Economic Costs (FEC). Together, these are having a significant impact on the viability of research in universities and have led to the institutions raising industrial overhead costs to fill the gap. This is unhelpful.” (p24)
  • “It is also recommended, that the funding agencies, in partnership with major charities, create a high-level recruitment fund that would pay the real cost of bringing successful scientists from abroad to work in major UK university institutions.” (see the proposal to attract international scientists below).
  • On clusters “Life sciences clusters are nearly always located around a university or other research institute and in the UK include elements of NHS infrastructure. However, evidence and experience suggests that governments cannot seed technology clusters and their success is usually driven by the underpinning assets of universities and companies, and also by the cultural features of networking and recycling of entrepreneurs and capital.” And “Regions should make the most of existing opportunities locally to grow clusters and build resilience by working in partnership across local Government, LEPs (in England), universities and research institutes, NHS, AHSNs, local businesses and support organisations, to identify and coalesce the local vision for life sciences. Science & Innovation Audits, Local Growth Funds and Growth Hubs (in England), Enterprise Zones and local rates and planning flexibilities can all be utilised to support a vision for life sciences. “ (see the proposal on clusters under “Growth and Infrastructure” – this was a big theme in the Industrial strategy and something we also covered in our Green Paper response)
  • On skills: “ The flow of multidisciplinary students at Masters and PhD level should be increased by providing incentives through the Higher Education Funding Council for England.2 and  “Universities and research funders should embed core competencies at degree and PhD level, for example data, statistical and analytical skills, commercial acumen and translational skills, and management and entrepreneurship training (which could be delivered in partnership with business schools). They should support exposure to, and collaboration with, strategically important disciplines including computer and data science, engineering, chemistry, physics, mathematics and material science.”

Health Advanced Research Programme (HARP) proposal – with the goal to create 2-3 entirely new industries over the next 10 years.

Reinforcing the UK science offer 

  • Sustain and increase funding for basic science to match our international competition – the goal is that the UK should attract 2000 new discovery scientists from around the globe
    • The UK should aim to be in the upper quartile of OECD R&D spending and sustain and increase the funding for basic science, to match our international competitors, particularly in university settings, encouraging discovery science to co-locate.
    • Capitalise on UKRI to increase interdisciplinary research, work more effectively with industry and support high-risk science.
    • Use Government and charitable funding to attract up to 100 world-class scientists to the UK, with support for their recruitment and their science over the next ten years.
  • Further improve UK clinical trial capabilities to support a 50% increase in the number of clinical trials over the next 5 years and a growing proportion of change of practice and trials with novel methodology over the next 5 years.

Growth and infrastructure – the goal is to create four UK companies valued at >£20 billion market cap in the next ten years.

NHS collaboration – the Accelerated Access Review should be adopted with national routes to market streamlined and clarified, including for digital products. There are two stated goals:

  • NHS should engage in fifty collaborative programmes in the next 5 years in late-stage clinical trials, real world data collection, or in the evaluation of diagnostics or devices.
  • The UK should be in the top quartile of comparator countries, both for the speed of adoption and the overall uptake of innovative, cost-effective products, to the benefit of all UK patients by the end of 2023.

Data – Establish two to five Digital Innovation Hubs providing data across regions of three to five million people.

  • Create a forum for researchers across academia, charities and industry to engage with all national health data programmes.
  • Establish a new regulatory, Health Technology Assessment and commercial framework to capture for the UK the value in algorithms generated using NHS data.
  • Two to five digital innovation hubs providing data across regions of three to five million people should be set up as part of a national approach and building towards full population coverage, to rapidly enable researchers to engage with a meaningful dataset. One or more of these should focus on medtech.
  • The UK could host 4-6 centres of excellence that provide support for specific medtech themes, focussing on research capability in a single medtech domain such as orthopaedics, cardiac, digital health or molecular diagnostics.
  • National registries of therapy-area-specific data across the whole of the NHS in England should be created and aligned with the relevant charity.

Skills

  • A migration system should be established that allows recruitment and retention of highly skilled workers from the EU and beyond, and does not impede intra-company transfers.
  • Develop and deliver a reinforced skills action plan across the NHS, commercial and third sectors based on a gap analysis of key skills for science.
    • Create an apprenticeship scheme that focuses on data sciences, as well as skills across the life sciences sector, and trains an entirely new cadre of technologists, healthcare workers and scientists at the cutting-edge of digital health.
    • Establish Institutes of Technology that would provide opportunity for technical training, particularly in digital and advanced manufacturing areas.
    • There should be support for entrepreneur training at all levels, incentivising varied careers and migration of academic scientists into industry and back to academia.
    • A fund should be established supporting convergent science activities including cross-disciplinary sabbaticals, joint appointments, funding for cross-sectoral partnerships and exchanges across industry and the NHS, including for management trainees.
    • High quality STEM education should be provided for all, and the government should evaluate and implement additional steps to increase the number of students studying maths to level 3 and beyond.

JANE FORSTER                                                             |                               SARAH CARTER

Policy Advisor                                                                                               Policy & Public Affairs Officer

65111                                                                                                              65070

Follow: @PolicyBU on Twitter                      |                               policy@bournemouth.ac.uk

 

HE Policy update w/e 25th August 2017

Immigration, International Students and Brexit

The government have commissioned a series of assessments and reviews of the impact of immigration policy and Brexit via the Migration Advisory Committee:

  • Call for evidence and briefing note: EEA-workers in the UK labour market – we will be responding on the HE questions via UCEA and UUK and we are considering a regional response, please let Sarah or I know if you have evidence that would be relevant to this – it is looking at EEA migration trends, recruitment practices and economic and social impacts.
  • a detailed assessment of the social and economic impact of international students in the UK. We would expect a call for evidence for this to follow. Looking at both EU and non-EU students, the MAC will be asked to consider:
  • the impact of tuition fees and other spending by international students on the national, regional, and local economy and on the education sector
  • the role students play in contributing to local economic growth
  • the impact their recruitment has on the provision and quality of education provided to domestic students.

The Commissioning Letter from Amber Rudd says: “The Digital Economy Act provides a unique opportunity to improve understanding of the migration data and as part of this work the Home Office will be working with the ONS and other Government departments to improve the use of administrative data. This will lead to a greater understanding of how many migrants are in the UK, how long they stay for, and what they are currently doing. The ONS will be publishing an article in September setting out this fuller work plan and the timetable for moving towards this landscape for administrative data usage”

As well as the post-Brexit future of students, the letter also makes reference to the Tier 4 visa pilot which was launched last year and included a handful of universities. Amber Rudd says “the pilot is being carefully evaluated and, if successful, could be rolled out more widely”.

The pilot covered masters courses at 4 universities:

  • Masters course for 13 months or less at the University of Oxford, University of Cambridge, University of Bath or Imperial College London.
  • Participating in the pilot allowed students to:
    • stay for six months after the end of the course;
    • submit fewer evidential documents with their applications – e.g. previous qualifications and documents relating to maintenance requirements

A deluge of other data and reports gave also been published:

  • The Home Office has published its second report on statistics being collected under the exit checks programme – Exit checks data.
    • For the 1.34m visas granted to non-EEA nationals and which expired in 2016/17, where individuals did not obtain a further extension to stay longer in the UK, 96.3% departed in time (that is before their visa expired)
  • A National Statistics update has been published which gives a breakdown of all the data
  • Additional analysis by Office for National Statistics (ONS) on international students, has been published
  • The Centre for Population Change has published the findings of a survey it carried out in March 2017 in partnership with the ONS and UUK. The survey looked at the intentions of graduating overseas students and found:
  • The majority of students do not intend to stay in the UK for more than a year after finishing their studies (and those that stated they intended to stay were not certain of their post-study plans, particularly non-EU students).
  • Fewer than one in ten international students plan to stay in the UK indefinitely and find a job.

According to UUK:

  • Exit checks data shows that student overstaying is at worst 3% and much of the 3% of undetermined outcomes may be due to individuals leaving via routes where there are no exit checks currently (such as via the Common Travel Area). This means student visa compliance is at least 97%, far higher than previous (incorrect) claims.
  • The Home Office exit checks data provides a more accurate picture (than the International Passenger Survey – IPS) of what non-EU students do after their initial period of leave to study
  • The ONS report suggest that the IPS is likely to underestimate student emigration – therefore any implied student net migration figure is likely to be an overestimate
  • The ONS also commits to working with colleagues across the government statistics service to utilise all available administrative systems to further improve migration statistics. They have also asked for UUK’s input to this work.

Widening Participation

A survey of access agreements has been published this week by the Office for Fair Access. In their press release OFFA note that every university has committed to working with schools to help increase access to HE. The report also notes that universities will focus on improved evaluation of the impact of financial support and an evidence based approach more generally, a specific focus on White working class males and BME attainment, and more support for mental health issues.  The amount universities spend on widening access will rise.

Responding to the survey, UUK Chief Executive, Alistair Jarvis, said: “The enhancements in support provided by universities has helped to increase the entry rate for disadvantaged young people to record levels. All UK universities work hard to widen participation and support disadvantaged students throughout their time at university. It is right to expect a continued focus on support for disadvantaged students to make further progress in closing the gap between different student groups.”

Industrial Strategy

The formal outcome of the Industrial Strategy consultation is still pending. However, there has been a reasonable amount of activity in the meantime and we thought it might be helpful to do a round up.

Clusters – The Arts and Humanities Research Council (AHRC) have set up a Creative Industries Clusters Programme, starting in 2018, to facilitate collaboration between the industry and universities. The pre-call announcement sets out the plan for at least 8 research and development partnerships, each led by an HEI, and a Policy and Evidence Centre. Calls will apparently open in October 2018.

Sector deals – As part of the Industrial strategy green paper, the government announced that there were 5 sector reviews taking place and suggested that they would welcome more.

Other organisations are setting up consultations and other reviews to respond to the Industrial Strategy, such as:

The interim findings of the industrial digitalisation review are interesting – they are working on a final report for the autumn of 2017:

  • It highlights a need for more leadership – with “much stronger marketing and messaging” and proposed the establishment of a Digital Technology Institute and Digital Technology Networks
  • It discusses issues with adoption rates for technology, particularly among SMEs and suggests better support for businesses via LEPs and other organisations, work on skills through interventions such as an Institute of Digital Engineering
  • Innovation – the interim review suggests looking at additive manufacturing and AI – and creating new industries in autonomous operations, but also providing kite marked content for businesses.

Industrial Strategy Challenge Fund – Innovate UK are running the Industrial strategy Challenge Fund – in April 2017 they identified 6 “core industrial challenges”:

Interesting reading

JANE FORSTER                                            |                       SARAH CARTER

Policy Advisor                                                                     Policy & Public Affairs Officer

65111                                                                                 65070

Follow: @PolicyBU on Twitter                        |                       policy@bournemouth.ac.uk

Innovate UK announce Digital Technology for Healthcare call

Innovate UK is to invest up to £8 million in projects that develop new digital technology solutions to healthcare challenges.

This competition is being run under the digital health technology catalyst, which is part of the Industrial Strategy Challenge Fund. The aim is to support the development of digital health products that meet NHS needs. It is a new £35 million funding programme over 4 years.

Innovate UK are seeking feasibility or development projects that advance digital health or digitally-enabled medical technologies. These should:

  • improve patient outcomes, such as through better clinical decision-making and supporting them to manage their own care
  • offer new approaches to healthcare that transform its delivery
  • reduce the demand on the health system, make it more efficient and create savings

Competition information

  • the competition opens on 31 July 2017, and the deadline for registrations is 4 October 2017
  • feasibility studies can range from £50,000 to £75,000 and last up to one year
  • industrial research and experimental development projects can range from £500,000 to £1 million and last up to 3 years
  • you can work alone or in collaboration with other organisations, but projects must be led by a UK-based SME
  • you could get up to 70% of your eligible project costs
  • projects must start by 1 February 2018

You can find more information and apply to the call here.

Innovate UK are holding a briefing webinar for applicants on Tuesday 1st August at 10:00am. To register click here.

UUK publish industrial strategy and universities regional briefings

Universities UK have published regional briefings to examine how and why universities have an important link to the UK’s industrial strategy.

The briefings show that at the local and regional level, universities support growth by providing and creating jobs, and lead on local economic and social issues.  Areas of focus include local businesses, big businesses, communities, school leavers and local services.

Bournemouth University is included in the south-west briefing.

 

 

Masterclass: The clinical doctorate model – Enabling Practitioner Research

Monday 15th May 2017, 14.00 – 15.30, Lansdowne Campus

This masterclass will be presented by Professor Vanora Hundley, Deputy Dean for Research and Professional Practice, Faculty of Health & Social Sciences. The development of a clinical PhD studentship utilises the opportunity to bring in research income, while developing a bespoke educational opportunity that is attractive to employers and directly relevant to practice. Professor Hundley’s clinical doctorate model has been recognised nationally as an example of excellent practice which facilitates Knowledge Exchange and enhances future research collaborations.

This is part of the Leading Innovation Masterclasses series.

There are two other masterclasses in May: ‘Developing Interdisciplinarity’ with Professor Barry Richards, and ‘Benchmarking your students’ digital experience’ with Jisc’s Sarah Knight.

Find out more about these and book a place at the following link:
Leading Innovation – Masterclasses

Masterclass: Developing Interdisciplinarity

Thursday 4th May 2017, 9.30-11.00 at Talbot Campus

In this session Professor Barry Richards will take us through the story of how intellectual and political interdisciplinarity established across both education and research, defined a new academic specialism which now has courses and departments in several universities, journals and a book series with major publishers and growing connections with professional practices.

This is part of the Leading Innovation Masterclasses series.

There are two other masterclasses in May: ‘Benchmarking your students’ digital experience’ with Jisc’s Sarah Knight, and ‘The clinical doctorate model – Enabling Practitioner Research’ with Professor Vanora Hundley.

Find out more about these and book a place at the following link:
Leading Innovation – Masterclasses

Research Drove Me to Murder

“As reported by National Policing Improving Agency, the most frequently encountered evidence at the scenes of a crime is footwear impressions and marks. Unfortunately, recovery and usage of this kind of evidence has not achieved its full potential. Due to the cost benefit ratio (time consuming casting procedures, expensive scanners) footprints are often neglected evidence. As technology changes, the capabilities of forensic science should continue to evolve. By translating academic research and technical ‘know-how’ into software (www.digtrace.co.uk) the authors have placed 3D imaging of footwear evidence in the hands of every police force in the UK and overseas.”

This was the abstract submitted to accompany Dominika’s recent submission to the Research Photography Competition.

Dominika Budka is currently working on an innovation funded (HEIF) project called: “Dinosaurs to Forensic Science: Digital, Tracks and Traces”. BU alumni, Dominika,  graduated last year  (2016) having completed an MSc Forensic and Neuropsychological Perspectives in Face-Processing.  Find our more about her role on the HEIF project.
Follow HEIF on Instagram to find out more about the innovation projects taking place at BU: https://www.instagram.com/heif_at_bu/

 

 

Sherlock’s Window: In search of an odourless growth medium

“A key aspect of forensic investigation is the assessment of the ‘window of opportunity’ during which death took place. Estimations using insects (e.g. blowflies) increase accuracy. Using blowflies to determine post-mortem period requires an understanding of the temperature dependent growth patterns that they develop through their life cycle. In order to understand this, blowfly larvae are reared on growth media in the laboratory.

Sherlock’s Window is a HEIF-funded project at BU which aims to produce an odourless growth medium that can be rolled out internationally for use in forensic investigation. Illustrated here is the head of a third instar blowfly larva. Maggots have no eyes, but the protrusions at the tip of the mouth area are palps, used for feeling and manipulating food particles. The rows of black barbs that are visible are used to pull the maggot forward through the food substrate.”

This was the abstract submitted to accompany Dr Andrew Whittington’s recent submission to the Research Photography Competition.

Find out more about the project in the latest edition of the Bournemouth Research Chronicle featured in the section:  “Innovation in industry:how researchers and the wider community are working together.”

Follow HEIF on Instagram to find out more about the innovation projects taking place at BU: https://www.instagram.com/heif_at_bu/

 

 

 

BU alumni working on serious gaming project

Joshua (Josh) Cook graduated in 2016  with a first in BSc Games Programming.  He is currently working on an innovation project being led by Professor Wen Tang. ” PLUS”   is a gamified training application funded by HEIF,  in collaboration with the Dorset, Devon and Cornwall (Strategic Alliance) Police forces in order to provide a virtual learning environment that teaches trainees in a more engaging manner than traditional paper based learning.

As a project team member Wen commented “Josh has been a pro-active and key member of the project team working with both academics , the College of Policing and police forces around the UK to develop this training application.”

Key areas of focus for Josh have included:

  • Making the system more generic, so that the project can later be expanded to multiple areas and more situations with ease
  • Improve the visual environment (of the game) with shaders and animations
  • Include data analytics in order to obtain an understanding as to how trainees are using the game, how long they take, how many mistakes they make etc

Josh didn’t take a placement year during University, so aside from a summer position in a local games position he  did not have much work experience. On being given this opportuntity to work on the projetc Josh commented ” The PLUS project seemed like an interesting project to work on, and when I found out a position was open to work on it I applied. I’ve learned some useful things on this project, such as working from and improving upon an existing code base, what it’s like working directly with clients, implementing and using data analytics, and I’m sure I’ll learn more throughout the duration of my employment.”

This project has received funding from August 2015 with the funding ending in July 2017. (HEIF 5+1 and HEIF 5+1+1)

Read more about this project in full: Serious Games for Police Training. 

College of Policing Research Map