Category / data management

HE policy update for the w/e 1st September 2017

We continue our series of summer updates focussing on themes rather than news with a look at learning gain.  We have updates on the Industrial Strategy Bell review of Life Sciences, and an update on the TEF from UUK.

Learning Gain

Learning gain has become a potential hot topic for universities over the last year – could it be the magic bullet for problems with TEF metrics?  Why is it a policy issue and what are the implications of the policy context for universities and students?  Wonkhe recently published a helpful summary in July by Dr Camille B. Kandiko Howson, from Kings College.

Background – TEF – The Teaching Excellence Framework (TEF) includes learning gain alongside student outcomes more generally as one of its three main criteria for assessing teaching excellence (the others are teaching quality and learning environment).  The relevant TEF criteria are:

Student Outcomes and Learning Gain  
Employment and Further Study (SO1) Students achieve their educational and professional goals, in particular progression to further study or highly skilled employment  
Employability and Transferrable Skills (SO2) Students acquire knowledge, skills and attributes that are valued by employers and that enhance their personal and/or professional lives
Positive Outcomes for All (SO3) Positive outcomes are achieved by its students from all backgrounds, in particular those from disadvantaged backgrounds or those who are at greater risk of not achieving positive outcomes

Further definition was given in the “Aspects of Quality” guidance (see the TEF guidance issued by HEFCE):

Student Outcomes and Learning Gain is focused on the achievement of positive outcomes. Positive outcomes are taken to include:

  • acquisition of attributes such as lifelong learning skills and others that allow a graduate to make a strong contribution to society, economy and the environment,
  • progression to further study, acquisition of knowledge, skills and attributes necessary to compete for a graduate level job that requires the high level of skills arising from higher education

The extent to which positive outcomes are achieved for all students, including those from disadvantaged backgrounds, is a key feature. The distance travelled by students (‘learning gain’) is included”.

And it goes on:

  • Work across the sector to develop new measures of learning gain is in progress. Until new measures become available and are robust and applicable for all types of providers and students, we anticipate providers will refer to their own approaches to identifying and assessing students’ learning gain – this aspect is not prescriptive about what those measures might be.”

The TEF guidance issued by HEFCE included examples of the sorts of evidence that universities might want to consider including (amongst a much longer list):

  • Learning gain and distance-travelled by all students including those entering higher education part-way through their professional lives
  • Evidence and impact of initiatives aimed at preparing students for further study and research
  • Use and effectiveness of initiatives used to help measure and record student progress, such as Grade Point Average (GPA)
  • Impact of initiatives aimed at closing gaps in development, attainment and progression for students from different backgrounds, in particular those from disadvantaged backgrounds or those who are at greater risk of not achieving positive outcomes.

TEF Assessment – If you have been following the debates about the TEF in Year 2 (results now published), you will be aware that the assessment of institutions against these criteria was done in two ways – by looking at metrics (with benchmarking and subdivision into various sub-sets), and by review of a written provider assessment.

  • The metrics that were used in TEF Year 2 for Student Outcomes and Learning Gain were from the Destination of Leavers from Higher Education survey (DLHE), specifically the DLHE declared activity 6 months after graduation – were they in employment of further study, and if in employment, was it “highly skilled” as defined by SOC groups 1-3 (managerial and professional).
  • So the metrics used in Year 2 of TEF do not cover learning gain at all. In fact they only really relate to SO1 above, are of limited use in terms of employability for SO2. DLHE doesn’t measure employability, only employment. Of course, DLHE is being replaced, after major consultations by HESA throughout 2016 and 2017 with the new Graduate Outcomes survey, which will take a longer-term view and look at a broader range of outcomes. (read more in our Policy Update of 30th June 2017).
  • So for the TEF year 2, any assessment of learning gain was done through the written submissions – and as noted above there are no measures for this, it was left to providers to “refer to their own approaches to identifying and assessing students’ learning gain”.

Universities UK have published their review of Year 2 of the TEF (see next section below) which includes a strong endorsement from UUK members for a comparative learning gain metric in future iterations of the TEF.

Measuring Learning Gain – As referred to above, there is a HEFCE project to look at ways of measuring learning gain.

They are running 13 pilot projects:

  • careers registration and employability initiatives – this  uses surveys and is linked most closely to SO2 – employability
  • critical-thinking ‘CLA+’ standardised assessment tool – also uses the UK Engagement Survey (UKES). CLA+ is a US assessment that is done on-line and asks students to assess data and evidence and decide on a course of action or propose a solution. As such, it measures general skills but is not subject specific.
  • self-efficacy across a range of disciplines
  • skills and self-assessment of confidence measures
  • a self-assessment skills audit and a situational judgement test
  • HE in FE
  • A multi-strand one: standardising entry and exit qualifications, new measures of critical skills and modelling change in learning outcomes
  • a project that will analyse the Affective-Behaviour-Cognition (ABC) model data for previous years
  • research skills in 6 disciplines
  • psychometric testing
  • learning gain from work-based learning and work placements
  • a project evaluating a range of methodologies including degree classifications, UKES, NSS, Student Wellbeing survey and CLA+
  • employability and subject specific learning across a range of methods – includes a project to understand the dimensions of learning gain and develop a way to measure them, one to look at R2 Strengths, one to look at career adaptability and one looking at international experience.

These are long term (3 year) projects – HEFCE published a year 1 report on 6th July 2017 – you can read more on our 14th July policy update – this flags a couple of challenges including how to get students to complete surveys and tests that are not relevant to their degree (a problem also encountered by the UKES). The report suggests embedding measurement “in the standard administrative procedures or formal curriculum” – which means a survey or test through enrolment and as part of our assessment programme.

To become a core TEF metric there would need to be a national standard measure that was implemented across the sector. That means that have to be mass testing (like SATs for university students) or another national survey alongside NSS and the new Graduate Outcomes survey (the replacement for DLHE) – with surveys on enrolment and at other points across the lifecycle.

Some BU staff are taking a different approach – instead of looking at generic measures for generic skills they have been looking at measuring specific learning gain against the defined learning outcomes for cohorts of students on a particular course. This is a much more customised approach but the team have set some basic parameters for the questions that they have asked which could be applied to other courses. The methodology was a survey. (Dr Martyn Polkinghorne, Dr Gelareh Roushan, Dr Julia Taylor) (see also a more detailed explanation, March 2017)

Pros, cons and alternatives

In January 2016, HEPI published a speech delivered in December 2015 by Andreas Schleicher, Director for Education and Skills, and Special Advisor on Education Policy to the Secretary-General at the Organisation for Economic Co-operation and Development (OECD) in Paris. In the speech, the author argues strongly for institutions worldwide to measure and use learning gain data. He supports the use of specific measures for disciplines although points out the difficulties with this – not least in getting comparable data. So he also focuses on generic skills – but he doesn’t suggest a specific methodology.

An HEA presentation from December 2016 mentions a number of inputs that “predict both student performance and learning gains” – including contact hours, class size (and a host of other things including the extent and timing of feedback on assignments).

It is worth looking quickly at GPA (Grade Point Average) as this is also mentioned in the TEF specification as noted above. The HEA are looking at degree standards for HEFCE now, having done a pilot project on GPA in 2013-14.  The report notes that “potential capacity to increase granularity of awards, transparency in award calculations, international recognition and student engagement in their programmes”. The summary says, “The importance to stakeholders of a nationally-agreed, common scale is a key finding of the pilot and is considered crucial for the acceptance and success of GPA in the UK”, and that “The pilot providers considered that the development of widespread stakeholder understanding and commitment would require clear communication to be sustained over a number of years.”

Wonkhe have a round up on the background to the GPA debate from June 2016,

Although the big focus for the TEF was on outputs not inputs, the Department for Education has announced that it will start to look at including some of the inputs. See our HE policy update for 21st July where we look at the new teaching intensity measure that will be part of the subject level TEF pilots. You can read more about this in a THE article from 2nd August:

  • The pilot “will measure teaching intensity using a method that weights the number of hours taught by the student-staff ratio of each taught hour,” explains the pilot’s specification, published by the Department for Education“. Put simply, this model would value each of these at the same level: two hours spent in a group of 10 students with one member of staff, two hours spent in a group of 20 with two members of staff, one hour spent in a group of five students with one member of staff,” it explains. Once contact hours are weighted by class sizes, and aggregated up to subject level, those running the pilot will be able to calculate a “gross teaching quotient” score, which would be an “easily interpretable number” and used as a “supplementary metric” to inform subject-level assessments”.

The contact hours debate is very political – tied up with concerns about value for money and linked to the very topical debate on fees (speech on 20th July by Jo Johnson .and see our HE Policy Update for 21st July 2017)

This is all very interesting when, as mentioned above, the TEF specification for year two put so much emphasis on measuring outcome and not just inputs: “The emphasis in the provider submission should be on demonstrating the impact and effectiveness of teaching on the student experience and outcomes they achieve. The submission should therefore avoid focusing on descriptions of strategies or approach but instead should focus on impact. Wherever possible, impact should be demonstrated empirically. “

Experts and evidence – There will be a real push from the sector for evidence that the new teaching intensity measure and reporting of contact hours and other things really does make a difference to students before it is included in the TEF. The HEA’s position on this (2016) is a helpful summary of the debate about contact hours.

There is an interesting article in the HEPI collection of responses to the Green Paper in January 2016  from Graham Gibbs, former Professor at the University of Winchester and Director of the Oxford Learning Institute, University of Oxford, and author of Dimensions of quality and Implications of ‘Dimensions of quality’ in a market environment. He supports the use of learning gain metrics as a useful tool. He points out that “cohort size is a strong negative predictor of both student performance and learning gains”. He also adds “Russell Group Universities have comparatively larger cohorts and larger class sizes, and their small group teaching is less likely to be undertaken by academics, all of which save money but reduce learning gains”. He does not accept that contact hours, or institutional reputation (linked to high tariff entry and research reputation) impact learning gain.

There is an interesting article on the Higher Education Policy Institute (HEPI) website here written by the authors of an article that looked at class size.

Impact so far – So what happened in the TEF – a very quick and incomplete look at TEF submissions suggests that not many institutions included much about learning gain (or GPA) and those that did seem to fall into two categories – those participating in the pilot who mention the pilot, and some who mention it in the context of the TEF core data – e.g. Birmingham mention their access project and learning gain (but don’t really evidence it except through employment and retention). Huddersfield talk about it in the context of placements and work experience but again linked to employment outcomes, although they also mention assessment improvement.

Teaching Excellence Framework (TEF) – year 2 review

Universities UK have published their review of Year 2 of the TEF following a survey that UUK did of their members.

The key findings from the report are:

  • There appears to be general confidence that overall process was fair, notwithstanding the outcomes of individual appeals. Judgements were the result of an intensive and discursive process of deliberation by the assessment panel. There was a slight correlation between TEF results, entry tariff and league table rankings.
  • It is estimated that the cost of participating in the TEF for 134 higher education institutions was approximately £4 million. This was driven by the volume of staff engagement, particularly senior staff.
  • Further consideration will need to be given to how the TEF accounts for the diversity of the student body, particularly part-time students, and how the TEF defines and measures excellence. [UUK also raises a concern about judgements possibly being skewed by prior attainment]
  • If subject-level TEF is to provide students with reliable information it must address the impact of increased metric suppression [this relates to metrics which could not be used because of small numbers, particularly for part-time students and for the ethnicity splits], how judgments are made in the absence of data [particularly an issue for those institutions affected by the NSS boycott], the comparability of subject groupings and the increase in cost and complexity of submissions and assessment.

[To address the issue with suppression, the report noted that the splits for ethnicity will be reduced from 6 to 3 for subject level TEF (p35)]

These findings also suggest that if the TEF is to make an effective contribution to the ongoing success of the whole UK sector, the following issues would merit consideration as part of the independent review:

  • How the TEF defines and measures excellence in a diverse sector and supports development of teaching and learning practice.
  • The role that the TEF plays across the student decision-making process and the relationship with the wider student information landscape.
  • The process for the future development of the TEF and the role of the sector, including students and devolved nations.
  • The relationship between the TEF and quality assessment, including regulatory baselines and the Quality Code.

Figure 4 shows the data benchmarking flags received by providers at each award level – these two charts are interesting because they show that providers with negative flags still received gold (and silver).

The survey also asked about future developments for the TEF with learning gain being a clear leader – ahead of teaching intensity. HEFCE is running learning gain pilots, as discussed above, and teaching intensity will be the subject of a pilot alongside subject level TEF. Interestingly, on p 33 a chart shows that nearly 70% of respondents believed that “there is no proportionate approach for producing a robust subject level TEF judgement which will be useful for students”.

Industrial Strategy

Following our update on the Industrial Strategy last week there are a couple of updates. Innovate UK has announced funding for businesses to work on innovative technologies, future products and services. The categories link closely to the Industrial Strategy priorities including digital technologies, robotics, creative economy and design and space applications as well as emerging technologies and electronics.

There was also an announcement about funding for innovative medicines manufacturing solutions.

Sir John Bell has published his report for the government on Life Sciences and the Industrial Strategy. There are seven main recommendations under 4 themes, which are summarised below. You can read a longer summary on the BU Research Blog.

Some interesting comments:

  • The key UK attribute driving success in life sciences is the great strength in university-based research. Strong research-based universities underpin most of the public sector research success in the UK, as they do in the USA and in Scandinavia. National research systems based around institutes rather than universities, as seen in Germany, France and China, do not achieve the same productivity in life sciences as seen in university-focussed systems.” (p22)
  • “The decline in funding of indirect costs for charity research is coupled to an increasing tendency for Research Councils to construct approaches that avoid paying indirect Full Economic Costs (FEC). Together, these are having a significant impact on the viability of research in universities and have led to the institutions raising industrial overhead costs to fill the gap. This is unhelpful.” (p24)
  • “It is also recommended, that the funding agencies, in partnership with major charities, create a high-level recruitment fund that would pay the real cost of bringing successful scientists from abroad to work in major UK university institutions.” (see the proposal to attract international scientists below).
  • On clusters “Life sciences clusters are nearly always located around a university or other research institute and in the UK include elements of NHS infrastructure. However, evidence and experience suggests that governments cannot seed technology clusters and their success is usually driven by the underpinning assets of universities and companies, and also by the cultural features of networking and recycling of entrepreneurs and capital.” And “Regions should make the most of existing opportunities locally to grow clusters and build resilience by working in partnership across local Government, LEPs (in England), universities and research institutes, NHS, AHSNs, local businesses and support organisations, to identify and coalesce the local vision for life sciences. Science & Innovation Audits, Local Growth Funds and Growth Hubs (in England), Enterprise Zones and local rates and planning flexibilities can all be utilised to support a vision for life sciences. “ (see the proposal on clusters under “Growth and Infrastructure” – this was a big theme in the Industrial strategy and something we also covered in our Green Paper response)
  • On skills: “ The flow of multidisciplinary students at Masters and PhD level should be increased by providing incentives through the Higher Education Funding Council for England.2 and  “Universities and research funders should embed core competencies at degree and PhD level, for example data, statistical and analytical skills, commercial acumen and translational skills, and management and entrepreneurship training (which could be delivered in partnership with business schools). They should support exposure to, and collaboration with, strategically important disciplines including computer and data science, engineering, chemistry, physics, mathematics and material science.”

Health Advanced Research Programme (HARP) proposal – with the goal to create 2-3 entirely new industries over the next 10 years.

Reinforcing the UK science offer 

  • Sustain and increase funding for basic science to match our international competition – the goal is that the UK should attract 2000 new discovery scientists from around the globe
    • The UK should aim to be in the upper quartile of OECD R&D spending and sustain and increase the funding for basic science, to match our international competitors, particularly in university settings, encouraging discovery science to co-locate.
    • Capitalise on UKRI to increase interdisciplinary research, work more effectively with industry and support high-risk science.
    • Use Government and charitable funding to attract up to 100 world-class scientists to the UK, with support for their recruitment and their science over the next ten years.
  • Further improve UK clinical trial capabilities to support a 50% increase in the number of clinical trials over the next 5 years and a growing proportion of change of practice and trials with novel methodology over the next 5 years.

Growth and infrastructure – the goal is to create four UK companies valued at >£20 billion market cap in the next ten years.

NHS collaboration – the Accelerated Access Review should be adopted with national routes to market streamlined and clarified, including for digital products. There are two stated goals:

  • NHS should engage in fifty collaborative programmes in the next 5 years in late-stage clinical trials, real world data collection, or in the evaluation of diagnostics or devices.
  • The UK should be in the top quartile of comparator countries, both for the speed of adoption and the overall uptake of innovative, cost-effective products, to the benefit of all UK patients by the end of 2023.

Data – Establish two to five Digital Innovation Hubs providing data across regions of three to five million people.

  • Create a forum for researchers across academia, charities and industry to engage with all national health data programmes.
  • Establish a new regulatory, Health Technology Assessment and commercial framework to capture for the UK the value in algorithms generated using NHS data.
  • Two to five digital innovation hubs providing data across regions of three to five million people should be set up as part of a national approach and building towards full population coverage, to rapidly enable researchers to engage with a meaningful dataset. One or more of these should focus on medtech.
  • The UK could host 4-6 centres of excellence that provide support for specific medtech themes, focussing on research capability in a single medtech domain such as orthopaedics, cardiac, digital health or molecular diagnostics.
  • National registries of therapy-area-specific data across the whole of the NHS in England should be created and aligned with the relevant charity.

Skills

  • A migration system should be established that allows recruitment and retention of highly skilled workers from the EU and beyond, and does not impede intra-company transfers.
  • Develop and deliver a reinforced skills action plan across the NHS, commercial and third sectors based on a gap analysis of key skills for science.
    • Create an apprenticeship scheme that focuses on data sciences, as well as skills across the life sciences sector, and trains an entirely new cadre of technologists, healthcare workers and scientists at the cutting-edge of digital health.
    • Establish Institutes of Technology that would provide opportunity for technical training, particularly in digital and advanced manufacturing areas.
    • There should be support for entrepreneur training at all levels, incentivising varied careers and migration of academic scientists into industry and back to academia.
    • A fund should be established supporting convergent science activities including cross-disciplinary sabbaticals, joint appointments, funding for cross-sectoral partnerships and exchanges across industry and the NHS, including for management trainees.
    • High quality STEM education should be provided for all, and the government should evaluate and implement additional steps to increase the number of students studying maths to level 3 and beyond.

JANE FORSTER                                                             |                               SARAH CARTER

Policy Advisor                                                                                               Policy & Public Affairs Officer

65111                                                                                                              65070

Follow: @PolicyBU on Twitter                      |                               policy@bournemouth.ac.uk

 

Bristol Online Surveys (BOS) are transferring to Jisc

Bristol Online Surveys (BOS)

BOS is currently managed by the University of Bristol and provided as a service to the UK HE community.  On 1 August 2017, ownership will be transferred to Jisc.  Following transfer to Jisc it is expected that the ‘look and feel’ of BOS should remain the same.

BOS account access is set up by IT Services who are account administrators.  Researchers wishing to use a BOS survey should put a request through the IT Service Desk (SNOW).

It is important to note that on  1 August 2017, BOS will be unavailable for around 48 hours.  We do not know the exact time period at the moment.  More information is available on the BOS site:

https://www.onlinesurveys.ac.uk/transfer-to-jisc-faqs-and-information/

Transfer to Jisc: FAQs
Transfer to Jisc: FAQs for Primary Contacts

The challenges and rewards of teaching qualitative analysis with software

Qualitative research is gaining momentum in social sciences, education and health, with new developments appearing every year for gathering, analysing and disseminating data. This session will focus on the teaching and learning potential of specialised programmes for the process of systematising and analysing qualitative data.

The session will cover the basic features of computer assisted qualitative data analysis software (CAQDAS) and their possible role in students’ understanding of qualitative analysis. Specifically, it will be suggested that the process of data analysis and related techniques (content, thematic, framework and discourse analysis, to name a few) should beadvanced before students engagement with CAQDAS, but that CAQDAS have the potential to enhance students’understanding of qualitative data analysis in practice. The session will outline some practice-based recommendations for engaging students when running interactive qualitative data analysis sessions in general andworkshops for CAQDAS in particular.

Aims and objectives:

  • To introduce attendees to the basic and advance features of CAQDAS
  • To discuss the challenges and rewards of teaching qualitative analysis using CAQDAS
  • To stimulate discussion around qualitative methods teaching

Save the date: Monday 24 April, 12.00-13.30. Talbot Campus.

Bookings should be made through the Intranet, with Organisational Development.

The session will be facilidated by Dr Jacqueline Priego, who has been delivering CAQDAS workshops and training postgraduate students and researchers on qualitative analysis since 2010. She is also available for queries relating to MAXQDA and QDAMiner (not supported at BU).

Making the Most of Writing Week Part 7: BUCRU – not just for Writing Week!

We’re coming to the end of Writing Week in the Faculty of Health and Social Sciences and by now you will have made a good start or have put the finishing touches to your academic writing projects. Over the last week, we have given you some tips on writing grant applications and highlighted some of the expertise within BUCRU. If you didn’t get the chance to pop in and see us we thought it would be useful to remind you what we’re about and how we can help.

Bournemouth University Clinical Research Unit (BUCRU) supports researchers in improving the quality, quantity and efficiency of research across the University and local National Health Service (NHS) Trusts. We do this by:

  • Helping researchers develop high quality applications for external research funding (including small grants)
  • Ongoing involvement in funded research projects
  • A “pay-as-you-go” consultation service for other work.

How can we help?

BUCRU can provide help in the following areas:

  • Study design
  • Quantitative and qualitative research methods
  • Statistics, data management and data analysis
  • Patient and public involvement in research
  • Trial management
  • Ethics, governance and other regulatory issues
  • Linking University and NHS researchers

Our support is available to Bournemouth University staff and people working locally in the NHS, and depending on the support you require, is mostly free of charge. There are no general restrictions on topic area or professional background of the researcher.

If you would like support in developing your research please get in touch through bucru@bournemouth.ac.uk or by calling us on 01202 961939. Please see our website for further information, details of our current and previous projects and a link to our recent newsletter.

Making the Most of Writing Week Part 6: What to do with your data

You don’t have to spend Writing Week working on grant applications. You may already have a dataset and now you finally have some time to do something with it. But where to start? It’s often a good idea to go back to your original research questions/aims/objectives. As we said yesterday, a well thought out research question can help shape your analysis strategy.
Hopefully you will have a record of which variables you were measuring and how data were coded. Were any calculations performed using the raw data to create new variables? How were these done? This is all part of good data management. To find out more visit the information pages created by the Library and Learning Support Team.
Once you are reacquainted with your data, it’s often a good idea (in the case of quantitative data) to start plotting graphs to find out more. Always keep in mind the original aims of the study, it’s easy to wander down a path of distraction. If you are feeling confused by all of this or, have got yourself lost down a data track, the BUCRU team are at hand to help.
Peter Thomas is available on Tuesday and Wednesday while Sharon Docherty is available Tuesday, Wednesday and Thursday this week. Why not drop us an email or pop by to see us in R505?

Research Data Management and you!

Research Data Management is a hot topic, especially when applying for grants. We all have our own strategies for managing our data as a product of research. Sometimes data management is in the form of a box or filing cabinet in a locked office, an external hard drive, purchased cloud storage or a hard drive. Whilst this approach is comfortable and familiar, it’s unlikely to comply with funder requirements neither currently nor in the future.

The Library has a created a guide that will help with navigating the diverse requirements of grant funding councils, writing data management plans and all its intricacies. The guide, ‘Research Data Management’ is available here .

We welcome your feedback about this resource, please contact rdm@bournemouth.ac.uk.

There is also a very informative youtube video Data Sharing and Management posted by NYU Health Sciences Library.

New frontiers on tech – big data, the cloud and Internet of Things

Technology in the hands
IT giant, Intel Corporation, is undergoing a massive shift in strategy. While jobs fall by the wayside, Intel has its eye firmly on what analysts are calling ‘new frontiers in technology’, and there are signs that the other tech behemoths are set to follow suit.
Last week, Intel CEO Brian Krzanich outlined his strategy for the chip giant in the years ahead, as it struggles to move away from its dependence on the waning PC market.
The thrust of the new strategy is: ‘transforming Intel from a PC company to a company that powers the cloud and billions of smart, connected computing devices’ and this, says Krzanich, encompasses five core beliefs:
• The cloud is the most important trend shaping the future of the smart, connected world – and thus Intel’s future.
• The many “things” that make up the PC Client business and the internet of things are made much more valuable by their connection to the cloud.
• Memory and programmable solutions such as FPGAs will deliver entirely new classes of products for the data centre and the internet of things.
• 5G will become the key technology for access to the cloud  as we move toward an always-connected world.
• Moore’s Law will continue to progress and Intel will continue to lead in delivering its true economic impact.
For Intel, these core beliefs form a clear virtuous cycle – the cloud and data center, the internet of things, memory s are all bound together by connectivity and enhanced by the economics of Moore’s Law. (Gordon Moore was an Intel co-founder so Moore’s Law is ingrained in the company’s psyche.)
Key to this is the “internet of things,” every device, sensor and console that has potential to connect to the cloud. This means that everything that a “thing” does can be captured as a piece of data, measured in real-time, and becomes accessible from anywhere.
Krzanich believes that: “the biggest opportunity in the internet of things is that it encompasses just about everything in our lives today – it’s ubiquitous. For most areas of industry and retail – from our shoes and clothes to our homes and cars – the internet of things is transforming everything and every experience. At Intel, we will focus on autonomous vehicles, industrial and retail as our primary growth drivers of the internet of things.”
In a time when technology is valued not just for the devices it produces, but for the experiences it makes possible, Intel is banking on the fact that a broader focus, and sharper execution will enable the company to take a lead in a smart, connected world.
Several major corporations have already taken the initiative to push frontier technology, such as Google with its Google Cardboard and Apple’s eventual (maybe?) 3D printer. Layoffs are the inevitable result and, in many cases, are already happening, as more and more companies find themselves having to look in new directions.
With Apple’s iPhone production on the decline comes more evidence that companies’ defining products won’t be what sustains them into the future and that Intel and Krzanich’s new focus on experiences rather than the devices that make these possible is the way to go.

Research Data Management and Sharing – MOOC

data management

Today, an increasing number of funding agencies, journals, and other stakeholders are requiring data producers to share, archive, and plan for the management of their data. In order to respond to these requirements, researchers and information professionals will need the data management and curation knowledge and skills that support the long-term preservation, access, and reuse of data. Effectively managing data can also help optimize research outputs, increase the impact of research, and support open scientific inquiry.

The Curating Research Assets and Data Using Lifecycle Education (CRADLE) Project in collaboration with EDINA at the University of Edinburgh have developed an online course which will provide learners with an introduction to research data management and sharing. After completing this course, learners will understand the diversity of data and their management needs across the research data lifecycle, be able to identify the components of good data management plans, and be familiar with best practices for working with data including the organization, documentation, and storage and security of data. Learners will also understand the impetus and importance of archiving and sharing data as well as how to assess the trustworthiness of repositories. .

After completing this course, learners will also be better equipped to manage data throughout the entire research data lifecycle from project planning to the end of the project when data ideally are shared and made available within a trustworthy repository.

The course material is free to access and if you wish to complete the course with a certificate, there is a charge of £34.

Please click on this link to find out more – https://www.coursera.org/learn/data-management/.