Tagged / ref

First BU Research Blog Poll Results

Are journal impact factors a good indicator of quality?

Following the launch of the first BU Research Blog Poll, we received 28 responses to the above question which were split as follows:

Yes – always 2
Yes – but in STEM disciplines only 1
Sometimes 22
No, never 2

The majority of responses indicate that there may be some doubt about the usefulness of impact factors when used as a proxy for journal quality. This is perhaps because there are a number of factors that could affect a journal’s perceived quality that cannot be demonstrated through metrics alone. Also, the use of journal metrics like impact factors are not necessarily perceived as being robust enough yet to give an accurate indication of journal or article quality, hence HEFCE’s decision not to rely solely on metrics in the Research Excellence Framework (REF).

To continue the debate on this, do feel free to post a reply below, or suggest a topic for a future poll by responding to Julie’s original post. For more information about journal impact factors, have a look at the previous blog post on this subject.

In the meantime, why not get involved in the current poll which can be accessed from the top right-hand side of the blog homepage – it will take just seconds to complete and will help shape the support offered to BU academics in the future.

Investigating academic impact at the London School of Economics: Blogs, Twitter and bumblebees!


The 'Current Thinking in Assessing Impact’ panel discussion during the LSE impact event on 12 June.

As someone who is still getting to grips with exactly how impact might be defined and operationalised for the REF, I went along to the Investigating Academic Impact Conference at LSE on the 12th June looking forward to learning more about precisely how we could create more effective impact case studies for the REF.  The day was opened by Patrick Dunleavy from the Impact of Social Sciences Project at LSE with the challenging statement that we needed to think about impact as a long-term, integral part of our research work and that simply trying to maximise impact for the REF was a short-term strategy.

What followed were sessions on how to use blogging, Wikipedia and Twitter to help enhance your electronic footprint and to engage with the public in new ways.  Following their own advice, all the presentations are now available, along with blogs and tweets, at http://blogs.lse.ac.uk/impactofsocialsciences/presentations/.  On the site there is a comprehensive (200-page) handbook detailing exactly how to increase your citations and how to achieve external impacts and, for those with a shorter attention span, there are some short how-to guides.  These include standard information about citation tools (such as ISI Web of Science and Scopus) as well as more esoteric measures of citation impact (such as the G-index and H scores).

There are also simple tips on how to get more widely read:

  • make sure your titles are informative
  • work on cross-disciplinary projects
  • build dissemination plans
  • have a distinctive name (many thanks to my parents on this one!).

The Impact of Social Sciences project at LSE has created a great resource which means that if you didn’t attend the day it doesn’t matter – the information is there for you to browse and look at anyway.

In the unexpected way that often happens at conferences, there are single pieces of information that are particularly memorable.  For this one it was the importance of the bumblebees!  At both the recent BU Research Impact Event and the LSE conference, one particular case study from the REF impact pilot exercise was singled out for particular praise. This was an elegant case study submitted by the University of Stirling on the conservation of bumblebees which was able to show tangible and far-reaching impact (for further details see http://www.hefce.ac.uk/research/ref/impact/ under Earth Systems and Environmental Sciences).  This is a great concrete example of how good impact case studies might be formulated and from which those still struggling with impact might be able to gain insights.

Perhaps one final message from the day was that, of course, if you wished to have academic impact then the best starting point of all is to have good research to talk about!

Siné McDougall

 

The Public Value of the Humanities

Demonstrating the public value of research will be a significant part of the forthcoming REF exercise. Most major funding bodies now require an impact statement as part of the application process. Universities are being required to demonstrate that their research offers value for money and tangible benefits outside of the academic sphere. This is easier in some disciplines than others, with many people believing the arts, humanities and social sciences (AHSS) will struggle to demonstrate impact.

The Public Value of the Humanities, recently published by Bloombury Academic and edited by Prof Jonathan Bate (University of Warwick), demonstrates how the AHSS discplines can demonstrate that their research has public impact, benefit and value.

For a full review of the book see the review on the THE website.

You can buy this book on Amazon.

Engaging Academic Social Scientists in Government Policy-Making and Delivery

Prof Martin Kretschmer, Professor of Information Jurisprudence and Research Centre Director for CIPPM in the Business School, recently attended a meeting organised by the British Academy and the ESRC on Engaging Academic Social Scientists in Government Policy-Making and Delivery. Here he provides an overview of the issues discussed at the event…

Making research relevant to policy is on the agenda of all Research Councils, as reflected in the Impact measure of REF 2014. The event was co-sponsored by the Government Heads of the Analytical Professions: Government Economic Service, Government Operational Research Service, Government Science & Engineering, Social Science in Government, and the Government Statistical Service. The programme and list of attendees is available here: British Academy event programme and delegate list

Some of the issues raised, and questions asked of the attendees included:

Q1: What do you think government should be doing more of to increase the influence of your research and expertise on government policy making and delivery?

Q2: What do you think the academic social science community should be doing more of to have a direct influence on government policy making and delivery?

Q3: What might encourage you to consider an advisory role to government, for example, as a social scientist on one of the government’s Scientific Advisory Committees?

I assume I was invited because I am just coming to the end of an ESRC Public Sector Fellowship in the UK Intellectual Property Office (within BIS). I also sit on the government’s Copyright Advisory Expert Group, and speak frequently on policy issues, for example last week (1 June) at a Hearing in the European Parliament on The Future of Copyright in the Digital Era

Below, I summarise a few points from the meeting that may be useful for the wider BU research community.

Prof Nick Pidgeon (Professor of Environmental Psychology, University of Cardiff, and Director of the Understanding Risk Research Group) offered 4 routes to influencing government:

  • Government contract research, including small review contracts.
  • RCUK (or similar) funding in policy relevant area.
  • Advisory Committees.
  • Indirectly, via dissemination through Royal Society, RSA, or similar.

Paul Johnson (Director of the Institute for Fiscal Studies): “Don’t expect to change government policy if your evidence points in a different direction.” There are two choices: EITHER Focus on points of detail within the policy direction given by government, OR Set agenda for 5 years hence.

Sir John Beddington (Government Chief Scientific Advisor) stressed the tightrope walk between advice that is a “challenge” and being labelled “unhelpful” (in Sir Humphries language). Academics should risk “challenge” even if it turns out to be “unhelpful”.

Prof Philip Lowe (Professor of Rural Economy, University of Newcastle, and Director of the Rural Economy and Land Use Programme): There is a paradox – How can a government department become a sophisticated consumer of research? Commissioning good research requires being able to know what you don’t know. Hard for civil servants and politicians. Important to build and sustains links over many years.

Prof Helen Roberts (Professor, General Adolescent and Paediatrics Unit, University College London, and non-executive director of the National Institute for Health and Clinical Excellence NICE): Public sector placements are very useful, both for academic and government, but governance of these grants can be cumbersome. [I can confirm that from my own secondment experience. At some point, there were suggestions that detailed delivery contracts would have to be drawn up between ESRC and BU, ESRC and BIS/IPO, BIS/IPO and BU. In the end, I was simply shown the Official Secrets Act, and the Code of Conduct for Civil Servants, and that was it.]

Importance of human dimension: “Most implementation comes though good relationships, not good research.”

Sharon Witherspoon (Deputy Director of the Nuffield Foundation, and in charge of research in social science and social policy): Most policy advisors double in “empirically informed counterfactuals”, and are normally grateful if offered help with: “What would happen if…” But academics can often make the most telling contribution by more radical reflection: “I wouldn’t start from here”. Governments are less likely to be open to that kind of challenge. Select Committees are becoming more independent of government (now have elected chairs). They can be a route to influence.

Paul Doyle (CEO, ESRC): The ESRC is building a database of government policy leads/contacts. Often it is impossible from government websites to identify the civil servants and special advisors dealing with specific policy issues. Government scientists should be encouraged to become members of Learned Societies.

 Key points from the open discussion:

  • Importance to keep independence by constructing portfolio of funders.
  • Economists are a separate breed in government. They have little concept of wider social research.
  • Responding to consultations is often a good first step to engagement.
  • Academics should use less jargon, shorter sentences.
  • Visual representation of research findings matters greatly.
  • Often it is useful to invite policy makers to academic events. They enjoy coming out of the office, and are less partisan/circumspect in a neutral environment.
  • There is an important corrective function for social scientists in assessing the presentation of data.
  • Difficulty in presenting the audit trail required for REF Impact. Government does have no interest in revealing the sources of its ideas, or it may be politically inconvenient to do so.

REF Impact Observations

Shortly we will learn the REF panel criteria and guidance submissions due to be published in July.  At BU we have engaged with some excellent initiatives such as the recent REF event to help focus our approach to the impact case studies. The measurement of benefit that UK research has on society is an opportunity to understand the value of our work. Already it is well documented that the concept of research impact has for the moment protected some sources of external funding at least for the next few years. As academics concerned with the REF we now need to consider delivering the evidence of our research impact from 2008 resulting from good BU research. This research could have happened almost as far back as 1993.

Impact is a difficult concept and not easy to measure in terms of the REF. From an engineering viewpoint impact is difficult to quantify as like the REF it involves knowledge of time in addition to the force. Engineering impact force increases as time decreases. This view of time relates to the general research impact debate. We now have the opportunity to look back and review BU research almost over the lifetime of the University.

Investigating Academic Impact event at LSE on 13 June

The LSE Public Policy Group is running a free one day event on evidencing the impact of research.

Date: Monday 13 June 2011 
Time: 10-5pm 
Venue:  New Academic Building, LSE, London

Academics are increasingly being pressed to provide evidence of impact from their research on the world outside academia. And universities will have to provide evidence of impact as part of the new Research Excellence Framework. But there is confusion about the different definitions of impact that exist amongst funding bodies and research councils, and also about methods of measuring impact.

This one day conference will look at a range of issues surrounding the impact of academic work on government, business, communities and public debate. We will discuss what impact is, how impacts happen and innovative ways that academics can communicate their work. Practical sessions will look at how academic work has impact among policymaking and business communities. Also how academic communication can be improved and how individual academics can easily start to asses their own impact.

PANELS:
Research Impact and the REF
Professor Rick Rylance (Chief Executive, Arts and Humanities Research Council)
David Sweeney (Director of Research, Innovation and Skills, HEFCE)
Professor Paul Wiles (Panel Chair, social work and social policy panel, REF impact pilot)

Current Thinking in Assessing Impact
Professor Patrick Dunleavy (Impact of Social Sciences project, London School of Economics)
Professor Alan Hughes (Centre for Business Research, University of Cambridge)
Tomas Ulrichsen (Public and Corporate Economic Associates)

Innovative Methods for Impact and Engagement
Professor Stephen Curry (blogger, Imperial College London)
Martyn Lawrence (Senior Publisher, Emerald Insight)
Paul Manners (Director, National Coordinating Centre for Public Engagement, UWE)
Mike Peel (Jodrell Bank Centre for Astrophysics / Wikimedia UK)

BREAKOUT SESSIONS:
Academic impact on policy-making
Maria O’Beirne (Analysis and Innovation Directorate, Department for Communities and Local Government)
Jill Rutter (Better Policy Making Programme Director, Institute for Government)

Knowledge transfer and the role of research mediators
Nick Pearce (Director, IPPR)
Professor Judy Sebba (University of Sussex)

Academic impacts on industry and business
James John (Director of Strategy, director of strategy, civil government, HP)

A ‘how to’ guide to measuring your own academic impact
Jane Tinkler (Impact of Social Sciences project, London School of Economics)

Improving academic communication
Professor Patrick Dunleavy (Impact of Social Sciences project, London School of Economics)
Chris Gilson (Managing Editor, British Politics and Policy blog, London School of Economics)

This event is free and open to all but pre-registration is required. For more information phone and email the PPG team on 020 7955 6064 or 020 7955 6731 or by email on impactofsocialsciences@lse.ac.uk|. You can find more information on the Investigating Academic Impact website.

BU Research Impact event is a success!

Last Friday BU held an internal Research Impact event to share the success of the excellent research that has been undertaken by BU academics. The focus of the event was on how this research has had an impact outside of academia, for example an impact on society, the economy, quality of life, culture, policy, etc.

REF logoFor the forthcoming REF2014 BU will be required to include a number of research impact case studies as part of the submission. This is a new element to the REF (previously the RAE) and the HE sector has been grappling with the concept of impact for a number of years now.

The event, attended by over 75 BU staff, opened with a presentation from Prof Matthew Bennett (Pro Vice Chancellor – Research, Enterprise and Internationalisation) on BU’s future research strategy, planning for the REF, and how to develop and evidence research impact.

Part of the presentation focused on the BU Research Themes which are currently being identified and defined through academic consultation via the Research Blog. This is still in the early stages but Matthew presented the ten draft themes that are emerging. You can comment on the emerging themes here.

There were 35 impact case studies presented in total with most units of assessment (UOAs) presenting three case studies. At the end of each presentation members of the audience critiqued the case study and offered advice as to how the strengthen and maximise the impact claim.

Attendees were encouraged to go to impact case study presentations from different UOAs/Schools to find out about research that is undertaken in different areas of the University. Stronger impact case studies can also be developed with input from different disciplines.

The event was also attended by key staff from Marketing & Communications who will be working with UOA Leaders to develop and enhance impact case studies between now and the REF submission in autumn 2013.

There has been much positive feedback received from attendees and we are considering whether this should now be an annual event, celebrating the success of BU research and its benefit to society.

Many thanks to all the presenters and attendees, and everyone who supported the event and made it such a success! 😀

We are now seeking feedback on the impact case studies presented. These are all available on the I-drive (I:\CRKT\Public\RDU\REF\REF event May 2011\impact case study presentations). Please could you email your feedback to Anita Somner in the Research Development Unit by Friday 3 June. Anita will then anonymise and collate the feedback and share it with the UOA Leaders.

For further information on impact see the impact pages on the HEFCE website or our previous BU Research Blog posts on impact.

The excellent HEFCE REF event at BU!

Developing and Assessing Impact for the REF

Last week BU hosted a HEFCE-supported event for universities in the south of England outlining recent changes in how the quality of research in higher education is assessed.

The event, attended by over 150 delegates from 39 institutions, outlined the new Research Excellence Framework (REF) which includes a new assessment element focusing on research impact.

As Chris Taylor, Deputy REF Manager for HEFCE, explained: “REF will provide accountability for public investment in research and demonstrate its benefits.” He continued:

“Impact is defined as any contribution the research makes outside of academia. It is the higher education sector’s opportunity to shout about what it contributes to society.”

Professor Peter Taylor-Gooby (University of Kent), Professor Roy Harrison (University of Birmingham), Professor James Goodwin (Age UK), Dr Kathryn Monk (Environment Agency Wales) and Dr Mari Williams (RCUK) presented their experiences of assessing impact case studies in the REF pilot exercise. Professor Jim Griffiths (University of Plymouth) presented his experience of identifying and submitting impact case studies to the pilot exercise in the hope that others would learn from his experience.

Prevalent themes emerging from the pilot included the importance of a demonstrable chain of evidence from impact claim through to outcome, high quality research underpinning the impact claim and fostering the crucial relationship between academic and user.

Professor James Goodwin explained how research can change society for people’s benefit, stessing the importance of “converting research into a message that will influence people’s thinking”. He gave the recent removal of the default retirement age as an example of how this can influence policy.

The event closed with a Q&A session with all speakers, giving delegates the chance to obtain further clarity on the REF that will undoubtedly change the future of higher education research.

Matthew Bennett (BU’s PVC for Research, Enterprise and Internationalisation) said: “There has been sector-wide concern about how impact will be defined, collated and assessed in the REF, and this event provided excellent advice and guidance for academic staff likely to be submitting to the REF and those leading the submissions.”

The deadline for submitting submissions is November 2013 and the assesment will be made in 2014.

We will be adding further posts to the Research Blog focusing on the good practice shared at the event (such as defining impact, what makes a strong impact case study, etc) over the next few weeks.

REF event 19 & 20 May 2011 – REGISTRATION IS OPEN!!

REF logoBU is hosting a two-day REF event on Thursday 19 and Friday 20 May 2011. All staff are invited to attend.

The event is of interest to BU academic staff and anyone who will be involved in the BU submission to the REF.

There will be three separate sessions:

Session 1
Thursday 19 May 9am-2pm
This session will be open to BU staff and external delegates.
There will be presentations from the REF team at HEFCE, REF impact pilot panel members, and a REF impact pilot institution (University of Plymouth).

Session 2
Thursday 19 May 2pm-5pm
This session is only open to BU staff.
This session will provide BU staff with the opportunity for internal networking, followed by a demonstration of BU’s new publications management system and a presentation on preparing a publication profile for the REF.

Session 3
Friday 20 May 9:45am-4:30pm
This session is only open to BU staff.
The focus of this session is the development of the BU impact case studies. There will be presentations of the impact case studies being developed at the moment.

All sessions will take place in Kimmeridge House and Poole House, Talbot Campus.

You must register separately for each session you will be attending.

See our previous REF Event blog post for further details. The provisional programmes are available on the registration forms (see links above).

REF event 19 & 20 May 2011 – SAVE THE DATE!

REF logoBU will be holding a two day Research Excellence Framework (REF) event on 19 and 20 May to which all staff are invited to attend.

Day 1 (open to BU staff and external delegates)
9am-2pm – this will be an external event supported by HEFCE to which all HEIs in the South of England will be invited. The focus will be on developing and assessing impact for the REF. There will be speakers from HEFCE, an academic from one of the impact pilot institutions (University of Plymouth), and some of the impact pilot panel members. The event is aimed primarily at academics likely to be submitted to the REF and UOA Leaders. It will provide a forum for networking and discussion around preparations for the impact element of the REF.

Day 1 (open to BU staff only)
2pm-5pm – There will be an opportunity for internal networking, a demonstration of the publications management system BU will soon be implementing, and a talk by Prof Matthew Bennett on building a publication profile for the REF.

Day 2 (open to BU staff only)
9am-4:30pm – the focus of Day 2 is the development of the BU impact case studies. The day will open with a presentation by Prof Matthew Bennett on what impact actually is, followed by presentations of the impact case studies being developed at the moment (3 per Unit of Assessment). These will run in 9 concurrent sessions with 4 presentations taking place in parallel during each session. The main aims of Day 2 are to get academics thinking about the impact case studies in a structured way, to identify resource requirements to maximise potential impacts, and the engage staff from M&C with the case studies being developed. In addition this is a great opportunity to showcase the excellent research that is undertaken at BU, to meet colleagues from other Schools, and to stimulate ideas for future research collaborations.

The event is free to attend but booking is essential. Booking will open next week – further details to follow!

REF Highlight Report #8

REF logoThe latest REF Highlight Report is now available from the Research Intranet.

Key points include updates on:

  • progress with the UOA Action Plans
  • the REF two-day event to be held at BU in May (19th/20th)
  • the second mock exercises for UOAs 7 and 26
  • the RASG and RALT meetings held in March

You can access the full document from here: REF Highlight Report #8

Journal Impact Factors Explained

There is often some confusion around Journal Impact Factors in terms of where they come from, how they’re calculated and what they mean. Hopefully the following will provide a brief explanation.


What are Journal Impact Factors?
Journal Impact Factors are just one of a number of journal analytical measures that form part of an online resource provided by Thomson Reuters on their Web of Knowledge called Journal Citation Reports® (JCR), which covers journals in the sciences, technology and social sciences. JCR provides a facility for the evaluation and comparison of journals across fields within the subject areas covered.

Other publications databases may provide their own tools for bibliometric or citation analysis (such as Elsevier’s Scopus) but Journal Impact Factors are only found on the Web of Knowledge.

A Journal Impact Factor is the average number of times that articles from a particular journal published in the past two years have been cited in the JCR year.

How are Journal Impact Factors calculated?
Journal Impact Factors are calculated by dividing the number of citations to articles published by a particular journal in the JCR year by the total number of articles published in the two previous years. For example, an Impact Factor of 2.5 means that, on average, the articles published in that journal up to two years ago have been cited two and a half times. Citing articles may be from the same journal although most citing articles are from different journals.

The number of articles given for journals listed in JCR primarily include original research and review articles. Editorials, letters, news items and meeting abstracts are usually not included in article counts because they are not generally cited. Journals published in non-English languages or using non-Roman alphabets may be less accessible to researchers worldwide, which can influence their citation patterns.

How are Journal Impact Factors used?
Journal Impact Factors can help in understanding how many citations journals have received over a particular period – it is possible to see trends over time and across subject areas, and they may help when you’re deciding where to publish an academic paper. However, as with all statistics, Journal Impact Factors should be used with caution and should ideally be combined with other metrics depending on how they’re being applied.

Equally, a journal’s Impact Factor is not necessarily a direct indicator of the quality of an individual paper published in that journal. Some published articles never receive any citations, for various reasons, even if they appear in a high impact factored journal.

Journal Impact Factors and the REF
Some of the assessment panels will be provided with citation metrics as part of HEFCE’s Research Excellence Framework (REF) in some subject areas, which will help inform the panel members’ judgements. However, journal impact factors or equivalent journal ranking systems (e.g. the ABS list) will NOT be used at all within the assessment process.

Publication, publication, publication!

VC Jonty de WolfeIt was with mixed feelings that I settled down to watch the first episode of Campus last night. Would it be funny, would I get the in-jokes, would they mention research, or would it be too close to the mark and therefore too painful to watch? The main thrust of the episode saw Vice Chancellor Jonty de Wolfe pressuring English professor Matt Beer to write a best selling publication, as one of his colleagues in another department had recently managed, but unfortunately the professor was too distracted to comply. Replace distracted with another word (perhaps busy, unsure, pressured) and this may resonate a little better with BU.

Whilst Campus was far fetched and at times utterly ridiculous, the pressures on academics to produce high impact publications are very true, especially now as we are preparing for our submission to the REF. Rather than acting like tyrannical and eccentric VC de Wolfe, we’ve pulled together some sources of information for academics feeling the pressure of publication.

How to get published – The Times Higher Education have produced an excellent booklet – How to get Published: a Guide for Academics. The guide includes the seven chapters, written by experts in academic publishing, including advice and information on the publication process, getting your work into an academic journal, and how to turn your research into a best seller (I’m sure this last chapter would have been useful for the Professor in Campus last night).

journalsHow to get published in academic journals – The road to getting published in academic journals can be a daunting journey. There is a booklet published by PSA/Wiley-Blackwell called Publishing in Politics: a Guide for New Researchers which is an excellent introduction to publishing recommended for researchers in all disciplines, not just politics.

Professor Keith Dowding (LSE) has produced a couple of guides for those new to getting published in academic journals which are particularly useful. These were published in European Political Science and provide an overview of the journal publishing journey:

Individual journal publishers usually provide advice and guidelines for prospective authors – these can normally be found on their websites.

Open access publishing – BU has a central budget for paying for open access publishing costs. Read more here.

Do you have any advice on getting published that could benefit your colleagues? If so share it here by adding a comment to the BU Research Blog!