Posts By / sastringer

REF Week – REF 2021: An overview

Photo by Martin Sanchez on Unsplash

This week is REF Week on BU’s Research Blog and what better way to start than with an overview of the REF 2021 exercise.

What is the REF?

The REF was first carried out in 2014, replacing the previous Research Assessment Exercise. The REF is undertaken by the four UK higher education funding bodies: Research England, the Scottish Funding Council (SFC), the Higher Education Funding Council for Wales (HEFCW), and the Department for the Economy, Northern Ireland (DfE).

What is the REF’s purpose?

The funding bodies’ shared policy aim for research assessment is to secure the continuation of a world-class, dynamic and responsive research base across the full academic spectrum within UK higher education. We expect that this will be achieved through the threefold purpose of the REF:

  • To provide accountability for public investment in research and produce evidence of the benefits of this investment.
  • To provide benchmarking information and establish reputational yardsticks, for use within the HE sector and for public information.
  • To inform the selective allocation of funding for research.

How is the REF carried out?

The REF is a process of expert review, carried out by expert panels for each of the 34 subject-based units of assessment (UOAs), under the guidance of four main panels. Expert panels are made up of senior academics, international members, and research users.

For each submission, three distinct elements are assessed: the quality of outputs (e.g. publications, performances, and exhibitions), their impact beyond academia, and the environment that supports research.

Outputs

The output element makes up 60% of the assessment (a decrease from 65% in 2014) and consists of outputs produced by the University during the assessment period (1st January 2014 to 31st December 2020).

Unlike previous REF exercises REF2021 will not be a selective exercise – all staff employed by BU on the census date (31st July 2020) and deemed to have a significant responsibility for research will be submitted to the REF exercise with a minimum of 1 research output.

Each UOA will submit an output pool for the unit as a whole, the size of which will be the total FTE of staff submitted multiplied by 2.5. The output pool can also include outputs from former members of staff.

Outputs will be assessed in terms of their originality, significance and rigour and will be a assigned star ranking on this basis:

Impact

The impact element makes up 25% of the assessment (a increase from 20% in 2014) and consists of case studies which describe specific impacts that have occurred during the assessment period (1 August 2013 to 31 July 2020) that were underpinned by excellent research undertaken by the university.

For the purposes of the REF, impact is defined as an effect on, change or benefit to the economy, society, culture, public policy or services, health, the environment or quality of life, beyond academia.

Each UOA will submit a number of impact case studies determined by the total FTE of staff submitted.

They will be assessed in terms of their reach and significance and will be a assigned star ranking on this basis:

Environment

The environment element makes up 15% of the assessment and consists of:

  • Quantitative data on –
    • Research doctoral degrees awarded
    • Research income
    • Research income-in-kind
  • An institutional-level environment statement which includes information about the institution’s strategy and resources to support
    research and enable impact, relating to the period 1 August 2013 to 31 July 2020.
  • An unit-level environment template which includes information about the environment for research and enabling impact for
    each submitting unit, relating to the period 1 August 2013 to 31 July 2020.

Each UOA will submit one environment template which will be assessed in terms of its vitality and sustainability and will be a assigned star ranking on this basis:

Want to know more?

For more information about REF 2021, have a look at the REF Guidance on Submissions and REF Panel Criteria and Working Methods.

Also, have a look at our other BU REF Week blog posts.

REF Week: Key Changes

Off with the old and on with the new…

Photo by Simon Migaj on Unsplash

Following REF 2014, Lord Stern conducted a major review of the exercise. This review has led to a number of changes to the exercise to be carried out in 2021. Here is a summary of some of the key changes:

  • Staff Submission – REF2021 will not be a selective exercise as REF2014 was. We will be required to include all staff who have a significant responsibility for research.

Further information on staff submission can be found in the REF Guidance on Submissions there is a particularly useful flow diagram on page 36.

  • Decoupling Outputs from Staff – We will submit a pool of outputs produced at Bournemouth during the REF period, rather than the four papers per person in 2014. This will need to include one paper from every person in post on the census date (31/07/2020) but can also include outputs from staff who have left the University.

Further information on output submission can be found in the REF Guidance on Submissions. There is a particularly useful flow diagram on page 52.

  • Open Access – Journal articles and conference proceedings accepted for publication after 1st April 2016 must meet open access requirements.

Further information on the Open Access Policy can be found in the REF Guidance on Submissions page 54.

  • Impact – There is a broader definition of impact to emphasise public engagement and to include impact on teaching.

Further information on the definition of impact  can be found in the REF Guidance on Submissions page 68.

  • Interdisciplinary Research – A number of additional measures have been introduced to support submission and assessment of interdisciplinary research.

Further information on measures to support interdisciplinary and collaborative research can be found in the REF Guidance on Submissions page 24.

  • Units of Assessment (UOAs) – The number of UOAs has been reduced from 36 to 34.

A full list of the UOAs can be found in the REF Guidance on Submissions Annex D page 91. The UOA descriptors can be found in the Panel criteria and working methods page 9.

  • Weightings – Like REF2014, each submission will be composed of three elements however, the weightings have been revised to Outputs 60%, Impact 25%  and Environment 15%.

Further information on the assessment criteria can be found in the REF Guidance on Submissions page 7.

Want to know more?

For more information about REF 2021, have a look at the REF Guidance on Submissions and REF Panel Criteria and Working Methods.

Also, have a look at our other BU REF Week blog posts.

REF 2021 – Final Guidance Published!

Research England have this morning published the final guidance for REF 2021 submission. The following documents:

  • Guidance on submissions
  • Panel criteria and working methods
  • Guidance on codes of practice

Are available under the publications page of the REF 2021 website: https://www.ref.ac.uk/publications/

For further information, and to read the official announcement, please visit the REF news page

REF Internal Review Panels – Recruiting Now!

To help us prepare for our upcoming submission to the Research Excellence Framework (REF) 2021 we are establishing a number of internal review panels to review and assess BU’s research outputs and impact case studies.

Expressions of Interest (EoI) are invited from academic staff who are interested in being a Panel Member. There will be one panel per Unit of Assessment (UOA) listed below. Those interested should identify which UOA Panel they would like to be considered for and put forward a short case (suggested length of one paragraph) as to why they are interested in the role and what they think they could bring to it. EoIs should be emailed to ref@bournemouth.ac.uk by 14th December 2018.

UOA Teams would particularly welcome EoIs from those who have:

  • Experience reviewing for previous REF stocktake exercises
  • Experience in editorship
  • Experience peer review

Full details of the role, the process of recruitment and terms of reference for the panels themselves can be found here.

Any queries regarding a specific panel should be directed to the UOA Leader. General enquiries should be directed to Shelly Anne Stringer, RKEO.

Unit of Assessment UOA Leader(s)
2 Public Health, Health Services and Primary Care Prof. Edwin Van Teijlingen
3 Allied Health Professions, Dentistry, Nursing and Pharmacy
4 Psychology, Psychiatry and Neuroscience Dr. Peter Hills
11 Computer Science and Informatics Prof. Hamid Bouchachia
12 Engineering Prof. Zulfiqar Khan
14 Geography and Environmental Studies Prof. Rob Britton
15  Archaeology Prof. Kate Welham and Prof. Holger Schutkowski
17 Business and Management Studies Prof. Dean Patton
18 Law Dr Sascha-Dominik Bachman
20 Social Work and Social Policy Prof. Jonathan Parker
23 Education Prof. Julian McDougall and Prof. Debbie Holley
24 Sport and Exercise Sciences, Leisure and Tourism Prof. Tim Rees (Sport) Prof. Adam Blake (Tourism)
27 English Language and Literature Prof. Bronwen Thomas
32 Art and Design: History, Practice and Theory Prof. Jian Chang
33 Music, Drama, Dance, Performing Arts, Film and Screen Studies Prof. Kerstin Stutterheim
34 Communication, Cultural and Media Studies, Library and Information Management Prof. Iain MacRury

 

BRIAN Down – 11th October 2018

IT are undertaking essential maintenance to the BRIAN servers on Thursday 11th October 8am. This will involve BRIAN being unavailable to users for a short period of time.

We will communicate on the blog as soon as BRIAN is up and running again.

BRIAN Down – Thursday 11th October

IT are undertaking essential maintenance to the BRIAN servers on Thursday 11th October 8am. This will involve BRIAN being unavailable to users for a short period of time.

We will communicate on the blog as soon as BRIAN is up and running again.

Publish Open Access in Springer Journals for Free!

BU has an agreement with Springer which enables its authors to publish articles open access in one of the Springer Open Choice journals at no additional cost.

There are hundreds of titles included in this agreement, some of which are – Hydrobiologia, European Journal of Nutrition, Annals of Biomedical Engineering, Climatic Change, Marine Biology and the Journal of Business Ethics. A full list of the journals included can be found here

To make sure that your article is covered by this new agreement, when your article has been accepted for publication, Springer will ask you to confirm the following:

  • My article has been accepted by an Open Choice eligible journal
  • I am the corresponding author (please use your institutional email address not your personal one)
  • I am affiliated with an eligible UK institution (select your institutions name)
  • My article matches one of these types: OriginalPaper, ReviewPaper, BriefCommunication or ContinuingEducation

Springer will then verify these details with us and then your article will be made available in open access with a CC BY licence.

Please note that 30 Open Choice journals are not included in this agreement as they do not offer CC BY licensing.

If you have any questions about the agreement or the process, please contact OpenAccess@bournemouth.ac.uk

Plan S – Making Open Access a reality by 2020

On 4 September 2018, 11 national research funding organisations, with the support of the European Commission including the European Research Council (ERC), announced the launch of cOAlition S, an initiative to make full and immediate Open Access to research publications a reality. It is built around Plan S, which consists of one target and 10 principles.

cOAlition S signals the commitment to implement, by 1 January 2020, the necessary measures to fulfil its main principle: “By 2020 scientific publications that result from research funded by public grants provided by participating national and European research councils and funding bodies, must be published in compliant Open Access Journals or on compliant Open Access Platforms.”

Further information on cOAlition S can be found here – https://www.scienceeurope.org/coalition-s/

Some reactions can be found here –

LERU

Nature

Science

STM

To Catch A Predatory Publisher

I often wonder if other scientists wake up every morning to delete a deluge of spam messages from no-name journals and questionable conferences. Sometimes one of these emails will escape my extermination efforts and I end up reading it by accident. The invitations from so-called “predatory” publishers are so transparently fake and poorly written that a part of me finds their annoying overtures oddly amusing.

I realize that predatory publishing and phishing emails are not laughing matters. There has been an explosion of predatory publishers trying to con scientists out of their money. For a fee, these journals or books are just frothing at the mouth to publish your work collect your cash. Some may even invite you to serve on their “prestigious” editorial board, but this is just to lend an air of authenticity to their sham operation.

What separates a predatory publisher from a legitimate science publisher? Both charge you large sums of money for you to do all the work, but the latter employs a rigorous peer-review process that ensures the articles they publish have been properly vetted. Sting operations have revealed that predatory journals will publish absolute gibberish, proving they are phonies who just want to make fast cash. A recent sting operation involved the submission of a manuscript about midi-chlorians from Star Wars, written by Dr. Lucas McGeorge and Dr. Annette Kin. The fictional paper was accepted by four of these flimflam publications masquerading as a legitimate scientific journal.

In an effort to help root out some of these predatory publishers, I’ve compiled some of my favorite lines from the suspect emails I receive on a daily basis. I hope this helps people spot dubious publishers. The typos, spelling mistakes, and grammatical errors were left in place intentionally—exactly as they were sent to me. These types of errors represent a big red flag that a predatory publisher is stalking you. Please don’t take them as a sign that the PLOS editorial staff is sleeping on the job!

 

The Greeting That Proves They Have No Idea Who You Are

Fake journals will address you in unusual ways, or not at all! Some make no effort to conceal that they merely cut and paste your name into the slot of a form letter. Here are some examples I have received.

Dear Dr. WJ William J,

Dear ,

>Dear Dr.Jr WJ ,

Dear Dr. SULLIVAN,

Dear Dr. William J. Sullivan, Jr.1,2*,

Dear Dr. Jr,

Dear Dr. Jr William,

Dear Author,

Dear Researcher,

Dear Dr. Ferris (or someone else who is not me),

The opening is almost always followed by something like this:

“Greetings from the [PREDATORY] Journal!!!” or “Hope you are doing great!!” or “Hope our e-mail finds you well and in healthy mood.”

 

A New Type Of Sport:  Extreme Flattery

Reading these emails can be a big boost to your ego, but keep in mind that thousands of others received the same exact praise. I’ve been called “esteemed,” “brilliant,” “magnificent,” and the “leader in the field.” I should show these emails to my mom; fake or not, she’d be very proud to see her son put on such a high pedestal.

They also make our work out to be the greatest thing since the microscope:

“It’s your eminence and reputation for quality of research and trustworthiness in the field of [insert a field that usually has no relationship to mine whatsoever] and for which you have been invited to become an honourable editorial board member.”

“I am impressed by your quality work and I really value your contribution towards recent work.”

“We have gone through your papers and find it is a wonderful resource for upcoming works.”

“It would be our honour and great fortune if you will share your manuscript.”

“It is our immense pleasure to invite you and your research allies to submit manuscript.” (I call all of my collaborators my Research Allies now, and warn everyone that if you’re not with us, you’re against us).

Some of the spam even tries to reassure you that it is not spam. Although I don’t doubt the last sentence (in bold).

“This is not a spam message, and has been sent to you because of your eminence in the field. If, however, you do not want to receive any email in future then reply us with the subject remove /opt-out. We are concern for your privacy.

 

Smell the Desperation?

There is a palpable urgency in these invitations to get their hands on your work cash before someone else does…

“We consider Mini Reviews, Original Research/ Review Article, Case Reports, Short Communication, Conference Proceedings, Commentaries, Book Reviews etc.”

“[Our journal] publishes all kinds of papers in all areas of this field.”

“It is our immense pleasure to invite you to submit Research, Review, Mini review, Short commentary, Commentary, Case Reports, Methodologies, Systematic Reviews (or any type of article) for the upcoming issue of our journal.”

“We necessitate your plate of stuff in the frame of article.”

“We are in a deadly need of an article.”

 

The Closing That Is Too Close For Comfort

Emails from predatory publishers often close with more friendly compliments, with the hope that you will continue to provide them a constant research revenue stream…

“We look forward to a close and lasting scientific relationship for the benefit of scientific community.”

“Anticipating for a positive response!”

“Thank you and have a great day doctor!”

“Our journey is effectively heading.” (whatever that means)

And here are some of the more amusing ways they describe my anticipated work:

“We await your adorable paper.”

“Your paper will serve as a wave maker.”

“We aspire you also to be a significant part of our team by publishing your magnificent article.”

And one that is just plain baffling:

“For future anticipation to reach out scientific community we want your support towards the success of Journal.”

They may even try to get you to do their dirty work for them. I take the following sentence to mean that they want me to spread the word about their predatory journal and invite my colleagues to be suckered into submitting work to them as well:

“As this is an invited submission request, you can also suggest your colleagues.”

 

Some Final Warning Signs

Be wary of any inaugural issues or invitations that ask you to submit your paper through an email address.

These predatory publishers may also try to give you a false sense of the importance of their “journal.” One of these invitations boasted that “email newsletters are being circulated to 90,000+ subscribers.” All this tells me is that 89,999 other people received their unsolicited spam.

I’ve also noticed they are often willing to discount their publication fees if you send them something – “anything at all” – within seven to ten days. No legitimate journal expects you to whip up a quality article in that short a time.

Oh, and one more thing:  many of these emails have an excessive number of exclamation points!!!

I hope that you are now better equipped to catch a predator. If you still aren’t sure if your email invitation describes a bona fide journal or not, you can check Beall’s List of Predatory Journals and Publishers or see if the journal is indexed on PubMed. Here’s hoping that you don’t get fooled!!!

Originally posted on the 4th October 2017 – https://blogs.plos.org/scicomm/2017/10/04/to-catch-a-predatory-publisher. Reposted here with permission.

Open Research Day – Today!!

Today colleagues will be available on both campuses to answer all your queries in regards to Open Research.

We’ll be in BG11 on Lansdowne between 9 and 12pm and FG04 on Talbot between 1 and 4pm.

Pop on down… there is cake! 🙂

 

 

 

BRIAN Downtime – Monday & Tuesday

BRIAN will be unavailable to users on Monday 30th April and Tuesday 1st May for a scheduled upgrade.

All relevant guidance notes and video guides on the Staff Intranet will be updated in due course. If you need any help using the new system or if you encounter any problems after the upgrade, please do send an email to BRIAN@bournemouth.ac.uk and a member of staff will be able to assist you.

BRIAN training sessions are also available every two months and are bookable through Organisational Development . The next session scheduled is:

  • Wednesday 20th June 2018

In the meantime, if you do have queries relating to the upgrade, please get in touch with BRIAN@bournemouth.ac.uk

BRIAN Upgrade – Next Week!

BRIAN will be upgrading to a new version next week, so will be inaccessible to users on Monday 30th April and Tuesday 1st May.
The main improvement for this upgrade is the introduction of a new Assessment module to enable more efficient REF preparation. However, we hope to also introduce more user friendly reporting over the next few months. 

All relevant guidance notes and video guides on the Staff Intranet will be updated in due course. If you need any help using the new system or if you encounter any problems after the upgrade, please do send an email to BRIAN@bournemouth.ac.uk and a member of staff will be able to assist you.

BRIAN training sessions are also available every two months and are bookable through Organisational Development . The next session scheduled is:

  • Wednesday 20th June 2018

In the meantime, if you do have queries relating to the upgrade, please get in touch with BRIAN@bournemouth.ac.uk

Research & Knowledge Development Framework – give us your feedback

It’s been over 18 month since Bournemouth University launched its new Research & Knowledge Exchange Development Framework, which was designed to offer academics at all stages of their career opportunities to develop their skills, knowledge and capabilities.

 

Since its launch, over 150 sessions have taken place, including sandpits designed to develop solutions to key research challenges, workshops with funders such as the British Academy and the Medical Research Council and skills sessions to help researchers engage with the media and policy makers.

 

The Research & Knowledge Exchange Office is currently planning activities and sessions for next year’s training programme and would like your feedback about what’s worked well, areas for improvement and suggestions for new training sessions.

 

Tell us what you think via our survey and be in with a chance of winning a £30 Amazon voucher. The deadline date is Wednesday 28th March.

What is Open Access?

Open access is about making the products of research freely accessible to all. It allows research to be disseminated quickly and widely, the research process to operate more efficiently, and increased use and understanding of research by business, government, charities and the wider public.

There are two complementary mechanisms for achieving open access to research.

The first mechanism is for authors to publish in open-access journals that do not receive income through reader subscriptions.

The second is for authors to deposit their refereed journal article in an open electronic archive.

These two mechanisms are often called the ‘gold’ and ‘green’ routes to open access:

  • Gold – This means publishing in a way that allows immediate access to everyone electronically and free of charge. Publishers can recoup their costs through a number of mechanisms, including through payments from authors called article processing charges (APCs), or through advertising, donations or other subsidies.
  • Green – This means depositing the final peer-reviewed research output in an electronic archive called a repository. Repositories can be run by the researcher’s institution, but shared or subject repositories are also commonly used. Access to the research output can be granted either immediately or after an agreed embargo period.

Article first published – http://www.hefce.ac.uk/rsrch/oa/whatis/

To encourage all academic communities to consider open access publishing, Authors Alliance has produced a comprehensive ‘Understanding Open Access‘ guide which addresses common open access related questions and concerns and provides real-life strategies and tools that authors can use to work with publishers, institutions, and funders to make their works more widely accessible to all.

To access and download the guide, please follow this link – http://authorsalliance.org/wp-content/uploads/Documents/Guides/Authors%20Alliance%20-%20Understanding%20Open%20Access.pdf

For any other open access related queries, please do get in touch with Shelly Anne Stringer in RKEO.

There’s no such thing as a bad metric.

Lizzie Gadd warns against jumping on ‘bad metrics’ bandwagons without really engaging with the more complex responsible metrics agenda beneath.

An undoubted legacy of the Metric Tide report has been an increased focus on the responsible use of metrics and along with this a notion of ‘bad metrics’.  Indeed, the report itself even recommended awarding an annual ‘Bad Metrics Prize’.  This has never been awarded as far as I’m aware, but nominations are still open on their web pages.  There has been a lot of focus on responsible metrics recently.  The Forum for Responsible Metrics have done a survey of UK institutions and is reporting the findings on 8 February in London.  DORA has upped its game and appointed a champion to promote their work and they seem to be regularly retweeting messages that remind us all of their take on what it means to do metrics responsibly.   There are also frequent twitter conversations about the impact of metrics in the up-coming REF.  In all of this I see an increasing amount of ‘bad metrics’ bandwagon-hopping.  The anti-Journal Impact Factor (JIF) wagon is now full and its big sister, the “metrics are ruining science” wagon, is taking on supporters at a heady pace.

It looks to me like we have moved from a state of ignorance about metrics, to a little knowledge.  Which, I hear, is a dangerous thing.

It’s not a bad thing, this increased awareness of responsible metrics; all these conversations.  I’m responsible metrics’ biggest supporter and a regular slide in my slide-deck shouts ‘metrics can kill people!’.  So why am I writing a blog post that claims that there is no such thing as a bad metric?  Surely these things can kill people? Well, yes, but guns can also kill people, they just can’t do so unless they’re in the hands of a human.  Similarly, metrics aren’t bad in and of themselves, it’s what we do with them that can make them dangerous.

In Yves Gingras’ book, “Bibliometrics and Research Evaluation” he defines the characteristics of a good indicator as follows:

  • Adequacy of the indicator for the object that it measures
  • Sensitivity to the intrinsic inertia of the object being measured
  • Homogeneity of the dimensions of the indicator.

So, you might have an indicator such as ‘shoe size’, where folks with feet of a certain length get assigned a certain shoe size indicator. No problem there – it’s adequate (length of foot consistently maps on to shoe size); it’s sensitive to the thing it measures (foot grows, shoe size increases accordingly), and it’s homogenous (one characteristic – length, leads to one indicator – shoe size).  However, in research evaluation we struggle on all of these counts.  Because the thing we really want to measure, this elusive, multi-faceted “research quality” thing, doesn’t have any adequate, sensitive and homogeneous indicators. We need to measure the immeasurable. So we end up making false assumptions about the meanings of our indicators, and then make bad decisions based on those false assumptions.  In all of this, it is not the metric that’s at fault, it’s us.

In my view, the JIF is the biggest scapegoat of the Responsible Metrics agenda.  The JIF is just the average number of cites per paper for a journal over two years.  That’s it.  A simple calculation. And as an indicator of the communication effectiveness of a journal for collection development purposes (the reason it was introduced) it served us well.  It’s just been misused as an indicator of the quality of individual academics and individual papers.  It wasn’t designed for that.  This is misuse of a metric, not a bad metric. (Although recent work has suggested that it’s not that bad an indicator for the latter anyway, but that’s not my purpose here).  If the JIF is a bad metric, so is Elsevier’s CiteScore which is based on EXACTLY the same principle but uses a three-year time window not two, a slightly different set of document types and journals, and makes itself freely available.

If we’re not careful, I fear that in a hugely ironic turn, DORA and the Leiden Manifesto might themselves become bad (misused) metrics: an unreliable indicator of a commitment to the responsible use of metrics that may or may not be there in practice.

I understand why DORA trumpets the misuse of JIFs; it is rife and there are less imperfect tools for the job. But there are also other metrics that DORA doesn’t get in a flap about – like the individual h-index – which are subject to the same amount of misuse, but are actually more damaging.  The individual h-index disadvantages certain demographics more than others (women, early-career researchers, anyone with non-standard career lengths); at least the JIF mis-serves everyone equally.  And whilst we’re at it peer review can be an equally inadequate research evaluation tool (which, ironically, metrics have proven). So if we’re to be really fair we should be campaigning for responsible peer review with as much vigour as our calls for responsible metrics.

Bumper stickers by Paul van der Werf
Bumper stickers by Paul van der Werf (CC-BY)

 

It looks to me like we have moved from a state of ignorance about metrics, to a little knowledge.  Which, I hear, is a dangerous thing.  A little knowledge can lead to a bumper sticker culture ( “I HEART DORA” anyone?  “Ban the JIF”?) which could move us away from, rather than towards, the responsible use of metrics. These concepts are easy to grasp hold of, but they mask a far more complex and challenging set of research evaluation problems that lie beneath.  The responsible use of metrics is about more than the avoidance of certain indicators, or signing DORA, or even developing your own bespoke Responsible Metrics policy (as I’ve said before this is certainly easier said than done).

The responsible use of metrics requires responsible scientometricians.  People who understand that there is really no such thing as a bad metric, but it is very possible to misuse them. People with a deeper level of understanding about what we are trying to measure, what the systemic effects of this might be, what indicators are available, what their limitations are, where they are appropriate, how they can best triangulate them with peer review.  We have good guidance on this in the form of the Leiden Manifesto, the Metric Tide and DORA.  However, these are the starting points of often painful responsible metric journeys, not easy-ride bandwagons to be jumped on.  If we’re not careful, I fear that in a hugely ironic turn, DORA and the Leiden Manifesto might themselves become bad (misused) metrics: an unreliable indicator of a commitment to the responsible use of metrics that may or may not be there in practice.

Let’s get off the ‘metric-shaming’ bandwagons, deepen our understanding and press on with the hard work of responsible research evaluation.

 


Elizabeth Gadd

Elizabeth Gadd is the Research Policy Manager (Publications) at Loughborough University. She has a background in Libraries and Scholarly Communication research. She is the co-founder of the Lis-Bibliometrics Forum and is the ARMA Metrics Special Interest Group Champion

 

 

Creative Commons LicenceOriginal content posted on The Bibliomagician reposted here with permission. Content is licensed under a Creative Commons Attribution 4.0 International License.

REF & TEF: the connections – 11th October 2017

The outcomes of this year’s Teaching Excellence Framework (TEF) and the direction for the Research Excellence Framework (REF) as set out in the 2017 consultation response are likely to have significant implications for the higher education sector.  The links between research and teaching are likely to become ever more important, but set against the context of increasing emphasis on student experience, how should the sector respond and where should it focus?

REF & TEF: the connections will be hosted at Bournemouth University and will bring together some of the leading experts in higher education in both research and teaching policy.  During the morning, attendees will have the opportunity hear from experts from across the higher education sector, as they share their insights into the importance of the links between teaching and research.  The afternoon will feature a number of case studies with speakers from universities with a particularly good record of linking research and  teaching.

Speakers confirmed to date include Kim Hackett, REF Manager and Head of Research Assessment, HEFCE and John Vinney Bournemouth University, William Locke University College London, Professor Sally Brown Higher Education Academy.

For more information or to book on visit: https://reftef.eventbrite.co.uk

Writing Days – Open for Booking!

As part of the Writing Academy, a series of writing days have been organised to help support BU authors work on their publications by providing some dedicated time and space, away from everyday distractions.

The days will have a collaborative focus on productive writing with other BU authors, the RKEO team will also be on hand to provide authors with help and guidance on all areas of the publication process.

Writing Days have been scheduled on the below dates:

  • Friday 15th September 2017

    Thursday 2nd November 2017

    Friday 5th January 2018

    Wednesday 7th March 2018

    Tuesday 22nd May 2018

    Monday 23rd July 2018

Spaces are limited so please only book on if you are able to commit to attending for the whole day.

Click here to book on!