Category / Research assessment

Fake conferences are not fake news: beware predatory conferences

Introduction

Academic have been warned for a decade about predatory Open Access publishers (van Teijlingen 2014). These are commercial organisations charging academics a publication fee on submission of their manuscripts with a promise to publish their work quickly online. The problem is twofold: first, these commercial organisations don’t offer proper peer-review and editorial quality assurance; and secondly, academic are being tricked into believing the journal is a legitimate scientific publication.  The second author receives on average six to eight invitations a week to publish in this kind of predatory journals – see below for examples. The first author, who despite having not worked in an academic institution for over three years, still receives such invitations to publish in ‘Journal X’.

Predatory conferences

A similar phenomenon to predatory journals is the predatory conference (Moital 2014; Nobes 2017; Grove 2017). These are pretend academic conferences of questionable value, established first and foremost to make money, not for the greater good of the academic discipline.

Both authors have received bogus and legitimate invitations to attend conferences. A predicament with such an invitation, which 99% of time arrives by email, is that it is not easy to distinguish between fake and real offers. For example, the first author recently received an offer (at short notice), to attend a conference in Miami in November 2017 (see below). This was on the back of an editorial he had published couple of months earlier. For a career researcher going from contract to contract, the appeal of being invited to present a keynote at a conference can be flattering, far less an honour and a boost for one’s career. Therefore, while the idea that if it seems too good to be true, is a prudent one to hold; there is also a temptation to follow through.

The author replied to the request quizzing the reason for the invite out of the blue. The answer was less than convincing, and a swift email by the author saying “Don’t tell me… You are offering me a keynote with travel and accommodation… Lol!!” called their bluff and ended correspondence.

But digging a little deeper he found there was a webpage dedicated to taking payments to attend the conference. In the digital world, a fool can be easily and quickly separated from his or her money.

Of course, it may have been a real conference at a real venue, and they really wanted him to speak. But discerning this is not easy at first…

Some of the warning signs/What to look out for

  • The conference email invitation looks very convincing (if not don’t even read it!).
  • The venue is good location as Nobes (2017) highlighted, “the organizers are more interested in marketing the tourist destination rather than the academic value of the conference”.
  • The conference covers too many different aspects or topics, as if the advert is designed to catch the eye of many people as possible who are vaguely connected to the discipline.
  • Mentions on associated predatory journals and ‘important’ organisations in the discipline.
  • Email and bank accounts that don’t look professional/ official.
  • Little mention of attendance fees, but after acceptance emails demanding a high conference fee and other charges.
  • Conference organisers are not academics, or unknown names.
  • Conference does not peer-review submission/ not provide proper editorial control over presentations
  • Signs of copying of names of existing academic conferences or scientific organisation and even copying of their webpages
  • Even more advertising than normal at a scientific conference.

Furthermore, Andy Nobes (2017) offered some helpful advice on quality of the conference websites in the list below. Andy is based at AuthorAID, a global network providing support, mentoring, resources and training for researchers in developing countries.

Who is at risk of falling for predatory conferences?

Academics need to be aware of money-making conferences and meetings without a true commitment to science. But some academics might be more at risk than others. Young researchers, PhD students and fledgling academics, living from contract to contract may feel any conference attendance is a potential career boost. Thus, such an invitation might seem flattering and an opportunity to good to miss. A way to show that he or she is a capable and independent academic.

Final thoughts

Most academics go to conferences for a combination of presenting their work to get critical feedback, making new contacts, sharing ideas and to be inspired. With such broad combination of motivating factors, the exact purpose of conferences is difficult to ascertain because there is no a priori agreed role and value of conferences (Nicolson, 2017a). However, there is evidence that academic conferences function to facilitate commodity transactions, be that knowledge, tools, skills, reputations, or connections, which reflects the neoliberal ethos in the modern academy (Nicolson 2017b). The predatory conference can be viewed in this light, where academia is more and more focused on generating revenue. It is at best scurrilous, and worst, criminal, for organisations to make money using such a confidence trick.  Always check which conferences are organised and advertised by recognised scholarly organisations in your own discipline. If uncertain ask a more experienced academic, a senior colleague or mentor.

 

 

Donald J. Nicolson

(Health Services Researcher, NHS Fife, and Independent Scholar; twitter @_mopster )

Edwin R. van Teijlingen

(Centre Midwifery, Maternal & Perinatal Health)

 

References:

Moital, M. (2014) Ten Signs of a Bogus/Fake Conference.

Grove, J. (2017) Predatory conferences ‘now outnumber official scholarly events’  (26th Oct.)

Nicolson, D.J. (2017a) Do conference presentations impact beyond the conference venue? Journal of Research in Nursing. 22(5), pp.422-425.

Nicolson, D.J. (2017b) Academic Conferences as Neoliberal Commodities, Palgrave Macmillan

Nobes, A. (2017) What are ‘predatory’ conferences and how can I avoid them?

van Teijlingen, E. (2014) Beware of rogue journals.

 

REF 2021 workshops – what makes a 2*, 3* or 4* output?

We have a series of externally-facilitated REF outputs workshops scheduled to take place in early 2018 as part of the RKE Development Framework. Each session is led by REF 2014 sub-panel member who will explain how the panel interpreted and applied the REF 2014 guidance when assessing the quality of outputs. The workshops are open to all academic staff to attend.

The expected learning outcomes from the workshops are for attendees to:

  • Gain insight into how the REF panels applied the REF criteria when considering the significance, rigour and originality of outputs;
  • Understand the differences between outputs scored 4*, 3*, 2*, 1* and Unclassified;
  • Gain insight into what is meant by ‘world leading’ and ‘internationally excellent’;
  • Understand how scores borderline cases were agreed and what the tipping points were to either break the ceiling into the higher star level or to hold an output back a star level;
  • Understand how panels used other information such as metrics, markers of journal quality or prior knowledge in output assessment;
  • Gain insight into how future outputs could be strengthened for REF2021.

 

Workshops scheduled so far are:

  • UOA 2/3 – Prof Dame Jill Macleod Clark – 15 March 2018
  • UOA 4 – Prof Marion Hetherington – 10 January 2018
  • UOA 11 – Prof Iain Stewart – 29 January 2018
  • UOA 12 – Prof Chris Chatwin – 8 January 2018
  • UOA 14 – Prof Jon Sadler – 11 January 2018
  • UOA 15 – Prof Graeme Barker – 7 February 2018
  • UOA 17 – Prof Terry Williams – 17 January 2018
  • UOA 18 – tbc
  • UOA 20/21 – Prof Imogen Taylor – 15 January 2018
  • UOA 23 – Prof Jane Seale – 26 January 2018
  • UOA 24 – tbc
  • UOA 27 – Prof Pat Waugh – 16 January 2018
  • UOA 11/32 (computer animation) – Prof Anthony Steed – 31 January 2018
  • UOA 32/34 (practice-based) – Prof Stephen Partridge – date tbc
  • UOA 36 – Prof Peter Lunt – date tbc

Bookings for these can be made via the Staff Intranet: https://staffintranet.bournemouth.ac.uk/workingatbu/staffdevelopmentandengagement/fusiondevelopment/fusionprogrammesandevents/rkedevelopmentframework/researchexcellenceframework/

REF 2021 – final decisions published

HEFCE kept their word and published the final decisions on REF 2021 in the autumn. Having issued the initial decisions on the Research Excellence Framework 2021 in September, the final decisions were published this week. HEFCE released its further decisions on staff and outputs on 21 November 2017.  These decisions have been informed by responses to key questions relating to staff and outputs and a survey of staff in scope for submission. This blog post provides a summary of the key decisions.

 

Submitting staff:

Previous REF/RAE exercises asked institutions to select staff for submission. The Stern Review in 2016 recognised how divisive this practice was and instead recommended that all research-active staff be returned to the next REF. HEFCE are implementing this recommendation by expecting all staff with a ‘significant responsibility for research’ to be submitted, provided they are ‘independent researchers’. What do these terms mean in practice? The HEFCE definition is:

“those for whom explicit time and resources are made available to engage actively in independent research, and that is an expectation of their job role. Research is defined as a process of investigation leading to new insights, effectively shared. Staff engaged exclusively in scholarship would not be considered to have a significant responsibility for research.”

Working with the REF 2021 main panels, HEFCE will provide further guidance on identifying staff with significant responsibility. This will be published in the guidance on submissions and panel criteria. This guidance will not prescribe a fixed set of criteria that all staff would be required to meet, but will set out a ‘menu’ of what HEFCE would consider may be appropriate indicators of significant responsibility.

 

Recognising that there are staff who have more significant responsibility for other activities, HEFCE will implement an approach whereby institutions, working with their staff and with guidelines, identify who is in scope for submission among staff meeting core eligibility criteria. HEFCE has defined the core eligibility criteria as:

Category A eligible’

  • academic staff with a contract of employment of ≥0.2 FTE
  • on the payroll of the submitting institution on the census date (31 July 2020)
  • whose primary employment function is to undertake either ‘research only’ (independent researchers only) or ‘teaching and research’
  • have a substantive connection with the submitting institution (i.e. BU)
  • for staff on ‘research only’ contracts, the eligible pool should only include those who are independent researchers, not research assistants

‘Category A submitted’ describes the staff from the ‘Category A eligible’ pool who have been identified as having significant responsibility for research on the census date.

Where the ‘Category A eligible’ staff definition accurately identifies all staff in the submitting unit with significant responsibility for research, the unit should submit 100% of staff. Where it does not accurately identify all staff in the submitting unit who have significant responsibility for research, institutions will need to implement processes to determine this and document this in a code of practice, approved by the relevant funding body with advice from the Equality and Diversity Advisory Panel (EDAP).

 

Submitting outputs:

  • The average number of outputs required per submitted FTE will be 2.5 (up from 2 outputs as previously suggested by HEFCE).
  • A minimum of one output will be required for each staff member employed on the census date (as expected).
  • A maximum of five outputs may be attributed to individual staff members (including those who have left) (down from 6 outputs as previously suggested by HEFCE).
  • Data on the distribution of outputs across staff in the unit, including staff who have left, will be provided to the sub-panels for consideration in relation to the assessment of the environment.

Output portability: A transitional approach is being adopted whereby outputs may be submitted by both the institution employing the staff member on the census date and the originating institution where the staff member was previously employed when the output was demonstrably generated. ‘Demonstrably generated’ will be determined by the date when the output was first made publicly available. This applies to the whole REF 2021 period.

Open access: The REF Open Access policy will be implemented as previous set out. This requires outputs within the scope of the policy (journal articles and some conference proceedings) to be deposited as soon after the point of acceptance as possible, and no later than three months after this date from 1 April 2018. Due to concerns around deposit on acceptance, a deposit exemption will be introduced from 1 April 2018 and remain in place for the rest of the REF 2021 publication period. This will allow outputs unable to meet this deposit timescale to remain compliant if they are deposited up to three months after the date of publication.

 

Number of impact case studies required

Submissions will include a total of one case study, plus one further case study per up to 15 FTE staff submitted, for the first 105 FTE staff returned (with a reduced requirement above this of one additional case study per 50 FTE staff). Submissions will therefore include a minimum of two case studies.

 

For the latest information, see our REF 2021 webpage.

Have you been involved with an event designed for the external community?

Then we want to hear from you!smiley-face1

The University is currently compiling the data for the annual Higher Education – Business & Community Interaction survey (HE-BCI) due to be submitted to HESAshortly. Data returned is used to calculate our HEIF grant.

We are asked to submit details of social, cultural and community events designed for the external community (to include both free and chargeable events) which took place between1 August 2016 and 31 July 2017.

hesa_logo

Event types that should be returned include, but are not limited to:

  • public lectures
  • performance arts (dance, drama, music, etc)
  • exhibitions
  • museum education
  • events for schools and community groups
  • business breakfasts

We cannot return events such as open days, Student Union activity, commercial conferences, etc.

All events that we ran as part of the Festival of Learning, ESRC Festival of Social Science and Cafe Scientifique series are likely to be eligible for inclusion and we will collate this information on your behalf centrally.

If you have been involved with any other event which could be returned, please could you let your contact (see below) know the event name and date, whether it was free or chargeable, the estimated number of attendees, and an estimate of how much academic time was spent preparing for (but not delivering) the event:

  • SciTech – Norman Stock
  • FoM – Rob Hydon
  • HSS – Tanya Richardson
  • FMC – Mark Brocklehurst
  • Professional Service – Rebecca Edwards (RKEO)

The data returned is used by HEFCE to allocate the HEIF funding so it is important that we return as accurate a picture as possible.

REF 2021 workshops – what makes a 2*, 3* or 4* output?

We have a series of externally-facilitated REF outputs workshops scheduled to take place in early 2018 as part of the RKE Development Framework. Each session is led by REF 2014 sub-panel member who will explain how the panel interpreted and applied the REF 2014 guidance when assessing the quality of outputs. The workshops are open to all academic staff to attend.

The expected learning outcomes from the workshops are for attendees to:

  • Gain insight into how the REF panels applied the REF criteria when considering the significance, rigour and originality of outputs;
  • Understand the differences between outputs scored 4*, 3*, 2*, 1* and Unclassified;
  • Gain insight into what is meant by ‘world leading’ and ‘internationally excellent’;
  • Understand how scores borderline cases were agreed and what the tipping points were to either break the ceiling into the higher star level or to hold an output back a star level;
  • Understand how panels used other information such as metrics, markers of journal quality or prior knowledge in output assessment;
  • Gain insight into how future outputs could be strengthened for REF2021.

 

We’ve got dates for half of the UOAs so far:

  • UOA 2/3 – Prof Dame Jill Macleod Clark – date tbc (likely to be mid to late February 2018)
  • UOA 4 – Prof Marion Hetherington – 10 January 2018
  • UOA 11 – Prof Iain Stewart – 29 January 2018
  • UOA 12 – Prof Chris Chatwin – 8 January 2018
  • UOA 14 – Prof Jon Sadler – date tbc
  • UOA 15 – Prof Graeme Barker – date tbc
  • UOA 17 – Prof Terry Williams – 17 January 2018
  • UOA 18 – tbc
  • UOA 20/21 – Prof Imogen Taylor – 15 January 2018
  • UOA 23 – Prof Jane Seale – 26 January 2018
  • UOA 24 – tbc
  • UOA 27 – Prof Pat Waugh – 16 January 2018
  • UOA 32 – Prof Stephen Partridge – date tbc
  • UOA 36 – Prof Peter Lunt – date tbc

Bookings for these can be made via the Staff Intranet: https://staffintranet.bournemouth.ac.uk/workingatbu/staffdevelopmentandengagement/fusiondevelopment/fusionprogrammesandevents/rkedevelopmentframework/researchexcellenceframework/

REF2021 – initial decisions finally published

On Friday there was an exciting update from the REF Team based at HEFCE – they published the initial decisions on REF 2021. Whilst this does not include decisions regarding submitting staff, output portability or the eligibility of institutions to participate in the REF, it does include key decisions regarding the UOA structure, institution-level assessment, and the assessment weightings.

The decisions published on Friday are summarised below:

 

OVERALL:

Assessment weightings:

  • Outputs 60% (down from 65%)
  • Impact 25% (up from 20%)
  • Environment 15% (same but now includes impact strategy)

The move of the impact template from the impact assessment to the environment assessment means impact will actually contribute to more than 25% of the weighting (see impact section).

Assessment will continue to use the five-point REF 2014 scale (1*-4* and Unclassified).

UOA structure:

  • Total UOAs reduced from 36 to 34
  • Engineering will be a single UOA – UOA 12
  • REF 2014 UOA 17 will be restructured to form UOA 14: Geography and Environmental Studies and UOA 15: Archaeology
  • ‘Film and Screen Studies’ will be located and included in the name of UOA 33: Music, Drama, Dance, Performing Arts, Film and Screen Studies
  • HEFCE will continue consulting with the subject communities for forensic science and criminology to consider concerns raised about visibility. A decision is expected this autumn.

HESA cost centres will not be used to allocate staff to UOAs. Responsibility for mapping staff into UOAs will therefore remain with institutions.

 

TIMETABLE:

Impact:

  • Underpinning research must have been produced between 1 Jan 2000 – 31 Dec 2020.
  • Impacts must have occurred between 1 Aug 2013 – 31 Jul 2020.

Environment:

  • Environment data (such as income and doctoral completions) will be considered for the period 1 Aug 2013 – 31 Jul 2020.

Outputs:

  • The assessment period for the publication of outputs will be 1 Jan 2014 – 31 Dec 2020.

The draft REF 2021 guidance will be published in summer/autumn 2018 and the final guidance will be published in winter 2018-19. The submission will be in autumn 2020.

 

OUTPUTS:

Interdisciplinary research:

  • Each sub-panel will have at least one appointed member to oversee and participate in the assessment of interdisciplinary research submitted in that UOA.
  • There will be an interdisciplinary research identifier for outputs in the REF submission system (not mandatory).
  • There will be a discrete section in the environment template for the unit’s structures in support of interdisciplinary research.

Outputs due for publication after the submission date:

A reserve output may be submitted in these cases.

Assessment metrics:

Quantitative metrics may be used to inform output assessment. This will be determined by the sub-panels. Data will be provided by HEFCE.

 

IMPACT:

  • Impact will have a greater weighting in REF 2021 (25% overall plus impact included in the environment template and therefore weighting).
  • Harmonised definitions of academic and wider impact will be developed between HEFCE and the UK Research Councils.
  • Academic impacts will be assessed as part of the ‘significance’ assessment of the outputs and therefore not in the impact assessment.
  • Further guidance will be provided on the criteria for reach and significance and impacts arising from public engagement.
  • The guidance on submitting impacts on teaching will be widened to include impacts within, and beyond, the submitting institution.
  • Impacts remain eligible for submission by the institution in which the associated research was conducted. They must be underpinned by excellent research (at least REF 2*).
  • Impact case study template will have mandatory fields for recording standardised information, such as research funder, etc.
  • The number of case studies required – still not confirmed – HEFCE are exploring this in relation to the rules on staff submission and the number of outputs.
  • Case studies submitted to REF 2014 can be resubmitted to REF 2021, providing they meet the REF 2021 eligibility requirements.
  • The relationship between the underpinning research and impact will be broadened from individual outputs to include a wider body of work or research activity.

 Institutional-level assessment (impact case studies):

  • HEFCE will pilot this assessment in 2018 but it will not be included in REF 2021.

 

ENVIRONMENT:

The UOA-level environment template will be more structured, including the use of more quantitative data to evidence narrative content:

  • It will include explicit sections on the unit’s approach to:
    • supporting collaboration with organisations beyond HE
    • enabling impact – akin to the impact template in REF 2014
    • supporting equality and diversity
    • structures to support interdisciplinary research
    • open research, including the unit’s open access strategy and where this goes beyond the REF open access policy requirements

Institutional-level assessment (environment):

  • Institution-level information will be included in the UOA-level environment template, assessed by the relevant sub-panel.
  • HEFCE will pilot the standalone assessment of institution-level environment information as part of REF 2021, but this will not form part of the REF 2021 assessment. The outcomes will inform post-REF 2021 assessment exercises.

 

PANEL RECRUITMENT:

  • The sub-panel chair application process is now open (details available via the link).
  • The document sets out the plan for the recruitment of panel members (a multi-stage approach)

 

OUTSTANDING DECISIONS:

The announcement does not include decisions regarding submitting staff, output portability or the eligibility of institutions to participate in the REF. There is ongoing dialogue between HEFCE (on behalf of the funding councils) and the sector regarding this. The letter (accessed via the link above) sets out HEFCE’s current thoughts on these points and invites the sector to feedback by 29 September 2017.  BU will be providing feedback so if you have a view on this then please email me (jnortham@bournemouth.ac.uk).

 

SUMMARIES AVAILABLE:

REF & TEF: the connections – 11th October 2017

The outcomes of this year’s Teaching Excellence Framework (TEF) and the direction for the Research Excellence Framework (REF) as set out in the 2017 consultation response are likely to have significant implications for the higher education sector.  The links between research and teaching are likely to become ever more important, but set against the context of increasing emphasis on student experience, how should the sector respond and where should it focus?

REF & TEF: the connections will be hosted at Bournemouth University and will bring together some of the leading experts in higher education in both research and teaching policy.  During the morning, attendees will have the opportunity hear from experts from across the higher education sector, as they share their insights into the importance of the links between teaching and research.  The afternoon will feature a number of case studies with speakers from universities with a particularly good record of linking research and  teaching.

Speakers confirmed to date include Kim Hackett, REF Manager and Head of Research Assessment, HEFCE and John Vinney Bournemouth University, William Locke University College London, Professor Sally Brown Higher Education Academy.

For more information or to book on visit: https://reftef.eventbrite.co.uk

I’m an academic at BU. Will I be submitted to REF 2021?

Good question and, although no firm decisions have yet been announced by HEFCE, it is looking increasingly likely that all academic staff at BU will be included in the REF 2021 submission, each with at least one output published between 2014-2020.

In the midst of the sector waiting with baited breath for the initial decisions from the UK funding bodies on this, and other REF questions, HEFCE held a webinar in July. During this webinar they shared some possible decisions with the sector (the webinar and the slides are available here on the HEFCE website). The key suggestions were:

  • 100% of academics with a “significant responsibility” to undertake research are likely to be included. It is unclear at this stage what “significant responsibility” means in practice, although it is anticipated this will be based on there being an expectation for an academic member of staff to undertake research.
  • Staff without a significant responsibility for research may be exempt from inclusion but auditable documentation would be required. This would need to explicitly evidence there is not an expectation of the individual to undertake research (examples given were workload models or career frameworks linked to the individual).
  • Everyone submitted is likely to need a minimum of 1 output. The average and maximum outputs per FTE are to be determined – in the consultation it was proposed these were an average of 2 outputs per submitted FTE and a maximum of six outputs per person.
  • There is likely to be a hybrid model for output portability (i.e. which HEI can submit the outputs authored by a member of academic staff who moves from one institution to another during the REF period) – HEFCE proposed two options:
    • Simple model whereby both old and new institutions can submit the outputs produced by the academic member of staff when he/she was employed at the old institution (this would, some might say unfortunately, result in double counting of outputs but this can probably be tolerated as it happens already in some cases, for example, where co-authors at different HEIs submit the same output).
    • Complex model whereby a census date and employment range date are used to determine which outputs can be submitted by which institution.

Whilst these are not yet firm decisions (these are expected in two communications – one on staff and outputs in the autumn and one on everything else later this month), these are the clearest indications yet that all academic staff at BU will be included in REF 2021, each with at least one output.

For further information on REF 2021, see BU’s REF 2021 webpage. If you have any queries, please contact Julie Northam or Shelly Anne Stringer.

REF Main Panel Chairs announced

The main panels will provide leadership and guidance to the sub-panels that undertake the REF assessment. As chairs designate, the appointees will at first advise the funding bodies on the initial decisions and on the further development of the framework. They will take up their roles as chairs later in the year*, once the outcomes of the ‘consultation on the second REF’ are announced and further appointments to the REF panels have been made.

The Main Panel Chairs (designate) for each of the four main panel areas are:

Biographies for the Main Panel Chairs are available here: Biographies

*Interesting to note that HEFCE have reaffirmed their previous commitment to announce the outcomes of the consultation later this year, despite rumours this would either be delayed or result in a second technical consultation.

REF 2021 – stocktake exercises

With the publication of the Stern Review last summer and the funding bodies’ Second Consultation on the REF earlier this year, there’s been a lot of discussion at BU and across the sector around REF 2021 lately. Despite this, and indeed because of this, we’re still none the wiser as to what the next REF will look like. Like many other universities, we are progressing with our internal preparations whilst we await the publication of the initial decisions from the funding bodies’ in response to the feedback to their consultation (predicted to be later this year).

One of the ways BU is preparing is by running a stocktake exercise to see what outputs academic staff have published since 1 January 2014 and what potential impact BU research is having. Not only will this provide a summary of progress c. half way through the REF assessment period, it will also enable resources to be allocated to support further high-quality outputs and to accelerate research impact.

The stocktake exercise is being run in two cohorts:

  • Cohort 1 takes place this summer and involves UOAs – 2, 3, 4, 12, 22/23, 25, 34 and 36.
  • Cohort 2 takes place this autumn and involves UOAs – 11, 17 (archaeology), 17 (geography and environmental studies), 19, 20, 26 and 29.

The process will be the same for each cohort. On the outputs side, we are changing from individuals self-nominating for their inclusion in the exercise to a model where all academic staff (with a research-only or a teaching and research contract) are automatically included. This ensures the exercise is fully inclusive whilst reducing the burden on individual academic staff. In terms of impact, we are changing from colleagues writing impact case studies to inviting them to attend a meeting and deliver a short informal presentation of their research, its impact and their plans for generating further impact, followed by a discussion with the panel. This is linked to the launch of the new impact tracker in BRIAN.

The stocktake exercises are designed to be fully inclusive, positive and developmental. Further information about the REF is available on the Research Blog’s REF webpage.

SciVal’s Field weighted citation impact: Sample size matters!

There’s been a buzz on social media recently about Field weighted Citation Impact (FWCI) particularly around the recent leak from the University of Manchester that the FWCI is one of the measures suggested by which to assess academics most at risk of redundancy:

In his recent blog on The Bibliomagician Blog  (reposted here with permission) Iain Rowlands a Research Information & Intelligence Specialist at King’s College London and a member of the LIS-Bibliometrics committee questions the stability of the FWCI indicator for sets of fewer than 10,000 documents. Ian invites others to use his methodology to further test his theory…

SciVal’s field-weighted citation impact (FWCI) is an article-level metric that takes the form of a simple ratio: actual citations to a given output divided by the expected rate for outputs of similar age, subject and publication type.  FWCI has the dual merits of simplicity and ease of interpretation: a value of 2 indicates that an output has achieved twice the expected impact relative to the world literature.  It is a really useful addition to the benchmarking toolkit.

The trouble is that, typically, the distribution of citations to outputs is highly skewed, with most outputs achieving minimal impact at one end and a small number of extreme statistical outliers at the other.  Applying the arithmetic mean to data distributed like this, as does FWCI, is not ideal because the outliers can exert a strong leveraging effect, “inflating” the average for the whole set.  This effect is likely to be more marked the smaller the sample size.

I explored this effect in a simple experiment.  I downloaded SciVal FWCI values for 52,118 King’s College London papers published up until 2014.  I then calculated mean FWCI and 95% confidence (or stability) intervals for the whole sample using the bootstrapping[1] feature in SPSS.  Then I took progressively smaller random samples (99%, 98%, and so on to 1%, then 0.1%), recalculating mean FWCI and stability intervals each time.

The findings shows how mean FWCI becomes less stable as sample size decreases.  Highly cited outliers are relatively uncommon, but their chance inclusion or exclusion makes a big difference, especially as the number of outputs decreases.  In this experiment, FWCI values range across four orders of magnitude, from 0.03 to 398.28.

FWCI chart_black

What does this mean for interpreting FWCI, especially when benchmarking? The table below offers some guidance.  It shows typical stability intervals around FWCI at different scales.  The final column assumes that SciVal spits out a value of 2.20 and shows how that figure should be interpreted in terms of its stability.

FWCI Table

It’s pretty clear from this analysis that you need to know when it’s time to stop when you are drilling down in SciVal!  Another implication is that there is no sensible justification for quoting FWCI to two let alone three decimal places of precision.  I’ve kept the second decimal place above simply for purposes of demonstration.

I am well aware that the guidance above is based on data from just one institution, and may not travel well. If you would like to replicate this experiment using your own data, I’m happy to share my SPSS Syntax file.  It automates the whole thing, so you just have to load and go off on a short holiday! Just drop me an email.

Ian Rowlands is a Research Information & Intelligence Specialist at King’s College London and a member of the LIS-Bibliometrics committee.

ian.rowlands@kcl.ac.uk