Tagged / bibliometrics

Books to understand academic publishing and research metrics

The library has just purchased two new titles about academic publishing and research metrics.

They give an overview of the main tools for measuring impact and a summary of main issues and terminology in academic publishing. These titles were recommended in the London School of Economics and Political Sciences’ Impact Blog.

Both titles are ebooks, so they can be accessed from anywhere:

https://capitadiscovery.co.uk/bournemouth-ac/items/1056481Measuring research: what everyone needs to know.

https://capitadiscovery.co.uk/bournemouth-ac/items/1056480Scholarly communication: what everyone needs to know.

Happy reading!

José López Blanco

HSS Faculty Librarian

Predatory journals and conferences – how your library team can help

Predatory journals are those which charge fees without proper editorial and publishing services. In order to help you, library and learning support offer quite a lot of guidance on spotting predatory journals and conferences.

Familiarising oneself with journal rankings and bibliometrics is also a good way of recognising good-quality journals.

Remember that BU library subscribes to Web of Science and Scopus, two of the most important citation databases. These can be accessed through our alphabetical list of databases. Web of Science and Scopus index some of the most quality journals.

Scimago is another good source of information to confirm the quality of a journal. This resource contains additional indices of journals.

In case of doubt regarding a journal, please contact your faculty library team.

Web of Science: how journals are selected for inclusion

Web of Science is one of the main metrics tools that will be used to inform REF2021, however, not all journals are indexed within it.

This useful link explains how the journal selection process works.

Here is a summary of the key points:

The Web of Science Core Collection now contains four main Citation Indexes, the established Science Citation Index Expanded (SCIE), Social Sciences Citation Index (SSCI) and Arts & Humanities Citation Index (AHCI), plus a new Emerging Sources Citation Index (ESCI).  The ESCI is mainly a source for some of the more recently established journal titles that are being evaluated with respect to their quality and influence within academic publishing; titles in this index do not have impact factors and do not appear within the Journal Citation Reports.

The following factors are taken into consideration when including journals within the indexes:

  • Publishing standards, such as the peer review process, format, timeliness and bibliographic information in English
  • Editorial content
  • International focus
  • Citation analysis

These stringent rules ensure that only the most cited journals are included in the 3 established indexes. However, some good quality journals are excluded, especially those outside science. This is why being familiar with other sources such as a Scopus and metrics such as Scimago is important to get a clearer picture of citations.

Remember to visit the following Library & Learning Support Guides for additional information:

Jose Lopez Blanco,  jlopezblanco@bournemouth.ac.uk
HSS Faculty Librarian

Places available at BU researcher development sessions – Book Now!

There are spaces available at the following sessions for BU staff. To find out more and to book, simply follow the link to BU intranet and log in:

This Wednesday – 4/7/18: 

Forthcoming…

11/7/18:

12/7/18:

24/7/18:

 

Bibliometrics workshops – 6th of March

Understanding bibliometrics and the impact of your publications is fundamental for the  next REF.

The library academic liaison team is delivering two workshops on the 6th of March at Talbot Campus.

The Introduction to Bibliometrics session explains how to find journal- and article- metrics, Altmetrics and using BRIAN for metrics.

The Advanced Bibibliometrics session goes into more detail and talks about researchers’ ID and calculating your citations using H-Index and Google Scholar.

We look forward to seeing you at this workshop.

Jose

Faculty Librarian (HSS)

 

Bibliometrics: an introduction to research impact metrics

New training opportunity from the library’s academic liaison team

RKE Development Framework Workshop – “Bibliometrics: an introduction to research impact metrics”

Wednesday, 31st of May,  10am – 12pm

Understanding and demonstrating impact is becoming an essential part of any research activity.

Have you ever wondered how other people are citing your work? Do you know how to calculate your “h-index”? Have you heard of Altmetrics? Come along to this session to find out more.

Topics covered will include:

  • Journal quality (SCOPUS, Web of Science, Scimago)
  • Article quality
  • Researcher quality
  • Easy metrics via BRIAN
  • Your external research profile
  • Differences between disciplines
  • Other measures to show impact (Altmetrics)
  • Using impact data.

To book a place, follow this link:  https://staffintranet.bournemouth.ac.uk/workingatbu/staffdevelopmentandengagement/fusiondevelopment/fusionprogrammesandevents/rkedevelopmentframework/skillsdevelopment/bibliometrics/

Should metrics be used more widely in the next REF?

Back in 2008, after the dust was settling from the REF 2008 submission, HEFCE initiated a series of exercises to investigate whether bibliometric indicators of research quality (such as citation counts) could be used as part of the assessment for REF 2014. BU was one of 22 institutions that took part in the bibliometrics pilot, the result of which was that HEFCE concluded that citation information was not sufficiently robust enough to be used formulaically or as a primary indicator of quality but that there might be scope for it to inform and enhance processes of expert review in some disciplines. The REF 2014 guidelines stated that citation data would be provided for outputs submitted to all sub-panels in Main Panel A and some sub-panels in Main Panel B.

In April 2014, the Minister for Universities and Science asked HEFCE to undertake a fresh review of the role of metrics in determining quality, impact and other key characteristics of research undertaken in the HE sector. The review is being chaired by Professor James Wilsdon, Professor of Science and Democracy at the Science Policy Institute, University of Sussex.

HEFCE have launched a sector-wide call for evidence about research metrics and BU will be making an institutional response. BU colleagues are therefore invited to send feedback to me so that it can be considered as part of BU’s response. Colleagues are also invited to send individual responses to HEFCE.

Thinking back to 2008-09, I remember research metrics being an emotive subject and many researchers, both at BU and  across the sector, were extremely skeptical of their use in research assessment. Although bibliometrics have moved on a long way since then I think that there will still be concern as to whether metrics are robust enough to be used formulaically, particularly in the arts, humanities and social sciences.

HEFCE have asked that responses focus on the following issues:

1. Identifying useful metrics for research assessment.

2. How should metrics be used in research assessment?

3. ‘Gaming’ and strategic use of metrics.

4. International perspective.

Further information about the call for evidence is available here: http://www.hefce.ac.uk/whatwedo/rsrch/howfundr/metrics/

It is anticipated that the outcome of the review will inform the framework for the next REF assessment so it is vitally important that HEFCE receive a high quality and quantity of feedback from all disciplines.

If you would like to contribute to the BU institutional response, please add your comments to this response form and email it to me (jnortham@bournemouth.ac.uk) by Friday 30th May.

Journal Citation Reports® (JCR) 2013 now available

The 2013 Edition of Journal Citation Reports® (JCR) provides a combination of impact and influence metrics, and millions of cited and citing journal data points that comprise the complete journal citation network of Web of ScienceSM.

The 2013 Edition of JCR includes:

  • More than 10,800 of the world’s most highly cited, peer reviewed journals in 232 disciplines
  • Nearly 2,500 publishers and 83 countries represented
  • 379 journals receiving their first Journal Impact Factor

Data from the JCR can be used to provide a quantitative, systematic review of the world’s leading journals.

You can access  the JCR and Scopus’s corresponding Journal Analyzer tool via the Library A-Z List of Databases.

If you need any help researching and finding information, using library researcher tools, navigating reference management software or advice on depositing your open access materials in BURO via BRIAN please get in touch with your School Library Team.

Latest journal impact factors

Following the release of the latest Journal Citation Reports® on the Thomson Reuters ISI Web of Knowledge database, we have compiled a list of the top ranking journals in various fields related to BU research. BU staff can access these lists by going to the designated folder on the I-drive (copy and paste the following path into Windows Explorer and press return): I:\R&KEO\Public\RDU\Journal Impact Factors 2012.

If there are any additional subject areas that you would like to see included, please leave a comment to this post, below.

Related blog posts that may be of interest:

ResearcherID – an online tool for building a publications profile

What is ResearcherID?

ResearcherID is a facility provided via the Thomson Reuters ISI Web of Knowledge database. You can register for a unique researcher identification number which enables you to link directly to your published outputs and avoid the common problem of author misidentification. Users can update their profile information, build their publication list by using the Thomson Reuters Web of Science search facilities or by uploading files, and opt to make their profile public or private. Registered as well as non-registered users can search the ResearcherID registry to view profiles and find potential collaborators. You can also now search ResearcherID from within the Web of Science.

A ResearcherID number is a unique identifier that consists of alphanumeric characters. Each number contains the year you registered.

The benefits of registering for ResearcherID

The main benefit is the ability to build your own custom profile and correctly identify you as the author of your publications. Once registered, you will be provided with the tools to build your publication list by searching ISI Web of Knowledge and Web of Science, or by uploading your own list. You can choose whether or not to make your profile public.

ResearcherID is also integrated with EndNote Web, so you can build your online publication list either by searching Web of Science or uploading RIS files, or by using the EndNote Web online search. You can also manage your publication list via EndNote Web. For a tutorial on using EndNote Web, click here.

Once you’ve put together your publications list, you can then generate the following citation metrics for items available in the Web of Science:

  • H-index
  • Citation distribution per year
  • Total Times Cited count
  • Average Times Cited count

These metrics will be automatically updated in ResearcherID from the Web of Science as new data is added to the Web of Science.

ResearcherID can also help you find potential research collaborators by enabling you to search the ResearcherID registry. You can also explore an interactive map that can help you locate researchers by country and topic, or you can use the new country ‘tag cloud’.

Registering for ResearcherID

Go to the ResearcherID homepage then click on ‘Register’ and complete the short online form. You will be sent an email with a link to ResearcherID and a more detailed form appears for you to complete. Once you have completed this, and accepted the terms and conditions, you will receive your unique ResearcherID. Note that you will be sent these details in an email.

When you log in using your ResearcherID and password, you will be taken to your own publications web page with a unique URL. You can include this link on your email signatures so that others can easily access your publications.

Managing your publications

Under ‘MyPublications’ you will have an option to ‘Add’. Alternatively there is an ‘Add Publications’ button on the right hand side of the screen that will take you to the same location. This will give you the three options of searching for or adding a file:

  • ISI Web of Science
  • EndNote Web
  • RIS text file

Clicking on the publications database will bring up a search screen. Ensure that your surname and initials are correctly entered and click search. Any publications that are already included on your ResearcherID web page will appear in the list and be ticked already. If you wish to add others, tick the relevant box and click ‘Add selections to MyPublications’. The process works the same for all databases. Please note that you need to have an EndNote Web account.

To delete publications, simply click on the ‘Manage’ button under MyPublications.

For a ResearcherID factsheet, click here.

REF week on the Blog! What were the HEFCE REF pilots?

This week is REF Week on the Blog! Each day we will be explaining a different element of the Research Excellence Framework (REF) as a quick reference guide to help you prepare for the forthcoming REF exercise – REF2014.

What were the HEFCE REF pilots? – HEFCE ran two pilot exercises with HEIs in the sector during the development of the REF. The first exercise was a bibliometrics pilot, and the second was an impact pilot.

Bibliometrics pilot – HEFCE ran a pilot exercise in the construction of bibliometric indicators of research quality in 2008-09, using Scopus and the Web of Scienceas the test databases. BU was chosen as one of 22 institutions to be part of phase one of the pilot exercise. This involved the provision of publication details to HEFCE, and cross-checking BU information on Web of Science and Scopus. Where possible this was completed using BU’s institutional repository, BURO. The outcome of the bibliometrics pilot was that bibliometric indicators are not yet sufficiently robust enough in all disciplines to be used formulaically or as a primary indicator of research quality. However HEFCE agreed that there was scope for bibliometrics to inform the process of expert review in some units of assessment. These findings resulted in the decision that some UOA sub-panels will receive citation data (the number of times an output has been cited, calculate via Scopus) as additional information about the academic significance of the outputs.

Impact pilot – During 2009-10, HEFCE ran a second pilot exercise, this time with the aim of developing proposals for how to assess research impact in the REF. The impact pilot involved 29 HEIs submitting evidence of impact (case studies and statements) which were assessed by pilot expert panels in five units of assessment:

  • Clinical Medicine
  • Physics
  • Earth Systems and Environmental Sciences
  • Social Work and Social Policy & Administration
  • English Language and Literature

The impact pilot completed in autumn 2010 and the final report (including recommendations and findings) was published on 11 November 2010. The full report can be accessed on the HEFCE website. For a brief summary of the report, please download the Impact Pilot Summary. You can also read our REF Impact FAQs.

You can access the latest presentation about the REF, written by the REF team, here: REF slide pack Sep 2011

Check out the posts appearing on the Blog every day this week as part of REF Week!

Bibliometrics need not be baffling!

What are bibliometrics?

Bibliometrics are a set of methods used to study or measure text and information. Citation analysis and content analysis are the most commonly used bibliometric methods. Bibliometric methods can help you explore the academic impact of a field, a set of researchers or a particular journal paper.

What is citation analysis?

Citation analysis looks at where a document has been referenced by others since it was originally published – this information can be used when searching for materials and in analysing their merit. Undertaking citation analysis on yourself is useful for assessing your own research performance. Specialist databases such as Web of Science and Scopus provide various tools for doing this analysis.

Searching for citation information on the Web of ScienceSM

Web of ScienceSM is hosted by Thomson Reuters and consists of various databases containing information gathered from thousands of scholarly journals, books, book series, reports, conference proceedings, and more:

  • Science Citation Index Expanded (SCI-Expanded)
  • Social Sciences Citation Index (SSCI)
  • Arts & Humanities Citation Index (A&HCI)
  • Index Chemicus (IC)
  • Current Chemical Reactions (CCR-Expanded)
  • Book Citations Index – coming soon!

These databases enable you to perform a variety of tasks, such as search published literature, undertake citation analysis, track new research within a particular field, and identify chemical compounds and reactions. Data is available from around 1990, and even earlier in some cases.

By producing a Web of ScienceSM Citation Report for yourself (or for others), you can find out who is citing your work and how it is being used in other people’s publications so that you can get a feel for the overall citation activity around your outputs. Search for an author’s name and then click on ‘Create Citation Report’ from the results page.

Producing this report will give you information such as the number of items published in each year, the number of citations to those items for each year, the average number of citations per item, and your h-index based on this information. Click here for web tutorials on how to use the Web of ScienceSM.

Searching for citation information on Scopus

Scopus, part of Elsevier’s SciVerse facility, was launched in November 2004 and is an abstract and citation database containing around 19,500 titles from more than 5,000 publishers. Scopus enables researchers to track, analyse and visualise research, and has broad coverage of the scientific, technical, medical and social sciences fields and, more recently, the arts and humanities. Data is currently largely available from 1996 but it does go back further than this in some cases. For more information about Scopus, click here.

By searching for yourself (or others) on the Scopus database using the author search facility, you can use the ‘View Citation Overview’ function to get a feel for the citations activity around your outputs. The information is presented and can be analysed in a number of ways, including pie charts, graphs and tables, and shows the breakdown of citation activity over a number of years and your h-index based on this data. Various tutorials on using Scopus can be accessed here.

Scopus and the Research Excellence Framework (REF): HEFCE has announced that Elsevier have been chosen as the provider of citation data services to the REF sub-panels that have chosen to make use of citation information as part of the assessment process. Using the Scopus database, HEFCE will provide the relevant sub-panels with raw citation data (i.e. not normalised) accompanied by contextual information, which will assist those panel members in making decisions about the outputs part of the REF submissions.

What is the h-index?

The h-index was conceived by Professor Jorge Hirsch in 2005 within the field of physics and is fast becoming one of the most widely used metrics for research evaluation. It is also becoming increasingly used as a measure of research activity and academic prominence across various subject areas.

The benefit of the h-index over other citation measures is that it is not influenced by a few highly cited papers and it ignores any papers that remain uncited. It is calculated based on the number of papers by a particular author that receive h or more citations. Therefore, an h-index of 15 means that a person has at least 15 papers that have been cited 15 times or more. Fortunately, the Web of Science and Scopus both automatically calculate the h-index as part of their citation analysis functions so there is no need to work it out manually.

If you’d like to know more about the h-index, the original research document can be accessed from the Cornell University Library webpage.

What are journal impact factors?

Journal Impact Factors are published annually on the Web of Knowledge and provide a way of ranking journals based on the citation performance of articles published by those journals from the previous two years. For more information about how impact factors are calculated and how they can be used, see my previous blog post.

Other methods of ranking journals also exist, such as the ABS Academic Journal Quality Guide and the ERA journal ranking list. Journal rankings can be useful when choosing which journal to publish in, for example.

Latest journal impact factors

Following the release of the latest Journal Citation Reports® on the Thomson Reuters ISI Web of Science database, we have compiled a list of the top ranking journals in various fields related to BU research. BU staff can access these lists by going to the designated folder on the collaborative I-drive: I:\R&KEO\Public\RDU\Journal Impact Factors 2011. If there are any additional subject areas that you would like to see included, do send me an email.

Related blog posts that may be of interest:
Journal impact factors explained

Google Scholar’s new citations tracking tool

As the demand for information about the performance of research outputs increases, Google Scholar’s latest online citations tracking tool could prove to be a useful resource. Google Scholar Citations is currently being launched with a small number of users before widespread dissemination, but the early signs are that it could provide a simple way for academics to calculate key metrics and monitor them over time.

According to Google Scholar, the tool will ‘use a statistical model based on author names, bibliographic data, and article content to group articles likely written by the same author’. You can then identify your articles using these groups and Google Scholar will then automatically produce the citation metrics. Three metrics will be available: the h-index, the i-10 index (which is the number of articles with at least ten citations), and the total number of citations to your articles. Each of these metrics will be available for all outputs as well as for articles published in the last five years, and will be automatically updated as new citations are available on the web.

You will be able to update your ‘profile’ of outputs by adding missing items, merging duplicates or correcting bibliographic errors. You’ll also have the choice to make your profile public, the idea being that you’ll make it easier for colleagues globally to follow your research because your outputs will appear in Google Scholar search results.

To get a glimpse of Google Scholar Citations in action, there are some sample profiles available via the Google Scholar blog. You can also register your email address so that you will be notified when the tool is finally available to everyone.