Institutional Learning from Funder Feedback: Research Methods

An insight from Associate Professor – Dr. Ian Jones.

One of the great benefits of acting as a reviewer – whether of funding applications or research papers – is being able to learn what is happening at the ‘cutting edge’ of a field, not only in terms of subject knowledge, but also in terms of methodology. Here, we can learn from both good, and not so good practice. Having recently reviewed a number of applications for the funding scheme associated with my own professional body, It was clear that such a task has clearly had a significant impact upon my own understanding of what makes ‘good’ research, and what makes a ‘good’ application for funding.

Perhaps the key term from the latest round of reviews – to me at least – was that of ‘coherence’, and coherence between various different elements of a proposed methodology. Often within applications there is an understandable focus upon ‘methods’ rather than ‘methodology’. To me, this means a missed opportunity to generate such coherence – and subsequently a missed opportunity to justify the key methodological decisions. As one example we can look at the importance of the ontological and epistemological basis of the work (perhaps more relevant within the social, rather than the natural sciences) which is often overlooked, or only briefly addressed. Often, even a relatively brief acknowledgement of these ideas can help to justify choices in terms of methods, sampling and data analysis. This can be taken further with reference to another – often overlooked – detail, that of the research design. Often, whilst research designs are outlined, their role as a ‘link’ between the epistemology of the study and the data collection and analysis methods is often omitted, where again, it can lead to a real sense of coherence within the methodology. The best bids had not only detail about the broader methodology, but also a real coherence between each element, with a consistent story being told, from the philosophical assumptions of the study, which guided the research design, where each method had a clear link both to the broader epistemological issues, and also the subsequent analysis and interpretation of the data.

Finally, and crucially from a reviewer’s perspective, the idea of coherence between researcher, subject and methodology is essential, often being the first question, a reviewer will be required to address. The research itself is not independent of the researcher, and does the study show coherence in terms of not only researcher-subject coherence (does the researcher have an established record in the area) but also researcher-methodology coherence (what evidence is there that the researcher could undertake this methodology successfully), again focusing not just on methods, but the broader methodology as a whole (for example is there coherence between the choice of research design, and the researcher’s own experiences and attributes (often key, for example, in ethnographic designs).

None of these points are ground breakingly original, but it is interesting to see that there is still great variation in how methodologies are constructed. And assessing such methodologies has proved to be of immense value when think about my own work.