Having led some seminars at BU, and dipped my toe in to teaching, as a useful mechanism and resource, I have often wondered what contexts make for a good workshop. It would be my suggestion that some or all of; insightful means of relating content; inspiring delivery; a variety of taught and practical exercises; and an opportunity to network and socialise are needed for an enjoyable workshop experience. These are the contexts which I hypothesize to be conducive toward a good workshop outcome. My experiences of workshops in my early career researcher and PhD journey to date have been mostly positive, but I have never experienced all of the above in equal high measure – UNTIL NOW!
This week I have attended a 3 day workshop on Realist Methodologies. The workshop was hosted by the University of Liverpool, but delivered on their London campus in the heart of the city’s financial district.
The content and resources was communicated and contextualised by facilitators Justin Jagosh (University of Liverpool), Geoff Wong (University of Oxford) and Sonia Dalkin (Northumbria University) in a manner that was informative, insightful and engaging. There was a good mix of taught material and hands on exercises. However, there were also chances to present and constructively discuss your work to the wider and interdisciplinary group, and opportunities for one on ones with the facilitators to discuss and (de)construct your own realist projects. In addition, there was also an opportunity to chat in an informal setting over some pizza, pasta, beer and gin & tonics! All of this led to enhanced reasoning, a mechanism, with an outcome of increased understanding.
So in a way that is succinct and accessible, what is realist methodology and what how can it be applied in research? I’ve actually dropped in some hints in the two larger paragraphs above… Before the methodology is outlined, firstly it is useful to discuss the philosophical position on which realist methodologies are based.
Realist methodology and evaluation is underpinned by the critical realist philosophical works of the likes of Roy Bhaskar and Andrew Sayer (to name a few). This furthers a philosophical position that “…there exists both an external world independent of human consciousness, and at the same time a dimension which includes our socially determined knowledge about reality.” (Danermark et al., 2002: 5-6). On this basis, it is possible to be a positivist and objective ontologist (what is) whilst, at the same time, being an epistemological interpretivist (what it is to know).
Going deeper (stay with me!), Roy Bhaskar proposed three realms of reality. The actual, (objective entities that manifest in the real world), real (Subjective structures, phenomena and agency that act as causal mechanisms in the real world) and the empirical (Observable human consciousness and perspectives on the actual and real). As Easton states, “The most fundamental aim of critical realism is explanation; answers to the question “what caused those events to happen?”” (2010: 121).
Based on this, and in the context of evaluating social programmes, realist evaluation is a research approach that seeks to ‘scratch beneath the surface’ and offer a ‘real’ and plausible account of “…what works for whom, in what circumstances, in what respects and how.” (Pawson et al., 2005: 21). It does so by proposing that the outcome (O) of social programmes or interventions rest the conceptual relationship between mechanisms (M) and context (C) – expressed as the ‘O=M+C’ formula.
However, integral to mechanisms are both resources (typically the programme or intervention) and reasoning. With it sometimes hard to adequately illustrate and distinguish these two characteristics in the CMO configuration, Dalkin et al (2015) propose a new iteration of Pawson and Tilley’s (1997) original CMO formula – expressed as ‘M(Resources) + C→M(Reasoning) = O’. I’m afraid you’ll have to come and ask me in person for my CMO configuration!
In conjunction with findings and evidence from existing literature to inform research protocols, this conceptual formula is used to gather data, and interrogate to ‘scratch beneath the surface’ as to what happens in social programmes and interventions, why, for whom and in what context. Finally, and importantly to note, realist evaluation has no methodological prescriptions – although it is particularly suited to mixed methods and qualitative research methods.
The realist methodology community is a very friendly and collegiate one. Do get in touch to discuss this approach. If I can’t help you (for example, I haven’t discussed realist synthesis – a kind of systematic review approach using the realist philosophy and CMO configuration), I can pass you on to someone who might be able to (The RAMASES JISCMail list is a good start).
My next workshop has a lot to live up to!
Dalkin, S. M., Greenhalgh, J., Jones, D., Cunningham, B. & Lhussier, M. 2015. What’s in a mechanism? Development of a key concept in realist evaluation. Implementation Science, 10.
Danermark, B., Ekstrom, M., Jakobsen, L. & Karlsson, J. C. 2002. Explaining Society: Critical realism in the social sciences, London, Routledge.
Easton, G. 2010. Critical realism in case study research. Industrial Marketing Management, 39, 118–128.
Pawson, R., Greenhalgh, T., Harvey, G. & Walshe, K. 2005. Realist review – a new method of systematic review designed for complex policy interventions. Journal of Health Services Research & Policy, 10, 21-34.
Pawson, R. & Tilley, N. 1997. Realistic Evaluation, London, Sage.