top of page
Writer's pictureEivind Engebretsen

A situated epistemological approach to research integration and implementation

Evidence, in the Evidence-based medicine (EBM) sense of the word, is mainly associated with randomized controlled trials (RCTs), which are considered the ‘gold standard’ for assessing cause-effect relationships between an intervention and its outcome; the findings generated by RCTs are likely to be “closer to the true effect than the findings generated by other research methods” (Evans 2003). Indeed, EBM has developed a framework for ranking evidence in a hierarchy that features simple observational methods at the bottom and moves on to increasingly rigorous methodologies, notably RCTs and systematic reviews, on the top.


Such a hierarchy does not necessarily fit all purposes nor incorporate all relevant types of evidence. Greenhalgh (2020) has for instance demonstrated that health authorities failed to acknowledge the whole fabric of evidence relevant to the question of using facemasks as a measure against the spread of COVID-19. An anthropological study about cultural attitudes or compliance, or a newspaper article reporting from an outbreak area does not measure ‘true effects’ of an intervention, but that does not make either of them unreliable as evidence in every sense of the word. They provide evidence of a different kind.


In our forthcoming book, Rethinking evidence in the time of pandemics (CUP, 2022), Mona Baker and I argue the need for situated epistemologies that are relative to the site or discourse where the knowledge is articulated and to the narrative location of those who articulate or evaluate that knowledge. Rather than taking the EBM hierarchy for granted as the only rational means of assessing evidence, we explore the various stories about COVID-19, and offer a theoretical basis for understanding how different individuals and communities decide which of a range of competing stories they should believe in and why. The truth qualities of a given story can never be assessed from a safe place outside and beyond the story itself, through reason as such, because reason always already dwells within a story. It follows that a story can only be evaluated based on the situated principles defined within the story itself, as well as the situated principles and values of the stories brought into the assessment by its audience and with which the assessed story resonates or competes. The empiricist notion of evidence underpinning EBM is thus only one possible situated interpretation or value according to which knowledge claims can be and are in practice assessed.


Based on this understanding, I will argue that also so-called integration and implementation science (i2S) needs to be based on situated epistemologies. I2S is a transdisciplinary framework developed by Bammer (2013) for conducting actionable research to solve complex real-world problems. Bammer (2020) has convincingly argued that to tackle complex societal or environmental problems we do not only need knowledge about the specific problem being tackled, but also generic expertise in research integration and implementation that is relevant to tackling any problem. Moreover, I maintain, this ability to combine knowledge from different disciplines and sources requires an awareness of and an ability to assess the relevant types of knowledge based on the values and principles that each encodes. Hence, in addition to ‘knowledge of that’ and ‘knowledge of how’, as emphasized by Bammer (2020), such expertise also requires what Fisher (1987) refers to as ‘knowledge of whether’. Questions such as whether to impose lockdowns or make vaccination mandatory are not strictly scientific but political. At the point where the experts cross the boundary of technical knowledge – where knowing that and knowing how dominate – and enter “the territory of life as it ought to be lived” (Fisher 1987:73), they are ‘off-duty’. They then pass from the domain of facts to the domain of values, from what they know to what they should do. In relation to such questions, the expert takes on the role of a counsellor “which is, as Walter Benjamin notes, the true function of the storyteller” (Fisher 1987:73). Outside the controlled context of an experiment or trial, practical problems also become the focal point for competing expert stories that address the issue from different angles. The question of whether or not to impose lockdowns or make vaccination mandatory, for instance, might be framed very differently from the point of view of immunologists, psychiatrists, sociologists and educational scientists. From the point of view of situated epistemology, however, none of these experts can “pronounce a story that ends all storytelling” (Fisher 1987:73).


Expertise in research integration and implementation must therefore be based on understanding of the stories people hold most dear and the values they value.


References

Bammer, Gabriele, Disciplining interdisciplinarity: Integration and implementation sciences for researching complex real-world problems. (Australian National University E Press, The Australian National University, Canberra, 2013). Open access online http://press.anu.edu.au?p=222171.


Bammer, Gabriele, et al. "Expertise in research integration and implementation for tackling complex problems: when is it needed, where can it be found and how can it be strengthened?." Palgrave Communications 6.1 (2020): 1-16.


Engebretsen, Eivind and Baker, Mona. Rethinking Evidence in the Time of Pandemics: Scientific vs Narrative Rationality and Medical Knowledge Practices (Cambridge University Press, 2022/in press), Available online: Rethinking Evidence in the Time of Pandemics - Oslo Medical Corpus.


Evans, David. Hierarchy of Evidence: A Framework for Ranking Evidence Evaluating Healthcare Interventions. Journal of Clinical Nursing (2003): 12: 77–84.


Fisher, Walter. Human Communication as Narration. Toward a Philosophy of Reason, Value, and Action (University of South Carolina Press, 1987).


Greenhalgh, Trisha. "Face coverings for the public: Laying straw men to rest." Journal of Evaluation in Clinical Practice 26.4 (2020): 1070-1077.




93 views0 comments

Comments


bottom of page