Disprove it! The science? of evidence based therapy

Disprove it! The science? of evidence based therapy

Evidence based therapy is broadly defined as the clinical application of findings from the scientific literature. The scientific literature documents research with varying degrees of ‘good’ (or good-enough) science. Sometimes evaluating an entire course of therapy, other times examining components of therapy or a condition. As our Centre starts to think about how best to set up our research and evaluation framework, I have been reflecting deeply on the methods of scientific enquiry. In this short paper, I reflect on ‘science’ and call for a careful examination into what is ‘evidence’ as it relates to therapeutic work and the risk of looking for evidence to support existing beliefs, sell products, support our ideologies. As we continue to evolve more collaborative and comprehensive approaches to methodology, we need to find effective appropriate ways to keep ourselves honest.

What is ‘evidence’?

Serving on a Human Research Ethics Committee for psychiatric and mental health research, I examine research applications for ethical standards, regulatory compliance and scientific rigour. Here, I see many and varied research methodologies applied to medical and psychological research. As diverse as the projects are, I see the diversity is still limited in many efforts to test theories. We err on the side of what we think is science in the hope that we will produce some valid evidence. 

Having studied physics, chemistry, mathematics and been trained in the scientific method, I am aware of many approaches to understanding ‘evidence’. Theoretical physics contains several quite well-recognised theories that have little to no observable evidence but are valued as they are based on interesting complex mathematical equations that “keep us honest…prevent us from lying to ourselves and to each other” (as Hossenfelder suggests in her book ‘Lost in Math’.) In spite of this high level mathematical evidence, many of these theories cannot co-exist. Maths may well keep us honest but is it any closer to getting to the truth? And is it any better than other forms of evidence? 

The examination of Lived Experience through qualitative research or survey data has analytical methods specific to this kind of research. In efforts to gather evidence that is generalisable, researchers will examine in smaller groups or use this data as a springboard into larger scale studies. A example of survey data comes from acute psychiatry. Over the years we have heard many women discuss the subject of gender violence in mental health facilities. A report released by ANROWS AND RMIT reported this danger continues. This report was based on interviews of 11 women, a small sample size on its own but collated with the same studies told in other studies over the past 30years it ends up being a lot of evidence that women have been feeling unsafe and continue to feel unsafe. Having worked in these spaces over the years I have noted this reality for some women. The survey called for the implementation of a list of recommendations. Good science would see that we test these listed recommendations and compare and contrast various forms of these recommendations, document and analyse how this impacts on women of different socio cultural groups, intellectual and physical abilities, nursing staff, etc. This approach would then effectively test how the recommendations will affect gender based violence; but as confident as we are in supporting these recommendations, how will we know whether some aspect of these recommendations will not cause harm whether it will be effective? Which evaluation method will honour and respect the dignity of these women’s experiences but also evaluate objectively the recommendations noted?

Sometimes as a therapist we attempt to apply an approach to therapy for our client and see that we have to adapt in several ways. The N of 1 shows that our particular adaptation worked for this one individual, but not enough to generalise this data across populations. Sometimes you hear therapists talk about ‘practice based evidence’ rather than ‘evidence based practice’ for this reason. But as confident as we are in our experience as therapists, how do we really know we have objectively evaluated what works in therapy? Which evaluation method will keep us honest?

Falsification 

Aside from complex theoretical mathematics, another approach to keeping us honest is falsification. One difficulty we see repeatedly is the intention of the researchers to find support ‘for’ the theory and the choice of enquiry or scientific method often reflects this intention. As philosophers of science we continue to debate what is good and true science. I question why some are not still interested in setting out to disprove theories as we did before and why the emphasis now appears to be on gathering evidence ‘for’. If we approach research with the intention to disprove a theory, we will more likely be kept honest than if we approach enquiry in order to see whether we are right or effective. Applying falsification to smaller studies that are more exploratory or narrative in their nature runs counter to the spirit of these types of research. But applying falsification to evaluating our therapeutic programs testing medicines evaluation theory may be worth considering. Keeping in mind the guidance of Charmers all those years ago that: If a theory has been falisifed “it needs to be ruthlessly rejected”. 

Of course, we wonder whether some therapists may find applying this method to their interesting insightful artistic approaches to therapy too confronting but generally when we see researchers who have a preexisting belief that this therapy works then move into evaluation and research of this therapy, we need to read with scrutiny. If their reputation and branding of clinical work largely revolves around something being true then it is our duty to apply (on their behalf) falsification or some other form of keeping it honest. 

Reductionism

Traditional scientific enquiry involves creating a very controlled and artificial examination of a number of key variables of interest for the purpose of disproving a theory, or testing a hypotheses. The limitations of traditional methods as applied to psychology are obvious; controlling all variables except for the one of interest to see if it makes a difference does not equate to real life, having controlled experiments on rodents and deducing to humans, or reductionist conclusions to complex questions make for lively debate, but disappointing results. Considerations of bio-psycho-social-spiritual and cultural aspects of humans are necessary; integrating knowledge from areas of psychology such as information processing, perception, attachment in a client centred and trauma-sensitive way. But attempts to operationalise these multilayered factors often leads to preposterous ways of simplifying complex experiences. An example Blood cites in her ‘Body Work: the Social Construction of Women’s Body Image’ where investigators argued that ‘sociocultural factor’ was operationalised by one group of participants watching a ten-minute video relating to advertising glorifying the thinness and attractiveness, the other group not exposed to this video. Did the researchers really believe that they could control years of socio-cultural influences by separating these two groups into ‘exposed to culture’ and ‘not exposed to culture’? Well, no of course they didn’t believe they could, but the researchers had an intention other than finding ‘truth’; they were not kept honest. 

When we come from an understanding of how gender, culture, religion, power, socioeconomics and more can influence an individual, the controlled nature of experimentation seems woefully inadequate, neglecting important factors in the lived experience of an individual. Applying lab results from studies that were based on a sample of white, educated, middle-class westerners with no outlier-ness does not apply to the ‘real life’ experience of many of our clients. This is one of the reasons therapists provide as justification for Therapist Drift (a phenomena outlined well by Waller that describes a therapist’s ‘drift’ away from well-researched manualised treatment). 

Anarchistic theory of knowledge-alternative methods of enquiry

Paul Feyerabend (Anarchistic Theory of Knowledge) wished for scientists to be liberated from the existing methodologies of science, generating a freedom to select from science and other ways of knowing. How delightful and terrifying for us that science should not take precedence over other forms of knowing; the hierarchy dismantled. As a scientist and artist and anarchist-sympatico…I agree in principle, but the high stakes of healthcare mean we err on the side of science. It’s too risky to apply an untested therapy or treatment; that we should harm (or at the very least waste someone’s time).

What do we do about art, the unobservable

Those therapists who don’t come from a science background are more open to alternative approaches to therapy. Whilst studying Movement and Dance at The University of Melbourne I took classes in Dance Movement Therapy; the idea was that dance could somehow be applied as a therapy across populations. Here I studied alongside artists, dancers, musicians, educators and researchers from a field that was so far removed from the ‘labs’ I was used to in the worlds of physics, chemistry and even the softer psychology. During this degree lectures took place in dance studios often cross legged on the floor and people were interested in the felt sense of the individual. They had unusual ways of ‘testing’ whether what they were doing was helpful. Sometimes relying on a complicated observational system called Laban Movement Analysis and often relying on no ‘control group’ or ‘placebo’, sample sizes of a few people or even an ‘N’ of 1?! As if the world of sciences was not comprehensively uncertain enough, here come the artists to well and truly mess with our heads. Despite my confusion and objection to calling it research, I persisted with cautious curiosity and over time collected enough experiences to make sense of why the methods were different and how the methods did support the idea that creative arts are therapeutic, but would these approaches hold up if we used falsification as a method of enquiry? 

As I have worked more closely with artists over the years I have heard so many people drawn to creative arts therapies like art therapy and music therapy. I know for a lot of us it feels therapeutic to move, sing, create art. Many are trying to apply scientific rigour to examine this area in more detail. Whether it is therapeutic, why it works, how it works is under consideration. Methods of researching this area have understandably had to go way beyond observable data and mathematical equations. Although some try to apply research designs such as RCTs, it does seem at odds that we apply methodology from hard science to these types of experiences. 

Listen to the lived experience

Another interesting area to consider is the inclusion of qualitative information, particularly from the lived experiences of the client and therapist. Alternative forms of research are an interesting way to address the shortcomings of traditional methods. Whilst some therapies can be compared and contrasted in large scale studies, the Lived Experience of therapy has received less attention that observable measurable outcomes. Whilst it may be too risky to not be scientific enough in our enquiry, it is more risky to not be listening to the lived experience of the client and therapist, too risky to not continually open to alternatives accepting the complexity of human wellbeing. Complex social emotional creatures we are. But when examining therapies, we are bound by existing methodological limits developed with hard science in mind, we limit the possibilities of properly investigating the lived experience of those of us in the room (client and therapist). Our clinical observations may (arguably) be to therapy what mathematical equations are to physics but we need to keep bias in check. With growing awareness of consumer-informed practices, Lived Experience and peer professionals becoming key contributors to program planning and evaluation, I hope to see substantial moves toward more collaborative approaches to enquiry.

Conclusion

After years of training in the arts and science, I continue to walk the tight rope of truth-seeking, perpetually aware of the long disgraceful fall into bias, subjectivism and ‘non-evidence based therapy’. Both the art and science of research and therapy is filled with uncertainties and limited methodologies. To incorporate qualitative research we may move closer to establishing better collaboration between the arts and sciences. Perhaps when we collaborate with open mindedness, effective communication we will start to see more elaborate and sophisticated methods of enquiry in the pursuit of evidence. Perhaps collaboration, consultation and communication is the only way to successfully navigate our way to the truth and keep ourselves honest in the mental health field.

Call to action

Research into women's mental health in particular can be liberated from existing limited and problematic methodologies, generating a freedom to select from science and other ways of knowing.

Join us in conducting research that is inclusive and collaborative.

We are looking to collaborate with researchers who have a passion for understanding mental health and wellbeing within the scientific method, from the perspective of lived experience, creative arts therapies and generally applying the feminist lens. We welcome enquiry from researchers who are interested in collaborating with us as we focus on our Centre's all important developing Research and Publications arm. 

Interesting as a scientist to see how science interacts with the Human experience. Very thought provoking.

Dr. Caroline Buzanko

Empowering Therapists and Educators to Optimize Kids' Resilience | Yoda of Anxiety & Big Feels | Busting Poor Practices

2y

Love that you wrote this. The other ways of knowing and listening to lived experiences both really resonated with me. When doing my dissertation I was the first student to do qualitative research and there was definitely pushback and a lack of understanding. But the parents' stories could not be reduced to numbers. That lived experience is so rich and every parents' journey was so unique. And I gotta say, even the process of gathering "data" is so different. I have done years of research of simply handing out rating scales and it was so clinical... but going to people's homes, listening to their stories, hearing the vulnerability. What an incredible experience. (And, the process proved to be therapeutic for them as well!) And the information I gathered was so much deeper and richer than anything I could have gathered from rating scales.

Ronit Joel

Clinical Psychologist and Board-Approved Supervisor

2y

That’s a very thought provoking article. I can’t wait to see what comes from your research hub!

To view or add a comment, sign in

More articles by Dr Sonja Skocic

Insights from the community

Others also viewed

Explore topics