by Mickey Skidmore, AMHSW, ACSW, MACSW

*Editor’s note: this essay was initially considered for publication in the inaugural AASW online publication FOCUS. However, ultimately this did not occur …

 

Preamble: Hard Sciences versus Soft Sciences

Perhaps the foremost reason western medicine is recognised as the accepted standard in healthcare is because it is rooted in science. And while medicine enjoys many aspects where hard science offers straight-forward, clear cut, empirical evidence that its interventions are effective, there remains an aspect of medicine that is beyond science. Ever wonder about the reference that physician’s “practice” medicine? The reason is because in addition to the scientific knowledge involved with medicine, it also requires experience, clinical acumen, and critical thinking. In fact, most individuals would acknowledge that the practice of medicine is both a science and an art.

Would it be surprising to learn that many physicians, hold the view that psychiatrists are not “real” doctors, or cast serious doubts on psychiatry as being a valid medical specialty? Thus, it should not be too surprising that allied health professions (psychology, social work, etc) are often subjugated to realm of the “soft sciences”.

The world of hard science is empirical, measurable, and replicable. Facts and evidence can be  clearly and easily measured, and credibility of the research is enhanced when outcomes can be consistently replicated. In the realm of hard science, one can say under a clearly defined set or sequence of variables; or when the original methods are followed a closely as possible, that the results are consistently the same — indisputable. And while the same empirical principles and approaches are applied with the soft sciences, it is often far more challenging to yield the straight-forward, clear cut outcomes that the hard sciences often enjoys. 

It is perhaps oversimplified and somewhat ironic to draw parallels to the functionality of the human brain. Research methods and the hard sciences in many respects are much like the organisation and functioning of the left hemisphere of the brain — stressing a linear thinking mode that emphasises scientific skills of writing, language, mathematics, lists, and logic. Whereas, the soft sciences are more in line with the functioning of the right hemisphere of the brain — stressing a holistic thinking mode that emphasises emotional expression, spacial awareness, music, imagination/intuition, and creativity (Mitrovic, 2015).

The Linear Mode Bias of Western Research Methods

As but one of the Allied Health disciplines, Social Work practice is broadly based and incredibly diverse. It includes social welfare, child protection, social policy, general health care, and a wide range of clinical practice (including psychotherapy), to name a few. Social Work, as well as other Allied Health professions offers distinct emphasis and professional pathways in academia and research, as well as direct service or clinical tracks. While research is not high on my list of professional interests, research methodology was part of my masters degree training and education. As part of my clinical experience and training, it provides me with an added dimension of information and knowledge that enables me to critically analyse evidence based approaches and claims made by researchers in my consideration of what is best practice. 

Many private and public organisations emphasise and in many cases insist on evidenced-based practice principles in health care delivery, which is often underscored and reflected in research. In fact, some public health care systems are moving towards requiring even entry level positions to not only participate in, but lead research projects, as part of their quality performance standards, as well as tying professional advancement to this effort as well.

At first glance, this may seem impressive. However, a closer examination of this trend reveals noted shortcomings of this view. It should not be surprising that today’s research methodology is dominated by a more logic oriented and linear modality approach. I argue however, that the linear approach to research methodology is both over-emphasised and over-valued to the point that the evidence it offers to practitioners is limited and one dimensional.

Recently, The Reproducibility Project headed by Brian Nosek of the University of Virginia led an international effort, including 270 scientists who attempted to replicate 100 studies published in three top psychology journals. The results of this effort were released in a report in August 2015, finding that only 36% of the rerun studies came out with the same results that were found initially.

To be fair, such provocative results do not necessarily mean that those original findings were incorrect or that the scientific process is flawed. There may be several possible explanations: luck; poor execution; or an incomplete understanding of the circumstances needed to show the effect (Villiger, 2015). Moreover, psychology and social sciences may be particularly difficult to replicate, as the aspects of human behaviour are often difficult to measure empirically. 

At the very least, this report raises several thought provoking observations. As I indicated previously, one of the tenets of scientific research is reproducibility. The ability to repeat a study and find the same results twice is a pre-requisite for building scientific knowledge and credibility. Replication allows us to ensure empirical findings are reliable and refines our understanding of when a finding occurs. However, it is rare that scientist conduct, much less publish attempted replications of existing studies. Moreover, Journals succumb to various pressures to publish novel, cutting edge research. The political realities of academia point to a relationship between professional advancement being determined by making new discoveries, not painstakingly confirming claims from previous studies. Furthermore, once a report or study appears in a reputable Journal, it acquires a quality of unassailable authority, that others can and will cite henceforth without a trace of skepticism. In short, such disincentives and political realities further contribute to the shortcomings of such an approach (Villiger, 2015).

Evidence-Based Principles versus Critical Thinking

For at least the past decade (or longer), industry and business leaders in the United States have been clamouring for workers with critical thinking capabilities (Skidmore, 1999). And while political leaders have stressed the value of ensuring that workers receive the technical and vocational training workers require to be successful in high tech and/or factory employment, it is not at all clear that they receive much training or education that includes critical thinking skills. Likewise, it can also be argued whether graduates from universities complete their studies with such capabilities either. From my perspective, this skill set seems considerably lacking in more and more graduates and young professionals than ever before.

In the late 1990’s and early 2000’s, I often cited a quotation that there were more than 300 forms of psychotherapy being practiced in the world. And one of the criticisms of these many approaches, at least from an empirical research perspective, is that the effectiveness reported by practitioners from these approaches were anecdotal at best, given the inherent inability to consistently or systematically measure results from these efforts. EMDR emerged as potentially promising approach during this time (especially in Veteran’s programs), but the enthusiasm for research of this approach gave way to the emphasis of marketing approaches for trainers.

Since coming to Australia, I have observed several “new” psychotherapy approaches, including: Mindfulness; Acceptance and Commitment Therapy (ACT); IPT (Interpersonal Therapy); M-CBT (Mindfulness CBT); Trauma-Form DBT; Schema Therapy; Motivational Interviewing, Mentalisation; Conversational Model; Narrative Therapy; and others — all touting evidence-based underpinnings to their approach. It is interesting that many of these efforts focus on providing effective treatment to the specific area of complex trauma and/or development of personality disorders — in particular, borderline personality disorder (BPD). This would suggest that the social science community recognises the limitations of pharmacological approaches in treating this cluster of disorders, and are mobilising to discover and provide enhanced, complimentary “talk therapy” approaches that offer hope of better outcomes for people suffering from these conditions. Such efforts risk criticism, however, of becoming increasingly “technique” oriented rather than developing comprehensive treatment modalities. 

I remain somewhat cynical about these approaches being “new”. What is new however, is that there has been some effort to apply scientific research methods to various aspects or components of these approaches. The research efforts to apply scientific study to the spiritual/philosophical practices of mindfulness are far from conclusive. There does appear to be evidence that mindfulness can be beneficial for some (Mitrovic 2015), however, nailing down the specifics (i.e. consistency; replicability; etc) remain elusive. Some subsequent reports have even characterised empirical research on mindfulness to be “shoddy”.

From my own professional experience, I notice broad conceptual aspects of NLP (Neuro-Linguistic Programming) within Motivational Interviewing, Conversational and Narrative Therapies. I also note developmental, gestalt and hypnotic strategies in Schema Therapy and Mentalisation. As a critical thinker, I recognise that older, well-established psychotherapeutic approaches are being revisited, with more of an earnest effort to apply scientific research methods. When the research yields evidence of encouraging outcomes, there is a re-packaging or renaming of the approach to a “newer” more “evidenced-based” title that is re-introduced to the mental health community.

In an effort to provide a more holistic perspective or right-brained contrast, I wish to submit some examples of critical thinking about evidenced-based research claims. And as a specific way to discuss this analysis, I will begin with the developments of Cognitive Behavioural Therapies in recent years.

In my 30+ year mental health career, I have witnessed numerous psychotherapeutic modalities that have become trendy and popular. Many of these approaches were off-shoots stemming from the roots of behavioural theory. Rational Emotive Therapy (RET) was one of the early trend setters, serving as a precursor to an even more popular and enduring model which most of us now recognise as Cognitive Behavioural Therapy (CBT). After enjoying many years atop the psychotherapy realm, Marcia Lenahan, PhD introduced Dialectical Behavioural Therapy (DBT) which would become the next psychotherapeutic method that became all the rage in the mental health arena.

The reason CBT became so popular within the empirical community, was because those behind this approach were for a long time the only model that had gone to the trouble of developing a nomenclature, or language that enabled their approach to be measurable. Thus, employing CBT approaches provided a pathway for social science research to be incorporated into the empirical scientific research parameters and methodology for the first time.

DBT offers an intensive and structured program that combines specific skills training in a group setting to better manage or address emotional and behavioural dis-regulation — often manifesting in self-harm (i.e. cutting; self-mutilation; etc); while simultaneously incorporating these skills via individual psychotherapy. The research reflected a noticeable decline in frequent presentations to Emergency Departments by a specific cohort of patients with Borderline Personality Disorder (BPD). The result of this outcome was instrumental in DBT becoming the next darling in the mental health community. However, research was released and shared at last year’s Project Air Workshop at Woollongong University — revealing that there was no significant difference for participants receiving DBT if they participated in the group skills training alone or together with individual therapy.

Marcia Lenahan has skillfully managed to capitalise on her model, leaving many to hold the view or accept that DBT may be more effective than the evidence indicates. I know many therapists who have carefully reviewed her approach and have voiced concerns about several aspects of her theoretical model. For years she has been promising a second phase to her model, presumably aimed at addressing longstanding, unresolved issues that underscore the symptoms of complex PTSD and/or personality disorders that were developed as a coping strategy to survive. Yet, we continue to rest on the laurels of the evidence from her initial work. 

This is not to say that DBT is not a viable approach to working with patients suffering from BPD. But we should all be clear about what the evidence based research suggests. The research indicates that this approach has proven effective in reducing the number of repeat ED visits by patients with BPD. Moreover, it speculates that this may be due to providing an alternate set of skills aimed to better manage emotional and behavioural dis-regulation. It does not suggest that DBT adequately addresses unresolved trauma, cures PTSD, or resolves personality disorders.

One of the major aspects of empirical research methods is statistical analysis. These specialised and complex mathematical formulas enable the researcher to determine results and outcomes from data that has been gathered meticulously and considered under precise and consistent  protocols. The strength of this exercise, and seeming impartiality of numbers has long been the appeal that so many like to tout as “evidence based”. Again, I image that this is more straight-forward within the realm of hard science. Measuring molecules or mass or density are much more linear. Whereas the things social scientist endeavour to measure are less so. Regardless, most people intuitively recognise that anyone with strong math skills and adequate experience with such techniques can often manipulate statistics towards an array of desired outcomes — to say virtually anything they want them to say.

Holistic and Critical Thinking

Without question, the Western world is dominated by left-brain thinking. The strength of this bias is undeniable within scientific laboratories. As a result, the world has enjoyed significant medical advancements, the least of which include an array of vaccines that have in effect eradicated many diseases and afflictions. However, human behaviour does not lend itself to straight-forward measurement applications. The complexities of the human mind and behaviour makes it difficult and challenging to measure. The social environment of human beings lives is considerably different to measuring microscopic components in a laboratory. Yet, throughout the 1990’s, the prevailing view espoused by many psychiatrists was that mental illness was the result of “a chemical imbalance of the brain” (Skidmore, 2000). This over-simplified explanation is an example of how left-brain thinking can result in one-dimensional application to the complexities of mental illness. Such explanations, were simplistic, easy, and even became popular among non-professionals. However, they were partial truths at best, vague, and limited — emphasising only medical dimensions to the conversation.

While efforts will and should continue to apply scientific research methods to social science realms, and psychotherapeutic treatment approaches in particular, it is more important than ever to embrace critical thinking and a holistic framework to the application of these evidence-based outcomes. Without a holistic perspective, we run the risk of becoming increasingly compartmentalised and technique-oriented. Moreover, without right-brained incorporation, we risk becoming increasingly one-dimensional — a framework that will become increasingly unsatisfying in its application to the complexities of human thought and behaviour. And perhaps most important is embracing the process of critical thinking to clearly understand what the evidence is in the research findings that are reported, and what the implications might be in applying these outcomes to clinical practice. 

Like medicine, the practice of psychotherapy is both a science and an art form. Clinical acumen and experience are every bit as important and valid as is research. Often in psychotherapy there is an unspoken debate underscoring the process — a schism of intellect versus emotion, which parallels the left-brain / right-brain discussion of this essay. However, the issue is not that intellect or emotion is superior to the other. Rather, the clinical reality is that intellect and emotion are cut off from each other, resulting in psychological dis-regulation or impairment. Treatment efforts then endeavour to reconnect the two resources so that the individual can re-establish a balance or equilibrium and once again have the opportunity to function in a more holistic manner — employing the strengths of both rather than emphasising one at the expense of the other.

I cannot stress enough that emphasising research at the expense of, or as superior to critical thinking is short-sighted, and in the long run will have detrimental consequences. While at face value it may seem reasonable that private and public health care systems are emphasising and tying professional advancement to conducting research. (It is certainly easier and a more straight-forward way to measure desired achievement). The concern of this however, is that it ignores and devalues the validity of clinical acumen, experience and critical thinking — or other leadership skill sets or qualities altogether. Moreover, the implication of such policies is that management will only recognise research as an indicator of leadership and professional advancement. 

I applaud the strengths and achievements that result from emphasising linear organisational principles. Moreover, I recognise the value of scientific research methods and anticipate there are many more discoveries that await human kind as a result of such efforts. I also however, acknowledge contributions that result from emphasising holistic organisational principles that have not been fully realised and in many case ignored, leaving us vulnerable to becoming one-dimensional. There are numerous examples of how even the most basic behavioural research findings are routinely ignored, despite clear evidence from the research. 

For example, in basic parenting dynamics research has shown the benefits of positive reinforcement over negative reinforcement. Yet, often negative reinforcement reigns, in part due to it being quicker, simpler, and easier than the taxing efforts of positive reinforcement. Another example is especially evident in beauracratic organisational systems, where holistic, behavioural evidence often gives way to political or financial realities. 

 

REFERENCES

1)  Scientists Replicated 100 Psychological Studies, and Fewer Than Half Got the Same Results.  Handwork, Brian. smithsonian.com 27 August 2015.

2)  We found only one-third of published psychology research is reliable — now              what?  Villiger, Maggie. theconversation.com 28 August 2015.

3)  Meditation Techniques in Psychotherapy: Will they work for your clients?Mitrovic, Dana. Presented at SSW Psychiatric Training Network 4 September 2015. Liverpool Hospital, NSW.

4)  The 1990’s: Decade of the Brain?                                                                             Skidmore, Mickey. Originally posted Feb 2000 at www.turning-points.com                     (now available at www.turning-points.com.au)

5)  Some Thoughts about Critical Thinking                                                                 Skidmore, Mickey. Originally posted August 1999 at www.turning-points.com              (now available at www.turning-points.com.au)