What’s the biggest barrier to learning more?

Reading and engaging with clinicians online and face-to-face, it’s clear to me that effectively integrating psychosocial factors into daily clinical reasoning, especially amongst physical or manual therapists, is a real challenge. There’s enough research around showing how poorly these factors are identified and then factored in to change what we do and how we do it for me to be convinced of this. What intrigues me, though, is why – given psychosocial risk factors have, in NZ, been around since 1997 – it’s still a problem.

It’s not ignorance. It’s not holding an alternative viewpoint. It’s not just that clinical reasoning models don’t seem to integrate these factors, or that our original training kinda partitioned the various “bits” of being human off – I think that it’s probably that we think we’re already doing well enough.

Image result for dunning kruger effect

This effect has a name – Dunning-Kruger effect. Now, don’t be put off by this term, because I know in some social media circles it’s used to bash people who are  maybe naive, or haven’t realised their lack of knowledge, and it can feel really awful to be told “well actually you’re ignorant”, or “you’re inflating your skill level”.  The thing is, it’s a common experience – we all probably think we’re great car drivers – but in reality we’re all pretty average.

The same thing occurs when we consider our ability to be:

  • empathetic
  • responsive
  • good listeners
  • client-centred
  • collaborative

Another important effect found in clinicians is that we believe our experience as clinicians means we’re better at aspects of clinical care, and especially at clinical reasoning. Over time we get better at recognising patterns – but this can actually be a problem for us. Humans are excellent at detecting patterns but as a result we can jump to conclusions, have trouble stopping ourselves from fixating on the first conclusion we draw, begin looking for things to confirm our hunch, overlook things that don’t fit with the pattern we’ve identified, and basically we begin to use stereotypes rather than really looking at the unique person sitting in front of us (see Croskerry, Singhal & Mamede, 2013a, b).

The effect of these biases, and especially our bias towards thinking we do better than we actually do (especially regarding communication skills and psychosocial factors) means we’re often completely unaware of HOW we communicate, and HOW poorly we pick up on psychosocial factors.

So often I’ve heard people say “Oh I use intuition, I just pick up on these psychosocial issues” – but the problem is that (a) we’re likely to over-estimate how well we pick up on them and (b) our intuition is poor. The risk for our patients is that we don’t identify something important, or alternatively, that we label something as a psychosocial risk factor when it’s actually irrelevant to this person’s problem.

Clinical reasoning is difficult. While recognising patterns becomes easier over time because we have a far broader range of patterns we’ve seen before, at the same time

  • research is expanding all the time (we can be out of date)
  • we can get stuck prematurely identifying something that isn’t relevant
  • we get hooked in on things we’ve just read about, things that happen rarely, things that remind us of something or someone else

Hypothetico-deductive reasoning is an alternative approach to clinical reasoning. It’s an approach that suggests we hold some ideas about what’s going on in our mind while collecting more information to test whether this is the case. The problem here is that we look for information to confirm what we think is happening – rather than looking for something to disconfirm, or test, the hypothesis we hold. So, for example, we might observe someone’s pain behaviour and think to ourselves “oh that person is doing that movement because of a ‘dysfunctional movement pattern’. We can assume that the reason for this movement pattern is because of underlying dysfunction of some sort – but we fail to test that assumption out to see whether it might in fact be a movement pattern developed because someone told the person “this is the way you should move”, or the person is moving that way because of their beliefs about what might happen if they move differently.

The problem with intuition and these other cognitive biases is that they simplify our clinical reasoning, and they reduce effort, so they’re easy traps to fall into. What seems to help is slowing down. Deliberately putting a delay in between collecting information and making a decision. Holding off before deciding what to do. Concurrently, we probably need to rely less on finding “confirming” information – and FAR more on collecting information across a range of domains, some of which we may not think are relevant.

That’s the tough bit. What we think is relevant helps us narrow down our thinking – great for reducing the amount of information we need to collect, but not so great for testing whether we’ve arrived at a reasonable conclusion. My suggested alternative is to systematically collect information across all the relevant domains of knowledge (based on what’s been found in our research), wait a bit and let it settle – then and only then begin to put those bits and pieces together.

Why doesn’t it happen? Well, we over-estimate how well we do this assessment process. We do jump to conclusions and sometimes we’re right – but we wouldn’t know whether we were right or not because we don’t check out alternative explanations. We’re pushed by expectations from funders – and our clients – to “set goals” or “do something” at the very first assessment. We feel guilty if we don’t give our clients something to take away after our initial assessment. We want to look effective and efficient.

Great quote?

For every problem, there is a solution that is simple, elegant, and wrong. H.L. Mencken.

If you’d like to question your own practice, try this: Record your session – and transcribe that recording. Notice every time you jump in to give advice before you’ve really heard your client. Notice how quickly you form an impression. Examine how often you look for disconfirmation rather than confirmation. See how often you ask about, and explore, those psychosocial factors. It’s tough to do – and sobering – but oh how much you’ll learn.

Croskerry, P., Singhal, G., & Mamede, S. (2013). Cognitive debiasing 1: origins of bias and theory of debiasing. BMJ Quality & Safety, 22(Suppl 2), ii58-ii64. doi:10.1136/bmjqs-2012-001712

Croskerry, P., Singhal, G., & Mamede, S. (2013). Cognitive debiasing 2: impediments to and strategies for change. BMJ Quality & Safety, 22(Suppl 2), ii65-ii72. doi:10.1136/bmjqs-2012-001713

Filed under: Assessment, Clinical reasoning, Pain, Pain conditions, Professional topics, Psychology, Science in practice Tagged: biopsychosocial, Clinical reasoning, healthcare, Pain, pain management, Research
Source: Health Skills WP

Leave a Reply

Your email address will not be published. Required fields are marked *