Often I hear of AI as a productivity driver. Releasing clinician time so they can see more patients. Sounds good in theory, but also risks more complex work & more burnt out clinicians. We must forget the person when applying tech to solve health care challenges.
And in some health systems (particularly those driven by profit or other targets) it creates an excuse to lump those clinicians with more patients within that time saved. So in the end, not really improving the clinicians workload and as you say, possibly making them even more vulnerable to burn out.
Delivering successful innovation through design research
3wI totally agree. Two related issues: If AI does the work, but does it in an inconsistent and unreliable way, it may make things more stressful and difficult. It's often easier to do the work yourself than to constantly check and correct mediocre or error-prone work. If a human is doing the administrative tasks, then it's possible to converse with them, ask for updates, discuss processes, point out errors and develop a system that works for everyone. If a black box piece of tech is doing the task then the clinician has to fit in with how that tech works. If they can't modify the tech, query how it works or even understand what it's doing, then the danger is that the clinician either has to work in a way that doesn't suit them/that they find more risky and stressful, or they end up working around it, making the whole process less safe and productive. I see so many examples of 'time-saving' and 'efficiency-boosting' tech that is so badly designed that it is basically a blocker to getting work done. Staff use it, not because it helps them, but because they have to. In some cases the tech is so bad that switching to paper would be more efficient.