There are few instances in our lives when we place greater trust in the abilities of our fellow humans than in surgery. Even in relatively safe procedures, invasive surgery carries an inherent degree of risk. From a doctor’s perspective, there are specific challenges to overcome in trying to minimise the degree of invasiveness, not the least of which is our basic biological makeup. “The advantage of open surgery is that you [the surgeon] have full use of your wrists and fingers, which means a large degree of freedom and potential articulation,” explains Dr Michael Hsieh, a professor at Stanford School of Medicine in California and an expert in robot-enhanced surgery. “Another advantage is that you have a three-dimensional view, with depth perception,” he adds.
It is here that advances in robotics are creating striking new possibilities that augment the capabilities of humans. Dr Hsieh has been conducting so-called multi-port robotic surgery for some time: guiding robotic arms into a patient’s body through several tiny incisions about the size of a keyhole. This accelerates recovery times and reduces scarring. The next frontier is the potential for single-port surgery. In certain cases this may enable surgeons to avoid any scarring at all, by entering via the navel, while further speeding recovery.
Such technologies are not supplanting the role, skills or creativity of surgeons; instead they are augmenting surgeons' abilities, freeing them to make advances that humans cannot accomplish on their own. “Robotic technology is not inhibiting human creativity,” agrees Dr Hsieh. “If anything, it has perhaps expanded our horizons by allowing us to conceive of new ways to conduct old operations, or ways to take completely new approaches to disorders. I would say that creativity has been enhanced.”
Creativity plus efficiency
Robots in surgery are a dramatic example of how technology can help healthcare professionals become more creative as well as efficient in the effort to improve patient care. And much more efficient they will need to become if healthcare systems are to meet the daunting challenges facing them. In Europe, for example, the costs of providing care to ageing populations are soaring, while governments remain intent on maintaining near-universal levels of provision. To achieve this amidst tight public financing will require vast improvements in efficiency in all facets of healthcare operations. Making better use of the myriad technologies coming available—in areas ranging from diagnostics to telehealth and others—is central to this objective. Nearly nine in ten health executives surveyed for this study[1] agree that there remains enormous room for technology-led efficiency gains in their organisations.
Unfortunately, the ease with which surgeons like Dr Hsieh are interacting with new technologies is less visible elsewhere in the sector. IT—and particularly the types of systems which connect the back office to the hospital floor or doctor's surgery, or provide the information necessary for effective patient care—has made slow inroads in healthcare in comparison with other sectors. The reasons are varied, but simple human resistance to change and difficulty in adapting to new technologies are prominent among them. Six of every ten healthcare respondents—more than in other sectors—say their organisations have become heavily reliant on technology in just the past three years, an indication of how recent significant technology penetration has been in some parts of the sector. Two-thirds report one or more instances of employee failure to learn a new technology in the past six months, suggesting that health employees' interaction with new technologies remains anything but smooth.
Failure to overcome difficulties in how doctors, nurses, administrative and other staff interact with technology can have expensive consequences. A salutary lesson was the 2011 scrapping of the UK’s £12.7bn effort to introduce electronic patient records. A range of factors plagued the implementation, but the thorniest was trying to convince doctors to accept and adopt new processes. (Germany, France and the Netherlands have experienced similar failures, although in Denmark such problems appear to have been surmounted.[2]) Beyond resistance to change, problems in connecting systems in different parts of the health service also undoubtedly play a role in such episodes. In our survey, sector executives point to such system disconnects as among the toughest challenges they face with technology. Another major challenge, according to the respondents, is that processes are not being written quickly enough to keep pace with technology advances.
Employees' adaptation to new technology will likely improve, and operational and cost efficiency along with it, but will there be a sacrifice in the types of human creativity and imagination needed for truly effective patient care? Our survey-takers are optimistic on this score. Close to 70% believe that increasing technology intensity has made their employees more, not less, creative in developing ideas for new health services and products, and 65% say the same about conceiving ideas to improve processes.
What’s my problem, Watson?
A look at medical diagnostics may help explain such optimism. It is an area where technology promises to enhance the abilities of health professionals, improving efficiency in the process. Diagnosis relies on the fundamental human capacity to draw on diverse pieces of information about patients—from how they describe their symptoms, to their prior medical history, to how they physically appear—and make an assessment of their likely condition. The lion's share of our survey respondents (43%) point to diagnostics as the area of healthcare where the retention of human intuition is most critical.
Much work is under way to bring machine learning and computing power to bear in diagnosis, in order to maximise the power of data. The
potential is clear: systems such as IBM's Watson supercomputer can “read” a million medical textbooks in just three seconds, while also sucking in diverse other information, from insurance claims to electronic medical records, to enhance its diagnostics calculations. Rick Robinson, an executive architect at IBM, notes that as many as 50,000 papers are published each year in the field of diabetes alone. “No human clinician can keep up with that,” he says. The result is inevitable errors. Studies suggest that doctors misdiagnose conditions as much as 10-15% of the time.[3]
There is no suggestion, however, that such systems would fully replace the role of humans in diagnosis. “I think there will always be the need for a human to decide and act in more complex situations,” says Mark Coecklebergh, an assistant professor in the philosophy of technology at the University of Twente (Netherlands). More fundamentally, there are wide-ranging challenges to overcome, ranging from issues of accountability to rethinking the fundamental processes of healthcare.
The newly digital doctor
Dr Eric Topol is an American cardiologist, geneticist and researcher. Named “Doctor of the Decade” by the Institute for Scientific Information for his research contributions, he is the author of “The Creative Destruction of Medicine: How the Digital Revolution Will Create Better Health Care”.
Q. How will technology change the role of doctors?
Today doctors control everything. They order in the data, the scans and any tests required. But tomorrow, the individual will drive that. Individuals will come to doctors—whether physically or virtually—with information in hand seeking their guidance. Individuals will also have information well beyond what was formally obtainable today—for example, blood pressure readings for every minute of the last two weeks, or glucose levels for every minute of the last month. Those prospects are exciting.
Q. Will a traditional physical exam be replaced?
There will certainly be more data analysis, but a physical exam will still be useful. My physical exam, however, has changed dramatically. Since December 2009 I have not used a stethoscope to listen to a heart. Why would I bother when I can use a high-resolution ultrasound, which is a pocket device in my coat? So the stethoscope will eventually go, but I don't believe technology could ever replace the doctor-patient relationship in terms of empathy, compassion and understanding.
Q. Can technology reduce the pressure on overburdened health systems?
I think so. We are going to level the playing field, and this should mean the demand for doctors lessens. More and more things can be done remotely, or by individuals on their own, as long as there is Internet coverage. There will be times where you need a hospital and the physical presence of a physician, but that need—which puts pressure on health systems—will be dramatically reduced over time.
Pressure to automate
While technology may augment human potential in some healthcare domains, in others it is being viewed as a means to free up people to perform other activities. In this context, the case for remote patient monitoring and other elements of “telehealth” is clear. The aim is to free clinicians from the basic and time-consuming manual processing of information so that they can focus on where they are needed most—patient care. “The demographic challenge is such that the current way of working, which involves recruiting other countries' doctors and nurses, is not
sustainable,” explains George MacGinnis, a telehealth expert at PA Consulting in the UK.
It is also an area where many see less need for human imagination or intuition: less than one-fifth of those polled in the sector think monitoring patients requires these capacities; and only 9% think the same of administering medicines. Both areas are ripe prospects for technology: from scales that monitor patients’ weight and flag up possible risk conditions to automated alerts reminding people to take their pills.
Such technologies hold clear potential not only to free up personnel, but also to improve patient outcomes and quality of life. Mr MacGinnis cites the example of patients with certain heart conditions who must weigh themselves daily to look for early signs of excess fluid retention. This can be automated with the help of an Internet-enabled scale that alerts doctors of any worrying changes.
Encouragingly, there is little fear in the industry that telehealth would somehow curtail the role of carers or nurses, or lessen their societal value. “There are certain things where you need emotions and where you need improvisation, imagination,” explains Mr Coecklebergh. This is borne out
in a variety of specific healthcare implementations, such as wide-ranging work at University Hospital Birmingham (UHB) in the UK to use technology to improve clinical decision support and increase process automation. “Contrary to negative perceptions, we’ve seen individuals empowered, obtain greater autonomy and achieve greater job satisfaction,” says Steve Chilton, UHB’s ICT director. He argues that such developments have pushed the role of human workers up the value chain, while new roles have emerged as a result, such as within process analytics. “Technology-led automation and development have freed up creativity,” he says.
The pain of disruption
Much of the wrenching change that healthcare organisations are destined to undergo over the next several years will be driven by technology. Robotics in surgery or video consultations between doctors and patients may get the headlines, but less exotic data analysis, knowledge sharing, website management and other systems will be at least as instrumental in creating the efficiencies that must be gained across under-pressure health systems. Technology disruption is part of almost any conceivable scenario for healthcare reform in the coming years.[4]
Pressure on healthcare professionals to adapt to technology change will thus remain relentless. How well they adapt will rely to some extent on the skill (and speed) with which processes are written to guide the interaction. The views of the health practitioners and experts, and the examples, presented in this article, provide grounds for optimism that the frictions which have plagued interaction between people and technology in this sector will be smoothed out, and that human creativity will not be sacrificed in the process. Which is a good thing, because health organisations will need all the creativity their employees can muster to deliver the effective and cost-efficient care their patients will require and their stakeholders will demand.
* This article is excerpted from a forthcoming Economist Intelligence Unit report, Humans and machines: The role of people in technology-driven organisations. The report will be published on 5th March 2013 to coincide with the "Technology Frontiers 2013" summit, hosted by The Economist Events. Both the report and the summit are sponsored by Ricoh.
[1] A survey of 432 senior executives was conducted online in November and December 2012. Of the 40 respondents from healthcare, biotechnology and pharmaceutical organisations, roughly one-third hail from North America, one-third from Europe and one-quarter from Asia-Pacific. Almost half (48%) hold C-suite or board positions, with the rest being other senior managers. Over half of the organisations (57%) have annual revenue in excess of US$500m, with 28% having US$10bn or more.
[2] Future-proofing Western Europe's healthcare: A study of five countries, Economist Intelligence Unit, September 2011.
[3] How doctors think, Jerome Groopman, 2007.
[4] A variety of scenarios for how healthcare reform may play out in Europe are presented in The future of healthcare in Europe, Economist Intelligence Unit, March 2011.