Tag Archives: cultural adoption

Org Change is Missing in Health Care Tech

If you are an Organizational Change Management (OCM) professional you must read this.  If it doesn’t make you angry then you need to read it again.

Robert M. Wachter, professor of medicine at the University of California, San Francisco, and the author of “The Digital Doctor: Hope, Hype, and Harm at the Dawn of Medicine’s Computer Age” wrote an op-ed that appeared on Sunday, March 22, 2015, in the New York Times under the headline “Why Health Care Tech Is Still So Bad.”

As Org Change professionals, we are collectively pressured to provide hard data to prove the value of OCM.  We see project teams rigorously measured and rewarded for meeting scope, budget and schedule without any regard for actual adoption or business impact.  We deal with senior leaders who believe that they will realize 100% of a projected benefit on the day a project is completed.  We know that this is crazy, but can’t seem to do anything about it.

My recommendation: Download this article from the NY Times.  Circulate it among your sponsors.  Talk with them about it.  This is what OCM is all about.

Here is the unedited text of the op-ed:

 LAST year, I saw an ad recruiting physicians to a Phoenix-area hospital. It promoted state-of-the-art operating rooms, dazzling radiology equipment and a lovely suburban location. But only one line was printed in bold: “No E.M.R.”

In today’s digital era, a modern hospital deemed the absence of an electronic medical record system to be a premier selling point.

That hospital is not alone. A 2013 RAND survey of physicians found mixed reactions to electronic health record systems, including widespread dissatisfaction. Many respondents cited poor usability, time-consuming data entry, needless alerts and poor work flows.

If the only negative effect of health care computerization were grumpy doctors, we could muddle through. But there’s more. A friend of mine, a physician in his late 60s, recently described a visit to his primary care doctor. “I had seen him a few years ago and I liked him,” he told me. “But this time was different.” A computer had entered the exam room. “He asks me a question, and as soon as I begin to answer, his head is down in his laptop. Tap-tap-tap-tap-tap. He looks up at me to ask another question. As soon as I speak, again it’s tap-tap-tap-tap.”

 “What did you do?” I asked.  “I found another doctor.”

Even in preventing medical mistakes — a central rationale for computerization — technology has let us down. A recent study of more than one million medication errors reported to a national database between 2003 and 2010 found that 6 percent were related to the computerized prescribing system.

At my own hospital, in 2013 we gave a teenager a 39-fold overdose of a common antibiotic. The initial glitch was innocent enough: A doctor failed to recognize that a screen was set on “milligrams per kilogram” rather than just “milligrams.” But the jaw-dropping part of the error involved alerts that were ignored by both physician and pharmacist. The error caused a grand mal seizure that sent the boy to the I.C.U. and nearly killed him.

How could they do such a thing? It’s because providers receive tens of thousands of such alerts each month, a vast majority of them false alarms. In one month, the electronic monitors in our five intensive care units, which track things like heart rate and oxygen level, produced more than 2.5 million alerts. It’s little wonder that health care providers have grown numb to them.

The unanticipated consequences of health information technology are of particular interest today. In the past five years about $30 billion of federal incentive payments have succeeded in rapidly raising the adoption rate of electronic health records. This computerization of health care has been like a car whose spinning tires have finally gained purchase. We were so accustomed to staying still that we were utterly unprepared for that first lurch forward.

Whopping errors and maddening changes in work flow have even led some physicians to argue that we should exhume our three-ring binders and return to a world of pen and paper.

This argument is utterly unpersuasive. Health care, our most information-intensive industry, is plagued by demonstrably spotty quality, millions of errors and backbreaking costs. We will never make fundamental improvements in our system without the thoughtful use of technology. Even today, despite the problems, the evidence shows that care is better and safer with computers than without them.

Moreover, the digitization of health care promises, eventually, to be transformative. Patients who today sit in hospital beds will one day receive telemedicine-enabled care in their homes and workplaces. Big-data techniques will guide the treatment of individual patients, as well as the best ways to organize our systems of care. (Of course, we need to keep such data out of the hands of hackers, a problem that we have clearly not yet licked.) New apps will make it easier for patients to choose the best hospitals and doctors for specific problems — and even help them decide whether they need to see a doctor at all.

Some improvements will come with refinement of the software. Today’s health care technology has that Version 1.0 feel, and it is sure to get better.

But it’s more than the code that needs to improve. In the 1990s, Erik Brynjolfsson, a management professor at M.I.T., described “the productivity paradox” of information technology, the lag between the adoption of technology and the realization of productivity gains. Unleashing the power of computerization depends on two keys, like a safe-deposit box: the technology itself, but also changes in the work force and culture.

In health care, changes in the way we organize our work will most likely be the key to improvement. This means training students and physicians to focus on the patient despite the demands of the computers. It means creating new ways to build teamwork once doctors and nurses are no longer yoked to the nurse’s station by a single paper record. It means federal policies that promote the seamless sharing of data between different systems in different settings.

We also need far better collaboration between academic researchers and software developers to weed out bugs and reimagine how our work can be accomplished in a digital environment.

I interviewed Boeing’s top cockpit designers, who wouldn’t dream of green-lighting a new plane until they had spent thousands of hours watching pilots in simulators and on test flights. This principle of user-centered design is part of aviation’s DNA, yet has been woefully lacking in health care software design.

Our iPhones and their digital brethren have made computerization look easy, which makes our experience with health care technology doubly disappointing. An important step is admitting that there is a problem, toning down the hype, and welcoming thoughtful criticism, rather than branding critics as Luddites.

In my research, I found humility in a surprising place: the headquarters of I.B.M.’s Watson team, the people who built the computer that trounced the “Jeopardy!” champions. I asked the lead engineer of Watson’s health team, Eric Brown, what the equivalent of the “Jeopardy!” victory would be in medicine. I expected him to describe some kind of holographic physician, like the doctor on “Star Trek Voyager,” with Watson serving as the cognitive engine. His answer, however, reflected his deep respect for the unique challenges of health care. “It’ll be when we have a technology that physicians suddenly can’t live without,” he said.

And that was it. Just an essential tool. Nothing more, and nothing less.


Elearning Cultural Adoption Factors

This posting is a summary of a discussion I facilitated with the Metro Milwaukee Society for Human Resource Management Performance & Development committee during their July ’09 meeting.

I was in a discussion recently with several corporate learning professionals and some Gen. Y folks.

We were discussing the application of Web 2.0 tools for learning, specifically the integration of Wiki’s as a post-training tool that learners could use as reference and best practice sharing sites.  The L&D pro’s were adamant, “We couldn’t allow that!  How could we control the accuracy of the information being shared?”  One of the younger folks replied, “You guys have been sharing bogus information at the water cooler and the lunch room for years, why does it bother you now?  At least this way you can see what we’re sharing!” 


We were stunned by the obvious logic and the clarity of the point.  As I reflected on this exchange over the next few weeks, it dawned on me that the ground has shifted under the Training & Development department!  In the last few years, driven by the wide-spread adoption of Web 2.0 technologies, the factors that can impact the success/failure of an elearning deployment have changed radically from what they’ve been historically. 

A brief survey of Technology-Enabled Learning / Elearning history

  • Gen. 1 – Laserdisc; specialized delivery systems(mid-80’s)
  • Gen. 2 – CD-ROM; desktop PC delivery (1990’s)
  • Gen. 3 – Intranet-based; Workplace or home PC delivery/Learning Management Systems (LMS) (2000’s)
  • Gen. 4 – Internet-based, Web 2.0 (e.g., Wiki, YouTube, Podcast, Simulations, Twitter);  PDA/Smartphone delivery/Learning Content Management Systems (LCMS)

Summary of Current Trends:

  • Unprecedented cost pressure on business (travel & training impact)
  • Training design shifting from discrete events to distributed-over-time
  • Control shifting from Training Dept. to Learner
  • Cost of technology-enabled training production & deployment dropping
  • Complexity of learning content management increasing
  • Workplace generational gaps/differences with technology widening

Cultural adoption issues for Gen. 1 thru Gen. 3

For the first three generations of technology-enabled training, it was the training department on the cutting-edge of technology.  Their job included not only the development and deployment of elearning content, but staying abreast of the developments in elearning technology, and the marketing of the technology to their client population as well.   During roll-out of an enterprise-wide elearning application at GE, as recently as 4-5 years ago, these were the typical cultural adoption issues:

  • Inconsistent navigation features from e-course to e-course – None have the comfort of a “book metaphor” (i.e., Table of Contents, Chapters, Index, Visual search, Mark pages with tabs, write in margins/ highlight, etc.).
  • Learner comfort/preference for face-to-face social interaction and value of spontaneous interaction found in a classroom vs. the solo elearning experience
  • Shift from listening/verbal skills/social skills needed in a classroom to reading/writing/technology skills needed for elearning
  • Classroom/off-site training frequently seen as a perquisite – elearning seen as a task
  • Overcoming user frustration with technology glitches and lack of computer  fluency/skills
  • Good time management skills and discipline managing distractions (e.g., Phone, email, drop-ins, boss) required for elearning.   Additionally the psychological benefit of changing venue (going to classroom) for training, helping to refocus attention to learning.

Actual learner feedback; “When I go to a classroom event I make training my A priority and I keep up with my day job before class, during lunch/breaks, after class.  When I’m at my desk, my day job is my A priority and elearning is a C priority – I rarely make the time for it”

Cultural adoption issues for Gen. 4

The wide-spread adoption of Web 2.0 technologies such as Wikis, blogs, social media, YouTube, simulations, smartphones, etc., predominantly by younger (Gen. Y) workers has created a larger “technology gap” for many of the boomers in the workforce who still have the Gen. 1-3 issues described above.   Now we add in a whole new dynamic, the Gen Y folks who bring:

  • Comfort with/preference for Web 2.0/social technologies and elearning rather than face-to-face social interaction
  • Expectations of highly-sophisticated technology experience (i.e. intuitive user interface & fast, glitch-free technology)
  • Different production values on-line content (e.g., cell phone video posted on YouTube is fine as long as it meets acceptable audio/video quality) as opposed to the emphasis on “glitzy” production that is still preferred for Gen. 1-3 elearning to support the internal marketing of it.
  • Widespread adoption of smartphone/PDA technology seamlessly integrated with on-line applications, and expectations of truly anywhere/anytime learning. 


I will continue this discussion in another post soon.  I’d love to hear your comments!