Wednesday, August 7, 2013

Good ideas can't necessarily survive on their own

Slow Ideas: Some innovations spread fast. How do you speed the ones that don’t? by Atul Gawande

The capacity to create and innovate is integral to effective decision-making. Thinking imaginatively from multiple perspectives is part of the challenge. But once an idea is out there and proven, there is still the issue of uptake. A good idea which is not accepted is not worth much. Many consider this to be a simple matter of communication, or creating the right reward (and punishment) incentives. Some dismiss the issue entirely on the grounds that if the idea is worthwhile, it will be adopted anyway.

It's not quite that simple as Gawande points out. He points to the contrasting fates of anesthesia and hospital hygiene.

The use of anesthesia spread like lightening.
On October 16, 1846, at Massachusetts General Hospital, Morton administered his gas through an inhaler in the mouth of a young man undergoing the excision of a tumor in his jaw. The patient only muttered to himself in a semi-conscious state during the procedure. The following day, the gas left a woman, undergoing surgery to cut a large tumor from her upper arm, completely silent and motionless. When she woke, she said she had experienced nothing at all.

Four weeks later, on November 18th, Bigelow published his report on the discovery of “insensibility produced by inhalation” in the Boston Medical and Surgical Journal. Morton would not divulge the composition of the gas, which he called Letheon, because he had applied for a patent. But Bigelow reported that he smelled ether in it (ether was used as an ingredient in certain medical preparations), and that seems to have been enough. The idea spread like a contagion, travelling through letters, meetings, and periodicals. By mid-December, surgeons were administering ether to patients in Paris and London. By February, anesthesia had been used in almost all the capitals of Europe, and by June in most regions of the world.

There were forces of resistance, to be sure. Some people criticized anesthesia as a “needless luxury”; clergymen deplored its use to reduce pain during childbirth as a frustration of the Almighty’s designs. James Miller, a nineteenth-century Scottish surgeon who chronicled the advent of anesthesia, observed the opposition of elderly surgeons: “They closed their ears, shut their eyes, and folded their hands. . . . They had quite made up their minds that pain was a necessary evil, and must be endured.” Yet soon even the obstructors, “with a run, mounted behind—hurrahing and shouting with the best.” Within seven years, virtually every hospital in America and Britain had adopted the new discovery.
In contrast,
Sepsis—infection—was the other great scourge of surgery. It was the single biggest killer of surgical patients, claiming as many as half of those who underwent major operations, such as a repair of an open fracture or the amputation of a limb. Infection was so prevalent that suppuration—the discharge of pus from a surgical wound—was thought to be a necessary part of healing.

In the eighteen-sixties, the Edinburgh surgeon Joseph Lister read a paper by Louis Pasteur laying out his evidence that spoiling and fermentation were the consequence of microorganisms. Lister became convinced that the same process accounted for wound sepsis. Pasteur had observed that, besides filtration and the application of heat, exposure to certain chemicals could eliminate germs. Lister had read about the city of Carlisle’s success in using a small amount of carbolic acid to eliminate the odor of sewage, and reasoned that it was destroying germs. Maybe it could do the same in surgery.

During the next few years, he perfected ways to use carbolic acid for cleansing hands and wounds and destroying any germs that might enter the operating field. The result was strikingly lower rates of sepsis and death. You would have thought that, when he published his observations in a groundbreaking series of reports in The Lancet, in 1867, his antiseptic method would have spread as rapidly as anesthesia.

Far from it. The surgeon J. M. T. Finney recalled that, when he was a trainee at Massachusetts General Hospital two decades later, hand washing was still perfunctory. Surgeons soaked their instruments in carbolic acid, but they continued to operate in black frock coats stiffened with the blood and viscera of previous operations—the badge of a busy practice. Instead of using fresh gauze as sponges, they reused sea sponges without sterilizing them. It was a generation before Lister’s recommendations became routine and the next steps were taken toward the modern standard of asepsis—that is, entirely excluding germs from the surgical field, using heat-sterilized instruments and surgical teams clad in sterile gowns and gloves.
The key conclusion?
But technology and incentive programs are not enough. “Diffusion is essentially a social process through which people talking to people spread an innovation,” wrote Everett Rogers, the great scholar of how new ideas are communicated and spread. Mass media can introduce a new idea to people. But, Rogers showed, people follow the lead of other people they know and trust when they decide whether to take it up. Every change requires effort, and the decision to make that effort is a social process.

This is something that salespeople understand well. I once asked a pharmaceutical rep how he persuaded doctors—who are notoriously stubborn—to adopt a new medicine. Evidence is not remotely enough, he said, however strong a case you may have. You must also apply “the rule of seven touches.” Personally “touch” the doctors seven times, and they will come to know you; if they know you, they might trust you; and, if they trust you, they will change.
Read the whole thing.


No comments:

Post a Comment