Clara V. was mopping the floor when her heart stopped. The paramedics arrived 12 minutes later and restarted her heart, but she would never wake up again.

When I became her Intensive Care Unit doctor four weeks later, Clara’s lungs were still inflating and deflating to the rhythm of the mechanical ventilator. Her blood oxygen level read normal. But she would never again recognize her family or know that she was alive. Her brain had been starved of oxygen for too long. She would never eat, smile or move her arm to scratch her face.

The neurologists had stopped making rounds on her: “Prognosis extremely poor for significant clinical improvement,” read their sixth and final note. “We will sign off for now. Please don’t hesitate to call us back if anything changes.”

 The day before I was to take over her care, my colleague had scheduled her for surgeries to sew permanent tubes into her stomach and neck for artificial feeding and breathing support. With these tubes, she’d be able to live for months or even years in this vegetative state. Her family saw the operations as hopeful signs, not as chains keeping her tethered to her deathbed. And no one had told them otherwise.

In the “olden days,” death was a fact of life. People planned for it, experienced it close-up when loved ones died, saw bodies laid out in their homes for neighbors to pay respects.

But with the advent of the ICU and its life-prolonging tools in the early 20th century, the reality of being able to save lives on the battlefield or the polio ward morphed into the fantasy of being able to defeat all death. Numerous tributaries of medical specialization developed, with accompanying pharmacopeia and devices, for ensuing generations of doctors to explore. Today, there is always something else to do, another treatment to try. (…) Read Full Article Here

Read Full Article