Posted on Sunday, March 24th, 2013 at 6:30 am
By Graham Mooney
In Tijuana, Mexico, 43-year-old tuberculosis patient Maria Melero takes her daily medicines at home while her health worker watches on Skype. Thirteen thousand kilometers away in New Delhi, India, Vishnu Maya visits a neighborhood health center to take her TB meds. A counselor uses a laptop to record Maya’s fingerprint electronically. An SMS is then sent to a centralized control center to confirm that Maya has received today’s dose.
The global burden of TB is massive and such episodes are repeated daily across the world. Although incidence and death rates have been falling in many regions—TB mortality has dropped by 41% since 1990—the WHO estimatesthat there were 8.7 million new cases in 2011. Deaths totaled 1.4 million people, 30% of whom were also HIV positive. Maria and Maya belong to a growing number of people who suffer from multidrug-resistant (MDR) TB, which is caused by organisms that are stubbornly unmoved by the most effective anti-TB drugs, isoniazid and rifampicin. In 2011 there were about half a million new MDR–TB cases and 9% of these are extensively drug-resistant (XDR) TB, failing to respond even to second line drugs such as amikacin, kanamycin, or capreomycin.
The remote and electronic surveillance of thousands of patients like Maria and Maya is part of an attempt to expand and enhance Directly Observed Therapy, Short-course (DOTS), whereby health care workers watch patients take their daily medications—for up to two years in the case of XDR-TB. DOTS, which undoubtedly has saved countless lives, was once believed to hold the key to eliminating the scourge of TB, but it has come in for a great deal of criticism in recent years. As well as focusing on patients who were easiest to treat, the mismanagement of DOTS in India and elsewhere has been identified as a key reason for the rise of MDR-TB, XDR-TB, and most recently, totally drug-resistant (TDR) TB. Reports of TDR-TB have now been made from as far afield as Italy, Iran, South Africa, and India and have prompted responses in the west that paint an apocalyptic future of superbugs run amok.
The Indian cases of TDR-TB were first recounted by Zarir Udwadia of the Hinduja Hospital and Research Center in Mumbai. Dr Udwadia recently described TDR-TB in India “an iatrogenic disease that represents a failure of physicians, public and private, and a failure of public health.” The idea that some illnesses are caused by medical activity isn’t a new one; nor is the notion that medical, pharmaceutical, and public health systems do a gross disservice to the world’s poor. What’s more, the triumphal claims that tuberculosis is a disease of the past and well within the realms of human control are now long gone, rightfully consigned to the dustbin labeled “medical hubris”. But if we consider the historical arc of tuberculosis prevention and therapy from the late nineteenth century onwards, Udwadia’s accusation signifies an important and necessary concession in the long-running blame game of patient “recalcitrance”, “non-compliance”, and “adherence”.
Medical and public health authorities frequently associated high rates of tuberculosis mortality and morbidity with immoral behaviors and vice. Well into the twentieth century, when some of the social determinants of tuberculosis—such as domestic crowding, under-nutrition, and occupational exposures—were acknowledged (if not fully understood or recognized by legislation), the reasons for heightened susceptibility were often explained away by deploying racial, ethnic, and class stereotypes. As many historical studies in the United States have shown, African-Americans, Chinese, Jews, eastern Europeans, and the poor were portrayed as incorrigible, ignorant, and incapable of acting on the advice of (mostly) white, middle-class public health professionals. Often these critiques were underwritten by eugenic arguments: tuberculosis was a disease that picked off the feeble-bodied, preying on a weak urban residuum that lacked inherited resistance. Such prejudicial attitudes were replicated across the world and were particularly prevalent in the colonized parts of the globe.
Before the wide availability of anti-tuberculosis drugs in the 1940s (the first, streptomycin, was discovered in 1943), treatment options for tuberculosis were limited and of questionable efficacy. Educational campaigns aimed to change behaviors, but lacked direct focus. The emergence of sanatoria in the late nineteenth century offered new possibilities that served two purposes: isolating sufferers from the rest of the community and instilling “rules for health” that centered on diet, exercise, rest, and plenty of fresh air. However, sanatoria only treated a minority of patients with TB and an extended period of sequestration was an unrealistic option for many a working-class family’s main breadwinner. Consequently, the lion’s share of tuberculosis management was home-based and sought to replicate the sanatorium regime as much as possible. Patients were relied upon to perform the rituals of taking their own temperature and weight, regulating diet, and scheduling periods of rest and activity. Placing the responsibility for success firmly in the hands of sufferers, doctors cautioned that this sort of domesticated “preventive therapy” would fail if the patient lacked will-power, strength of character, and an even temperament. As mentioned earlier, these traits were not thought to be abundant in the types of people who usually succumbed to tuberculosis!
Not surprisingly, long-standing tropes of patient delinquency found their way into criticisms about the early versions of supervised ambulatory therapies in the late 1950s and early 1960s. Yet technological interventions such as fluorescent tracers, urine testing, and radioactive pill dispensers that were designed to overcome perceived cultural barriers to taking medicines ended up undermining patient-doctor relationships with secrecy and mistrust. By the late 1960s, direct observation seemed to be a palpable necessity, though it took almost 30 years to become WHO policy in 1995.
It is somewhat ironic that just as the structural, social, and economic barriers to patient “non-compliance” are being taken seriously by the medical mainstream, the chief medical officer for England recently invoked a dystopian scenario of a drug-resistant bacterial rampage reminiscent of the early nineteenth century. If we do indeed spiral towards some sort of pre-antibiotic dark age, then one can only hope that prejudice, discrimination, and infection can be overcome in equal measure.
Graham Mooney co-edits the journal Social History of Medicine, which has released avirtual issue on tuberculosis to mark World Tuberculosis Day on Sunday March 24, 2013. He is an Assistant Professor at the Institute of the History of Medicine at Johns Hopkins University. His most recent publication is ‘The material consumptive: Domesticating the tuberculosis patient in Edwardian England’ in the Journal of Historical Geography (2013).
Social History of Medicine is concerned with all aspects of health, illness, and medical treatment in the past. It is committed to publishing work on the social history of medicine from a variety of disciplines. The journal offers its readers substantive and lively articles on a variety of themes, critical assessments of archives and sources, conference reports, up-to-date information on research in progress, a discussion point on topics of current controversy and concern, review articles, and wide-ranging book reviews.
No comments:
Post a Comment