Die Struktur des Blocks ist ein Müllhaufen. Mit Glück findet man aber doch noch die fehlenden Teile.
Dieses Stück ist vom Mai 2011:
https://web.archive.org/web/20111007123316/http://shambook.blogspot.com/2011_05_01_archive.html[*quote*]
S H A M b l o g
Exposing the scams, shams, and shames of modern life.
Monday, May 30, 2011
You are free to read this. Or not.
I'm taking a break from my customary weekend habit of posting the latest installment of "Placebo" in order to post a link to this blog item by Sam Harris. It perfectly expresses my views on the subject of free will (although it does not express all of my views on the ramifications of a determinist system of thought; in fact, Harris makes what I regard as a naive assumption about the implications of deterministic thinking for the justice system, etc.).
Anyway, please read this.
(P.S. Thank you, Hilary.)
© Copyright by Steve Salerno at 5:21 PM 2 comments
Labels: determinism, free will
Friday, May 27, 2011
Still at the Center of controversy.
A few days ago I received from reader Stuart Anderson* the following email about the Midwest Center for Stress & Anxiety:
Thanks for your blog. I just read it and had to add to the nonsense. My gullible wife ordered this program from a TV spot a week ago. She is now in the process of trying to obtain a return authorization from the company to send the stuff back to them. They are pulling out all sorts of tricks to avoid having to accept the return – past the trial period, etc., even though it is all bogus. My wife had to have her credit card company block all their charge attempts, even though they were not authorized to charge any more than the shipping charges. Sounds like a reputable company to me.
As the sarcasm of Anderson's parting shot makes clear, some corporate leopards may give their web site a major overhaul, but they seldom change their (ad) spots. More evidence: The comments continue to trickle in on my long-ago posts from April 3, 2007 and July 29, 2008.
Indeed, the gamesmanship is as constant as the twisted smirk that seems forever pasted on founder Cindy Bassett's face.
* Once again here, I cannot vouch for the accuracy of the statements made by Anderson. I publish his email because it seems contextually similar to many complaints that I have received, and read, with regard to the Center.
© Copyright by Steve Salerno at 7:15 AM 3 comments
Labels: Lucinda Bassett, Midwest Center
Monday, May 23, 2011
Call it...a RoarShock test?
Last night as I was getting on a narrow entrance ramp after work, I was passed by a gargantuan, lifted, customized Hummer; the only things missing were the gun turrets. Actually, "passed" is far too mild a term for what transpired. The driver of the Hummer, concealed behind his not-street-legal blacked-out windows, clearly thought that I and my little Acura were trespassing in a sector of air and asphalt that properly belonged to him, so he asserted his claim. My sole option was to veer off to the right shoulder. ... OK, I suppose it's possible that I could've held my ground and just let him pass over me, but I didn't want to chance it.
A thought occurred.
In this society we have evolved all sorts of complicated and elaborate psychological profiling designed to uncover mental pathology. I propose a much simpler test: Does the person drive a Hummer? Especially the big, tricked-out one? That's all the evidence I need.
Now, I'm not saying that only people who drive Hummers are unbalanced. We must be mindful here of the old logical fallacy: The fact that psychos drive Hummers doesn't mean that Hummer drivers are the only psychos. Some unbalanced people (notably the ones with limited funds or bad credit) drive other cars. But I think it's safe to regard Hummer ownership as confirming proof of mental/emotional dysfunction. Only a true asshole among assholes would accouter himself with as audacious, unapologetically grandiose a monstrosity.
After all, what does the Hummer do well besides offend sense and sensibility? For starters, it has always ranked among the most poorly made "luxury" vehicles in America, with an appalling number of initial-quality and reliability defects. Gas mileage? The most popular 8-cylinder version musters 14 mpg combined. Its handling characteristics are roughly equivalent to a Winnebago's, with skidpad marks ("lateral acceleration") in the laughable .60 range. (Come to think of it, maybe that's why the guy last night almost careened into me; he couldn't control the damn thing.) And for all its bulk and bravado, the Hummer isn't even that safe, at least not historically.
Oh, and guess who's widely "credited" with buying the very first one in the U.S.? Ah-nold. I rest my case.
In all seriousness, if you drive a Hummer, I'd like to hear from you. Tell me what motivated your purchase. I'd really like to know.
© Copyright by Steve Salerno at 10:02 PM 1 comments
Labels: mores, role models
Sunday, May 22, 2011
Placebo: how a sugar pill became a poison pill. Part 9 of a contintuing saga...
Read Part 8.
In 1921, amid the early tumult of prohibition, a remarkable study took shape in Palo Alto, California. Stanford psychologist Lewis Madison Terman—as serious-looking a man as one is apt to find, with his specs, upright bearing and unsmiling mien—would one day be remembered mostly for designing and publishing the final accepted version of the Stanford-Binet IQ test.
In '21, however, Terman began work on another project that may have more lasting import for humankind, despite being known today to just a small circle of “longevity wonks.” Terman proposed to track the lives of 1528 American children from that point on. His subjects, encountered in the course of his study of intelligence, were all 10 years old. Terman himself was 44; he would follow them and their families for the rest of his life, and he obtained from his younger associates a pledge to do the same after he was gone. The goal was to note what kind of longevity the 10-year-olds achieved, and try to deduce the reasons why.
Terman recognized that scientists of that era—doctors in particular—associated longevity first with healthcare, then with nutrition and other environmental factors. Given his background, Terman understandably wondered how certain other factors might come into play: If a person is happy, does that bode for greater longevity? What about marriage? A stable circle of friends? Religious faith? Job satisfaction?
Although Terman died in 1956, his project went on to enjoy enviable longevity in its own right. Over the ensuing five decades, the Terman Life Cycle Study grew richer and more nuanced, piquing the interest of some of America's foremost psychologists, epidemiologists and gerontologists. Their conclusions about the so-called Terman Cohort challenge many popular beliefs about longevity. For intance, the workaholics in the sample often lived longer than the slackers; neurotic men in particular outlived their more laid-back counterparts. Unhappy individuals logged about as many years as the happy ones—and downright cheerful people could expect to end up on the wrong side of the longevity curve.
The notion that longevity has more to do with who we are than how we take care of ourselves is also legible in new research suggesting that lifespan may all be in the cards—or the genes. A controversial 2010 study reported in Science poses that in lesser species, genes determine variations in longevity both from species to species and within any given species. The study's authors inferred that by fully cracking the genetic code of a species or a representative individual, they could predict longevity with an accuracy of 77 percent.
Despite more objections from the AMA—rooted, one supposes, in the way the study implicitly marginalizes their members' efforts—the National Institute on Aging has agreed to fund this avenue of inquiry going forward. Wouldn't it be something if, among all factors that bode for long life, medicine turns out to be relatively unimportant....
In the meantime, and for all the progress since Hippocrates, human longevity remains a riddle, a moving target full of butterfly effects, unintended consequences and environmental variables that can't always be predicted, accounted for, or separated out. This much we do know: While people in 2011 may be dying of different things than people in 1911—whooping cough then, pancreatic cancer now—a lot of them are dying at the same approximate ages. “Over the past two thousand years there's been a lot of snake oil, a lot of claims made without any real impact on human lifespan,” says San Francisco biochemist Simon Melov, one of America's leading authorities on anti-aging. “Hygiene has changed tremendously, infant mortality has changed tremendously—but intervention in the basic aging process? It hasn't really changed all that much in two millennia.”
None of which stops the Franz Humers and Denton Cooleys (whom we met in earlier installments of this series) from selling their visions of perpetual life through chemistry or surgery. They realize that they make a mockery of the scientific method when they spin their syrupy vignettes about the nursing home Nana who's sitting in front of a cake with 100 candles on it thanks to some pill or procedure. They realize that a few cases here and there do not a trend-line or truism make; that there were plenty of instances of exceptional longevity centuries before CNN was on-hand to cover them. For if CNN had existed back in, say, 1635, surely some ancestral Anderson Cooper would've been camped out at Westminster Abbey to cover the big news that November: the demise of a man who'd attributed his stunning longevity not to regular visits with Elizabethan-era physicians, but to a regimen of “green cheese, onions, coarse bread, buttermilk or mild ale (cider on special occasions) and no smoking.” So said one Thomas Parr, buried at the Abbey by order of England's King Charles I upon Parr's death at age 152.*
To be continued...
* It must be said that some consider the story of "Old Tom Parr" as apocryphal: one of Mankind's earliest urban legends. Nonetheless, there are also respected sources who swear by it. In any case, it makes an anecdotal point that is increasingly borne out by more substantial data. And it was good enough for the whiskey manufacturers who decided to name their aged blend in tribute to him.
© Copyright by Steve Salerno at 6:06 AM 0 comments
Labels: medicine
Thursday, May 19, 2011
Nightmare on I-476.
It's been a long time since I've blogged just to blog. (That doesn't count my ongoing series on the folly of modern medicine, which is really in a different category, at least to my eye.) And I'd be the first to admit, it's not like there's been this mass outcry for me to resume my regular SHAMblog activities. ("Oh Steve, oh Steve, please come back and enlighten us once more...!")
But this evening, as I was making the hour-plus commute home from where I'm working these days, something I heard on the radio so shook me that it almost sent me over the divider on the northbound turnpike extension and into oncoming traffic. Actually, it was a couple of somethings, which I heard in succession as my radio scanned from station to station. First I heard theater critic-cum-conservative talk-show host Michael Medved say of former president George W. Bush, without apparent irony, this:
"I think history will look back on Bush as a fine president...even near-great."
Not long thereafter I heard former Reagan political functionary-cum-blowhard Mark Levin (he pronounces it Lev-IN) describe Barack Obama as an "idiot."
Somehow I righted the car and made it home safely.
Dubya and "near-great" in the same sentence? Medved is saying this of the man whose greatest accomplishment while in office was managing not to get his tongue stuck in one of the White House drawers? The man who got pissed off over 9/11 and promptly invaded a country that had nothing to do with it, thereby incurring trillion-dollar costs that may haunt us for generations to come? The man who turned government into an oligarchical ATM for his country club pals? The man who ultimately sundered the U.S. economy? The man who wasn't even a conservative's conservative? NEAR-GREAT??
Then we have Obama/"idiot." I realize that Levin was being figurative; he wasn't factually implying that Barack Obama's IQ is below the generally accepted "idiot threshold" of 30. He was using the term to derogate the policies espoused by the current occupant of the Oval Office. Still...these people lined up behind the aforementioned Dubya, then went that one better by getting all gooey over Sarah Palin (and later Christine O'Donnell, a candidate who launched her campaign by denying that she's a witch)...and Barack is an idiot?
This made me realize that there's no true hope for meaningful engagement or even mere dialogue anymore. Not when people who support a given ideology can become so myopic and rabid about The Cause that they say things like I heard on the way home tonight.
Maybe the world is indeed ending this Saturday. And you know, maybe I'm OK with that.
© Copyright by Steve Salerno at 7:36 PM 5 comments
Labels: demagoguery, medicine, Obama, politics
Sunday, May 15, 2011
Placebo: How a sugar pill became a poison pill. Part 8 pf a continuing saga...
Read Part 7.
Healthcare apologists insist that medicine's true impact on longevity has been blurred by “lifestyle issues” that, in recent decades, worked to offset those gains. They note, for one thing, that up to a third of adults now meet the clinical definition for obesity. That line of argument slyly ignores the many positive ambient changes in the American way of life that should have produced even more robust longevity numbers than we now see. Among them:
► A cleaner environment. Amid the latter-day carping by environmentalists about vehicular emissions and greenhouse gases, it's easy to forget the challenge posed by mere breathing at the dawn of the Industrial Revolution. Pollution-control insiders divide the history of American environmental management into four periods, the first of which, 1900-1950, is tellingly labeled “the smoke era.” American city dwellers gazed up into a sky so clogged with soot that in theory a keen-eyed observer could count the particulate matter-per-cubic-foot as if in some bizarro variant of bird-watching:
“Soot-filled industrial cities of the East and Midwest blackened skies in the early part of the 20th century. Emissions were first detected, and regulated, by sight [emphasis added]... By 1950, visible emissions from many industrial sources were controlled...and the effects of different air pollutants on health were being discovered.”
—from “Will the Circle Be Broken? A History of the U.S. Ambient Air Quality Standards,” in a 2007 journal put out by the Air & Waste Management Association
Over the second half of the 20th Century, many erstwhile steel or coal towns, formerly blighted by industrial pollutants, made a spectacular resurgence and today uphold high standards for clean air and water.
► Ever-improving rules governing occupational health and safety. The 1970 advent of both OSHA (Occupational Safety and Health Administration) and MSHA (Mine Safety and Health Administration), plus independent rating services like the International Organization for Standardization, have wrought a wholesale rethinking of how factories and warehouses operate. The Economic History Association portrays the U.S. as an “unusually dangerous” place to work prior to the 1930s, describing early factories as “extraordinarily risky by modern standards”—though still safer than railroads or mines. On-the-job fatalities nowadays are a fraction of what they were as recently as the 1950s. Far fewer Americans work with the toxic substances—like benzene, asbestos, and trichlorethylene—that gave rise to so many “unexplained” cancers during the 1960s and 1970s.
► Better insights about proper nutrition. Americans may eat too often at McDonald's and KFC, but in general, many more of them than in days past make a good-faith effort to work fruits, vegetables, vitamins, fiber and other desirable nutrients into their family's diets. This trio of factors alone could be expected to help Americans live longer even if no one ever went to a doctor for anything.
And, of course—the elephant in the room—there's the significant decline in cigarette smoking since the days when Gerald Ford was clanking golf balls and banging his head on airplane doors. As the 1970s drew to a close, 40 percent of Americans regularly used tobacco products. The number had dropped to 24 percent by 2009. Cultural taboos and legislated constraints on where that remaining 24 percent lights up have materially reduced the exposure of the rest of us to second-hand smoke: Some parents may still smoke three packs a day, but they don't usually do it in their cars with their kids present. Coworkers, air travelers and diners can go about their business without having cigarette smoke waft over them from an adjacent cubicle, seat or booth.
Medicine's true impact on the battle against top killers like cancer and heart disease cannot be properly reckoned without taking such (literal) atmospherics into account. As was the case a century ago with TB, any ostensible successes may have less to do with medical intervention and more to do with the broader context in which those interventions are taking place.
But then, all of this sound and fury, all of this bickering over contextual variables and decimal places, may be just a diverting numbers game that misses the larger point by failing to address the real-world consequences of those few added years for the people who must live them. Over the objections of the AMA and other groups representing healthcare interests, the World Health Organization has begun publishing an alternate longevity statistic, Health-Adjusted Life Expectancy (HALE). The HALE index suggests that if medicine is adding years of life, it is also adding years of pain and disability—at a roughly one-to-one ratio. Today's “increased lifespan” too often becomes a nonstop dirge of repetitive surgeries, the suffering of chemotherapy and radiation, the embarrassment and despair of incontinence and/or impotence. In too many cases, death, when it comes, is merciful.
Deeply troubled by that prospect, Dr. William J. Hall of the esteemed Highland Hospital Center for Healthy Aging in Rochester, New York, observed in a 2008 issue of The Archives of Internal Medicine: “Longevity is a Pyrrhic victory if those additional years are characterized by inexorable morbidity from chronic illness, frailty-associated disability and increasingly lowered quality of life.”
To be continued...
© Copyright by Steve Salerno at 5:54 AM 5 comments
Labels: medicine
Sunday, May 08, 2011
Placebo: How a sugar pill became a poison pill. Part 7 of a continuing saga...
Read Part 6.
Despite such subsequent advancements as the first influenza vaccine (1945), the first open heart surgery (1952), the first kidney transplant (1954), and the World Health Organization's official (if premature) declaration of the defeat of smallpox (1980), U.S. death rates remained remarkably level over the 60-year period between 1948 and 2007: 9.9-per-1000 in the earliest year, 8.0 in the final one. Adjusting for some of the plague-like factors that wiped out mass numbers of Americans in the bad old days, the longevity enhancements of the current era are shockingly modest.
Put simply, not that many adults are living that much longer than in years gone by. During the Civil War era, a 70-year-old man could expect to live to 80. In 1950, that same 70-year-old man could expect to live to—80. No measurable gain in a full century of medical progress!
Little has changed since, either, in spite of an endless array of pharmaceutical therapies, an aggressive, multifocal counterattack on cancer, and the myriad socially entrenched insights about proper health maintenance. Further, during the past half-century society has witnessed the proliferation of health-insurance plans that put these innovations within financial reach of most Americans. Nevertheless, the longevity of the average 70-year-old has increased by about 3.5 years over what it was when John F. Kennedy took office.
If this multigenerational parity still seems ludicrous on its face, consider the Founders. Washington died at 67, a bit young by present standards, but Franklin and Madison were 84 and 85 at their deaths. Jefferson died at 83, poetically on the same day, July 4, 1826, as his dear friend, John Adams, who was 90. Adams' son, John Quincy Adams, reached 80. Samuel Adams was 81. Andrew Jackson was 78. James Monroe attained 73. John Jay, 84. Hamilton died at 49—in the infamous duel with Aaron Burr, who lived to see 80. We can go even farther back. In her piece, “Dead at 40,” Carolyn Freeman Travers, research manager of the Plimoth* Plantation restoration, cites the supposition of modest life expectancy as one of “several common pieces of misinformation/mistaken beliefs about people in the past.” Of Massachusetts' Andover settlement she writes, “Circumstances evidently combined to encourage a high birth rate and an exceptionally low death rate, a combination which produced a population that grew at a rapid pace.” Citing the research of historian Philip Greven, Travers continues, “The average age of twenty-nine first-generation men at the time of their deaths was 71.8 years, and the average age at death of twenty first-generation wives was 70.8 years.”
The spectacularity of these trends was not lost on contemporaries. In 1644, William Bradford, long-time governor of the Plymouth Colony, wrote, “I cannot but here take occasion not only to mention but greatly to admire the marvelous providence of God! That notwithstanding the many changes and hardships that these people went through, and the many enemies they had and difficulties they met withal, that so many of them should live to a very old age!”
BY FAR THE most important variable in the science of longevity is the reckoning of individuals who become a death statistic at birth or soon after. No single factor has more decisively swayed the pendulum, for better or worse, than infant mortality.
Turn-of-the-Century America was an inhospitable place for newborns. In several American cities up to 30 percent of babies died before taking their first steps, and as you might imagine, some of the individual stories from this era are grievously tragic. One such story concerns the Coswells of Illinois. Between 1894 and 1907, Mary Coswell delivered no fewer than five stillborn children. With the fifth birth, Mary herself died. Another contemporaneous account mentions (but does not name) a husband and wife who, after the woman's second stillbirth, went to a nearby overlook and in front of horrified bystanders, joined hands and leaped to their death.
It's hard to overstate infant mortality's impact on longevity numbers from the early 20th Century. In 1900, a male child at birth had a life expectancy of about 48 years—but if he survived to age 1, his remaining life expectancy jumped instantly to 54 years. That gain represents the “write-off” of first-year mortality: With the appalling toll in infant deaths now shunted back to an earlier data set, the rest of the cohort “gains” an instant six-year longevity benefit (in much the same way that grading on a curve lops off the lowest marks and allows the arithmetic mean to rise commensurately). Over the ensuing decades the first-year gap narrowed, then disappeared. By 1980, that first completed year subtracted from remaining life expectancy, just as all subsequent years do.
Here's another way of looking at it. In 1920, when life expectancy at birth was a shade above 56 years, the infant-mortality rate stood at 85.8 deaths per 1000 live births, or 171,000 infant deaths in total; few of those aforementioned tenements remained untouched. By 2000, life expectancy at birth had risen to an even 77, and the infant-mortality rate had dropped to 6.9 per 1000 live births, or 28,000 infant deaths nationwide. Had 1920's rate of infant mortality still applied in 2000, the total number of infant deaths that year would've skyrocketed to well over 300,000. Those additional deaths at “age zero” would've chopped seven full years off 2000's overall life expectancy of 77.
And it gets worse. In 1920, the maternal death rate—representing women, like poor Mary Coswell, who died of pregnancy-related complications or during childbirth—was just under eight women for every 1000 births. By 2000 that grim statistic had been sliced to near-nonexistence: just one woman for every 100,000 births.
But again, let's assume 1920's death rate still applied in 2000, and that each of the 4 million births that year represented one mother (i.e. leaving out the nominal statistical impact of twins and other multiples). That simple exercise in “what if?” adds some 30,000 maternal deaths to our hypothetical mix. Inasmuch as the age of the typical American woman giving birth is 25, those 30,000 premature deaths lop another full year off overall longevity figures.
In 1912, President Taft signed a bill that ordered the creation of the Children's Bureau, which embraced as its charter mandate a full-out assault on the nation's alarming rate of infant mortality. Over the next decade that goal drew on the wisdom of the finest minds in public health, clinical medicine and social welfare. This tandem effort at first centered on the sanitary processing and handling of milk, then shifted its emphasis to other areas of hygiene and education, then took up the matter of promoting comprehensive infant- and maternal-welfare services. These initiatives wrought a sea change in the medical and cultural view of childbirth: from a historical model of care that kicked in only after delivery, to a comprehensive program of prenatal mentoring and monitoring.
The dividends began to show themselves almost immediately. Although healthcare indisputably played a supporting role in extending the lives of countless infants from that period, the bulk of the care was preventive, not interventional. It was seldom a case of “treating” newborns who fell ill. Rather, the goal was to prevent newborns from falling ill to begin with.
In any case, by the time World War II GIs returned home and put down roots, the major victories in this omnibus war on infant mortality had been won. Thus America's all-important triumph over infant death was achieved in large part without today's costly “miracles of modern medicine.” Sonograms and fetal heart monitors—now deemed obligatory elements of a proper prenatal regimen—weren't invented till the late 1950s, and wouldn't come into general usage for two decades more. It's fair to say that their impact on infant mortality has been negligible. Indeed, if one wanted to be a curmudgeon, one might point out that in recent years, America's infant-mortality rate has crept back up slightly—this, in an era when expectant mothers can avail themselves of a panoply of health services that their forebears from Model T America could not have imagined.
* She uses the Colonial spelling.
© Copyright by Steve Salerno at 7:04 AM 1 comments
Labels: medicine
Monday, May 02, 2011
Placebo: How a sugar pill became a poison pill. Part 6 of a continuing saga...
Read Part 5.
This is where you might expect journalists to “fact check” the self-serving fluff and, as necessary, set the record straight. Regrettably, the climate of arm's-length detachment that should separate reporters from their sources does not apply in medical journalism. Health reporters tend to be in the thrall of celebrity doctors and research scientists to begin with, and undertake little true investigative journalism that isn't spoon-fed to them by the rare healthcare dissident or some crusading personal-injury lawyer. News reports on major healthcare scandals—drugs that kill people, doctors whose supbar skills have invited scads of malpractice lawsuits—are framed as aberrations, departures from the norm. “NEW MEDICAL BREAKTHROUGH!” constitutes one of the timeless feel-good themes that the media rely on to leaven the otherwise-relentless onslaught of bad news. And an improvement in longevity is the supreme feel-good story.
No less a media heavyweight than Time, commenting on life-expectancy figures released by the CDC in December 2009, uncritically repeated researchers' contentions that “improvements in life expectancy are largely due to improvements in reducing and treating heart disease, stroke, cancer, and chronic lower respiratory diseases.” The merest peek behind the curtains would've resulted in a very different headline.
On paper, the upswing in American longevity since 1900 is difficult to ignore: about 49 for both genders then versus about 78 for both genders now, an apparent gain of nearly three full decades. But this striking then-and-now statistical juxtaposition has been framed in the public dialogue as if mass numbers of Will Rogers' contemporaries suddenly keeled over on their 49th birthdays. Nothing could be farther from the truth. Seldom has a data set been more deceiving or a statistical “fact” more spurious. The credulous reporting of those “facts” bespeaks a woeful misunderstanding of the concept of life expectancy.
Most laypeople (and too many journalists with a rudimentary knowledge of the health beat) unthinkingly use the terms longevity and life expectancy interchangeably. They regard the entire subject as a one-dimensional computation that yields a single fixed number—that number being the age at which an adult can expect to die: “Well, I'm a 74-year-old man, and male life expectancy is 75, so if there's anybody I've always wanted to tell off, I've got one year to do it!” Not so. Life expectancy—as the term is used by scientists, demographers, actuaries and allied professionals—is a sliding scale. Somewhat like a GPS navigational system that recalculates your route if you miss a turn, life-expectancy tables recompute your odds of dying at each new age plateau you attain. That new calculation is made based on the average number of additional years of life logged by others who have reached the same plateau. In scientific and actuarial circles, this is known more specifically as “life expectancy by age.” Among other things, it's the primary basis for life-insurance underwriting.
When the media and general public make casual reference to longevity, they actually mean “life expectancy at birth”: the average final age attained by all members of a given universe born in a given year, encompassing everyone from that rare centenarian in the nursing home down the street to babies who barely managed to take their first breaths before dying. Projections of future life expectancy are based on observed experience as that entire data universe inches forward a year at a time. The current figure for life expectancy at birth is 75.3 years for men and 80.4 years for women, which resolves to 77.9 years. In no way, however, does this imply that a man who actually reaches age 75 should spend his birthday shopping for caskets and lining up a favored eulogist. In actuarial terms, a male who attains that milestone today has a life expectancy of an additional 10.8 years.
To no small degree, as the 19th Century gave way to the 20th, life expectancy was tied to one's luck at avoiding a trio of infectious diseases that stalked and killed with impunity. In 1900, the combined U.S. death rate from tuberculosis, flu and pneumonia was 396.6 per 100,000 population. (To put that in context, the current death rate from all cancers combined is 200 per 100,000 population.) TB alone claimed 194 lives per 100,000. The disease was commonly called “consumption” for its profoundly debilitating effect on late-stage victims, who appeared to waste away as if being consumed from the inside out. So severe was the panic surrounding TB that dedicated sanatoriums sprang up on the outskirts of dozens of cities for the express purpose of “treating”—that is, quarantining and warehousing—victims of the highly contagious killer.
So final a death sentence was a diagnosis of tuberculosis thought to be that doctors at these sanatoriums felt they had little to lose by attempting surgical interventions which, for sheer barbarism, rivaled anything thought up a few decades later by Josef Mengele. In the most gruesome of these, doctors would remove a patient's rib cage and encircling musculature in the theory that excising these “obstructions” might literally give a patient more breathing room. Such measures only inflicted horrific pain and in most cases hastened death.
Life in turn-of-the-century America was also marred by “slate-wiper” pandemics such as the Great Flu of 1918-1919, and random but regular outbreaks of polio and diphtheria, the latter disease one of the most feared blights among children prior to the 1930s. Terrified mothers kept their kids indoors after school, or kept them home from school altogether. The mere rumor of a child up the street who'd fallen ill was enough to drop attendance in city schools to levels that render modern America's worst truancy problems hardly worth discussing. Birthday parties ended the moment a child coughed or complained of feeling unwell. And yet here's the thing: The most dramatic breakthrough in the bedrock measure of U.S. mortality, deaths per 1000 population, happened between 1900 and 1930—when nothing dramatic at all was happening in medicine to account for it. In the earlier of the two years the Grim Reaper was frightfully busy, targeting 17.2 of every 1000 Americans; in a typical tenement of the sort that had begun to crowd the skies of lower Manhattan by 1900 (as depicted in films like Hester Street and Godfather II), residents attended five or six funerals. By the latter year the Reaper's day was far less busy at 11.3 deaths per 1000, largely because the cumulative toll from the aforementioned “big three” of TB, flu and pneumonia had plummeted by more than half, from 396 to 173 deaths per 100,000. That precipitous drop, it's clear, had little to do with medicine and everything to do with a massive public-awareness campaign emphasizing nutrition, sanitation and personal hygiene. After all, the gold standard TB-zapper streptomycin was still years removed from being isolated (1943), doctors treating pneumonia would not have penicillin in their arsenals till after World War II, and human trials of Salk's polio vaccine would not commence until 1954. So it's not that the lifespan of homo sapiens Americanus was magically extended by onrushing healthcare know-how. It's more that the unfortunate background circumstances that skewed the stats, leading to preposterously conservative assumptions about the limits of longevity, began to remit on their own.
If fewer people died, it was mostly because fewer people got sick to begin with.
To be continued...
© Copyright by Steve Salerno at 6:28 AM 0 comments
Labels: medicine
Newer PostsOlder PostsHome
Subscribe to: Posts (Atom)
subscribe to SHAMblog
Posts
All Comments
about me
My Photo
STEVE SALERNO
Author/essayist, musician, teacher, ballplayer, irritant. Email me.
View my complete profile
By: TwitterIcon.com
guru watch
I make no pretense of fairness or objectivity here. As I've said with regard...[read more]
Breatharian Institute of America
Byron Katie
Character Training Institute (Bill Gothard)
Dahn Yoga Centers
Eckhart Tolle
Educo Mind Power
Excellerated Business School/"Money & You"
Fellowship of Friends
Gabriel of Urantia (nee Tony of Pittsburgh)
I Can Do It! conferences (Louise Hay)
Impact Trainings
Instinct-Based Medicine (Leonard Coldwell)
James Arthur Ray
Joe Vitale (1)
Joe Vitale (2)
Kevin Trudeau
Landmark Forum (1)
Landmark Forum (2)
Lifespring
Lucinda Bassett/Midwest Center
Mankind Project
Maverick Money Makers
Melanie Tonia Evans
Millionaire Mind
NXIVM (Executive Success Programs)
Rapport Leadership International
Sterling Institute of Relationship
ThinkLove (
www.thinklove.com)
Tony Robbins
Turning Point/People Knowhow
useful links
Steve's site: corporate/academic consulting
More on SHAM, including sample chapter
Featured works
Steve's resume
Amazon listing for SHAM
Interview in American Spectator
Skeptic magazine
Quackwatch
FTC (with links to fraud complaint forms)
Whirled Musings
The Straight Dope
Salty Droid
This Day in History
William II Becomes King of the Netherlands (1840)
William served in the Peninsular War, was wounded at Waterloo, and led the Dutch army in the Belgian revolution after his father failed to approve his conciliation efforts. Called to the throne upon his father's abdication in 1840, William was immediately confronted with a financial crisis, which was solved by raising a "voluntary loan" among the people. A conservative leader, he resisted constitutional revision until the revolutionary spirit of 1848 induced him to grant what desired reforms? More... Discuss
This Day in History provided by The Free Dictionary
Quote of the Day
The person, be it gentleman or lady, who has not pleasure in a good novel, must be intolerably stupid.
Jane Austen
(1775-1817) Discuss
Quote of the Day provided by The Free Library
blog archives (click arrow)
▼ 2011 (72)
► October 2011 (2)
► September 2011 (6)
► August 2011 (7)
► July 2011 (4)
► June 2011 (4)
▼ May 2011 (

You are free to read this. Or not.
Still at the Center of controversy.
Call it...a RoarShock test?
Placebo: how a sugar pill became a poison pill. Pa...
Nightmare on I-476.
Placebo: How a sugar pill became a poison pill. Pa...
Placebo: How a sugar pill became a poison pill. Pa...
Placebo: How a sugar pill became a poison pill. Pa...
► April 2011 (4)
► March 2011 (4)
► February 2011 (14)
► January 2011 (19)
► 2010 (127)
► 2009 (209)
► 2008 (216)
► 2007 (204)
► 2006 (205)
► 2005 (44)
© Copyright 2005-2011 by Steve Salerno. All rights reserved. Except for material used in accordance with fair use guidelines, this blog may not be reproduced in any form, by any technological means, without the express written consent of Steve Salerno.
php web stats
StatCounter - Free Web Tracker and Counter
[*/quote*]