Thursday, December 30, 2010

Twenty Books I Read in 2010 That Really Influenced My Thinking

1. Sarah Bakewell, How to Live Or A Life of Montaigne. This book is the best introduction in English to Michel Eyquem de Montaigne, the man who retired in 1570 to his Bordeaux estate to make wine and invent the modern essay. I had read the essays in college, but Bakewell inspired me to re-read them, which I am doing. Montaigne who lived during a time of almost constant civil war between the Protestants and Catholics is wise and has advice like question everything that works today.

2. Kathryn Schulz, Being Wrong: Adventures in the Margin of Error. Schulz really made me re-think how I view being wrong and failure. This book is hilarious, well written, but ultimately quite serious. To read a blog inspired by this book go to

http://ow.ly/3w5VC

3. Siddhartha Mukherjee, The Emperor of All Maladies: A Biography of Cancer. Both the biography of cancer and the training of an oncologist, this book is graceful, elegant, and well written. I am in awe.

4. David Blumenthal and James A. Morone, The Heart of Power: Health and Politics in the Oval Office. From FDR to George W. Bush, health care’s role in the office of the president and in American politics is dissected and analyzed. Reading this great book helps one keep the day-to-day ups and downs of health care reform in perspective. Partisan fighting and unfair attacks are nothing new when it comes to this divisive issue. Yes David Blumenthal is that guy who runs ONC under HHS.

5. John E. Wennberg, Tracking Medicine: A Researcher’s Quest to Understand Health Care. The founder of the Dartmouth Atlas and approach makes a sensible and compelling argument for shared decision-making as the only way to decrease per-capita cost and increase quality. Using University of California data, he also refutes the UCLA argument. (http://ow.ly/3w6aL)

6. Jeffrey J. Kripal, Esalen: America and the Religion of No Religion. The J. Newton Rayzor Professor and chair of the Department of Religious Studies at Rice University not only tells the Esalen tale complete with characters like Hunter Thompson, Joan Baez, Fritz Perls and all the rest, but he more importantly cogently analyzes the influence of Eastern religions on America.

7. Robert Kegan and Lisa Laskow Lahey, Immunity to Change: How to Overcome It and Unlock the Potential in Yourself and Your Organization. The best book on change management that actually works. Must read in my humble opinion.

8. Sherry Turkle, Simulation and Its Discontents. Sherry Turkle is the university professor I wish I were. She is the one pundit on technology I really trust.

9. Dennis McCullough, MD, My Mother, Your Mother: Embracing Slow Medicine, the Compassionate Approach to Caring For Your Aging Loved Ones. McCullough is that wise primary care provider you wish you had as your doctor and as the doctor to your parents who live on the other side of the country. (http://ow.ly/3w6aL)

10. Carol S. Dweck, PhD. Mindset: The New Psychology of Success. Stanford professor whose research has changed my mind about how important mindsets are to the individual, the organization, and the nation. (http://ow.ly/3w6tp)

11. Charles Seife, Proofiness: The Dark Arts of Mathematical Deception. The title says it all.

12. Eric Abrahamson and David H. Freedman, A Perfect Mess: The Hidden Benefits of Disorder – How Crammed Closets, Cluttered Offices, and On-the-Fly Planning Make the World a Better Place. Anyone who has seen my office or desk or talked to Elizabeth Melby knows why I love this book. And let’s face it, it is true.

13. Ian F. McNeely with Lisa Wolverton, Reinventing Knowledge: From Alexandria to the Internet. One of those brilliant big picture books that traces the institutions (library, monastery, university, republic of letters, disciplines, and the laboratory) that have nurtured, shaped, and changed human knowledge.

14. Faye Flam, The Score: How the Quest for Sex Has Shaped the Modern Man. Title says it all about this hilarious and true science book.

15. Daniel S. Greenberg, Tech Transfer: Science, Money, Love, and the Ivory Tower. It’s a novel, but after having been a medical school professor, I can tell you it rings true. A professor creates a rat that never sleeps, has a bowel movement, or urinates, and the US Army is very interested.

16. Tom Chatfield, Fun Inc.: Why Gaming Will Dominate the Twenty-First Century. The serious look at the future of gaming that inspired the blog post (http://ow.ly/3w6Ke)

17. Melvin L. Rogers, The Undiscovered Dewey: Religion, Morality, and the Ethos of Democracy. An important interpretation of America’s most important and influential philosopher, the main man for pragmatism.

18. H. Gilbert Welch, MD, MPH, Should I Be Tested for Cancer? May Not and Here’s Why. The title tells you why this is an important book.

19. David H. Freedman, Wrong: Why Experts Keep Failing Us. (http://ow.ly/3w6S6)

20. Steve Hagen, How the World Can Be the Way It Is: An Inquiry for the Millennium into Science, Philosophy, and Perception. A science writer turned Buddhist monk really knows how to write about science and religion.

Monday, December 27, 2010

The Difficult Science, Part II


“Despite their great explanatory powers these laws [such as gravity] do not describe reality. Instead, fundamental laws describe highly idealized objects in models.” Nancy Cartwright, “Do the Laws of Physics State the Facts?”

In Part I the limitations of science in helping us make wise choices and decisions about our health were examined. (http://j.mp/dUg6mo) Because of an inherent difficulty in establishing causation, absolute certainty is unattainable even in science. Medical knowledge follows Karl Popper’s theory of science because the right answer, whether about what causes ulcers or if you should take hormone replacement therapy, keeps changing with the publication of new studies. And most depressingly of all, a respected expert on evidence-based medicine concludes, “The majority of published studies are likely to be wrong.” (http://ow.ly/3tKdM)

Part I ended with some suggestions that seemed to imply that savvy patients should enroll in a graduate level statistics class and understand the subtleties of observational studies, met analysis, and randomized controlled clinical trials. Being an informed health care consumer is evidently difficult indeed.

Part II explores how we all have to change if we are to live wisely in a time of rapid transformation of the American healthcare system that everyone agrees needs to decrease per-capita cost and increase quality.

PATIENTS

When I talk to physicians about pay for performance programs, I am always asked why should doctors be responsible for patient behavior that they cannot control. Even if we were able to have health care access for all and eliminate every error in medicine, we would only account for 10% of whether an individual stays healthy. Environment and genetics account for about 35%, but the remaining 55% of whether one stays well depends on behavior (exercise, smoking, diet) and social support systems (families, communities, places of worship) (http://ow.ly/3uVgl)

Patients do need to change; more care is not always better care; more expensive care can be unproven and dangerous to your health; understanding the trade-offs involved in all medical decisions is imperative; becoming an empowered or e-Patient makes sense to me because nobody cares as much about your health as you do; having more financial skin in the game makes sense to me because now those that pay (government and employers) are not represented when doctors and patients make decisions.

But, I think it is ludicrous and unreasonable to think that patients have to make all these changes on their own. They cannot possibly do it alone. And I think it is important to remember patients come in all kinds and shapes. Not everybody wants to be an empowered patient. One bioethicist in 1990 said, “We must render a patient’s responsibility to the physician unacceptable, and we must insist that patients take primary responsibility for making decisions related to their health care” (http://ow.ly/3uRR8). This bioethicist doesn’t understand patients; he doesn’t understand illness, and he certainly doesn’t understand the health care delivery system.

I have written elsewhere at length about the ideal doctor patient relationship (http://ow.ly/3uIop), but let me insert one paragraph from that blog post here:

“Schneider in The Practice of Autonomy finds that some patients may want to reasonably give up their right to make their own medical decisions because they feel less competent than their physicians, because they are too exhausted, depressed, irritable, and confused by their illness to think straight, and because they want to be manipulated into a course of action they desire but still resist. He describes patients’ desires as ‘complex, ambiguous, and ambivalent.’ A patient quoted in this important book says, ‘I needed the doctors to take control so I could use all my energy for recovering’” (http://ow.ly/3uRR8).

My friend and colleague e-Patient Dave who survived Stage IV kidney cancer weighed in on January 9, 2009 in his blog where he wrote, “As someone whose butt was saved by excellent medical care, I find it unimaginable to consider doctors ‘incapable of determining what will benefit’ me. What, like I was going to think up high-dosage Interleukin-2 on my own?”

Patients do need to become wiser and savvier consumers of health care, but it would help if others did a better job of supporting them in this endeavor.

THE MEDIA

Patients would be better served if the media did a superb job of putting new medical “breakthroughs” in proper context. Former US Senator David Durenberger has recently emphasized the importance of a responsible news media in an era of health care transformation:

“At no time in our history have we been more dependent on good reporting about things beyond our scope or our control than we are today. The ability of unreliable or biased information and its reporting to distort public opinion and destroy public confidence in policy-makers has made it well nigh impossible for elected officials to deliver the hard news we need to hear. Or the good news that is possible from appropriate behavior change.” (http://ow.ly/3uJ65)

The Reader’s Digest played an important role in informing Americans of the link between smoking and lung cancer in 1952 and in explaining why switching to filter cigarettes did not protect against cancer in 1957. (http://ow.ly/3tKdM)

However, the media today often falls woefully short of doing the superb job that we as patients and citizens need. The editors of The New England Journal of Medicine in 1994 wrote, “The problem is not in the research but in the way it is interpreted for the public…. An association between two events is not the same as a cause and effect.” They emphasized the importance of reporters articulating the limitations of any one single scientific study. (http://ow.ly/3uJkj)

Health journalist Gary Schwitzer has established HealthNewsReview.org where a panel of more than a dozen experts grade health news stories for their accuracy, balance, and completeness. Using a zero to five star rating system, this website is an important resource for anyone trying to become a more informed citizen or consumer. This project has been honored with the Mirror Award and the Knight-Batten Award for Innovation in Journalism. The problem is that many of the stories in our newspapers and on our television newscasts fall far short of a 5-star rating.

The Science Literacy Project is another resource trying to improve the level of reporting, and they point out that journalists need to learn “enough about how studies are designed and conducted to be able to tell your listeners how solid the research really is…The gold standard is a randomized, double-blind, controlled trial. Other kinds of studies…can provide intriguing hints but not firm evidence.” (http://ow.ly/3tKdM)

Scott Maier, a veteran news reporter and university journalism professor, explains that the entire reporting culture works against the kind of health reporting that is needed, “We want to look at the positive aspects of medical breakthroughs, we want stories that pay off with some dividend. If you want the story to have its fullest impact, you’re more likely to exaggerate what the expert says than you are to question it.” (http://ow.ly/3tKdM)

We need a health care media that is skeptical and questions medical studies so that the public understands the limitations of any one “breakthrough.”

RESEARCHERS

With due respect to the editors of The New England Journal of Medicine, there are problems with research culture, and investigators need to change as much as journalists and patients.

A study documenting 788 retracted papers from 2000 to 2010 in the Journal of Medical Ethics concludes, “American scientists are significantly more prone to engage in data fabrication or falsification than scientists from other countries.” (http://ow.ly/3uKzt) American scientists were lead authors on 169 papers retracted for serious errors and 84 retracted for outright fraud. It is important to recognize that retractions for errors can be part of the normal process of scientific discovery and confirmation by replication, and retractions for any reason are rare. However, cases of outright fraud like Dr. Scott Reuben, the Massachusetts anesthesiologist who had 21 papers retracted, undermine the public’s confidence in medicine because his papers changed the way millions of patients were treated for postoperative pain (http://ow.ly/3uKzt). Ivan Oransky, MD and Adam Marcus provide a valuable resource for us all with their Retraction Watch blog that keeps track of this phenomenon (http://ow.ly/3uL3b).

The publish or perish culture found at American universities may be compromising research objectivity and integrity, according to Danielle Fanelli, author of an analysis of 1,300 academic papers published in the United States (http://ow.ly/3uLev). ProPublica’s discovery that more than a dozen Stanford medical school faculty were paid speakers for pharmaceutical companies also lend support to the need for change in the research culture. Stanford had been applauded for its tough conflict of interest policy that prohibited such presentations, but enforcement and accountability appear to be lacking. The University of Pennsylvania, the University of Pittsburgh, and the University of Colorado Denver are also looking into similar situations (http://ow.ly/3uLs5)

David H. Freedman’s book Wrong: Why Experts Keep Failing Us – And How to Know When Not to Trust Them does an admirable job of explaining how difficult it is to get research right and how the culture does not always help maintain objectivity and integrity. “The beliefs of researchers are shaped by ‘all of the vanities, vested interests, hunches, experiences, politics, careerism, grantsmanship tactics, competing cadres of collaborators, imperfections, and backgrounds of the scientists investigating problems at any time.’” Since research is a human activity, bias is rampant and unavoidable. Freedman details how researchers measure what doesn’t matter, mismeasure, toss out inconvenient data, keep reanalyzing data using different statistical models until they discover an association, don’t publish negative findings, and fail as referees in peer review to find obvious flaws in research papers. Tomaso Poggio, a tenured computer scientist as MIT states, “There’s much more competition for tenure in academia now than there was twenty years ago. It’s almost a little sick.” UCLA cancer researcher Jeffrey H. Miller tells Freedman, “The way science works is, when you end up backing a theory, you can’t afford to be wrong or your grant will suffer.” (http://ow.ly/3tKdM)

Jonah Lehrer also points out how hard it is for researchers to find the truth. “It’s hard because reality is complicated, shaped by a surreal excess of variables. But it’s also hard because scientists aren’t robots: the act of observation is simultaneously an act of interpretation.” (http://ow.ly/3uUor) When Lehrer wrote an article on scientific replication and the decline effect, he was criticized by some in the scientific community for giving aid and comfort to those who deny climate change or evolution. Lehrer described the frustration of biologist Michael Jennions who looked at hundreds of papers and 44 meta-analyses and discovered a consistent decline effect over time. “This is a very sensitive issue for scientists. You know, we’re supposed to be dealing with hard facts, the stuff that’s supposed to stand the test of time. But when you see these trends you become a little more skeptical of things.” (http://ow.ly/3uUor) Lehrer emphasizes that the decline effect makes it imperative that we consider each single scientific study in context. (http://ow.ly/3uUor)

While the randomized, double blind clinical trial is the evidence-based medicine gold standard of today, they are so expensive and time-consuming that enough of them will never be finished to serve the needs of Medicare or patients with various conditions. Computer simulation may provide an alternative to the clinical trial that is better than the status quo. Dr. David Eddy compared the computer simulation approach to the clinical trials approach by trying to predict the outcomes of the seven-year Collaborative Atorvastatin Diabetes Study (Cards). Of the four principal findings of the Cards trial, Eddy’s Archimedes computer model predicted two correctly, a third within the margin of error, and a fourth just below the margin of error. Eddy estimates that the computer simulation took a few months and cost 200th the cost of the Cards trial (http://ow.ly/3uTb6).

Having served on the tenure review committee at the University of Iowa Carver College of Medicine and having also been on the medical school faculty at UCSF, Allegheny University of the Health Sciences, and Michigan State College of Human Medicine, I can personally attest that the culture of medical research needs to be improved. Researchers need to follow the rules, be open to new approaches, and be transparent about the limits of science.

CLINICIANS

Physicians need to adjust to tremendous change in the practice of medicine due to the Internet, the wide-spread adoption of electronic medical records, e-Patients (http://e-patients.net/), the emergence of patient social media sites (http://ow.ly/3uQZU), the increase of patient-directed research (http://www.curetogether.com/), the movement to allow nurses and physician assistants to practice to the top of their licenses (http://ow.ly/3uQyp), the increased federal funding for comparative effectiveness research (http://ow.ly/3uR6l), the demand for more accountability (http://ow.ly/3uRc4), and even the use of avatars and video games for health and wellness (http://ow.ly/3urrA).

Physician discontent appears to be a growing problem both for patients and for providers (http://www.ncbi.nlm.nih.gov/pubmed/12928472), and the above-described changes are playing a role in increasing this lack of professional satisfaction.

I propose Dr. Lewis Blowers, a general surgeon, and Dr. Robert Parker, a pediatrician, as role models. When John E. Wennberg shared the variations in local tonsillectomy rates in Vermont with the State Medical Society, Dr. Blowers and Parker instituted a second opinion process, which lowered the chance of a child in Morrisville undergoing a tonsillectomy from 60% to less than 10% (http://ow.ly/3uRM4). They did not challenge the data; they did not react defensively; they did not try to defend a lucrative way of treating sore throats; they did what was in the best interest of their patients, even though it required them to change their style of practice.

In order to be professionally fulfilled and undiscouraged, physicians need to develop the humility, courage, and existential strength required to view all of these disruptive changes as potential ways to better take care of their patients. Accepting and admitting that medical science does not have all the answers is part of developing humility.

This need for professional humility is nothing new; when Dr. Robert Lovett of Children’s Hospital in Boston told Franklin Delano Roosevelt there was no medical treatment for his polio induced poor muscle control, Roosevelt developed his own exercise program. Roosevelt even bought and managed a spa in Warm Springs, Georgia where he and others could exercise in a warm swimming pool. When the American Orthopedic Association refused to let Roosevelt speak in 1926, he crashed the meeting and negotiated an agreement to evaluate his patient-developed Warm Springs program. (http://ow.ly/3uT1K) Talk about an empowered patient.

The movie The King’s Speech movingly depicts the success of an uncredentialed, failed Australian actor in treating the King of England’s speech problem, and it also documents the failure of the knighted royal medical doctors’ approach (http://ow.ly/3uT49).

I would hope that physicians would become the leading advocates of shared decision making. This is the one approach that offers us the ability to both decrease per-capita cost and increase quality. “A recent Cochrane review of randomized clinical trials comparing shared decision making supported by decision aids to obtaining informal consent through usual care showed an average 24% decline in demand for a wide range of elective surgeries and tests.” (http://ow.ly/3uRM4). One estimate suggests that such an approach could save Medicare $4 billion a year. (http://ow.ly/3uRM4). The Foundation for Informed Decision Making is a valuable resource for such decision aids that need to be much more widely adopted in the United States. (http://ow.ly/3uSAp)

CONCLUSION

Patients, the media, researchers, and clinicians all have to change if we are to live wisely in a time of enormous change and transformation for the American health care system. We must be willing to accept the limitations of science and be open to shared decision making that recognizes that there are trade-offs in any decision made in this uncertain and unpredictable place we call reality.

Part III will describe how medical schools, payers, and employers need to change (coming soon)

Friday, December 24, 2010

The Difficult Science Behind Becoming a Savvy Healthcare Consumer, Part I

“People are so prone to overcausation that you can make the reticent turn loquacious by dropping an occasional ‘why’ in the conversation.” Nassim Nicholas Taleb

“The mind leans over backward to transform a mad world into a sensible one, and the process is so natural and easy we hardly notice that it is taking place.” Jeremy Campbell

“In a substantialist view, the universe will be unborn, non-ceased, remaining immutable and devoid of variegated states.” Nagarjuna

“If he is weak in the knees, let him not call the hill steep.” Henry David Thoreau



On the same day in November, headlines from the Wall Street Journal and the New York Times reported on the same story about a federal panel’s recommendations on consumer intake of vitamin D. “Triple That Vitamin D Intake, Panel Prescribes” read the WSJ story; “Extra Vitamin D and Calcium Aren’t Necessary, Report Says” stated the New York Times. (http://ow.ly/3tJMe) Since I had recently started taking vitamin D daily, I was interested in what the experts in Washington, DC were recommending.

How should you decide what advice to follow about the relationship between your diet, lifestyle, medications, health, and wellness?

Is this just another example of how the media does a terrible job? Many of us resonate with the view of media watchdog Steven Brill who said, “When it comes to arrogance, power, and lack of accountability, journalists are probably the only people on the planet who make lawyers look good.” (http://ow.ly/3tKdM)

The media does play a role here and needs to improve, but it turns out that it is really complicated to figure out what the “truth” is about diet, exercise, medicines, and your individual well being. Everybody (journalists, government panel members, scientists, patients, physicians, and nurse practitioners) needs to change.

It is really hard to establish with certainty the cause of any disease. Pierre-Daniel Huet ‘s Philosophical Treatise on the Weaknesses of the Human Mind is my favorite skeptical analysis of causality. He writes in 1690 that any event can have an infinite number of possible causes. (http://ow.ly/3tLy2) David Hume, the great Scottish philosopher, makes us realize that until we know the Necessary Connection / cause of things then all human knowledge is uncertain, merely a habit of thinking based upon repeated observation (induction), and which depends upon the future being like the past. (http://ow.ly/3u5Fs) All involved in giving medical advice should read Nassim Nicholas Taleb who studies how empirical decision makers need to concentrate on uncertainty in order to understand how to act under the inevitable conditions of incomplete information. (http://ow.ly/3tLy2)

But don’t scientific studies establish the cause of diseases so that we can either prevent them or treat them with evidence-based methods? The philosopher Karl Raimund Popper’s theory of science emphasized that there are really only two kinds of scientific theories: those that have been proven wrong and those that have yet to be proven wrong. For Popper, and Taleb who greatly admires him, one needs to be skeptical of definitive truths because the world is very unpredictable. (http://ow.ly/3tNOf)

However, many of us would agree with physicist James Cushing’s statement that “scientific theories are to be taken as giving us literally true descriptions of the world.” A straw poll at a university department of physics found ten out of eleven faculty members who believed that what they were describing with their equations was objective reality. (http://ow.ly/3tNIL)

And yet Edmund Gettier showed that one can have a justified, true belief and still not know what he believes. A man believes there is a sheep in a field because he mistakes a dog for a sheep, but hidden behind a rock out of view in the field is a real sheep. “The three criteria for knowledge (belief, justification, and truth) appear to have been met yet we cannot say that this person actually knows there is a sheep in the field, since his ‘knowledge’ is based on having mistaken a dog for a sheep.” (http://ow.ly/3tNIL)

Science does not give us truth or certainty. As Lys Ann Shore says, “The quest for absolute certainty must be recognized as alien to the scientific attitude, since scientific knowledge is fallible, tentative, and open to revision and modification.”
(http://ow.ly/3tNIL)

When I graduated from Case Western Reserve School of Medicine in 1980, the evidence-based causes of peptic ulcer disease included stress, spicy food, chewing gum, and inadequate parenting. In 1982, Perth pathologists Robin Warren and Barry J. Marshall proposed that infection with Helicobacter pylori was the real cause, but physicians did not readily agree. When Marshall developed gastritis 5 days after drinking a Petri dish full of Helicobacter pylori, the scientific community slowly accepted the new theory. Treatment changed from a bland diet and psychotherapy to a combination of two antibiotics and a proton pump inhibitor. In 2005, Warren and Marshall were awarded the Nobel Prize for Physiology or Medicine. Medical knowledge is indeed “fallible, tentative, and open to revision and modifications.”

So how can the average person evaluate the latest scientific breakthrough that is reported in the press? One must become an informed skeptic. Scientific studies come in different types. Observational studies are often untrustworthy. Epidemiological studies are better, but still can lead us astray. Meta-analysis try to aggregate the knowledge discovered by many studies and can be useful. Randomized controlled clinical trials are the trustworthiest and considered the gold standard of evidence. (http://ow.ly/3tKdM)

John Ioannidis, MD, faculty member at Tufts-New England Medical Center and University of Ioannina Medical School, has studied the accuracy of all those medical studies we read in the popular press, and he has discovered that by a two to one margin discoveries in the most prestigious medical journals are either refuted or the results are found to be exaggerated by later papers. (http://ow.ly/3tQbX)

Ioannidis states, “Amazingly, most medical treatment simply isn’t backed up by good, quantitative evidence.” He also writes that the problem is not confined to medicine, “The facts suggest that for many, if not the majority, of fields, the majority of published studies are likely to be wrong.” (http://ow.ly/3tKdM)

To better understand how this can be so, let’s take a look at hormone replacement therapy for women. In 1966 a best selling book Feminine Forever argued that menopause was a disease that could be treated by taking estrogen, and hormone replacement therapy became a best selling drug in the United States. In 1985 the Harvard Nurses’ Health Study reported that women on estrogen had only a third as many heart attacks as women who did not receive the drug. In 1998 the Heart and Estrogen-progestin Replacement Study (HERS) found that estrogen increased the likelihood that women who already had heart disease would experience a myocardial infarction. In 2002 the Women’s Health Initiative (WHI) concluded that hormone replacement therapy increased the risk of heart disease, stroke, blood clots, and breast cancer.

One journalist estimates that tens of thousands of women suffered harm because they took a prescription drug that was prescribed by their physician to treat menopause and protect them from heart attacks. (http://ow.ly/3tSko) What happened and why?

The Harvard Nurses’ Health Study is a well designed, large (122,000 subjects), and well-run prospective cohort study that examines disease rates and lifestyle factors to generate hypotheses about what caused the diseases. Although such studies can say there is an association between two events (women who took estrogen had fewer heart attacks), they cannot determine causation. (http://ow.ly/3tSko) Huet writing in 1690 was right; there are a lot of possible causes for any one event. Ioannidis estimates that there are as many as three thousand different factors that might cause a condition like obesity, so it is not surprising that many hypotheses turn out to be wrong. (http://ow.ly/3tKdM)

That is why the hypotheses generated by such observational studies need to be tested by the gold standard randomized controlled clinical trials like the HERS and WHI. There are three ways to reconcile the difference between the clinical trial and the Nurses’ Health Study results. (http://ow.ly/3tSko) The association of estrogen with fewer heart attacks could be explained by the healthy user and prescriber effects; the women who took hormone replacement therapy were different from those who did not take it and the physicians prescribing it to women who took it were different from the physicians who treated women without it. Another possible explanation for the discrepancy is that it is hard to accurately find out if the women in the observational study actually took the estrogen before their heart attacks occurred. The third possibility is that both the clinical trials and the observation got the right answer, but to different questions. The Nurses’ study had mostly younger women, and the clinical trials had mostly older women. It is possible that estrogen both protects the hearts of younger women and induces heart attacks in older women, and this is now known as the timing hypothesis. (http://ow.ly/3tSko)
We really don’t know which of these three possible explanations is correct.

What I take away from this is that the skeptical consumer needs to be wary of all the new advice coming from scientific breakthrough studies reported in the lay press. A principal investigator with the Nurses’ study from 1976 to 2001 warns, “Even the Nurses’ Health Study, one of the biggest and best of these studies, cannot be used to reliably test small-to-moderate risks or benefits. None of them can.” (http://ow.ly/3tSko) Skeptical consumers need to understand the inherent limitations of such observational studies.

David H. Freedman in Wrong: Why Experts Keep Failing Us – And How to Know When Not to Trust Them (http://ow.ly/3tKdM) writes about the certainty principle. Drawing upon behavioral economics studies, he shows that humans are biased to advice that is simple, clear-cut, actionable, universal, and palatable. “If an expert can explain how any of us is sure to make things better via a few simple, pleasant steps, then plenty of people are going to listen.” And experts know that people will pay more attention if they make dramatic claims, tell interesting stories, and use a lot of statistics.

Freedman provides some guidance for those of us who want to be informed, skeptical, wise consumers of medical tests, therapies, and expert advice. Under characteristics of less trustworthy expert advice he lists:

• Simplistic, universal, and definitive advice
• Advice supported by a single study, or many small studies, or animal studies
• Groundbreaking advice
• Advice pushed by people or organizations that will benefit from its adoption
• Advice geared toward preventing the future occurrence of a recent crisis or failure.

Characteristics of expert advice we should ignore according to Freedman include:

• It’s mildly resonant
• It’s provocative
• It gets a lot of positive attention
• Other experts embrace it.
• It appears in a prestigious journal
• It’s supported by a big, rigorous study
• The experts backing it boast impressive credentials.

Freedman’s characteristics of more trustworthy advice:

• It does not trip the other alarms
• It’s a negative finding
• It’s heavy on qualifying statements
• It’s candid about refutational evidence
• It provides some context for the research
• It provides perspective
• It includes candid, blunt comments.


With due respect to Freedman who has written an informative book, his advice is confusing and not all that helpful to the layperson trying to decide if he should take extra Vitamin D. Although experts with impressive credentials have given advice that should be ignored, others with similar expertise have also performed rigorous prospective, randomized clinical trials whose findings should be followed. Think not smoking if you want to avoid lung cancer or heart disease. It is also true that the really skeptical epidemiologists accept very few diet and lifestyle factors as true causes of common diseases: smoking is a cause of lung cancer and heart disease; sun exposure does cause skin cancer; sexual activity does spread the papilloma virus that causes cervical cancer. (http://ow.ly/3tSko)

So where does all this reading get me as far as Vitamin D is concerned? Primary care physicians, relying on findings of an association between Vitamin D levels and a higher risk for a variety of diseases including heart disease, cancer, and autoimmune disease, started telling their patients to take supplemental Vitamin D. The sale of Vitamin D rose 82% from 2008 to 2009, reaching $430 million a year in the United States. One expert is quoted in the New York Times saying, “Everyone was hoping Vitamin D would be a kind of panacea.” (http://ow.ly/3u5YJ) As far as I know, none of these claims for Vitamin D preventing disease have been proven by a randomized controlled clinical trial.

And what about the conflicting WSJ and New York Times headlines? In a way, they both were right. The WSJ concentrated on the recommendation that people should get 600 IU of Vitamin D every day which is three times as much as the old standard of 200 IU a day. (http://ow.ly/3u63J) The New York Times concentrated on the finding that most of us get enough Vitamin D from our diet and exposure to sunlight. (http://ow.ly/3u5YJ)

So what did I decide? Following in the footsteps of Huet, Hume, Popper, Taleb, Ioannidis, and Freedman, I decided to stop taking Vitamin D and not to test my blood level. Sometimes you have to act based on incomplete knowledge of an unpredictable world, and I tend to be a skeptic and a minimalist when it comes to doctors and medical advice.

Part II: How and why we all have to change (coming soon)

Monday, December 20, 2010

Apologies, Echo Chambers, Open Minds, Learning, & Health Care

I have a confession to make. I sometimes overstate things to try to get people to read my blog posts and tweets. I really do not think that avatars and video games are going to replace doctors any time soon (http://ow.ly/3rRuh).

I felt a little guilty today when I read Irene Greif of IBM saying, “I do think of computers as augmenting people, not replacing them. We need help with the limits of the brain, but there are some things that our brains can do that computers can’t do.”
(htt://ow.ly/3rOVR) I felt guilty because I agree with Greif, but my blog headline blared out about avatars and video games replacing doctors.

This guilt reminded me of my reaction to a tweet from @neuroconscience chastising me for retweeting one of his posts with the label “Brain wars.” “@KentBottles Hi, not exactly 'BRAIN WARS', and please don't retweet so sensationally. At least, don't make it look like I said it! Thnx” I was so ashamed and taken aback that I never apologized to @neuroconscience until now. The link I was trying to draw attention to was a blog by LSU graduate student Gary Williams in which he defended Antonio Damasio from criticism by Ned Block at the New York Times Book Review and by Allison Gopnik over at Slate. “Block and Gopnik level the exact same argument against Damasio: he has conflated the minimal self with the reflective self and mistakenly claimed that the minimal self depends on the reflective self.” (http://j.mp/fwVrG2)

What struck me today was how fortunate I am to be able through twitter to interact and learn from others who think differently than I do. @neuroconscience is listed on the twitter profile as Micah Allen from Arhus, Denmark, and I got in trouble by labeling as “brain wars” a thoughtful essay by Gary Williams who obviously understands the minimal self and the reflective self I lot better than I do. It goes without saying that I have never met either Williams or Allen, but I used to work with Damasio at the University of Iowa College of Medicine.

Many of the people in my twitter tribe (@e-Patient Dave, @Susannah Fox, @healthythinker, @murzee, @maggiemahar come immediately to mind) share the same opinions I do about many things and usually understand my sarcasm and exaggeration. Or at least I hope and think they do. Micah Allen is a Danish academic, and I thank him for making me think more deeply.

I, like IBM, am “trying to break out of the standard way people use social networks to navigate the flood information. Typically, people interact with generally like-minded friends and thus create an ‘echo chamber’ where prejudices are reinforced. Such a posture also helps flatten the culture – reduces the culture to memes if you will; we float along the same YouTube clip everyone else is passing along. Treat serious news as gossip, and vice versa.” (htt://ow.ly/3rOVR)

I do not want to exist in an echo chamber; I want to learn and grow. I try to read the Wall Street Journal and The New York Times every day, and I enjoy seeing how the right and the left view the same stories differently. I follow @mparent77772 and @pharmaguy on twitter because they often do not see things the same way I do. I try to read John C. Goodman’s conservative Health Policy Blog (http://healthblog.ncpa.org/) as much as Maggie liberal Mahar’s Healthbeat blog (http://www.healthbeatblog.org/). I need all the help I can get to understand health care and patients and doctors and tests and myself.

Tuesday, December 14, 2010

Will Avatars, Robots, and Video Games Replace Doctors?

I have never met Dr. Joseph C. Kvedar of Partners HealthCare’s Center for Connected Health, Susannah Fox of Pew Research Center’s Internet and American Life Project, or Professor Andy Clark of Edinburgh University face to face in the real world. And yet they have all profoundly changed the way I think about health care’s most vexing problem: how are we going to take care of all these Baby Boomers who are starting to retire and get sick?

Kvedar nicely summarizes this supply and demand problem on one slide in a talk I watched on YouTube; he notes that there are currently 24 million Americans with diabetes, and the rate is increasing 8% every year. One in three Americans over 20 years old have hypertension, and Kvedar wonders where we are going to get all the doctors to care for these patients. His answer is we need to form trusting relationships with technology in a process he terms Emotional Automation. (http://e-patients.net/index.php?s=fox)

I had never heard of Kvedar or the Center for Connected Health until I saw a Fox twitter link to her blog post about robots, enchanted objects, and networks. (http://e-patients.net/index.php?s=fox) Fox and I follow each other on Twitter, so I read her blog, which included the embedded YouTube video of Kvedar speaking about Emotional Automation. In a way Fox is also responsible for me knowing about Professor Clark’s views on “embodied cognition” and “the extended mind.” One Sunday Fox noted in a tweet that my habit of aggregating the health care news every morning at 5:30 AM was helpful to her and the rest of my twitter tribe. That one pat on the back encouraged me months later to scour the New York Times blogs where I found Professor Clark’s Opinionator blog titled “Out of Our Brains.”

Can technology really solve the supply and demand problem in American health care? Can humans love and trust electronic devices made of glass, silicon and plastic? What can video games teach us about changing behaviors to cope with chronic disease? Should we think about what the explosion of cognitive prosthetics means for our understanding of the interplay between brains, bodies, and the real world where we live?

Many of us have already formed trusting, loving relationships with technology, but we have not really thought through the implications for health care. People love and trust their iPhones and tablet computers because they are extensions of themselves. “It is different now that we carry our second self with us. We think with the objects we love and we love the objects we think with.” So says MIT’s Sherry Turkle, the pioneering student of evocative subjects (http://ow.ly/3jjCG). Mark Rolston, chief creative officer of Frog Design, observes that people grieve when they lose a personal electronic device. “You are leaving your brain behind,” he says (http://ow.ly/3jjCG). I have blogged before about Lois Simmeth, 73, who lives in a Pittsburgh nursing home that provides her with a $6,000 harp seal robot to hold. “I love animals. I know you’re not real but somehow, I don’t know, I love you (http://ow.ly/21cj7).” Kvedar observes that humans find it easy and natural to anthropomorphize pet rocks and tomagotchis. He also states that most of us initially believe that a trusting relationship requires two human beings who interact face to face in the real world. (http://e-patients.net/index.php?s=fox)

Philosopher Roger Scruton is not buying my argument that trusting relationships with technology are possible:

“In real life, friendship involves risk. The reward is great: help in times of need, joy in times of celebration. But the cost is also great: self-sacrifice, accountability, the risk of embarrassment and anger, the effort of wining another’s trust. Hence I can become friends with you only by seeking your company. I must attend to your words, gestures and body language, and win the trust of the person revealed in them, and this is risky business…. When I relate to you through the screen there is a marked shift in emphasis. Now I have my finger on the button. At any moment I can turn you off…Of course I may stay glued to the screen. Nevertheless, it is a screen that I am glued to, not the person behind it.”
(http://technology.timesonline.co.uk/tol/news/tech_and_web/the_web/article5139532.ece)

Tom Chatfield and I are betting Scruton is not addicted to World of Warcraft or WoW as it is fondly called by its 12 million monthly subscribers who pay over $1 billion annually to play this Massively Multiplayer Online (MMO) video game. Chatfield in his book Fun Inc.: Why Gaming Will Dominate the Twenty-first Century (New York: Pegasus Books, 2010) describes the WoW social experience as friendly and accessible to both beginners and experts. The story of how Adam Brouwer’s orc warrior Mogwai after 4,500 hours of play became the leader of the guild Adelante with 20,000 gold pieces and the two most powerful weapons in WoW is instructive for those of us who do not play MMO games. Although Brouwer thinks he could sell Mogwai for $10,000 on e-Bay for real world money, his obligations and allegiances to his fellow players won’t allow him to cash out. “The strange thing about Mogwai is that he doesn’t just belong to me. Every item he has got through the hard work of twenty or more other people. Selling him would be a slap in their faces. When I started, I didn’t care about the other people. Now they are the only reason I continue.” (Chatfield)

Video games have much to teach us about how to motivate humans to self manage their chronic diseases, and they offer a research tool for large-scale studies of human behavior. Researchers are interested in why video gamers become so absorbed and focused and are able to easily achieve the state of flow usually associated with master musicians and champion athletes (http://ow.ly/3pgbZ). “Gamers are engaged, focused, and happy. How many employers wish they could say that about even a tenth of their work force?” says Edward Castronova of Indiana University (http://ow.ly/3pgbZ). How many doctors wish they could say that about a tenth of their patients managing their chronic illness? A recent Harvard Business Review article concluded “the best sign that someone’s qualified to run an internet startup may not be an MBA degree, but level 70 guild leader status” in a MMO video game. (Chatfield)

Nicole Lazzaro of the player experience and research company XEODesign has identified four key characteristics of video games that may help explain why the typical American has spent 10,000 hours playing computer games by the age of 21. “Hard fun” entails pursuing a goal that gets more difficult with each level of play and requires the player to use sophisticated strategies and be rewarded for progress. “Easy fun” entails sheer enjoyment of the game and satisfying the player’s need for curiosity and mystery. “Altered states” refers to player reports that video games changed how they felt inside by clearing the mind, eliminating boredom, changing their sense of time, and experiencing a sense of achievement. “The people factor” is important to gamers because they develop relationships with others. Remote interactions with fellow players from all over the world are increasingly taking place through microphones, speakers, and real time conversations as well as in-game interactions. (Chatfield)

These learnings from video games can and are being incorporated into strategies to motivate patients to change behaviors to prevent and live with chronic disease conditions. Managing a chronic condition is full of failures manifested by high blood sugars and unexpected increases in body weight. Chatfield believes “One of the most profound transformations we can learn from games is how to turn the sense that someone has ‘failed’ into the sense that they ‘haven’t succeeded yet.’” (http://ow.ly/3pgbZ) Carnegie Mellon University’s Jesse Schell has described a system of awarding points for everything we do in real life in order to reward healthy behaviors. Lucy Bradshaw of Maxis explains, “You could strive to get the 10-stroke tooth brushing achievement, for instance, and then somehow you would collect all those points and utilize them.” (http://ow.ly/3pgjn) Dr. Jane McGonigal of the Institute for the Future plays the online Chorewars game in which she and her husband earn real rewards by doing chores in their San Francisco apartment. (http://ow.ly/3pgbZ) Anne McLaughlin of North Carolina State University’s Gains Through Gaming Lab says, “To make something into a game, you have to have a goal. You have to create the game. It’s more than just measurement…I know we keep talking about blurring the lines between gaming and reality, but I think it does that, and when it’s for a good cause it’s great.” While some think this is great, even the moderator of the South by Southwest Interactive Festival found it “rather ominous and spooky.” (http://ow.ly/3pgjn)

Video games also offer a research tool for understanding the real time interactions of complex systems involving people. Emergency triage and epidemic management are just two areas where game theory can reproduce complex systems and try out different strategies. Blitz Game Studios is developing a triage game that takes place in an interactive three-dimensional world. One physician favorably compared this approach to the traditional large-scale emergency training with volunteers covered with fake blood. “A virtual world can simulate the noise, the chaos, everything. You could assess, for example, the exact percentage and degree of someone’s burns from the way they looked in a game.” Most importantly such a game allows participants to try out different approaches and see if they work. Epidemiologist Nina H. Fefferman at the 2008 Games for Health Conference stated that studying thousands of people in games could model the unpredictable human behavior in epidemics. (Chatfield) Castronova says, “One reason that policy keeps screwing up – think Katrina – is because it never gets tested. In the real world, you can’t create five versions of New Orleans and throw five hurricanes at them to test different logistics. But you can do that in virtual environments.” (http://ow.ly/3pgbZ) Chatfield observes, “Game technologies excel at nothing so much as scoring, comparing and rewarding progress.”

Therapists are now using digital worlds with autonomous, virtual humans to help patients work through social anxiety, drinking, gambling, post-traumatic stress, and agoraphobia. (http://www.nytimes.com/2010/11/23/science/23avatar.html) Such therapists can discuss the patient’s feelings at the very moment that the virtual bartender asks the alcoholic if he wants to order another drink, and different coping techniques can be practiced time and time again in virtual situations that are experienced as real. One such patient said, “I just think it’s a fantastic idea to be able to experience situations where you know that the worst cannot happen. You know it’s controlled and gradual and yet feels somehow real…the great thing about it [is]…you get to practice.” (http://www.nytimes.com/2010/11/23/science/23avatar.html) USC psychologist Albert Rizzo has helped veterans with post-traumatic stress by using a virtual Humvee scenario that recreates ambushes by insurgents. “We can control the intensity of experience, and then work on the patient’s response,” breaking the association between reminders of the ambush and the panic the patient has been dealing with months later. (http://www.nytimes.com/2010/11/23/science/23avatar.html) In a USC study, people with social anxiety confessed more of their personal flaws, fears and fantasies to virtual figures programmed to be socially sensitive than to live therapists conducting video interviews. (http://www.nytimes.com/2010/11/23/science/23avatar.html)

Kvedar, who first introduced me to the concept of Emotional Automation, cites Karen the virtual wellness coach/avatar who gets her human walkers to exercise more and the Boston hospital patients who prefer a robot discharge planner to a human one as examples of humans learning to trust technology. And why shouldn’t the patient prefer the robot that is not in a hurry, does not talk down to the patient, and encourages the patient to ask the same question over and over again. The busy human discharge planner may in this setting be less effective than the avatar. (http://e-patients.net/index.php?s=fox)

The term avatar comes from Sanskrit and is usually translated as incarnation or descent to describe the process in which a higher spiritual being (Rama or Krishna, for example) takes on mortal flesh. It is now commonly used to describe a player’s presence within a video game. (Chatfield) Palo Alto Research Scientist Nick Yee, PhD has described the Proteus Effect, how our video game avatars change how we behave in virtual environments and in real life. In several papers, Yee demonstrated that players given more attractive or taller avatars disclosed more personal information and bargained more aggressively than unattractive, shorter avatars. Yee also showed that the person’s perceptions of their own attractiveness persisted outside of the game environment to affect their participation in real life online dating. Yee believes that providing users with “fit, athletic avatars in exergames may encourage longer and more engaged exercise sessions than if they were provided with normal-looking avatars or avatars that were modeled from their own bodies.” (http://www.healthgamesresearch.org/our-publications/research-briefs/the-proteus-effect)

Finally, what does all this do for our understanding of the interplay between brains, bodies, and the real world where we live? Professor Clark who works in “embodied cognition” and “the extended mind” fields of philosophy argues that a wire-free interface that links our brains to our notepad or iPhone should count as providing support for our cognitive processing. (http://ow.ly/3pgqK) Basically, I think he is saying that some of the activity that enables us to think occurs outside of our brain. He cites studies that show that hand gestures may play an active role in our ability to think; when research subjects were prevented from using hand gestures, they perform poorly on tests of mental abilities. He provocatively notes “evolution and learning don’t give a jot what resources are used to solve a problem. There is no more reason, from the perspective of evolution or learning, to favor the use of a brain-only cognitive strategy than there is to favor the use of canny (but messy, complex, hard-to-understand) combinations of brain, body, and world.” (http://ow.ly/3pgqK)

I have never spoken to Kvedar, Fox, or Clark face to face in real life, and yet they have indirectly convinced me that patients in the future will trust and use technology to prevent and treat illness in ways that we are just starting to understand and envision. The supply and demand problem of taking care of retiring Baby Boomers will include robots, avatars, video games, and physicians.

Will Avatars, Robots, and Video Games Replace Doctors?

I have never met Dr. Joseph C. Kvedar of Partners HealthCare’s Center for Connected Health, Susannah Fox of Pew Research Center’s Internet and American Life Project, or Professor Andy Clark of Edinburgh University face to face in the real world. And yet they have all profoundly changed the way I think about health care’s most vexing problem: how are we going to take care of all these Baby Boomers who are starting to retire and get sick?

Kvedar nicely summarizes this supply and demand problem on one slide in a talk I watched on YouTube; he notes that there are currently 24 million Americans with diabetes, and the rate is increasing 8% every year. One in three Americans over 20 years old have hypertension, and Kvedar wonders where we are going to get all the doctors to care for these patients. His answer is we need to form trusting relationships with technology in a process he terms Emotional Automation. (http://e-patients.net/index.php?s=fox)

I had never heard of Kvedar or the Center for Connected Health until I saw a Fox twitter link to her blog post about robots, enchanted objects, and networks. (http://e-patients.net/index.php?s=fox) Fox and I follow each other on Twitter, so I read her blog, which included the embedded YouTube video of Kvedar speaking about Emotional Automation. In a way Fox is also responsible for me knowing about Professor Clark’s views on “embodied cognition” and “the extended mind.” One Sunday Fox noted in a tweet that my habit of aggregating the health care news every morning at 5:30 AM was helpful to her and the rest of my twitter tribe. That one pat on the back encouraged me months later to scour the New York Times blogs where I found Professor Clark’s Opinionator blog titled “Out of Our Brains.”

Can technology really solve the supply and demand problem in American health care? Can humans love and trust electronic devices made of glass, silicon and plastic? What can video games teach us about changing behaviors to cope with chronic disease? Should we think about what the explosion of cognitive prosthetics means for our understanding of the interplay between brains, bodies, and the real world where we live?

Many of us have already formed trusting, loving relationships with technology, but we have not really thought through the implications for health care. People love and trust their iPhones and tablet computers because they are extensions of themselves. “It is different now that we carry our second self with us. We think with the objects we love and we love the objects we think with.” So says MIT’s Sherry Turkle, the pioneering student of evocative subjects (http://ow.ly/3jjCG). Mark Rolston, chief creative officer of Frog Design, observes that people grieve when they lose a personal electronic device. “You are leaving your brain behind,” he says (http://ow.ly/3jjCG). I have blogged before about Lois Simmeth, 73, who lives in a Pittsburgh nursing home that provides her with a $6,000 harp seal robot to hold. “I love animals. I know you’re not real but somehow, I don’t know, I love you (http://ow.ly/21cj7).” Kvedar observes that humans find it easy and natural to anthropomorphize pet rocks and tomagotchis. He also states that most of us initially believe that a trusting relationship requires two human beings who interact face to face in the real world. (http://e-patients.net/index.php?s=fox)

Philosopher Roger Scruton is not buying my argument that trusting relationships with technology are possible:

“In real life, friendship involves risk. The reward is great: help in times of need, joy in times of celebration. But the cost is also great: self-sacrifice, accountability, the risk of embarrassment and anger, the effort of wining another’s trust. Hence I can become friends with you only by seeking your company. I must attend to your words, gestures and body language, and win the trust of the person revealed in them, and this is risky business…. When I relate to you through the screen there is a marked shift in emphasis. Now I have my finger on the button. At any moment I can turn you off…Of course I may stay glued to the screen. Nevertheless, it is a screen that I am glued to, not the person behind it.”
(http://technology.timesonline.co.uk/tol/news/tech_and_web/the_web/article5139532.ece)

Tom Chatfield and I are betting Scruton is not addicted to World of Warcraft or WoW as it is fondly called by its 12 million monthly subscribers who pay over $1 billion annually to play this Massively Multiplayer Online (MMO) video game. Chatfield in his book Fun Inc.: Why Gaming Will Dominate the Twenty-first Century (New York: Pegasus Books, 2010) describes the WoW social experience as friendly and accessible to both beginners and experts. The story of how Adam Brouwer’s orc warrior Mogwai after 4,500 hours of play became the leader of the guild Adelante with 20,000 gold pieces and the two most powerful weapons in WoW is instructive for those of us who do not play MMO games. Although Brouwer thinks he could sell Mogwai for $10,000 on e-Bay for real world money, his obligations and allegiances to his fellow players won’t allow him to cash out. “The strange thing about Mogwai is that he doesn’t just belong to me. Every item he has got through the hard work of twenty or more other people. Selling him would be a slap in their faces. When I started, I didn’t care about the other people. Now they are the only reason I continue.” (Chatfield)

Video games have much to teach us about how to motivate humans to self manage their chronic diseases, and they offer a research tool for large-scale studies of human behavior. Researchers are interested in why video gamers become so absorbed and focused and are able to easily achieve the state of flow usually associated with master musicians and champion athletes (http://ow.ly/3pgbZ). “Gamers are engaged, focused, and happy. How many employers wish they could say that about even a tenth of their work force?” says Edward Castronova of Indiana University (http://ow.ly/3pgbZ). How many doctors wish they could say that about a tenth of their patients managing their chronic illness? A recent Harvard Business Review article concluded “the best sign that someone’s qualified to run an internet startup may not be an MBA degree, but level 70 guild leader status” in a MMO video game. (Chatfield)

Nicole Lazzaro of the player experience and research company XEODesign has identified four key characteristics of video games that may help explain why the typical American has spent 10,000 hours playing computer games by the age of 21. “Hard fun” entails pursuing a goal that gets more difficult with each level of play and requires the player to use sophisticated strategies and be rewarded for progress. “Easy fun” entails sheer enjoyment of the game and satisfying the player’s need for curiosity and mystery. “Altered states” refers to player reports that video games changed how they felt inside by clearing the mind, eliminating boredom, changing their sense of time, and experiencing a sense of achievement. “The people factor” is important to gamers because they develop relationships with others. Remote interactions with fellow players from all over the world are increasingly taking place through microphones, speakers, and real time conversations as well as in-game interactions. (Chatfield)

These learnings from video games can and are being incorporated into strategies to motivate patients to change behaviors to prevent and live with chronic disease conditions. Managing a chronic condition is full of failures manifested by high blood sugars and unexpected increases in body weight. Chatfield believes “One of the most profound transformations we can learn from games is how to turn the sense that someone has ‘failed’ into the sense that they ‘haven’t succeeded yet.’” (http://ow.ly/3pgbZ) Carnegie Mellon University’s Jesse Schell has described a system of awarding points for everything we do in real life in order to reward healthy behaviors. Lucy Bradshaw of Maxis explains, “You could strive to get the 10-stroke tooth brushing achievement, for instance, and then somehow you would collect all those points and utilize them.” (http://ow.ly/3pgjn) Dr. Jane McGonigal of the Institute for the Future plays the online Chorewars game in which she and her husband earn real rewards by doing chores in their San Francisco apartment. (http://ow.ly/3pgbZ) Anne McLaughlin of North Carolina State University’s Gains Through Gaming Lab says, “To make something into a game, you have to have a goal. You have to create the game. It’s more than just measurement…I know we keep talking about blurring the lines between gaming and reality, but I think it does that, and when it’s for a good cause it’s great.” While some think this is great, even the moderator of the South by Southwest Interactive Festival found it “rather ominous and spooky.” (http://ow.ly/3pgjn)

Video games also offer a research tool for understanding the real time interactions of complex systems involving people. Emergency triage and epidemic management are just two areas where game theory can reproduce complex systems and try out different strategies. Blitz Game Studios is developing a triage game that takes place in an interactive three-dimensional world. One physician favorably compared this approach to the traditional large-scale emergency training with volunteers covered with fake blood. “A virtual world can simulate the noise, the chaos, everything. You could assess, for example, the exact percentage and degree of someone’s burns from the way they looked in a game.” Most importantly such a game allows participants to try out different approaches and see if they work. Epidemiologist Nina H. Fefferman at the 2008 Games for Health Conference stated that studying thousands of people in games could model the unpredictable human behavior in epidemics. (Chatfield) Castronova says, “One reason that policy keeps screwing up – think Katrina – is because it never gets tested. In the real world, you can’t create five versions of New Orleans and throw five hurricanes at them to test different logistics. But you can do that in virtual environments.” (http://ow.ly/3pgbZ) Chatfield observes, “Game technologies excel at nothing so much as scoring, comparing and rewarding progress.”

Therapists are now using digital worlds with autonomous, virtual humans to help patients work through social anxiety, drinking, gambling, post-traumatic stress, and agoraphobia. (http://www.nytimes.com/2010/11/23/science/23avatar.html) Such therapists can discuss the patient’s feelings at the very moment that the virtual bartender asks the alcoholic if he wants to order another drink, and different coping techniques can be practiced time and time again in virtual situations that are experienced as real. One such patient said, “I just think it’s a fantastic idea to be able to experience situations where you know that the worst cannot happen. You know it’s controlled and gradual and yet feels somehow real…the great thing about it [is]…you get to practice.” (http://www.nytimes.com/2010/11/23/science/23avatar.html) USC psychologist Albert Rizzo has helped veterans with post-traumatic stress by using a virtual Humvee scenario that recreates ambushes by insurgents. “We can control the intensity of experience, and then work on the patient’s response,” breaking the association between reminders of the ambush and the panic the patient has been dealing with months later. (http://www.nytimes.com/2010/11/23/science/23avatar.html) In a USC study, people with social anxiety confessed more of their personal flaws, fears and fantasies to virtual figures programmed to be socially sensitive than to live therapists conducting video interviews. (http://www.nytimes.com/2010/11/23/science/23avatar.html)

Kvedar, who first introduced me to the concept of Emotional Automation, cites Karen the virtual wellness coach/avatar who gets her human walkers to exercise more and the Boston hospital patients who prefer a robot discharge planner to a human one as examples of humans learning to trust technology. And why shouldn’t the patient prefer the robot that is not in a hurry, does not talk down to the patient, and encourages the patient to ask the same question over and over again. The busy human discharge planner may in this setting be less effective than the avatar. (http://e-patients.net/index.php?s=fox)

The term avatar comes from Sanskrit and is usually translated as incarnation or descent to describe the process in which a higher spiritual being (Rama or Krishna, for example) takes on mortal flesh. It is now commonly used to describe a player’s presence within a video game. (Chatfield) Palo Alto Research Scientist Nick Yee, PhD has described the Proteus Effect, how our video game avatars change how we behave in virtual environments and in real life. In several papers, Yee demonstrated that players given more attractive or taller avatars disclosed more personal information and bargained more aggressively than unattractive, shorter avatars. Yee also showed that the person’s perceptions of their own attractiveness persisted outside of the game environment to affect their participation in real life online dating. Yee believes that providing users with “fit, athletic avatars in exergames may encourage longer and more engaged exercise sessions than if they were provided with normal-looking avatars or avatars that were modeled from their own bodies.” (http://www.healthgamesresearch.org/our-publications/research-briefs/the-proteus-effect)

Finally, what does all this do for our understanding of the interplay between brains, bodies, and the real world where we live? Professor Clark who works in “embodied cognition” and “the extended mind” fields of philosophy argues that a wire-free interface that links our brains to our notepad or iPhone should count as providing support for our cognitive processing. (http://ow.ly/3pgqK) Basically, I think he is saying that some of the activity that enables us to think occurs outside of our brain. He cites studies that show that hand gestures may play an active role in our ability to think; when research subjects were prevented from using hand gestures, they perform poorly on tests of mental abilities. He provocatively notes “evolution and learning don’t give a jot what resources are used to solve a problem. There is no more reason, from the perspective of evolution or learning, to favor the use of a brain-only cognitive strategy than there is to favor the use of canny (but messy, complex, hard-to-understand) combinations of brain, body, and world.” (http://ow.ly/3pgqK)

I have never spoken to Kvedar, Fox, or Clark face to face in real life, and yet they have indirectly convinced me that patients in the future will trust and use technology to prevent and treat illness in ways that we are just starting to understand and envision. The supply and demand problem of taking care of retiring Baby Boomers will include robots, avatars, video games, and physicians.