Public health’s evolving role promoting U.S. military interests

The seal of the U.S. Department of Defense, representing seven branches of the U.S. military.
The seal of the U.S. Department of Defense, representing seven branches of the U.S. military.

The U.S. Department of Defense (DoD) remains one of the most sophisticated media production machines on the planet. Its ubiquitous advertising filters into every aspect of our lives, from public schools to product placement in the lucrative gaming industry to traditional online ads.

In 2007 alone, according to a Rand Corp. study, the total recruiting budget for the Army, Navy, Air Force, and Marine Corps exceeded $3.2 billion. Rand Corp. analysts also deemed those investments as successful as measured by recruitment, even during two ongoing wars in Afghanistan and Iraq.

Events with military personnel always feature sophisticated press and social media coverage. One of the more nuanced and I think effective messages I have seen from the DoD is how the military is not just about defense, but about a more deeply and morally resonant “good.” The U.S. Navy’s very slick videos call the branch a “a global force for good,” and show Navy SEALs in action carrying that message.

This clip from a U.S. Navy recruiting video shows a successful branding effort by the U.S. Department of Defense to promote its global activities as a moral good, including special ops efforts by U.S. special forces
This clip from a U.S. Navy recruiting video shows a successful branding effort by the U.S. Department of Defense to promote its global activities as a moral good, including special ops efforts by U.S. special forces.

Helping to prop up that messaging is the country’s long-standing integration of public health services into the DoD and overall military readiness. The military is successfully integrating public health activities, and it is branding these as part of its global efforts, including on the new battlefield in Africa.

Through contracting opportunities that support these efforts, many U.S. based firms who specialize in development and traditional public health activities are actively supporting these initiatives, in order to monetize their own business models.

Chasing contracts serving two masters: public health and defense

I recently stumbled on a job posted on the American Public Health Association (APHA) LinkedIn page by a company called the QED Group, LLC. The position was similar to ones I see posted on their job site now, for work on a “monitoring and evaluation” project in Africa.

This is one of many government-contracting agencies that chase hundreds of millions of contracts with U.S. government agencies and the major public health funders like the Bill and Melinda Gates Foundation.

In this case, the company was specifically targeting those in the public health community, who are entering the field or currently have positions with backgrounds in public health, economics, science, and health. The 15-year-old company itself actually began as a so-called 8(a) contractor, which means it could win no-bid and lucrative government contracts that are now the center of an ongoing and intense controversy over government waste. (These companies were created by the late Alaska Sen. Ted Stevens, who created the provision to steer billions in government contracting to Alaska Native owned firms that partner with companies like Halliburton and the Blackwater overseas and in the United States.)

QED Project in NorthAfrica
The company QED Group showcases its recent work evaluating anti-terrorism-related efforts in North Africa.

Today, QED Group, LLC claims “it is full-service international consulting firm committed to solving complex global challenges through innovative solutions” by providing clients “with best-value services so they increase their efficiency, learning capacity, and accountability to the public in an ever more complex and interconnected world.” It lists standard international development and public health contract areas of health, economic growth, and democracy and governance.

QED Group is not the only multi-purpose public health and development agency chasing military and global health contracts in Africa.  Another health contracting company called PPD boasts of its “long history of supporting the National Institutes of Health, the nation’s foremost medical research agency,” and that it was “awarded a large contract by the U.S. Army.” It claims its is also a “preferred provider to a consortium of 14 global health Product Development Partners (PDPs), funded in part by the Bill & Melinda Gates Foundation.”

As a public health professional, QED Group looks like a great company to join. However, if one scratches deeper, one learns that this company also uses its public health competencies with the U.S. military, which is spearheaded in Africa by U.S. Africa Command, or AFRICOM.  This raises larger questions of the conflicting ethics of both promoting human health and public health and also serving the U.S. Department of Defense, whose primary mission is to “deter war and to protect the security of our country.”

AFRICOM’s emerging role flexing U.S. power in Africa

AFRICOM’s demonstration of “hard power” is well-documented through its use of lethal firepower in Africa. AFRICOM is reportedly building a drone base in Niger and is expanding an already busy airfield at a Horn of Africa base in the tiny coastal nation of Djibouti. On Oct. 29, 2013, a U.S. drone strike took out an explosives expert with the al-Qaida-linked al-Shabaab terrorist group in Somalia, which had led a deadly assault at a Kenyan shopping center earlier that month.

One blog critical of the United States’ foreign policy, Law in Action, reports that the AFRICOM is involved in the A to Z of Africa.  “They’re involved in Algeria and Angola, Benin and Botswana, Burkina Faso and Burundi, Cameroon and the Cape Verde Islands. And that’s just the ABCs of the situation. Skip to the end of the alphabet and the story remains the same: Senegal and the Seychelles, Togo and Tunisia, Uganda and Zambia. From north to south, east to west, the Horn of Africa to the Sahel, the heart of the continent to the islands off its coasts, the U.S. military is at work.”

U.S. efforts in Africa require health, public health, and development experts. As it turns out the company, QED Group,  won a USAID contract examining U.S. efforts promoting “counter-extremism” programs in the Sahel. That study evaluated work using AFRICOM-commissioned surveys, all designed to promote U.S. national security interests in the unstable area.

The area is deeply divided between Christians and Moslems. It is also home to one of the largest al-Qaida based insurgencies known as al-Qaida in the Islamic Maghreb, which has similar violent aspirations as the ultra-violent Boko Haram Islamic militant movement of violence-wracked northern Nigeria. Al-Qaida in the Islamic Maghreb military seized control of Northern Mali in 2012, which ended when U.S.-supported French military forces invaded the country and routed the Islamic extremists in January 2013.

Public health’s historic role with U.S. defense and national security

“Hard power” and “soft power” are tightly intertwined in U.S. overseas efforts, where health and public health personnel support U.S. interests. This is true in Afghanistan and is certainly true in North Africa. This particular QED-led program used the traditional public health method of a program evaluation of an antiterrorism program to see if a USAID program was changing views in Mali, Niger and Chad—all extremely poor countries that are at the heart of a larger struggle between Islamists and the West.

That research methods used in public health–and which I have used to focus on health equity issues in Seattle–can be used equally well by U.S. development agencies to advance a national security agenda is not itself surprising.

However, faculty certainly did not make that case where I studied public health (the University of Washington School of Public Health). I think courses should be offered on public health’s role in national defense and international security activities, because it is nearly inevitable public health work will overlap with some form of security interests for many public health professionals, whether they want to accept this or not.

U.S. Public Health Service Corps members proudly serve their country and wear its uniforms.
U.S. Public Health Service Commissioned Corps members proudly serve their country and wear its uniforms. This photo published on the corps’ web site demonstrates that pride.

Public health in the United States began as a part of the U.S. armed services, as far back as the late 1700s. It was formalized with the military title of U.S. Surgeon General in 1870. To this day those who enter the U.S. Public Health Service Commissioned Corps wear military uniforms and hold military ranks.

A good friend of mine who spent two decades in the Indian Health Service, one of seven branches in the corps, retired a colonel, or “full bird.” He always experienced bemusement when much larger and far tougher service personnel had to salute him when he showed his ID as he entered Alaska’s Joint Base Elmendorf Fort Richardson looking often like a fashion-challenged bum in his minivan (he frequently had to see patients on base, and was doing his job well).

The U.S. Public Health Corps' web site shows the different uniforms worn by their members.
The U.S. Public Health Service Commission Corps’ web site shows the different uniforms worn by its members.

The U.S. Army’s Public Health Command was launched in WWII, and it remains active today. One of its largest centers is Madigan Army Medical Center at Joint Base Lewis McChord, in Pierce County, Washington. Public Health activities are central to the success of the U.S. Armed Services, who promote population-based measures and recommendations outlined by HealthyPeople 2020 to have a healthy fighting force.

AFRICOM charts likely path for the future integration of public health and defense

Africom photo
This screen snapshot of an AFRICOM media file highlights the public health and health related efforts AFRICOM personnel undertake in the region, where military efforts are also underway to suppress and disrupt Islamic extremist groups.

Today, the U.S. military continues to use the “soft power” of international public health to advance its geopolitical interests in North Africa.  In April 2013, for example, AFRICOM hosted an international malaria partnership conference in Accra, Ghana, with malaria experts and senior medical personnel from eight West African nations to share best practices to address the major public health posed by malaria.

At last count, the disease took an estimated 660,000 lives annually,  mostly among African children.

At the event, Navy Capt. (Dr.) David K. Weiss, command surgeon for AFRICOM, said: “We are excited about partnering with the eight African nations who are participating. We’ll share best practices about how to treat malaria, which adversely impacts all of our forces in West Africa. This is a great opportunity for all of us, and I truly believe that we are stronger together as partners.”

I have reported on this blog before how AFRICOM and the United States will increasingly use global health as a bridge to advance the U.S. agenda in Africa. And global health and public health professionals will remain front and center in those activities, outside of the far messier and controversial use of drone strikes.

It is likely this soft and hard power mission will continue for years to come. Subcontractors like QED Group will likely continue chasing contracts with USAID related to terror threats. Global health experts will meet in another African capital to discuss major diseases afflicting African nations at AFRICOM-hosted events. And drones will continue flying lethal missions over lawless areas like Somalia and the Sahel, launching missiles at suspected terrorist targets.

Oregon’s smallpox legacy in a state celebrated for vaccination deniers

Smallpox remains the only human disease that has been successfully eradicated. Its scourge has been global, impacting nearly every great civilization from the time of the Pharaohs onward.

Smallpox helped the Spanish invaders conquer the Aztecs in the 1500s; nearly 3 million persons were killed.

In Europe, it reportedly claimed 60 million lives in the 1700s. In the 1500s, up to 3 million Aztecs died after being infected by the conquering Spanish, bringing about the collapse of their culture and civilization more effectively than the violent conquistadores could have ever dreamed. The last reported case occurred in the 1970s. Since that time, the virus has existed only in two highly guarded labs.

Smallpox is also tragically rooted in the meeting of European and Native American cultures, and its horrific impact on the continent’s first peoples underlies the nation’s historic narrative as much as political and economic developments from colonial expansion to industrialization to slavery.

The pilgrims, like the Spanish, brought the dreaded scourge, which immediately took a toll on Native tribes on the Eastern seaboard. The first outbreak claimed 20 of the white settlers’ lives. Founding Father Ben Franklin lost a son to smallpox in 1736. But smallpox more than any army, particularly in the Pacific Northwest in the Oregon territory, made it possible for the young American nation to conquer Native areas, many totally wiped clean of their Native inhabitants. I will talk more about the impacts in Oregon shortly, but first some background on the killer virus.

Smallpox’s enormous role in North American and Native American history

There are two smallpox variants, Variola major, the more severe form, and the less severe Variola minor. Its symptoms include fever and lethargy about two weeks after exposure, followed by a sore throat and vomiting. For those afflicted, a rash would then appear on the face and body, and sores in the mouth, throat, and nose. Infectious pustules would emerge and expand. By the third week, scabs formed and separated from the skin. The virus is spread by respiratory droplets, and also by contaminated bedding and clothes. This was how many historians suspect the disease may have been transmitted to Native Americans in North America.

French Jesuits in Canada in 1625, according to an account by Ian and Jennifer Glynn in The Life and Death of Smallpox, received great hostility from Natives because of the link made between the disease and contact with Europeans. The missionaries reported the local people “observed with some sort of reason that since our arrival in these lands those who had been the nearest to us had happened to be the most ruined by [smallpox], and that whole village of those who had receive us now appeared utterly exterminated.”

The first recorded use of smallpox as a weapon was during the siege of Fort Pitt in 1763, when Native tribes during Pontiac’s uprising during the French and Indian war were reportedly given infected blankets by a British general, possibly with the goal of infection, even though scientific knowledge at the time did not fully understand germ theory or microbial infections. However, there was an understanding of how the disease might be spread based on experiences.  Reports also exist of the British attempting to infect colonial areas during the Revolutionary War–all early cases of germ warfare.

Smallpox was reportedly used against the 10,000-man contingent of the Continental Army that invaded British-held Quebec. Of that force, half were stricken by smallpox, and it was theorized the British commander may have intentionally spread it by sending infected persons to Continental Army camps. That army’s commander died, and the force retreated in 1776, keeping the Canadian territories intact and thus giving birth to Canada. Noted John Adams, “Our misfortunes in Canada are enough to melt the heart of stone. The smallpox is 10 times more terrible than the British, Canadians and Indians together.”

Abraham Lincoln supposedly contracted it during the height of the Civil Ware in 1863—the outcome of which could have turned the course of U.S. and global history, had he died. (I for one am glad he survived this.)

The first vaccine, developed in 1770, was derived from cowpox by Edward Jenner. He had observed how a milk maid  was inoculated from the impacts of the more deadline Variola major and minor by a previous exposure to cowpox. It was not until 1947 when a frozen vaccine was introduced globally. After a costly global campaign, smallpox was declared eradicated in 1980.

The College of Physicians of Philadelphia has published an extremely useful illustration and timeline of the history of smallpox in the United states and globally.

A man who caught smallpox in Milwaukee is shown in this 1925 photo.
It was less than 100 years ago smallpox wreaked havoc. A photo provided by Dr. Bennet Lorbar shows a man with pox marks on his body, among the victims of the 1925 Milwaukee outbreak that claimed 87 lives.

Today, many people in the United States, particularly those born after routine smallpox vaccinations were ended in 1972, have no memory of how awful such a disease can be. (The CDC has a plan to vaccinate the entire country should the virus ever break free from its labs.)

This may be a contributing factor to the rise of the anti-vaccination movement. It should noted opposition to smallpox vaccination in the United States dates to the 1920s, and opposition even as far back as the first vaccine of Jenners.

Ex-Playmate McCarthy and the vaccination deniers

The most famous case of modern day vaccination denialism is linked to controversies surrounding the measles, mumps, and rubella (MMR) vaccine, and its alleged link to autism and autism spectrum disorder. This bogus claim was completely based on a widely discredited study published by the British medical journal the Lancet in 2004, and then formally retracted in 2010. It was further debunked by extensive population based studies.

Facts, of course, have still not stopped former 1994 Playmate of the year Jenny McCarthy, and the “Green our Vaccines” campaign, from claiming toxins in vaccines cause autism.

Would anyone care what Jenny McCarthy has ever said if she did not have large breasts and have been a Playmate of the Year in 1994?
Would anyone care what Jenny McCarthy has ever said if she did not have large breasts and was not the Playmate of the Year in 1994?

Her campaign of disinformation just got a boost when she was given a national stage by Walt Disney Co.-owned ABC News, which hired the vaccination extremist to its show called The View in mid-July 2013. She begins her post in September.

As expected a chorus of worried public health advocates and policy wonks decried ABC’s crass capitalistic gesture. This made no impact whatsoever on the parent corporation, Disney—all of which might lead a rational person to ask when the Disney-owned ABC News might hire a blond, big-boobed Holocaust denier to co-host a lively, unscripted talk show, so long as she boosted ratings.

Smallpox wiped out Native Americans in state that now has the highest rates of vaccination exemptions

It seems particularly and painfully ironic that the state with the highest rate of parents opting out of childhood vaccinations is Oregon. This is a major public health concern, because when there are fewer people receiving vaccinations, herd immunity is reduced, making it easier for a disease to spread.

Oregon currently has the highest rate of unvaccinated children in the nation, well above the national average of 1.2%.

As of 2013, Oregon schools had the highest rate of non-medical–meaning religious–immunization exemptions for kindergarten age children. An all time high of 6.4% were exempt. That same year the state also recorded the highest rates for pertussis (whooping cough) cases in the United States, for the past 50 years, according to the Centers for Disease Control and Prevention (CDC).

According to the newsletter called the Lund Report: “In 2013, rates also showed that 17 counties have now surpassed the common 6 percent threshold whereby herd immunity may be compromised for some vaccine-preventable diseases such as pertussis and measles. In 2012, 13 counties were above 6 percent.”

Thanks to a new law signed in July 2013 by Gov. John Kitzhaber (D), himself a doctor, it will now be harder for Oregon parents to get exemptions from mandatory immunizations for children enrolling in schools.

Now, flash back more than two centuries, when the scourge of smallpox was first recorded in the Northwest due to trade with Europeans. A smallpox epidemic, starting in the upper Missouri River country, swept through current day Oregon to the Pacific Ocean in 1781–82 with horrific effects. Another scourge of “fever and ague,” likely malaria, ravaged Oregon in 1830–31. Other diseases as tuberculosis, measles, and venereal infections also took a huge toll. Epidemics in fact took an estimated nine of 10 lives of the lower Columbia Indian population between 1830 and 1834.

A rest stop on the Columbia River Gorge provide historic background on the dessimation of Native residents in Oregon due to disease in the 1800s.
A rest stop on the Columbia River Gorge provides historic background on the dessimation of Native residents in Oregon due to disease in the 1800s.

In 1834, Dr. John Townsend, in the area that would become the Oregon Territory, wrote of a mass extermination of Native residents, similar in scope to what one today only knows through zombie or science fiction films of recent years like World War Z and I am Legend.

Townsend wrote: “The Indians of the Columbia were once a numerous and powerful people; the shore of the river, for scores of miles, was lined with their villages; the council fire was frequently lighted, the pipe passed round, and the destinies of the nation deliberated upon . . . Now alas! where is he? –gone; —gathered to his fathers and to his happy hunting grounds; his place knows him no more. The spot where once stood the thickly peopled village, the smoke curling and wreathing above the closely packed lodges, the lively children playing in the front, and their indolent parents lounging on their mats, is now only indicated by a heap of undistinguishable ruins. The depopulation here has been truly fearful. A gentleman told me, that only four years ago, as he wandered near what had formerly been a thickly peopled village, he counted no less than sixteen dead, men and women, lying unburied and festering in the sun in front of their habitations. Within the houses all were sick; not one had escaped the contagion; upwards of a hundred individuals, men, women, and children, were writhing in agony on the floors of the houses, with no one to render them any assistance. Some were in the dying struggle, and clenching with the convulsive grasp of death their disease-worn companions, shrieked and howled in the last sharp agony.”

An image the young then-U.S. officer Ulysses S. Grant, during his tour of duty on the Pacific Coast, where he saw the devastation of smallpox firsthand.
An image shows the young then-U.S. officer Ulysses S. Grant, during his tour of duty on the Pacific Coast, where he saw the devastation of smallpox firsthand.

While stationed in Fort Vancouver on the banks of the Columbia River in 1852 and 1853, future Union General and President Ulysses S. Grant recorded similar devastation: “The Indians, along the lower Columbia as far as the Cascades and on the lower Willamette, died off very fast during the year I spent in that section; for besides acquiring the vices of the white people they had acquired also their diseases. The measles and the small-pox were both amazingly fatal. … During my year on the Columbia River, the smallpox exterminated one small remnant of a band of Indians entirely, and reduced others materially. I do not think there was a case of recovery among them, until the doctor with the Hudson Bay Company took the matter in hand and established a hospital. Nearly every case he treated recovered. I never, myself, saw the treatment described in the preceding paragraph, but have heard it described by persons who have witnessed it. The decimation among the Indians I knew of personally, and the hospital, established for their benefit, was a Hudson’s Bay building not a stone’s throw from my own quarters.”

(For those interested in this topic, they may wish to buy, download, or borrow a study of smallpox’s impact on Native North Americans called Rotting Face: Smallpox and the American Indian. One reviewer wrote that smallpox “claimed more lives from the Northern Plains tribes in one year than all the military expeditions ever sent against American Indians.”)

Where is the statue or monument pointing out this critical event in Oregon’s history?

Yet, I could find no record of any statue or memorial in Oregon today that notes this historic tragedy, which depopulated a region and left it wide open for white settlers to inhabit in the mid-1800s. Perhaps if such physical reminders were present, and educational programs to accompany them, there might be a more lively debate in Oregon. But as of now, it is state celebrated for its vaccination deniers and for denying the benefits of community water fluoridation for residents of its major urban center, Portland, for a fourth time since the 1950s.

Maybe a statue honoring ghost villages, dead tribes, and forgotten cultures on the banks of scenic Multnomah River in downtown Portland, could kick off with a special celebrity ceremony. The organizers could host a live broadcast of The View with Jenny McCarthy, in a revealing dress, describing why the state’s residents should keep their children from getting vaccinated from diseases such as pertussis.

I would be sure this event included representatives of the remaining tribal groups who managed to survive the wholesale disease-driven extermination of their brethren not many decades ago, many due to illnesses now controlled through childhood immunizations. Now that would be an attention-grabbing event that might just propel the discussion in a new direction.

The crowded, congested, contested road: unsafe at nearly every speed

Seattle traffic
Seattle traffic is among the worst in the nation, and it can be downright deadly, according to those who track road-related fatalities.

Every day that I drive to work, I am literally putting my life on the line. I commute roughly 80 miles daily, round trip, from Seattle to Tacoma, navigating one of the most harrowing urban traffic corridors in the Untied States, on Interstate 5 and two state highways. (My story why I am commuting this way will be for another day, but there are good reasons.)

Routinely, erratic drivers dangerously pass me, putting our lives at risk, in order to gain a few extra minutes by speeding. I have seen many accidents, some fatal, on this route over the years, and I am glad that I have my will and living will in proper order in case a truck jack-knifes near me in the rain—and yes I’ve seen that happen twice before on the freeway system around Seattle.

Seattle Road Kill 2001-2009
How deadly are roads in the Puget Sound–take a look at the roadkill on this data map showing types of mortality by form of transportation for 2001-2009.

Judging by this map, we get a fair share of road kill in the metro area I call home.

The Centers for Disease Control and Prevention (CDC) put the number of road deaths annually in my home state at nearly 500 (2009). Nationally, in 2012, the United States reported that 34,080 people died in motor vehicle traffic crashes in 2012, a 5.3% jump over 2011. This ranks as 10th leading cause of death in the United States, if one pulls this form of death from all accidental deaths, in which it is grouped by the CDC epidemiologists.

So by all counts, getting in one’s car (or on one’s bike or in a bus or other form of transportation) and hitting the road can be deadly business in my country, especially given the proliferation of mobile-device users and drunk drivers.

In 2011, cell phone use in the good ole’ U.S.A. was a contributing factor in more than 3,300 deaths and for the previous year, in 387,000 motor vehicle injuries. These are very sobering numbers, and I actually expected there would be more given that I have seen far too many texters during peak travel times in vehicles moving 70 mph. Normally I move over a lane or lay on my horn to snap them out of it.

But this is nothing compared to the perils that passengers and drivers experience globally. According to the World Health Organization (WHO), road accidents claimed 1.2 million lives globally in 2011, ranking as the No. 10 cause of death, on a list that has some pretty nasty company, including respiratory infections (3.5 million), tuberculosis (1.3 million), and the big killer of children ages 0-5 years, diarrhea (2.5 million).

The Institute for Health Metrics produced this data table showing how road deaths globally compared to other causes of death (it's No. 10); go to: http://www.healthmetricsandevaluation.org/gbd/visualizations/gbd-heatmap
The Institute for Health Metrics produced this data table showing how road injury globally compares to other burdens of disease (it is No. 10); go to: http://www.healthmetricsandevaluation.org/gbd/visualizations/gbd-heatmap

A typical story that one sees with mind-numbing frequency overseas are bus collisions with motorcycles and motor scooters. This November 2012 story, 19-yr-olds crushed to death by bus, notes two aspiring young men were run over by an errant bus driver and dragged 40 feet in Chandigarh, India; the driver then fled the scene. Both of the men’s heads were crushed by the bus’s wheels.

I saw no less than three similar road maulings on the island of Java in 2009, when I visited Indonesia. That island, one of the most densely populated locations in the world, is overwhelmed with low-income and middle-income residents on  scooters competing for space with trucks and army of loosely and unregulated van taxis and buses.

Indonesians who use these highly efficient and inexpensive 100-125cc motor scooters are frequently killed on the island nation's infamously unsafe and crowded roads.
Indonesians who use these highly efficient and inexpensive 100-125cc motor scooters are frequently killed on the island nation’s infamously unsafe and crowded roads.

Road accidents alone in Indonesia account for more than 48,000 deaths annually, the 9th leading cause of death in the world’s largest Muslim nation.

The United States Department of State offers this stern warning to would-be American visitors to Indonesia–a country I really loved by the way: “Air, ferry, and road accidents resulting in fatalities, injuries, and significant damage are common. … While all forms of transportation are ostensibly regulated in Indonesia, oversight is spotty, equipment tends to be less well maintained than that operated in the United States, amenities do not typically meet Western standards, and rescue/emergency response is notably lacking.”

During my two-week visit in 2009 to the island nation, I rode about a dozen different buses and equally as many microbuses, not to mention the country’s crash-prone domestic air carriers once, their local train service (also unsafe at times), and the far less safe inter-island ferry services. I saw about a half dozen crashes from my bus window, most fatal and usually with motor cycle riders as victims, and from my hotel room I heard one multi-vehicle crash in the middle of the night that clearly claimed many lives. I learned the next day it was between a bus and truck. The bus was totaled.

Roads can really kill you overseas, and so can planes, boats, and trains too

Buses like these are cheap in Indonesia, but your life can be as some locals would say, insha-Allah, or at the mercy of God.
Buses like these are cheap in Indonesia, but your life can be as some locals would say, insha-Allah, or at the mercy of God.

The writer Carl Hoffman, author of the book The Lunatic Express: Discovering the World… via Its Most Dangerous Buses, Boats, Trains, and Planes, documents the horrendous conditions of ferries, public transportation, trains, planes, and other forms of transport. The book’s online promotion notes that it offers a “harrowing and insightful look at the world as it is, a planet full of hundreds of millions of people, mostly poor, on the move and seeking their fortunes.”

Anyone who has travelled in developing or “middle-income” countries (like, say, Chile or Turkey) knows their life is literally in the hands of drivers who may have no proper training, in busses with no proper maintenance or even reliable brakes. Worse, the drivers of buses and microbuses in countries from Uganda to India to Mexico may trust their fate to Allah, Saint Christopher, the Virgin Mary, or Krishna. Those who have travelled in such places know this to be true, by the many religious deities dangling at the front of public transportation by the drivers’ seats.

Worse, the drivers will often play chicken with their competitors by speeding into oncoming traffic at high speeds while passing other vehicles or simply to “have fun.” I swear I thought I would die on many occasions in: Mexico, Guatemala, Nepal, Peru, Uganda, Indonesia, Egypt, Turkey, Chile, Argentina, India, and other places that I’d rather forget just now.

accident or more by Birn
When is an accident really an accident, or when it is linked to larger systems issues? This analysis is provided by Anne-Emmanuelle Birn in her description of the social determinants of health (SDOH).

Three separate times, after I lived through the near mishap, I swore I would never, ever take a bus again in a developing nation. Yet I threw caution to the wind, as I needed to get around, and I could not afford to get around any other way. Not seeing the country I was visiting was not an option.

Is it really  “just an accident” or something more?

Anne-Emmanuelle Birn, international health professor at the University of Toronto, and co-author of the widely used global health tome called Textbook of International Health, points out the deeper connections that road-related deaths have to poverty and social inequity in undeveloped and middle-income countries. Birn writes that road traffic accidents are the second-leading cause of death for children between 5 and 14 years of age globally, and that poor and working classes are disproportionately affected in most countries. In high- income countries, most of those killed are drivers and passengers, whereas in low- and middle-income countries pedestrians, cyclists, and public transport passengers make up nine out of every 10 road-related deaths.

In Haiti, for instance, the word for local transport is molue (“moving morgue”) and in southern Nigeria locals say danfo (“flying coffins”).

Duncan Green, an Oxfam policy adviser and development blogger, recently wrote an article asking when road traffic injuries would finally be recognized as a priority by the international development community.

In fact a major report released in June 2013 by the Overseas Development Institute, the United Kingdom’s leading development think tank, notes that transportation is not recognized as a human right like access to water, yet it still is a fundamental factor for many to achieve basic human rights. Well-run transportation systems, for people and for goods and services, promote benefits, while unsafe and weak transportation systems harm the most vulnerable citizens.

Given the debate emerging now for future sustainable development post-2015, the deadline set for the Millennium Development Goals, road safety may finally find a way into the broader public health, development, and environment agenda, as a way to tackle this clearly documented major global killer. Perhaps the threat may finally be treated as the international epidemic that is is, globally or closer to home in the United Sates. For me, this includes the roads in the Puget Sound where I spend more than two hours daily to and from my public health job.

Musings on slavery, abolitionist John Brown, and Hollywood’s clumsy embrace of human bondage

News stories continue to highlight the growth of human trafficking in the United StatesEurope, and especially Asia. One estimate puts the number of persons in captivity, either for forced bondage or sex trafficking and prostitution, at 12 million to 27 million. An increasing number of victims are young girls 18 and younger, who become infected with sexually transmitted diseases such as HIV/AIDs.

Slavery seems to bring out the worst of humanity, and perhaps is a manifestation of our inglorious inhumanity. Sadly it is, well, about as American as the U.S. Constitution that not only enshrined it, but gave Southern states extra voting power–the notorious 3/5ths clause–for its slaves in the census allotment of Congressional seats.

I still remember when I visited the Philippines in 2003. Male and female pimps repeatedly accosted me within seconds of exiting taxis in front of my hotels in Cebu City and Manila, where I was working on a photo-documentary project. I was sure their workers were sex slaves. When I told them to go away, they mocked me and even offered me young children. It was sobering to realize that I represented a market, a lucrative market, that eagerly comes to countries like the Philippines, Thailand, Cambodia, and Laos to exploit women, even young boys and girls. Though aware of the problem, and having seen evidence of its freewheeling nature in Asia, the unrelenting media coverage of sex slavery has become overwhelming.

Time Magazine reported on slavery in Embassy Row in the nation's capital three years ago, but it can happen anywhere in the United States.
Time Magazine reported on slavery in Embassy Row in the nation’s capital three years ago, but it can happen anywhere in the United States.

In April 2013, European Union Home Affairs Commissioner Cecilia Malmström lamented: “It is difficult to imagine that in our free and democratic EU countries tens of thousands of human beings can be deprived of their liberty and exploited, traded as commodities for profit.” The United Nations estimates human trafficking nets $32 billion annually—a major transnational business. The United States fares no better. There are slaves being trafficked and sold in my home city of Seattle right now. A local KIRO News story recently reported: “Child sex trafficking – as easy in Seattle as ordering a pizza.”

Visiting Osawatomie, and its place in U.S. history

So slavery was on my mind when I drove across the country in late May from St. Louis to Seattle. I wanted to take a road less traveled and see some out of the way places, including in Kansas. Most of my friends practically laughed at me when I described sight-seeing there. So, I pulled out my atlas and found Osawatomie on the map, about an hour southwest of Kansas City, along state Highway 169

Osawatomie is home to one of the most important battles of the violent pre-Civil War era known as Bleeding Kansas, which claimed 56 lives.

Specifically, it is where America’s most famous abolitionist and violent revolutionary, John Brown (1800-1859), fought pro-slavery forces to prevent the then Kansas Territory from becoming a slave state.  All told 30-45 free state defenders, known as Jayhawkers (the University of Kansas’ namesake) fought nearly 250 proslavery militia along the banks of the Marais de Cygnes River on Aug. 30, 1856. Brown’s son Frederick and others died. Many say the war actually began in this small Kansas town that pro-slavers burnt to the ground during the attack.

Entrance to John Brown Memorial Park in Ossawatomie, Kan.
Entrance to John Brown Memorial Park in Osawatomie, Kan.

In May of that year,  Missouri ruffians, numbering 800, had sacked Lawrence, Kan., and burned a hotel, killing one abolitionist. Their strategic goal was to keep an entire race of persons in human bondage and treated as nothing more than property, and expand the inhumane practice and trade into territories recently “ethnically cleansed” of its Indian population by the U.S. Army, based at Ft. Leavenworth.

On May 24 and 25, 1856, at the so-called Pottawatomie Massacre, Brown responded in kind, by murdering five pro-slavery settlers with a sword. The mass murder by Brown and his sons was inspired by Brown’s deep Christian faith that he had been called to undertake a divine mission to end slavery and contest its brutality and those of its violent supporters with force.

The repeated and well-publicized examples of slavery’s inhumanity in the United States enraged Brown to the point where he dedicated his life to crushing it and freeing the slaves. (Unlike most of his day, Brown also believed in the equality of races, including Indians, and of the sexes.)

Just two years earlier in 1854, a divided Congress passed the Kansas-Nebraska Act, ending the fragile 24-year-old Missouri Compromise allowing a balance of pro-slave and free states to join the Union. With the 1854 act, settlers themselves would determine if that “peculiar institution” of slavery, which held in bondage an estimated 4 million persons, or 13% of all residents in the young country, would be allowed. Pro-slavery voters won, but the constitution was disavowed, the bogus legislature tossed out, and Kansas entered a free state in 1861.

One historic political outcome from the four years of fighting in the territory was the rise of a young Illinois politician of the nascent Republican Party, who noted in his political speeches, “Look at the magnitude of this subject! … about one-sixth of the whole population of the United States are slaves!” Abraham Lincoln emerged from the turbulence of the era as the standard bearer of his party in the divisive 1860 election that set in motion the war to address what Lincoln accurately noted was the “the all absorbing topic of the day.”

As for Brown after Osawatomie, he travelled in and out of Kansas the next two years of violence before returning East to plan his failed Oct. 16, 1859, raid on the federal armory in Harper’s Ferry, Va.  The raid, with 21 men to trigger a Southern slave uprising, failed miserably.

A statue of the abolitionist and revolutionary John Brown stands guard at a park with his namesake in Osawatomie, Kan.
A statue of the abolitionist and revolutionary John Brown stands guard at a park with his namesake in Osawatomie, Kan.

Brown was captured, tried in Charlestown, Va., and sentenced to hang to death on Dec. 2, 1859. During his trial he told the court, “Now, if it be deemed necessary that I should forfeit my life for the furtherance of the ends of justice, and mingle my blood further with the blood of my children, and with the blood of millions in this slave country whose rights are disregarded by wicked, cruel, and unjust enactments, I submit: so let it be done.”

Southern politicians were terrified by Brown’s decisive and violent insurrection against the U.S. government and their “cherished traditions.” Their paranoia of either a slave uprising or further such “meddling”  precipitated their rebellion against the union.

All of that history seemed overblown and forgotten in modern-day Osawatomie (pop. 4,447). The memorial to Brown and the battle is the John Brown Museum State Historical site. It includes a cabin of a local minister and his wife used as an Underground Railroad station. The cabin survived the battle. The park features a bronze statue of Brown and historic battle markers. It looked a little shabby and unappreciated, like any small-town park without money for upkeep, except it has happened to have two presidential visitors who delivered policy speeches, by Teddy Roosevelt in 1910 and Barack Obama in 2011.

Hollywood, Slavery, and the Battle for Kansas

For many of us, however, our perception of slavery is shaped by popular culture. One of two most recent Hollywood treatments of the subject was the scholarly costume epic Lincoln, by Stephen Spielberg. The film did not hide the brutality of slavery; in fact, the film opens with a vicious hand-to-hand battle pitting likely former slave Union soldiers locked in deadly embrace with their white Confederate adversaries. The film is basically a procedural drama how Lincoln’s administration passed the 13th Amendment to the Constitution, to end slavery “forever” in United States, while the nation’s most violent war rages outside of Washington.

The more controversial rendering of slavery is the 2012 Quentin Tarantino blood and gore pre-Civil War spectacle, Django Unchained.  This shoot-‘em up racks up a huge body count in a gratuitously violent revenge fantasy that follows the actions of a former slave, Django, played by Jamie Foxx. He kills perhaps nearly two dozen Southerners, blows up plantation mansions, and frees his true love. Unlike Lincoln, this film was heatedly debated. One review noted, “No single Hollywood film in the last decade has sparked the kind of controversy and wide-ranging response as Quentin Tarantino’s latest.”

The film triggered unrest not because of its brutal violence (nothing new for Hollywood splatter fests), but because of its rival view of history. “The most important thing about Django Unchained is that it’s a reaction against, or corrective of, movies like Birth of a Nation and Gone with the Wind. At every turn, it subverts or inverts the racist tropes that have defined Hollywood’s–and our culture’s–treatment of slavery, the Civil War, and Reconstruction,” according to Jamelle Bouie.

I have black friends who had a distinctly more positive personal reaction to the violent tale than did my white counterparts. While the film’s violence seems designed only thrill audiences, the violence of slavery and of efforts to expand it by pro-slavery bushwhackers in Kansas before and during the Civil War was every bit if not more cruel, if historical records are accurate. Reality actually trumps anything Tarantino could dream up.

The magazine Harper's printed an illustration of the 1863 raid by Southern bushwhackers of Lawrence, Kan, which killed 180 people.
The magazine Harper’s printed an illustration of the 1863 raid by Southern bushwhackers of Lawrence, Kan, which killed 180 people.

According to one account, a bushwhackers’ raid during the Civil War on Lawrence, Kan., is considered one of the worst cases of mass murder by the pro-Slavery forces.

On Aug. 21, 1863, 450 pro-Confederates Led by Bill Quantrill staged an early-warning raid and mostly showed no mercy, slaughtering about 180 men and boys as young as 14. Most of the victims were unarmed and still in their beds when the killing began. Another famous bushwhacker in the region, a psychopath named “Bloody” Bill Anderson, reportedly scalped victims before he was tracked and killed, and then beheaded as an example.

The official Hollywood rendering of “bleeding Kansas” and John Brown’s efforts to end slavery remains Michael Curtiz’s unsavory pro-slavery 1940 Western called the Sante Fe Trail (you can see the whole film here). The movie stars Errol Flynn as future Confederate General Jeb Stuart, then-actor Ronald Reagan as future Indian-killing General George Custer, and Olivia de Havilland as their mutual romantic interest. The film  renders a staggering historic whitewash of not only slavery and pre-Civil War America, but of John Brown’s actions in Kansas to contest the bushwhackers during the mid- to late 1850s.

Brown is portrayed by Raymond Massey as a bug-eyed, villainous psychopath bent on murder and revolution to end slavery, while Southern gentlemen like Flynn’s Stuart are true Americans who claim the South can work out slavery on their own terms.  There is no portrayal of slavery’s base cruelty, only abolitionist violence in Kansas and at Harper’s Ferry.

Raymond Massey portraying John Brown on his hanging day on Dec. 2, 1859--an event that sped the nation faster to Civil War.
Raymond Massey portraying John Brown on his hanging day on Dec. 2, 1859–an event that sped the nation faster to Civil War.

In an even more bizarre twist, future Confederate President Jefferson Davis is rendered as moral voice of wisdom, telling the graduating cadets: “”You men have but one duty alone, America.” This was the same Davis who owned slaves and dedicated himself to ensuring slavery’s survival as head of the pro-slave states doing everything they could to break away from that country.

The pro-slavery 1940 film Sante Fel Trail featured escaped slaves as subservient, pro-slavery fools who desired to return to plantation life rather than chase freedom with John Brown.
The pro-slavery 1940 film Sante Fe Trail featured escaped slaves as subservient, pro-slavery fools who desired to return to plantation life rather than chase freedom with John Brown.

The only “black folk” seen in this disingenuous Dixie-cratic rendering of reality are powerless, witless slaves who cannot think for themselves. After a firefight that sent Brown fleeing, a husband and wife slave couple from Texas caught up in Brown’s violence reveal themselves to Stuart as misguided lovers of the white slaveholding class: “Well, old John Brown said he gonna give us freedom but, shuckin’, if this here Kansas is freedom then I ain’t got no use for it, no sir,” drawled the wife. Her husband added, “Me neither. I just want to get back home to Texas and set till kingdom come.” I suppose that means he’d get a good whipping if he fessed up for trying to win his freedom.

As one film commentator noted: “In the years before 1960 most portrayals of slavery in cinema were like it was in Gone with the Wind and Jezebel. The slaves were happy and contented and too simple to live on their own. The Civil War was unnecessary and brought on by a handful of fanatics in the North.” The film’s final scenes show Brown before he is hung in 1859, followed by a happy kiss of the newlyweds, Flynn and de Havilland, all two years before the entire country entered its greatest conflagration that claimed more than half a million lives, finally “ending” slavery as a legal institution in the United States.

Former Klansman becomes part of Hollywood whitewash of Southern bushwhacking

The other noteworthy and historically inaccurate portrayal of Kansas-related bushwhacking violence is Clint Eastwood’s disturbing 1976 revisionist film The Outlaw Josey Wales. While supposedly based on a true Southern fighter, the film rewrites the script of historic events. Instead of violent Confederate bushwhackers who murdered indiscriminately, as they did in Lawrence, Southerners are portrayed as victims of murderous Jayhawkers and Union soldiers, who kill innocent women and slaughter surrendering prisoners, and hound Wales to Texas. The film was based on a novel, Gone to Texas, by Asa Carter, also author of a popular kid’s book called the Education of Little Tree.

At the time the film was made in 1976, it was unknown that Carter had reinvented himself. Instead of being a Cherokee Indian as he claimed, Carter was in fact a former Alabama Klansman, avowed racist, and speechwriter for Alabama’s segregationist Governor George Wallace. The books served as a clever reinvention for a man preaching against “government intrusion,” as Carter did for Wallace with racist hate language. Even his supposed Cherokee words were fiction. As for Josey Wales, the film helped to reinforce Southern stereotypes of Northern aggression and Southern innocence (despite its holding 4 million in captivity), while boosting Eastwood’s maverick filmmaking career.

In 2013, in an era when slavery seems to be as thriving an enterprise globally as it was in the antebellum South, perhaps it is time reexamine on the big screen the complex events in Kansas and Virginia and that fanatical revolutionary who committed his life to ending the institution forever. I just do not want the filmmaker to be Eastwood, Tarantino, or even Spielberg, nor a vampire camp production. Time to let someone else tell a tale that still needs to be told. Love or hate him, Brown was right about slavery’s stain on the nation. Brown’s enemies “could kill him,” wrote freed slave and fellow abolitionist Frederick Douglass, “but they could not answer him.”

Project Homeless Connect provides ‘disaster relief’ close to home

On May 17, 2013, I participated with other employees in my public health department working at Project Homeless Connect.  This is, at present, a quarterly endeavor to provide a range of medical and social services to the estimated 2,000 homeless individuals of Pierce County, Washington.

However, the people who line up as early as 7 a.m. for a range of needed services are not entirely the homeless. Many have jobs, but lack health and dental insurance. They basically are coming for primary or even emergency care that they cannot access elsewhere.

The Washington State Department of Social and Health Services is one of many organizations participating in Project Homeless Connect.
The Washington State Department of Social and Health Services was one of many organizations participating in Project Homeless Connect, held on May 17, 2013 at Calvary Community Church, in Sumner, Wash.

Project Homeless Connect, in its communications for its volunteer-run event, said it offered the following:

  • Medical and urgent care
  • Urgent dental care
  • Mental health services
  • Social service referrals
  • Vision/glasses
  • Haircuts
  • Child/adult immunizations
  • Veterinary care
  • Legal and financial advice
  • Housing, shelter, employment and education information
  • Tobacco cessation
  • Homeless assistance
  • Veterans services
  • Chemical dependence and assessment

This was no small effort. Months of planning went into pulling off this disaster-relief style engagement that is more associated with hurricanes and tornadoes than with meeting the basic needs of Pierce County, the second most populous (pop. 812,000) in Washington State.

Large, converted vans/trucks lined up providing veterinary services, dental care, and other interventions. Yet, oddly, there was no media present to put the story on the 5 p.m. news or in the daily newspaper the following day. (I checked but found nothing doing Google searches.) Why? Everyone who was homeless in Pierce and most social service and medical service providers likely was aware the event was taking place, for months in advance.

I did see not any elected officials (they may have come, and they may even have volunteered). All of this took place in a county whose hospitals are making profits of $1,000 per patient visit more than the state average and in a county where nonprofit hospitals are earning up to and more than $500 million in profits.

I saw all kinds of people—young, old, white, black, Asian, Latino, Pacific Islander, disabled, able-bodied, veterans, you name it. Volunteers came in all stripes as well. There were military personnel, dental assistant students from Pierce County community colleges (Bates and Pierce ), trained medical providers, church volunteers, hair stylists, and more. The list goes on. What struck me the most was how polite and appreciative the attendees were. Many drove or were driven from remote parts of the county to this somewhat semi-rural area in Pierce, southeast of Tacoma.

One of the providers, Medical Teams International, had one of its full-service converted mobile home vans providing dental care.

Medican Teams International brought one its converted mobile home vans to Project Homeless Connect on May 17, 2013, in Sumner, Wash.
Medical Teams International brought one its converted mobile home vans to Project Homeless Connect on May 17, 2013, in Sumner, Wash.

That program boasts a fleet of 11 mobile dental clinics in Oregon, Washington, and Minnesota that use 38-foot converted motor homes. Each clinic contains has two full medical stations and all necessary equipment, instruments, and supplies. The organization claims it has helped more than 200,000 adults and children with its mobile medical program since 1989.

Medical Teams International defines itself as a christian global health organization “demonstrating the love of Christ to people affected by disaster, conflict, and poverty.” The group works globally, including in Africa, South America, Asia, and North America.

Yet, it was in Pierce, addressing what clearly that organization perceived as akin to disaster and conflict.

In Washington State, 14 percent of all residents are without health insurance, according to the Kaiser Family Foundation. In Pierce County, the percentage is roughly the same.

All of this I find remarkable. Less than five miles from this revolving quarterly circus of human need there was a major shopping center, South Hill Mall, with about every major electronic gadget and consumer good on the market. Truck and car lots were also close by, with products selling from $25,000 and up. The disconnect to me was palpable, particularly the same week the Republican-led U.S. House of Representatives passed its 37th legislative measure to repeal or defund the market-driven health care reform known to its detractors as “Obamacare.”

I recall what one of my University of Washington School of Public Health colleagues—the one I respected more than nearly all others—told me when we talked about our peers who had worked or would work in public health in Africa or in developing nations. My friend asked somewhat ironically, why don’t they work at home. We have plenty of problems here. Given what I saw at Project Homeless Connect in Pierce County in mid-May 2013, I could not agree more.

Why Joan of Arc matters to beleaguered public health

Milla as Joan
Milla Jovovich in her role as Joan of Arc in the film The Messenger: the Story of Joan of Arc.

Recently, I watched a movie about the life of Joan of Arc (Jeanne d’Arc) called The Messenger, the Story of Joan of Arc by French director Luc Besson and starring Milla Jovovich. Though the movie got tepid reviews, I was mesmerized by it.

The period epic faithfully re-tells many key moments in the short life of the world-renown young French leader, including her actual words that were recorded in detailed written accounts. I found the movie intoxicating because of Jovivich’s exuberance as Joan, inspiring her countrymen to arms to free their nation, ensuring the crowning of the Dauphin Prince in the Reims Cathedral as King, and following in her view the will of God.

Few other single individuals had such an impact on world history as this illiterate peasant girl, who rose to prominence in a violent male world and became one of history’s greatest and most inspirational figures—and a saint for Catholic believers. In fact, at the mess hall at West Point, a mural depicting history’s greatest military leaders includes a rendition of Joan, with her holding a sword and in full body armor.

In fact no single historic figure from Europe during the 100 Years War between France and England remains as famous today as Joan. By the age of 17, she unswervingly acted on voices in her head telling her to drive the English from France and crown Charles VII as King of France. This came at France’s weakest moment in its history, with the English and Burgundians in control of half the country.

Yet, this virtual unknown girl never waivered. She gained access to the French court in the spring of 1429 in Chinon, France. She withstood questions from learned and suspicious church officials and a virginity test. She arrived in the besieged city of Orleans in April that year, bearing a standard and ready for action.

In defiance of cautious male commanders, she singlehandedly helped lead the French to defeat the attacking English, suffering several nearly fatal injuries. Her foes called her a witch and remained fearful of her talismanic powers. She brought together violent, power hungry men, like the Count of Dunois and the Duke of Alencon, around a common cause to the point they even would stop swearing and offered blind loyalty to her. Most importantly, she restored confidence of the French people around a common goal. Soon, all of Europe was talking about the Maid of Orleans and her battlefield exploits.

Joan Burning Picture
Joan of Arc being burned at the stake after being tried by the English and church leaders in 1431. She was only 19 years old.

By July that year, Charles VII was crowned king. Yet within a year, the young peasant who worked miracles was captured and ransomed to the English, tried as a heretic, and burned at the stake in Rouen on May 27, 1431, for having worn men’s clothes, no less.

Five centuries after her murder, she was pronounced a saint by the Catholic Church for the miracles that are linked to her remarkable accomplishments. While she did promote violence, she always offered her opponents opportunities for peaceful alternatives, and she reportedly showed great kindness to those captured.

So why should anyone in public health care about Joan of Arc?

As a student of history, I found many elements of her remarkable story relevant for my reality. Instead of beleaguered 15th century France, I find myself in the reality of the beleaguered U.S. public health system.

Religion you say? That has nothing to do with healthcare and public health, right? Well, that ignores the fact that religion has everything to do with healthcare and public health. For example:

Well, an illiterate peasant girl can teach nothing of value to doctors, PhDs, and other well-educated professionals who run our nation’s public health system, right?

I recently read an article highlighting leadership and public health. Some of the attributes associated with leadership include: serving, complex thinking, being a change agent, self-empowerment to empower others, risking failure, creating a future one envisions, and being confident in one’s beliefs and then living the change one wants. I am actually hard-pressed to find examples of such traits in leaders in my field who are resonating widely with the American public. Joan of Arc consistently showed all of these leadership traits, from risking her life on the field, to being a catalyst, to having supreme confidence in her vision.

Former U.S. Surgeon General and "Public Health Hero" Dr. David Satcher.
Former U.S. Surgeon General and “Public Health Hero” Dr. David Satcher.

In the United States, there are always “unsung hero” awards for people who no one outside of the particular field giving the award have heard of, or even care about, it seems. While these may help sustain the field of providers, they likely do little to inspire the public.

The University of California Berkeley in February held its annual event for “public health heroes,” awarding its 2013 prize to former U.S. Surgeon General Dr. David Satcher. However, I doubt few Americans know who Dr. Satcher is, what he accomplished, and why such facts matter to the nation’s crisis of promoting public health in the 21st century.

This is not to belittle Dr. Satcher’s many accomplishments, such as his calling attention to the oral health epidemic in the United States. (Oral health experts have been talking about his report for more than a decade because he and it were spot on.)

Public health, teetering like France before the arrival of Joan of Arc?

Of course medieval France has nothing in common with the reality of modern America and its healthcare system, right? But if you take the view of that history can teach open-minded students of the present many valuable lessons, regardless of their field, one might find parallels.

France at Joan’s time was on the verge of collapse, lacking strong leadership and a vision to restore hope and unity. Joan arrived completely confident in her vision and religious mission, and she never wasted a day. She famously said, “Better today than tomorrow, better tomorrow than the day after.” She also is remembered by her words, “go forth boldly.” Such words and such inspiration are lacking in the U.S. public health system, to me at least.

For those working in the field of public health, one is constantly exposed to the reality of budget cuts that continue to hack away at programs that do everything to promote chronic disease interventions to immunizations. Between 2008 and 2010 alone, in the aftermath of the Great Recession, more than half of all local public health departments had cut core funding and shed 23,000 jobs, as well as cut programs, mainly due to falling tax revenues that hammered local and state funding.

Things continue to spiral downward as the recession’s effects linger, and mandatory across the board federal budget cuts known as the sequester will soon impact every local public health department in the country and national agencies who help fund local efforts.  The Public Health Institute warned that sequester related cuts will be “devastating to the public’s health.” Such cuts, the institute says, “will cost jobs and resources in the short run, and the long-term costs—in money and lives—will be borne by families and communities for years to come.”

Crises also prevent departments from looking to innovation as they focus on life support and triage. Morale suffers, which impacts service and core functions. Leadership, perhaps what little that may exist in this beleaguered environment, is lacking. Public health managers struggle to connect with the public about what public health is and why it matters.

They fail to show that the U.S. health system’s treatment, not prevention, focus is largely unsustainable for the population’s health and the economy. In 2009, U.S. public health spending (at all governmental levels) amounted to $76.2 billion – only 3% of the nation’s overall healthcare outlays of $2.5 trillion. Yet, chronic diseases, which public health efforts can address, make up three quarters of all health care costs.

Public health spending versus all other healthcare spending in the United States.
Public health spending, as measured as billions of dollars, versus all other healthcare spending in the United States and spending on chronic diseases and all other healthcare costs.

Reform does happen, and it can be bold when breakthroughs capture the public’s and globe’s attention.

HIV/AIDS assistance, which is now at the heart of a larger global public health agenda, was launched in the late 1990s when activists outside of the medical and public health establishment demanded that antiretroviral drugs, or ARVs, be made available to many of the world’s poorest and most afflicted nations, most in Africa, to reduce the spread of the virus inside the bodies of infected people and make it possible for them to live long lives.

It was not reformers inside “the system,” it was radicals outside “the system,” who offered a clear vision and the groundswell for change that the establishment eventually fully embraced.

As someone who works inside “the bureaucracy,” however, I am ever mindful of how the great Joan of Arc was ultimately marginalized, tortured, and burned alive at the stake for her completely unorthodox ways that challenged nearly all in authority in her day. The English did not trigger her downfall, it was palace politics and sexism, and likely fear of her power.

Joan statue
One of many Joan of Arc statues in France honoring one of the French nation’s greatest heroes.

The lessons are telling today. You can work miracles, but the machinations of any bureaucratic system can be deadlier than slings and arrows of a battlefield of your sworn enemies. You could transpose the palace intrigues of 15th century French and English courts to any bureaucracy today and it would be a near perfect fit, really. Would any bureaucratic leader trust an uneducated, poor, unconnected interloper to provide a vision for change for the failing health and public health system, such as the one facing the United States in 2013?

Sure, such a thought is laughable, but it happened, and can happen again. It may even be needed if things continue on the present course.

In the end, no one remembers the bishops who tried and convicted Joan or the weak king she helped to bring to power, or in fact any of the kings of her day. Likewise, no one remembers or cares about bureaucrats in the end. Why? Quite simply they are not visionaries.

It is Joan who has statues in her honor, countless biographies recounting her legend, and many movies and documentaries exploring her incredible exploits.

Coptic Christians under assault, and memories of my Egyptian travels

On April 7, a mob in Cairo attacked a funeral procession of Coptic Christians, a minority in the now Muslim Brotherhood-led nation of Egypt. The attackers became violent during their seige, firing guns and throwing petrol bombs according to press reports. Prior to the fall of former president and practically dictator for life, Hosni Mubarak, state police protected Christian monasteries and churches in Egypt, due to the historic persecution of the minority Christians over decades.

Coptic Egyptians protest the assault that killed two and left nearly 100 injured at St. Mark's Cathedral in Cairo on April 7, 2013.
Coptic Egyptians protest the assault that killed two and left nearly 100 injured at St. Mark’s Cathedral in Cairo on April 7, 2013.

During the violent outburst at St. Mark’s Cathedral, two persons were killed and nearly 100 were injured. Christians inside the walled compound sustained what was called a “frenzied assault” from unknown perpetrators.

I visited in Egypt in 2004 and saw well-armed and manned police garrisons at multiple monasteries, including those in unpopulated areas, as well as at St. Mark’s Cathedral, the seat of the Coptic Christian Church. Amid the disintegration of Egyptian civil society and the ascendancy of the long-banned Muslim Brotherhood, Coptic Christians and their most sacred sanctuaries are now under direct assault. Tensions have escalated since the election of U.S.-educated and Islamist Mohamed Morsi as Egypt’s president in June 2012.

Egypt’s Coptic leaders had grown increasingly wary of worsening conditions over the last five years, particularly since the demise of U.S.-backed Hosni Mubarak. Muslim clerics, the Muslim Brotherhood, and its political wing, the Freedom and Justice Party, are credited by some media observers for inciting views hostile to the nation’s Christian minority.

Inside Bishoi Monastery, one of the oldest Coptic monasteries in Egypt 2004)
Inside Bishoi Monastery, one of the oldest Coptic monasteries in Egypt (2004).
Coptic Christians, like the young men seen here from my 2004 photo, are a persecuted minority in Egypt.
Coptic Christians, like the young men seen here from my 2004 photo, are a persecuted minority in Egypt.

In 2009, amid the swine flu scare, the Mubarak government destroyed more than 300,000 pigs, which was rebuked by the United Nations as unnecessary. Many believed the act was motivated Islam’s prohibition for eating pigs and the fact that Egypt’s pork industry is run almost entirely by Copts, many the urban poor.

One blogger wrote, “It is a national campaign to rid the country of its estimated 300,000 pigs in the name of public health.”

Copts allege the military council in the post-Mubarak era—the military still runs many Egyptian institutions and business sectors—is doing little against perpetrators of the attacks. Copts also have long complained of discrimination, including a law requiring presidential permission for churches to be built.

The Daily Star Newspaper of Lebanon reports that many Copts question their future as Egyptians. The paper notes the latest round of violence is the worse since Morsi was elected in June 2012: “Christians have been worrying about the rise of militant Islamists since the fall of President Hosni Mubarak in 2011. But after days of fighting at the cathedral and a town outside Cairo killing eight – the worst sectarian strife since Islamist President Mohammad Morsi was elected in June [2012]–many Copts now question whether they have a future in Egypt.”

Who are the Copts?

Today, Copts purportedly number about one in every 10 of Egypt’s 85 million residents. However, official statistics placed them at half that figure, or 5 million. The Coptic Church challenges that estimate, pegging their numbers at 15-18 million.

Father Tawdros at St. Anthony's Monastery in Egypt, taken in 2004.
Father Tawdros at St. Anthony’s Monastery in Egypt, taken in 2004.

The original term “Copt” simply meant a native Egyptian with no religious connotation, only later taking on its religious meaning today.

The Coptic Church is among the oldest Christian churches, preceding Islam’s arrival in Egypt by centuries in a land that is central to Judaism and Christianity. Some of the most important places to both faiths are within Egypt’s border, including Mt. Sinai and St. Catherine’s Monastery in the Sinai Peninsula.

The Copts split from the Eastern Orthodox and Roman Catholic Churches in 451 AD over a theological dispute over the nature of Christ. Today Copts are more similar to the Eastern Orthodox Church and perhaps the Armenian Orthodox church. In addition, the Coptic language, which is similar to the ancient Egyptian language, and written with the Greek alphabet, is still used in parts of Coptic services.

Increasing violence targets Christian minorities in the Middle East

Among the worst attacks on Egypt’s Coptic minority in recent years was the 2010-11 New Year’s Eve bombing in Alexandria. It targeted a Coptic church and killed 21. No individual has been arrested or brought to trial for the terrorist attack in one of Egypt’s most cosmopolitan and historic cities. The deed was largely forgotten with the world’s attention focussed on the “Arab Spring.”

Since the U.S.-led overthrow of Saddam Hussein in Iraq in 2003, Christians throughout the Middle East have been feeling increasingly under siege. Terrorist attacks and murders of Christians have occurred widely in many countries. (See map of the dispersion of Christians throughout the region—in all cases Christians had preceded the ascendency of Islam, but today are distinct minority communities.)christians middleeast

In Egypt and to a greater degree civil-war plagued Syria, the “Arab Spring” has brought intense disorder and violence to many minorities and minority faiths (Christians, Chaldeans, Kurds, Alawites, among others). Christians regionally remain fearful of a peaceful future of coexistence in the region that gave birth to contemporary Christianity.

In Egypts, Copts are now claiming life was better under dictator Mubarak, who dealt brutally with Islamists and their radical military wing, who waged a military and political campaign for decades.

Many Copts believe Muslim radicals want to eradicate Christianity, whose roots in Egypt predate the Islamic era.

According to an article published by the Middle East Quarterly, Muslim rulers historically have denied collective minority rights of non-believers. The concept of dhimmitude—itself a controversial term—explains the Islamic practice of denying equality to Jews and Christians, who historically since the Middle Ages have lived within the political realm of Muslim rulers and nations. Islam provided religious autonomy, not national freedom. To be fair, political rights for many groups, women, economic classes, and faiths everywhere in the world have not been fully realized until the last two centuries, and slowly at best and still not even today.

Memories of monasteries and my travels in Egypt

Whenever overseas events occur, it is often impossible to feel a connection to them. For me, in the case of Egypt, the collapse of Egyptian civil society has had great resonance for me. I had a chance to tour many parts of the country in 2004, observing the great poverty experienced by tens of millions of Egyptians on Mubarak’s corrupt rule. I was treated well, and I met many wonderful people, Muslim and Christians alike.

My visit to the St. George Monastery near Luxor required the permission of the local army commander for entire region around the Valley of the Kings (2004).
My visit to the St. Tawdros Monastery near Luxor required the permission of the local army commander for the entire region around the Valley of the Kings (taken in 2004).
CopticEgypt5
Suryani Monastery (2004).

I also visited many remote monasteries throughout the country—St. Catherine’s in the Sinai (run by the Greek Orthodox), St. Anthony’s in a remote inland oasis 30 miles from the Red Sea, Bishoi and Suryani monasteries in the Wadi Natrun oasis about 80 miles northwest of Cairo, and St. George’s and St. Tawdros’s monasteries, in the desert near Luxor.

The monasteries date as far back the 4th century AD, preceding the Islamic Arab conquest of that followed in the seventh century. Today about 50 monasteries remain.

I found the Coptic monasteries to be breathtakingly beautiful and peaceful. These are continuously inhabited facilities, but also significant cultural and historic sites.

The monks who greeted me were generous and gave me tours of their facilities. At St. Tawdros’s Monastery, I required a police escort of no less than the commander of the entire military contingent protecting the Valley of Kings region, one of the most popular tourist destinations in Egypt and the scene of one of Egypt’s more violent terrorist assaults. At all of the compounds, there were armed guards in large numbers.

Those guards have now melted away. In fact, it was the Egyptian military that led a coordinated assault on the Bishoi Monastery in February 2011, shortly after the terrorist bombing in Alexandria.

The video shows nothing less than a full assault of armed men, equipped with armored personnel carriers and bulldozers, demolishing an outer protective wall that I recall seeing built during my 2004 visit. The government denied responsibility despite the glaring video evidence. Today the monastery, one of Egypt’s great historic treasures, is now at risk of increased mob and organized violence by Islamic radicals and political extremists.

Egyptian military were filmed leading an attack on the Bishoi monastery in February 2011, which destroyed a protected outer wall.
Click on the image to see the full video of the Egyptian military leading an attack on the Bishoi monastery in February 2011, which destroyed a protected outer wall.

I’m not sure what will happen in Egypt. It is likely Egypt’s Christians will remain a persecuted minority and some of the world’s greatest historic treasures will be desecrated by extremists and opportunists, as was seen after the U.S.-led invasion of Iraq and as the world is observing in Syria amid its civil war.

America’s cultural zeitgeist and the emerging Don Corleone of public health

This has been one of the wildest weeks exposing the extremes of America’s cultural zeitgeist I can remember. What could be more American than gay marriage moving to the mainstream of American life and semi-automatic weapons readily available at a Walmart  near you, right?

Need a weapon of war to feel safe? Just drive to the nearest Walmart near you and select from their popular product lines.
Need a weapon of war to feel safe? Just drive to the nearest Walmart near you and select from their popular product lines.

On one hand, you have the U.S. Supreme Court hearing two landmarks cases, one on the legality of a voter approved ban on same sex marriage and another on the constitutionality of the federal Defense of Marriage Act, which aligns hundreds of federal benefits to promote that only a man can legally marry a woman.

Meanwhile, a full-court press was taking place in Congress to advance legislation that would require criminal background checks on all gun purchases and that would close the so-called gun-show loophole, which allows for up to 40% of all firearms sales to evade any scrutiny at all. However, efforts to include Sen. Dianne Feinstein’s amendment to restrict the sale of semiautomatic, military style assault rifles —the kind used to slaughter 26 civilians at Newtown—were dashed when Sen. Majority Leader Harry Reid (D-Nev.), on March 20, pulled it from the current gun legislation in the U.S. Senate. GOP members of Congress are already promising to filibuster the bill.

Will Ferrell, actor, comedian, and cultural clairvoyant, seemed to sum up the obvious best.
Will Ferrell, actor, comedian, and cultural clairvoyant, seemed to sum up the obvious best.

Will Ferrell’s now much repeated tweet seemed to put the pulse of the nation best: “I feel so blessed that the government protects my wife and me from the dangers of gay marriage so we can safely go buy some assault weapons.”

And, as we have so often seen in our country, sometimes tasteless, but also very popular, comedians can best summarize the seemingly craziness of political reality, where serious-minded commentators fall flat. Perhaps only through comedy can we see the absolutely surreality of our current reality.

Bloomberg takes on the NRA: no quarter asked, and none given

This week also saw the launch of Mayor Michael Bloomberg’s $12 million campaign in 10 states to promote federal gun legislation, through his national coalition of big city mayors called Mayors Against Illegal Guns. “I don’t think there’s ever been an issue where the public has spoken so clearly, where Congress hasn’t eventually understood and done the right thing,” said the multi-billionaire leader of a national political movement to restrict the proliferation of weapons that claim more than 31,000 lives annually.

Bloomberg’s newly created super PAC, Independence USA PAC, infused millions in the last federal election cycle, helping elect four of seven candidates who promoted legislation to reduce gun violence in the United States, a major public health threat that only now is getting the attention of public health  officials nationally after years of self-imposed silence.

Wayne LaPierre went head to head with Michael Bloomberg on the talk shows.
Wayne LaPierre went head to head with Michael Bloomberg on the talk shows.

Likely fearing the emergence of a national political movement, the National Rifle association (NRA) launched a counter-strike against Bloomberg’s media campaign. NRA head Wayne LaPierre sparred with Bloomberg on Meet the Press on March 24, framing Bloomberg as a plutocratic, public health-minded uber-nanny who threatened America’s freedoms, including the alleged right to own guns and the right to eat unhealthy food:

“And he can’t spend enough of his $27 billion to try to impose his will on the American public,” said LaPierre, the national face for the most powerful gun industry lobby.”They don’t want him in their restaurants, they don’t want him in their homes. They don’t want him telling them what food to eat; they sure don’t want him telling them what self-defense firearms to own. And he can’t buy America.”

Which multi-billionaire do you want to champion public health, Gates or Bloomberg?

Bloomberg’s efforts to limit the size of sugary drinks in New York City was recently struck down by the courts. But Bloomberg remains determined to preserve his emerging national status as the Don Corleone of public health.

From pushing upstream interventions to tackle obesity to funding multiple efforts to reframe the national dialogue on guns and America, Bloomberg appears to be everywhere at once these days. In many ways, the bolder, tougher, more confrontational face for public health and the national voice for legislative action on clear public health threats is the 71-year-old Boston native.

By force of will and deep pockets, Bloomberg is emerging as a rival brand for plutocratic public health warrior to reigning champion Bill Gates, whose Microsoft-based wealth helped fund the biggest non-governmental player in public health, the Bill and Melinda Gates Foundation. With $34 billion in assets it is the largest openly run private foundation on the planet.

Which Don Corleone do you want to promote public health, Bill Gates or Michael Bloomberg?
Which Don Corleone do you want to promote public health, Bill Gates or Michael Bloomberg?

Multi-billionaire Gates carefully has chosen non-confrontational public health initiatives that many limited-government and conservative minded leaders can champion, such as poverty reduction programs, education programs, and promoting technological efforts such as genetically modified crops.  Bloomberg’s approach is a much more in-your-face, New York style. He has proven very effective on the bully pulpit by staking out public positions and articulating views that few in the field of public health or even elected office have championed since the assault weapons ban was passed in 1994 as part of a major cops bill under the Clinton White House.

One thing is clear. Leadership, in the wake of repeated gun-fueled tragedies, like the Sandyhook Elementary School mass murders, is making a difference. And for a change, it appears that the NRA’s seeming unshakable momentum to promote the ever-expanding sales of firearms and legislation that allows for the deadly use of force has been called into check.

This also has rippled down to the public health departments, which are now showing greater resolve and passing measures calling firearms-related deaths a threat to public health and totally preventable. Maybe Bloomberg’s moxie is rubbing off. Such symbolic efforts by public health departments clearly are not a true fix, but they are a long-awaited and long-overdue baby step forward.

The politicization of public health (and everything else too)

maherobama
Click on the photo to open a link to the video clip of Maher’s commentary.

Some might say TV host Bill Maher is so political that he cannot be trusted. I disagree.

On March 8, on his TV show, Maher delivered a very provocative commentary that everyone in the field of health promotion, public health, and public policy should watch. Maher rightly asked, “Since when in America did everything have to be so political?” It was a smart piece of punditry, because he correctly showed how efforts to promote public health, nutrition, and healthy eating had become as politicized as the debate over regulating the proliferation of firearms.

Showing pictures of First Lady Michelle Obama, a champion of a national nutrition and exercise campaign called Let’s Move, Maher opined, “If seeing this nice lady on TV saying she likes the movies, or nutrition, or exercise fills you with rage, get help.”

Maher further correctly noted, “Big portions, conservative; knowing where your food came from, liberal.” In short, Maher said what few in the public health profession are saying or have the courage to say—that a deep schism exists in the public space that taints and will continue to taint all efforts to tackle some of this country’s biggest health problems.

These include the obesity epidemic and the threat posed to our healthcare system and our national health by chronic disease.

Ever a political lightning rod who is ready to fan conservative flames, former half-term Alaska Gov. Sarah Palin used her speaking appearance  at the 40th annual Conservative Political Action Committee (CPAC) conference on March 16, to lambaste New York City Mayor Michael Bloomberg’s efforts to tackle obesity by limiting the size of sugary-sweetened beverages. Bloomberg’s New York City law to limit the serving size of such drinks to just 16 ounces was  overturned by a New York State Judge on March 11.

This perfectly framedAP file photo from March 16 shows Palin's eager embrace of red-meat politics that seeks to prevent small measures to address the proliferation of obesity in the United States.
This perfectly framed AP file photo from March 16 shows half-term former Alaska Gov. Sarah Palin’s eager embrace of red-meat politics that seeks to prevent small measures to address the proliferation of obesity in the United States.

Completely ignoring the obesity crisis that is afflicting her own former state and the country, where two-thirds of all residents are obese or overweight, Palin slurped soda from a 7-11 Big Gulp. The theatrics, all perfectly inline with Palin’s anti-government theology, again proved Maher’s point about the politicization of even micro efforts by some local elected officials to address the public health threats facing the country. (Side note, Palin briefly was governor when I lived in Alaska, and I saw her at health promotion events like community runs–an action that she likely would brand as “liberal” today.)

Whenever I would engage Puget Sound area public health officials during my two years of study at the University of Washington School of Public Health (2010-’12), I always asked, how can you prevent the public perception that efforts to promote healthy activity and nutrition are not perceived by conservative voters and Republican elected officials as part of a liberal, activist agenda. I never got a good answer, mainly because I do not believe those officials had an answer. I did not draw any great wisdom from my faculty or UW SPH peers either.

Some wonkish types have tried to investigate this issue in “philosophical terms,” along traditional axes of egalitarianism/choice minded conservatism against regulation-minded “big government” liberalism. One 2005 article on responsibility in health care choices argued, “Holding individuals accountable for their choices in the context of health care is, however, controversial.” There may be some truth to this, but I discount the “core political values” explanation as a way of understanding the politicization of public health initiatives.

Perhaps the biggest fight  in the U.S. political system today is over tax policy and the future of major social/medical programs—Social Security, Medicare, Medicaid—that provide the true underpinning to the public wellness of our country. This is, at its core, is vicious political battle that will shape the public health of the country unlike any action taken by any regulatory or health agency of the U.S. government.

Regulation to promote health has been at the heart of the public health enterprise ever since the field emerged as a profession in the United States in the late 1800s. According to the Centers for Disease Control and Prevention, many of the most successful public health achievements of the 20th century  (food safety, motor vehicle safety, identifying tobacco as a health hazard, etc.) were “upstream” interventions that, by definition, were regulatory in nature and thus purely political.

However, public health, by being a public enterprise, is by definition a creature of the political process, and thus influenced through the power of the purse to curtail its authority and stymie its reach. Public health departments today, for instance, are managed by publicly accountable officials. A local public health department board of health, like King County’s, includes a broad range of elected officials and a few medical professionals.

The nation’s leading de facto public health official, the U.S. Surgeon General (Dr. Regina Benjamin), today remains a mostly toothless position that has little if no sway over the public policy debate concerning the nation’s public health, according to New York Times health blogger Mark Bittman. He writes, “… there is no official and identifiable spokesperson for the nation’s public health, and the obfuscation and confusion sown by Big Food, along with its outright lies and lobbying might, has created a situation in which no one in power will speak the truth: that our diet is making us sick, causing millions of premature deaths each year and driving health care costs through the roof.”

I personally believe that the position of Surgeon General remains that of a paper tiger because those who have power, members of Congress and the Executive Branch, do no wish to allow an advocate for public health to embarrass them with pesky things like facts and science that call for action.

Dr. C. Everett Koop, former U.S. Surgeon General and effective communicator and advocate for public health.
Dr. C. Everett Koop, former U.S. Surgeon General and effective communicator and advocate for public health.

The most effective Surgeon General in living memory who recently passed away in February, the late Dr. C. Everett Koop, proved unpredictable. Though a staunch conservative appointed by President Ronald Reagan, Dr. Koop staked out very controversial political positions on moral and medical grounds, in defiance of his boss, Reagan.

His notable actions still stand out today for their audacity to challenge powerful interests and their embrace of morality as a tactical advocacy tool:

  • Koop’s office produced the plainly worded, 36-page “Surgeon General’s Report on Acquired Immune Deficiency Syndrome,” which clinically detailed HIV transmission, making clear it was not spread by casual contact and affirming that, “We are fighting a disease, not people.” Koop promoted sex education and condom use, enraging conservative critics.
  • Koop also took on the all-powerful tobacco industry and lawmakers who received its many contributions with his pronouncements that smoking killed and should be banned. He famously called purveyors of cigarettes the “merchants of death.” (When is the last time anyone has heard a medical leader embrace such powerful language for a public health cause?)

Though Koop reportedly claimed morality never “clouded his judgment,” he remained an effective advocate on the bully pulpit by literally shaming those in power. “My whole career had been dedicated to prolonging lives,” he said, “especially the lives of people who were weak and powerless, the disenfranchised who needed an advocate: newborns who needed surgery, handicapped children, unborn children . . .people with AIDS.”

I keep waiting for someone, anyone besides billionaire Mayor Bloomberg, to enter the political discourse on behalf of public health and use straight language that cuts through the hype. The problem is, they cannot teach you leadership when you enter the fields of public health or politics. It is something you either are capable of, or simply lack. Right now, it is lacking.

Do community health fairs really make any difference at all?

As a frequent community event and festival attendee in Seattle and many other communities, I have always wondered how effective these events have been in achieving their goals of promoting health and wellness. In the public health world, we call these “health fairs,” and they are fairly ubiquitous nationally and accepted with de rigueur. But do they really work?

Somewhat new to the field of public health, I am more familiar with trade shows, which I have been attending for many years. These much more ubiquitous activities provide a common space where companies, governments, and a mass market meet to hopefully find audiences and make sales. They do not seem to be going out of fashion. One show I attended, the biennial Oil and Gas Expo in Calgary, one of the continent’s largest energy shows, draws 20,000 attendees from around the world and sells out every hotel room during its June run. The massive trade fair also attracts some of the world’s largest and most influential companies. So clearly where money is to be made, “the show must go on.”

The super-sized Oil and Gas Expo in Calgary is a perfect example of how important trade fairs are in the private sector.
The super-sized Oil and Gas Expo in Calgary is a perfect example of how important trade fairs are in the private sector.

But what of health fairs that cater to smaller subpopulations, and sell messages, behavior change, and health awareness that can be even unwanted by the audience? I recall distinctly that one of my public health professors at the UW School of Public Health, who shall remain nameless, said s/he had never seen any evidence this public health activity had any measurable outcomes, yet they proliferated as a best practice.

Champions of the health fair model

One fan of community health fairs is Dr. Kevin Pho, an internal medicine specialist who also runs a blog that attempts to reach out to a mass audience. On his blog, KevinMD.com, he gives space to another blogger, who does not give his name and thus we do not know if he is a true MD. But Dr. Pho claims he is, and by endorsing his colleague, he publishes a passionate defense of health fairs as a way of extending medical care without medical hierarchy: “Meeting in this context fosters rapprochement between patient and doctor. The once hierarchical encounter is no more. In this habitat, doctor and patient are in fellowship.” The mystery doctor, who we cannot fully validate, claims that health fairs:

  • Are an excellent way to engage underserved communities in caring for their health.
  • Offer a unique opportunity to engage patients in the community with which they self-identify, particularly when they are in the “precontemplation” phase of action.
  • Are a great opportunity to field patient questions–he claims to have fielded many questions about Bill Clinton’s post-bypass surgery veganism.
  • Uncover and provide the platform to correct misconceptions, in a nonconfrontational setting that can lead to positive discussions.
  • Can grow a doctor’s practice.
  • Are fun.

    At the 2013 Tet Fest at the Seatte Center, a health clinic table was set up amid other tables hawking cell phone plans and new bank accounts.
    At the 2013 Tet Fest at the Seatte Center, a health clinic table was set up amid other tables hawking cell phone plans and new bank accounts.

The Centers for Disease Control and Prevention (CDC) publishes how-to guides how to organize events that engage target communities, such as this guide focusing on injury prevention for kids. Seattle, where I live, is virtually awash in corporate medical events that also involve local partners, like the Seattle Housing Authority and social service providers like Neighborhood House.

These event focus on many of the many minority populations in King County, such as the Latino community, which was engaged at the annual Fiestas Patrias event held in September at the Seattle Center. This particular fair focussed on HIV testing, behavioral health, dental care, long-term care, cancer, chronic disease, and culturally appropriate care for the Spanish-speaking community.

I was recently at the annual Tet celebration at the Seattle Center the weekend of Feb. 16-17, 2013, and not to my surprise saw a table promoting health-fair-styled information for the nearly entirely Vietnamese-American audience in attendance. I did not have the ability to know if anyone attending bothered with that booth or were more interested in the photo booth, the deep fried tofu and Vietnamese coffee, or stage shows.

A booth offering Tet pictures appeared to be more popular than the health clinic table at the Tet Fest in Seattle in February 2013.
A booth offering Tet pictures appeared to be more popular than the health clinic table at the Tet Fest in Seattle in February 2013.

What do we know from recent research?

One non-profit, called Unite for Sight, published an article that reported that there was inconclusive evidence about the benefits of health fairs and community screenings. The medical literature has often viewed them with great skepticism. “Health fairs are neither regulated nor routinely certified in the United States, and complete data on their numbers and content are not available.” The article further noted that tests at fairs may be more harmful than helpful because the may unnecessarily alarm participants with bad results, or provide false reassurance that results shown are normal.The article cites a 1985 study that found “rates of false alarm of healthy people and false reassurance of those at risk may be high for some tests, and the benefits of detecting new disease are easily overestimated.”

A more recent 2011 study on blood pressure screenings at community health fairs, published in the Journal of Community Nursing, looked at outreach on hypertension. The article reported “nurse-operated health fairs, crafted to identify those with high BP readings, are promising as a simple and effective means in motivating individuals to seek follow-up care.”

Another study from 2003, Reconsidering Community-Based Health Promotion: Promise, Performance, and Potential, published in the American Journal of Public Health, found that “evidence from health promotion programs employing a community-based framework suggests that achieving behavioral and health change across an entire community is a challenging goal that many programs have failed to attain.” The authors, Cheryl Merzel and Joanna D’Afflitt, write that “interventions themselves probably are too limited in scope and intensity to produce large effects across a community. Many programs focus primarily on individuals, with most people receiving mass education alone, and interventions and messages are not sufficiently tailored to reach various population subgroups.”

How well do health tables compete with the private sector like banks, as seen at Seattle's 2013 Tet Fest.
How well do health tables compete with the private sector like banks, as seen at Seattle’s 2013 Tet Fest.

The article, however, reported that community interventions have been found to work for, say, HIV. They call this the “prevention paradox,” or the fact that prevention measures that bring big benefits to the community have little benefit to individuals. Thus, most community-based chronic disease prevention programs have  reportedly found it hard to get individuals to change their behavior, but HIV-related programs have reportedly worked.

Merzel and D’Affliti suggest that HIV programs may be more successful than other health fair promotion events because they go after small and homogenous groups. This is harder to do with large, diverse groups. So “getting identifiable social groups to change specific behaviors with discrete levels of individual risk may be more achievable than developing multiple interventions designed to motivate numerous subgroups of varying risk found within a broad geographically defined community.”