As a native of Detroit, I present this first of several essays, with a profound sense of sadness. (See my photo blog for my first photo essay.)
It is hard to accept that my birthplace, this once great global city, has become a symbol for American industrial decay and capitalism’s larger ills. At one point, Detroit boasted nearly 2 million residents in the 1950s. Today is barely counts 700,000 residents. [Updated census figures, 5/5/2015.]
In its heyday of bustling industrial production, Detroit served as a global icon for American ingenuity, industrial might, and economic power. During World War II, when the larger metro area produced the country’s war weaponry to defeat the Axis powers, Detroiters proudly called their city the Arsenal of Democracy. In the 1920s and 1930, about 40 percent of all automobiles were manufactured in the Motor City and the Ford River Rouge plant was the world’s largest.
Today, Detroit is known more as the murder capital of the United States, and the arson capital. All told, 90,000 fires were reported in 2008, double New York’s number—for a city 11 times larger—according to Mark Binelli, author of Detroit City is the Place to Be. It is the epitome of racial politics. Binelli notes, 90,000 buildings are abandoned, and huge swaths of the 140-square mile urban area are now returning to nature. Beavers, coyotes, deer, packs of wild dogs, and foxes are now reported in the city.
I just visited Detroit, and the trip had a more profound impact on me than I was prepared for. How is it that our country could undertake two overseas wars to conquer and rebuild nations—Iraq and Afghanistan—and yet abandon a city that helped to make the country the global power it once was.
National partisan politics have played a role, with Detroit becoming a symbol of the Democratic Party’s failure, as a black city and union city, in the eyes of white and conservative detractors. Then there are NAFTA (pushed by Bill Clinton) and industry fleeing the country for cheaper manufacturing from global suppliers and gross mismanagement of the Big 3 automobile companies, two of whom were bailed out by U.S. taxpayers in 2009.
White flight eventually followed long-simmering racial tensions. There have been Detroit race riots in 1863, 1943, 1967, and 1987. Those riots were stoked by historic racism, redlining, job discrimination, and the building of freeways that helped to destroy America’s inner cities. Today, some criminal fringe actors among Detroit’s mostly black residents are burning what’s left of their own city, for at times just the hell of it.
Charlie LeDuff, author of Detroit, An American Autopsy, painted a heart-breaking tale of the city’s self-destructive conflagrations through the tales of firemen trying to combat the arsonists. “In this town, arson is off the hook,” said a firefighter to LeDuff. “Thousands of them a year bro. In Detroit, it’s so fucking poor that a fire is cheaper than a movie. A can of gas is three-fifty, and a movie is eight bucks, and there aren’t any movie theaters left in Detroit so fuck it.” (I will do a photo essay of fire-ravaged homes shortly.)
That latest malaise, on top of repeated political scandals and corruption by the city’s bureaucrats and criminal politicians, was a crushing bankruptcy filing in the face of an $18 billion debt. In December 2014, after a year an a half in limbo, a grand bargain was struck with creditors, the city, the state, and private industry that prevented the city from selling its city-owned artwork (Rembrandts, Van Goghs, and more) in the world famous Detroit Institute of Arts.
As I wandered the glittering white palace that is the DIA, I wondered, what’s more important, this art or the blocks and blocks of emptied neighborhoods that most of this country has forgotten.
Coming back to Portland was hard. I posted a comment on Twitter as soon as I arrived back home how bizarre it was to be back in the whitest city in North America, Portland, after spending time in the city that America defines as African-American.
News stories continue to highlight the growth of human trafficking in the United States, Europe, and especially Asia. One estimate puts the number of persons in captivity, either for forced bondage or sex trafficking and prostitution, at 12 million to 27 million. An increasing number of victims are young girls 18 and younger, who become infected with sexually transmitted diseases such as HIV/AIDs.
Slavery seems to bring out the worst of humanity, and perhaps is a manifestation of our inglorious inhumanity. Sadly it is, well, about as American as the U.S. Constitution that not only enshrined it, but gave Southern states extra voting power–the notorious 3/5ths clause–for its slaves in the census allotment of Congressional seats.
I still remember when I visited the Philippines in 2003. Male and female pimps repeatedly accosted me within seconds of exiting taxis in front of my hotels in Cebu City and Manila, where I was working on a photo-documentary project. I was sure their workers were sex slaves. When I told them to go away, they mocked me and even offered me young children. It was sobering to realize that I represented a market, a lucrative market, that eagerly comes to countries like the Philippines, Thailand, Cambodia, and Laos to exploit women, even young boys and girls. Though aware of the problem, and having seen evidence of its freewheeling nature in Asia, the unrelenting media coverage of sex slavery has become overwhelming.
In April 2013, European Union Home Affairs Commissioner Cecilia Malmström lamented: “It is difficult to imagine that in our free and democratic EU countries tens of thousands of human beings can be deprived of their liberty and exploited, traded as commodities for profit.” The United Nations estimates human trafficking nets $32 billion annually—a major transnational business. The United States fares no better. There are slaves being trafficked and sold in my home city of Seattle right now. A local KIRO News story recently reported: “Child sex trafficking – as easy in Seattle as ordering a pizza.”
Visiting Osawatomie, and its place in U.S. history
So slavery was on my mind when I drove across the country in late May from St. Louis to Seattle. I wanted to take a road less traveled and see some out of the way places, including in Kansas. Most of my friends practically laughed at me when I described sight-seeing there. So, I pulled out my atlas and found Osawatomie on the map, about an hour southwest of Kansas City, along state Highway 169
Specifically, it is where America’s most famous abolitionist and violent revolutionary, John Brown (1800-1859), fought pro-slavery forces to prevent the then Kansas Territory from becoming a slave state. All told 30-45 free state defenders, known as Jayhawkers (the University of Kansas’ namesake) fought nearly 250 proslavery militia along the banks of the Marais de Cygnes River on Aug. 30, 1856. Brown’s son Frederick and others died. Many say the war actually began in this small Kansas town that pro-slavers burnt to the ground during the attack.
In May of that year, Missouri ruffians, numbering 800, had sacked Lawrence, Kan., and burned a hotel, killing one abolitionist. Their strategic goal was to keep an entire race of persons in human bondage and treated as nothing more than property, and expand the inhumane practice and trade into territories recently “ethnically cleansed” of its Indian population by the U.S. Army, based at Ft. Leavenworth.
On May 24 and 25, 1856, at the so-called Pottawatomie Massacre, Brown responded in kind, by murdering five pro-slavery settlers with a sword. The mass murder by Brown and his sons was inspired by Brown’s deep Christian faith that he had been called to undertake a divine mission to end slavery and contest its brutality and those of its violent supporters with force.
The repeated and well-publicized examples of slavery’s inhumanity in the United States enraged Brown to the point where he dedicated his life to crushing it and freeing the slaves. (Unlike most of his day, Brown also believed in the equality of races, including Indians, and of the sexes.)
Just two years earlier in 1854, a divided Congress passed the Kansas-Nebraska Act, ending the fragile 24-year-old Missouri Compromise allowing a balance of pro-slave and free states to join the Union. With the 1854 act, settlers themselves would determine if that “peculiar institution” of slavery, which held in bondage an estimated 4 million persons, or 13% of all residents in the young country, would be allowed. Pro-slavery voters won, but the constitution was disavowed, the bogus legislature tossed out, and Kansas entered a free state in 1861.
One historic political outcome from the four years of fighting in the territory was the rise of a young Illinois politician of the nascent Republican Party, who noted in his political speeches, “Look at the magnitude of this subject! … about one-sixth of the whole population of the United States are slaves!” Abraham Lincoln emerged from the turbulence of the era as the standard bearer of his party in the divisive 1860 election that set in motion the war to address what Lincoln accurately noted was the “the all absorbing topic of the day.”
As for Brown after Osawatomie, he travelled in and out of Kansas the next two years of violence before returning East to plan his failed Oct. 16, 1859, raid on the federal armory in Harper’s Ferry, Va. The raid, with 21 men to trigger a Southern slave uprising, failed miserably.
Brown was captured, tried in Charlestown, Va., and sentenced to hang to death on Dec. 2, 1859. During his trial he told the court, “Now, if it be deemed necessary that I should forfeit my life for the furtherance of the ends of justice, and mingle my blood further with the blood of my children, and with the blood of millions in this slave country whose rights are disregarded by wicked, cruel, and unjust enactments, I submit: so let it be done.”
All of that history seemed overblown and forgotten in modern-day Osawatomie (pop. 4,447). The memorial to Brown and the battle is the John Brown Museum State Historical site. It includes a cabin of a local minister and his wife used as an Underground Railroad station. The cabin survived the battle. The park features a bronze statue of Brown and historic battle markers. It looked a little shabby and unappreciated, like any small-town park without money for upkeep, except it has happened to have two presidential visitors who delivered policy speeches, by Teddy Roosevelt in 1910 and Barack Obama in 2011.
Hollywood, Slavery, and the Battle for Kansas
For many of us, however, our perception of slavery is shaped by popular culture. One of two most recent Hollywood treatments of the subject was the scholarly costume epic Lincoln, by Stephen Spielberg. The film did not hide the brutality of slavery; in fact, the film opens with a vicious hand-to-hand battle pitting likely former slave Union soldiers locked in deadly embrace with their white Confederate adversaries. The film is basically a procedural drama how Lincoln’s administration passed the 13th Amendment to the Constitution, to end slavery “forever” in United States, while the nation’s most violent war rages outside of Washington.
The more controversial rendering of slavery is the 2012 Quentin Tarantino blood and gore pre-Civil War spectacle, Django Unchained. This shoot-‘em up racks up a huge body count in a gratuitously violent revenge fantasy that follows the actions of a former slave, Django, played by Jamie Foxx. He kills perhaps nearly two dozen Southerners, blows up plantation mansions, and frees his true love. Unlike Lincoln, this film was heatedly debated. One review noted, “No single Hollywood film in the last decade has sparked the kind of controversy and wide-ranging response as Quentin Tarantino’s latest.”
The film triggered unrest not because of its brutal violence (nothing new for Hollywood splatter fests), but because of its rival view of history. “The most important thing about Django Unchained is that it’s a reaction against, or corrective of, movies like Birth of a Nation and Gone with the Wind. At every turn, it subverts or inverts the racist tropes that have defined Hollywood’s–and our culture’s–treatment of slavery, the Civil War, and Reconstruction,” according to Jamelle Bouie.
I have black friends who had a distinctly more positive personal reaction to the violent tale than did my white counterparts. While the film’s violence seems designed only thrill audiences, the violence of slavery and of efforts to expand it by pro-slavery bushwhackers in Kansas before and during the Civil War was every bit if not more cruel, if historical records are accurate. Reality actually trumps anything Tarantino could dream up.
According to one account, a bushwhackers’ raid during the Civil War on Lawrence, Kan., is considered one of the worst cases of mass murder by the pro-Slavery forces.
On Aug. 21, 1863, 450 pro-Confederates Led by Bill Quantrill staged an early-warning raid and mostly showed no mercy, slaughtering about 180 men and boys as young as 14. Most of the victims were unarmed and still in their beds when the killing began. Another famous bushwhacker in the region, a psychopath named “Bloody” Bill Anderson, reportedly scalped victims before he was tracked and killed, and then beheaded as an example.
The official Hollywood rendering of “bleeding Kansas” and John Brown’s efforts to end slavery remains Michael Curtiz’s unsavory pro-slavery 1940 Western called the Sante Fe Trail (you can see the whole film here). The movie stars Errol Flynn as future Confederate General Jeb Stuart, then-actor Ronald Reagan as future Indian-killing General George Custer, and Olivia de Havilland as their mutual romantic interest. The film renders a staggering historic whitewash of not only slavery and pre-Civil War America, but of John Brown’s actions in Kansas to contest the bushwhackers during the mid- to late 1850s.
Brown is portrayed by Raymond Massey as a bug-eyed, villainous psychopath bent on murder and revolution to end slavery, while Southern gentlemen like Flynn’s Stuart are true Americans who claim the South can work out slavery on their own terms. There is no portrayal of slavery’s base cruelty, only abolitionist violence in Kansas and at Harper’s Ferry.
In an even more bizarre twist, future Confederate President Jefferson Davis is rendered as moral voice of wisdom, telling the graduating cadets: “”You men have but one duty alone, America.” This was the same Davis who owned slaves and dedicated himself to ensuring slavery’s survival as head of the pro-slave states doing everything they could to break away from that country.
The only “black folk” seen in this disingenuous Dixie-cratic rendering of reality are powerless, witless slaves who cannot think for themselves. After a firefight that sent Brown fleeing, a husband and wife slave couple from Texas caught up in Brown’s violence reveal themselves to Stuart as misguided lovers of the white slaveholding class: “Well, old John Brown said he gonna give us freedom but, shuckin’, if this here Kansas is freedom then I ain’t got no use for it, no sir,” drawled the wife. Her husband added, “Me neither. I just want to get back home to Texas and set till kingdom come.” I suppose that means he’d get a good whipping if he fessed up for trying to win his freedom.
As one film commentator noted: “In the years before 1960 most portrayals of slavery in cinema were like it was in Gone with the Wind and Jezebel. The slaves were happy and contented and too simple to live on their own. The Civil War was unnecessary and brought on by a handful of fanatics in the North.” The film’s final scenes show Brown before he is hung in 1859, followed by a happy kiss of the newlyweds, Flynn and de Havilland, all two years before the entire country entered its greatest conflagration that claimed more than half a million lives, finally “ending” slavery as a legal institution in the United States.
Former Klansman becomes part of Hollywood whitewash of Southern bushwhacking
The other noteworthy and historically inaccurate portrayal of Kansas-related bushwhacking violence is Clint Eastwood’s disturbing 1976 revisionist film The Outlaw Josey Wales. While supposedly based on a true Southern fighter, the film rewrites the script of historic events. Instead of violent Confederate bushwhackers who murdered indiscriminately, as they did in Lawrence, Southerners are portrayed as victims of murderous Jayhawkers and Union soldiers, who kill innocent women and slaughter surrendering prisoners, and hound Wales to Texas. The film was based on a novel, Gone to Texas, by Asa Carter, also author of a popular kid’s book called the Education of Little Tree.
At the time the film was made in 1976, it was unknown that Carter had reinvented himself. Instead of being a Cherokee Indian as he claimed, Carter was in fact a former Alabama Klansman, avowed racist, and speechwriter for Alabama’s segregationist Governor George Wallace. The books served as a clever reinvention for a man preaching against “government intrusion,” as Carter did for Wallace with racist hate language. Even his supposed Cherokee words were fiction. As for Josey Wales, the film helped to reinforce Southern stereotypes of Northern aggression and Southern innocence (despite its holding 4 million in captivity), while boosting Eastwood’s maverick filmmaking career.
In 2013, in an era when slavery seems to be as thriving an enterprise globally as it was in the antebellum South, perhaps it is time reexamine on the big screen the complex events in Kansas and Virginia and that fanatical revolutionary who committed his life to ending the institution forever. I just do not want the filmmaker to be Eastwood, Tarantino, or even Spielberg, nor a vampire camp production. Time to let someone else tell a tale that still needs to be told. Love or hate him, Brown was right about slavery’s stain on the nation. Brown’s enemies “could kill him,” wrote freed slave and fellow abolitionist Frederick Douglass, “but they could not answer him.”
As a former St. Louis area resident, I first thought my friend was pulling a prank when he shared a story on Sept. 29, which was picked up by the Daily Mail tabloid in the United Kingdom and alleged my old home city was intentionally contaminated by U.S. military researchers during the Cold war. I nearly deleted the email suspecting it was spam.
It turns out it was not a prank story in the Onion. During the last week of September 2012, St. Louis’ major broadcast news stations (KMOX and KSDK) broke a news story on recently completed research of government documents that showed U.S. military researchers conducted human subjects testing, in violation of the Nuremberg Code, on poor and minority residents in St. Louis during the 1950s and 1960s. The bombshell that was dropped by St. Louis Community College-Meramec sociology professor Lisa Martino-Taylor, in her PhD thesis, was that U.S. Army’s researchers sprayed an aerosol on human subjects that allegedly was laced with a fluorescent additive, a possible radiological compound, produced by U.S. Radium Corp. The company had been linked to the deaths of workers at a watch factory decades before.
The issue of the U.S. government testing on unwilling and non-consenting persons for military and medical research during the Cold War has long been established, both in St. Louis, and also in the Inner Mountain West and in Washington State. At the Hanford Nuclear Reservation, in southeastern Washington, radioactive iodine (I-131) was intentionally emitted in 1949 ( the Green Run test) to measure the impacts of exposure on human health as part of the U.S. Air Force’s efforts to better understand and track Soviet weapons testing. For its part, St. Louis was one of 33 U.S. and Canadian cities and rural areas intentionally exposed to the spray that was dispersed from airplanes, rooftops, and vehicles. A subsequent National Research Council committee, in 1997, claimed these tests did not expose residents to chemical levels considered harmful. However, promised follow-up studies may not have been conducted. Residents in St. Louis were quoted in press reports claiming planes dropped a white powder that fell on people below, which residents did not view as potentially harmful.
According to Martino-Taylor, thousands upon thousands of St. Louis residents likely inhaled the zinc cadmium sulfide spray. In St. Louis, where tests were conducted in 1953-54 and 1963-64 by the U.S. Army Chemical Corps, Martino-Taylor said, ”The powder was milled to a very, very fine particulate level. This stuff travelled for up to 40 miles. So really all of the city of St. Louis was ultimately inundated by the stuff.” The Daily Mail reported one of the compounds sprayed unknowingly on St. Louis residents was FP2266 (radium 226), which according to the U.S. Army was made by U.S. Radium Corp. The compound was the same one that was linked to the death and of former U.S. Radium Corp. workers.
According to press coverage, the U.S. Army has admitted that it added a fluorescent substance to the “harmless” compound, but the issue of whether the additive was radioactive remains classified.
The story was immediately picked up by a number of blogs, which repeated the allegations and news coverage. Almost immediately, Missouri’s two U.S. senators, Claire McCaskill (D) and Roy Blunt (R), wrote to Army Secretary John McHugh demanding answers and to ask if follow-up studies promised in 1997 by the National Research Council were ever completed. The full text of McCaskill’s letter and press release can be found here.
According to an Oct. 3, 2012, AP story, aides to Sens. McCaskill and Blunt said they have received no response. At the time of the story, the U.S. Army declined to be interviewed by the AP. The AP’s story notes that St. Louis was chosen for reserach because it resembled some Russian cities. However, one of the primary areas that was chosen for testing was the Pruitt-Igoe public housing complex, which was razed in the 1970s as a failed national public housing experiment–and one of St. Louis’ legacies as a decaying city. At the time of the spraying by federal researchers, the complex had 10,000 mostly African-American and low-income residents, 70 percent of whom were 12 and younger.
Martino-Taylor’s thesis (The Manhattan-Rochester Coalition, research on the health effects of radioactive materials, and tests on vulnerable populations without consent in St. Louis, 1945—1970) is worth examining first-hand, as it describes how she was tipped to the improbable and almost unbelievable tales of two women, both sharing stories of having been unwilling human subjects to military spraying and suffering health consequences from that research. Surprisingly, she knew nothing about these then allegations. Thus began her effort to request information under the U.S. Freedom of Information Act from the federal government, often in severely redacted form. A point that much of the media continues to miss is that her research focuses on the researchers as well as their victims. Her thesis statement states her work looks at how a “large number of participants inside an organization will willingly participate in organizational acts that are harmful to others, and how large numbers of outsiders, who may or may not be victims of organizational activities, are unable to determine illegal or harmful activity by an organization.”
The leaders of the studies, which she calls the Manhattan-Rochester Coalition, were the researchers who conducted the human-subjects research on nuclear weapons as part of the country’s efforts to prepare for, and win, a possible nuclear confrontation with the U.S.S.R. During the tests in St. Louis and other areas, according to Martino-Taylor, the U.S. Army violated the 1947 Nuremberg Code, the standard set after trials of Nazi doctors and war criminals, which established that “voluntary consent of the human subject is absolutely essential” for any human-subjects testing. There was no such standard in these tests in St. Louis, Minneapolis, and elsewhere, Martino-Taylor maintains.
During the 1940s, the Nazi regime’s corrupt and criminal medical and scientific community committed horrific crimes at dozens of concentration and extermination camps in Nazi-occupied Europe, including live vivisections, gassings, cold water immersion tests, high-pressure testing, lethal injections, and intentional murder for “scientific purposes.” I in fact visited many of the rooms and buildings where these crimes against humanity occurred during my tour of the camps in the summer of 2000, so it was especially painful for me to know that my own government, in my former home city, may have been breaking established international guidelines that were codified following the defeat of the Nazis and their murderous state. (See my photo documentary here.) According to Martino-Taylor, the initial congressional investigation of the spraying program included testimony from experts that claimed the experiment team “chose to ignore Nuremberg.”
In the United States, following the Tuskegee Institute’s syphilis experiments on African-American men, reforms were passed in 1979 through the Belmont Report, which theoretically was supposed to protect human subjects from harm in research. However, even as the media report on this sensational story of testing on humans in two countries (Canada and the United States) in the 1950s and 1960s, researchers at elite universities and laboratories continue to violate the principles first set out at Nuremberg. Slate.com this year reported that “marginalized groups have frequently been coerced into studies that violate their right to consent. A recent review of the bio-ethics of human research in the U.S. offers little prospect for change.”
The Slate.com story, from Jan. 22, 2012, was gloomy in its overall assessment of the failure of safeguards to prevent unethical research on humans, particularly when large corporate interests are involved. The story said the Presidential Bioethics Commission issued a report on protecting human research subjects that trumpeted the United States’s so-called “robust” protections—rules that have repeatedly permitted and legitimized breaches of informed consent. “The failure to elicit consent is not confined to the U.S. One in every three U.S. corporate medical studies is now carried out abroad, usually in places where trials can be conducted more cheaply than in the U.S. Subjects are often unaware that the treatments are experimental.”
I am pretty sure the dust from this recent controversy will settle quickly, and even in St. Louis, the community will focus more on their beloved Cardinals’ bid for another World Series title. It is likely no one involved in these unethical if not possibly illegal studies will ever be held accountable for their actions against the civilians they may have harmed.
A now-deceased doctor friend of mine who dedicated his life to serving the Native community in the Indian Health Service used the expression a lot describing where he worked in New Mexico and Alaska. It is a legal term, codified in treaty rights, federal regulations, and court decisions. Indian country can be a physical place, associated with customs and cultures of the continent’s first peoples. It is also a state of mind. You literally know you are in Indian country when you go there. There are place names and of course the people. I grew up in St. Louis, Mo., which sits on the mighty Mississippi River (Ojibwe for “great river”), and I felt connected to Indian country there because of the great muddy and the phenomenal Cahokia Mounds just east of the city in Illinois. I knew I was living on historic Indian land even as a kid.
I have lived the last 16 years of my life in what I definitely consider to be Indian Country, Alaska and Washington State. Alaska felt much more like Indian country to me. Anchorage, my home for six years, is very much a Native city in terms of population (about 16 percent). I rarely feel that connection in modern, congested, urban Seattle. But I recently took a four-day trip to the hot, upper plateau of central Washington, from the Methow Valley to Omak, and indeed felt I had landed four-square in Indian country again.
-All land within the limits of any Indian reservation under the jurisdiction of the United States government, notwithstanding the issuance of any patent, and including rights-of-way running through the reservation;
-All dependent Indian communities within the borders of the United States whether within the original or subsequently acquired territory thereof, and whether within or without the limits of a State; and
-All Indian allotments, the Indian titles to which have not been extinguished, including rights-of-way running through the same.
Indian country also implies U.S. federal recognition of tribal bands as sovereign on their lands and capable of enjoying rights that are government to government. As one source notes, recognized tribes “possess absolute sovereignty [that] are completely independent of any other political power,” but also which is shared with other jurisdictions (local, state, and federal).
In Washington state, federal definitions of “Indian country” apply to state law, in addition to provisions acknowledging tribes non-taxable status in some commerce, such as the sale of tobacco products to tribal members on their reservation. In Seattle, there is still a band, the sparsely populated Duwamish, who have lost their sovereign status and failed to win legal recognition in the city’s limits, on some of the choicest real-estate on the West Coast. Another nearby tribe, the Snoqualmie, regained their status in 1999 and promptly built a casino and became an economic and political player.
The decades-long fight over treaty-protected fishing and subsistence rights by the tribes culminated in the historic 1974 ruling in the landmark U.S. v. Washington case (the Boldt Decision) that unequivocally affirmed 19 federally-recognized tribes’ fishing rights to salmon and steelhead runs in western Washington. That decision gave the tribes rights to half of the salmon, steelhead, and shellfish harvests in the Puget Sound. It was a major game changer, and its impacts are still felt today–particularly legal squabbles if the decision should still be applied to land-use decisions impacting salmon habitat.
Yet, even as I gaze out on the beautiful Puget Sound, I am hard-pressed to think that I am on historic Indian lands, that I live in Indian country, where there are 29 federally-recognized tribes, in all corners of the state (see tribes and locations here). But this is very much Indian country in a historic and cultural sense.
In fact, more than half of the state was outright taken by military force, illegal land seizures, and treaties (which also provided fishing and resource rights to tribal members) from the 1850s to the 1890s. Many stories of the exploitation of Native tribes come to mind, notably the hanging of Yakima warrrior Qualchan (also called Qualchew) by the reportedly violent Col. George Wright, in his campaign that defeated five tribes in Washington in the eastern half of what is now is the state.
On Sept. 25, 1858, Qualchan had surrendered with a white flag and was hung within 15 minutes. That was followed with the hanging of six Palouse warriors the next day. Such incidents typified the period of conquest in my home state. Exploitation of tribal rights followed the signing of treaties. The Colville Tribes, for instance, had their lands stolen without their consent, setting off decades of legal battles that continued to the 1930s and ended in historic settlements returning hundreds of thousands of stolen acres of land. Salmon and steelhead runs in the state were decimated by commercial fishing interests that harmed tribal groups in the upper and lower Columbia River basin. The runs were further extinguished by the dams built on the Columbia River. Only with the Boldt Decision in 1974 did the tide turn, but with numbers that no where near compared to the great runs of 100 years earlier.
Again, all of this is very academic and abstract to me and most Western Washington residents. Only when I traveled to the “World Famous Omak Stampede” rodeo and suicide race, with Native riders who charge down a 200 foot hill on horseback every second weekend of August, did I again realize I was truly in Indian country. Omak, in north central Washington, lies partially in the 1.4 million-acre Colville Reservation, in sparsely populated Okanogan and Ferry counties. The Confederated Tribes of the Colville Reservation number less than 10,000. I found the area to be amazingly beautiful. It’s hot in the summer, and bitterly cold in the winter. During my visit to Omak for the Stampede, the mercury hit 100 F.
Outside of agriculture (on non-tribal lands), there is little industry in this part of the state, but there is gold mining, forestry, and a limited personal use salmon fishery for tribal members. Forestry is the mainstay for generating tribal revenues. Gaming is also a big moneymaker at the tribes’ three casinos. If you can believe it, the casinos are attracting acts like blues legend Buddy Guy and rock has-beens like Foreigner and Joe Walsh in the next few weeks. I think it’s a bit sad that even stalwart Canadians are driving south from British Columbia to spend their loonies at the tribal gaming tables, but come they do.
Despite the flow of revenues, health issues remain a problem, as they do throughout Indian country. A June 9, 2012, story republished in the New York Daily News about Tribal Councilman Andy Joseph, Jr., profiles his efforts to address Native health funding issues. The story notes his tribal members and others nationally “are dying of cancer, diabetes, suicide and alcoholism. They are dying of many diseases at higher rates than the rest of the population. And instead of those rates getting better, they’re getting worse.” Joseph is the tribes’ representative to the Northwest Portland Area Health Board, which serves 41 tribes in Washington, Oregon, and Idaho, and is that group’s delegate to the National Indian Health Board, which speaks for all 566 federally-recognized tribes in the country. The story notes that, nationally, tribal members die an average of five years earlier than the rest of the U.S. population and are six times more likely to die of tuberculosis or alcoholism, three times more likely to die of diabetes, and also twice as likely to be killed in an accident. What’s more, they are also twice as likely to die from homicide or suicide. Pretty grim data indeed.
According to Joseph, the major health issues associated with diet and nutrition have occurred as a result of conquest and cultural assimilation: “‘Joseph holds up a jar of canned salmon sitting on his desk. ‘Our people crave this,’ he said. ‘It was taken away from us when they put Grand Coulee Dam in.’ He reaches for a string of dried camas root. ‘It’s what our bodies were raised with for thousands of years. Now, we have Safeway and Albertsons and Walmart.'”
In Omak, I got a taste of Native pride during the Omak Stampede Parade, which mainly featured local businesses, rodeo princesses, groups like firefighters, Republican office holders or candidates, and less than half a dozen Indian floats. (I saw no Latino groups in the parade, despite their large presence picking fruit and in agriculture–they “officially” number about 15 percent of Omak’s residents.)
The Stampede features a tribal encampment with teepees and a performance area where tribal members perform traditional dances and song in gorgeous costumes. It reminded me a lot of Alaska, particularly the many gatherings I saw there, including the largest conference called the Alaska Federation of Natives Annual Convention. Yup, I was definitely in Indian country.
My only real, true regret was that I missed the Suicide Race, which features some of the state’s finest Native horseman who charge down the steep hill and swim across the Okanogan River on their way to the finish inside the Omak Stampede stadium. You can watch it on YouTube, and note some times, yes, horses have died in this race.
The massive corporate, sport, and media spectacle that is the Olympics is underway. The Games’ charter, in idealistic language, calls for “respect for universal fundamental ethical principles.” Who doesn’t want that.
I’m a fan of the spirit of international cooperation and competition that are the “ideals” of the Games, but not the dark underbelly that is associated with them and has always been associated with them.
First, the positives. They bring out our best. They can lift you up. I simply loved watching Usain Bolt grab gold in the 100m and 200m sprints in Beijing in 2008, and also the fact that he became an instant positive icon to hundreds of millions of people around the world. (Since then he has become a self-styled global brand.) Or better, I continually marvel at the crop of Kenya’s world-class middle- and long-distance runners, such as Samuel Kamau Wanjiru, who won the marathon in Beijing with style with a blistering 2:06:32 time and then tragically died in mysterious circumstances in 2011. Kenyans grabbed 14 medals in all in 2008, compared to Jamaica’s 11 – both amazing outcomes for countries that are relatively poor by all measures, and thus less capable of funding national sports programs. To me, these are the Games’ positives.
Kenya’s phenomenal male and female runners, who shine in the Games every four years, especially stand out for me, because most of them are running to escape poverty and build a better life for themselves and their families. A profile of them on the NPR pegged the success of the Kenyan athletes to their training regime at high altitudes in the Rift Valley, discipline, and also the country’s limited economic opportunities, with many of the best runners hoping to win big-money marathons like Chicago’s or Berlin’s.
Of course the dark side of the Olympics is the corporate control of every facet of the Games. As The Nation notes, the Olympics, under the leadership of Juan Antonio Samaranch, a certifiable Spanish fascist, “was transformed from Cold War spectacle into a neoliberal Trojan Horse: an invading corporate sledgehammer of privatization and payoffs.” The Games have a recent history of corruption (at both the Sydney and Salt Lake City Games). According to the Daily Mail, taxpayers are subsidizing what many critics of the Games say is a giant corporate schmooze event. Taxpayers are underwriting the games by an 8-1 margin compared to private investors. A year before the games began, the Daily Mail noted that “corporate fat cats” got more than half of top games tickets for showpiece events, which are handed out to corporate sponsors, Olympic bigwigs, and their various VIP guests. Only 32,000 out of 80,000 seats in the Olympic stadium – about four in ten – were available to the public for the marquis events.
Transparency International has pointed out the multiple ways corrupt practices taint the Games, through ticket allocations, corporate hospitality, media contracts, match fixing (not proven), construction allocation, old-fashioned cronyism from corrupt states who bring large entourages, and corporate sponsorship itself. According to Transparency International’s Robert Barrington, “of the 53 official corporate sponsors in London … several have also been subject to investigation under the U.S. Foreign Corrupt Practices Act (FCPA) or equivalent laws.”
The Olympics in my lifetime have always been mired in problems and issues of the day, and they have reflected conflicts boiling on the international stage and larger cultural and racial currents. Memorable controversies in my lifetime have been the massacre of hundreds of civilians by Mexican government forces just prior to the start of the Mexico City Games in 1968, the brutal killing of 11 Israeli athletes at the 1972 Munich Games by Palestinian gunmen who also were killed in a botched rescue, the boycott by 62 nations of the 1980 Moscow games following the USSR’s invasion of Afghanistan and retaliatory boycott by 14 nations in response of the 1984 Los Angeles Games, and the terrorist bombing that killed two persons at the 1996 Atlanta Games by a right wing U.S. extremist, murderer, and abortion clinic bomber Eric Robert Rudolph. There are other scandals I could mention, but will not.
Perhaps the most abysmal moments in Olympics history were the summer and winter Olympics games in Nazi Germany in 1936. While many claim Nazi ideology of racial superiority was destroyed by Jesse Owens’ four gold medals on the track, the Games largely were a massively successful propaganda operation, according to the U.S. Holocaust Memorial Museum, by the Nazi state as it was marching toward implementing policies that later resulted in genocide against Jews and Gypsies and a war that claimed tens of millions of lives. It is worth watching the documentary of those games by Nazi propagandist Leni Riefenstahl. The opening ceremonies show national squads like France’s marching into Berlin’s Olympic Stadium with their arms raised with a “sieg heil” salute to future mass murderer and then-dictator Adolf Hitler (go to 4:47 of the clip on this video — it is truly chilling).
In the city near where I grew up, St. Louis, the Olympics were hosted in conjunction with and likely in the same spirit as the World’s Fair in 1904 (I wrote my undergraduate thesis on the fair, and know it well). This was a time of imperial expansion, the prevalence of racial eugenics in scientific and political thinking, and entrenched racism and segregation in the United States. The fair was the single largest gathering of human beings from other cultures ever put on display like a zoo. More than 5,000 persons from different cultures and countries were displayed, the largest group being Filipinos, whose country was taken over by the United States a few years earlier during the Spanish American War. Some of those humans on display, two tribesmen from South Africa, who were part of the “Boer Exhibit,” actually competed in the Olympics marathon and did surprisingly well, placing in ninth and 12th place. They were also the first Africans to compete in the notoriously racially segregated Games at the time.
The 1904 Olympics marathon is notable for many reasons, including a famous scandal and my odd connection to it. I used to live in University City, Mo. (next to St. Louis), and run on a road that was part of the marathon route and see an official marathon mile marker every time I ran, about three times a week. It connected me to the Games in a personal way. That marathon had 32 racers; only 18 finished. The race began at 3 p.m., in 90 fahrenheit (in Midwest humidity!). The winner, Thomas Hicks, doped and nearly died (on strychnine). A false winner , John Lorz, cheated by being driven most of the race. Still another runner nearly died inhaling road dust kicked up by automobiles on dirt roads that were used for the race.
While the scandals and problems that were there nearly at the beginning of the Games remain to this day, I still would like to think of the Games as something to inspire. In many ways, I credit that Olympics’ mile marker sign for motivating many of my early morning runs in the dark in high school. I have not stopped running since.